KR20170022334A - Digital device and method of processing data the same - Google Patents
Digital device and method of processing data the same Download PDFInfo
- Publication number
- KR20170022334A KR20170022334A KR1020150117163A KR20150117163A KR20170022334A KR 20170022334 A KR20170022334 A KR 20170022334A KR 1020150117163 A KR1020150117163 A KR 1020150117163A KR 20150117163 A KR20150117163 A KR 20150117163A KR 20170022334 A KR20170022334 A KR 20170022334A
- Authority
- KR
- South Korea
- Prior art keywords
- image frame
- data
- tile
- unit
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
The present invention relates to a digital device, and more particularly, to the processing of data in a digital device.
A mobile device such as a smart phone, a tablet PC, a wearable device and the like, in addition to a standing device such as a personal computer (PC) or a television (TV) Let this snow. Conventionally, the fixed device and the mobile device have conventionally developed their own areas and have been developed individually. However, the boundaries of the areas established between each other are becoming obscure due to the recent boom of digital convergence.
Recently, portable devices such as smart phones and tablet PCs have been increasing in popularity, resulting in a significant increase in users and usage time of mobile devices. Accordingly, there is an increasing need for not only simple web surfing but also high quality multimedia contents through a mobile device. Although the performance of the device is improving at a faster rate than the conventional one, there is a problem in that the service can not be satisfied due to the problem of providing the service due to the hardware capacity, power, temperature, etc. of the device relatively have. On the other hand, there is a limit to improving the hardware performance of the device, and another approach is needed to solve the above problem.
In this specification, a digital device and a method for processing data in the digital device are disclosed to overcome the above problems and needs.
The present invention can maintain or improve the quality of service (QoS) of the application data, as well as a hardware performance improvement or configuration addition of the linkage device (s) associated with the processing section, The problem is to provide a method.
Another object of the present invention is to maintain or improve performance such as QoS while improving the overall environment of the device itself such as hardware performance, power, and temperature by processing application data in software.
According to the present invention, it is possible to minimize the increase in cost due to the hardware by maintaining or improving the processing performance of the application data by software, regardless of hardware performance improvement or configuration addition, Another challenge is to encourage desire.
The technical problem to be solved by the present invention is not limited to the above-described technical problems and other technical problems which are not mentioned can be clearly understood by those skilled in the art from the following description .
Various embodiments (s) for digital devices and methods of processing application data in the digital devices are disclosed herein.
A method for processing application data in a digital device according to an embodiment of the present invention includes receiving application data, setting a tile rendering sequence for image frames of the application data, setting each tile of the first image frame Generating an interpolated image frame by compositing the tile data of some tiles rendered among all the tiles of the first image frame and the pre-rendered image frame based on the tile index, and And displaying the pre-rendered image frame, the interpolated image frame, and the first image frame.
A digital device according to an embodiment of the present invention includes a receiver for receiving application data, an image processor for rendering each tile of the first image frame according to a set tile rendering sequence, a tile rendering sequence for image frames of the application data A control unit configured to generate an interpolated image frame by compositing the tile data of the rendered tiles among all the tiles of the first image frame and the pre-rendered image frame based on the tile index, An image frame, an interpolation image frame, and a display unit for displaying the first image frame.
The technical solutions obtained by the present invention are not limited to the above-mentioned solutions, and other solutions not mentioned are clearly described to those skilled in the art from the following description. It can be understood.
The effects of the present invention are as follows.
According to one embodiment of the various embodiments of the present invention, the quality of service (QoS) of the application data, as well as the processor for processing application data, as well as the hardware performance enhancement or configuration addition of the linkage device And the like can be maintained or improved.
According to another embodiment of the present invention, application data can be processed in software to improve the overall environment of the device itself, such as hardware performance, power, temperature, etc., while also maintaining or improving performance such as QoS .
According to still another embodiment of the various embodiments of the present invention, the cost increase burden due to the hardware can be minimized by maintaining or improving the processing performance of the application data by software, regardless of hardware performance improvement or configuration addition, It is possible to improve the satisfaction of the device of the user caused by the device, and to enhance the desire to purchase the device.
The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.
1 schematically illustrates a service system according to an embodiment of the present invention;
2 is a block diagram illustrating a digital device according to one embodiment of the present invention;
FIG. 3 is a block diagram of another configuration or detailed configuration of FIG. 2; FIG.
4 is a block diagram illustrating an external device according to one embodiment of the present invention;
5 is a block diagram illustrating a digital device or external device according to another embodiment of the present invention;
6 illustrates control means for digital device control according to an embodiment of the present invention;
7 is a diagram illustrating a method of processing application data in a digital device according to an embodiment of the present invention;
8 is a block diagram of an image processing configuration of a digital device for processing application data according to an embodiment of the present invention;
FIG. 9 is a diagram for explaining a processing procedure between the
FIG. 10 is a diagram illustrating a tile rendering sequence according to an embodiment of the present invention; FIG.
11 is a view for explaining an intermediate image frame synthesis method according to an embodiment of the present invention;
12 is a block diagram of an image processing configuration of a digital device for processing application data according to another embodiment of the present invention;
13 is a diagram illustrating a method of processing image frames according to a tile rendering sequence according to an embodiment of the present invention;
14 is a diagram illustrating an interpolated image frame using a prior processed tile (s) and a previous image frame (Frame N) in a corresponding image frame (Frame N + 1) according to an embodiment of the present invention. ≪ / RTI > FIG.
15 is a diagram for explaining an image frame interpolation method according to an embodiment of the present invention;
16 is a view for explaining an image frame interpolation method according to another embodiment of the present invention;
17 is a diagram illustrating the determination of the number of image frames to be interpolated between an image frame (Frame N) and an image frame (Frame N + 1) based on motion prediction data in accordance with an embodiment of the present invention; And
18 is a flowchart illustrating a method of processing data in a digital device according to an embodiment of the present invention.
Hereinafter, various embodiments (s) of a digital device according to the present invention and a method of processing image data in the digital device will be described with reference to the drawings.
The suffix "module "," part ", and the like for components used in the present specification are given only for ease of specification, and both may be used as needed. Also, even when described in ordinal numbers such as " 1st ", "2nd ", and the like, it is not limited to such terms or ordinal numbers. In addition, although the terms used in the present specification have been selected from the general terms that are widely used in the present invention in consideration of the functions according to the technical idea of the present invention, they are not limited to the intentions or customs of the artisan skilled in the art, It can be different. However, in certain cases, some terms are arbitrarily selected by the applicant, which will be described in the related description section. Accordingly, it should be understood that the term is to be interpreted based not only on its name but on its practical meaning as well as on the contents described throughout this specification. It is to be noted that the contents of the present specification and / or drawings are not intended to limit the scope of the present invention.
As used herein, a " digital device " includes all devices that perform at least one of, for example, receiving, processing, outputting, The digital device may stream or download information about the content or the content through a server such as a broadcasting station or an external input. The digital device can transmit / receive data including the content and the server through a wire / wireless network. The digital device may be either a standing device or a mobile device (handheld device). The fixed device may include a network TV, an HBBTV, a smart TV, an IPTV, a PC, and the like. The mobile device may be a personal digital assistant (PDA), a smart phone, a tablet PC, a notebook, a digital broadcasting terminal, a portable multimedia player (PMP), a navigation, A slate PC, an Ultrabook, a wearable device (e.g., a smart watch, a glass glass, a head mounted display (HMD), etc.) . 2 and 3 are digital TVs, one of the fixed devices, and FIGS. 4 and 5 show mobile terminals and wearable devices (for example, smart watches) as one example of a digital device. And will be described in detail in the corresponding section. In the case where the digital device is a fixed type device, the digital device may be in the form of a sign including only a display panel, or may be set in another configuration, for example, a set-top box (STB) (SET).
The wired / wireless network includes all the hardware and / or software for connecting or pairing the server and the digital device, data communication, and the like, Network that is currently supported or will be supported in the future. Meanwhile, the wired / wireless network may support one or more communication protocols for data communication. Such a wired / wireless network includes, for example, a universal serial bus (USB), a composite video banking sync (CVBS), a component, an S-video (analog), a digital visual interface (DVI) (WLAN), Wi-Fi direct, RFID, and the like for a wire connection such as an RGB, a D-SUB, and a communication standard or protocol therefor, (Radio Frequency Identification), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access) A high speed downlink packet access (HSDPA), a long term evolution (LTE-A), or the like, and a communication standard or protocol for the wireless connection.
On the other hand, a digital device can use a general-purpose operating system (OS), a web OS (OS), or the like, and can add various services or applications to a general-purpose OS kernel, a Linux kernel, Deleting, amending, updating, and the like, thereby providing a more user-friendly environment.
1 is a schematic diagram illustrating a service system according to an embodiment of the present invention.
Referring to FIG. 1, a service system may be basically implemented including a
A digital device according to an embodiment of the present invention includes a receiver for receiving application data, an image processor for rendering each tile of the first image frame according to a set tile rendering sequence, a tile rendering sequence for image frames of the application data A control unit configured to generate an interpolated image frame by compositing the tile data of the rendered tiles among all the tiles of the first image frame and the pre-rendered image frame based on the tile index, An image frame, an interpolation image frame, and a display unit for displaying the first image frame.
2 is a block diagram illustrating a digital TV according to an embodiment of the present invention.
The
The
The TCP /
The
The
The
The audio /
The application manager includes, for example, a
The
The
The
The
The
The SI &
Meanwhile, the
FIG. 3 is a block diagram of another configuration or detailed configuration of FIG. 2;
3A, the digital TV includes a
The
The
The external
The A / V input / output section includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), a DVI (digital) Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.
The wireless communication unit can perform short-range wireless communication with another digital device. Digital TVs are used for communication such as, for example, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance) Depending on the protocol, it can be networked with other digital devices.
Also, the external
The
The
The user
The
Although not shown in the figure, the digital TV may further include a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal. The channel browsing processing unit receives a stream signal TS output from the
The
In order to detect the user's gesture, a sensing unit (not shown) having at least one of a touch sensor, a voice sensor, a position sensor, and an operation sensor may be further included in the digital TV, as described above. The signal sensed by the sensing unit may be transmitted to the
The
The
In addition, the digital TV according to the present invention may further include a configuration which is not shown or which is not reversed, as needed, among the configurations shown. On the other hand, the digital TV does not have a tuner and a demodulator but may receive and reproduce the content through the network interface unit or the external device interface unit.
3B, an example of the control unit includes a
The
The image processing unit performs image processing of the demultiplexed video signal. To this end, the image processing unit may include a
The
The
A frame rate converter (FRC) 380 converts a frame rate of an input image. For example, the frame
The
On the other hand, the voice processing unit (not shown) in the control unit can perform the voice processing of the demultiplexed voice signal. Such a voice processing unit can support processing various audio formats. For example, even when a voice signal is coded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, or BSAC, a corresponding decoder can be provided. Further, the audio processing unit in the control unit can process a base, a treble, a volume control, and the like. A data processing unit (not shown) in the control unit can perform data processing of the demultiplexed data signal. For example, the data processing unit can decode the demultiplexed data signal even when it is coded. Here, the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.
Meanwhile, the above-described digital TV is an example according to the present invention, and each component can be integrated, added, or omitted according to specifications of a digital TV actually implemented. That is, if necessary, two or more components may be combined into one component, or one component may be divided into two or more components. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and devices thereof do not limit the scope of rights of the present invention. On the other hand, the digital TV may be a video signal processing device that performs signal processing of an image stored in the device or an input image. Other examples of the video signal processing device include a set-top box (STB) excluding the
4 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention.
4, the
The
The
The
The
The short-
The
The A /
The image frame processed by the
The
The user input unit 430 generates input data for the user's operation control of the terminal. The user input unit 430 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.
The
The
The
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
The
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
A
Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
The
The
The
The
The
The
The identification module is a chip for storing various information for authenticating the use right of the
When the
The
The
The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, a controller, micro-controllers, microprocessors, and an electrical unit for performing other functions. In some cases, May be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Here, the software code is stored in the
On the other hand, a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses can operate or function as a digital device or an external device in this specification. These wearable devices include smart watch, smart glass, and head mounted display (HMD).
As shown in Fig. 1, the wearable device can mutually exchange (or interlock) data with another device. The short
5 is a block diagram illustrating a digital device or external device in accordance with another embodiment of the present invention.
5, a watch-type mobile terminal, that is, a
The
The
A
The
The
On the other hand, the
The
6 is a diagram illustrating control means for digital device control according to an embodiment of the present invention.
A front panel (not shown) or a control means (input means) provided on the
The control means includes a
The input means may be a communication protocol such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA) At least one can be employed as needed to communicate with the digital device.
The
The
Since the
The control means such as the
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to various embodiments (s) of processing data in a digital device in accordance with the present invention with reference to the accompanying drawings.
In the processing of application data and the like, hardware performance improvement, power consumption, temperature, etc. of the digital device are limited. In particular, the limitations are more evident in the increasingly increasing demand for high-end applications. In other words, considering the cost and the like in such a situation, purchasing only a high-performance digital device is unlikely to be a fundamental solution.
In order to solve this problem, the digital device has a method of reducing quality of service (QoS) such as skipping some image frames in processing application data, for example, image frames of the application, Flickering (lagging) or lagging (lagging) occurs at the time of playing an application, which inevitably deteriorates the overall application playback quality.
In the present specification, various embodiments (s) will be described with respect to a method for maintaining or improving the quality of an application through software processing rather than hardware performance improvement, configuration addition, or image frame skipping.
In the present invention, a FRUC (Frame Rate Up-Conversion) method will be described as an example. The FRUC method can be used in a GPU (Graphic Processing Unit), a CPU (Central Processing Unit), a video decoding, an image signal processing (Image Signal Processing), or the like for processing or displaying an image frame of a digital device.
The FRUC method according to the present invention may use at least one of a rendering method using a motion vector and a tile rendering method. In the rendering method using an electronic motion vector, a motion of an object in an image frame is estimated and a new image frame is generated using the motion. The latter tile rendering method is a method of processing an image frame in units of tiles and using the processed tile data.
Hereinafter, for the sake of an understanding of the present invention and for the convenience of explanation, the digital device is a mobile device, and the FRUC method is described as an example of a tile rendering method, but the present invention is not limited thereto.
FIG. 7 is a view for explaining a tile rendering method according to the present invention.
FIG. 7A illustrates a
FIG. 8 is a block diagram showing an image processing configuration of a digital device for processing application data according to an embodiment of the present invention. FIG. 9 shows a processing procedure between the
The power consumption of the image processing unit depends on the increase / decrease of the workload of the image processing unit according to the frame rate for processing the image frame of the application. In other words, as the frame rate increases, the power consumption of the image processing unit increases. This is an important problem in a device sensitive to power consumption, such as a mobile device, and it is a constraint because it requires consideration of hardware performance, power consumption, temperature, and the like. The lack of hardware performance is a cause of flickering or lagging phenomenon. However, as described above, only improvement of the hardware performance of the device is not a solution, and the problem may not be completely solved for other reasons.
If the digital device can provide the visual quality at a level that the user can satisfy, the lower the frame rate, the smaller the workload of the image processing unit and save the consumed power. In other words, as described above, even when quality of service (QoS) is lowered due to quality reduction, flickering, lagging, or the like due to lack of hardware image frame rendering performance, improvement in hardware performance of the image processing unit increases the power consumption The present invention aims to improve the visual perception performance of the user through the FRUC method, that is, the tile rendering method, by excluding the consideration of the hardware part.
For example, when the frame rate is low, flickering often occurs in the output image. In this case, an intermediate image frame may be inserted between the image frame and the image frame to some extent.
As described above, the present invention uses a tile rendering method. Here, the tile rendering method refers to a method of dividing the image frame into a predetermined number of tile units and dividing the image frame into tile units instead of rendering the entire image frame at once. According to such a tile rendering method, it is possible to increase the efficiency of memory bandwidth and data caching. On the other hand, in the tile rendering method, a final pixel in a tile unit is rendered inside the image processing unit, and is then stored in a frame buffer allocated to an external memory (for example, a DRAM).
The FRUC method or tile rendering method according to the present invention basically interpolates one or more newly created image frames based on tile-rendered data between an image frame (Frame N) and an image frame (Frame N + 1) interpolating) or inserting (hereinafter referred to as " interpolation "). In the image frame interpolation process, there is a method of using a motion vector, edge detection, application profile data, FRUC management data, etc. in the image frame tile rendering process. For the sake of convenience, the following description will be made by way of example of a method of using application profile data or FRUC management data in the present specification.
8 and 9 show only the processing arrangements necessary for rendering the image frames in the digital device, in particular in the context of the present invention, wherein some of the arrangements shown according to the system are omitted or merged with other arrangements May be modularized, or conversely, the configuration (s) not shown may be further added.
8, the image processing unit of the digital device includes a
The
The
The tile
FIG. 10A shows a scan line-order, FIG. 10B shows a Z-order, FIG. 10C shows a spiral sequence and FIG. 10D shows an eye-tracking sequence with an example of a tile rendering sequence. Although four tile rendering sequences are illustrated in FIG. 10, the present invention is not limited to this, and other sequences may be used for rendering tiles.
On the other hand, all the image frames in one application may be subjected to image processing according to the tile rendering sequence of any one of Figs. 10a to 10d, or may be different every tile rendering sequence A tile rendering sequence may be applied. In other words, in processing image frames of an application, the tile rendering sequence applied may be one or more.
The tile rendering sequence can be tailored to the scan-line order of the image frame. Referring to FIG. 10A, the tiles of an image frame may be rendered in a row from the upper row to the lower row, and in a row, from the left column to the right column.
The tile rendering sequence may render the tiles of the image frame into a Z-order. Referring to FIG. 10B, the tiles belonging to the first and second rows and the tiles belonging to the first row and the second row (
The tile rendering sequence is rendered according to the spiral-order, for example, as shown in FIG. 10C with respect to a particular point or tile of the image frame.
The tile rendering sequence can be rendered, for example, along with the user's instruction points or tiles sensed by the eye-tracking sensor, as shown in Figure 10d.
As described above, when the tile
The combining
The
12 is a block diagram of an image processing configuration of a digital device for processing application data according to another embodiment of the present invention.
Fig. 12 is a block diagram of a digital device for rendering an image frame in addition to the edge detection method in the digital device of Fig. 8 described above.
Hereinafter, the contents overlapping with those of Figs. 8 to 11 will be described with reference to different parts, and the above description will be used.
The digital device of FIG. 12 further includes an edge
The edge
The
The
The
The tile
The tile rendering
The tile rendering
The GPU continuously stores tile data rendered in the tile buffer according to the tile rendering priority for the image frame (Frame N + 1). When rendering is completed for all the tiles in the image frame (Frame N + 1), the tile buffer transfers the frame to the frame buffer and stores the frame.
If the tile data of the image frame (Frame N + 1) stored in the tile buffer exceeds a predetermined threshold value, the combining
Meanwhile, all of the above-described configurations can be controlled by the
In FIG. 12, the
13 is a view for explaining a method of processing image frames according to a tile rendering sequence according to an embodiment of the present invention.
Fig. 13A shows the case of the normal display (30 fps), Fig. 13C shows the case of the HFF display (60 fps) according to the present invention, Fig. Respectively. On the other hand, in Fig. 13, the Z-order tile rendering sequence of Fig. 10B will be described as an example.
Referring to FIG. 13A, after rendering of all tiles of an image frame (Frame N) is completed, tiles of an image frame (Frame N + 1) are rendered. In this case, As described above, the tiles of the image frame are rendered in a Z-order manner.
If the tile data is temporarily stored in the tile buffer by rendering the tiles of the image frame (Frame N + 1) in the GPU as a Z-order sequence, the controller accesses the tile buffer to store the tile data of the stored image frame (Frame N + 1) It is determined whether the number of tile data exceeds a predetermined threshold value. If the number of tile data stored in the tile buffer exceeds the threshold value, the control unit controls the combining unit to synthesize a new image frame (Frame N + 0.5).
The combining unit synthesizes an image frame (Frame N + 0.5) according to the synthesis control of the controller as shown in FIG. 13C.
The thus synthesized image frame (Frame N + 0.5) is displayed between the image frame (Frame N) and the image frame (Frame N + 1).
On the other hand, the threshold may or may not be the same for each image frame, for example. For example, as described above, when motion of an object in a specific image frame is detected with reference to the object motion prediction data, image frame synthesis may be performed according to a threshold value different from other image frames. The threshold value may be set, for example, by a control unit or the like. In addition, for similar reasons, in addition to the threshold, the tile rendering sequence may also be changed according to the image frame.
14 is a diagram illustrating an interpolated image frame using a prior processed tile (s) and a previous image frame (Frame N) in a corresponding image frame (Frame N + 1) according to an embodiment of the present invention. As shown in Fig.
All image frames are subdivided into a plurality of tiles, and each processed tile is stored in a frame buffer. When the total number of tiles of the processed (N + 1) -th image frame reaches a threshold value defined by the system, it is combined with data of the N-th frame without rendering delay of the (N + 1) -th image frame.
At the time of the composition, when the amount of change in the frame between the tiles (s) of the same index deviates by a predetermined ratio or more, the composition for the tile may be skipped. For example, this is to reduce artifacts due to rapid lighting and motion changes.
In addition, when tile is synthesized, as shown in FIG. 14A, seam artifacts such as a cut surface due to a sudden difference in color, motion, or the like may occur on the boundary surface of the tile. Therefore, in order to solve the problem of generation of the seam artifact, various interpolation functions such as the saw-tooth composition method of FIG. 14B and the blending method of FIG. 14C can be supported , And can be selected or released according to adaptive information.
15 is a view for explaining an image frame interpolation method according to an embodiment of the present invention.
In the case of FIG. 15, for example, the power consumption can be reduced through image frame interpolation according to the present invention. For example, if the frame rate of the image processing unit (GPU) exceeds 60 fps as shown in FIG. 15A through the V-sync restriction, the digital device can reduce the power consumption as described above.
As shown in FIG. 15B, if the frame rate is increased through interpolation of image frames according to the method disclosed in this specification by reducing the rendering speed of the image processing unit to 30 fps after rendering, the power consumption of the image processing unit is reduced It will be possible to do.
16 is a view for explaining an image frame interpolation method according to another embodiment of the present invention.
FIG. 16 is different from FIG. 15 described above, and can improve QoS through image frame interpolation according to the present invention. For example, a digital device can improve the QoS, as described above, if the frame rate of the image processing unit is less than 60 fps, as shown in FIG. 16A, under the V-sync limit.
This is to improve the QoS by interpolating the image frame (s) synthesized in accordance with the present invention between image frames of less than 60 fps through the arrangement as in Fig. 8 or Fig.
In the present specification, 60 fps, 30 fps, and the like are illustrated for convenience of explanation, and the present invention is not limited by these numerical values.
FIG. 17 is a diagram for explaining the determination of the number of image frames to be interpolated between an image frame (Frame N) and an image frame (Frame N + 1) based on motion prediction data according to an embodiment of the present invention.
17A shows a case where there is one image frame 1730 to be interpolated between an image frame (Frame N) 1710 and an image frame (Frame N + 1)
17B shows a case where there are two image frames 1740 and 1750 to be interpolated between an image frame (Frame N) 1710 and an image frame (Frame N + 1) 1720. FIG.
In the above, the number of image frames to be interpolated may be preset, for example, always constantly, or may be arbitrarily changed between every image frame. That is, the number of image frames to be interpolated adaptively can be determined.
18 is a flowchart illustrating a method of processing data in a digital device according to an embodiment of the present invention.
The digital device receives the application data (S1802), and sets a tile rendering sequence for the image frames of the application data (S1804).
The digital device renders each tile of the first image frame according to the set tile rendering sequence (S1806), tiles the tile data of some of the tiles rendered among all the tiles of the first image frame, And generates an interpolated image frame based on the index (S1808).
The digital device displays the pre-rendered image frame, the interpolated image frame, and the first image frame (S1810).
In the above, the interpolation image frame may be displayed before the first image frame. The interpolated image frame may be displayed between the pre-rendered image frame and the first image frame. And the interpolated image frame may be synthesized and generated based on a tile index with tiles of the pre-rendered image frame if the number of rendered tiles among all tiles of the first image frame exceeds a predetermined threshold have.
The set tile rendering sequence may refer to at least one of FRUC management data, currently executing application information, and profile data of the application, and may include at least one of a scan line order, a Z order, a spiral order, and a child- .
In addition, the tile rendering sequence to be set may be different in units of an application or an image frame.
Meanwhile, the set tile rendering sequence may refer to tile rendering priority data based on at least one of object motion prediction data and object type data based on edge data of the pre-rendering rendered image frame.
Alternatively, the pre-rendered image frame, the interpolated image frame, and the first image frame may be displayed sequentially, and the generated interpolated image frame may be displayed between the pre-rendered image frame and the first image frame May be plural.
Therefore, according to various embodiments of the present invention described above, it is possible to maintain or improve the QoS of the application data, or the like, without requiring a hardware performance improvement or configuration addition of the linkage device (s) And the application data can be processed in software to improve the overall environment of the device itself such as hardware performance, power and temperature, and to maintain or improve the performance such as QoS. As described above, It is possible to maintain or improve the processing performance of the application data by software in addition to the configuration addition, thereby minimizing the burden of cost increase due to the hardware, thereby improving the satisfaction of the user's device, The.
In addition, according to various embodiments of the present invention, it is possible to provide an improved visual quality to a user with low power and low performance, and to provide QoS even in high-end applications that require more than hardware limit performance of an image processing unit It can improve the CPU performance, the memory size, the limitation of the connection device such as the display, and can also improve the environmental constraints such as hardware thermal (H / W thermal) and electric power. In this case, even if the load of the image processing unit, the CPU, the memory, and the like is lowered, the QoS can be guaranteed.
The digital device disclosed in this specification and the data processing method in the digital device can be applied to a configuration or a method of the embodiments described above in a limited manner, Some of which may be selectively combined.
Meanwhile, the operation method of the digital device disclosed in this specification can be implemented as a code readable by a processor in a recording medium readable by a processor included in the digital device. The processor-readable recording medium includes all kinds of recording devices in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include ROM (Read Only Memory), RAM (Random Access Memory), CD-ROM, magnetic tape, floppy disk, optical data storage device, And may be implemented in the form of a carrier-wave. In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Of the right. Further, such modifications are not to be understood individually from the technical idea of the present invention.
201: network interface unit 202: TCP / IP manager
203: service delivery manager 204: SI decoder
205
207: Video decoder 208:
209: Service Control Manager 210: Service Discovery Manager
211: SI & Metadata Database 212: Metadata Manager
213: service manager 214: UI manager
Claims (20)
Receiving application data;
Setting a tile rendering sequence for image frames of the application data;
Rendering each tile of the first image frame according to the set tile rendering sequence;
Generating an interpolated image frame by compositing tile data of some rendered tiles among all tiles of the first image frame and pre-rendered image frames based on a tile index; And
And displaying the pre-rendered image frame, the interpolated image frame, and the first image frame.
Wherein the interpolation image frame comprises:
Wherein the first image frame is displayed prior to the first image frame.
Wherein the interpolation image frame comprises:
Wherein the pre-rendered image frame is displayed between the pre-rendered image frame and the first image frame.
Wherein the interpolation image frame comprises:
And generating a composite image based on a tile index with tiles of the pre-rendered image frame if the number of rendered tiles among all tiles of the first image frame exceeds a preset threshold value. Processing method.
The set tile rendering sequence includes:
FRUC management data, currently executing application information, and profile data of the application.
The set tile rendering sequence includes:
Scan line order, Z order, spiral order, and eye-tracking sequence.
Wherein the set tile rendering sequence comprises:
Wherein the data is different in units of an application or an image frame.
Wherein the set tile rendering sequence comprises:
The tile rendering priority data based on at least one of object motion prediction data and object type data based on edge data of the pre-rendering completed image frame.
Wherein the pre-rendered image frame, the interpolated image frame, and the first image frame are sequentially displayed.
Wherein the generated interpolation image frame comprises:
And a plurality of said first image frame and said first image frame.
A receiving unit for receiving application data;
An image processing unit that renders each tile of the first image frame according to a set tile rendering sequence;
A tile rendering sequence for the image frames of the application data is set, and the tile data of some of the tiles rendered among all the tiles of the first image frame and the pre-rendered image frame are synthesized based on the tile index, A control unit for controlling the control unit to generate the control signal; And
And a display unit for displaying the pre-rendered image frame, the interpolated image frame, and the first image frame.
Wherein,
And to display the interpolated image frame prior to the first image frame.
Wherein,
And to display the interpolated image frame between the pre-rendered image frame and the first image frame.
Wherein,
And generating the interpolated image frame by compositing based on a tile index with tiles of the pre-rendered image frame if the number of rendered tiles among all tiles of the first image frame exceeds a preset threshold value Digital devices.
Wherein,
Wherein the tile rendering sequence is set with reference to at least one of FRUC management data, currently executing application information, and profile data of the application.
Wherein,
Scan line order, Z order, spiral order, and eye-tracking sequence into a tile rendering sequence.
Wherein,
Wherein the tile rendering sequence is set differently on an application-by-application or image-frame basis.
Wherein,
And sets the tile rendering sequence with further reference to tile rendering priority data based on at least one of object motion prediction data and object type data based on edge data of the pre-rendered rendering image frame.
Wherein,
The interpolated image frame, and the first image frame in sequence, based on the first image frame and the second image frame.
Wherein,
And generate a plurality of interpolated image frames between the pre-rendered image frame and the first image frame.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150117163A KR20170022334A (en) | 2015-08-20 | 2015-08-20 | Digital device and method of processing data the same |
PCT/KR2016/009075 WO2017030380A1 (en) | 2015-08-20 | 2016-08-18 | Digital device and method of processing data therein |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150117163A KR20170022334A (en) | 2015-08-20 | 2015-08-20 | Digital device and method of processing data the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170022334A true KR20170022334A (en) | 2017-03-02 |
Family
ID=58427186
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150117163A KR20170022334A (en) | 2015-08-20 | 2015-08-20 | Digital device and method of processing data the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170022334A (en) |
-
2015
- 2015-08-20 KR KR1020150117163A patent/KR20170022334A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101586321B1 (en) | Display device and controlling method thereof | |
KR102393510B1 (en) | Display device and controlling method thereof | |
KR102063075B1 (en) | Service system, digital device and method of processing a service thereof | |
US11949955B2 (en) | Digital device and method of processing data in said digital device | |
KR102404356B1 (en) | Digital device and method of processing data the same | |
US10324595B2 (en) | Digital device configured to connect with plurality of operation devices and method of displaying second user interface associated with second operation device in response to dragging of first user interface associated with first operation device in the same | |
KR102423493B1 (en) | Digital device and method of processing data the same | |
CN102348133A (en) | Image display device and method for operating the same | |
KR20160019341A (en) | Display device and method for controlling the same | |
KR102348957B1 (en) | Digital device and method for controlling the same | |
KR20170011763A (en) | Digital device and method of processing data the same | |
KR20180041961A (en) | Display device and method of processing data the same | |
KR20170022333A (en) | Digital device and method of processing data the same | |
KR20170017606A (en) | Digital device and method of processing data the same | |
KR20160009415A (en) | Video display apparatus capable of sharing ccontents with external input appatatus | |
KR102311249B1 (en) | Display device and controlling method thereof | |
KR102384520B1 (en) | Display device and controlling method thereof | |
KR102478460B1 (en) | Display device and image processing method thereof | |
KR102369588B1 (en) | Digital device and method of processing data the same | |
KR20170032004A (en) | Digital device and method of processing data the same | |
KR20170031898A (en) | Display device and method for controlling the same | |
KR20170022334A (en) | Digital device and method of processing data the same | |
KR20160083737A (en) | Image display device and operation method of the image display device | |
KR102384521B1 (en) | Display device and controlling method thereof | |
KR20170018562A (en) | Digital device and method of processing data the same |