KR20170066844A - Apparatus for processing graphics and operating method thereof, and terminal including same - Google Patents
Apparatus for processing graphics and operating method thereof, and terminal including same Download PDFInfo
- Publication number
- KR20170066844A KR20170066844A KR1020150172997A KR20150172997A KR20170066844A KR 20170066844 A KR20170066844 A KR 20170066844A KR 1020150172997 A KR1020150172997 A KR 1020150172997A KR 20150172997 A KR20150172997 A KR 20150172997A KR 20170066844 A KR20170066844 A KR 20170066844A
- Authority
- KR
- South Korea
- Prior art keywords
- image frame
- rendering
- lod
- module
- scene
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/36—Level of detail
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Telephone Function (AREA)
Abstract
A graphics processing apparatus according to an embodiment of the present invention includes a focal length extraction module for extracting a focal length of a first image frame, a depth map extraction module for extracting a depth map of the first image frame, A rendering LoD generation module for generating a rendering level of detail (LoD) for a second image frame, which is the next frame of the first image frame, and a scene change module for generating a scene change between the first image frame and the second image frame And a rendering module that performs rendering on the second image frame by applying the rendering LoD.
Description
An embodiment according to the concept of the present invention relates to a graphics processing apparatus, and more particularly, to a graphics processing apparatus capable of performing rendering by applying different rendering levels of detail (LoD) to regions or pixels in an image frame, And a terminal including the same.
A graphics processing unit (or a graphics processing unit (GPU)) included in a terminal such as a PC, a notebook computer, a smart phone, a tablet PC, or the like performs graphics processing for generating a 2D or 3D image, and performs image processing operations such as rendering.
Rendering is a process of creating an image from a scene. It is also used as a method of adding stereoscopic effect and realism to a two-dimensional image in consideration of information such as light source, position, color, shadow, density change,
The quality of the image frame displayed through the display may vary depending on the rendering level of detail (LoD). The higher the rendering LoD, the higher the quality of the image frame, and the lower the rendering LoD the lower the quality of the image frame. On the other hand, the higher the rendering LoD, the higher the power consumption of the graphics processing apparatus and the lower the processing speed.
The time for which one image frame is displayed to the user is very short, and accordingly, the area viewed by the user can be limited to a specific area of the image frame, for example, an object displayed in the center area or the focus area. According to the conventional rendering method, since the same rendering LoD is applied to the entire image frame, it can be inefficient when a high rendering LoD is applied to all the areas.
According to an aspect of the present invention, there is provided a graphics processing apparatus capable of performing rendering on an image frame by applying different rendering LoDs for each region or each pixel in an image frame, and an operation method thereof.
A graphics processing apparatus according to an embodiment of the present invention includes a focal length extraction module for extracting a focal length of a first image frame, a depth map extraction module for extracting a depth map of the first image frame, A rendering LoD generation module for generating a rendering level of detail (LoD) for a second image frame, which is the next frame of the first image frame, and a scene change module for generating a scene change between the first image frame and the second image frame And a rendering module that performs rendering on the second image frame by applying the rendering LoD.
The focal length extraction module and the depth map extraction module may extract the focal length and the depth map from the first image frame being rendered by the rendering module.
According to one embodiment, the graphics processing apparatus further comprises a blur area detection module for detecting a blur area from the rendered first image frame, wherein the rendering LoD generation module is configured to calculate the focal distance, , And generate the rendering LoD based on the blur area.
According to one embodiment, the graphics processing apparatus further comprises a representative depth value extracting module for dividing the depth map into a plurality of tiles and extracting a representative depth value of each of the plurality of divided tiles, The generating module may generate the rendering LoD based on the focal length and the representative depth value of each of the plurality of tiles.
The representative depth value of each of the plurality of tiles may be an average value, a middle value, or a mode value of depth values included in each of the plurality of tiles.
According to one embodiment, the graphics processing apparatus further comprises a scene change detection module for comparing scene identities between the first image frame and the second image frame, wherein the rendering module, If the scene of the frame and the scene of the second image frame are the same, the rendering of the second image frame may be performed by applying the generated rendering LoD.
The case where the scene of the first image frame and the scene of the second image frame are the same may include a case where the degree of change between the scene of the first image frame and the scene of the second image frame is lower than the reference degree .
A method of operating a graphics processing apparatus according to an embodiment of the present invention includes extracting a focal length of a first image frame, extracting a depth map of the first image frame, Generating a rendering level of detail (LoD) for a second image frame that is a next frame of the first image frame, and based on a scene change between the first image frame and the second image frame, And performing rendering on the second image frame by applying LoD.
A terminal according to an exemplary embodiment of the present invention includes a graphics processing apparatus that renders a first image frame, a display unit that displays a first image frame that is rendered, and a controller that controls the graphics processing apparatus and the display unit, The graphic processing apparatus includes a focal distance extracting module for extracting a focal distance of the first image frame, a depth map extracting module for extracting a depth map of the first image frame, A rendering LoD generation module for generating a rendering LoD for a second image frame, which is the next frame of one image frame, and a rendering LoD generation module for applying the rendering LoD based on a scene change between the first image frame and the second image frame, And a rendering module that performs rendering for the second image frame.
The graphics processing apparatus according to the embodiment of the present invention can lower the rendering LoD for a region blurred beyond the focal distance in the image frame or for a region with low perceived interest and thus can reduce the power consumption of the graphics processing apparatus and improve the processing speed It is effective.
1 is a schematic block diagram of a terminal according to an embodiment of the present invention.
2A and 2B are schematic block diagrams of a graphics processing apparatus according to an embodiment of the present invention.
FIG. 3 is a flowchart for explaining the operation of the graphic processing apparatus shown in FIG. 2A.
Figure 4 is an illustration of a depth map extracted from an image frame being rendered.
5 and 6 are diagrams illustrating an operation of setting a rendering LoD based on a focal length and a depth map by the graphics processing apparatus according to the embodiment of the present invention.
7 is a flowchart for explaining another embodiment of the operation of setting the rendering LoD based on the focal length and depth map by the graphics processing apparatus according to the embodiment of the present invention.
8 is an exemplary view showing the operation of the graphic processing apparatus shown in Fig.
Figs. 9 to 11 show the operation of the graphic processing apparatus shown in Fig. 2A in more detail.
12 is a flowchart for explaining the operation of the graphic processing apparatus shown in FIG. 2B.
13 to 14 show the operation of the graphic processing apparatus shown in Fig. 2B in more detail.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The terminal described in this specification may be a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC, A tablet PC, an ultrabook, a wearable device (e.g., a smartwatch, a smart glass, a head mounted display (HMD), etc.) .
However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, .
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings attached hereto.
1 is a schematic block diagram of a terminal according to an embodiment of the present invention.
The
The
The
The
The
The
The
In addition, the
The
In addition, the
The
The
The
At least some of the components may operate in cooperation with each other to implement a method of operation, control, or control of a terminal according to various embodiments described below. In addition, the operation, control, or control method of the terminal may be implemented on the terminal by driving at least one application program stored in the
Hereinafter, the components listed above will be described in more detail with reference to FIG. 1, before explaining various embodiments implemented through the terminal 100 as described above.
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) Transmits and receives data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
The short-
The
The
The
The
The
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
The
The
In addition to vibration, the
The
The
The signal output by the
The
The identification module is a chip for storing various information for authenticating the usage right of the terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) a universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the
The
The
The
Meanwhile, as described above, the
In addition, the
According to a hardware implementation, the
The
In addition, the
As another example, the
Next, a communication system that can be implemented through the terminal 100 according to the present invention will be described.
First, the communication system may use different wireless interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA) ), Universal mobile telecommunication systems (UMTS) (in particular Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A)), Global System for Mobile Communications May be included.
Hereinafter, for the sake of convenience of description, the description will be limited to CDMA. However, it is apparent that the present invention can be applied to all communication systems including an OFDM (Orthogonal Frequency Division Multiplexing) wireless communication system as well as a CDMA wireless communication system.
A CDMA wireless communication system includes at least one terminal, at least one base station (BS) (also referred to as a Node B or Evolved Node B), at least one base station controller (BSCs) And a Mobile Switching Center (MSC). The MSC is configured to be coupled to a Public Switched Telephone Network (PSTN) and BSCs. The BSCs may be paired with the BS via a backhaul line. The backhaul line may be provided according to at least one of E1 / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL or xDSL. Thus, a plurality of BSCs may be included in a CDMA wireless communication system.
Each of the plurality of BSs may comprise at least one sector, and each sector may comprise an omnidirectional antenna or an antenna pointing to a particular direction of radial emission from the BS. In addition, each sector may include two or more antennas of various types. Each BS may be configured to support a plurality of frequency assignments, and a plurality of frequency assignments may each have a specific spectrum (e.g., 1.25 MHz, 5 MHz, etc.).
The intersection of sector and frequency assignment may be referred to as a CDMA channel. The BS may be referred to as a base station transceiver subsystem (BTSs). In this case, a combination of one BSC and at least one BS may be referred to as a "base station ". The base station may also indicate a "cell site ". Alternatively, each of the plurality of sectors for a particular BS may be referred to as a plurality of cell sites.
A Broadcasting Transmitter (BT) transmits a broadcast signal to terminals operating in the system. The
In addition, a CDMA wireless communication system may be associated with a Global Positioning System (GPS) for identifying the location of the terminal 100. The satellite aids in locating the
The
The
The WiFi Positioning System (WPS) is a wireless communication system that uses a WiFi module included in the
The WiFi location tracking system may include a Wi-Fi location server, a
The terminal 100 connected to the wireless AP can transmit a location information request message to the Wi-Fi location server.
The Wi-Fi location server extracts information of the wireless AP connected to the terminal 100 based on the location information request message (or signal) of the terminal 100. [ The information of the wireless AP connected to the terminal 100 may be transmitted to the Wi-Fi position location server through the terminal 100 or may be transmitted from the wireless AP to the Wi-Fi location server.
The information of the wireless AP to be extracted based on the location information request message of the
As described above, the Wi-Fi location server can receive the information of the wireless AP connected to the terminal 100 and extract the wireless AP information corresponding to the wireless AP to which the mobile terminal is connected from the pre-established database. In this case, the information of any wireless APs stored in the database includes at least one of MAC address, SSID, channel information, privacy, network type, radius coordinates of the wireless AP, building name, Available), the address of the AP owner, telephone number, and the like. At this time, in order to remove the wireless AP provided using the mobile AP or the illegal MAC address in the positioning process, the Wi-Fi location server may extract only a predetermined number of wireless AP information in order of RSSI.
Thereafter, the Wi-Fi position location server may extract (or analyze) the location information of the terminal 100 using at least one wireless AP information extracted from the database. And compares the received information with the received wireless AP information to extract (or analyze) the location information of the terminal 100.
As a method for extracting (or analyzing) the position information of the terminal 100, a Cell-ID method, a fingerprint method, a triangulation method, and a landmark method can be utilized.
The Cell-ID method is a method for determining the location of the wireless AP having the strongest signal strength among the neighboring wireless AP information collected by the terminal as the location of the terminal. Although the implementation is simple, it does not cost extra and it can acquire location information quickly, but there is a disadvantage that positioning accuracy is lowered when the installation density of the wireless AP is low.
The fingerprint method collects signal strength information by selecting a reference position in a service area, and estimates the position based on the signal strength information transmitted from the mobile terminal based on the collected information. In order to use the fingerprint method, it is necessary to previously convert the propagation characteristics into a database.
The triangulation method is a method of calculating the position of the terminal based on the coordinates of at least three wireless APs and the distance between the terminals. In order to measure the distance between the terminal and the wireless AP, the signal intensity is converted into distance information, the time of arrival (ToA) of the wireless signal, the time difference of arrival (TDoA) Angle of Arrival (AoA) or the like can be used.
The landmark method is a method of measuring the position of a terminal using a landmark transmitter that knows the location.
Various algorithms can be utilized as a method for extracting (or analyzing) the location information of the terminal.
The extracted location information of the terminal 100 is transmitted to the terminal 100 through the Wi-Fi location server, so that the terminal 100 can acquire the location information.
The terminal 100 may be connected to at least one wireless AP to obtain location information. At this time, the number of wireless APs required to acquire the location information of the terminal 100 can be variously changed according to the wireless communication environment in which the terminal 100 is located.
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
2A and 2B are schematic block diagrams of a graphics processing apparatus according to an embodiment of the present invention.
Referring to FIG. 2A, the
The
The quality of the image IM to be output through the
Conventionally, when an image IM is generated by the
The rendering
The rendering
The focal
According to the embodiment, even in the case of an image in which a fuzzy region does not exist, since the region corresponding to the focal distance has a higher interest or attention than the region not corresponding to the focal distance, a different rendering LoD is set for each region There may be a need to extract the focal lengths for.
For example, in OpenGL ES (OpenGL for Embedded Terminals) standard, the focal
The depth
The depth
According to an embodiment, the rendering
The representative depth
For example, the representative depth
The rendering
According to an embodiment, the rendering
According to an embodiment, the rendering
For example, the blur
The blur
The scene
For example, if the scene
Operations of the
Referring to FIG. 2B, the
The scene change detection module 260B detects the scene change GD1 corresponding to the current image frame and the scene GD2 corresponding to the next image frame before generating the render LoD from the currently rendered image frame IM1 Can be judged whether they are the same or not.
If the scene is the same or the degree of change is lower than the reference level, the scene change detection module 260B may cause the rendering
FIG. 3 is a flowchart for explaining the operation of the graphic processing apparatus shown in FIG. 2A.
Referring to FIGS. 2A and 3, the
The
The
For example, the rendering LoD for a pixel that corresponds to a distance equal to the focal length or has a depth value corresponding to a distance that the difference from the focal distance is less than the reference value may be high. On the other hand, the rendering LoD for a pixel having a depth value corresponding to a distance greater than the reference value for the difference from the focal length may be low. According to an embodiment, the reference value may comprise a plurality of reference values, whereby the rendering LoD may be subdivided.
The steps S120 to S140 will be described in detail with reference to FIGS. 4 to 8. FIG.
Figure 4 is an illustration of a depth map extracted from an image frame being rendered.
4A and 4B, the depth
The
The
The
5 and 6 are diagrams illustrating an operation of setting a rendering LoD based on a focal length and a depth map by the graphics processing apparatus according to the embodiment of the present invention.
5, the rendering
For example, it is assumed that the range of the depth values of the pixels included in the depth map is 0 to 1 (assuming that the depth value of the closest pixel is 0 and the depth value of the furthest pixel is 1) ) Corresponds to 0.3, the
If the rendering LoD has three levels of H, M, and L, the
According to the embodiment, the range of each area (or reference values for distinguishing each area) can be freely changed according to the setting of the
Referring to FIG. 6, the
6, the rendering LoD for the pixels included in the
That is, the rendering LoD of the pixels having the depth value corresponding to the focal length FD and the depth value adjacent thereto is set high, and the rendering LoD of the pixels having the depth value having a large difference from the depth value corresponding to the focal length FD Can be set low. As described above, since it is common that objects at focal distances far from the focal distance in an image frame are displayed with a low interest in attention or attention and are blurred, by setting the rendering LoD of pixels representing such objects low, It is possible to reduce power consumption and prevent degradation of processing performance.
7 is a flowchart for explaining another embodiment of the operation of setting the rendering LoD based on the focal length and depth map by the graphics processing apparatus according to the embodiment of the present invention. In particular, the embodiment shown in FIG. 7 may be for the case where the
Referring to FIG. 7, the
The
The
Steps S142 to S146 will be described in detail with reference to FIG.
8 is an exemplary view showing the operation of the graphic processing apparatus shown in Fig.
Referring to FIG. 8A, the
The
The
In the case of generating the rendering LoD information for each of the pixels according to the embodiment shown in FIG. 6, the accuracy of the generated rendering LoD information may be high, but the size of the rendering LoD information may be excessively large. On the other hand, in the case of generating the rendering LoD information for each of the tiles according to the embodiment shown in FIG. 8, the accuracy of the generated rendering LoD information may be lower than the accuracy of the rendering LoD information shown in FIG. Smaller size can be more efficient. Also, the accuracy and size of the rendering LoD information may be adjusted by adjusting the number of tiles to be divided as needed.
3 will be described again.
The
If it is determined that the scene of the first image frame is identical to the scene of the second image frame (YES in S160), the
On the other hand, when it is determined that the scene of the first image frame is not the same as the scene of the second image frame (NO in S160), the
Figs. 9 to 11 show the operation of the graphic processing apparatus shown in Fig. 2A in more detail.
9 to 11 are provided for convenience of description of the operation of the modules included in the
9, when the graphic data GD1 of the first image frame IM1 is input to the
The scene
Each of the focal
According to an embodiment, when the
The rendering
10, when the graphic data GD2 of the second image frame IM2 is input to the
If the detection result scene is not changed (the same or the degree of change is lower than the reference level), the scene
11, the
9, each of the focal
The rendering
12 is a flowchart for explaining the operation of the graphic processing apparatus shown in FIG. 2B.
Referring to FIG. 12, the
If the scene of the first image frame and the scene of the second image frame are the same (or the degree of change of the scene is lower than the reference level) as a result of the detection (YES in S320), the
Based on the extracted focal length and depth map, the
If it is determined that the first image frame and the second image frame are not the same (NO in S320), the
13 to 14 show the operation of the graphic processing apparatus shown in Fig. 2B in more detail.
13, the graphic data GD1 of the first image frame IM1 is input to the
The scene
The scene
Each of the focal
According to an embodiment, when the
The rendering
Referring to FIG. 14, the
The scene
The scene
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. will be. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.
Claims (18)
A depth map extraction module for extracting a depth map of the first image frame;
A rendering LoD generation module for generating a rendering level of detail (LoD) for a second image frame, which is the next frame of the first image frame, based on the extracted focal length and depth map; And
And a rendering module that performs rendering on the second image frame by applying the rendering LoD based on a scene change between the first image frame and the second image frame.
Wherein the focal length extracting module and the depth map extracting module,
And extracts the focal length and the depth map from the first image frame being rendered by the rendering module.
Further comprising a blur area detection module for detecting a blur area from the rendered first image frame,
Wherein the rendering LoD generation module comprises:
And generates the rendering LoD based on the focal length, the depth map, and the blur area.
Further comprising a representative depth value extracting module for dividing the depth map into a plurality of tiles and extracting a representative depth value of each of the plurality of divided tiles,
Wherein the rendering LoD generation module comprises:
And generates the rendering LoD based on the focal length and the representative depth value of each of the plurality of tiles.
A middle value, or a mode value of depth values included in each of the plurality of tiles.
Further comprising a scene change detection module for comparing a scene identical between the first image frame and the second image frame,
The rendering module includes:
And performs rendering of the second image frame by applying the generated rendering LoD if the scene of the first image frame and the scene of the second image frame are the same.
And when the scene of the first image frame and the scene of the second image frame are the same, if the degree of change between the scene of the first image frame and the scene of the second image frame is lower than the reference degree, Device.
Extracting a focal length of the first image frame;
Extracting a depth map of the first image frame;
Generating a rendering level of detail (LoD) for a second image frame, which is the next frame of the first image frame, based on the extracted focal length and depth map; And
And performing rendering on the second image frame by applying the rendering LoD based on a scene change between the first image frame and the second image frame.
Wherein the extracting of the focal length and the extracting of the depth map comprise:
And extracting the focal length and the depth map from the first image frame being rendered.
Further comprising detecting a blur area from the rendered first image frame,
Wherein the generating the rendering LoD comprises:
And generating the rendering LoD based on the focal length, the depth map, and the blur area.
Dividing the extracted depth map into a plurality of tiles;
Further comprising extracting a representative depth value of each of a plurality of divided tiles,
Wherein the generating the rendering LoD comprises:
And generating the rendering LoD based on the focal length and the representative depth value of each of the plurality of tiles.
A middle value, or a mode value of depth values included in each of the plurality of tiles.
Comparing scene identicalness between the first image frame and the second image frame; And
And performing rendering on the second image frame by applying the generated rendering LoD if the scene of the first image frame is identical to the scene of the second image frame as a result of the comparison .
And when the scene of the first image frame and the scene of the second image frame are the same, if the degree of change between the scene of the first image frame and the scene of the second image frame is lower than the reference degree, Method of operation of the device.
A display unit displaying a rendered first image frame; And
And a control unit for controlling the graphic processing apparatus and the display unit,
The graphic processing apparatus includes:
A focal distance extracting module for extracting a focal distance of the first image frame;
A depth map extraction module for extracting a depth map of the first image frame;
A rendering LoD generation module for generating a rendering LoD for a second image frame, which is the next frame of the first image frame, based on the extracted focal length and depth map; And
And a rendering module for performing rendering on the second image frame by applying the rendering LoD based on a scene change between the first image frame and the second image frame.
The graphic processing apparatus includes:
Further comprising a blur area detection module for detecting a blur area from the rendered first image frame,
Wherein the rendering LoD generation module comprises:
And generates the rendering LoD based on the focal length, the depth map, and the blur area.
The graphic processing apparatus includes:
Further comprising a representative depth value extracting module for dividing the depth map into a plurality of tiles and extracting a representative depth value of each of the plurality of divided tiles,
Wherein the rendering LoD generation module comprises:
And generating the rendering LoD based on the focal length and the representative depth value of each of the plurality of tiles.
The graphic processing apparatus includes:
Further comprising a scene change detection module for comparing a scene identical between the first image frame and the second image frame,
The rendering module includes:
And performs rendering on the second image frame by applying the generated rendering LoD when the scene of the first image frame and the scene of the second image frame are the same.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150172997A KR20170066844A (en) | 2015-12-07 | 2015-12-07 | Apparatus for processing graphics and operating method thereof, and terminal including same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150172997A KR20170066844A (en) | 2015-12-07 | 2015-12-07 | Apparatus for processing graphics and operating method thereof, and terminal including same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170066844A true KR20170066844A (en) | 2017-06-15 |
Family
ID=59217384
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150172997A KR20170066844A (en) | 2015-12-07 | 2015-12-07 | Apparatus for processing graphics and operating method thereof, and terminal including same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170066844A (en) |
-
2015
- 2015-12-07 KR KR1020150172997A patent/KR20170066844A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20170005557A (en) | Deformable display device and operating method thereof | |
KR20160150539A (en) | Deformable display device and operating method thereof | |
KR20180042777A (en) | Mobile terminal and operating method thereof | |
KR20180040451A (en) | Mobile terminal and operating method thereof | |
KR20160147441A (en) | Mobile terminal and operating method thereof | |
KR20180056182A (en) | Mobile terminal and method for controlling the same | |
KR20160142172A (en) | Deformable display device and operating method thereof | |
KR20180043019A (en) | Mobile terminal | |
KR20170041098A (en) | Mobile device and method for operating the same | |
KR20180092137A (en) | Mobile terminal and method for controlling the same | |
KR20180041430A (en) | Mobile terminal and operating method thereof | |
KR20170112527A (en) | Wearable device and method for controlling the same | |
KR20160005416A (en) | Mobile terminal | |
KR101727823B1 (en) | Image processing device and method for operating thereof | |
KR20180055364A (en) | Mobile terminal | |
KR20180047694A (en) | Mobile terminal | |
KR20170045676A (en) | Mobile terminal and operating method thereof | |
KR20170071334A (en) | Mobile terminal and operating method thereof | |
KR20170029834A (en) | Mobile terminal and method for controlling the same | |
KR20170074371A (en) | Mobile terminal and method for controlling the same | |
KR20160067393A (en) | Apparatus for controlling push service | |
KR101728758B1 (en) | Mobile terminal and method for controlling the same | |
KR20160043266A (en) | Mobile device and method for controlling the same | |
KR101637663B1 (en) | Mobile terminal | |
KR20170066844A (en) | Apparatus for processing graphics and operating method thereof, and terminal including same |