KR20170098641A - Image processing device and Image display apparatus including the same - Google Patents

Image processing device and Image display apparatus including the same Download PDF

Info

Publication number
KR20170098641A
KR20170098641A KR1020160020830A KR20160020830A KR20170098641A KR 20170098641 A KR20170098641 A KR 20170098641A KR 1020160020830 A KR1020160020830 A KR 1020160020830A KR 20160020830 A KR20160020830 A KR 20160020830A KR 20170098641 A KR20170098641 A KR 20170098641A
Authority
KR
South Korea
Prior art keywords
frame rate
processor
frame
image
changed
Prior art date
Application number
KR1020160020830A
Other languages
Korean (ko)
Inventor
박진홍
나재호
임영규
황지운
민성욱
김민규
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020160020830A priority Critical patent/KR20170098641A/en
Publication of KR20170098641A publication Critical patent/KR20170098641A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4092Image resolution transcoding, e.g. by using client-server architectures
    • H04M1/72544

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The present invention relates to an image processing device and an image display device including the same. The image processing device according to an embodiment of the present invention comprises: a processor which sets a frame rate about a graphic image frame; and a graphic processing unit which performs a rendering about a current image frame according to the set frame rate. The graphic processing unit outputs information about a changed tile by comparing a current image frame with a previous image frame. The processor sets the maximum frame rate to be changed based on the rate information of the changed tile or the number information of the changed tile calculated according to the information of the changed tile. By the above, power consumption can be reduced during a graphical image processing.

Description

TECHNICAL FIELD The present invention relates to an image processing apparatus and an image display apparatus having the same,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image processing apparatus and a video display apparatus having the same, and more particularly, to an image processing apparatus and a video display apparatus having the same that can reduce power consumption during graphic image processing.

The video display device is a device having a function of providing an image that a user can view. The user can view various images through the video display device.

In particular, when an application such as a game is executed, the video display device can generate a graphic image based on the game data and display the generated graphic image.

On the other hand, various attempts have been made in hardware and software to implement the complex functions of graphic image processing.

It is an object of the present invention to provide an image processing apparatus capable of reducing power consumption in graphic image processing and a video display apparatus having the same.

According to an aspect of the present invention, there is provided an image processing apparatus including a processor for setting a frame rate for a graphic image frame, and a graphics processing unit for rendering a current image frame according to a set frame rate The graphic processor compares the current image frame with the previous image frame to output information on the changed tile. The processor displays the number information of the changed tile or the ratio information of the changed tile calculated according to the information on the changed tile The maximum frame rate is set to be variable.

According to another aspect of the present invention, there is provided an image display apparatus including a processor for setting a frame rate for a graphic image frame, a graphic processor for rendering a current image frame according to a set frame rate, Wherein the graphics processing unit compares the current image frame with the previous image frame to output information on the changed tile, and the processor displays the information on the number of changed tiles or the ratio of the changed tiles calculated according to the information on the changed tile Based on the information, the maximum frame rate is set to be variable.

According to an embodiment of the present invention, an image processing apparatus includes a processor for setting a frame rate for a graphic image frame, and a graphics processing unit for performing rendering on a current image frame according to a set frame rate, The processor compares the current image frame with the previous image frame to output information on the changed tile, and the processor calculates a maximum value of the changed tile based on the number information of the changed tiles or the ratio information of the changed tiles, By setting the frame rate to be variable, power consumption during graphic image processing can be reduced.

On the other hand, by setting the maximum frame rate to increase as the number or ratio of changed tiles increases, it is possible to maintain a maximum frame rate in a dynamic scene in which most objects move, It is possible to minimize the image disconnection.

On the other hand, when the number or the ratio of the changed tiles is increased, it is prevented that the user is delayed by controlling the maximum frame rate to be set as the first variation amount, and when the number or the ratio of the changed tiles decreases, 1 is set to a second change amount smaller than the first change amount, it is possible to prevent the graphical image from being jumbled due to the abrupt drop in the frame rate.

On the other hand, since only the color change comparison is performed for each tile, not the color value comparison of the entire pixel values, it is possible to reduce the additional configuration at the time of the frame rate adjustment.

FIG. 1 illustrates execution of a game application in a video display device according to an embodiment of the present invention.
2 is an example of a block diagram of the video display device of FIG.
Figure 3 is an example of an internal block diagram of the processor of Figure 2;
4A to 4B show various examples of the image processing apparatus provided in the image display apparatus of FIG.
5 illustrates an example of an internal block diagram of an image processing apparatus according to an embodiment of the present invention.
6 is a flowchart illustrating an operation method of an image processing apparatus according to an embodiment of the present invention.
7A to 10 are views referred to explain various examples of the operation method of the image processing apparatus of FIG.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The video display device described in the present specification may be applied to various devices such as a mobile phone, a smart phone, a notebook computer, a TV, a monitor, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP , A navigation computer, a tablet computer, an e-book terminal, a wearable device such as a smart watch, and the like.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

FIG. 1 illustrates execution of a game application in a video display device according to an embodiment of the present invention.

Referring to the drawings, the video display device 100 of FIG. 1 can display a game image on the display 180 by executing a game application.

To this end, the image display apparatus 100 generates a graphic image frame using the graphic data and displays the graphic image frame on the display 180 when the game application is executed.

On the other hand, when signal processing for a graphic image is performed every frame, power consumption in the image display apparatus 100 is significantly increased.

Particularly, when the image display apparatus 100 displays a graphic image using power from an internal battery, the battery usage due to power consumption increases.

In order to solve this problem, the present invention proposes a method for reducing power consumption in the processing of a graphic image.

The image display apparatus 100 according to an exemplary embodiment of the present invention sets a frame rate for a graphic image frame, performs rendering on the current image frame according to a set frame rate, And sets the maximum frame rate to be variable based on the number information of the changed tiles or the ratio information of the changed tiles calculated according to the information on the changed tiles. As described above, by varying the maximum frame rate, it is possible to reduce the power consumption at the time of graphic image processing.

Particularly, by setting the maximum frame rate to be larger as the number or the ratio of the changed tiles becomes smaller, the maximum frame rate becomes smaller, and the larger the number or ratio of changed tiles, the power consumption during graphic image processing can be reduced .

Meanwhile, in the image display apparatus 100 according to an embodiment of the present invention, by setting the maximum frame rate to increase as the number or ratio of changed tiles increases, a maximum frame rate is obtained in a dynamic scene in which most objects move So that it is possible to minimize the deterioration of the graphic image that the user can feel due to the variable frame rate.

Meanwhile, the image display apparatus 100 according to an embodiment of the present invention prevents the user from being delayed by controlling the maximum frame rate to be set as the first variation amount when the number or the ratio of the changed tiles increases, When the number or ratio of the changed tiles is reduced, the maximum frame rate is set to a second variation amount smaller than the first variation amount, thereby preventing the graphical image from being jerky due to the abrupt decrease in frame rate .

Meanwhile, since the image display apparatus 100 according to the embodiment of the present invention performs color variation comparison only for each tile rather than comparing color values of all pixel values, it is possible to reduce a separate additional configuration .

Meanwhile, the above-described graphic image may be a concept including a game image, a user interface image, a web screen image, an application screen image, and the like.

A method for processing graphic images in the image display apparatus 100 according to an embodiment of the present invention will be described in detail with reference to FIG. 5 and the following figures.

2 is an example of a block diagram of the video display device of FIG.

2, the image display apparatus 100 includes a wireless communication unit 110, an audio / video (A / V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A memory 160, an interface unit 125, a processor 170, and a power supply unit 190. When such components are implemented in practical applications, two or more components may be combined into one component, or one component may be divided into two or more components as necessary.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 113, a wireless Internet module 115, a short distance communication module 117, and a GPS module 119.

The broadcast receiving module 111 receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal. The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network, in which case it can be received by the mobile communication module 113. Broadcast-related information can exist in various forms.

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems. In particular, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S) ), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T), and the like. In addition, the broadcast receiving module 111 may be configured to be suitable for all broadcasting systems that provide broadcasting signals, as well as the digital broadcasting system. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 113 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 115 is a module for wireless Internet access, and the wireless Internet module 115 can be built in or externally attached to the image display device 100. [ WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and Near Field Communication (NFC) may be used as the short distance communication technology.

A GPS (Global Position System) module 119 receives position information from a plurality of GPS satellites.

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 123. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. Then, the processed image frame can be displayed on the display 180. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 123 receives an external audio signal by a microphone in an audio reception mode, for example, a communication mode, a recording mode, or a voice recognition mode, and processes the audio signal as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 113 and output when the voice data is in the call mode. The microphone 123 may use various noise reduction algorithms for eliminating noise generated in receiving an external audio signal.

On the other hand, the plurality of microphones 123 may be arranged at different positions. The audio signal received at each microphone can be processed as an audio signal at the processor 170 or the like.

The user input unit 130 generates key input data that the user inputs to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, and a touch pad (static / static) capable of receiving commands or information by a user's pressing or touching operation. Also, the user input unit 130 may be a jog wheel for rotating a key, a jog type or a joystick, or a finger mouse. Particularly, when the touch pad has a mutual layer structure with the display 180 described later, it can be called a touch screen.

The sensing unit 140 senses the current state of the image display apparatus 100 such as the open / close state of the image display apparatus 100, the position of the image display apparatus 100, And a sensing signal for controlling the sensor. For example, when the video display device 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. In addition, a sensing function related to whether power is supplied to the power supply unit 190, whether the interface unit 125 is connected to an external device, and the like can be handled.

The sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, a motion sensor 145, and the like. The proximity sensor 141 can detect the presence of an object approaching the image display device 100 or an object existing in the vicinity of the image display device 100 without mechanical contact. The proximity sensor 141 can detect a nearby object by using a change in the alternating magnetic field or a change in the static magnetic field, or a rate of change in capacitance. The proximity sensor 141 may be equipped with two or more sensors according to the configuration.

The pressure sensor 143 can detect whether pressure is applied to the image display device 100, the magnitude of the pressure, and the like. The pressure sensor 143 may be installed at a position where the pressure of the image display device 100 is required depending on the use environment. If the pressure sensor 143 is installed on the display 180, a touch input via the display 180 and a pressure touch with a larger pressure than the touch input, according to the signal output from the pressure sensor 143, Input can be identified. In addition, the magnitude of the pressure applied to the display 180 at the time of the pressure touch input can be determined according to the signal output from the pressure sensor 143.

The motion sensor 145 detects the position, motion, and the like of the image display device 100 using an acceleration sensor, a gyro sensor, or the like. An acceleration sensor that can be used for the motion sensor 145 is a device that converts an acceleration change in one direction into an electric signal and is widely used along with the development of MEMS (micro-electromechanical systems) technology.

The acceleration sensor measures the acceleration of a small value built in the airbag system of an automobile and recognizes the minute motion of the human hand and measures the acceleration of a large value used as an input means such as a game There are various kinds. Acceleration sensors are usually constructed by mounting two or three axes in one package. Depending on the usage environment, only one axis of Z axis is required. Therefore, when the acceleration sensor in the X-axis direction or the Y-axis direction is used instead of the Z-axis direction for some reason, the acceleration sensor may be mounted on the main substrate by using a separate piece substrate.

The gyro sensor is a sensor for measuring the angular velocity, and it can sense the direction of rotation with respect to the reference direction.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display 180, an audio output module 153, an alarm unit 155, and a haptic module 157.

The display 180 displays and outputs information processed in the image display apparatus 100. [ For example, when the video display device 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the image display apparatus 100 is in the video communication mode or the image capture mode, the captured or received images can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display 180 and the touch pad have a mutual layer structure to constitute a touch screen, the display 180 may be used as an input device capable of inputting information by a user's touch in addition to the output device .

If the display 180 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like. In this case, the touch screen panel is a transparent panel attached to the outside, and can be connected to the internal bus of the image display apparatus 100. The touch screen panel keeps a watch on the contact result, and if there is a touch input, sends the corresponding signals to the touch screen panel controller. The touchscreen panel controller processes the signals and then transmits corresponding data to the processor 170 to enable the processor 170 to determine whether the touch input was present and which area of the touch screen was touched.

The display 180 may be composed of an e-paper. Electronic paper (e-Paper) is a kind of reflective display, and has excellent visual characteristics such as high resolution, wide viewing angle and bright white background as conventional paper and ink. The electronic paper (e-Paper) can be implemented on any substrate such as plastic, metal, paper, and the image is retained even after the power is turned off and the battery life of the image display apparatus 100 is short It can be maintained for a long time. As the electronic paper, a hemispherical twist ball filled with a telephone can be used, or an electrophoresis method and a microcapsule can be used.

In addition, the display 180 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display). In addition, there may be two or more displays 180 depending on the implementation of the image display apparatus 100. For example, the image display apparatus 100 may include an external display (not shown) and an internal display (not shown) at the same time.

The audio output module 153 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 outputs audio signals related to functions performed in the image display apparatus 100, for example, call signal reception tones, message reception tones, and the like. The sound output module 153 may include a speaker, a buzzer, and the like.

The alarm unit 155 outputs a signal for notifying the occurrence of an event of the image display apparatus 100. [ Examples of events occurring in the video display device 100 include call signal reception, message reception, and key signal input. The alarm unit 155 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. The alarm unit 155 can output a signal to notify when a call signal is received or a message is received. Also, when the key signal is inputted, the alarm unit 155 can output the signal as the feedback to the key signal input. The user can recognize the occurrence of an event through the signal output by the alarm unit 155. A signal for notifying the occurrence of an event in the image display apparatus 100 may also be output through the display 180 or the sound output module 153. [

The haptic module 157 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 157 is a vibration effect. When the haptic module 157 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 157 can be converted, and the different vibrations can be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 157 may be provided with a function of stimulating by a pin arrangement vertically moving with respect to the contact skin surface, an effect of stimulating air through the injection or suction force of the air through the injection port or the suction port, A variety of tactile effects such as an effect of stimulation through the contact of the electrode (eletrode), an effect of stimulation by electrostatic force, and an effect of cold / warm reproduction using a device capable of endothermic or exothermic can be generated. The haptic module 157 can be implemented not only to transmit the tactile effect through direct contact but also to feel the tactile effect through the muscular sense of the user's finger or arm. The haptic module 157 may include two or more haptic modules 157 according to the configuration of the image display device 100.

The memory 160 may store a program for processing and controlling the processor 170 and may have functions for temporarily storing input or output data (e.g., a phone book, a message, a still image, a moving picture, .

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM , And a ROM. ≪ / RTI > Also, the image display apparatus 100 may operate a web storage that performs a storage function of the memory 150 on the Internet.

The interface unit 125 serves as an interface with all the external devices connected to the video display device 100. Wireless data ports, a memory card, a SIM (Subscriber Identification Module) card, a UIM (User Identity Module) card, A card socket such as a card, an audio I / O (input / output) terminal, a video I / O (input / output) terminal and an earphone. The interface unit 125 receives data from the external device or receives power from the external device and transmits the data to each component in the image display device 100 so that data in the image display device 100 is transmitted to the external device .

The interface unit 125 may be a path through which power from the cradle connected to the image display apparatus 100 is supplied to the image display apparatus 100 when the image display apparatus 100 is connected to an external cradle, May be a path through which the image is displayed to the image display apparatus 100.

The processor 170 typically controls the operation of the respective parts to control the overall operation of the image display device 100. For example, voice communication, data communication, video communication, and the like. In addition, the processor 170 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware within the processor 170 or separately from the processor 170 in software. Meanwhile, the processor 170 may include an application processor (not shown) for driving an application. Or an application processor (not shown) may be provided separately from the processor 170.

The power supply unit 190 receives external power and internal power under the control of the processor 170 and supplies power necessary for operation of the respective components.

The image display apparatus 100 having such a configuration is configured to be operable in a communication system capable of transmitting data through a frame or a packet including a wired / wireless communication system and a satellite-based communication system .

Figure 3 is an example of an internal block diagram of the processor of Figure 2;

The processor 170 according to an exemplary embodiment of the present invention includes a demultiplexer 310, an image processor 320, a graphics processor 340, a mixer 345, a frame rate converter 350, and a formatter 360. An audio processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 310 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120 or the external device interface 130.

The image processing unit 320 may perform image processing of the demultiplexed image signal. For this, the image processing unit 320 may include a video decoder 325 and a scaler 335.

The video decoder 325 decodes the demultiplexed video signal and the scaler 335 performs scaling so that the resolution of the decoded video signal can be output from the display 180.

The video decoder 325 can include a decoder of various standards. For example, a 3D image decoder for MPEG-2, H, and H.264 decoders, a color image and a depth image, and a decoder for a multi-view image may be provided.

The graphic processing unit 340 generates a graphic signal according to a user input or itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated graphic signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated graphic signal may include a 2D object or a 3D object.

The graphic processing unit 340 can generate a pointer that can be displayed on the display based on the pointing signal input from the remote control device 200. [ In particular, such a pointer may be generated by a pointing signal processing unit, and the OSD generating unit 240 may include such a pointing signal processing unit (not shown). Of course, a pointing signal processing unit (not shown) may be provided separately from the OSD generating unit 240.

The mixer 345 may mix the graphic signal generated by the graphic processing unit 340 and the decoded video signal processed by the image processing unit 320. The mixed video signal is supplied to a frame rate converter 350.

A frame rate converter (FRC) 350 can convert the frame rate of an input image. On the other hand, the frame rate converter 350 can output the frame rate without conversion.

The formatter 360 may arrange the left eye image frame and the right eye image frame of the frame rate-converted 3D image. Then, the left eye glass of the 3D viewing apparatus (not shown) and the synchronization signal Vsync for opening the right eye glass can be output.

On the other hand, the formatter 360 may convert the format of the input video signal into a video signal for display on a display and output the video signal.

In addition, the formatter 360 can change the format of the 3D video signal. For example, in various 3D formats such as Side by Side, Top / Down, Frame Sequential, Interlaced, and Checker Box formats, It can be changed to any one format.

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, according to a 3D image generation algorithm, an edge or a selectable object is detected in a 2D image signal, and an object or a selectable object according to the detected edge is separated into a 3D image signal and is generated . At this time, the generated 3D image signal can be separated into the left eye image signal L and the right eye image signal R, as described above.

Although not shown in the drawing, it is also possible that a 3D processor (not shown) for 3-dimensional effect signal processing is further disposed after the formatter 360. The 3D processor (not shown) can process the brightness, tint, and color of the image signal to improve the 3D effect. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. On the other hand, the functions of such a 3D processor can be merged into the formatter 360 or merged into the image processing unit 320. [

Meanwhile, the audio processing unit (not shown) in the processor 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

In addition, an audio processing unit (not shown) in the processor 170 can process a base, a treble, a volume control, and the like.

The data processor (not shown) in the processor 170 may perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be electronic program guide information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

Meanwhile, a block diagram of the processor 170 shown in FIG. 3 is a block diagram for one embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the processor 170 that is actually implemented.

In particular, the frame rate conversion unit 350 and the formatter 360 are not provided in the processor 170, and may be separately provided, or may be separately provided as one module.

4A to 4B show various examples of the image processing apparatus provided in the image display apparatus of FIG.

4A, an image processing apparatus 410a includes a processor 170 for setting a frame rate for a graphic image frame, a graphic processor (not shown) for rendering a current image frame according to a set frame rate 340 and a memory 460 for storing the frame rate for the previous image frame.

Meanwhile, the graphic processing unit 340 outputs the rendered graphic image frame, and the display 180 can display the graphic image.

Referring to FIG. 4A, the processor 170, the graphics processing unit 340, and the memory 460 may be separately provided in the image processing apparatus 410a.

Meanwhile, the graphic processing unit 340 may compare the current image frame with the previous image frame, and output information on the changed tile.

On the other hand, the processor 170 can set the maximum frame rate to be variable based on the number information of the changed tiles or the ratio information of the changed tiles, which is calculated according to the information on the changed tiles. As a result, it is possible to reduce the power consumption in processing the graphic image.

In particular, the processor 170 may be configured to increase the maximum frame rate as the number or percentage of changed tiles increases. Accordingly, in a dynamic scene in which most objects move, the maximum frame rate can be maintained, and it is possible to minimize the disruption of the graphic image that the user can feel due to the variable frame rate.

On the other hand, the processor 170 sets the maximum frame rate to the first frame rate based on the number or ratio of the changed tiles in the previous image frame when the number or the ratio of the changed tiles increases, Based on an average of the number or ratio of changed tiles in a plurality of frames including a previous image frame or an average of a maximum frame rate in a plurality of frames including a previous image frame, The rate can be set to the second frame rate.

On the other hand, the processor 170 controls the maximum frame rate to be set as the first variation amount when the number or the ratio of the changed tiles increases, and when the number or the ratio of the changed tiles decreases, It is possible to control to set the second variation amount smaller than the first variation amount. As a result, it is possible to prevent the graphic image from being strayed due to the abrupt drop in the frame rate.

On the other hand, if the set maximum frame rate is larger than the current frame rate, the processor 170 controls to perform rendering of the next frame. If the set maximum frame rate is smaller than the current frame rate, .

On the other hand, the processor 170 sets the maximum frame rate to the first frame rate, if the number or percentage of changed tiles increases, based on the frame rate for the previous image frame stored in the memory 460, The maximum frame rate can be set to the second frame rate.

On the other hand, since only the color change comparison is performed for each tile, not the color value comparison of the entire pixel values, it is possible to reduce the additional configuration at the time of the frame rate adjustment.

Next, referring to FIG. 4B, the image processing apparatus 410b of FIG. 4B is provided with a graphic processing unit 340 in the processor 170, which is distinguished from the image processing apparatus 410a of FIG. 4A. The operation in each unit is the same as that in Fig. 4A, and a description thereof will be omitted.

Next, referring to FIG. 4C, the image processing apparatus 410c of FIG. 4C is provided with a graphics processing unit 340 and a memory 460 in a processor 170, and distinguishes the image processing apparatus 410a of FIG. do. The operation in each unit is the same as that in Fig. 4A, and a description thereof will be omitted.

5 illustrates an example of an internal block diagram of an image processing apparatus according to an embodiment of the present invention.

5, the image processing apparatus 410 includes a processor 170 for setting a frame rate for a graphic image frame, a graphics processor (for example, 340 may be provided.

The processor 170 includes an application unit 171 for driving an application program, a frame rate control unit 173 for setting a frame rate for a graphic image frame, and a graphics processing unit (GPU) And a GPU driver 174 for driving the PDP.

Meanwhile, the frame rate control unit 173 may be provided in a graphic library 172 in the processor 170.

The driving unit 174 may include a counter collector 177 for counting or calculating the number of changed tiles or the ratio of the changed tiles.

The graphic processor 340 includes a frame renderer 341 for rendering the current image frame and a tile comparing unit 342 for comparing the current image frame with the previous image frame and outputting information on the changed tile, and a tile comparer 342.

In the figure, the first area 1020a and the second area 1020b in the image 1020 are illustrated as being changed compared to the black frame.

Here, the information on the changed tile may include location information of the changed tile, size information, luminance information, color information, and the like.

The information on the changed tile outputted from the tile comparator 342 is input to a counter collector 177 in the driving unit 174 and the counter collector 177 receives information on the changed tile The number of changed tile information or the ratio information of the changed tile may be counted or calculated based on the information on the tile.

The number information of the changed tiles or the ratio information of the changed tiles calculated by the counter collector 177 can be input to the frame rate control unit 173.

The frame rate control unit 173 can set the maximum frame rate to be variable based on the changed tile number information or the changed tile ratio information.

Then, the frame renderer 341 performs rendering on the current image frame according to the set maximum frame rate.

In this way, by setting the maximum frame rate in accordance with the number information of the changed tiles or the ratio information of the changed tiles, it is possible to reduce the power consumption in the graphic image processing.

In particular, the processor 170 may be configured to increase the maximum frame rate as the number or percentage of changed tiles increases. Accordingly, in a dynamic scene in which most objects move, the maximum frame rate can be maintained, and it is possible to minimize the disruption of the graphic image that the user can feel due to the variable frame rate.

On the other hand, the processor 170 sets the maximum frame rate to the first frame rate based on the number or ratio of the changed tiles in the previous image frame when the number or the ratio of the changed tiles increases, Based on an average of the number or ratio of changed tiles in a plurality of frames including a previous image frame or an average of a maximum frame rate in a plurality of frames including a previous image frame, The rate can be set to the second frame rate.

On the other hand, the processor 170 controls the maximum frame rate to be set as the first variation amount when the number or the ratio of the changed tiles increases, and when the number or the ratio of the changed tiles decreases, It is possible to control to set the second variation amount smaller than the first variation amount. As a result, it is possible to prevent the graphic image from being strayed due to the abrupt drop in the frame rate.

On the other hand, if the set maximum frame rate is larger than the current frame rate, the processor 170 controls to perform rendering of the next frame. If the set maximum frame rate is smaller than the current frame rate, .

On the other hand, the processor 170 sets the maximum frame rate to the first frame rate, if the number or percentage of changed tiles increases, based on the frame rate for the previous image frame stored in the memory 460, The maximum frame rate can be set to the second frame rate.

On the other hand, since only the color change comparison is performed for each tile, not the color value comparison of the entire pixel values, it is possible to reduce the additional configuration at the time of the frame rate adjustment.

FIG. 6 is a flowchart showing an operation method of an image processing apparatus according to an embodiment of the present invention, and FIGS. 7A to 10 are views referred to explain various examples of an operation method of the image processing apparatus of FIG.

Referring to FIG. 6, the graphics processor 410 in the image processing apparatus 410 performs rendering on the current image frame (S610).

The graphics processor 410, particularly the tile comparer 342, in the image processing apparatus 410 compares the current image frame with the previous image frame to determine the tile change.

Next, the processor 170 in the image processing apparatus 410, in particular, the counter collector 177 receives information on the tile change from the graphics processing unit 410, and based on this information, (S620).

Next, the processor 170 in the image processing apparatus 410, in particular, the frame rate control section 173 sets the maximum frame rate based on the measured tile change amount (S630).

For example, the processor 170 may be set to increase the maximum frame rate as the number or ratio of changed tiles increases, as shown in FIG. 7A.

7A, when the changing tile ratio is equal to or smaller than RA1, the maximum frame rate is sequentially increased in FPS1 when the maximum frame rate is FPS1 and the changing tile ratio is between RA1 and RA2, and when the changing tile ratio is RA2 or more, And that the frame rate is set to FPS2.

Next, the processor 170 in the image processing apparatus 410, particularly, the frame rate control unit 173 determines whether the set maximum frame rate has increased compared to the previous frame (S635), and if so, It is determined whether the rate is greater than the actual frame rate (S640), and if so, the rendering of the next frame may be performed (S645).

On the other hand, in step 635 (S635), the processor 170, particularly, the frame rate control unit 173 in the image processing apparatus 410, if not, i.e., when the set maximum frame rate is decreased as compared with the previous frame , The maximum frame rate can be reset (S637).

On the other hand, the processor 170 sets the maximum frame rate to the first frame rate based on the number or ratio of the changed tiles in the previous image frame when the number or the ratio of the changed tiles increases, Based on an average of the number or ratio of changed tiles in a plurality of frames including a previous image frame or an average of a maximum frame rate in a plurality of frames including a previous image frame, The rate can be set to the second frame rate.

On the other hand, the processor 170 controls the maximum frame rate to be set as the first variation amount when the number or the ratio of the changed tiles increases, and when the number or the ratio of the changed tiles decreases, It is possible to control to set the second variation amount smaller than the first variation amount.

FIG. 7B illustrates that the maximum frame rate is varied sequentially, in accordance with the number or ratio of changed tiles.

Referring to FIG. 7B, the maximum frame rate may be set to FPSa, FPSb, FPSa, FPSc, and FPSd at time T11, T12, T13, and T14, respectively.

7C, the maximum frame rate may be set to FPSa, FPSb, FPSm, FPSn, and FPSo at times T21, T22, T23, and T24, respectively. Here, FPSm, FPSn, and FPSo may have a smaller rate of change than FPSa, FPSc, and FPSd in Fig. 7B.

7B and 7C, the rising period of the maximum frame rate is set so as to be increased based on the number or the ratio of the changed tiles in the previous image frame, and the falling period of the maximum frame rate is set to the previous image frame Based on an average of the number or percentage of changed tiles in the plurality of frames including the previous image frame or the average of the maximum frame rates in the plurality of frames including the previous image frame, So that a gentle descent becomes possible.

That is, the resetting of the maximum frame rate in step 637 (S637) may be performed by changing the number or ratio of the changed tiles in the previous image frame, the average or the number of changed tiles in the plurality of frames including the previous image frame , Or an average of the maximum frame rates in a plurality of frames including the previous image frame.

Thus, it is possible to prevent the graphical image from being strayed due to the abrupt change in the frame rate through the gradual decline of the maximum frame rate.

If the maximum frame rate is not equal to the actual frame rate in step 640, the processor 170 in the image processing apparatus 410 skips the rendering of the next frame (S647).

8 to 10 show various examples of the image processing apparatuses 410aa, 410ab, and 410ac of the present invention.

8 includes a processor 170 for setting a frame rate for a graphic image frame, a graphics processor 340 for rendering a current image frame according to a set frame rate, , And a memory 460 for storing the frame rate for the previous image frame.

The processor 170 includes an application unit 171 for driving an application, a frame rate control unit 173 for setting a frame rate for a graphic image frame, a graphic processing unit (GPU) driver 174 for driving the GPU 340.

Meanwhile, the frame rate control unit 173 may be provided in a graphic library 172 in the processor 170.

The driving unit 174 may include a counter collector 177 for counting or calculating the number of changed tiles or the ratio of the changed tiles.

The graphic processing unit 340 may include a vertex shader 343, a tiler 344, a fragment shader 346, a local tile memory 347, and counter registers 348. [

Meanwhile, the local tile memory 347 may include a tile comparer 347a, and the counter registers 348 may include a removed tile number calculating unit 348a.

The tile comparer 347a may correspond to the tile comparer 342 of FIG.

The memory 460 may include an attribute 461, a geometry working set 461, a texture 461, a frame buffer 461, and an FPS history buffer 465.

Here, the FPS history buffer 465 may store the frame rate for the previous image frame.

On the other hand, FPS is abbreviation of frame per second. In the present specification and the drawings, the terms FPS and frame rate are described as being the same or similar.

Meanwhile, the operation of the processor 170 of FIG. 8 is similar to that of FIG. 5, and the description thereof will be omitted.

Next, the image processing apparatus 410ab of FIG. 9 is similar to the image processing apparatus 410aa of FIG. 8, and the differences therebetween will be mainly described. In the image processing apparatus 410ab of FIG. 9, the frame rate control unit 173 Is provided together with the counter collector 177 in the driver 174 rather than in the graphic library 172. [

That is, the driving unit 174 may include a frame rate control unit 173.

Next, the image processing apparatus 410ac shown in Fig. 10 is similar to the image processing apparatus 410aa shown in Fig. 8, and the differences therebetween will be mainly described. In the image processing apparatus 410ac shown in Fig. 10, ) Is provided in the processor 170 and the removed tile number calculation unit 348a is provided in the memory 360. [

Meanwhile, the operation method of the image processing apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (11)

A processor for setting a frame rate for the graphic image frame;
And a graphics processor for rendering a current image frame according to the set frame rate,
Wherein the graphic processing unit comprises:
Comparing the current image frame with a previous image frame to output information on the changed tile,
The processor comprising:
And sets the maximum frame rate to be variable based on the number information of the changed tiles or the ratio information of the changed tiles calculated according to the information on the changed tiles.
The method according to claim 1,
The processor comprising:
And sets the maximum frame rate to increase as the number or ratio of the changed tiles increases.
The method according to claim 1,
The processor comprising:
Setting the maximum frame rate to a first frame rate based on the number or ratio of changed tiles in the previous image frame when the number or ratio of the changed tiles is increased,
Wherein when the number or ratio of the changed tiles is reduced, an average of the number or ratio of changed tiles in a plurality of frames including the previous image frame, or a maximum frame rate in the plurality of frames including the previous image frame And sets the maximum frame rate to a second frame rate based on the average of the first frame rate and the second frame rate.
The method according to claim 1,
The processor comprising:
Controlling the maximum frame rate to be set as a first variation amount when the number or ratio of the changed tiles increases,
And controls the maximum frame rate to be set to a second variation amount smaller than the first variation amount when the number or ratio of the changed tiles decreases.
The method according to claim 1,
The processor comprising:
And controls the rendering of the next frame if the set maximum frame rate is larger than the current frame rate,
And skips the rendering of the next frame when the set maximum frame rate is smaller than the current frame rate.
The method of claim 3,
And a memory for storing a frame rate for a previous image frame,
The processor comprising:
Setting the maximum frame rate to the first frame rate when the number or ratio of the changed tiles is increased based on a frame rate of a previous image frame stored in the memory and decreasing the number or ratio of the changed tiles The maximum frame rate is set to a second frame rate.
The method according to claim 1,
The processor comprising:
An application unit for driving a game application;
A frame rate controller for setting a frame rate for the graphic image frame;
And a driving unit for driving the graphic processing unit,
Wherein the graphic processing unit comprises:
And a tile comparison unit for comparing the current image frame with a previous image frame and outputting information on the changed tile.
8. The method of claim 7,
The driving unit includes:
And the frame rate control unit.
The method according to claim 1,
The processor comprising:
An application unit for driving an application;
A frame rate controller for setting a frame rate for the graphic image frame;
A driving unit for driving the graphic processing unit; And
And a tile comparison unit for comparing the current image frame with a previous image frame and outputting information on the changed tile.
10. The method of claim 9,
And a memory for storing a frame rate for a previous image frame.
An image display apparatus comprising the image processing apparatus according to any one of claims 1 to 10.
KR1020160020830A 2016-02-22 2016-02-22 Image processing device and Image display apparatus including the same KR20170098641A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160020830A KR20170098641A (en) 2016-02-22 2016-02-22 Image processing device and Image display apparatus including the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160020830A KR20170098641A (en) 2016-02-22 2016-02-22 Image processing device and Image display apparatus including the same

Publications (1)

Publication Number Publication Date
KR20170098641A true KR20170098641A (en) 2017-08-30

Family

ID=59760637

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160020830A KR20170098641A (en) 2016-02-22 2016-02-22 Image processing device and Image display apparatus including the same

Country Status (1)

Country Link
KR (1) KR20170098641A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11328649B2 (en) 2018-12-19 2022-05-10 Samsung Display Co., Ltd. Driving controller, display device having the same, and driving method of display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11328649B2 (en) 2018-12-19 2022-05-10 Samsung Display Co., Ltd. Driving controller, display device having the same, and driving method of display device
US11823607B2 (en) 2018-12-19 2023-11-21 Samsung Display Co., Ltd. Driving controller, display device having the same, and driving method of display device

Similar Documents

Publication Publication Date Title
CN109712224B (en) Virtual scene rendering method and device and intelligent device
CN108595239B (en) Picture processing method, device, terminal and computer readable storage medium
CN109191549B (en) Method and device for displaying animation
CN111372126B (en) Video playing method, device and storage medium
CN110368689B (en) Game interface display method, system, electronic equipment and storage medium
CN108449641B (en) Method, device, computer equipment and storage medium for playing media stream
CN110856019B (en) Code rate allocation method, device, terminal and storage medium
US10732792B2 (en) Image display apparatus and method for changing properties of a highlighted item and surrounding items
CN108196755B (en) Background picture display method and device
CN108734662B (en) Method and device for displaying icons
US20230179734A1 (en) Video image display method and apparatus, multimedia device and storage medium
KR20170092868A (en) A display apparatus and a display method
CN111010588B (en) Live broadcast processing method and device, storage medium and equipment
CN109636715B (en) Image data transmission method, device and storage medium
CN111083554A (en) Method and device for displaying live gift
CN108664300B (en) Application interface display method and device in picture-in-picture mode
KR20180071775A (en) Power supply device and mobile terminal including the same
KR20170098641A (en) Image processing device and Image display apparatus including the same
KR20170098637A (en) Image processing device and Image display apparatus including the same
CN115798418A (en) Image display method, device, terminal and storage medium
JP7005160B2 (en) Electronic devices and their control methods
KR20170013738A (en) Image display apparatus, and mobile termial
CN110119233B (en) Content pushing method and device, terminal and storage medium
EP3032393A1 (en) Display apparatus and display method
EP3319308B1 (en) Spherical image display apparatus and method of displaying spherical image on a planar display