KR20130042958A - Image display apparatus, server and method for operating the same - Google Patents

Image display apparatus, server and method for operating the same Download PDF

Info

Publication number
KR20130042958A
KR20130042958A KR1020110107139A KR20110107139A KR20130042958A KR 20130042958 A KR20130042958 A KR 20130042958A KR 1020110107139 A KR1020110107139 A KR 1020110107139A KR 20110107139 A KR20110107139 A KR 20110107139A KR 20130042958 A KR20130042958 A KR 20130042958A
Authority
KR
South Korea
Prior art keywords
image
osd
server
receiving
display device
Prior art date
Application number
KR1020110107139A
Other languages
Korean (ko)
Inventor
임종현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110107139A priority Critical patent/KR20130042958A/en
Publication of KR20130042958A publication Critical patent/KR20130042958A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/09Arrangements for device control with a direct linkage to broadcast information or to broadcast space-time; Arrangements for control of broadcast-related services
    • H04H60/14Arrangements for conditional access to broadcast information or to broadcast-related services
    • H04H60/20Arrangements for conditional access to broadcast information or to broadcast-related services on secondary editing information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • H04N21/2355Processing of additional data, e.g. scrambling of additional data or processing content descriptors involving reformatting operations of additional data, e.g. HTML pages
    • H04N21/2358Processing of additional data, e.g. scrambling of additional data or processing content descriptors involving reformatting operations of additional data, e.g. HTML pages for generating different versions, e.g. for different recipient devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4227Providing Remote input by a user located remotely from the client device, e.g. at work

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

PURPOSE: An image display device, a server thereof, and an operating method thereof are provided to efficiently manage resources between the image display device and the server by synthesizing an image or an OSD(On Screen Display) received from the server with an external input image and displaying the same. CONSTITUTION: An image display device receives an external input image from an external device(S610). The image display device receives an image or OSD from a server(S620). The image display device synthesizes the image or OSD with the external input image(S630). The image display device displays the synthesized image on a display(S640). [Reference numerals] (AA) Start; (BB) No; (CC) Yes; (DD) End; (S610) Receiving an external input image from an external device; (S620) Receiving an OSD or image from a network; (S625) Broadcasting image received?; (S630) Synthesizing the OSD or image with the external input image; (S635) Synthesizing the OSD or image with the broadcasting image; (S640) Displaying the synthesized image;

Description

[0001] The present invention relates to an image display apparatus, a server, and a method of operating the same,

The present invention relates to an image display apparatus, a server, and an operation method thereof, and more particularly, to an image display apparatus, a server, and an operation method capable of performing a quick boot.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

An object of the present invention is to provide an image display apparatus, a server, and a method of operating the same, which can efficiently manage resources between the image display apparatus and the server.

Another object of the present invention is to provide an image display apparatus, a server, and an operation method thereof, which may improve user convenience.

According to an aspect of the present invention, there is provided a method of operating an image display device, the method comprising: receiving an external input image from an external device, receiving an OSD or an image from a server, and receiving the received OSD or Synthesizing the image with the external input image; and displaying the synthesized image.

In addition, an operation method of an image display apparatus according to an embodiment of the present invention for achieving the above object, receiving an OSD or an image from a server, receiving a broadcast image, received OSD or image and And synthesizing the received broadcast video and displaying the synthesized video.

In addition, the operation method of the server according to an embodiment of the present invention for achieving the above object, the step of receiving an OSD or image transmission request from the image display device; and, according to the request, generating the corresponding OSD or image And transmitting the generated OSD or image to the image display apparatus, wherein the OSD or image includes a pointer image corresponding to a pointing signal of the remote controller.

In addition, the operation method of the server according to an embodiment of the present invention for achieving the above object, receiving an external input image, receiving an OSD or image transmission request from the image display device, and, upon request, Generating a corresponding OSD or image, synthesizing the generated OSD or image, a received external input image, and transmitting the synthesized image to the image display apparatus.

In addition, an image display apparatus according to an embodiment of the present invention for achieving the above object, an external device interface unit for receiving an external input image from an external device, a network interface unit for receiving an OSD or image from a server, An image synthesizer for synthesizing the SD or image and the external input image, and a display for displaying the image synthesized by the image synthesizer.

In addition, the image display device according to an embodiment of the present invention for achieving the above object, the network interface unit for receiving the OSD or image from the server, the broadcast receiving unit for receiving the broadcast image, the OSD or image, and broadcast An image synthesizer for synthesizing an image and a display for displaying an image synthesized by the image synthesizer.

In addition, the server according to an embodiment of the present invention for achieving the above object, the network interface unit for receiving an OSD or image transmission request from the image display device, and a graphic processing unit for generating a corresponding OS or image according to the request The network interface unit may transmit the generated OSD or image to the image display device, and the OSD or image may include a pointer image corresponding to a pointing signal of the remote controller.

In addition, the server according to an embodiment of the present invention for achieving the above object, the external device interface unit for receiving an external input image, the network interface unit for receiving an OSD or image transmission request from the image display device, Accordingly, a graphic processing unit for generating the corresponding OSD or image, and an image synthesizing unit for synthesizing the generated OSD or image and the received external input image, the network interface unit transmitting the synthesized image to the image display apparatus.

According to an embodiment of the present invention, by receiving an external input image from an external device, receiving an OSD or an image from a server, and synthesizing and displaying the received OSD or image and the received external input image, the image display apparatus It can efficiently manage the resources between the server and the server.

Meanwhile, the broadcast image is signal-processed in the video display device and the OSD or the image is received from the server, thereby efficiently managing resources between the video display device and the server.

The OSD or image from the server may be a graphic OSD or image for the home screen, or the like. By using such a server, signal processing for various functions in the smart TV can be performed in the server.

According to this method, various functions in the smart TV can be easily updated through the server, thereby increasing user convenience. In addition, it is possible to continuously improve performance without replacing the image display device.

In addition, signal processing and data provision for a plurality of video display devices can be provided through one server.

1 is a diagram schematically illustrating a video display device system according to an exemplary embodiment of the present invention.
FIG. 2 is an internal block diagram of the image display device of FIG. 1.
3 is an internal block diagram of the controller of FIG. 2.
4 is a diagram illustrating an example of a legacy system platform in the image display device of FIG. 1.
5 is a diagram illustrating an example of a structure diagram of a smart system platform in the server of FIG. 1.
6 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention.
7 to 13 are views for explaining various examples of the operation method of the image display apparatus of FIG. 6.
14 is a flowchart illustrating a method of operating a server according to an embodiment of the present invention.
15 is a flowchart illustrating a method of operating a server according to another embodiment of the present invention.
16 to 17 are views referred to for describing various examples of a method of operating the server of FIG. 15.

Hereinafter, the present invention will be described in more detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a diagram schematically illustrating a video display device system according to an exemplary embodiment of the present invention.

Referring to the drawings, the image display device system 50 according to an embodiment of the present invention may include an image display device 100, an external device 300, and a server 500.

The video display device system 50 according to an embodiment of the present invention may be a virtual desktop infrastructure system.

In such a VDI environment, the video display device 100 can operate as a client device. Meanwhile, the server 50 transmits predetermined data to the image display apparatus 100 that is a client device.

Typically, in a VDI environment, the server 50 is responsible for controlling the peripherals of the client device, but in the embodiment of the present invention, it is assumed that the server 50 is selectively responsible for controlling the peripherals.

In particular, when the image display apparatus 100 receives a broadcast image and displays the broadcast image, since the amount of broadcast image data is considerable, the processing of the broadcast image is not the server 500 but the image display apparatus 100 or the external device 300. We assume in charge of.

To this end, the video display device 100 may include a legacy system platform 400, and the server 500 may include a smart system platform 405. This will be described later with reference to FIGS. 4 and 5.

Meanwhile, the above-described image display apparatus 100 may be a digital broadcast receiver capable of receiving fixed or mobile digital broadcasting.

On the other hand, the image display device 100 described in the present specification is a TV receiver capable of displaying a broadcast image, a monitor, a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, PDA (Personal Digital Assistants) ), PMP (Portable Multimedia Player), and the like.

The external device 300 may be a digital versatile disk (DVD) player, a Blu-ray player, a game device, a camera, a camcorder, a computer (laptop), a set top box, or the like. The external device 300 may be connected to the image display apparatus 100 by wire / wireless and perform an input / output operation with the image display apparatus 100.

The server 50 may be connected to the image display apparatus 100 by wire / wireless, and generate a corresponding OSD or image according to a data transmission request from the image display apparatus 100, thereby generating the image. 100 can be sent. In addition, various control operations for the image display apparatus 100 may be performed.

The server 50 is a server adjacent to the image display apparatus 100 and may be a home server. In particular, the server 50 may perform signal processing on various functions of the smart TV, and transmit the processed data to the image display apparatus 100.

In particular, in a VDI environment, all process resources are in the server 500, and the image display apparatus 100, which is a client device, displays an image, OSD, or image processed by the server 500, the external apparatus 300, or the like. It can be displayed by combining.

According to this method, various functions of the smart TV can be easily updated through the server 50, so that user convenience can be increased. In addition, it is possible to continuously improve performance without replacing the image display device.

In addition, through one server 500, it is possible to provide signal processing and data for the plurality of image display apparatuses.

FIG. 2 is an internal block diagram of the image display device of FIG. 1.

2, an image display apparatus 100 according to an exemplary embodiment of the present invention includes a broadcast receiving unit 105, a network interface unit 130, an external device interface unit 135, a storage unit 140, An input interface unit 150, a control unit 170, a display 180, an audio output unit 185, and a power supply unit 190. Among these, the broadcast receiver 105 may include a tuner 110 and a demodulator 120. Meanwhile, the broadcast receiving unit 105 may further include a network interface unit 130.

The tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it may be converted into a digital IF signal (DIF), and if the analog broadcast signal is an analog baseband video or audio signal (CVBS / SIF).

Meanwhile, the tuner 110 may sequentially select RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna and convert them to intermediate frequency signals or baseband video or audio signals. have.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulator 120 may output a stream signal TS after performing demodulation and channel decoding. At this time, the stream signal may be a signal in which a video signal, a voice signal, or a data signal is multiplexed.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 135 may connect the external device to the image display device 100. To this end, the external device interface unit 135 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a DVD (Digital Versatile Disk), Blu-ray (Blu ray), a game device, a camera, a camcorder, a computer (laptop), a set top box, or the like by wire / wireless. It may also perform input / output operations with external devices.

The A / V input / output unit may receive a video and audio signal of an external device. The wireless communication unit may perform short range wireless communication with another electronic device.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. For example, the network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network.

Meanwhile, the network interface unit 130 may access a predetermined web page through a connected network or another network linked to the connected network. That is, by accessing a predetermined web page through the network, it is possible to send or receive data with the server. In addition, content or data provided by a content provider or a network operator may be received.

In addition, the network interface unit 130 may select and receive a desired application from among applications that are open to the public through the network.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal processed video, audio, or data signal.

In addition, the storage 140 may perform a function for temporarily storing an image, audio, or data signal input from the external device interface 135 or the network interface 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function.

In addition, the storage 140 may store an application or a list of applications input from the external device interface 135 or the network interface 130.

The image display apparatus 100 may reproduce and provide a content file (video file, still image file, music file, document file, application file, etc.) stored in the storage 140 to a user.

2 illustrates an embodiment in which the storage 140 is provided separately from the controller 170, but the scope of the present invention is not limited thereto. The storage unit 140 may be included in the control unit 170.

The user input interface unit 150 transmits a signal input by the user to the controller 170 or transmits a signal from the controller 170 to the user.

For example, the remote controller 200 transmits / receives a user input signal such as power on / off, channel selection, screen setting, or a local key (not shown) such as a power key, a channel key, a volume key, or a set value. Transmits a user input signal input from the control unit 170, or transmits a user input signal input from the sensor unit (not shown) for sensing the user's gesture to the control unit 170, or the signal from the control unit 170 It can transmit to a sensor unit (not shown).

The control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 135, and outputs a video or audio signal. You can create and output.

The image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal. The video signal processed by the controller 170 may be input to the external output device through the external device interface 135. [

The audio signal processed by the control unit 170 may be output to the audio output unit 185 through audio. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 135.

Although not shown in FIG. 2, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 3.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150. In particular, the user may access the network to download the desired application or application list into the image display apparatus 100.

For example, the controller 170 controls the tuner 110 to input a signal of a selected channel according to a predetermined channel selection command received through the user input interface unit 150. Then, video, audio, or data signals of the selected channel are processed. The controller 170 may output the channel information selected by the user together with the processed video or audio signal through the display 180 or the audio output unit 185.

As another example, the controller 170 may receive an external device image playback command from the external device, for example, a camera or a camcorder, according to an external device image playback command received through the user input interface unit 150. The video signal or the audio signal may be output through the display 180 or the audio output unit 185.

The controller 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate a 3D object for a predetermined 2D object among the images displayed on the display 180, and display the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), an EPG (Electronic Program Guide), various menus, widgets, icons, still images, videos, and text.

The controller 170 may recognize a location of a user based on an image photographed by a photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 may be determined. In addition, the x-axis coordinates and the y-axis coordinates in the display 180 corresponding to the user position may be determined.

Meanwhile, when entering an application view item, the controller 170 may control to display an application or a list of applications that can be downloaded from the image display apparatus 100 or from an external network.

The controller 170 may control to install and run an application downloaded from an external network along with various user interfaces. In addition, by selecting a user, an image related to an executed application may be controlled to be displayed on the display 180.

The display 180 converts an image signal, a data signal, an OSD signal processed by the controller 170, or an image signal, data signal, etc. received from the external device interface unit 135 into R, G, and B signals, respectively. Generate a signal.

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or a 3D display.

The display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the controller 170 and outputs the audio signal.

The power supply unit 190 supplies power to the entire image display apparatus 100. In particular, power may be supplied to the controller 170, which may be implemented in the form of a System On Chip (SOC), a display 180 for displaying an image, and an audio output unit 185 for audio output. have.

To this end, the power supply unit 190 may include a converter (not shown) for converting the AC power into DC power. Meanwhile, for example, when the display 180 is implemented as a liquid crystal panel having a plurality of backlight lamps, an inverter (not shown) capable of PWM operation may be further provided for driving of variable brightness or dimming. have.

The remote control apparatus 200 transmits a user input to the user input interface unit 150. To this end, the remote control apparatus 200 may use RF (Radio Frequency) communication, infrared (IR) communication, Bluetooth, Bluetooth (UWB), ZigBee (ZigBee) method and the like.

In addition, the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, display it on the remote control apparatus 200 or output an audio or vibration.

The remote control apparatus 200 may transmit coordinate information corresponding to the movement of the remote control apparatus 200 to the image display apparatus 100. As a result, a pointer corresponding to the movement of the remote control apparatus 200 may be displayed on the display of the image display apparatus. As such, since the corresponding pointer is moved and displayed according to the movement in the 3D space, this may be referred to as a 3D pointing device.

Meanwhile, a block diagram of the image display device 100 shown in FIG. 2 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

On the other hand, the image display device 100, unlike shown in Figure 2, does not include the tuner 110 and the demodulator 120 shown in Figure 2, the network interface unit 130 or the external device interface unit ( Through 135, a broadcast image may be received and reproduced.

3 is an internal block diagram of the controller of FIG. 2.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 310, the image processor 320, the OSD generator 340, the mixer 350, the frame rate converter 355 ), And a formatter 360. In addition, the apparatus may further include a voice processor (not shown) and a data processor (not shown).

The demultiplexer 310 demultiplexes an input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface unit 135.

The image processor 320 may perform image processing of the demultiplexed image signal. To this end, the image processor 320 may include an image decoder 325 and a scaler 335.

The image decoder 325 decodes the demultiplexed video signal, and the scaler 335 performs scaling to output the resolution of the decoded video signal on the display 180.

The video decoder 325 can include a decoder of various standards.

On the other hand, the video signal decoded by the image processor 320 is input to the mixer 350.

The processor 330 may control overall operations in the image display apparatus 100 or the controller 170. For example, the processor 330 may control the tuner 110 to control tuning of an RF broadcast corresponding to a channel selected by a user or a previously stored channel.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

In addition, the processor 330 may perform data transmission control with the network interface unit 135 or the external device interface unit 130.

The processor 330 may control operations of the demultiplexing unit 310, the image processing unit 320, the OSD generating unit 340, and the like in the controller 170.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal or a control signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100.

For example, the OSD generator 340 may generate a signal for displaying broadcast information based on subtitles or EPGs of a broadcast image.

On the other hand, since the OSD generation unit 340 generates an OSD signal or a graphic signal, it may be referred to as a graphic processing unit.

The mixer 350 may mix the OSD signal generated by the OSD generator 340 and the decoded image signal processed by the image processor 220. The mixed signal is provided to the formatter 360. Since the decoded broadcast video signal or the external input signal and the OSD signal are mixed, the OSD may be overlaid and displayed on the broadcast video or the external input video.

The frame rate converter (FRC) 355 may convert the frame rate of the input video. On the other hand, the frame rate converter 350 can output the data as it is without additional frame rate conversion.

The formatter 360 receives the output signal of the frame rate converter 355 and changes the format of the signal to be suitable for the display 180 and outputs the signal. For example, the R, G, B data signals may be output, and the R, G, B data signals may be output as low voltage differential signaling (LVDS) or mini-LVDS.

The formatter 360 may change the format of the 3D video signal or convert the 2D video into a 3D video.

The voice processing unit (not shown) in the controller 170 may perform voice processing of the demultiplexed voice signal. To this end, the voice processing unit (not shown) may include various decoders.

Also, the voice processing unit (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel.

Meanwhile, a block diagram of the controller 170 shown in FIG. 3 is a block diagram for one embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be provided separately.

4 is a diagram illustrating an example of a legacy system platform in the image display device of FIG. 1.

Referring to FIG. 4, the legacy system platform 400 in the image display apparatus 100 according to an exemplary embodiment of the present invention may include a driver 420 and a middleware on the OS kernel 410. 430, an application layer 450 may be included.

The OS kernel 410 is a core of an operating system, and when a video display device 100 is driven, a hardware driver is driven, security of hardware and a processor in the video display device 100, efficient management of system resources, and memory. At least one of management, provision of an interface to hardware by hardware abstraction, multiprocessing, and schedule management according to multiprocessing may be performed. The OS kernel 410 may further provide power management and the like.

The hardware driver in the OS kernel 410 may be, for example, at least one of a display driver, a WiFi driver, a Bluetooth driver, a USB driver, an audio driver, a power manager, a binder driver, a memory driver, and the like. It may include.

In addition, the hardware driver in the OS kernel 410 is a driver for a hardware device in the OS kernel 410, and includes a character device driver, a block device driver, and a network device driver. dirver). The block device driver may need a buffer to store as much as the unit size because data is transmitted in a specific block unit, and the character device driver may not need a buffer because it is transmitted in a basic data unit, that is, a character unit.

The OS kernel 410 may be implemented as a kernel based on various operating systems (OS) such as Unix-based (Linux) and Windows-based. In addition, the OS kernel 410 is an open OS kernel and may be general purpose that can be used in other electronic devices.

The driver 420 is located between the OS kernel 410 and the middleware 430 and, together with the middleware 430, drives the device for the operation of the application layer 450. For example, the driver 420 may include a micom, a display module, a graphics processing unit (GPU), a frame rate converter (FRC), a general purpose input / output pin (GPIO), and the like in the video display device 100. It may include a driver such as HDMI, SDEC (System Decoder or Demultiplexer), VDEC (Video Decoder), ADEC (Audio Decoder), PVR (Personal Video Recorder), or I2C (Inter-Integrated Circuit). . These drivers operate in conjunction with hardware drivers in the OS kernel 410.

In addition, the driver 420 may further include a driver of the remote control apparatus 200, particularly, a 3D pointing device to be described later. The driver of the 3D pointing device may be provided in various ways in the OS kernel 410 or the middleware 430 in addition to the driver 420.

The middleware 430 may be located between the OS kernel 410 and the application layer 450, and may serve as an intermediary to exchange data between different hardware or software. As a result, a standardized interface can be provided, and various environments can be supported and systems can interoperate with other tasks.

Examples of the middleware 430 in the legacy system platform 400 may include middleware of multimedia and hypermedia information coding expert groups (MHEG) and advanced common application platform (ACAP), which are data broadcasting related middleware, and PSIP, which is broadcast information related middleware. Or there may be a middleware of the SI, and DLNA middleware, which is a middleware related to the peripheral communication.

The application layer 450 on the middleware 430, that is, the application 450 layer in the legacy system platform 400, may, for example, provide a user interface application for various menus in the image display apparatus 100. It may include. The application layer 450 on the middleware 430 may be editable by a user's selection and may be updated through a network. By using the application layer 450, it is possible to enter a desired menu among various user interfaces according to the input of the remote control apparatus 200 while watching a broadcast image.

In addition, the application layer 450 in the legacy system platform 400 may further include at least one of a TV guide application, a Bluetooth application, a reservation application, a digital video recorder (DVR) application, and a hotkey application. Can be.

The aforementioned platform of FIG. 4 may be loaded in the storage 140 or the controller 170 or a separate processor (not shown).

5 is a diagram illustrating an example of a structure diagram of a smart system platform in the server of FIG. 1.

Referring to FIG. 5, the smart system platform 405 in the server 500 according to an embodiment of the present invention may include a library 435, a framework 440, and an application on the OS kernel 415. The application layer 455 may be included.

The OS kernel 415 is a core of an operating system, and when the server 500 is driven, a hardware driver is driven, security of hardware and a processor in the server 500, efficient management of system resources, memory management, and hardware abstraction. At least one of an interface to hardware, a multi-process, and a schedule management according to the multi-process may be performed. The OS kernel 415 may further provide power management or the like.

The hardware driver in the OS kernel 415 is a driver for a hardware device in the OS kernel 410, and is a character device driver, a block device driver, and a network device driver. It may be provided. The block device driver may need a buffer to store as much as the unit size because data is transmitted in a specific block unit, and the character device driver may not need a buffer because it is transmitted in a basic data unit, that is, a character unit.

The OS kernel 415 may be implemented as a kernel based on various operating systems (OS) such as Unix-based (Linux) and Windows-based. In addition, the OS kernel 415 is an open OS kernel and may be general-purpose usable in other electronic devices.

The library 435 may be located between the OS kernel 415 and the framework 440, and form a basis of the framework 440. For example, the library 435 may include a security-related library, SSL (Secure Socket Layer), a web engine-related library, WebKit, libc (c library), a video format, an audio format, and the like. The media framework, which is a media related library, may be included. Such a library 435 may be written based on C or C ++. In addition, it may be exposed to the developer through the framework 440.

The library 435 may include a runtime 437 having a core java library and a virtual machine (VM). This runtime 437, together with the library 435, forms the basis of the framework 440.

The virtual machine VM may be a plurality of instances, that is, a virtual machine capable of performing multitasking. Meanwhile, each virtual machine (VM) may be allocated and executed according to each application in the application layer 455. In this case, a binder (in the OS kernel 415) may be used for scheduling or interconnection between a plurality of instances. Binder) driver (not shown) may operate.

The binder driver and the runtime 437 may connect a Java-based application to a C-based library.

The library 435 and the runtime 437 may correspond to the middleware of the legacy system.

Meanwhile, the framework 440 in the smart system platform 405 includes a program on which the application in the application layer 455 is based. The framework 440 may be compatible with any application and may reuse, move, or exchange components. The framework 440 may include a support program, a program that ties other software components, and the like. For example, the resource manager may include a resource manager, an activity manager related to an activity of an application, a notification manager, a content provider that summarizes sharing information between applications, and the like. . The framework 440 may be written based on Java (JAVA).

The application layer 455 on the framework 440 includes various programs that can be driven and displayed in the image display apparatus 100. For example, a core application including at least one of an email, a short message service (SMS), a calendar, a map, a browser, and the like may be included. Can be. The application layer 450 may be written based on JAVA.

In addition, the application layer 455 is an application 465 stored in the image display apparatus 100 that cannot be deleted by the user, and is downloaded and stored through an external device or a network, and is free to install or delete by the user. 475).

Through the application in this application layer 455, by Internet connection, Internet telephony service, video on demand (VOD) service, web album service, social networking service (SNS), location based service (LBS), map service, web search Services, application search services, etc. may be performed. In addition, various functions such as a game and schedule management may be performed.

The aforementioned platform of FIG. 5 may be used for general purposes in the server 500 as well as various other electronic devices.

Meanwhile, the platform of FIG. 5 may be loaded into a storage unit (not shown) or a processor (see 740 of FIG. 7) in the server 500.

6 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 7 to 13 are views for explaining various examples of the method of operating the image display apparatus of FIG. 7.

Referring to the drawing, first, an external input image is received from an external device (S610). The external device interface 135 of FIG. 7 receives an external input image from the external device 300. The external device 300 may be various examples such as an optical disc player such as a Blu-ray player, a set top box, a camera, and a game device. The following description focuses on the optical disc player.

When the content is played in the optical disc player 300, the played content may be input to the HDMI terminal of the external device interface unit 135.

8 illustrates that a content image 810 reproduced in the optical disc player 300 is transmitted to the image display apparatus 100.

Meanwhile, the playback content image 810 input to the external device interface unit 135 is transmitted to the image synthesizing unit 710. The image synthesizing unit 710 receives the external input image 810 from the external device interface unit 135.

Next, the OS or image is received from the server (S620). The network interface unit 130 of FIG. 7 receives an OSD or an image from the server 500 through a network.

For example, when there is a home screen display input of the image display apparatus 100, the image display apparatus 100 may transmit an OSD or graphic image transmission request related to the home screen to the server 500. Accordingly, the server 500 generates an OSD or graphic image related to the home screen in the graphic processing unit 750, and displays an image of the OSD or graphic image related to the home screen through the network interface unit 730. To the device 100.

8 illustrates that the OS 820 related to the home screen generated by the server 500 is transmitted to the image display apparatus 100.

The network interface unit 130 of the image display apparatus 100 receives an OSD or graphic image 820 associated with the generated home screen, transmits the received OS or graphic image 820 to the image synthesizing unit 710, and then, the image synthesizing unit ( 710 receives an OSD or graphical image 820 associated with the home screen.

Next, it is determined whether a broadcast image is received by the image display apparatus 100 (S625). If a broadcast image is not received, step S630 may be performed. When the broadcast video is received, operation S635 may be performed.

In operation S630, the image synthesis unit 710 synthesizes the received OSD or image and the received external input image in operation S630. The display 180 displays the synthesized video (S640).

The image synthesizer 710 synthesizes the received external input image 810 and the OSD 820 related to the received home screen. To this end, scaling of the received external input image 810 may be performed. In FIG. 8, it can be seen that the size of the external input image 815 in the home screen displayed in the display 180 is smaller than that of the external input image 810 input from the optical disc player 300.

The image synthesizing unit 710 may arrange the positions of the scaled external input image 815 according to the home screen setting. In FIG. 8, the scaled external input image 815 is disposed on the upper left portion of the display 180.

In this way, the video display device 100 synthesizes and displays the received external device input video and the received OSD or image, thereby efficiently managing resources between the video display device and the server.

In operation S635, the image synthesizer 710 synthesizes the received OSD or image and the received broadcast image in operation S630. The display 180 displays the synthesized video (S640).

The broadcast signal may be received through the broadcast receiver 105, for example, the tuner 110, the demodulator 120, or the like in the image display apparatus 100. The received broadcast signal is signal processed through demultiplexing, and the like, and the broadcast image of which the signal is processed may be input to the image synthesizer 710.

FIG. 11 illustrates a broadcast image 1110 received by the video display device 100, in particular, by the broadcast receiver 105.

The image synthesizer 710 synthesizes the received broadcast image 1110 and the OSD 820 related to the received home screen. To this end, scaling of the received broadcast image 1110 may be performed. In FIG. 11, it can be seen that the size of the broadcast video 1115 in the home screen displayed in the display 180 is smaller than the broadcast video 1110 received by the broadcast receiver 105.

The image synthesizing unit 710 may arrange the positions of the scaled broadcast image 1115 according to the home screen setting. In FIG. 11, the scaled broadcast image 1115 is disposed on the upper left portion of the display 180.

In this way, the video display device 100 synthesizes and displays the received broadcast video and the received OSD or image, thereby efficiently managing resources between the video display device 100 and the server 500. do.

As shown in FIG. 8, the image synthesis in the image display apparatus 100 is performed by using the OS 820 related to the home screen generated by the server 500 and the external input image 810 received from the optical disc player 300. In operation 710, when the OS 820 and the external input image 810 are synthesized, the home screen 900 as illustrated in FIG. 9 may be displayed on the display 180 of the image display apparatus 100. .

9 illustrates that a home screen is displayed on the display 180 of the image display apparatus.

The home screen 900 of FIG. 9 is an initial screen when the power is on or in the standby mode, or is provided in the local key (not shown) or the remote controller 200. It may be set as a basic screen by the operation of a home key (for example, a 'menu' button).

In order to implement such a home screen, the smart system platform may be mounted in the processor 740 or a storage unit (not shown) in the server 500.

For example, the smart system platform may include an OS kernel, a library on the OAS kernel, a framework, and an application layer. Under this smart system platform, the application can be freely downloaded, installed, executed, deleted, and the like.

The home screen 900 of FIG. 9 broadly lists items from a live image area 910 displaying a live image 815, from various sources (eg, content providers or applications such as CPs). The card object area 920 including the card objects 921 and 1222 to be displayed separately, and the application menu area 930 including a shortcut menu of the application item.

In the drawing, the application menu area 930 is displayed at the bottom of the screen. In addition, the login item and the exit item are further displayed.

In this case, the live image area 910 and the application menu area 930 may be fixedly displayed items or objects therein.

Meanwhile, the card object area 920 may be displayed by moving or replacing the card objects 921 and 922 therein. Alternatively, each item (eg, an item “yakoo”) in the card objects 921 and 922 may be displayed by being moved or replaced.

Meanwhile, in the drawing, the scaled external image 815 in which the external input image input from the external device is scaled is displayed in the live image area 910.

Meanwhile, as shown in FIG. 9, when the location of the pointer 905 is in the live image 915 while the home screen is displayed on the display 180, an arrow-shaped pointer 905 may be displayed.

For example, when a pointing signal (motion information, etc.) of the remote control apparatus 200 is input through the user input interface unit 150 of the image display apparatus 100 of FIG. 7, the user input interface unit 150 is input. Alternatively, the processor 330 may extract pointer coordinate information of the remote controller based on the pointing signal. The extracted pointer coordinate information may be transmitted to the server 500 through the network interface 130.

The server 500 may generate a pointer image together when generating an OSD or an image related to a home screen in the graphic processor 750 based on the received pointer coordinate information. Accordingly, as illustrated in FIG. 9, the pointer image 905 may be generated in the external image 815.

As another example, when a pointing signal (movement information, etc.) of the remote control apparatus 200 is input through the user input interface unit 150 in the image display apparatus 100 of FIG. 7, the processor 330 may indicate the pointing signal. Can be transmitted to the server 500 as it is.

The processor 740 of the server 500 may calculate coordinate information based on the received pointing signal. The graphic processor 750 may also generate a pointer image when generating an OSD or an image related to a home screen. Accordingly, as illustrated in FIG. 9, the pointer image 905 may be generated in the external image 815.

On the other hand, the image synthesizing unit 710 in the image display apparatus 100 has the priority of the pointer image 905 as the highest priority, so that when the image is synthesized, the pointer image is superimposed on another image or the OSD image. .

On the other hand, when the application card object 922 is selected by using the pointer 905 corresponding to the movement of the remote controller 200, the card object name 923 in the application card object 922 is selected. In this case, an app store screen (not shown) may be displayed.

Meanwhile, an app store screen (not shown) may also be generated in the server 500.

FIG. 10 illustrates that when the camera 301 is used as an external device, the user image 1010 captured by the camera 301 and another user image 1020 are transmitted from the server 500 to the image display device. To illustrate.

The other user image 1020 may be an image of a user of another image display apparatus remotely connected to the image display apparatus 100 through a network.

When the video call function of the video display device 100 is activated, the video display device 100, as shown in the figure, the user image 1010 taken by the camera 301 and another user image from the server 500 1020 may be received.

The image synthesizer 710 synthesizes the received external input image 1010 and another user image 1020 received from the server 500. To this end, scaling of another user image 1020 received from the server 500 may be performed. In FIG. 20, it can be seen that the size of the other user image 1036 in the video call screen 1030 displayed in the display 180 is smaller than that of the other user image 1020 input from the server 500.

Meanwhile, the video synthesizing unit 710 may arrange the positions of other scaled user images 1036 according to activation of the video call function. 20 illustrates placing another scaled user image 1036 in the lower left portion of the display 180.

As such, the image display apparatus 100 efficiently displays resources between the image display apparatus 100 and the server 500 by synthesizing and displaying the external apparatus input image received from the external apparatus and the image received from the server. It can be managed.

Meanwhile, as shown in FIG. 11, the OSD 820 associated with the home screen generated by the server 500 and the broadcast image 1110 received by the image display apparatus 100 are used to operate in the image display apparatus 100. When the image synthesizer 710 synthesizes the OS 820 and the broadcast image 1110, the home screen 1200 as shown in FIG. 12 may be displayed on the display 180 of the image display apparatus 100. have.

12 illustrates that a home screen is displayed on the display 180 of the image display apparatus.

The home screen 1200 of FIG. 12 is similar to the home screen of FIG. 9, except that the broadcast screen 1115 is displayed in the live video area 1110 instead of the external input video 815. Accordingly, broadcast video information is displayed in the broadcast video 1115.

Meanwhile, as shown in FIG. 12, when the location of the pointer 1205 is in the broadcast image 1115 while the home screen is displayed on the display 180, an arrow-shaped pointer 1205 may be displayed.

For example, when a pointing signal (motion information, etc.) of the remote control apparatus 200 is input through the user input interface unit 150 of the image display apparatus 100 of FIG. 7, the user input interface unit 150 is input. Alternatively, the processor 330 may extract pointer coordinate information of the remote controller based on the pointing signal. The extracted pointer coordinate information may be transmitted to the server 500 through the network interface 130.

The server 500 may generate a pointer image together when generating an OSD or an image related to a home screen in the graphic processor 750 based on the received pointer coordinate information. Accordingly, as illustrated in FIG. 11, the pointer image 1205 may be generated in the broadcast video 1115.

As another example, when a pointing signal (movement information, etc.) of the remote control apparatus 200 is input through the user input interface unit 150 in the image display apparatus 100 of FIG. 7, the processor 330 may indicate the pointing signal. Can be transmitted to the server 500 as it is.

The processor 740 of the server 500 may calculate coordinate information based on the received pointing signal. The graphic processor 750 may also generate a pointer image when generating an OSD or an image related to a home screen. Accordingly, as illustrated in FIG. 11, the pointer image 1205 may be generated in the broadcast image 1115.

On the other hand, the image synthesizing unit 710 in the image display apparatus 100 has the priority of the pointer image 1205 as the highest priority, so that when the image is synthesized, the pointer image is superimposed on another image or the OSD image. .

Meanwhile, FIG. 13 illustrates that one server 500 transmits an OSD or an image to the plurality of image display apparatuses 100a and 100b.

The video display device system 50 according to an exemplary embodiment of the present invention may control the video display devices 100a and 100b by using the server 50.

In detail, when the server 500 receives an OSD or image transmission request from the first image display apparatus 100a, the server 500 generates the corresponding OSD or image 1310 in response to the request. . The generated OS or image 1310 is transmitted to the first image display apparatus 100a through the network interface unit 730.

On the other hand, when the server 500 receives an OSD or image transmission request from the second image display apparatus 100b at the same time, the graphic processing unit 750 generates the corresponding OSD or image 1320 according to the request. do. The generated OSD or image 1320 is transmitted to the second image display apparatus 100b through the network interface unit 730.

As such, when receiving the OSD or image transmission request from the plurality of image display apparatuses 100a and 100b, respectively, the graphic processing unit 750 in the server 500 may execute the corresponding OSD or image in parallel, respectively. By creating the 1310, 1320, it is possible to perform data processing quickly.

14 is a flowchart illustrating a method of operating a server according to an embodiment of the present invention.

Referring to the drawing, first, an OSD or an image transmission request is received from an image display apparatus (S1410).

For example, when there is a home screen display input, the video display device 100 may transmit a home screen-related OSD or image transmission request to the server 500.

As another example, when the video call function is activated, the video display device 100 may transmit another user captured image transmission request to the server 500.

The network interface unit 730 in the server 500 may receive this request and forward it to the processor 740.

Next, according to the request, the corresponding OSD or image is generated (S1420).

For example, the graphic processor 740 in the server 500 generates a corresponding home screen OSD or image according to the home screen OSD or image transmission request.

As another example, the network interface unit 730 in the server 500 may receive another user captured image from another image display device through a network according to another user captured image transmission request.

Next, the generated OS or image is transmitted to the image display apparatus (S1430).

The network interface unit 730 in the server 500 may transmit the generated home screen-related OSD or image to the image display apparatus 100 or may transmit another received user captured image to the image display apparatus.

On the other hand, when the pointing remote control apparatus 200 is used, the image display apparatus 100 may transmit the pointing signal or the pointing coordinate information to the server 500, and the graphic processor 750 in the server 500 may The pointer image may be generated based on the received pointing signal or the pointing coordinate information. In particular, in the generated OSD or image, the pointer image may be disposed at a specific position based on the coordinate information.

The generated OSD or image including the pointer image is transmitted to the video display device 100 through the network interface unit 730 in the server 500.

15 is a flowchart illustrating a method of operating a server according to another exemplary embodiment of the present invention, and FIGS. 16 to 17 are diagrams for explaining various examples of the method of operating the server of FIG. 15.

Referring to FIG. 15, in the method of operating the server of FIG. 15, the server 500 receives an external input image and synthesizes the OSD or imager and the external input image from which the server 500 is generated. There is a difference in transmitting the synthesized image to the image display apparatus 100.

First, an external input image is received from the external device 300 (S1505).

The server 500 may include an external device interface unit (not shown), through which an external input image may be received from the external device 300.

FIG. 16 illustrates that the content image 810 reproduced in the optical disc player 300 is transmitted to the server 500. Similar to FIG. 8, the difference is that the reproduced content image 810 is transmitted to the reproduced server 500 instead of the image display device 100.

17 illustrates that the user image 1010 captured by the camera is transmitted to the server 500. Although similar to FIG. 10, there is a difference in that the captured user image 1010 is transmitted to the reproduced server 500 instead of the image display apparatus 100.

In operation S1510, an OSD or image transmission request is received from the image display apparatus 100. In operation S1520, the corresponding OSD or image is generated according to the request. In operation S1525, the OSD or the image and the external input image are synthesized.

For example, when there is a home screen display input, the video display device 100 may transmit a home screen-related OSD or image transmission request to the server 500.

According to the home screen display input, the server 500 generates an OSD or an image 820 related to the home screen. The image synthesizer (not shown) in the server 500 synthesizes the generated OSD or image 820 and the external input image 810.

FIG. 16 illustrates generating an image 830 by synthesizing an external input image 810 with an OSD or image 820 related to a home screen in the server 500.

As another example, when the video call function is activated, the video display device 100 may transmit another user captured image transmission request to the server 500.

In response to a request for transmitting another user captured image, the network interface unit 730 in the server 500 may receive another user captured image 1020 from another image display device through a network. The image synthesizer (not shown) in the server 500 synthesizes the captured user image 1010 received from the external device and the other received user captured image 1020.

FIG. 17 illustrates generating a synthesized image 1030 by synthesizing a captured user image 1010 received from an external device with another received user captured image 1020 in the server 500. .

Next, the synthesized image is transmitted to the image display apparatus (S1530).

The network interface unit 730 in the server 500 transmits the synthesized home screen image 830 to the video display device 100 as illustrated in FIG. 16, or the synthesized video call image 1030 as illustrated in FIG. 17. May be transmitted to the image display apparatus 100.

On the other hand, when the pointing remote control apparatus 200 is used, the image display apparatus 100 may transmit the pointing signal or the pointing coordinate information to the server 500, and the graphic processor 750 in the server 500 may The pointer image may be generated based on the received pointing signal or the pointing coordinate information.

In particular, the image synthesizing unit (not shown) may arrange the pointer image at a specific position in the synthesized image based on the coordinate information.

The synthesized image including the pointer image is transmitted to the image display apparatus 100 through the network interface unit 730 in the server 500.

16 and 17, it is assumed that the video display device 100 receives a broadcast video. Accordingly, when displaying the broadcast video in the live video area instead of the external input video when displaying the home screen, the server 500 of FIG. 16 displays an OSD or an image related to the home screen, not the synthesized video It may be transmitted to the display device 100. That is, the transmission of the synthesized image may be performed only when the external input image is displayed on the home screen.

The image display apparatus, the server and the operation method thereof according to the present invention are not limited to the configurations and the methods of the embodiments described above, but the embodiments are not limited thereto, Or some of them may be selectively combined.

On the other hand, the operating method of the image display device or the server of the present invention can be implemented as a processor-readable code in a processor-readable recording medium provided in the image display device or the server. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (26)

Receiving an external input image from an external device;
Receiving an OS or image from a server;
Synthesizing the received OSD or image with the received external input image; And
Displaying the synthesized image; and operating the image display device.
The method of claim 1,
Receiving a broadcast image;
Synthesizing the received OSD or image with the received broadcast image; And
And displaying the synthesized image.
The method of claim 1,
The received OSD or image is,
An operating method of an image display device comprising an OSD or an image related to a home screen.
The method of claim 1,
The received OSD or image is,
Method of operating a video display device comprising a photographed image taken by another user.
The method of claim 1,
Receiving a pointing signal of a remote control device;
Sending the pointing signal to the server;
And receiving a pointer image related to the pointing signal from the server.
The method of claim 1,
And further receiving a pointer image related to a pointing signal of a remote control device from the server.
The method of claim 1,
Receiving a pointing signal of a remote control device; And
Generating a pointer image related to the pointing signal;
The image synthesis step,
And combining the received external input image, the received OS or image, and the pointer image.
Receiving an OS or image from a server;
Receiving a broadcast image;
Synthesizing the received OSD or image with the received broadcast image; And
Displaying the synthesized image; and operating the image display device.
Receiving an OSD or image transmission request from an image display device;
Generating the corresponding OSD or image according to the request; And
And transmitting the generated OSD or image to the image display device.
The OSD or image includes a pointer image corresponding to a pointing signal of a remote control device.
10. The method of claim 9,
Receiving the pointing signal of the remote control device from the image display device.
10. The method of claim 9,
Receiving a photographed image of another user through a network;
The transmitting of the image comprises operating the server.
10. The method of claim 9,
Receiving a second OSD or second image transmission request from a second image display device;
Generating the corresponding OSD or image according to the request; And
And transmitting the generated OSD or image to the second image display device.
Receiving an external input image;
Receiving an OSD or image transmission request from an image display device;
Generating the corresponding OSD or image according to the request;
Synthesizing the generated OSD or image with the received external input image; And
And transmitting the synthesized image to the image display device.
The method of claim 13,
The OSD or image includes a pointer image corresponding to a pointing signal of a remote control device.
The method of claim 13,
Receiving a pointing signal of a remote control device from the image display device; operating method of a server further comprising.
The method of claim 13,
The external input image,
And at least one of a broadcast image, a captured image of another user, or an external device input image.
An external device interface unit for receiving an external input image from an external device;
A network interface unit for receiving an OSD or an image from a server;
An image synthesizer configured to synthesize the OSD or image and the external input image; And
And a display for displaying the image synthesized by the image synthesizing unit.
18. The method of claim 17,
It further comprises a broadcast receiving unit for receiving a broadcast video,
The video synthesizing unit synthesizes the OSD or image and the broadcast video,
And the display is configured to display an image synthesized by the image synthesizing unit.
18. The method of claim 17,
Further comprising; a user input interface unit for receiving a pointing signal of the remote control device,
The network interface unit,
Transmitting the pointing signal to the server, and further receiving a pointer image related to the pointing signal of the remote control device from the server.
18. The method of claim 17,
The network interface unit,
And a pointer image related to a pointing signal of a remote control device is received from the server.
A network interface unit for receiving an OSD or an image from a server;
A broadcast receiver for receiving a broadcast image;
An image synthesizer for synthesizing the OSD or image with the broadcast image; And
And a display for displaying the image synthesized by the image synthesizing unit.
A network interface unit receiving an OSD or image transmission request from an image display device; And
And a graphic processor for generating a corresponding OSD or an image according to the request.
The network interface unit transmits the generated OSD or image to the video display device.
The OSD or image includes a pointer image corresponding to a pointing signal of a remote control device.
The method of claim 22,
The network interface unit,
And receiving the pointing signal of the remote control device from the image display device.
The method of claim 22,
The network interface unit receives a second OSD or second image transmission request from a second image display device,
The graphic processor, according to the request, generates the corresponding OSD or image,
The network interface unit, characterized in that for transmitting the generated OSD or image to the second image display device.
The method of claim 22,
And a processor equipped with a smart system platform.
An external device interface unit for receiving an external input image;
A network interface unit receiving an OSD or image transmission request from an image display device;
A graphic processor for generating a corresponding OSD or an image according to the request; And
And an image synthesizer configured to synthesize the generated OSD or image and the received external input image.
The network interface unit,
And transmit the synthesized image to the image display device.
KR1020110107139A 2011-10-19 2011-10-19 Image display apparatus, server and method for operating the same KR20130042958A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110107139A KR20130042958A (en) 2011-10-19 2011-10-19 Image display apparatus, server and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110107139A KR20130042958A (en) 2011-10-19 2011-10-19 Image display apparatus, server and method for operating the same

Publications (1)

Publication Number Publication Date
KR20130042958A true KR20130042958A (en) 2013-04-29

Family

ID=48441416

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110107139A KR20130042958A (en) 2011-10-19 2011-10-19 Image display apparatus, server and method for operating the same

Country Status (1)

Country Link
KR (1) KR20130042958A (en)

Similar Documents

Publication Publication Date Title
KR101742986B1 (en) Image display apparatus and method for operating the same
KR101507866B1 (en) Method for producing advertisement content using a display device and display device for same
US10200738B2 (en) Remote controller and image display apparatus having the same
KR101708691B1 (en) Image Display Device and Method for Operating the Same
US9363570B2 (en) Broadcast receiving apparatus for receiving a shared home screen
KR102058041B1 (en) Image display apparatus, and method for operating the same
KR101000063B1 (en) Image display apparatus and method for operating the same
KR101003506B1 (en) Operating an image display device capable of application installation
KR102104438B1 (en) Image display apparatus, and method for operating the same
KR20110128072A (en) Apparatus for executing application and method for controlling operation of the same
KR101916437B1 (en) Mobile apparatus, image display apparatus, server and method for operating the same
KR101708646B1 (en) Image Display Device and Method for Operating the Same
KR101000062B1 (en) Image display apparatus and method for operating the same
KR101847616B1 (en) Image display apparatus, server and method for operating the same
KR102056165B1 (en) Apparatus for receiving broadcasting and method for operating the same
KR102046642B1 (en) Image display apparatus, and method for operating the same
KR102110532B1 (en) Image display apparatus, and method for operating the same
KR20130079926A (en) Image display apparatus, server and method for operating the same
US9542008B2 (en) Image display apparatus and method for operating the same
KR20130033813A (en) Image display apparatus, and method for operating the same
KR102039486B1 (en) Image display apparatus, and method for operating the same
KR20130042958A (en) Image display apparatus, server and method for operating the same
KR101778279B1 (en) Image display apparatus and method for operating the same
KR101711840B1 (en) Image display apparatus and method for operating the same
KR20110134090A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination