KR20130066339A - Remote control device, and image display apparatus including the same - Google Patents

Remote control device, and image display apparatus including the same Download PDF

Info

Publication number
KR20130066339A
KR20130066339A KR1020110133125A KR20110133125A KR20130066339A KR 20130066339 A KR20130066339 A KR 20130066339A KR 1020110133125 A KR1020110133125 A KR 1020110133125A KR 20110133125 A KR20110133125 A KR 20110133125A KR 20130066339 A KR20130066339 A KR 20130066339A
Authority
KR
South Korea
Prior art keywords
remote control
signal
coordinate information
display panel
light
Prior art date
Application number
KR1020110133125A
Other languages
Korean (ko)
Inventor
권오규
천영호
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110133125A priority Critical patent/KR20130066339A/en
Publication of KR20130066339A publication Critical patent/KR20130066339A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/28Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Position Input By Displaying (AREA)

Abstract

PURPOSE: A remote control device and an image display device including the same are provided to delete an area corresponding to the area of the remote control device from an image displayed on a panel based on a timing signal sensed by optical sensors. CONSTITUTION: A first optical sensor(810) senses light to output a first timing signal and a second optical sensor(812) is separated from the first optical sensor to output a second timing signal. A control unit(880) calculates first coordinate information based on the first timing signal and calculates second coordinate information based on the second timing signal. A wireless communication unit(300) transmits the first and the second coordinate information or area information based on the coordinate information to the outside. The control unit calculates area information of a facing area facing a remote control device and a plasma display panel based on the first and the second coordinate information. [Reference numerals] (400) Pointing signal processing device; (815) Amplifying unit; (820) Comparing unit; (821,765) RF module; (830,760) Antenna; (880) Control unit

Description

Remote control apparatus, and image display apparatus including the same {Remote control device, and image display apparatus including the same}

The present invention relates to a remote control device and an image display device including the same, and more particularly, to a touch pen type remote control device for detecting light emitted from a discharge cell of a plasma display panel in an erase mode. It relates to a remote control device that can be used, and an image display device including the same.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is changing from analog broadcasting to digital broadcasting around the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

On the other hand, the research on the remote control device for controlling the image display device remotely.

An object of the present invention, by using a remote control device for sensing the light emitted from the discharge cells of the plasma display panel, to provide a remote control device that can easily erase the image displayed on the panel, and an image display device including the same have.

A remote control apparatus according to an embodiment of the present invention for achieving the above object, as a remote control device for sensing the light emitted from the discharge cells of the plasma display panel, by detecting the light, the first timing signal based on the light detection A first optical sensor for outputting, a second optical sensor disposed to be spaced apart from the first optical sensor, and configured to sense light and output a second timing signal based on light sensing, and based on the first timing signal, The control unit which calculates the coordinate information and calculates the second coordinate information based on the second timing signal, and the region information based on the calculated first and second coordinate information or the first and second coordinate information to the outside. It includes a wireless communication unit for transmitting.

In addition, the remote control device according to an embodiment of the present invention for achieving the above object, as a remote control device for sensing the light emitted from the discharge cell of the plasma display panel, by detecting the light, the timing signal based on the light detection A first optical sensor for outputting, a pressure sensor that is disposed to be spaced apart from the first optical sensor, detects pressure, and outputs a pressure sensing signal based on pressure sensing, a control unit that calculates coordinate information based on a timing signal; And a wireless communication unit for transmitting coordinate information or region information based on the coordinate information to the outside.

In addition, the image display apparatus according to the embodiment of the present invention for achieving the above object, in the touch pen mode, in the touch pen mode, sequentially emits vertical address light during the vertical scan subfield period, A plasma display panel which sequentially emits horizontal address light during the scan subfield period, an interface unit which receives an image signal corresponding to the position of the first remote controller, and a first remote control in the erase mode of the touch pan mode And a controller for controlling to erase the image displayed in correspondence with the position of the device.

According to the exemplary embodiment of the present invention, in the erase mode of the touch pen mode, the image displayed on the panel can be easily deleted using the remote controller.

In particular, when the remote control device includes two optical sensors, an area corresponding to the area of the remote control device can be deleted from the image displayed on the panel based on the timing signal detected by each sensor.

On the other hand, when the remote control device supports both the erase mode and the write mode of the touch pen mode, in the erase mode, the coordinate signal based on the timing signals detected by the two optical sensors is transmitted to the outside, and in the write mode, 2 Since the coordinate signal based on the timing signal sensed by any one of the two optical sensors is transmitted to the outside, it is possible to perform both modes.

On the other hand, when the remote control device includes one light sensor and one pressure sensor, the area of the remote control device in the image displayed on the panel based on the timing signal from the light sensor and the pressure detection signal from the pressure sensor. The area corresponding to can be erased.

In addition, various user interfaces are possible in the touch pen mode, thereby improving user convenience.

1 is a block diagram of an image display apparatus according to an embodiment of the present invention.
2 to 3 illustrate various examples of an internal block diagram of the image display apparatus of FIG. 1.
4 is a diagram illustrating an example of an interior of the display of FIG. 2.
5 is an internal block diagram of the controller of FIG. 2.
6 is a view for explaining an example of the operation of the second remote control device for controlling the image display device of FIG.
FIG. 7 is an internal convex view of the second remote control device of FIG. 2.
FIG. 8 shows an example of an internal block diagram of the second remote control device of FIG. 2 and a simplified internal block diagram of the pointing signal receiving device.
9 is a view referred to for explaining light sensing in the second remote control apparatus.
10 to 12 illustrate an operation of the plasma display panel in the touch pen mode according to an embodiment of the present invention.
FIG. 13 is a view for explaining an example of the operation of the first remote control apparatus for controlling the image display apparatus of FIG. 2.
14A to 14B illustrate various examples of the first remote control apparatus of FIG. 13.
FIG. 15 is an internal block diagram of the first remote control device of FIG. 14A.
FIG. 16 is an internal block diagram of the first remote control device of FIG. 14B.
FIG. 17 is a view referred to for describing light sensing in the first remote control device of FIG. 15.
18A to 18B are views referred to for describing the operation of the first remote control device of FIG. 15.
19 is a view referred to for describing the operation of the first remote control apparatus according to another embodiment of the present invention.
20 is a view referred to for describing the operation of the first remote control apparatus according to another embodiment of the present invention.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a block diagram of an image display apparatus according to an embodiment of the present invention.

Referring to FIG. 1, an image display apparatus 100 according to an exemplary embodiment of the present invention may include a second remote control apparatus 200 of a touch pen type, a pointing signal receiving apparatus 300, a pointing signal processing apparatus 400, And the first remote control apparatus 500 of the touch pen type, to configure an image display system.

The image display apparatus 100 may include a plasma display panel to enable a touch pen method. The plasma display panel includes a phosphor layer formed in a discharge cell divided by a partition wall, and includes a plurality of electrodes.

When the plasma display panel supplies a drive signal to each electrode, the discharge is generated by the drive signal supplied in the discharge cell. Here, when discharged by a drive signal in the discharge cell, the discharge gas filled in the discharge cell generates vacuum ultraviolet rays, and the vacuum ultraviolet light emits the phosphor formed in the discharge cell to emit visible light. Generate. The visible light displays an image on the screen of the plasma display panel.

Meanwhile, an inert mixed gas such as He + Xe, Ne + Xe, He + Ne + Xe, or the like may be injected into the discharge space in the discharge cell of the plasma display panel.

In the gas discharge described above, in addition to emitting visible light, the plasma display panel also emits infrared rays by xenon (Xe).

According to an exemplary embodiment of the present invention, the second pen remote controller 200 of the touch pen type is a remote controller operable in a write mode and detects light emitted from a discharge cell of a plasma display panel. Specifically, infrared (IR) is detected. For example, when the second remote controller 200 approaches or contacts a specific discharge cell of the plasma display panel, the second remote controller 200 outputs a timing signal based on the detected light. Based on the timing signal, the x, y coordinate signal of the corresponding discharge cell can be calculated. The calculated x, y coordinate signals of the discharge cells are converted into RF signals and transmitted to the pointing signal receiver 300.

According to an exemplary embodiment of the present invention, the first touch pen type remote controller 500 is a remote controller operable in an erasing mode and detects light emitted from a discharge cell of a plasma display panel. Specifically, infrared (IR) is detected. For example, when the first remote controller 500 approaches or contacts a specific discharge cell of the plasma display panel, the first remote controller 500 outputs a timing signal based on the detected light. Based on the timing signal, the x, y coordinate signal of the corresponding discharge cell can be calculated. The calculated x, y coordinate signals of the discharge cells are converted into RF signals and transmitted to the pointing signal receiver 300.

Meanwhile, the x, y coordinate signal output from the first remote control apparatus 500 and the x, y coordinate signal output from the second remote control apparatus 200 may be transmitted separately from each other. For example, the x, y coordinate signal output from the first remote control device 500 and the x, y coordinate signal output from the second remote control device 200 are transmitted through different frequency channels or different from each other. It may have a level size or may be transmitted at different timings.

The pointing signal receiving apparatus 300 receives an x, y coordinate signal of an RF method from the first remote control apparatus 500 or the second remote control apparatus 200, and transmits the x, y coordinate signal to the pointing signal processing apparatus 400. . To this end, the pointing signal receiving apparatus 300 may include an antenna for receiving an RF signal and an RF module for processing the same. The x, y coordinate signal of the received RF method may be transmitted to the pointing signal processing apparatus 400 by wire or wirelessly. For example, the pointing signal receiver 300 may be a USB or a Bluetooth dongle.

The pointing signal processing apparatus 400 receives an x, y coordinate signal from the pointing signal receiving apparatus 300, processes the signal, and transmits a predetermined image signal to the image display apparatus 100. As a result, the image display apparatus 100, specifically, the plasma display panel, displays a predetermined image (a pointing image, etc.) in a specific discharge cell, that is, in a discharge cell corresponding to the corresponding coordinate (x, y coordinate).

In the write mode, that is, based on the x, y coordinate signals from the second remote controller 200, the pointing signal processing apparatus 400 may generate a write image corresponding to the coordinates. The write image at this time may be a black image or the like.

Meanwhile, in the erase mode, the pointing signal processing apparatus 400 may generate an erase image corresponding to the coordinates based on the x, y coordinate signals from the first remote control apparatus 500. At this time, the erase image may be a white image.

Meanwhile, the pointing signal processing apparatus 400 may include a program for executing the touch pen mode, and execute the pointing signal processing apparatus to perform signal processing and transmission on the received x and y coordinates. For example, the pointing signal processing apparatus 400 may be a computer or the like.

As described above, by using the pen-shaped second remote control apparatus 200, a predetermined image (pointing image or the like) can be displayed at specific coordinates in the display panel in contact or non-contact manner. That is, when the second remote control apparatus 200 is moved, as in writing on the plasma display panel of the image display apparatus 100 with the touch pen, the writing may be performed according to the movement path.

On the other hand, by using the erase-shaped first remote control device 500, by contact or non-contact, it is possible to erase the image corresponding to a specific coordinate in the display panel. That is, as the image is displayed on the plasma display panel of the image display apparatus 100 by the eraser, when the first remote control apparatus 500 is moved, the image of the corresponding area may be erased according to the movement path. .

In an embodiment of the present invention, such a remote control device 200 or 500 is called a touch pen type remote control device, and the touch pen mode according to the embodiment of the present invention is a touch by a positive pressure contact mode. Mode, or touch mode by the capacitive contact mode.

Meanwhile, in the drawing, the touch pen type image display apparatus 100, the pointing signal receiving apparatus 300, and the pointing signal processing apparatus 400 are separately illustrated, but the pointing signal receiving apparatus 300 and the pointing signal are illustrated. At least a pointing signal processing apparatus 400 of the processing apparatus 400 may be provided in the image display apparatus 100. As a result, in one image display apparatus, the touch pen mode can be easily performed.

2 to 3 illustrate various examples of an internal block diagram of the image display apparatus of FIG. 1.

First, referring to FIG. 2, the video display device 100 according to an embodiment of the present invention may include a broadcast receiving unit 105, an external device interface unit 130, a network interface unit 135, and a storage unit 140. , A user input interface unit 150, a controller 170, a display 180, an audio output unit 185, and a power supply unit 190.

The broadcast receiver 105 may include a tuner 110, a demodulator 120, and a network interface unit 130. Of course, if necessary, the tuner 110 and the demodulator 120 may be provided so as not to include the network interface unit 130. On the contrary, the tuner 110 and the network interface unit 130 may be provided. The demodulator 120 may be designed so as not to be included.

The tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulator 120 may be input to the controller 170. The control unit 170 performs demultiplexing, video / audio signal processing, and the like, and then outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 may connect the external device to the image display device 100. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire / wireless.

The A / V input / output unit can receive video and audio signals from an external device. Meanwhile, the wireless communication unit can perform short-range wireless communication with other electronic devices.

Also, the external device interface unit 130 may be connected to various set-top boxes via at least one of the various terminals described above to perform input / output operations with the set-top box.

The external device interface unit 130 may transmit / receive data with the pointing signal processing apparatus 400.

The network interface unit 135 provides an interface for connecting the video display device 100 to a wired / wireless network including the Internet network. For example, the network interface unit 135 can receive, via the network, content or data provided by the Internet or a content provider or a network operator.

The storage unit 140 may store a program for each signal processing and control in the control unit 170 or may store the processed video, audio, or data signals.

In addition, the storage unit 140 may perform a function for temporarily storing video, audio, or data signals input to the external device interface unit 130. [ In addition, the storage unit 140 may store information on a predetermined broadcast channel through a channel memory function such as a channel map.

Although the storage unit 140 of FIG. 2 is provided separately from the control unit 170, the scope of the present invention is not limited thereto. The storage unit 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, a user input signal such as power on / off, channel selection, screen setting, etc. may be transmitted / received from the second remote control apparatus 200 or the first remote control apparatus 500, or a power key, a channel key, a volume may be used. The user input signal input from a local key (not shown) such as a key or a set value is transmitted to the controller 170, or the user input signal input from a sensing unit (not shown) sensing a user's gesture to the controller 170. Alternatively, the controller 170 may transmit a signal from the controller 170 to a sensing unit (not shown).

The controller 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 130, and outputs a video or audio signal. You can create and output.

The video signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the video signal. Also, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The audio signal processed by the control unit 170 may be output to the audio output unit 185 as an audio signal. The audio signal processed by the controller 170 may be input to the external output device through the external device interface unit 130. [

Although not shown in FIG. 2, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG.

In addition, the control unit 170 can control the overall operation in the video display device 100. [ For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

Meanwhile, the control unit 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 is described below on the assumption that the display 180 is a plasma display panel capable of a touch pen method according to an embodiment of the present invention.

The audio output unit 185 receives the signal processed by the control unit 170 and outputs it as a voice.

Meanwhile, in order to detect a gesture of a user, as described above, a sensing unit (not shown) including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the image display apparatus 100. have. The signal detected by the sensing unit (not shown) is transmitted to the controller 170 through the user input interface unit 150.

The controller 170 may detect a gesture of the user by combining or combining the image photographed by the photographing unit (not shown) or the detected signal from the sensing unit (not shown).

The power supply unit 190 supplies power to the entire image display apparatus 100. Particularly, it is possible to supply power to a control unit 170 that can be implemented in the form of a system on chip (SOC), a display 180 for displaying an image, and an audio output unit 185 for audio output have.

To this end, the power supply unit 190 may include a converter (not shown) for converting the AC power into DC power. The apparatus may further include a dc / dc converter for level converting the DC power and outputting the level converted DC power.

The second remote control apparatus 200 is used to input a user input through the user input interface unit 150. In particular, according to an embodiment of the present invention, by detecting the light emitted from a specific discharge cell of the plasma display panel, and the corresponding coordinate information through the pointing signal receiving device 300 and the pointing signal processing device 400, Is used to cause an image signal to be input to the image display apparatus 100.

The first remote control apparatus 500 is used to input a user input through the user input interface unit 150. In particular, according to an embodiment of the present invention, by detecting the light emitted from a specific discharge cell of the plasma display panel, and the corresponding coordinate information through the pointing signal receiving device 300 and the pointing signal processing device 400, An image signal for erasing an area to be input is input to the image display apparatus 100.

Next, the image display apparatus 100 of FIG. 3 is similar to FIG. 2, except that the pointing signal receiving apparatus 300 and the pointing signal processing apparatus 400 of FIG. 2 are provided in the image display apparatus 100, respectively. There is a difference.

Accordingly, the coordinate information based on the optical signal detected by the second remote control apparatus 200 or the first remote control apparatus 500 may include a pointing signal receiver 300 and a pointing signal processor (in the image display apparatus 100). 400). The pointing signal processor 400 may generate an image signal based on the coordinate information, and transmit the image signal to the controller 170. The controller 170 may control to display a predetermined image corresponding to the image signal on the plasma display panel. Meanwhile, the predetermined program described in FIG. 1 may be mounted in the pointing signal processor 400. Meanwhile, unlike FIG. 3, the pointing signal receiving unit 300 and the pointing signal processing unit 400 may be provided in the user input interface unit 150.

Meanwhile, the video display device 100 may be a digital broadcast receiver capable of receiving a fixed or mobile digital broadcast.

On the other hand, the video display device described in the present specification is a TV receiver, a mobile phone, a smart phone (notebook computer), a digital broadcasting terminal, PDA (Personal Digital Assistants), PMP (Portable Multimedia Player), etc. May be included.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIGS. 2 to 3 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

4 is a diagram illustrating an example of an interior of the display of FIG. 2.

Referring to the drawing, the plasma display panel based display 180 includes a plasma display panel 210 and a driving circuit 230.

The plasma display panel 210 is formed on the first substrate and is formed parallel to each other, and the scan electrode Y and the sustain electrode Z are formed on the second substrate, and the scan electrode Y and the sustain electrode ( And an address electrode X intersecting with Z).

In order to display an image, a plurality of scan electrode lines Y, a sustain electrode line Z, and an address electrode line X are arranged to cross each other in a matrix form, and discharge cells are formed in the crossing regions. Meanwhile, the discharge cells may be generated for each of R, G, and B.

The driving circuit unit 230 drives the plasma display panel 210 through a control signal and a data signal supplied from the controller 170 of FIG. 1. To this end, the driving circuit unit 230 includes a timing controller 232, a scan driver 234, a sustain driver 238, and an address driver 236. The operation of the scan driver 234, the sustain driver 238, and the address driver 236 will be described later with reference to FIG. 9 or below.

The timing controller 232 receives a control signal from the control unit 170, an R, G, B data signal, a vertical synchronization signal Vsync, and the like, and responds to the control signal to the scan driver 234 and the sustain driver ( 238 is controlled, and the R, G, and B data signals are rearranged and provided to the address driver 236.

The power supply unit 190 may supply a plurality of levels of DC power required for the plasma display panel 210 to the scan driver 234, the sustain driver 238, and the address driver 236, respectively.

5 is an internal block diagram of the controller of FIG. 2.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 410, the image processor 420, the OSD generator 440, the mixer 445, the frame rate converter 450, and formatter 460. A voice processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 410 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 410 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface unit 130.

The image processor 420 may perform image processing of the demultiplexed image signal. To this end, the image processor 420 may include an image decoder 425 and a scaler 435.

The image decoder 425 decodes the demultiplexed image signal, and the scaler 435 performs scaling to output the resolution of the decoded image signal on the display 180.

The video decoder 425 may include decoders of various standards.

The OSD generator 440 generates an OSD signal according to a user input or itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated OSD signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated OSD signal may include a 2D object or a 3D object.

The mixer 445 may mix the OSD signal generated by the OSD generator 440 and the decoded image signal processed by the image processor 420. At this time, the OSD signal and the decoded video signal may include at least one of a 2D signal and a 3D signal. The mixed video signal is provided to the frame rate converter 450.

The frame rate converter 450 converts the frame rate of the input video. On the other hand, the frame rate converter 450 may output the data as it is without additional frame rate conversion.

The formatter 460 receives a mixed signal from the mixer 445, that is, an OSD signal and a decoded video signal, and changes the format of the signal to be suitable for the display 180. For example, the R, G, and B data signals may be output as low voltage differential signaling (LVDS) signals or mini-LVDS signals.

The formatter 460 may separate a 2D video signal and a 3D video signal for displaying a 3D video. It is also possible to change the format of the 3D video signal or convert the 2D video signal to the 3D video signal.

On the other hand, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the voice processing unit (not shown) may include various decoders.

In addition, the audio processing unit (not shown) in the control unit 170 can process a base, a treble, a volume control, and the like.

The data processing unit (not shown) in the control unit 170 can perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be EPG (Electronic Program Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

Meanwhile, a block diagram of the controller 170 shown in FIG. 5 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the control unit 170 actually implemented.

In particular, the frame rate converter 450 and the formatter 460 may not be provided in the controller 170, but may be provided separately.

6 is a view for explaining an example of the operation of the second remote control device for controlling the image display device of FIG.

As shown in FIG. 6 (a), the touch pen type second remote controller 200 is moved from the first point to the second point on or near the plasma display panel 180. In the case of moving, as shown in FIG. 6B, according to the movement, an image corresponding to the movement is displayed on the display 180. In the figure, it illustrates that the image of the '-' shape is displayed.

As described above, the touch pen type second remote control apparatus 200 detects infrared rays (IR) output from a specific discharge cell in the plasma display panel 180 in the touch pen mode, and detects the light. Based on the calculation of the coordinates of the discharge cell. As a result, the image is displayed on the plasma display panel 180 according to the calculated coordinates.

Next, as shown in FIG. 6 (c), the touch pen type second remote controller 200 is mounted on the display 180 at a third point on or near the plasma display panel 180. When moving to a point, as shown in FIG. 6 (d), according to the movement, an image corresponding to the movement is displayed on the display 180. As a result, the figure illustrates that an image having a 'T' shape is displayed.

On the other hand, unlike the illustrated in the figure, when the touch pen-type second remote control device 200 is still located in a specific discharge cell, the plasma display panel 180, '.' The image of the shape will be displayed.

By such a touch pen method, a user can easily display an image having a desired shape on the plasma display panel.

Hereinafter, the touch pen type second remote control apparatus 200 will be described in more detail.

FIG. 7 is an internal convex view of the second remote control device of FIG. 2, and FIG. 8 is an example of an internal block diagram of the second remote control device of FIG. 2 and a simplified internal block diagram of the pointing signal receiving device. FIG. 9. Is a view referred to for explaining the light sensing in the second remote control device.

Referring to FIGS. 7 to 9, the second remote control apparatus 200 of the touch pen type includes a wireless communication unit 225, a user input unit 235, an optical sensor unit 240, an output unit 250, The power supply unit 260 may include a storage unit 270 and a control unit 280. In addition, the second remote control apparatus 200 of the touch pen type may include a rotating ball 780.

The wireless communication unit 225 may include an RF module 221 or an IR module 223 for communication with the pointing signal receiving apparatus 300.

The IR module 223 or the RF module 221 transmits coordinate signals (x, y) corresponding to the calculated discharge cells based on the light detected by the optical sensor unit 240 according to the IR method or the RF method. The pointing signal may be transmitted to the apparatus 300. In addition, the IR module 223 or the RF module 221 may transmit a control signal such as a power on / off signal of the second remote control apparatus 200. In particular, in the embodiment of the present invention, to communicate with the pointing signal receiving apparatus 300 through the RF module 221, for stable communication through various channels.

The user input unit 235 may be configured as a keypad, a button, a touch pad, or a touch screen. The user may input a command related to the image display apparatus 100 to the second remote control apparatus 200 by manipulating the user input unit 235. When the user input unit 235 includes a hard key button, the user may input a command related to the image display apparatus 100 to the second remote control apparatus 200 by pushing a hard key button.

The user input unit 235 may include a home key 740 as illustrated in FIGS. 7 to 8.

For example, when the home key 740 is briefly pressed once, the second remote controller 200 is turned on, and when the home key 740 is pressed once briefly, the second remote controller 200 is powered on. Can be turned off.

Then, when the home key 740 is pressed once for a long time while the second remote controller 200 is turned on, pairing may be performed. If the second key is pressed once again, the pairing may be released.

On the other hand, the display unit 745 may output different light when power on / off and pairing / unpairing.

On the other hand, when the home key 740 is pressed once and the second remote controller 200 is powered on, pairing is performed immediately, and once pressed again, when the second remote controller 200 is powered off, pairing is performed. It can also be released.

Meanwhile, unlike the drawing, a power on / off key (not shown) and a touch pen mode key (not shown) for performing pairing may be separated.

For example, according to the operation of the power on / off key, the power of the second remote controller 200 may be turned on or off, and when the touch pen mode key is operated once, the second remote controller 200 may be used. ) Wakes up, enters the touch pen mode, and when the touch pen mode key is pressed again, the touch pen mode may end.

In addition, the user input unit 235 may include various types of input means that can be operated by the user, and this embodiment does not limit the scope of the present invention.

The optical sensor unit 240 may detect light emitted from a specific discharge cell of the plasma display panel of the image display apparatus 100, for example, infrared rays. To this end, the optical sensor unit 240, as shown in FIG. 8, may include an optical sensor 710, an amplifier 715, and a comparator 720.

In the touch pen mode, the optical sensor 710 may detect light emitted from a corresponding discharge cell near or in contact with a specific discharge cell of the plasma display panel. In particular, infrared (IR) can be detected. The detected signal S IR may be, for example, as shown in FIG. 9A.

The amplifier 715 amplifies the optical signal S IR detected by the optical sensor 710. To this end, the amplifier 715 may include an OP AMP. The amplified signal Samp may be, for example, as shown in FIG. 9 (b).

Next, the comparison unit 720 compares the signal Samp amplified by the amplification unit 715 with the reference signal Sref, and the timing corresponding to a section that is equal to or greater than the reference signal Sref level among the amplified signals Samp. Output the signal Sf. In FIG. 9C, a section having a level higher than or equal to the reference signal Sref level among the amplified signals Samp has a low level.

The timing signal Sf corresponds to the position of a specific discharge cell, in particular, the x and y coordinates, and is input to the control unit 280 and used for the x and y coordinate calculation.

9, the low level section of the timing signal Sf corresponds to a section of a lower level, not a peak section of the detected optical signal S IR . In order to detect the signal more accurately, there is a method of setting the reference signal (Sref) level higher, but according to the surrounding environment when detecting the infrared light, the optical signal (S IR ) detected by the optical sensor 710 may include noise. The optical sensor unit 240 or the controller 280 may further perform signal processing on the timing signal Sf of FIG. 9C.

For example, a falling edge and a rising edge of the timing signal Sf of FIG. 9C may be calculated to set the average value to a low level. That is, it is possible to set the intermediate section between the falling edge and the rising edge to a low level. Thus, a digital signal almost similar to the actual waveform of the infrared signal can be calculated.

The output unit 250 may output a video or audio signal corresponding to an operation of the user input unit 235 or a signal transmitted from the image display apparatus 100. The user may recognize whether the user input unit 235 is manipulated or whether the image display apparatus 100 is controlled through the output unit 250.

For example, the output unit 250 may include a LED module 251 that is turned on when the user input unit 235 is operated or a signal is transmitted and received with the image display device 100 through the wireless communication unit 225, and a vibration module generating vibration. 253, a sound output module 255 for outputting sound, or a display module 257 for outputting an image.

The power supply unit 260 supplies power to the second remote control apparatus 200. On the other hand, the power supply unit 260, when the second remote control device 200 does not detect the light for more than the first predetermined time, enters the standby mode, it may limit the power of some modules. In addition, when the standby mode is not detected for more than a second predetermined time, the power supply may be stopped by stopping the power supply. The power supply unit 260 may resume power supply when a predetermined key provided in the second remote control apparatus 200 is operated or when light sensing is performed by the optical sensor unit 240 again.

The storage unit 270 may store various types of programs, application data, and the like necessary for controlling or operating the second remote control apparatus 200. In particular, for a pairing operation with the pointing signal processing apparatus 400, information about a specific frequency band or a transmission data unit for a plurality of channels may be stored.

In the touch pen mode, the controller 280 receives a timing signal corresponding to a light detection signal that detects light emitted from a specific discharge cell of the plasma display panel from the light sensor unit 240. For example, a timing signal Sf as shown in FIG. 9C may be input.

The controller 280 performs signal processing on the received timing signal to calculate x, y coordinate signals in the plasma display panel.

In addition, the controller 280 may perform signal conversion to transmit the calculated x, y coordinate signal in an RF manner. In addition, the converted RF x, y coordinate signal may be output to the RF module 221.

Meanwhile, when the power on / off key 775 is operated and the power is turned on to the second remote control apparatus 200, the controller 280 uses the pointing signal receiving apparatus 300 to indicate the pointing signal processing apparatus ( And control to perform a pairing operation. The pairing operation may be performed before the touch pen mode key 235 enters the touch pen mode according to the operation.

As illustrated in FIG. 8, the second remote control apparatus 200 further includes an antenna 730 to output data signals such as RF coordinate signals or other pairing signals output from the RF module. Can be.

As illustrated in FIG. 8, the pointing signal receiver 300 may include an antenna 760 and an RF module 765. The antenna 760 receives an RF signal, and the received RF module 765 may process the received RF signal and output an x, y coordinate signal. The output coordinate signal is input to the pointing signal processing apparatus 400 connected in a wired or wireless manner.

The pointing signal processing apparatus 400 processes the signal based on the input coordinate signal, and transmits a predetermined image signal to the image display apparatus 100. Accordingly, the image display apparatus 100, specifically, the plasma display panel, can display a predetermined image (pointing image, etc.) in a specific discharge cell, that is, in a discharge cell corresponding to the corresponding coordinate (x, y coordinate). do.

The rotating ball 780 is disposed in front of the optical sensor 710 to rotate when in contact with the plasma display panel. As such, by implementing the part in contact with the plasma display panel with the rotating ball, wear of the contact part can be reduced, thereby improving the durability of the remote control device and the surface durability of the plasma display panel.

On the other hand, for smooth light sensing in the optical sensor 710, the rotating ball 780 is preferably made of a transparent material. That is, light emitted from the plasma display panel may pass through the transparent rotating ball 780 and may be detected by the optical sensor 710.

Although not shown in FIG. 8, the second remote control apparatus 200 may further include a pressure detector (not shown) or a rotation detector (not shown).

For example, when the second remote control apparatus 200 is in contact with the plasma display panel, the rotary ball 780 of the second remote control apparatus 200 is driven by a force pushing the second remote control apparatus 200. The light sensor 710 may retreat in the direction. According to the pressure, the pressure sensing unit may output signals of different levels. In addition, the detected pressure signal is transmitted to the controller 280. Accordingly, the image display apparatus 100 may display images having different sizes or thicknesses based on different pressure signals.

Meanwhile, a driving method for driving the plasma display panel provided in the display of the image display apparatus will be described below.

In the plasma display panel, the unit frame for implementing the gray level of the image may include a plurality of subfields.

In addition, the plurality of subfields may include a sustain period for implementing gradation according to an address period and a number of discharges for selecting discharge cells in which discharge cells will not occur or discharge cells in which discharge occurs. Period) may be included.

Alternatively, at least one subfield of the plurality of subfields of the frame may further include a reset period for initialization.

10 to 12 illustrate an operation of the plasma display panel in the touch pen mode according to an embodiment of the present invention.

Referring to FIG. 10, in the touch pen mode, at least one of a plurality of subfields forming one frame may be set as a scan subfield.

For example, a first subfield and a second subfield among a plurality of subfields of a frame may be used as a scan subfield for detecting a touch position. In addition, the remaining subfields except the scan subfield among the plurality of subfields of the frame may be normal subfields (Normal SF). Here, the general subfield is a subfield for video display, that is, video gradation display, and is also referred to as a display subfield in comparison with the scan subfield.

In addition, in the normal mode other than the touch pen mode, the frame does not include the scan subfield, and all subfields included in the frame may be the general subfield.

In other words, in the touch pen mode, when the touch pen type second remote control apparatus 200 operates, at least one of the plurality of subfields of the frame may be set as a scan subfield.

Referring to FIG. 11, the scan subfield may include a vertical scan subfield VSSF for detecting the vertical position of the touch position and a horizontal scan subfield HSSF for detecting the horizontal position of the touch position.

For example, in the touch pen mode, the first subfield of the plurality of subfields of the frame may be a vertical scan subfield, and the second subfield may be a horizontal scan subfield. As such, the vertical scan subfield and the horizontal scan subfield may be continuously arranged in one frame.

In the vertical scan address period VSAP of the vertical scan subfield VSSF, the touch scan signal TSP descending from the scan reference voltage Vsc may be supplied to the scan electrode.

Preferably, the touch scan signal TSP may be sequentially supplied to the plurality of scan electrodes Y. Alternatively, the touch scan signal TSP may be supplied to at least two scan electrodes Y at substantially the same time.

As such, when the touch scan signal TSP is supplied to the scan electrode Y, the voltages of the address electrode X and the sustain electrode Z may be kept substantially constant.

When the touch scan signal TSP is supplied to the scan electrode Y in the vertical scan address period VSAP, when the voltage of the address electrode X is set higher than the voltage of the sustain electrode Z, the scan electrode Y ) And the address electrode X may generate a discharge. In the following description, discharges sequentially generated in the vertical scan address period VSAP are referred to as vertical address discharges as described above.

In the address period of the horizontal scan subfield HSSF (hereinafter referred to as the horizontal scan address period HSAP), the touch data signal TDP may be supplied to the address electrode X. FIG.

Preferably, the touch data signal TDP may be sequentially supplied to the plurality of address electrodes X. Alternatively, the touch data signal TDP may be supplied to at least two address electrodes X at substantially the same time point.

As such, when the touch data signal TDP is supplied to the address electrode X, the voltages of the scan electrode Y and the sustain electrode Z may be kept substantially constant.

When the touch data signal TDP is supplied to the address electrode X in the horizontal scan address period HSAP, when the voltages of the scan electrode Y and the sustain electrode Z are kept constant, the scan electrode Y And discharge may occur between the sustain electrode Z and the address electrode X. Hereinafter, the discharge generated in the horizontal scan address period HSAP as described above is referred to as horizontal address discharge.

Meanwhile, the remote controller, for example, the second remote controller 200 described in detail above, is based on the vertical address discharge generated in the vertical scan address period (VSAP), that is, the vertical coordinate (y coordinate) of the touch position. ) And information corresponding to the horizontal coordinate (x coordinate) of the touch position based on the horizontal address discharge generated in the horizontal scan address period (HSAP), that is, the horizontal address light.

For example, in the touch pen mode, it is assumed that the position of the second remote control apparatus 200 is located at the third scan electrode line Y3 and the second address electrode line X2 as shown in FIG. 12. 2 The remote control apparatus 200 detects the vertical address light generated in the third scan electrode line Y3 during the vertical scan subfield VSSF of the scan subfield. The horizontal address light generated in the second address electrode line X2 is sensed during the horizontal scan subfield HSSF of the scan subfield.

In particular, it can be seen that the vertical coordinate of the touch position is Y3 based on the vertical address light sensing timing occurring in the third scan electrode line Y3, and the horizontal address light sensing timing occurring in the second address electrode line X2. Based on this, it can be seen that the horizontal coordinate of the touch position is X2.

The vertical light sensing timing and the horizontal light sensing timing may be calculated based on the scan sustain period SSP, respectively. Accordingly, the coordinate information of the touch position can be obtained simply.

Meanwhile, as in the case of FIG. 11, at least one of the scan electrode Y and the sustain electrode Z is touched in the scan sustain period SSP between the vertical scan address period VSAP and the horizontal scan address period HSAP. The sustain signal TSUS can be supplied.

On the other hand, unlike the drawing, it is also possible to alternately supply the touch sustain signal TSUS to the scan electrode Y and the sustain electrode Z in the scan sustain period SSP.

The scan sustain period SSP of FIG. 11 may include a synchronous sustain period and an identification sustain period. The scan sustain period may also be referred to as a reference sustain period in other terms.

In FIG. 11, two sync sustain pulses are applied to the scan electrode Y in the sync sustain period, but various examples are possible depending on the setting.

In FIG. 11, an identification sustain pulse is applied to the scan electrode Y after the synchronous sustain pulse, that is, after the second synchronous sustain pulse.

On the basis of such an identification sustain pulse, it is possible to perform vertical coordinate and horizontal coordinate calculation, exactly based on the identification sustain light. For example, as shown in FIG. 12, the vertical coordinate, that is, the Y3 coordinate may be calculated using the time difference between the identification sustain light and the vertical address light corresponding to the Y3 position. Then, the horizontal coordinate, that is, the X2 coordinate can be calculated using the time difference between the identification sustain light and the horizontal address light corresponding to the X2 position.

Of course, in addition to the identification sustain pulse, it is also possible to calculate the horizontal coordinate and the vertical coordinate by further utilizing the synchronous sustain pulse.

FIG. 13 is a view for explaining an example of the operation of the first remote control apparatus for controlling the image display apparatus of FIG. 2.

As shown in FIG. 13 (a), the first remote control device 500 of the touch pen type on the display 180 is moved from a first point to a second point on or near the plasma display panel 180. In the case of moving, as shown in FIG. 13B, the image corresponding to the moving area is erased and displayed on the display 180 according to the moving. In the drawing, the image of the '-' shape region is erased and displayed.

As described above, the touch pen type first remote control apparatus 500 detects infrared rays (IR) output from a specific discharge cell in the plasma display panel 180 in the touch pen mode, and detects the light. Based on the calculation of the coordinates of the discharge cell. As a result, the image is erased and displayed on the plasma display panel 180 according to the calculated coordinates.

Next, as shown in FIG. 13 (c), the first remote control apparatus 500 of the touch pen type is mounted on the display 180 at a third point on or near the plasma display panel 180. In the case of moving to a point, as shown in FIG. 13D, according to the movement, an image corresponding to the movement area is erased and displayed on the display 180. As a result, in the drawing, the image of the 'T' shape region is erased and displayed.

By the touch pen method as described above, the user can easily erase and display an image of a desired area on the plasma display panel. That is, like the eraser, the first remote control device 500 can be used.

Hereinafter, the first remote control device 500 will be described in more detail.

14A to 14B illustrate various examples of the first remote control apparatus of FIG. 13, FIG. 15 is an internal block diagram of the first remote control apparatus of FIG. 14A, and FIG. 16 is a first remote control of FIG. 14B. An internal block diagram of the device.

First, referring to FIGS. 14A and 15, the first remote control apparatus 500 according to an exemplary embodiment of the present invention may be used in the erase mode of the touch pen mode. The unit 840, the controller 880, the RF module 821, the antenna 830, the home key 840, and the display unit 845 may be included.

Compared to the second remote control apparatus 200 of FIG. 8, there is a difference in that two optical sensors are disposed in the optical sensor unit 840. Accordingly, the description of the control unit 880, the RF module 821, the antenna 830, the home key 840, the display unit 845, and the like, which are the same as in FIG. 8, will be omitted with reference to FIG. 8. Describe the differences around them.

The first optical sensor 810 detects light emitted from the discharge cells of the plasma display panel and outputs a first timing signal based on the light sensing.

The second optical sensor 812 is spaced apart from the first optical sensor 810, and detects light emitted from the discharge cells of the plasma display panel, and outputs a second timing signal based on the light sensing.

As described above, the first optical sensor 810 and the second optical sensor 812 sense the vertical address light emitted during the vertical scan subfield period of the plasma display panel, and the horizontal light emitted during the horizontal scan subfield period. The address light can be detected.

The control unit 880 calculates the first coordinate information based on the first timing signal, and calculates the second coordinate information based on the second timing signal.

The controller 880 may process the received first and second timing signals, respectively, to calculate first and second x, y coordinate signals in the plasma display panel. In addition, the controller 880 may perform signal conversion to transmit the calculated x, y coordinate signal in an RF manner. In addition, the converted RF x, y coordinate signal may be output to the RF module 821.

On the other hand, the controller 880 may calculate area information of the opposing area where the plasma display panel and the first remote control apparatus 500 face each other based on the first and second coordinate information. The opposing area area information herein may be information corresponding to an area of the first remote control apparatus 500.

That is, the controller 880 may calculate area information to be erased based on the calculated first and second coordinate information.

The wireless communication unit including the RF module 821 and the antenna 830 externally, for example, points area information based on the calculated first coordinate information and the second coordinate information or the first and second coordinate information. The signal receiver 300 may transmit the signal.

Referring to FIG. 14A, the first remote control apparatus 500 may have a rectangular shape and may be spaced apart from the first optical sensor 810 and the second optical sensor 812 on one surface thereof.

In the drawing, the first optical sensor 810 is located at the center of the rectangular shape, and the second optical sensor 812 is located at the corner of the rectangular shape. The distance between the first edge of the first optical sensor 810 and the second optical sensor 812 is B, and the distance between the first optical sensor 810 and the second edge (intersecting the first edge) is A. It can be seen that.

Accordingly, the controller 880 may calculate area information to be erased based on the first coordinate information from the first optical sensor 810 and the second coordinate information from the second optical sensor 812. That is, as shown in FIG. 14A, the area area of the first remote control apparatus 500 may be calculated as area information to be erased.

On the other hand, the position of the two photosensors 810, 812, unlike the figure, may be arranged in a variety of positions, preferably, it is preferably arranged in an asymmetrical shape. When the two photosensors 810 and 812 are arranged in an asymmetrical shape, the controller 880 may distinguish whether the first remote control apparatus 500 is disposed horizontally or vertically.

On the other hand, the number of optical sensors is preferably two or more. As shown in the figure, there may be two, but it is also possible to have more than that. For example, four optical sensors may be disposed at each corner of the first remote control apparatus 500.

On the other hand, when the shape of the first remote control device 500 has a circular shape, it is also possible to use one number of optical sensors.

Meanwhile, in the erasing mode, the first remote control apparatus 500 of FIGS. 14A and 15 transmits coordinate information or area information to the outside using two optical sensors 810 and 812, and in the writing mode. By using only one optical sensor 810 or 812, it is possible to transmit only coordinate information to the outside.

Next, referring to FIGS. 14B and 16, the first remote control apparatus 500 according to another embodiment of the present invention may be used in the erase mode of the touch pen mode. The unit 840, the controller 880, the RF module 821, the antenna 830, the home key 840, and the display unit 845 may be included.

Compared with the second remote control apparatus 200 of FIGS. 14A and 15, there is a difference in that one optical sensor 810 and one pressure sensor 813 are disposed in the optical sensor unit 840, respectively. Hereinafter, the differences will be described.

The photosensor 810 senses light emitted from the discharge cells of the plasma display panel and outputs a timing signal based on the photosense.

The pressure sensor 813 is disposed to be spaced apart from the optical sensor 810 and senses a pressure, and outputs a pressure sensing signal based on the pressure sensing.

The control unit 880 calculates coordinate information based on the timing signal.

The controller 880 may process the received timing signal to calculate a coordinate signal in the plasma display panel. In addition, the controller 880 may perform signal conversion to transmit the calculated coordinate signal in an RF manner. In addition, the converted RF x, y coordinate signal may be output to the RF module 821.

The controller 880 may calculate area information of an opposing area where the plasma display panel and the first remote controller 500 face each other, based on the coordinate information and the pressure sensing signal. The opposing area area information herein may be information corresponding to an area of the first remote control apparatus 500.

That is, the controller 880 may calculate area information to be erased based on the coordinate information and the pressure sensing signal based on the pointing signal.

The wireless communication unit including the RF module 821 and the antenna 830 may transmit the calculated coordinate information or region information based on the coordinate information to an external device, for example, the pointing signal receiver 300.

Referring to FIG. 14B, the first remote control apparatus 500 may have a rectangular shape and may have an optical sensor 810 and a pressure sensor 813 spaced apart from one surface thereof.

In the drawing, the photosensor 810 is located at the center of the rectangular shape, the pressure sensor 813 is illustrated that the located at the corner of the rectangular shape. It can be seen that the distance between the first edge where the light sensor 810 and the pressure sensor 813 are located is D, and the distance between the light sensor 810 and the second edge (intersecting with the first edge) is C.

Accordingly, the controller 880 may calculate area information to be erased based on the coordinate information from the optical sensor 810 and the pressure detection signal from the pressure sensor 813. That is, as shown in FIG. 14B, the area area of the first remote control apparatus 500 may be calculated as area information to be erased.

On the other hand, the position of the optical sensor 810 and the pressure sensor 813 may be arranged in various positions, unlike the drawing, but preferably, it is preferably arranged in an asymmetrical shape. When the light sensor 810 and the pressure sensor 813 are arranged in an asymmetrical shape, the controller 880 may distinguish whether the first remote control device 500 is disposed horizontally or vertically.

On the other hand, the number of the optical sensor 810 and the pressure sensor 813 is preferably one or more, respectively, it is also possible to have more than that. For example, two optical sensors and two pressure sensors may be disposed at each corner of the first remote control apparatus 500.

Meanwhile, in the erasing mode, the first remote control apparatus 500 of FIGS. 14B and 16 transmits the coordinate information or the area information to the outside using the optical sensor 810 and the pressure sensor 813, and writes them to the outside. In the mode, it is possible to transmit only coordinate information to the outside using only the optical sensor 810.

FIG. 17 is a view referred to for describing light sensing in the first remote control device of FIG. 15.

Referring to the drawings, the two optical sensors 910 and 912 in the first remote control apparatus 500 of FIG. 15 are in contact with or in contact with a specific discharge cell of the plasma display panel in the touch pen mode. The light emitted from the discharge cells can be sensed respectively. In particular, infrared (IR) can be detected. The sensed signal S IR1 or S IR1 may be, for example, as shown in FIG. 17A.

The amplifier 915 amplifies the optical signals S IR1 or S IR1 detected by the two optical sensors 910 and 912. To this end, the amplifier 915 may be provided with an OP AMP. The amplified signal Samp1 or Samp1 may be, for example, as shown in FIG. 17 (b).

Next, the comparator 920 compares the signal Samp1 or Samp1 amplified by the amplifier 915 with the reference signal Sref, and the interval that is equal to or greater than the reference signal Sref level among the amplified signals Samp1 or Samp1. The timing signal Sf1 or Sf2 corresponding thereto is output. In FIG. 17C, a section having a level higher than or equal to the reference signal Sref level among the amplified signals Samp1 or Samp1 has a low level.

The timing signal Sf1 or Sf2 corresponds to the position of the specific discharge cell, in particular, the x, y coordinates, and is input to the controller 880 and used for the x, y coordinate calculation.

On the other hand, as the first remote control device 500 is spaced apart from the panel, the optical signal level is reduced, and accordingly, the low level section of the timing signal may be reduced.

18A to 18B are views referred to for describing the operation of the first remote control device of FIG. 15.

FIG. 18A illustrates a case in which the first remote control apparatus 500 including two optical sensors is disposed vertically, and FIG. 18B illustrates a case in which the first remote control apparatus 500 is arranged horizontally.

Meanwhile, as described above, the first remote control apparatus 500 corresponds to the vertical coordinate (y coordinate) of the touch position based on the vertical address discharge generated in the vertical scan address period (VSAP), that is, the vertical address light. Information corresponding to the horizontal coordinate (x coordinate) of the touch position can be obtained based on the horizontal address discharge generated in the horizontal scan address period HSAP, that is, the horizontal address light.

First, FIG. 18A shows that the position of the first optical sensor 810 of the first remote control apparatus 500 is (X3, Y3), and the position of the second optical sensor 812 is (X3, Y1). Illustrate that.

The first optical sensor 810 detects the vertical address light generated in the third scan electrode line Y3 during the vertical scan subfield VSSF of the scan subfield. The horizontal address light generated in the second address electrode line X3 is sensed during the horizontal scan subfield HSSF of the scan subfield.

In particular, it can be seen that the vertical coordinate of the touch position is Y3 based on the vertical address light sensing timing occurring in the third scan electrode line Y3, and the horizontal address light sensing timing occurring in the third address electrode line X3. Based on this, it can be seen that the horizontal coordinate of the touch position is X3.

The second optical sensor 812 detects the vertical address light generated in the first scan electrode line Y1 during the vertical scan subfield VSSF of the first remote control apparatus 500 and the scan subfield. The horizontal address light generated in the third address electrode line X3 is sensed during the horizontal scan subfield HSSF of the scan subfield.

In particular, it can be seen that the vertical coordinate of the touch position is Y1 based on the vertical address light sensing timing occurring in the first scan electrode line Y1, and the horizontal address light sensing timing occurring in the third address electrode line X3. Based on this, it can be seen that the horizontal coordinate of the touch position is X3.

Accordingly, the control unit 170 uses the (X3, Y3) coordinate information from the first optical sensor 810 and the (X3, Y1) coordinate information from the second optical sensor 812, and (X2 to Y). The areas corresponding to X4) and (Y1 to Y5) can be calculated using the area information of the first remote control apparatus 500. The image display apparatus 100 may display the image of the corresponding area so as to be erased based on the area information.

Next, FIG. 18B shows that the position of the first optical sensor 810 of the first remote control apparatus 500 is (X3, Y3), and the position of the second optical sensor 812 is (X5, Y3). Illustrate that.

The first photosensor 810 detects the vertical address light generated by the third scan electrode line Y3 and the horizontal address light generated by the third address electrode line X3.

The second optical sensor 812 senses the vertical address light generated by the third scan electrode line Y3 and the horizontal address light generated by the fifth address electrode line X5.

Accordingly, the control unit 170 uses the (X3, Y3) coordinate information from the first optical sensor 810 and the (X5, Y3) coordinate information from the second optical sensor 812, and (X1 to The areas corresponding to X5) and (Y2 to Y4) can be calculated using the area information of the first remote control apparatus 500. The image display apparatus 100 may display the image of the corresponding area so as to be erased based on the area information.

18A and 18B, the coordinate information from the first optical sensor 810 is the same as (X3, Y3), but the coordinate information from the second optical sensor 812 is (X3, Y1) and (X5, Y3) show a difference. In order to calculate the area information of the first remote control apparatus 500 according to an embodiment of the present invention, in addition to the coordinate information of the first optical sensor 810, the coordinate information of the second optical sensor 810 is further included. You will see that you need more.

As described with reference to FIG. 17, as the distance between the first remote control apparatus 500 and the panel increases, the low level width of the timing signal input to the controller 880 becomes smaller. As a result, the controller 880 may calculate the area strength information separately from the area information calculation. That is, as the distance between the first remote control apparatus 500 and the panel increases, the region intensity information may be set lower and transmitted to the outside. As a result, the image display apparatus 100 may erase the image of the corresponding area so as to be erased in proportion to the area intensity information. That is, the farther the distance between the first remote control device 500 and the panel is, the less it can be erased.

19 is a view referred to for describing the operation of the first remote control apparatus according to another embodiment of the present invention.

Referring to the drawings, the first remote control apparatus 500 according to an embodiment of the present invention may be used in a write mode rather than an erase mode.

Accordingly, the coordinate information or the area information output from the first remote control apparatus 500 may be used for displaying a predetermined image.

In the figure, it is illustrated that a specific image 1710 corresponding to area information of the first remote control apparatus 500 is displayed. That is, like the painting, the first remote control device 500 can be used.

20 is a view referred to for describing the operation of the second remote control apparatus according to another embodiment of the present invention.

Unlike the above, the second remote control apparatus 200 may be used in the erase mode, in the touch pen mode.

When the second remote control apparatus 200 includes one optical sensor 720, the controller 280 uses the timing signal of the optical sensor 720 and the size information of the second remote control apparatus. The area information of the area where the second remote control device is located may be calculated. And, it can be controlled to transmit to the outside. Accordingly, the second remote control apparatus 200 can be used in the erase mode.

In the drawing, the image of the region 1805 corresponding to the region where the second remote control apparatus 200 is located among the image display apparatuses 100 is erased and displayed.

Meanwhile, an object 1815 indicating an erasing mode and an object 1818 indicating a writing mode for entering the erasing mode may be displayed on the image display apparatus 100.

For example, when the object 1815 indicating the erasing mode is selected, that is, when the light of the discharge cells in the object 1815 is sensed using the second remote control apparatus 200, the touch pen mode You can enter the clear mode.

In addition, the 2nd remote control apparatus 200 can also be equipped with two optical sensors. One optical sensor may be disposed at the front end of the second remote control apparatus 200, and one optical sensor may be disposed at the rear end.

In the erase mode of the touch pen mode, when the side surface of the pen-shaped second remote controller is attached to the display panel, the image corresponding to the position area of the second remote controller is based on the light detection by the two optical sensors. Can be cleared and displayed.

The remote control apparatus and the image display apparatus including the same according to the present invention are not limited to the configuration and method of the embodiments described as described above, but the embodiments may be modified in various ways so that various modifications can be made. All or some of these may optionally be combined.

On the other hand, the operation method of the remote control apparatus or the image display apparatus of the present invention can be implemented as a processor-readable code on a processor-readable recording medium provided in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (20)

In the remote control device for sensing the light emitted from the discharge cells of the plasma display panel,
A first photosensor which senses the light and outputs a first timing signal based on the photosense;
A second optical sensor disposed to be spaced apart from the first optical sensor and sensing the light and outputting a second timing signal based on the light sensing;
A controller configured to calculate first coordinate information based on the first timing signal, and calculate second coordinate information based on the second timing signal; And
And a wireless communication unit which transmits the calculated first and second coordinate information or area information based on the first and second coordinate information to the outside.
The method of claim 1,
The control unit,
And based on the first and second coordinate information, calculate area information of an opposing area where the plasma display panel and the remote control apparatus face each other.
The method of claim 1,
The control unit,
In the erase mode of the touch pen mode,
Remote control device characterized in that for controlling to transmit the area information to the outside.
The method of claim 1,
The control unit,
In the writing mode of the touch pen mode,
Controlling to transmit the calculated first and second coordinate information or region information based on the first and second coordinate information to the outside,
In the writing mode of the touch pen mode,
And controlling to transmit any one of the first coordinate information and the second coordinate information to the outside.
The method of claim 1,
The first and second optical sensors,
And detecting vertical address light emitted during the vertical scan subfield period of the plasma display panel, and sensing horizontal address light emitted during the horizontal scan subfield period.
In the remote control device for sensing the light emitted from the discharge cells of the plasma display panel,
A first optical sensor for sensing the light and outputting a timing signal based on the light sensing;
A pressure sensor disposed to be spaced apart from the first optical sensor and configured to sense a pressure and output a pressure sensing signal based on the pressure sensing; And
A controller configured to calculate coordinate information based on the timing signal; And
And a wireless communication unit for transmitting the coordinate information or the area information based on the coordinate information to the outside.
The method according to claim 6,
The control unit,
And based on the coordinate information and the pressure sensing signal, calculate area information of an opposing area facing the plasma display panel and the remote control device.
The method according to claim 6,
The control unit,
In the erase mode of the touch pen mode,
Remote control device characterized in that for controlling to transmit the area information to the outside.
The method according to claim 6,
The control unit,
In the writing mode of the touch pen mode,
Remote control device for controlling to transmit the coordinate information to the outside.
The method according to claim 6,
The optical sensor,
And detecting vertical address light emitted during the vertical scan subfield period of the plasma display panel, and sensing horizontal address light emitted during the horizontal scan subfield period.
A plasma display panel having a plurality of discharge cells, in the touch pen mode, sequentially emitting vertical address light during a vertical scan subfield period, and sequentially emitting horizontal address light during a horizontal scan subfield period; And
An interface unit for receiving an image signal corresponding to the position of the first remote control apparatus;
And a controller configured to erase an image displayed in correspondence with the position of the first remote controller in the erase mode of the touch fan mode.
12. The method of claim 11,
The control unit,
And displaying a predetermined image corresponding to the position of the second remote control device on the plasma display panel in the write mode of the touch fan mode.
12. The method of claim 11,
The control unit,
And displaying the predetermined image corresponding to the position of the first remote control device on the plasma display panel in the write mode of the touch fan mode.
12. The method of claim 11,
A pointing signal receiver configured to receive coordinate information from the first remote controller; And
And a pointing signal processor for outputting a predetermined image signal based on the coordinate information received from the pointing signal receiving unit.
12. The method of claim 11,
The first remote control device,
A first photosensor which senses the light and outputs a first timing signal based on the photosense;
A second optical sensor disposed to be spaced apart from the first optical sensor and sensing the light and outputting a second timing signal based on the light sensing; And
And a wireless communication unit configured to transmit first coordinate information based on the first timing signal, second coordinate information based on the second timing signal, or region information based on the first and second coordinate information to the outside. An image display device.
12. The method of claim 11,
The first remote control device,
A first photosensor which senses the light and outputs a first timing signal based on the photosense;
A pressure sensor disposed to be spaced apart from the first optical sensor and configured to sense a pressure and output a pressure sensing signal based on the pressure sensing; And
And a wireless communication unit which transmits coordinate information based on the timing signal or region information based on the coordinate information to the outside.
17. The method according to claim 15 or 16,
The area information is,
And area information of an opposing area where the plasma display panel and the remote control device face each other.
12. The method of claim 11,
The plasma display panel,
And in the touch pen mode, at least one synchronous sustain light during the scan sustain period between the vertical scan subfield period and the horizontal scan subfield period.
19. The method of claim 18,
The plasma display panel,
And further emit identification sustain light during a scan sustain period between the vertical scan subfield period and the horizontal scan subfield period.
The method of claim 12,
The second remote control device,
An optical sensor for sensing the light and outputting a timing signal based on the light sensing; And
And a wireless communication unit which transmits coordinate information based on the timing signal to the outside.
KR1020110133125A 2011-12-12 2011-12-12 Remote control device, and image display apparatus including the same KR20130066339A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110133125A KR20130066339A (en) 2011-12-12 2011-12-12 Remote control device, and image display apparatus including the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110133125A KR20130066339A (en) 2011-12-12 2011-12-12 Remote control device, and image display apparatus including the same

Publications (1)

Publication Number Publication Date
KR20130066339A true KR20130066339A (en) 2013-06-20

Family

ID=48862652

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110133125A KR20130066339A (en) 2011-12-12 2011-12-12 Remote control device, and image display apparatus including the same

Country Status (1)

Country Link
KR (1) KR20130066339A (en)

Similar Documents

Publication Publication Date Title
US10706774B2 (en) Image display apparatus
KR20120136628A (en) Apparatus for displaying image and method for operating the same
US20180364881A1 (en) Image display apparatus
EP2685354A2 (en) Touch display device and multi-touch display device
KR20150117018A (en) Computing apparatus, method for controlling computing apparatus thereof, and multi-display system
KR101577331B1 (en) Display apparatus and method for operating the same
KR101233215B1 (en) Image display apparatus and method for operating the same
KR102333764B1 (en) Image display apparatus
US20130113725A1 (en) Display system and control method thereof
US20220319458A1 (en) Display device and method for operating same
KR102431503B1 (en) Image display apparatus
US11984087B2 (en) Display device and method of performing local dimming thereof
KR102483086B1 (en) Image display apparatus
KR20130053513A (en) Image display apparatus and method for operating the same
KR102313306B1 (en) Image display apparatus, and mobile termial
US10803793B2 (en) Image display apparatus
KR20130066339A (en) Remote control device, and image display apparatus including the same
EP2568360B1 (en) Electronic chalkboard system, control method thereof, and pointing device
KR20120138988A (en) Remote control device, and image display apparatus including the same
KR20120140025A (en) Method for power management in relation to a remote control device and image display apparatus including the same
KR101595573B1 (en) Space remote controller Display device and controlling method thereof
KR102254857B1 (en) Image display apparatus
KR20130025243A (en) Image display apparatus and method for operating the same
US10488576B2 (en) Image display apparatus
KR20120109895A (en) Remote control device, and image display apparatus including the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination