KR20130025243A - Image display apparatus and method for operating the same - Google Patents

Image display apparatus and method for operating the same Download PDF

Info

Publication number
KR20130025243A
KR20130025243A KR1020110088649A KR20110088649A KR20130025243A KR 20130025243 A KR20130025243 A KR 20130025243A KR 1020110088649 A KR1020110088649 A KR 1020110088649A KR 20110088649 A KR20110088649 A KR 20110088649A KR 20130025243 A KR20130025243 A KR 20130025243A
Authority
KR
South Korea
Prior art keywords
screen
signal
image
display
movement
Prior art date
Application number
KR1020110088649A
Other languages
Korean (ko)
Inventor
박현대
전진욱
김지성
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110088649A priority Critical patent/KR20130025243A/en
Publication of KR20130025243A publication Critical patent/KR20130025243A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/28Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Plasma & Fusion (AREA)
  • Computer Hardware Design (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Position Input By Displaying (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The present invention relates to an image display apparatus and an operation method thereof. An operating method of an image display device according to an embodiment of the present invention is an operation method of an image display device displaying an image by using a remote control device that detects light emitted from a discharge cell of a plasma display panel. Receiving pointer coordinate information corresponding to the movement of the remote controller, displaying an image corresponding to the movement of the remote controller on a display, receiving a screen switching signal or a screen moving signal, and switching the screen. Based on a signal or a screen movement signal, switching and displaying an image screen displayed on the display, or moving and displaying the screen so that a part of the image is displayed. Accordingly, the screen switching can be easily performed in the touch pen method.

Description

Image display apparatus and method for operating the same

The present invention relates to an image display apparatus and a method of operating the same, and more particularly, an image that can easily switch screens in a touch pen method for sensing light emitted from a discharge cell of a plasma display panel. A display device and an operation method thereof.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

On the other hand, the research on the remote control device for controlling the image display device remotely.

An object of the present invention is to provide an image display apparatus and an operation method thereof that can easily switch screens in a touch pen method for sensing light emitted from a discharge cell of a plasma display panel.

Operation method of an image display device according to an embodiment of the present invention for achieving the above object, Method of operating an image display device for displaying an image using a remote control device for sensing the light emitted from the discharge cells of the plasma display panel In the touch pen mode, the method may further include receiving pointer coordinate information corresponding to a movement of a remote controller, displaying an image corresponding to the movement of the remote controller on a display, and receiving a screen change signal or a screen movement signal. And switching and displaying an image screen displayed on the display based on the screen switching signal or the screen moving signal, or moving and displaying the screen so that a part of the image is displayed.

In addition, the image display device according to an embodiment of the present invention for achieving the above object, and provided with a plurality of discharge cells, in the touch pen mode, at least one subfield of the plurality of subfields constituting the frame, the remote control A plasma display panel which is set to a scan subfield for detecting coordinates of discharge cells in the apparatus and sequentially emits vertical address light and horizontal address light during the scan subfield period, and a predetermined image corresponding to the position of the remote control device. Control to display on the plasma display panel, and control to display the image screen displayed on the plasma display panel by switching based on the screen switching signal or the screen movement signal, or to move the screen to display a portion of the image to display It includes a control unit.

According to an embodiment of the present invention, based on the screen switching signal or the screen movement signal, by switching the image screen displayed on the display to display, or by moving the screen to display a portion of the image, the screen switching in the touch pen method It can be easily performed. Accordingly, infinite writing can be enabled.

In particular, according to the movement of the user or the degree of movement of the remote control device, by switching the image screen displayed on the display to display, or by moving the screen to display a portion of the image to display, it is suitable for the user position, screen movement, etc. Can be performed.

On the other hand, when the displayed image is displayed on the boundary area of the display, by switching the image screen displayed on the display to display, or by moving the screen to display a portion of the image to display, according to the displayed image, screen movement, etc. This can be done.

On the other hand, after switching the screen or moving the screen, by displaying an object indicating the location of the currently displayed area of the entire image area on a portion of the display, it is possible to simply know the position information of the displayed area.

On the other hand, according to the movement input to the image or object displayed on the display, the displayed image or object is moved, and accordingly the screen movement is performed, so that the user can easily move the image or object.

Meanwhile, the screen setting object may be used to select a desired screen among a plurality of screens, thereby allowing the user to move directly to a desired area.

In addition, various user interfaces are possible in the touch pen mode, thereby improving user convenience.

1 is a block diagram of an image display apparatus according to an embodiment of the present invention.
2 to 3 illustrate various examples of an internal block diagram of the image display apparatus of FIG. 1.
4 is a diagram illustrating an example of an interior of the display of FIG. 2.
5 is an internal block diagram of the controller of FIG. 2.
6 is a view for explaining an example of the operation of the remote control device for controlling the image display device of FIG.
7 is an internal convex view of the remote control device of FIG.
8 shows an example of an internal block diagram of the remote control device of FIG. 2 and a simplified internal block diagram of a pointing signal receiver.
9 is a view referred to for explaining the light sensing in the remote control device.
10 to 12 illustrate an operation of the plasma display panel in the touch pen mode according to an embodiment of the present invention.
13 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention.
14 to 22B are views referred to for describing an operating method of the image display device of FIG. 13.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a block diagram of an image display apparatus according to an embodiment of the present invention.

Referring to FIG. 1, an image display apparatus 100 according to an exemplary embodiment of the present invention may be provided with a touch pen-based remote control apparatus 200, a pointing signal receiving apparatus 300, and a pointing signal processing apparatus 400. The video display system can be configured.

The image display apparatus 100 may include a plasma display panel to enable a touch pen method. The plasma display panel includes a phosphor layer formed in a discharge cell divided by a partition wall, and includes a plurality of electrodes.

When the plasma display panel supplies a drive signal to each electrode, the discharge is generated by the drive signal supplied in the discharge cell. Here, when discharged by a drive signal in the discharge cell, the discharge gas filled in the discharge cell generates vacuum ultraviolet rays, and the vacuum ultraviolet light emits the phosphor formed in the discharge cell to emit visible light. Generate. The visible light displays an image on the screen of the plasma display panel.

Meanwhile, an inert mixed gas such as He + Xe, Ne + Xe, He + Ne + Xe, or the like may be injected into the discharge space in the discharge cell of the plasma display panel.

In the gas discharge described above, in addition to emitting visible light, the plasma display panel also emits infrared rays by xenon (Xe).

According to an embodiment of the present invention, the touch pen type remote controller 200 senses light emitted from a discharge cell of a plasma display panel. Specifically, infrared (IR) is detected. For example, when the remote controller 200 approaches or contacts a specific discharge cell of the plasma display panel, the remote controller 200 outputs a timing signal based on the detected light, and based on the timing signal. Thus, the x, y coordinate signal of the corresponding discharge cell can be calculated. The calculated x, y coordinate signals of the discharge cells are converted into RF signals and transmitted to the pointing signal receiver 300.

The pointing signal receiving apparatus 300 receives an x, y coordinate signal of an RF method, and transmits the x, y coordinate signal to the pointing signal processing apparatus 400. To this end, the pointing signal receiving apparatus 300 may include an antenna for receiving an RF signal and an RF module for processing the same. The x, y coordinate signal of the received RF method may be transmitted to the pointing signal processing apparatus 400 by wire or wirelessly. For example, the pointing signal receiver 300 may be a USB or a Bluetooth dongle.

The pointing signal processing apparatus 400 receives the received x, y coordinate signal, processes the signal, and transmits a predetermined image signal to the image display apparatus 100. As a result, the image display apparatus 100, specifically, the plasma display panel, displays a predetermined image (a pointing image, etc.) in a specific discharge cell, that is, in a discharge cell corresponding to the corresponding coordinate (x, y coordinate).

Meanwhile, the pointing signal processing apparatus 400 may include a program for executing the touch pen mode, and execute the pointing signal processing apparatus to perform signal processing and transmission on the received x and y coordinates. For example, the pointing signal processing apparatus 400 may be a computer or the like.

In this manner, by using the pen-shaped remote control apparatus 200, it is possible to display a predetermined image (pointing image, etc.) at specific coordinates in the display panel in a contact or non-contact manner. That is, as the handwriting moves on the plasma display panel of the image display apparatus 100 using the touch pen, when the remote control apparatus 200 is moved, the writing may be performed according to the movement path.

In an embodiment of the present invention, such a remote control device is referred to as a touch pen remote control device, and the touch fan mode according to the embodiment of the present invention is a touch mode or a capacitive touch mode. It is distinguished from the touch mode by the contact mode.

Meanwhile, in the drawing, the touch pen type image display apparatus 100, the pointing signal receiving apparatus 300, and the pointing signal processing apparatus 400 are separately illustrated, but the pointing signal receiving apparatus 300 and the pointing signal are illustrated. At least a pointing signal processing apparatus 400 of the processing apparatus 400 may be provided in the image display apparatus 100. As a result, in one image display apparatus, the touch pen mode can be easily performed.

2 to 3 illustrate various examples of an internal block diagram of the image display apparatus of FIG. 1.

First, referring to FIG. 2, the video display device 100 according to an embodiment of the present invention may include a broadcast receiving unit 105, an external device interface unit 130, a network interface unit 135, and a storage unit 140. , A user input interface unit 150, a controller 170, a display 180, an audio output unit 185, and a power supply unit 190.

The broadcast receiver 105 may include a tuner 110, a demodulator 120, and a network interface unit 130. Of course, if necessary, the tuner 110 and the demodulator 120 may be provided so as not to include the network interface unit 130. On the contrary, the tuner 110 and the network interface unit 130 may be provided. The demodulator 120 may be designed so as not to be included.

The tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 may connect the external device to the image display device 100. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire / wireless.

The A / V input / output unit may receive a video and audio signal of an external device. The wireless communication unit may perform short range wireless communication with another electronic device.

In addition, the external device interface unit 130 may be connected through at least one of the various set top boxes and the various terminals described above to perform input / output operations with the set top box.

The external device interface unit 130 may transmit / receive data with the pointing signal processing apparatus 400.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. For example, the network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal-processed video, audio, or data signal.

In addition, the storage unit 140 may perform a function for temporarily storing an image, audio, or data signal input to the external device interface unit 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function such as a channel map.

Although the storage unit 140 of FIG. 2 is provided separately from the control unit 170, the scope of the present invention is not limited thereto. The storage 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, the remote controller 200 transmits / receives a user input signal such as power on / off, channel selection, screen setting, or a local key (not shown) such as a power key, a channel key, a volume key, or a set value. Transmits a user input signal input from the control unit 170, a user input signal input from a sensing unit (not shown) for sensing a user's gesture to the control unit 170, or a signal from the control unit 170 The transmission may be transmitted to a sensing unit (not shown).

The controller 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 130, and outputs a video or audio signal. You can create and output.

The image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal. In addition, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The voice signal processed by the controller 170 may be sound output to the audio output unit 185. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

Although not shown in FIG. 2, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

The controller 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 is described below on the assumption that the display 180 is a plasma display panel capable of a touch pen method according to an embodiment of the present invention.

The audio output unit 185 receives a signal processed by the controller 170 and outputs the audio signal.

Meanwhile, in order to detect a gesture of a user, as described above, a sensing unit (not shown) including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the image display apparatus 100. have. The signal detected by the sensing unit (not shown) is transmitted to the controller 170 through the user input interface unit 150.

The controller 170 may detect a gesture of the user by combining or combining the image photographed by the photographing unit (not shown) or the detected signal from the sensing unit (not shown).

The power supply unit 190 supplies power to the entire image display apparatus 100. In particular, power may be supplied to the controller 170, which may be implemented in the form of a System On Chip (SOC), a display 180 for displaying an image, and an audio output unit 185 for audio output. have.

To this end, the power supply unit 190 may include a converter (not shown) for converting the AC power into DC power. The apparatus may further include a dc / dc converter for level converting the DC power and outputting the level converted DC power.

The remote control apparatus 200 is used to input a user input through the user input interface unit 150. In particular, according to an embodiment of the present invention, by detecting the light emitted from a specific discharge cell of the plasma display panel, and the corresponding coordinate information through the pointing signal receiving device 300 and the pointing signal processing device 400, Is used to cause an image signal to be input to the image display apparatus 100.

Next, the image display apparatus 100 of FIG. 3 is similar to FIG. 2, except that the pointing signal receiving apparatus 300 and the pointing signal processing apparatus 400 of FIG. 2 are provided in the image display apparatus 100, respectively. There is a difference.

Accordingly, coordinate information based on the optical signal sensed by the remote control apparatus 200 may be input to the pointing signal receiver 300 and the pointing signal processor 400 in the image display apparatus 100. The pointing signal processor 400 may generate an image signal based on the coordinate information, and transmit the image signal to the controller 170. The controller 170 may control to display a predetermined image corresponding to the image signal on the plasma display panel. Meanwhile, the predetermined program described in FIG. 1 may be mounted in the pointing signal processor 400. Meanwhile, unlike FIG. 3, the pointing signal receiving unit 300 and the pointing signal processing unit 400 may be provided in the user input interface unit 150.

Meanwhile, the above-described image display apparatus 100 may be a digital broadcast receiver capable of receiving fixed or mobile digital broadcasting.

On the other hand, the video display device described in the present specification is a TV receiver, a mobile phone, a smart phone (notebook computer), a digital broadcasting terminal, PDA (Personal Digital Assistants), PMP (Portable Multimedia Player), etc. May be included.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIGS. 2 to 3 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

4 is a diagram illustrating an example of an interior of the display of FIG. 2.

Referring to the drawing, the plasma display panel based display 180 includes a plasma display panel 210 and a driving circuit 230.

The plasma display panel 210 is formed on the first substrate and is formed parallel to each other, and the scan electrode Y and the sustain electrode Z are formed on the second substrate, and the scan electrode Y and the sustain electrode ( And an address electrode X intersecting with Z).

In order to display an image, a plurality of scan electrode lines Y, a sustain electrode line Z, and an address electrode line X are arranged to cross each other in a matrix form, and discharge cells are formed in the crossing regions. Meanwhile, the discharge cells may be generated for each of R, G, and B.

The driving circuit unit 230 drives the plasma display panel 210 through a control signal and a data signal supplied from the controller 170 of FIG. 1. To this end, the driving circuit unit 230 includes a timing controller 232, a scan driver 234, a sustain driver 238, and an address driver 236. The operation of the scan driver 234, the sustain driver 238, and the address driver 236 will be described later with reference to FIG. 9 or below.

The timing controller 232 receives a control signal from the control unit 170, an R, G, B data signal, a vertical synchronization signal Vsync, and the like, and responds to the control signal to the scan driver 234 and the sustain driver ( 238 is controlled, and the R, G, and B data signals are rearranged and provided to the address driver 236.

The power supply unit 190 may supply a plurality of levels of DC power required for the plasma display panel 210 to the scan driver 234, the sustain driver 238, and the address driver 236, respectively.

5 is an internal block diagram of the controller of FIG. 2.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 410, the image processor 420, the OSD generator 440, the mixer 445, the frame rate converter 450, and formatter 460. In addition, the apparatus may further include a voice processor (not shown) and a data processor (not shown).

The demultiplexer 410 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 410 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface unit 130.

The image processor 420 may perform image processing of the demultiplexed image signal. To this end, the image processor 420 may include an image decoder 425 and a scaler 435.

The image decoder 425 decodes the demultiplexed image signal, and the scaler 435 performs scaling to output the resolution of the decoded image signal on the display 180.

The video decoder 425 may include decoders of various standards.

The OSD generator 440 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100. In addition, the generated OSD signal may include a 2D object or a 3D object.

The mixer 445 may mix the OSD signal generated by the OSD generator 440 and the decoded image signal processed by the image processor 420. In this case, the OSD signal and the decoded video signal may each include at least one of a 2D signal and a 3D signal. The mixed video signal is provided to the frame rate converter 450.

The frame rate converter 450 converts the frame rate of the input video. On the other hand, the frame rate converter 450 may output the data as it is without additional frame rate conversion.

The formatter 460 receives a mixed signal from the mixer 445, that is, an OSD signal and a decoded video signal, and changes the format of the signal to be suitable for the display 180. For example, the R, G, B data signals may be output, and the R, G, B data signals may be output as low voltage differential signaling (LVDS) or mini-LVDS.

The formatter 460 may separate a 2D video signal and a 3D video signal for displaying a 3D video. In addition, the format of the 3D video signal may be changed or the 2D video signal may be converted into a 3D video signal.

The voice processing unit (not shown) in the controller 170 may perform voice processing of the demultiplexed voice signal. To this end, the voice processing unit (not shown) may include various decoders.

Also, the voice processing unit (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel.

Meanwhile, a block diagram of the controller 170 shown in FIG. 5 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, the frame rate converter 450 and the formatter 460 may not be provided in the controller 170, but may be provided separately.

6 is a view for explaining an example of the operation of the remote control device for controlling the image display device of FIG.

As shown in FIG. 6A, the display 180 moves the touch pen-based remote controller 200 from the first point to the second point on or near the plasma display panel 180. In this case, as shown in FIG. 6B, according to the movement, an image corresponding to the movement is displayed on the display 180. In the figure, it illustrates that the image of the '-' shape is displayed.

As described above, the touch pen type remote controller 200 detects infrared rays (IR) output from a specific discharge cell in the plasma display panel 180 in the touch pen mode, and based on the detected light. By calculating the coordinates of the discharge cell. As a result, the image is displayed on the plasma display panel 180 according to the calculated coordinates.

Next, as shown in FIG. 6 (c), the touch pen-based remote control device 200 on the display 180 is moved from the third point to the fourth point on or near the plasma display panel 180. In the case of moving, as shown in FIG. 6 (d), according to the movement, an image corresponding to the movement is displayed on the display 180. As a result, the figure illustrates that an image having a 'T' shape is displayed.

On the other hand, unlike the illustrated in the figure, when the touch pen-type remote control device 200 is still located in a specific discharge cell, the plasma display panel 180, '.' The image of the shape will be displayed.

By such a touch pen method, a user can easily display an image having a desired shape on the plasma display panel.

Hereinafter, the touch pen type remote controller 200 will be described in more detail.

FIG. 7 is an internal convex view of the remote control device of FIG. 2, FIG. 8 shows an example of an internal block diagram of the remote control device of FIG. 2 and a simplified internal block diagram of a pointing signal receiving device, and FIGS. 9A to 9B 8 is a view for explaining various examples of the ends of the remote control device of FIG. 8, and FIG. 9 is a view for explaining light sensing in the remote control device.

Referring to FIGS. 7 to 9, the touch pen type remote control apparatus 200 includes a wireless communication unit 225, a user input unit 235, an optical sensor unit 240, an output unit 250, and a power supply unit. 260, a storage 270, and a controller 280. In addition, the touch pen-type remote control device 200 may include a rotary ball 780.

The wireless communication unit 225 may include an RF module 221 or an IR module 223 for communication with the pointing signal receiving apparatus 300.

The IR module 223 or the RF module 221 transmits coordinate signals (x, y) corresponding to the calculated discharge cells based on the light detected by the optical sensor unit 240 according to the IR method or the RF method. The pointing signal may be transmitted to the apparatus 300. In addition, the IR module 223 or the RF module 221 may transmit a control signal such as a power on / off signal of the remote controller 200. In particular, in the embodiment of the present invention, to communicate with the pointing signal receiving apparatus 300 through the RF module 221, for stable communication through various channels.

The user input unit 235 may be configured as a keypad, a button, a touch pad, or a touch screen. The user may input a command related to the image display apparatus 100 to the remote control apparatus 200 by manipulating the user input unit 235. When the user input unit 235 includes a hard key button, the user may input a command related to the image display apparatus 100 to the remote control apparatus 200 through a push operation of the hard key button.

As illustrated in FIGS. 7 to 8, the user input unit 235 may include a power on / off key 775, a touch pen mode key 773, and the like.

For example, according to the operation of the power on / off key 775, the power of the remote controller 200 may be turned on or off, and according to the operation of the touch pen mode key 773, the remote controller 200 may be used. ) May wake up to enter touch pen mode.

Specifically, when the touch pen mode key 773 is pressed once, the remote control device 200 wakes up to enter the touch pen mode, and when pressed again, the touch pen mode ends. Can be.

As another example, when the touch pen mode key 773 is pressed, the remote controller 200 wakes up to enter the touch pen mode, and when the touch pen mode key 773 is not pressed again, the touch Pen mode may end.

In addition, the user input unit 235 may include various types of input means that can be operated by the user, and this embodiment does not limit the scope of the present invention.

The optical sensor unit 240 may detect light emitted from a specific discharge cell of the plasma display panel of the image display apparatus 100, for example, infrared rays. To this end, the optical sensor unit 240, as shown in FIG. 8, may include an optical sensor 710, an amplifier 715, and a comparator 720.

In the touch pen mode, the optical sensor 710 may detect light emitted from a corresponding discharge cell near or in contact with a specific discharge cell of the plasma display panel. In particular, infrared (IR) can be detected. The detected signal S IR may be, for example, as shown in FIG. 9A.

The amplifier 715 amplifies the optical signal S IR detected by the optical sensor 710. To this end, the amplifier 715 may include an OP AMP. The amplified signal Samp may be, for example, as shown in FIG. 9 (b).

Next, the comparison unit 720 compares the signal Samp amplified by the amplification unit 715 with the reference signal Sref, and the timing corresponding to a section that is equal to or greater than the reference signal Sref level among the amplified signals Samp. Output the signal Sf. In FIG. 9C, a section having a level higher than or equal to the reference signal Sref level among the amplified signals Samp has a low level.

The timing signal Sf corresponds to the position of a specific discharge cell, in particular, the x and y coordinates, and is input to the control unit 280 and used for the x and y coordinate calculation.

9, the low level section of the timing signal Sf corresponds to a section of a lower level, not a peak section of the detected optical signal S IR . In order to detect the signal more accurately, there is a method of setting the reference signal (Sref) level higher, but according to the surrounding environment when detecting the infrared light, the optical signal (S IR ) detected by the optical sensor 710 may include noise. The optical sensor unit 240 or the controller 280 may further perform signal processing on the timing signal Sf of FIG. 9C.

For example, a falling edge and a rising edge of the timing signal Sf of FIG. 9C may be calculated to set the average value to a low level. That is, it is possible to set the intermediate section between the falling edge and the rising edge to a low level. Thus, a digital signal almost similar to the actual waveform of the infrared signal can be calculated.

The output unit 250 may output a video or audio signal corresponding to an operation of the user input unit 235 or a signal transmitted from the image display apparatus 100. The user may recognize whether the user input unit 235 is manipulated or whether the image display apparatus 100 is controlled through the output unit 250.

For example, the output unit 250 may include a LED module 251 that is turned on when the user input unit 235 is operated or a signal is transmitted and received with the image display device 100 through the wireless communication unit 225, and a vibration module generating vibration. 253, a sound output module 255 for outputting sound, or a display module 257 for outputting an image.

The power supply unit 260 supplies power to the remote control apparatus 200. On the other hand, the power supply unit 260, when the remote control device 200 does not detect the light for more than the first predetermined time, enters the standby mode, it may limit the power of some modules. In addition, when the standby mode is not detected for more than a second predetermined time, the power supply may be stopped by stopping the power supply. The power supply unit 260 may resume power supply when a predetermined key included in the remote control apparatus 200 is operated or when light sensing is performed by the optical sensor unit 240 again.

The storage unit 270 may store various types of programs, application data, and the like necessary for controlling or operating the remote control apparatus 200. In particular, for a pairing operation with the pointing signal processing apparatus 400, information about a specific frequency band or a transmission data unit for a plurality of channels may be stored.

In the touch pen mode, the controller 280 receives a timing signal corresponding to a light detection signal that detects light emitted from a specific discharge cell of the plasma display panel from the light sensor unit 240. For example, a timing signal Sf as shown in FIG. 9C may be input.

The controller 280 performs signal processing on the received timing signal to calculate x, y coordinate signals in the plasma display panel.

In addition, the controller 280 may perform signal conversion to transmit the calculated x, y coordinate signal in an RF manner. In addition, the converted RF x, y coordinate signal may be output to the RF module 221.

On the other hand, the control unit 280, when the power on / off key 775 is operated, the power to the remote control device 200, via the pointing signal receiving device 300, the pointing signal processing device 400 Control to perform a pairing operation with the. The pairing operation may be performed before the touch pen mode key 235 enters the touch pen mode according to the operation.

As illustrated in FIG. 8, the remote controller 200 may further include an antenna 730, and may output data signals such as RF coordinate signals or other pairing signals output from the RF module. .

As illustrated in FIG. 8, the pointing signal receiver 300 may include an antenna 760 and an RF module 765. The antenna 760 receives an RF signal, and the received RF module 765 may process the received RF signal and output an x, y coordinate signal. The output coordinate signal is input to the pointing signal processing apparatus 400 connected in a wired or wireless manner.

The pointing signal processing apparatus 400 processes the signal based on the input coordinate signal, and transmits a predetermined image signal to the image display apparatus 100. Accordingly, the image display apparatus 100, specifically, the plasma display panel, can display a predetermined image (pointing image, etc.) in a specific discharge cell, that is, in a discharge cell corresponding to the corresponding coordinate (x, y coordinate). do.

The rotating ball 780 is disposed in front of the optical sensor 710 to rotate when in contact with the plasma display panel. As such, by implementing the part in contact with the plasma display panel with the rotating ball, wear of the contact part can be reduced, thereby improving the durability of the remote control device and the surface durability of the plasma display panel.

On the other hand, for smooth light sensing in the optical sensor 710, the rotating ball 780 is preferably made of a transparent material. That is, light emitted from the plasma display panel may pass through the transparent rotating ball 780 and may be detected by the optical sensor 710.

Although not shown in FIG. 8, the remote control apparatus 200 may further include a pressure detector (not shown) or a rotation detector (not shown).

For example, when the remote controller 200 is in contact with the plasma display panel, the rotary ball 780 of the remote controller 200 is directed toward the optical sensor 710 by the force of pressing the remote controller 200. You can retreat. According to the pressure, the pressure sensing unit may output signals of different levels. In addition, the detected pressure signal is transmitted to the controller 280. Accordingly, the image display apparatus 100 may display images having different sizes or thicknesses based on different pressure signals.

Meanwhile, a driving method for driving the plasma display panel provided in the display of the image display apparatus will be described below.

In the plasma display panel, the unit frame for implementing the gray level of the image may include a plurality of subfields.

In addition, the plurality of subfields may include a sustain period for implementing gradation according to an address period and a number of discharges for selecting discharge cells in which discharge cells will not occur or discharge cells in which discharge occurs. Period) may be included.

Alternatively, at least one subfield of the plurality of subfields of the frame may further include a reset period for initialization.

10 to 12 are diagrams illustrating an operation of the plasma display panel in the touch pen mode according to an embodiment of the present invention.

Referring to FIG. 10, in the touch pen mode, at least one of a plurality of subfields forming one frame may be set as a scan subfield.

For example, a first subfield and a second subfield among a plurality of subfields of a frame may be used as a scan subfield for detecting a touch position. In addition, the remaining subfields except the scan subfield among the plurality of subfields of the frame may be normal subfields (Normal SF). Here, the general subfield is a subfield for video display, that is, video gradation display, and is also referred to as a display subfield in comparison with the scan subfield.

In addition, in the normal mode other than the touch pen mode, the frame does not include the scan subfield, and all subfields included in the frame may be the general subfield.

In other words, in the touch pen mode, when the touch fan type remote control apparatus 200 operates, at least one subfield of the plurality of subfields of the frame may be set as a scan subfield.

Referring to FIG. 11, the scan subfield may include a vertical scan subfield VSSF for detecting the vertical position of the touch position and a horizontal scan subfield HSSF for detecting the horizontal position of the touch position.

For example, in the touch pen mode, the first subfield of the plurality of subfields of the frame may be a vertical scan subfield, and the second subfield may be a horizontal scan subfield. As such, the vertical scan subfield and the horizontal scan subfield may be continuously arranged in one frame.

In the vertical scan address period VSAP of the vertical scan subfield VSSF, the touch scan signal TSP descending from the scan reference voltage Vsc may be supplied to the scan electrode.

Preferably, the touch scan signal TSP may be sequentially supplied to the plurality of scan electrodes Y. Alternatively, the touch scan signal TSP may be supplied to at least two scan electrodes Y at substantially the same time.

As such, when the touch scan signal TSP is supplied to the scan electrode Y, the voltages of the address electrode X and the sustain electrode Z may be kept substantially constant.

When the touch scan signal TSP is supplied to the scan electrode Y in the vertical scan address period VSAP, when the voltage of the address electrode X is set higher than the voltage of the sustain electrode Z, the scan electrode Y ) And the address electrode X may generate a discharge. In the following description, discharges sequentially generated in the vertical scan address period VSAP are referred to as vertical address discharges as described above.

In the address period of the horizontal scan subfield HSSF (hereinafter referred to as the horizontal scan address period HSAP), the touch data signal TDP may be supplied to the address electrode X. FIG.

Preferably, the touch data signal TDP may be sequentially supplied to the plurality of address electrodes X. Alternatively, the touch data signal TDP may be supplied to at least two address electrodes X at substantially the same time point.

As such, when the touch data signal TDP is supplied to the address electrode X, the voltages of the scan electrode Y and the sustain electrode Z may be kept substantially constant.

When the touch data signal TDP is supplied to the address electrode X in the horizontal scan address period HSAP, when the voltages of the scan electrode Y and the sustain electrode Z are kept constant, the scan electrode Y And discharge may occur between the sustain electrode Z and the address electrode X. Hereinafter, the discharge generated in the horizontal scan address period HSAP as described above is referred to as horizontal address discharge.

On the other hand, the remote control device described above, for example, the remote control device 200, based on the vertical address discharge, that is, the vertical address light generated in the vertical scan address period (VSAP), to the vertical coordinate (y coordinate) of the touch position. The corresponding information may be obtained, and information corresponding to the horizontal coordinate (x coordinate) of the touch position may be obtained based on the horizontal address discharge generated in the horizontal scan address period HSAP, that is, the horizontal address light.

For example, in the touch pen mode, it is assumed that the position of the remote controller 200 is located in the third scan electrode line Y3 and the second address electrode line X2 as shown in FIG. 12. In operation 200, during the vertical scan subfield VSSF of the scan subfield, the vertical address light generated by the third scan electrode line Y3 is sensed. The horizontal address light generated in the second address electrode line X2 is sensed during the horizontal scan subfield HSSF of the scan subfield.

In particular, it can be seen that the vertical coordinate of the touch position is Y3 based on the vertical address light sensing timing occurring in the third scan electrode line Y3, and the horizontal address light sensing timing occurring in the second address electrode line X2. Based on this, it can be seen that the horizontal coordinate of the touch position is X2.

The vertical light sensing timing and the horizontal light sensing timing may be calculated based on the scan sustain period SSP, respectively. Accordingly, the coordinate information of the touch position can be obtained simply.

Meanwhile, as in the case of FIG. 11, at least one of the scan electrode Y and the sustain electrode Z is touched in the scan sustain period SSP between the vertical scan address period VSAP and the horizontal scan address period HSAP. The sustain signal TSUS can be supplied.

On the other hand, unlike the drawing, it is also possible to alternately supply the touch sustain signal TSUS to the scan electrode Y and the sustain electrode Z in the scan sustain period SSP.

The scan sustain period SSP of FIG. 11 may include a synchronous sustain period and an identification sustain period. The scan sustain period may also be referred to as a reference sustain period in other terms.

In FIG. 11, two sync sustain pulses are applied to the scan electrode Y in the sync sustain period, but various examples are possible depending on the setting.

In FIG. 11, an identification sustain pulse is applied to the scan electrode Y after the synchronous sustain pulse, that is, after the second synchronous sustain pulse.

On the basis of such an identification sustain pulse, it is possible to perform vertical coordinate and horizontal coordinate calculation, exactly based on the identification sustain light. For example, as shown in FIG. 12, the vertical coordinate, that is, the Y3 coordinate may be calculated using the time difference between the identification sustain light and the vertical address light corresponding to the Y3 position. Then, the horizontal coordinate, that is, the X2 coordinate can be calculated using the time difference between the identification sustain light and the horizontal address light corresponding to the X2 position.

Of course, in addition to the identification sustain pulse, it is also possible to calculate the horizontal coordinate and the vertical coordinate by further utilizing the synchronous sustain pulse.

On the other hand, when a plurality of remote control devices are used, a method for easily recognizing an input from each remote control device will be discussed below.

13 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 14 to 22B are views referred to for describing an operation method of the image display apparatus of FIG. 13.

Referring to the drawings, first, the image display apparatus 100 enters the touch pen mode (S1310). For example, when the power of the touch pen-type remote controller 200 is turned on, that is, when the touch pen mode key 773 of the remote controller 200 is operated, the image display apparatus 100 is pointing. The signal may be received through the signal receiving device 300 and the pointing signal processing device 400, and may automatically enter the touch pen mode.

Alternatively, when the power of the touch pen type remote control apparatus 200 is turned on, that is, when the touch pen mode key 773 of the remote control apparatus 200 is operated, the image display apparatus 100 receives the pointing signal. The signal may be received through the apparatus 300 and the pointing signal processing apparatus 400, and accordingly, a main menu including a touch pen item may be displayed on the display 180.

When the touch pen item is selected in the main menu, an external input selection menu for connecting to the pointing signal processing apparatus 400 may be displayed. The external input selection menu may include, for example, items such as 'HDMI1', 'HDMI2', 'RGB1', and 'RGB2'.

If the item 'HDMI1' is selected, the touch pen home screen may be displayed. As illustrated in FIG. 22A, the touch pen home screen may include a touch pen menu including a sketch book item, a photo decorating item, a My Gallery item, an Internet item, a family calendar item, and the like. have.

Hereinafter, in the touch pen mode, it is assumed that a 'sketchbook' item that can be written on the touch pen home screen is assumed.

When entering the touch pen mode, the video display device 100 sets at least one subfield among a plurality of subfields forming a frame as a scan subfield, and sets another subfield as a display subfield.

For example, as shown in FIG. 10, in the touch pen mode, a first subfield and a second subfield among a plurality of subfields of a frame may be used as a scan subfield for detecting a touch position. In addition, the remaining subfields except the scan subfield among the plurality of subfields of the frame may be normal subfields (Normal SF).

Next, pointer coordinate information corresponding to the movement of the remote controller is received (S1320). In operation S1330, an image corresponding to the movement of the remote controller is displayed on the display.

In the touch pen mode, the remote controller 200 detects light emitted from the discharge cells of the plasma display panel. The remote control apparatus 200 may output a timing signal based on the sensed light, and calculate an x, y coordinate signal of the corresponding discharge cell based on the timing signal. The calculated x, y coordinate signals of the discharge cells are converted into RF signals and transmitted to the pointing signal receiver 300. This corresponds to step 1417 of FIG. 14.

The pointing signal receiving apparatus 300 receives an x, y coordinate signal of an RF method, and transmits the x, y coordinate signal to the pointing signal processing apparatus 400.

The pointing signal processing apparatus 400 receives the received x, y coordinate signal, processes the signal, and transmits a predetermined image signal to the image display apparatus 100. This corresponds to step 1420 of FIG. 14 (S1420).

The user input interface unit 150 of the image display apparatus 100 receives an image signal and transmits the image signal to the controller 170. The control unit 170 controls the display signal by processing the image signal.

Accordingly, the timing controller 232 in the display 180 resets the subfields, particularly the display subfields SF3 to SF10, in response to the image signal. Accordingly, an image corresponding to the user's movement can be displayed on a specific discharge cell of the plasma display panel, that is, at a corresponding coordinate (x, y coordinate). This corresponds to step 1430 (S1430) of FIG.

On the other hand, it is determined whether a screen switching signal or a screen moving signal is received (S1340), and if applicable, stores the displayed image (S1350), and displayed on the display based on the screen switching signal or the screen moving signal. The image screen is switched and displayed or the screen is shifted to display a part of the image (S1360).

The controller 170 of the image display apparatus 100 receives a screen switching signal or a screen moving signal and controls to store the displayed image in the storage 140 according to the corresponding signal.

Then, the controller 170 of the image display apparatus 100 controls to display the image screen displayed on the display by switching or displaying the screen movement so that a part of the image is displayed based on the screen change signal or the screen movement signal. do.

When there is a sensor unit (not shown), specifically, a camera (not shown) attached to the image display apparatus 100 or outside the image display apparatus 100, the camera is written on the display 180 in the pen touch mode. You can shoot the remote control device used by the user or the user.

The captured image may be input to the controller 170 of the image display apparatus 100, and the controller 170 may determine the movement of the user or the movement of the remote control apparatus 200 based on the captured image. Will be.

The controller 170 may generate a screen change signal or a screen move signal based on the user's movement or the movement of the remote control apparatus 200.

When the screen change signal or the screen shift signal is generated, the controller 170 controls to store the image already displayed on the display 180 in the storage 140 for screen change or screen shift. This corresponds to step 1450 of FIG. 14.

In addition, the controller 170 controls to display the image screen displayed on the display by switching the screen according to the screen switching signal or the screen moving signal, or by moving the screen to display a part of the image. This corresponds to step 1460 of FIG. 14.

15A to 15D illustrate screen movement or switching according to a user's movement or a remote control's movement.

First, FIG. 15A illustrates that when a user 1500 writes using a pen touch type remote controller 200, that is, when the user moves the remote controller 200, a predetermined image screen 1510 is displayed on the display 180. Exemplifies what is displayed in In the drawing, the user 1500 is positioned in the middle of the display 180, and thus, an image corresponding to the writing content is displayed up to the middle of the display 180.

Next, FIG. 15B illustrates that an image screen 1520 is displayed in which the user 1500 continues to write, and the image extends to the right end of the display 180. In this case, the user is located at the right end of the display 180.

Next, FIG. 15C illustrates that the user 1500 holding the pen touch type remote control device 200 partially moves to the left side. The controller 170 detects a user's movement (left movement) or a movement of the remote controller and generates a screen switching signal or a screen movement signal.

Specifically, since the user has moved to the left side, the control unit stores the image screen 1520 that is already written in the storage unit 140, and at least a part of the image screen 1520 of FIG. 15B displayed on the display 180 Control to move to the left.

In FIG. 15C, only the last writing portion of the writing contents of FIG. 15B is displayed on the display 180, and the other writing portions are not displayed. That is, it illustrates that a new image screen 1530 is displayed on the display 180. Accordingly, an area in which the user can write is simply secured, and the user can continue to write in the empty space.

Next, FIG. 15D illustrates that the user 1500 holding the pen touch type remote controller 200 moves to the left end of the display 180. The controller 170 detects a user's movement (left movement) or a movement of the remote controller and generates a screen switching signal or a screen movement signal.

Specifically, since the user has moved to the left end, the image screen 1520 that is already written is controlled to be stored in the storage 140, and the image screen (1520 of FIG. 15B) displayed on the display 180 is switched. Control to be displayed.

In FIG. 15D, all of the writing contents of FIG. 15B are not displayed, and a new image screen 1540 in which nothing is written is displayed on the display 180. Accordingly, an area in which the user can write is simply secured, and the user can continue to write in the empty space.

On the other hand, comparing FIG. 15C with FIG. 15D, it can be seen that the degree of movement of the image screen or whether to switch the screen is determined according to the degree of movement of the user 1500 or the remote control apparatus 200.

That is, when the movement is relatively smaller as shown in FIG. 15C, the image screen may be moved and displayed so that only a part of the displayed image screen 1520 is displayed. In addition, as shown in FIG. 15D, when the movement is relatively greater, in particular, when the user moves to the right end of the display 180, the displayed image screen 1520 may be switched and the changed screen may be displayed.

Meanwhile, in FIG. 15C, when the user's movement is smaller, unlike the drawing, the last two parts of the writing content are displayed on the display 180, and only the first part of the writing content is not displayed.

16A to 16D illustrate screen movement or switching by a screen switching object or an object for screen movement.

First, FIG. 16A illustrates that when a user 1500 writes using the pen touch type remote controller 200, that is, when the user moves the remote controller 200, the predetermined image screen 1610 is displayed on the display 180. Exemplifies what is displayed in

Next, FIG. 16B illustrates that the user 1500 spaces the pen touch type remote control apparatus 200 for a predetermined period or more without contacting the display 180. Accordingly, as shown in the drawing, the confirmation key object 1622 and the left, right, down, and up moving objects 1624, 1626, 1628, 1630 may be displayed in one region of the display 180.

That is, in the writing mode, the user 1500 touches the pen touch type remote control device 200 to the display 180, while writing, and displays the remote control device 200 for a predetermined time or more. If it does not touch, the control unit 170 of the image display apparatus 100 detects this, and the identification key objects 1622, the left, right, down, and moving objects 1624, 1626, 1628, 1630 are displayed ( 180 may be controlled to display.

Next, FIG. 16C illustrates that the user 1500 selects the right moving object 1626 using the pen touch type remote control apparatus 200.

When the right moving object 1626 is selected, the controller 170 generates a screen switching signal or a screen moving signal.

Accordingly, as shown in FIG. 16D, the changed screen 1640 may be displayed according to the right movement. In this case, the existing writing content may be moved to the left or switched to the left.

Meanwhile, in the drawing, after the screen change or the screen move, the object 1650 indicating the position of the currently displayed area of the entire image area is illustrated in one area of the display 180.

The controller 170 generates an object 1650 that informs the position of the currently displayed area of the entire image area.

In the figure, numbering is given for the entire image area. That is, the image area 1 and the image area 2 may be divided and displayed in the object 1650, and the image area 2 corresponding to the image area currently displayed on the display 180 is focused and displayed. do. This makes it possible to easily know the position of the current image area.

17A to 17C illustrate selecting and displaying a desired image area while an object indicating an entire image area is displayed on a partial area of a display.

First, FIG. 17A illustrates that an object 1710 indicating an entire image area is displayed on one area of the display 180.

For example, as shown in FIG. 15D, the screen is switched by the user's movement, and a new screen is displayed on the display 180, or as shown in FIG. 16D, the screen is switched by the right moving object selection, and the new screen is displayed. If it is displayed at 180, as shown in FIG. 17A, an object 1710 indicating the entire image area may be displayed.

In the drawing, when the entire image area is 3 * 3 size, it illustrates that the object 1710 corresponding to the 3 * 3 size is displayed.

Next, FIG. 17B illustrates that the user 1500 selects an area in the object 1710 indicating the entire image area by using the pen touch type remote control device 200. In the drawing, the fifth region 1715 is selected.

Accordingly, the controller 170 controls the image 1720 corresponding to the selected fifth area to be displayed on the display 180 as shown in FIG. 17C. As a result, by displaying the object 1710 indicating the entire image area and selecting a portion of the area, it is possible to simply display the image of the desired area on the display 180.

18A to 19B illustrate that screen switching is automatically performed when an image written by a user is displayed in a boundary area of a display.

First, FIG. 18A illustrates that an image 1810 written by a user is displayed up to the lower boundary area 1815 of the display. When the image is displayed up to the lower boundary area 1815, the controller 170 may generate a screen switching signal or a screen moving signal. In detail, a screen change signal for causing the screen to move downward may be generated.

Accordingly, as shown in FIG. 18B, the changed screen 1820 may be displayed as the user moves downward. Meanwhile, in the drawing, an object 1830 indicating the location of the currently displayed area of the entire image area after screen switching is illustrated in one area of the display 180. In this case, it is possible to focus and display an area corresponding to the image area currently displayed on the display 180.

19A illustrates that an image 1910 of a user's writing is displayed up to the right boundary area 1915 of the display. When the image is displayed to the right boundary area 1915, the controller 170 may generate a screen switching signal or a screen moving signal. In detail, a screen change signal for moving the screen to the right may be generated.

Accordingly, as shown in FIG. 19B, the changed screen 1920 may be displayed according to the right movement. Meanwhile, in the drawing, after switching the screen, an object 1930 indicating the location of the currently displayed area of the entire image area is displayed on one area of the display 180. The controller 170 generates such an object 1930. In this case, it is possible to focus and display an area corresponding to the image area currently displayed on the display 180.

20A to 21D illustrate screen switching automatically performed while moving an image or an object displayed on a display according to a movement input using a remote controller.

First, FIG. 20A illustrates selecting a predetermined object 2010 displayed on the display using the remote control apparatus 200. In detail, when the object 2010 is touched for a predetermined time, the object 2010 may be selected and focused.

Next, FIG. 20B illustrates that there is a right moving input from the remote controller 200 for the selected object 2010. The controller 170 may generate a screen change signal or a screen move signal when there is a right move input from the remote controller 200.

Accordingly, as shown in FIG. 20C, the changed screen 2050 may be displayed according to the right movement. Meanwhile, in the drawing, an object 2030 indicating the position of the currently displayed area of the entire image area after the screen change is illustrated in one area of the display 180. At this time, it is possible to focus and display the area 2034 corresponding to the image area currently displayed on the display 180.

On the other hand, when the moving input speed or intensity of FIG. 20B is the first moving speed or the first moving intensity, when the moving input speed or intensity is smaller than that of FIG. 20B, It is also possible to move from one screen area to the third screen area. That is, the object 2030 indicating the location of the currently displayed area of the entire image area may be displayed only from the first area to the third area.

The controller 170 may change the screen area to be moved and displayed according to the movement input speed or intensity.

Next, FIGS. 21A to 21D illustrate that screen switching is automatically performed while an image or an object displayed on the display 180 moves in accordance with a movement input using a remote controller, similar to FIGS. 20A to 20C. do.

21A illustrates that for the selected object 2110, there is a right move input from the remote control device 200.

Accordingly, as shown in FIG. 21B, the changed screen 2150 may be displayed according to the right movement. Meanwhile, in the drawing, after switching the screen, an object 2130 indicating a location of a currently displayed area of all image areas is displayed on one area of the display 180. In this case, it is possible to focus and display the area 2134 corresponding to the image area currently displayed on the display 180.

Next, FIG. 21C illustrates that when a new object 2115 is selected after selecting the first area from the object 2130 indicating the location of the currently displayed area of the entire image area, the remote object is selected for the selected object 2115. It illustrates that there is a right movement input from the control device 200.

Accordingly, as shown in FIG. 21D, the changed screen 2160 may be displayed according to the right movement. Meanwhile, in the drawing, after switching the screen, an object 2130 indicating a location of a currently displayed area of all image areas is displayed on one area of the display 180. In this case, it is possible to focus and display the area 2135 corresponding to the image area currently displayed on the display 180.

Meanwhile, in FIG. 21B, since there is an object that is displayed to be moved first, in order to distinguish it from each other, each region 2134 and 2135 may be displayed by being separated by a bookmark. Accordingly, the image displayed in the current area can be distinguished.

22A to 22B illustrate selecting one of a plurality of screens through a screen setting object.

As described above, when entering the touch pen mode, the touch pen home screen may be displayed. As illustrated in FIG. 22A, the touch pen home screen 2200 may include a 'sketbook' item, a 'decorating picture' item, a 'my gallery' item, an 'internet' item, a 'family calendar' item, a 'screen setting' item, and the like. The touch pen menu 2210 may be included. In addition, it may further include a setup item 2230 and an exit item 2235.

In this case, when the 'screen setting' item 2215 is selected, the controller 170 may control the object 180 to select a plurality of screens to be displayed on the display 180 as illustrated in FIG. 22B. In addition, when an object corresponding to any one screen is selected, the controller 170 may control a screen corresponding to the selected object to be displayed on the display 180.

For example, similar to the object 1710 of FIG. 17A, an object 2250 capable of selecting a plurality of screens may be displayed on the display 180, and when any one of the objects is selected, similar to FIG. 17B. The corresponding screen may be displayed on the display.

In addition, various user interfaces are possible in the touch pen mode, thereby improving user convenience.

The image display device and its operation method according to the present invention are not limited to the configuration and method of the embodiments described as described above, but the embodiments are all or all of the embodiments so that various modifications can be made. Some may be optionally combined.

On the other hand, the operation method of the remote control apparatus or the image display apparatus of the present invention can be implemented as a processor-readable code on a processor-readable recording medium provided in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (20)

In the operating method of the image display device for displaying an image using a remote control device for sensing the light emitted from the discharge cells of the plasma display panel,
Receiving a pointer coordinate information corresponding to a movement of the remote controller in the touch pen mode;
Displaying an image corresponding to a movement of the remote controller on a display;
Receiving a screen switching signal or a screen moving signal; And
And changing and displaying an image screen displayed on the display based on the screen change signal or a screen movement signal, or by moving and displaying a screen so that a part of the image is displayed. How to operate.
The method of claim 1,
And storing the image displayed on the display when the screen change signal or the screen movement signal is received.
The method of claim 1,
Detecting a movement of a user or the remote control device;
And the screen change signal or the screen move signal is generated based on the movement of the user or the movement of the remote control device.
The method of claim 3,
And the degree of movement of the image screen or whether to switch the screen is determined according to the degree of movement of the user or the remote control apparatus.
The method of claim 1,
The screen change signal or the screen shift signal,
And when the displayed image is displayed in a boundary area of the display.
The method of claim 1,
Displaying a screen switching object or an object for screen movement on the display;
And the screen change signal or the screen move signal is generated when the object is selected.
The method of claim 1,
And displaying, on the partial region of the display, an object indicating the position of the currently displayed region of the entire image region after the screen change or screen movement.
The method of claim 1,
Displaying an object indicating a whole image area on a partial area of the display after the screen change or screen movement; And
And displaying a screen corresponding to the selected area on the display when a specific area is selected from the entire image area.
The method of claim 1,
Receiving a movement input for an image or an object displayed on the display;
And the screen change signal or the screen move signal is generated according to the movement input.
10. The method of claim 9,
And the degree of movement of the image screen or whether to switch the screen is determined according to the speed or intensity of the movement input.
The method of claim 1,
Displaying a screen setting object;
Displaying an object for selecting a plurality of screens when the screen setting object is selected; And
And displaying a screen corresponding to an object selected from among the plurality of screen objects on the display.
And a plurality of discharge cells, and in the touch pen mode, at least one subfield among a plurality of subfields forming a frame is set as a scan subfield for detecting coordinates of a discharge cell in the remote controller, and the scan A plasma display panel for sequentially emitting vertical address light and horizontal address light during a subfield period; And
Control to display a predetermined image corresponding to the position of the remote controller on the plasma display panel, and control to display and display an image screen displayed on the plasma display panel based on a screen change signal or a screen movement signal; And a controller which controls to move and display a screen so that a part of the image is displayed.
The method of claim 12,
And a storage unit which stores an image displayed on the plasma display panel when the screen change signal or the screen movement signal is received.
The method of claim 12,
Further comprising: a sensor unit for detecting a movement of the user or the remote control device,
The control unit,
And a screen change signal or a screen move signal based on the movement of the user or the movement of the remote control device.
The method of claim 12,
The control unit,
And when the displayed image is displayed in a boundary area of the plasma display panel, generating the screen switching signal or the screen moving signal.
The method of claim 12,
The control unit,
And a screen switching signal or a screen moving signal when the object is selected while a screen switching object or an object for screen movement is displayed on the plasma display panel.
The method of claim 12,
The control unit,
And receiving a movement input for an image or an object displayed on the plasma display panel, and generating the screen switching signal or the screen movement signal according to the movement input.
The method of claim 12,
The control unit,
Control to display a screen setting object, control to display an object for selecting a plurality of screens when the screen setting object is selected, and display a screen corresponding to an object selected from the plurality of screen objects on the plasma display panel. And an image display apparatus for controlling the display.
The method of claim 12,
And an interface unit configured to receive an image signal corresponding to a position of the remote controller from an external pointing signal processor.
The control unit,
And display the image according to the image signal.
The method of claim 12,
A pointing signal receiver for receiving coordinate information from the remote control device; And
And a pointing signal processor configured to output a predetermined image signal based on the coordinate information received from the pointing signal receiver.
The control unit,
And display the image according to the image signal.
KR1020110088649A 2011-09-01 2011-09-01 Image display apparatus and method for operating the same KR20130025243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110088649A KR20130025243A (en) 2011-09-01 2011-09-01 Image display apparatus and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110088649A KR20130025243A (en) 2011-09-01 2011-09-01 Image display apparatus and method for operating the same

Publications (1)

Publication Number Publication Date
KR20130025243A true KR20130025243A (en) 2013-03-11

Family

ID=48176994

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110088649A KR20130025243A (en) 2011-09-01 2011-09-01 Image display apparatus and method for operating the same

Country Status (1)

Country Link
KR (1) KR20130025243A (en)

Similar Documents

Publication Publication Date Title
US10963151B2 (en) Image display apparatus
US10706774B2 (en) Image display apparatus
KR101577331B1 (en) Display apparatus and method for operating the same
US20130002729A1 (en) Image display apparatus and method for operating the same
CN103339956B (en) Image display and the method for operation image display
KR101233215B1 (en) Image display apparatus and method for operating the same
CN107317911B (en) Content transmission device and mobile terminal
US20170270873A1 (en) Image display apparatus
EP2930712A1 (en) Image display apparatus and operation method thereof
KR20130053513A (en) Image display apparatus and method for operating the same
EP3110137B1 (en) Display apparatus and control method therefor
KR20130025243A (en) Image display apparatus and method for operating the same
KR20120138988A (en) Remote control device, and image display apparatus including the same
KR20120131258A (en) Apparatus for displaying image and method for operating the same
US10488576B2 (en) Image display apparatus
KR20120140025A (en) Method for power management in relation to a remote control device and image display apparatus including the same
KR20130066339A (en) Remote control device, and image display apparatus including the same
KR101772526B1 (en) Input apparatus and input method of image display device
KR20120109895A (en) Remote control device, and image display apparatus including the same
KR20160039478A (en) Image display apparatus, and method for operating the same
KR20210056762A (en) Image display apparatus and method thereof
KR20120128886A (en) Method for calibration of spatial coordinate, and image display apparatus using the same
US10659718B2 (en) Image display apparatus
KR20230098260A (en) Clock distribution device, signal processing device having the same, and image display device
KR101393803B1 (en) Remote controller, Display device and controlling method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination