CN109192129B - Display device and display method - Google Patents

Display device and display method Download PDF

Info

Publication number
CN109192129B
CN109192129B CN201811317065.3A CN201811317065A CN109192129B CN 109192129 B CN109192129 B CN 109192129B CN 201811317065 A CN201811317065 A CN 201811317065A CN 109192129 B CN109192129 B CN 109192129B
Authority
CN
China
Prior art keywords
sensing
signal
display panel
manipulation
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811317065.3A
Other languages
Chinese (zh)
Other versions
CN109192129A (en
Inventor
吕绍平
郭家玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AU Optronics Corp
Original Assignee
AU Optronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AU Optronics Corp filed Critical AU Optronics Corp
Publication of CN109192129A publication Critical patent/CN109192129A/en
Application granted granted Critical
Publication of CN109192129B publication Critical patent/CN109192129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The display device comprises a display panel and a controller. The display panel includes a plurality of sensing circuits. The sensing circuit comprises a first sensing circuit and a second sensing circuit. The first sensing circuit is disposed at an edge of an image area of the display panel and configured to detect an optical signal to generate a first sensing signal. The second sensing circuit is disposed in the image area and is used for detecting the optical signal to generate a second sensing signal. The controller is coupled to the sensing circuit and used for identifying a control track formed by the optical signal on the image area according to at least one of the first sensing signal or the second sensing signal and driving the display panel to display a control command according to the control track.

Description

Display device and display method
Technical Field
The present disclosure relates to a display device, and more particularly, to a display device including a display panel for recognizing an optical signal.
Background
Display devices are commonly found in a variety of applications. For example, in a presentation application, a display device may be used to display a presentation image to be presented. However, in the current bulletin application, the user can only project the light source on the display image to guide the content through the laser pen, and the like, and the display device and the user cannot have other interaction or more convenient corresponding operation.
Disclosure of Invention
In view of the above, the present disclosure provides a display apparatus for recognizing an optical signal and a display method thereof, so as to solve the problems described in the prior art.
An embodiment of the present disclosure relates to a display device. The display device comprises a display panel and a controller. The display panel includes a plurality of sensing circuits. The sensing circuit comprises a first sensing circuit and a second sensing circuit. The first sensing circuit is disposed at an edge of an image area of the display panel and configured to detect an optical signal to generate a first sensing signal. The second sensing circuit is disposed in the image area and is used for detecting the optical signal to generate a second sensing signal. The controller is coupled to the sensing circuit and used for identifying a control track formed by the optical signal on the image area according to at least one of the first sensing signal or the second sensing signal and driving the display panel to display a control command according to the control track.
An embodiment of the present disclosure relates to a display method. The display method comprises the following steps: detecting an optical signal through at least one first sensing circuit in the display panel and generating at least one first sensing signal; detecting the optical signal through at least one second sensing circuit in the display panel and generating at least one second sensing signal; identifying a control track formed by the optical signal on an image area of the display panel according to at least one of the at least one first sensing signal or the at least one second sensing signal; and driving the display panel to display the control command according to the control track, wherein the at least one first sensing circuit is arranged on at least one edge of an image area of the display panel, and the at least one second sensing circuit is arranged in the image area.
Drawings
The disclosure may be more completely understood in consideration of the following detailed description of embodiments in connection with the accompanying drawings, in which:
FIG. 1 is a schematic diagram of a circuit shown in accordance with some embodiments of the present disclosure;
FIG. 2A is a schematic diagram of a display panel shown in accordance with some embodiments of the present disclosure;
FIG. 2B is a schematic diagram of a display system shown in accordance with some embodiments of the present disclosure;
FIG. 3 is a schematic diagram of an optical signal shown in accordance with some embodiments of the present disclosure; and
fig. 4 is a flow chart of a method shown in accordance with some embodiments of the present disclosure.
Description of reference numerals:
100: display unit
120: sensing circuit
140: pixel circuit
T1-T5: transistor with a metal gate electrode
C: capacitor with a capacitor element
G0, G1, S1, V1, GND: voltage of
L: reading line
D: data line
G: gate line
200: display panel
220: controller
240: control device
VC: drive signal
VS1, VS 2: sensing signal
SL, SL1, SL 2: optical signal
202: first sensing region
204: second sensing region
A to F: control track
CW, P: control mode
TCW, TP, TH: time of day
Detailed Description
The following embodiments are described in detail with reference to the drawings, but the embodiments are only for explaining the embodiments of the present invention and not for limiting the embodiments of the present invention, and the description of the structural operation is not for limiting the execution sequence thereof, and any structure with equivalent technical effects generated by the recombination of elements is included in the scope of the disclosure of the embodiments of the present invention.
As used herein, the term "circuit" refers broadly to an object that is connected in some manner by one or more transistors and/or one or more primary passive components for processing signals, and also refers broadly to a single system that includes one or more circuits.
Refer to fig. 1. Fig. 1 is a schematic diagram of a display unit 100 shown in accordance with some embodiments of the present disclosure. In some embodiments, the display unit 100 can be used to display an image to be presented, and simultaneously sense a light signal and display an image and an instruction corresponding to the light signal. The display unit 100 includes a sensing circuit 120 and a pixel circuit 140. In some embodiments, the display unit 100 is formed by a Thin Film Transistor (TFT), i.e., the sensing circuit 120 and the display circuit 140 are TFT circuits.
In some embodiments, the display unit 100 is part of a circuit in the display panel 200 (shown in fig. 2A later). In some embodiments, the display panel 200 includes a plurality of display units 100 having the same structure. In some embodiments, the plurality of circuits 100 with the same structure are arranged in the display panel 200 in a matrix. More of the display panel 200 will be discussed later with respect to FIG. 2A.
In some embodiments, the sensing circuit 120 is a photo sensing circuit, and the sensing circuit 120 is configured to receive a light signal (not shown in fig. 1) to generate a corresponding sensing signal. In some embodiments, the sensing circuit 120 is configured to identify the optical signal, for example, the sensing circuit 120 is configured to identify different wavelengths of the optical signal and generate corresponding sensing signals (for example, the sensing signals VS1 or VS2) in response to the different received wavelengths. In other examples, the sensing circuit 120 is used to identify different repetition rates of the optical signal. In a further example, the sensing circuit 120 is used to identify the trace of the optical signal moving on the display panel 200. Embodiments of the track recognition optical signal will be discussed later.
In some embodiments, the optical signal is visible light, e.g., the optical signal is narrow band laser light or a combined light combining multiple narrow band laser lights. In other embodiments, the light signal is a combination of visible light and invisible light, such as a combination of infrared light and red light or a combination of ultraviolet light and blue light. In some embodiments, the light source of the optical signal is a laser light source. In some embodiments, the light source of the optical signal is a Light Emitting Diode (LED) light source.
In some embodiments, the light source of the optical signal is a substantial distance away from the sensing circuit 120, in other words, there is no physical contact between the light source of the optical signal and the sensing circuit 120. In other embodiments, the light source of the optical signal is in physical contact with the sensing circuit 120. The types of optical signals described above are for illustrative purposes only, and various types of optical signals are contemplated within the scope of the present disclosure.
In some embodiments, the light source of the light signal is controlled by a user. In other words, the user can hold the light source (e.g., a laser pen or the control device 240) to irradiate the light signal on the display panel 200, so that the display unit 100 generates the corresponding operation according to the light signal. Thus, the user can generate an interactive relationship with the display panel 200.
As shown in fig. 1, the sensing circuit 120 is coupled to the pixel circuit 140. In some embodiments, the sensing circuit 120 includes a capacitor C, a first transistor T1, a second transistor T2, a third transistor T3, a fourth transistor T4 and a fifth transistor T5.
The first terminal of the capacitor C is coupled to the first terminal of the first transistor T1 and the first terminal of the fifth transistor T5. The second terminal of the capacitor C receives a voltage (e.g., ground voltage GND). The control terminal of the first transistor T1 receives the driving voltage G1. The first terminal of the second transistor T2 is coupled to the second terminal of the first transistor T1. The control terminal of the second transistor T2 receives the driving voltage S1. The second terminal of the second transistor T2 is coupled to the control terminal. The second terminal of the third transistor T3 is coupled to the second terminal of the fourth transistor T4 and receives a voltage V1. A first terminal of the third transistor T3, a first terminal of the fourth transistor T4, and a first terminal of the second transistor T2 are coupled. A first terminal of the third transistor T3 is coupled to the control terminal. The first terminal of the fourth transistor T4 is coupled to the control terminal. The second terminal of the fifth transistor T5 is coupled to the readout line L. The control terminal of the fifth transistor T5 is coupled to the driving voltage G0.
In some embodiments, the first transistor T1, the second transistor T2, the third transistor T3 and the fourth transistor T4 are configured to receive the optical signal to generate a signal. In some embodiments, the fifth transistor T5 is used to control the voltage at the first terminal of the capacitor C according to the driving voltage G0. In some embodiments, the fifth transistor T5 operates as a switch. In some embodiments, the readout line L is used to receive the sensing signals VS1 and/or VS2 generated in response to the light signal.
In some embodiments, when the fifth transistor T5 is turned off and the first transistor T1 is turned on, the voltage at the first terminal of the capacitor C is approximately equal to the voltage V1, which is referred to as a restart mode of the sensing circuit 120. In some embodiments, when the fifth transistor T5, the first transistor T1 and the second transistor T2 are all turned off, the voltage at the first end of the capacitor C is dropped by the drain path provided by the first transistor T1 and the second transistor T2, which is referred to as an operation mode of the sensing circuit 120. In some embodiments, when the fifth transistor T5 is turned on, the voltage on the first terminal of the capacitor C is transmitted to the read line L through the fifth transistor T5, which is referred to as a sampling mode of the sensing circuit.
The configuration of the sensing circuit 120 described above is for exemplary purposes only, and various configurations of the sensing circuit 120 are within the scope of the present disclosure. For example, the sensing circuit further includes a sixth transistor (not shown), the sixth transistor is coupled between the capacitor C and the first transistor T1, and a control terminal of the sixth transistor is coupled to the first terminal of the second transistor T2.
In some embodiments, the pixel circuit 140 is used for displaying an image and an instruction corresponding to the optical signal sensed by the sensing circuit 120. As shown in fig. 1, the pixel circuit 140 is coupled to a data line D and a gate line G, and displays images and commands through data transmitted by the data line D.
The configuration of the display unit 100 described above is for exemplary purposes only, and various configurations of the display unit 100 are within the scope of the present disclosure.
Refer to fig. 2A. Fig. 2A is a schematic diagram of a display panel 200 shown in accordance with some embodiments of the present disclosure. In some embodiments, the display panel 200 includes a first sensing region 202 and a second sensing region 204. In some embodiments, the first sensing region 202 is an edge sensing region and the second sensing region 204 is a central sensing region (e.g., the region within the dashed frame in fig. 2A), wherein a plurality of the display units 100 in fig. 1 are disposed within the first sensing region 202 and the second sensing region 204. In some embodiments, first sensing region 202 and second sensing region 204 have similar structures and similar circuit functions. In some embodiments, first sensing region 202 and second sensing region 204 are an integral, indivisible structure.
As shown in fig. 2A, the first sensing region 202 is disposed at the edge of the image region of the display panel 200, the second sensing region 204 is disposed in the central region (i.e., the image region) of the display panel 200, and the second sensing region 204 is surrounded by the first sensing region 202.
Refer to fig. 2B. Fig. 2B is a schematic diagram of a display device shown in accordance with some embodiments of the present disclosure. The display system includes a display panel 200, a controller 220 and a control device 240. In some applications, the display system may be used as an electronic blackboard, but the disclosure is not limited thereto.
As shown in fig. 2B, the controller 220 is coupled to the display panel 200. The control device 240 is configured with a light source for projecting a light signal SL onto the display panel 200. In some embodiments, the control device 240 can be used by a user to equivalently input an instruction to be executed to the display panel 200.
The display panel 200 generates at least one sensing signal VS1 and/or VS2 according to the detected light signal SL. The controller 220 is configured to receive at least one sensing signal VS1 and/or VS2 generated by the display panel 200 and output a driving signal VC to the display panel 200 according to the at least one sensing signal VS1 and/or VS 2.
For example, in some embodiments, the display units 100 in the first sensing region 202 are configured to detect the optical signals SL to generate one or more sensing signals VS 1. For example, when one of the display units 100 in the first sensing region 202 receives the light signal SL, the display panel 200 generates a sensing signal VS1 for the controller 220 to identify the position of the light signal SL on the first sensing region 202. Alternatively, when the light signal SL forms a continuous moving track (e.g., the manipulation track a) on the first sensing region 202, the display units 100 in the first sensing region 202 detect the light signal SL to generate a plurality of sensing signals VS 1. Thus, the controller 220 can determine the trace of the optical signal SL by at least one sensing signal VS1 transmitted by the signal line L.
Similarly, the display units 100 in the second sensing region 204 detect the optical signals SL to generate one or more sensing signals VS 2. In some embodiments, the controller 220 can transmit at least one sensing signal VS2 through the signal line L. In some embodiments, controller 220 is configured to receive a plurality of electrical signals from different sensing regions. In some embodiments, the controller is configured to distinguish the sensing signals VS1 and VS2 generated by the first sensing region 202 and the second sensing region 204, i.e., the controller 220 can distinguish whether the light signal SL is incident on the first sensing region 202 or the second sensing region 204.
In some embodiments, the controller 220 is used to receive the sensing signals VS1 and VS2 to identify corresponding trajectories. In a further embodiment, the controller 220 is used for identifying the position of the light signal SL on the display panel 200 according to the sensing signals VS1 and/or VS 2.
In various embodiments, the user can input the operation command to be executed by projecting the light signal SL through the operation device 240 to form a single point or different operation tracks (e.g., operation tracks a-F) on the display panel 200. By identifying the type of the manipulation tracks a-F according to at least one of the sensing signals VS1 or VS2, the controller 220 can output a corresponding driving signal VC to drive the display panel 200 to display a corresponding manipulation command. The following description is made with respect to the manipulation trajectories a to F in order.
Regarding the manipulation trajectory a, the manipulation trajectory a is a movement of the optical signal SL from the outside of the display panel 200 to the second sensing region 204 in the display panel 200, that is, the optical signal SL passes through the first sensing region 202 and then passes through the at least one second sensing region 204. Under this condition, in response to the manipulation trajectory a, the first sensing region 202 and the at least one second sensing region 204 sequentially generate a plurality of corresponding sensing signals VS1 and VS2 to the controller 220. The controller 220 can identify the corresponding path of the control track a by receiving the sensing signal VS1 generated by the first sensing region 202 and then receiving the sensing signal VS2 generated by the at least one second sensing region 204. As shown in fig. 2A, the moving path of the track a passes through the first sensing region 202 and the second sensing region 204 from the outside of the display panel 200.
Regarding the manipulation trajectory B, the manipulation trajectory B is that the optical signal SL moves from the second sensing region 204 of the display panel 200 to the outside of the display panel 200, i.e., the optical signal SL passes through the second sensing region 204 and then passes through the first sensing region 202. Under this condition, in response to the manipulation trajectory B, the second sensing region 204 and the first sensing region 202 sequentially generate corresponding sensing signals VS2 and VS1 to be transmitted to the controller 220. After receiving the sensing signal VS2 generated by the second sensing region 204, the controller 220 receives the sensing signal VS1 generated by the first sensing region 202, and thus the moving path of the control track B can be identified. As shown in fig. 2A, the moving path of the manipulation track B passes through the second sensing region 204 first, and then passes through the first sensing region 202 to the outside of the display panel 200.
Regarding the manipulation trajectory C, the manipulation trajectory C is that the optical signal SL moves from the outside of the display panel 200 to the second sensing region 204 and then moves to the outside of the display panel 200, that is, the optical signal SL passes through the first sensing region 202, then passes through the at least one second sensing region 204, and then passes through the first sensing region 202. Under this condition, in response to the manipulation trajectory C, the first sensing region 202, the second sensing region 204 and the first sensing region 202 passing by for the first time sequentially generate the corresponding sensing signals VS1, VS2 and VS1 to the controller 220. If the controller 220 receives the sensing signal VS1 generated by the first sensing region 202, then receives the sensing signal VS2 generated by the second sensing region 204, and then receives the sensing signal VS1 generated by the first sensing region 202, the moving path of the control track C can be identified. As shown in fig. 2A, the movement path of the manipulation track C passes through a first sensing region 202, a second sensing region 204 and the first sensing region 202 from the outside of the display panel 200. That is, the manipulation trajectory C touches the edge of the display panel 200 at least twice.
Taking the above-mentioned manipulation tracks a-C as an example, in some embodiments, the controller 220 can determine the order in which the manipulation tracks a-C touch the edge and the image area of the display panel 200 according to at least one sensing signal VS1 and/or at least one sensing signal VS 2. Thus, according to different touch sequences, the controller 220 can select one of the preset instructions as the control instruction to be executed, and drive the display panel 200 to display the control instruction.
With reference to fig. 2A, regarding the manipulation trajectory D, the manipulation trajectory D is a closed region formed by the optical signal SL moving clockwise on the second sensing region 204 of the display panel 200, that is, the optical signal SL passes through the plurality of display units 100 of the second sensing region 204 and then passes through the repeated display units 100 to form a closed region clockwise. Under this condition, in response to the manipulation trajectory D, the display units 100 in the second sensing region 204 sequentially generate a plurality of corresponding sensing signals VS2 to the controller 220. If the controller 220 receives the sensing signal VS2 generated by the second sensing region 204 and then receives the sensing signal VS2 generated by the repeated display unit 100, and the position sequence of the display unit 100 is clockwise, the moving track of the control track D can be identified. As shown in fig. 2A, the movement of the manipulation trajectory D passes through the second sensing region 204 and passes through the repeated display units 100 in a clockwise direction.
Regarding the manipulation trajectory E, the manipulation trajectory E is a closed region formed by the optical signal SL moving counterclockwise on the second sensing region 204 of the display panel 200, that is, the optical signal SL passes through the plurality of display units 100 of the second sensing region 204 and then passes through the repeated display units 100 to form a closed region counterclockwise. Under this condition, in response to the manipulation trajectory E, the display units 100 in the second sensing region 204 sequentially generate a plurality of corresponding sensing signals VS2 to the controller 220. If the controller 220 receives the sensing signal VS2 generated by the second sensing region 204 and then receives the sensing signal VS2 generated by the repeated display unit 100, and the position sequence of the display unit 100 is counterclockwise, the moving track of the manipulation track E can be identified. As shown in fig. 2A, the movement trace of the manipulation trace E passes through the second sensing region 204 and passes through the repeated display units 100 in the counterclockwise direction.
Regarding the manipulation trajectory F, the manipulation trajectory F is an unclosed region formed by the movement of the light signal SL on the second sensing region 204 of the display panel 200, that is, the light signal SL passes through the plurality of display units 100 of the second sensing region 204 and does not pass through the repeated display units 100 to form the unclosed region. Under this condition, in response to the manipulation trajectory F, the display units 100 in the second sensing region 204 sequentially generate a plurality of corresponding sensing signals VS2 to the controller 220. If the controller 220 receives the sensing signal VS2 generated by the second sensing region 204 and does not receive the sensing signal VS2 generated by the repeated display unit 100, the moving track of the manipulation track F can be identified. As shown in fig. 2A, the movement trace of the manipulation trace F passes through the second sensing region 204 and does not pass through the repeated display units 100.
Taking the above-mentioned manipulation tracks D-F as an example, in some embodiments, the controller 220 can determine the rotation direction of the manipulation tracks D-F on the image area of the display panel 200 and whether a closed area is formed according to at least one sensing signal VS 2. Thus, according to different rotation directions and whether a closed region is formed, the controller 220 may select one of the plurality of preset instructions as the control instruction to be executed, and drive the display panel 200 to display the control instruction.
The above-mentioned manipulation trajectories a-F are for illustrative purposes only, and various manipulation trajectories are within the scope of the present disclosure.
In some embodiments, the controller 220 is further configured to output the driving signal VC to drive the display panel 200 to display a corresponding manipulation instruction in response to the manipulation trajectory identified by the controller 220. As described above, in one or more embodiments, the controller 220 can identify the manipulation tracks generated by the light signals SL on the display panel 200 according to the at least one sensing signal VS1 and/or the at least one sensing signal VS 2. According to one or more conditions, such as the sequence of the control trace touching the edge and the image area of the display panel 200, whether a closed area is formed and/or the rotation direction, the controller 220 may output a corresponding driving signal VC to drive the display panel 200 to display a corresponding operation. Therefore, the controller 220 outputs the corresponding driving signal VC to the display panel 200 according to the above determination, so that the display panel 200 displays the corresponding operation.
For example, in some embodiments, the controller 220 is configured to identify a manipulation track generated by the optical signal SL on the display panel 200, and first determine whether the manipulation track touches the first sensing region 202; if the first sensing region 202 is not touched, determining whether a closed region is formed on the control track; if a closed area is formed, judging the rotation direction of the control track; alternatively, if touching the first sensing region 202, it is determined that the first sensing region 202 is touched first or the second sensing region 204 is touched first, and it is determined that the first sensing region 202 or the second sensing region 204 is touched last. The above is merely an example, and the present disclosure is not limited to the above determination order.
For easy understanding, in the present disclosure, an operation corresponding to the manipulation trajectory a is defined as a manipulation instruction 1, an operation corresponding to the manipulation trajectory B is defined as a manipulation instruction 2, an operation corresponding to the manipulation trajectory C is defined as a manipulation instruction 3, an operation corresponding to the manipulation trajectory D is defined as a manipulation instruction 4, an operation corresponding to the manipulation trajectory E is defined as a manipulation instruction 5, and an operation corresponding to the manipulation trajectory F is defined as a manipulation instruction 6. In some embodiments, the display panel 200 displays the corresponding control commands 1-6 according to the received driving signal VC.
Refer to fig. 3. Fig. 3 is a schematic diagram of an optical signal SL shown in accordance with some embodiments of the present disclosure. In some embodiments, the light source of the control device 240 can output different light signals SL in different modes. The enable period (e.g., the period at the high level) of the optical signal SL has different setting modes according to different modes. For example, as shown in fig. 3, the control device 240 can generate the optical signal SL with different enabling periods (enabling periods) in a Continuous Wave (CW) mode or a burst wave (P) mode.
In fig. 3, the optical signal SL1 sent by the control device 240 in the continuous mode CW has an enable time TCW. In some embodiments, the enable time TCW is greater than 16.6 milliseconds (ms) -in other words, the repetition rate of the optical signal of the continuous mode CW is less than 60 hertz (Hz).
In fig. 3, the optical signal SL2 sent by the control device 240 in the burst mode P has a period TP. In some embodiments, the period TP of the burst mode P is less than 16.6 ms. In other words, in the burst mode P, the time difference between every two adjacent enable periods TH in the optical signal SL is smaller than 16.6 ms. As shown in fig. 3, the period TP of the optical signal SL2 in the burst mode P is less than 16.6ms, so the enabling time TH is also less than 16.6 ms. In other words, the repetition rate of the optical signal of the burst mode P is greater than 60 Hz.
In some embodiments, the plurality of display units 100 in the display panel 200 may generate the sensing signals VS1 or VS2 with different timings according to the enabled periods of the light signals SL1 or SL 2. In some embodiments, the controller 220 is further configured to identify the enabling period of the optical signal SL according to the timing of at least one sensing signal VS1 and/or VS2 to confirm the operation mode of the control device 240. In some embodiments, the controller 220 may also output the corresponding driving signal VC according to different enabling periods of the optical signal SL, so that the display panel 200 displays different control commands.
Accordingly, according to one or more embodiments described above, in addition to one or more conditions such as the sequence of touching the edge and the image area of the display panel 200 by the manipulation trajectory, the existence of the closed area and/or the rotation direction, the controller 220 may further classify the predetermined instruction set according to the different enabling periods of the light signal SL to select the corresponding operation.
In some embodiments, according to fig. 2A and 3, the controller 220 identifies the manipulation tracks a-F and identifies the pattern CW or P of the optical signal, such that each of the above-mentioned manipulation commands 1-6 can be further divided into two manipulation commands. That is, the manipulation trajectories a to F shown in fig. 2A may correspond to at most 12 manipulation instructions 1 to 12 in combination with the recognized trajectory and the pattern of the optical signal.
For ease of understanding, the classification of the control commands 1-12 is summarized as the following table I and table II, where table I corresponds to the optical signal SL1 operating in the continuous mode CW and table II corresponds to the optical signal SL2 operating in the burst mode P. Counterclockwise/clockwise represents a rotation direction of the manipulation trajectory, and the inner-to-outer and outer-to-inner represent that the manipulation trajectory moves from inside the display panel 200 to outside through the edge thereof or from outside the edge of the display panel 200 to inside the display panel 200, and touching twice represents that the manipulation trajectory touches the edge of the display panel 200 at least twice. Whereas the manipulation trajectories that do not belong to the above-described case are classified as "other lines".
Table one:
Figure BDA0001856567270000101
Figure BDA0001856567270000111
in some embodiments, the steering command 1 corresponds to the steering track a matching with the optical signal SL1 in the continuous mode CW. In some embodiments, the steering command 2 corresponds to the steering track B and the optical signal SL1 in the continuous mode CW. In some embodiments, the steering command 3 corresponds to the steering track C and the optical signal SL1 in the continuous mode CW. In some embodiments, the steering command 4 corresponds to the steering trajectory D and the optical signal SL1 in the continuous mode CW. In some embodiments, the steering command 5 corresponds to the steering trajectory E and the optical signal SL1 in the continuous mode CW. In some embodiments, SL1 of the optical signal of the continuous mode CW associated with the manipulation trajectory F corresponds to the manipulation instruction 6.
Table two:
Figure BDA0001856567270000112
in some embodiments, the matching of the control track a and the optical signal SL2 in the burst mode P corresponds to the control command 7. In some embodiments, the control command 8 corresponds to the light signal SL2 of the burst mode P and the control track B. In some embodiments, the control command 9 corresponds to the light signal SL2 of the burst mode P along with the control track C. In some embodiments, the manipulation trajectory D corresponds to the manipulation instruction 10 in conjunction with the light signal SL2 of the burst mode P. In some embodiments, the manipulation trajectory E corresponds to the manipulation instruction 11 in conjunction with the light signal SL2 of the burst mode P. In some embodiments, the manipulation trajectory F corresponds to the manipulation instruction 12 in conjunction with the light signal SL2 of the burst mode P.
In some embodiments, the manipulation commands 1-12 are deleting, enlarging, reducing, and displaying the portion of the image displayed on the display panel 200. In some embodiments, the manipulation commands 1-12 are all of the images displayed on the display panel 200 for deleting, enlarging, reducing, and so on. In some embodiments, the manipulation commands 1-12 are part or all of the images displayed on the mobile display panel 200. In some embodiments, the control commands 1-12 are images or menus not yet displayed on the display panel 200. In some embodiments, the manipulation instructions 1-12 are for switching the display pages of the display panel 200, for example, turning to the next page. In some embodiments, the manipulation commands 1-12 are the recognized manipulation tracks displayed on the display panel 200, i.e. the lines are drawn on the display panel 200.
In some embodiments, some of the operations corresponding to the control commands 1-12 are temporarily displayed on the display panel 200, and after a predetermined time, the display panel 200 returns to the original display image. In other embodiments, some of the operations corresponding to the above-mentioned manipulation commands 1-12 are normally displayed on the display panel 200, and if there is no next command, the display panel 200 will keep the displayed screen.
The above-mentioned control commands 1-12 are only for illustrative purposes, and various control commands are within the scope of the disclosure.
In some embodiments, the controller 220 may include at least one processing circuit and a memory circuit. In some embodiments, the instruction sets categorized in table one and table two may be created in the form of look-up tables and stored in the memory circuit. The at least one processing circuit is configured to identify an enabling period and/or a manipulation trajectory of the optical signal SL based on the at least one sensing signal VS1 and/or the sensing signal VS2, to select a corresponding manipulation instruction from a lookup table of the memory circuit, and to output a corresponding driving signal VC. The above arrangement is merely an example, and the disclosure is not limited thereto.
Refer to fig. 4. Fig. 4 is a flow chart of a method 400 shown in accordance with some embodiments of the present disclosure. The method 400 includes steps S402-S416. In some embodiments, the method 400 may be applied to the embodiments of fig. 1-3. In order to understand the present disclosure in a preferred manner, the method 400 will be discussed in conjunction with the embodiments of fig. 1-3, but the present disclosure is not so limited.
In step S402, the operation mode of the control device 240 is set to be the continuous mode CW or the burst mode P.
In step S404, the light signal SL is irradiated on the display panel 200 through the manipulation device 240.
In step S406, the sensing circuit 120 of the display panel 200 senses the light signal SL and generates at least one sensing signal VS1 and/or VS2 to be transmitted to the controller 220, and the controller 220 identifies the position of the sensing light signal SL through at least one sensing signal VS1 and/or VS 2. In some embodiments, step S406 is also referred to as an initial position reporting step.
In step S408, the mobile operating device 240 is stopped, or the mobile operating device 240 enables the light signal SL to go outside the display panel.
In step S410, the sensing circuit 120 of the display panel 200 senses a moving trajectory of the light signal SL and transmits at least one sensing signal VS1 and/or VS2 to the controller 220, such that the controller 220 identifies the sensed manipulation trajectory.
In step S412, the sensing circuit 120 of the display panel 200 senses the light signal SL, generates a sensing signal with a corresponding timing according to the pattern of the light signal SL, and transmits the sensing signal with the corresponding timing to the controller 220, so that the controller 220 can distinguish the pattern of the light signal.
In step S414, the controller 220 determines the manipulation command according to the manipulation trajectory and the mode of the optical signal SL in the first and second tables, and generates the corresponding driving signal VC to transmit to the display panel 200.
In step S416, the display circuit 140 of the display panel 200 displays the corresponding manipulation instruction.
In some related technologies, the display panel receives a touch-generated signal and performs a corresponding operation. In such related art, the display panel cannot receive the light signals simultaneously and perform corresponding operations according to the light signals. Therefore, the operation functions of the display panel are limited to only contact execution or interaction, and the convenience of the display panel is greatly reduced.
Compared to the related art, the display panel 200 of the present disclosure, in combination with the sensing circuit 120 and the pixel circuit 140, can be operated remotely by the optical signal or interacted with a remote user, so as to increase the convenience and range of the display panel 200.
The description of method 400 above includes exemplary operations, but the operations of method 400 need not be performed in the order shown. It is within the spirit and scope of the embodiments of the present disclosure that the order of the operations of method 400 be altered or that the operations be performed concurrently, with partial concurrence or omitted, where appropriate.
Although the embodiments of the present invention have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the embodiments of the invention as defined by the appended claims.

Claims (8)

1. A display device, comprising:
a display panel comprising a plurality of sensing circuits, and the sensing circuits comprising:
at least one first sensing circuit, disposed on at least one edge of an image area of the display panel, for detecting an optical signal to generate at least one first sensing signal; and
at least one second sensing circuit, which is arranged in the image area and is used for detecting the optical signal to generate at least one second sensing signal; and
a controller, coupled to the sensing circuit, for identifying a manipulation track formed by the optical signal on the image area according to at least one of the at least one first sensing signal or the at least one second sensing signal, and driving the display panel to display a manipulation instruction according to the manipulation track;
further comprising:
a control device for outputting the optical signal in a first mode or a second mode, the optical signal having an enable period, the enable period in the first mode being different from the enable period in the second mode,
and the controller further identifies the enable period of the optical signal according to the at least one of the at least one first sensing signal or the at least one second sensing signal, so as to select one of a plurality of preset instructions as the control instruction based on the enable period.
2. The display apparatus according to claim 1, wherein the controller is further configured to identify a location of the light signal on the image area according to the at least one of the at least one first sensing signal or the at least one second sensing signal.
3. The display apparatus according to claim 1, wherein the controller is configured to determine whether the manipulation trajectory forms a closed region in the image region according to the at least one second sensing signal, and if the manipulation trajectory forms the closed region, the controller is configured to select one of a plurality of predetermined commands as the manipulation command according to a rotation direction of the manipulation trajectory,
the display panel further comprises a plurality of pixel circuits, the pixel circuits are respectively arranged corresponding to the sensing circuits, and if the manipulation track does not form the closed area, the controller is used for driving at least one of the pixel circuits corresponding to the at least one second sensing circuit to display the manipulation track.
4. The display apparatus according to claim 1, wherein the controller is further configured to determine an order in which the manipulation trajectory touches the at least one edge and the image area according to the at least one first sensing signal and the at least one second sensing signal, and select one of a plurality of predetermined commands as the manipulation command according to the order.
5. A display method, comprising:
detecting an optical signal through at least one first sensing circuit in a display panel and generating at least one first sensing signal;
detecting the optical signal through at least one second sensing circuit in the display panel and generating at least one second sensing signal;
identifying a control track formed by the optical signal on an image area of the display panel according to at least one of the at least one first sensing signal or the at least one second sensing signal; and
driving the display panel to display a control command according to the control track,
the at least one first sensing circuit is arranged on at least one edge of the image area of the display panel, and the at least one second sensing circuit is arranged in the image area;
further comprising:
outputting the optical signal in a first mode or a second mode through an operation device, wherein the optical signal has an enable period, and the enable period in the first mode is different from the enable period in the second mode; identifying the enabled period of the optical signal according to the at least one of the at least one first sensing signal or the at least one second sensing signal; and
one of a plurality of preset instructions is selected as the control instruction based on the enabling period.
6. The display method of claim 5, wherein identifying the light signal further comprises:
identifying a position of the optical signal on the image area according to the at least one of the at least one first sensing signal or the at least one second sensing signal.
7. The display method of claim 5, wherein driving the display panel to display the manipulation instruction comprises:
determining whether the control track forms a closed area in the image area according to the at least one second sensing signal;
if the closed area is formed on the control track, selecting one from a plurality of preset instructions as the control instruction according to a rotating direction of the control track; and
and if the control track does not form the closed area, driving the display panel to display the control track.
8. The display method of claim 5, wherein driving the display panel to display the manipulation instruction comprises:
determining an order in which the control track touches the at least one edge and the image area according to the at least one first sensing signal and the at least one second sensing signal; and
according to the sequence, one of a plurality of preset instructions is selected as the control instruction.
CN201811317065.3A 2018-08-13 2018-11-07 Display device and display method Active CN109192129B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW107128212 2018-08-13
TW107128212A TWI667603B (en) 2018-08-13 2018-08-13 Display device and displaying method

Publications (2)

Publication Number Publication Date
CN109192129A CN109192129A (en) 2019-01-11
CN109192129B true CN109192129B (en) 2022-05-24

Family

ID=64942190

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811317065.3A Active CN109192129B (en) 2018-08-13 2018-11-07 Display device and display method

Country Status (2)

Country Link
CN (1) CN109192129B (en)
TW (1) TWI667603B (en)

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8310504B2 (en) * 2009-08-31 2012-11-13 Fuji Xerox Co., Ltd. System and method for panning and selecting on large displays using mobile devices without client software
WO2011066343A2 (en) * 2009-11-24 2011-06-03 Next Holdings Limited Methods and apparatus for gesture recognition mode control
TW201211859A (en) * 2010-09-02 2012-03-16 Inst Information Industry Laser light spot trajectory tracking event triggering method, system and computer program product thereof
EP2474894A1 (en) * 2011-01-06 2012-07-11 Research In Motion Limited Electronic device and method of controlling same
CN105718192B (en) * 2011-06-07 2023-05-02 联想(北京)有限公司 Mobile terminal and touch input method thereof
CN102323868B (en) * 2011-10-26 2013-07-31 中国人民解放军国防科学技术大学 Man-machine multipoint interaction method and device based on infrared image
TWI450150B (en) * 2011-12-21 2014-08-21 Wistron Corp Touch method and touch system
TW201407444A (en) * 2012-08-01 2014-02-16 Aver Information Inc Control panel and operation method of the same for controlling electronic apparatus
CN103631446B (en) * 2012-08-23 2016-08-17 联想(北京)有限公司 A kind of optical touch displayer and electronic equipment
TW201426511A (en) * 2012-12-27 2014-07-01 Au Optronics Corp Display system and control method thereof
TWI498792B (en) * 2013-08-06 2015-09-01 Wistron Corp Optical touch system and touch and display system
CN103729096A (en) * 2013-12-25 2014-04-16 京东方科技集团股份有限公司 Interaction recognition system and display unit provided with same
CN107943348B (en) * 2017-12-14 2021-07-20 广州视源电子科技股份有限公司 Control method, device and equipment of intelligent tablet and storage medium

Also Published As

Publication number Publication date
TWI667603B (en) 2019-08-01
CN109192129A (en) 2019-01-11
TW202009675A (en) 2020-03-01

Similar Documents

Publication Publication Date Title
US10180752B2 (en) Display device and gate driving circuit thereof
US9786723B2 (en) Pixel circuit, driving method thereof and display apparatus
US8665223B2 (en) Display device and method providing display contact information based on an amount of received light
EP3163559B1 (en) Pixel circuit, display panel and display device
US9058072B2 (en) Touch sensing apparatus and driving method thereof
US9916635B2 (en) Transparent display device and control method thereof
EP3163562B1 (en) Pixel circuit, display panel and display device
KR101747263B1 (en) Driver integrated circuit and display apparatus using the same
CN101944323B (en) Organic light-emitting display device, pixel unit and touch detection method thereof
EP3151228B1 (en) Pixel circuit and display device
JP2018206351A (en) Touch display device and method for driving touch display device
US9035916B2 (en) Optical touch display panel
US20090122024A1 (en) Display Device Provided With Optical Input Function
US8619020B2 (en) Display apparatus
US9778800B2 (en) Pixel circuit, display panel and display apparatus
JP5247139B2 (en) Display device and method, program, and electronic apparatus
CN109407880B (en) Display device including sensing unit and sensing method using the same
TW202020638A (en) Integrated touch display device and driving method thereof
US9524697B2 (en) Capacitive touch screen display system including circuitry to address display perturbations induced by panel sensing
JP2015007924A (en) Liquid crystal display device with touch panel
CN109192129B (en) Display device and display method
KR102568330B1 (en) Subpixel, driving circuit and display device
US8743090B2 (en) Display device with input unit, control method for same, control program and recording medium
JPWO2014136155A1 (en) Input device
KR20160037307A (en) In-cell touch type liquid crystal display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant