US20130265283A1 - Optical operation system - Google Patents

Optical operation system Download PDF

Info

Publication number
US20130265283A1
US20130265283A1 US13/586,823 US201213586823A US2013265283A1 US 20130265283 A1 US20130265283 A1 US 20130265283A1 US 201213586823 A US201213586823 A US 201213586823A US 2013265283 A1 US2013265283 A1 US 2013265283A1
Authority
US
United States
Prior art keywords
sensing array
operation system
optical operation
output signal
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/586,823
Inventor
Chun-Yi Lu
Yuan-Yu Peng
Yu-Chia Lin
Tzung-Min SU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Assigned to PIXART IMAGING INC. reassignment PIXART IMAGING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIN, YU-CHIA, LU, CHUN-YI, PENG, YUAN-YU, SU, TZUNG-MIN
Publication of US20130265283A1 publication Critical patent/US20130265283A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3262Power saving in digitizer or tablet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Definitions

  • the present invention relates to a technology in an optical operation system field, and more particularly to an optical operation system capable of determining an object being in a hover state.
  • Multi-touch mouse for example, EvoMouse
  • EvoMouse is a type of optical operation system that allows a user to use his or her finger gestures to control a computer system.
  • This optical operation system can be implemented as a virtual input device, such as a keyboard or mouse; and thereby providing a convenient operation interface to users.
  • the conventional optical operation system can only determine two-dimensional positions and cannot distinguish the two operations “touch” and “hover” of a sensed object.
  • one object of the present invention is to provide an optical operation system capable of determining whether or not an object is in a hover state.
  • the present invention provides an optical operation system, which includes an image sensing apparatus and a processing circuit.
  • the image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array.
  • the first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal.
  • the second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal; wherein the first height is greater than the second height.
  • the processing circuit is electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.
  • the present invention further provides an optical operation system, which includes an image sensing apparatus and a processing circuit.
  • the image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array.
  • the first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal.
  • the second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal; wherein the first height is greater than the second height.
  • the processing circuit is electrically connected to the image sensing apparatus and configured to activate the first sensing array so as to receive the first output signal and determine whether or not to activate the second sensing array according to the first output signal.
  • the present invention still further provides an optical operation system, which includes an image sensing apparatus and a processing circuit.
  • the image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array; wherein the first sensing array is arranged above the second sensing array.
  • the first sensing array and the second sensing array are configured to capture images above the operation plane and accordingly generate a first output signal and a second output signal, respectively.
  • the processing circuit is electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.
  • the optical operation system can effectively determine an object above the operation plane whether or not in a hover state according to the images captured by the two sensing arrays.
  • FIG. 1 is a schematic top view of an optical operation system in accordance with an embodiment of the present invention
  • FIG. 2 is a schematic view illustrating the disposing position of the two sensing arrays in the image sensing apparatus shown in FIG. 1 ;
  • FIG. 3 is a schematic top view of an optical operation system in accordance with another embodiment of the present invention.
  • FIG. 1 is a schematic top view of an optical operation system in accordance with an embodiment of the present invention.
  • the optical operation system in this embodiment includes an image sensing apparatus 108 and a processing circuit 110 .
  • the image sensing apparatus 108 is disposed on an edge of an operation plane 112 and configured to sense an object 102 (for example, a user's index finger) on the operation plane 112 .
  • the operation plane 112 is a parallelogram; and specifically the operation plane 112 is a rectangle in a preferred embodiment.
  • the operation plane 112 can be a real plane (for example, a display panel of a displaying apparatus) or a virtual plane.
  • the image sensing apparatus 108 includes two sensing arrays as illustrated in FIG. 2 , which is a schematic view illustrating the disposing position of the two sensing arrays in the image sensing apparatus 108 .
  • the image sensing apparatus 108 includes two sensing arrays 108 - 1 , 108 - 2 ; each is configured to capture images located above the operation plane 112 .
  • the higher sensing array 108 - 1 is configured to capture images having a first height relative to the operation plane 112 and generate an output signal S 1 according to the captured images
  • the lower sensing array 108 - 2 is configured to capture images having a second height relative to the operation plane 112 and generate an output signal S 2 according to the captured images; wherein the first height is greater than the second height.
  • the processing circuit 110 is electrically connected to the image sensing apparatus 108 and configured to receive the output signals S 1 , S 2 from the sensing arrays 108 - 1 , 108 - 2 and accordingly output control commands C 1 , C 2 , respectively.
  • the processing circuit 110 can determine that whether or not the object 102 is in a hover state according to the images captured by the two sensing arrays 108 - 1 , 108 - 2 (specifically, according to the output signals S 1 , S 2 outputted from the sensing arrays 108 - 1 , 108 - 2 , respectively).
  • the object 102 is determined in a hover state if the object 102 is captured by the sensing array 108 - 1 only; and accordingly, the processing circuit 110 can indicate that the object 102 is in a hover state through the control command C 1 .
  • the processing circuit 110 can further use the control command C 1 for indicating a movement state of the object 102 .
  • the control command C 1 can further be used for indicating a movement state of the object 102 if the object 102 has been in the hover state for a determined time.
  • the processing circuit 110 can be configured to activate the sensing array 108 - 2 only when specific object information (for example, some specific information associated with the object 102 ) is delivered in the output signal S 1 outputted from the processing circuit 110 .
  • the processing circuit 110 may activate the sensing array 108 - 1 first, and then determine whether or not to activate the sensing array 108 - 2 according to the output signal S 1 outputted from the sensing array 108 - 1 . Therefore, the sensing array 108 - 1 only needs to determine whether the object 102 is sensed or not, and there is no need to determine the coordinate of the object 102 by the sensing array 108 - 1 .
  • the sensing arrays 108 - 1 , 108 - 2 both can be integrated on a single sensor chip; alternatively, the sensing arrays 108 - 1 , 108 - 2 can be implemented on two individual sensor chips, respectively. Moreover, it is to be noted that the sensing arrays 108 - 1 , 108 - 2 can be respectively disposed at two different positions on an edge of the operation plane 112 .
  • the optical operation system in the present invention can be a touch system or a handwriting system.
  • the optical operation system in the present invention is not limited to one image sensing apparatus and one processing circuit.
  • the optical operation system in another embodiment can be implemented by more than one image sensing apparatus and more than one processing circuit, as illustrated in FIG. 3 .
  • FIG. 3 is a schematic top view of an optical operation system in accordance with another embodiment of the present invention.
  • the optical operation system in this embodiment further includes, beside a pair of an image sensing apparatus 308 and a corresponding processing circuit 310 , another pair of an image sensing apparatus 328 and a corresponding processing circuit 330 .
  • the image sensing apparatuses 308 , 328 each are disposed on an edge of an operation plane 312 and configured to sense an object 302 on the operation plane 312 individually.
  • the image sensing apparatuses 308 , 328 each include two sensing arrays; wherein the two sensing arrays in the image sensing apparatus 308 or 328 are arranged same as the view illustrated in FIG. 2 .
  • the processing circuit 310 is electrically connected to the image sensing apparatus 308 and configured to receive the two output signals from the two sensing arrays therein and accordingly output the control commands C 1 , C 2 , respectively.
  • the processing circuit 330 is electrically connected to the image sensing apparatus 328 and configured to receive the two output signals from the two sensing arrays therein and accordingly output the control commands C 3 , C 4 , respectively.
  • the processing circuit 310 is configured to activate the higher sensing array in the image sensing apparatus 308 first, and then determine whether or not to activate the lower sensing array in the image sensing apparatus 308 according to the output signal outputted from the higher sensing array.
  • the processing circuit 310 is, while activating the lower sensing array in the image sensing apparatus 308 , configured to issue a trigger signal TS to the processing circuit 330 so as to control the processing circuit 330 to simultaneously activate the two sensing arrays in the image sensing apparatus 328 .
  • the image sensing apparatus 328 may include the lower sensing array only, which is for, corporately with the lower sensing array in the image sensing apparatus 308 , sensing a specific movement (for example, a two-dimensional movement) of the object 302 on the operation plane 312 .
  • a specific movement for example, a two-dimensional movement
  • the controlling of the sensing array(s) in the image sensing apparatus 328 is similar to that in the image sensing apparatus 308 , and no any unnecessary detail is given here.
  • the optical operation system according to the present invention can effectively determine an object above the operation plane whether or not in a hover state according to the images captured by the two sensing arrays.
  • the optical operation system according to the present invention can also be used in one-camera touch sensing system.
  • an object's coordinate can be determined by a real image and a mirror image if only one camera is employed with one mirror.
  • the hover state detection still can be performed if the camera has two sensing arrays as disclosed in the aforementioned descriptions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Position Input By Displaying (AREA)
  • Image Input (AREA)

Abstract

An optical operation system includes an image sensing apparatus and a processing circuit. The image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array. The first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal. The second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal; wherein the first height is greater than the second height. The processing circuit is electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.

Description

    TECHNICAL FIELD
  • The present invention relates to a technology in an optical operation system field, and more particularly to an optical operation system capable of determining an object being in a hover state.
  • BACKGROUND
  • Multi-touch mouse (for example, EvoMouse) is a type of optical operation system that allows a user to use his or her finger gestures to control a computer system. This optical operation system can be implemented as a virtual input device, such as a keyboard or mouse; and thereby providing a convenient operation interface to users.
  • However, the conventional optical operation system can only determine two-dimensional positions and cannot distinguish the two operations “touch” and “hover” of a sensed object.
  • SUMMARY OF EMBODIMENTS
  • Therefore, one object of the present invention is to provide an optical operation system capable of determining whether or not an object is in a hover state.
  • The present invention provides an optical operation system, which includes an image sensing apparatus and a processing circuit. The image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array. The first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal. The second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal; wherein the first height is greater than the second height. The processing circuit is electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.
  • The present invention further provides an optical operation system, which includes an image sensing apparatus and a processing circuit. The image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array. The first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal. The second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal; wherein the first height is greater than the second height. The processing circuit is electrically connected to the image sensing apparatus and configured to activate the first sensing array so as to receive the first output signal and determine whether or not to activate the second sensing array according to the first output signal.
  • The present invention still further provides an optical operation system, which includes an image sensing apparatus and a processing circuit. The image sensing apparatus is disposed at an edge of an operation plane and includes a first sensing array and a second sensing array; wherein the first sensing array is arranged above the second sensing array. The first sensing array and the second sensing array are configured to capture images above the operation plane and accordingly generate a first output signal and a second output signal, respectively. The processing circuit is electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.
  • In summary, through arranging two sensing arrays in an image sensing apparatus and using the two sensing arrays to respectively capture images at two different heights above an operation plane, the optical operation system according to the present invention can effectively determine an object above the operation plane whether or not in a hover state according to the images captured by the two sensing arrays.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above embodiments will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, in which:
  • FIG. 1 is a schematic top view of an optical operation system in accordance with an embodiment of the present invention;
  • FIG. 2 is a schematic view illustrating the disposing position of the two sensing arrays in the image sensing apparatus shown in FIG. 1; and
  • FIG. 3 is a schematic top view of an optical operation system in accordance with another embodiment of the present invention.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • The disclosure will now be described more specifically with reference to the following embodiments. It is to be noted that the following descriptions of preferred embodiments are presented herein for purpose of illustration and description only. It is not intended to be exhaustive or to be limited to the precise form disclosed.
  • First Embodiment
  • FIG. 1 is a schematic top view of an optical operation system in accordance with an embodiment of the present invention. As shown, the optical operation system in this embodiment includes an image sensing apparatus 108 and a processing circuit 110. The image sensing apparatus 108 is disposed on an edge of an operation plane 112 and configured to sense an object 102 (for example, a user's index finger) on the operation plane 112. In this embodiment, the operation plane 112 is a parallelogram; and specifically the operation plane 112 is a rectangle in a preferred embodiment. In addition, the operation plane 112 can be a real plane (for example, a display panel of a displaying apparatus) or a virtual plane.
  • In the embodiment, the image sensing apparatus 108 includes two sensing arrays as illustrated in FIG. 2, which is a schematic view illustrating the disposing position of the two sensing arrays in the image sensing apparatus 108. As shown, the image sensing apparatus 108 includes two sensing arrays 108-1, 108-2; each is configured to capture images located above the operation plane 112. Specifically, the higher sensing array 108-1 is configured to capture images having a first height relative to the operation plane 112 and generate an output signal S1 according to the captured images, and the lower sensing array 108-2 is configured to capture images having a second height relative to the operation plane 112 and generate an output signal S2 according to the captured images; wherein the first height is greater than the second height. Please refer to FIGS. 1, 2 both. The processing circuit 110 is electrically connected to the image sensing apparatus 108 and configured to receive the output signals S1, S2 from the sensing arrays 108-1, 108-2 and accordingly output control commands C1, C2, respectively.
  • Because the two sensing arrays 108-1, 108-2 in the image sensing apparatus 108 are configured to capture images with two different heights relative to the operation plane 112, the processing circuit 110 can determine that whether or not the object 102 is in a hover state according to the images captured by the two sensing arrays 108-1, 108-2 (specifically, according to the output signals S1, S2 outputted from the sensing arrays 108-1, 108-2, respectively). In other words, the object 102 is determined in a hover state if the object 102 is captured by the sensing array 108-1 only; and accordingly, the processing circuit 110 can indicate that the object 102 is in a hover state through the control command C1. It is to be noted that the processing circuit 110 can further use the control command C1 for indicating a movement state of the object 102. For example, instead of being used to indicate a hover state, the control command C1 can further be used for indicating a movement state of the object 102 if the object 102 has been in the hover state for a determined time.
  • Additionally, in order to have a power saving feature, the processing circuit 110 can be configured to activate the sensing array 108-2 only when specific object information (for example, some specific information associated with the object 102) is delivered in the output signal S1 outputted from the processing circuit 110. In other words, the processing circuit 110 may activate the sensing array 108-1 first, and then determine whether or not to activate the sensing array 108-2 according to the output signal S1 outputted from the sensing array 108-1. Therefore, the sensing array 108-1 only needs to determine whether the object 102 is sensed or not, and there is no need to determine the coordinate of the object 102 by the sensing array 108-1.
  • It is to be noted that the sensing arrays 108-1, 108-2 both can be integrated on a single sensor chip; alternatively, the sensing arrays 108-1, 108-2 can be implemented on two individual sensor chips, respectively. Moreover, it is to be noted that the sensing arrays 108-1, 108-2 can be respectively disposed at two different positions on an edge of the operation plane 112. In addition, the optical operation system in the present invention can be a touch system or a handwriting system.
  • Based on the aforementioned descriptions, it is understood that the optical operation system in the present invention is not limited to one image sensing apparatus and one processing circuit. In other words, the optical operation system in another embodiment can be implemented by more than one image sensing apparatus and more than one processing circuit, as illustrated in FIG. 3.
  • FIG. 3 is a schematic top view of an optical operation system in accordance with another embodiment of the present invention. As shown, the optical operation system in this embodiment further includes, beside a pair of an image sensing apparatus 308 and a corresponding processing circuit 310, another pair of an image sensing apparatus 328 and a corresponding processing circuit 330. The image sensing apparatuses 308, 328 each are disposed on an edge of an operation plane 312 and configured to sense an object 302 on the operation plane 312 individually. In this embodiment, the image sensing apparatuses 308, 328 each include two sensing arrays; wherein the two sensing arrays in the image sensing apparatus 308 or 328 are arranged same as the view illustrated in FIG. 2. The processing circuit 310 is electrically connected to the image sensing apparatus 308 and configured to receive the two output signals from the two sensing arrays therein and accordingly output the control commands C1, C2, respectively. The processing circuit 330 is electrically connected to the image sensing apparatus 328 and configured to receive the two output signals from the two sensing arrays therein and accordingly output the control commands C3, C4, respectively.
  • In this embodiment, the processing circuit 310 is configured to activate the higher sensing array in the image sensing apparatus 308 first, and then determine whether or not to activate the lower sensing array in the image sensing apparatus 308 according to the output signal outputted from the higher sensing array. In addition, at the same time the processing circuit 310 is, while activating the lower sensing array in the image sensing apparatus 308, configured to issue a trigger signal TS to the processing circuit 330 so as to control the processing circuit 330 to simultaneously activate the two sensing arrays in the image sensing apparatus 328. It is to be noted that the image sensing apparatus 328 may include the lower sensing array only, which is for, corporately with the lower sensing array in the image sensing apparatus 308, sensing a specific movement (for example, a two-dimensional movement) of the object 302 on the operation plane 312. Ii is understood that the controlling of the sensing array(s) in the image sensing apparatus 328 is similar to that in the image sensing apparatus 308, and no any unnecessary detail is given here.
  • In summary, through arranging two sensing arrays in an image sensing apparatus and using the two sensing arrays to respectively capture images at two different heights above an operation plane, the optical operation system according to the present invention can effectively determine an object above the operation plane whether or not in a hover state according to the images captured by the two sensing arrays. In addition, it is to be noted that the optical operation system according to the present invention can also be used in one-camera touch sensing system. In other words, an object's coordinate can be determined by a real image and a mirror image if only one camera is employed with one mirror. Moreover, it is understood that the hover state detection still can be performed if the camera has two sensing arrays as disclosed in the aforementioned descriptions.
  • While the disclosure has been described in terms of what is presently considered to be the most practical and preferred embodiments, it is to be understood that the disclosure needs not be limited to the disclosed embodiment. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims which are to be accorded with the broadest interpretation so as to encompass all such modifications and similar structures.

Claims (27)

What is claimed is:
1. An optical operation system, comprising:
an image sensing apparatus disposed at an edge of an operation plane and comprising a first sensing array and a second sensing array, wherein the first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal, the second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal, and the first height is greater than the second height; and
a processing circuit electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.
2. The optical operation system according to claim 1, wherein the optical operation system is a touch system.
3. The optical operation system according to claim 1, wherein the optical operation system is a handwriting system.
4. The optical operation system according to claim 1, wherein the first sensing array and the second sensing array are integrated on a single sensor chip.
5. The optical operation system according to claim 1, wherein the first sensing array and the second sensing array are implemented on two individual sensor chips, respectively.
6. The optical operation system according to claim 1, wherein the first sensing array and the second sensing array are respectively disposed at two different positions on an edge of the operation plane.
7. The optical operation system according to claim 1, wherein the second sensing array is activated if the first output signal contains object information.
8. The optical operation system according to claim 1, wherein the first control command is for indicating that an object is in a hover state.
9. The optical operation system according to claim 1, wherein the first control command is for indicating a movement state of an object.
10. An optical operation system, comprising:
an image sensing apparatus disposed at an edge of an operation plane and comprising a first sensing array and a second sensing array, wherein the first sensing array is configured to capture images at a first height above the operation plane and accordingly generate a first output signal, the second sensing array is configured to capture images at a second height above the operation plane and accordingly generate a second output signal, and the first height is greater than the second height; and
a processing circuit electrically connected to the image sensing apparatus and configured to activate the first sensing array so as to receive the first output signal and determine whether or not to activate the second sensing array according to the first output signal.
11. The optical operation system according to claim 10, wherein the optical operation system is a touch system.
12. The optical operation system according to claim 10, wherein the optical operation system is a handwriting system.
13. The optical operation system according to claim 10, wherein the first sensing array and the second sensing array are integrated on a single sensor chip.
14. The optical operation system according to claim 10, wherein the first sensing array and the second sensing array are implemented on two individual sensor chips, respectively.
15. The optical operation system according to claim 10, wherein the first sensing array and the second sensing array are respectively disposed at two different positions on an edge of the operation plane.
16. The optical operation system according to claim 10, wherein the second sensing array is activated if the first output signal contains object information.
17. The optical operation system according to claim 10, wherein the processing circuit is further configured to output a first control command, for indicating that an object is in a hover state, in response to the first output signal.
18. The optical operation system according to claim 10, wherein the processing circuit is further configured to output a first control command, for indicating a movement state of an object, in response to the first output signal.
19. An optical operation system, comprising:
an image sensing apparatus disposed at an edge of an operation plane and comprising a first sensing array and a second sensing array, wherein the first sensing array is arranged above the second sensing array, the first sensing array and the second sensing array are configured to capture images above the operation plane and accordingly generate a first output signal and a second output signal, respectively; and
a processing circuit electrically connected to the image sensing apparatus and configured to receive the first output signal and the second output signal and accordingly generate a first control command and a second control command, respectively.
20. The optical operation system according to claim 19, wherein the optical operation system is a touch system.
21. The optical operation system according to claim 19, wherein the optical operation system is a handwriting system.
22. The optical operation system according to claim 19, wherein the first sensing array and the second sensing array are integrated on a single sensor chip.
23. The optical operation system according to claim 19, wherein the first sensing array and the second sensing array are implemented on two individual sensor chips, respectively.
24. The optical operation system according to claim 19, wherein the first sensing array and the second sensing array are respectively disposed at two different positions on an edge of the operation plane.
25. The optical operation system according to claim 19, wherein the second sensing array is activated if the first output signal contains object information.
26. The optical operation system according to claim 19, wherein the first control command is for indicating that an object is in a hover state.
27. The optical operation system according to claim 19, wherein the first control command is for indicating a movement state of an object.
US13/586,823 2012-04-10 2012-08-15 Optical operation system Abandoned US20130265283A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW101112664A TW201342137A (en) 2012-04-10 2012-04-10 Optical operation system
TW101112664 2012-04-10

Publications (1)

Publication Number Publication Date
US20130265283A1 true US20130265283A1 (en) 2013-10-10

Family

ID=49291914

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/586,823 Abandoned US20130265283A1 (en) 2012-04-10 2012-08-15 Optical operation system

Country Status (2)

Country Link
US (1) US20130265283A1 (en)
TW (1) TW201342137A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015165620A1 (en) * 2014-04-28 2015-11-05 Robert Bosch Gmbh Electrical device and method for operating an electrical device
US20160139735A1 (en) * 2014-11-14 2016-05-19 Quanta Computer Inc. Optical touch screen

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6594023B1 (en) * 1999-09-10 2003-07-15 Ricoh Company, Ltd. Coordinate inputting/detecting apparatus, method and computer program product designed to precisely recognize a designating state of a designating device designating a position
US20050024324A1 (en) * 2000-02-11 2005-02-03 Carlo Tomasi Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6947032B2 (en) * 2003-03-11 2005-09-20 Smart Technologies Inc. Touch system and method for determining pointer contacts on a touch surface
US7342574B1 (en) * 1999-10-29 2008-03-11 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
US20080068352A1 (en) * 2004-02-17 2008-03-20 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
US20090144668A1 (en) * 2007-12-03 2009-06-04 Tse-Hsien Yeh Sensing apparatus and operating method thereof
US7557935B2 (en) * 2003-05-19 2009-07-07 Itzhak Baruch Optical coordinate input device comprising few elements
US7629967B2 (en) * 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US20100020201A1 (en) * 2008-07-23 2010-01-28 Pixart Imaging Inc. Sensor array module with wide angle, and image calibration method, operation method and application for the same
US20100207911A1 (en) * 2003-02-14 2010-08-19 Next Holdings Limited Touch screen Signal Processing With Single-Point Calibration
US20100245289A1 (en) * 2009-03-31 2010-09-30 Miroslav Svajda Apparatus and method for optical proximity sensing and touch input control
US20110050644A1 (en) * 2009-08-28 2011-03-03 Pixart Imaging Inc. Touch system and pointer coordinate detection method therefor
TW201122966A (en) * 2009-12-31 2011-07-01 Acer Inc Touch-sensed controlled monitor system
US8451252B2 (en) * 2008-04-30 2013-05-28 Beijing Intouch Co., Ltd. Image sensor for touch screen and image sensing apparatus

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6594023B1 (en) * 1999-09-10 2003-07-15 Ricoh Company, Ltd. Coordinate inputting/detecting apparatus, method and computer program product designed to precisely recognize a designating state of a designating device designating a position
US7342574B1 (en) * 1999-10-29 2008-03-11 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
US20050024324A1 (en) * 2000-02-11 2005-02-03 Carlo Tomasi Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US7629967B2 (en) * 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US20100090985A1 (en) * 2003-02-14 2010-04-15 Next Holdings Limited Touch screen signal processing
US20100097353A1 (en) * 2003-02-14 2010-04-22 Next Holdings Limited Touch screen signal processing
US20100207911A1 (en) * 2003-02-14 2010-08-19 Next Holdings Limited Touch screen Signal Processing With Single-Point Calibration
US6947032B2 (en) * 2003-03-11 2005-09-20 Smart Technologies Inc. Touch system and method for determining pointer contacts on a touch surface
US7557935B2 (en) * 2003-05-19 2009-07-07 Itzhak Baruch Optical coordinate input device comprising few elements
US20080068352A1 (en) * 2004-02-17 2008-03-20 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
US20090144668A1 (en) * 2007-12-03 2009-06-04 Tse-Hsien Yeh Sensing apparatus and operating method thereof
US8451252B2 (en) * 2008-04-30 2013-05-28 Beijing Intouch Co., Ltd. Image sensor for touch screen and image sensing apparatus
US20100020201A1 (en) * 2008-07-23 2010-01-28 Pixart Imaging Inc. Sensor array module with wide angle, and image calibration method, operation method and application for the same
US20100245289A1 (en) * 2009-03-31 2010-09-30 Miroslav Svajda Apparatus and method for optical proximity sensing and touch input control
US20110050644A1 (en) * 2009-08-28 2011-03-03 Pixart Imaging Inc. Touch system and pointer coordinate detection method therefor
TW201122966A (en) * 2009-12-31 2011-07-01 Acer Inc Touch-sensed controlled monitor system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015165620A1 (en) * 2014-04-28 2015-11-05 Robert Bosch Gmbh Electrical device and method for operating an electrical device
US20160139735A1 (en) * 2014-11-14 2016-05-19 Quanta Computer Inc. Optical touch screen

Also Published As

Publication number Publication date
TW201342137A (en) 2013-10-16

Similar Documents

Publication Publication Date Title
US11886699B2 (en) Selective rejection of touch contacts in an edge region of a touch surface
US9317130B2 (en) Visual feedback by identifying anatomical features of a hand
US9507521B2 (en) Input apparatus, input mode switching method and computer apparatus
KR102331888B1 (en) Conductive trace routing for display and bezel sensors
US20140189579A1 (en) System and method for controlling zooming and/or scrolling
US20150268789A1 (en) Method for preventing accidentally triggering edge swipe gesture and gesture triggering
US20120113044A1 (en) Multi-Sensor Device
US9454260B2 (en) System and method for enabling multi-display input
US20140043265A1 (en) System and method for detecting and interpreting on and off-screen gestures
US20100328351A1 (en) User interface
CN102341814A (en) Gesture recognition method and interactive input system employing same
KR20160023298A (en) Electronic device and method for providing input interface thereof
JP2011503709A (en) Gesture detection for digitizer
US9639167B2 (en) Control method of electronic apparatus having non-contact gesture sensitive region
US20160202840A1 (en) Electronic device, control method for the same, and non-transitory computer-readable storage medium
US20130257809A1 (en) Optical touch sensing apparatus
KR200477008Y1 (en) Smart phone with mouse module
CN104978018B (en) Touch system and touch method
US20130113728A1 (en) Single-point-multi-finger gestures for touch panel
US20130265283A1 (en) Optical operation system
US20140210746A1 (en) Display device and method for adjusting display orientation using the same
US10338726B2 (en) Mobile device and method of distinguishing between different touch forces
Krithikaa Touch screen technology–a review
KR100969927B1 (en) Apparatus for touchless interactive display with user orientation
CN110162257A (en) Multiconductor touch control method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXART IMAGING INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LU, CHUN-YI;PENG, YUAN-YU;LIN, YU-CHIA;AND OTHERS;REEL/FRAME:028794/0246

Effective date: 20120803

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION