EP3326052A1 - Apparatus and method for detecting gestures on a touchpad - Google Patents

Apparatus and method for detecting gestures on a touchpad

Info

Publication number
EP3326052A1
EP3326052A1 EP16741039.8A EP16741039A EP3326052A1 EP 3326052 A1 EP3326052 A1 EP 3326052A1 EP 16741039 A EP16741039 A EP 16741039A EP 3326052 A1 EP3326052 A1 EP 3326052A1
Authority
EP
European Patent Office
Prior art keywords
touchpad
user
touch
proximity
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16741039.8A
Other languages
German (de)
French (fr)
Inventor
Søren Borup Jensen
Finn Ejlersen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bang and Olufsen AS
Original Assignee
Bang and Olufsen AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bang and Olufsen AS filed Critical Bang and Olufsen AS
Publication of EP3326052A1 publication Critical patent/EP3326052A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Definitions

  • the present invention relates to a touchpad and interpretation of gestures performed on the touchpad or in close proximity thereto.
  • Recent applications include systems that may detect position of objects in proximity fields around an apparatus, also called “proximity sensing”.
  • a system is disclosed by Synaptics Inc, and as published in US patent application number US 2007/10262951.
  • US patent application number US2011/0279397 discloses a monitoring unit for monitoring a hand or finger in three dimensions in the vicinity of a touch screen such that the monitoring unit is working in contact mode or in contact-less mode.
  • Other publications disclosing user interfaces, for example interpreting gestures comprise US patent US8830181 and US patent applications US2008/0168403, US2010/0245289, and US20120162073.
  • a gesture for example a swiping action with a finger
  • a gesture is interpreted according to a predetermined orientation and location of the user relative to the detector.
  • the gesture has to be performed by the user in relation to the actual orientation of the user interface of the touchpad relative to the user.
  • symbols or text need to be printed or applied otherwise to the surface of the detector to ensure the correct orientation of the detector relative to the user. This requires attention by the user and is a limitation to the user-friendl iness. it would be desirable to increase the user-friendliness of touch pads and to remove the need for symbols or text on the surface of the detector.
  • the present invention provides an apparatus and a method for detecting user-supplied control commands given via gestures on a touch-sensitive surface, also called touch- pad, of the apparatus, for example multimedia apparatus, AV system, loudspeaker, remote control or media player.
  • the touchpad comprises a proximity detection system for detecting the location of an object, for example a user ' s finger, in the proximity of the touchpad, and for detecting a movement performed with the object by the user in the prox imity of the touchpad.
  • the apparatus further comprises a touch detection system for detecting contact by said object with the surface of the touchpad and for detecting a gesture performed w ith the object by the user while the object is in contact with the touchpad.
  • the function of the touch detection system is combined w ith the function of the proximity detection system, where the latter detects the presence of a finger or hand or pointer of a user before, during or after the gesture on the touchpad in order to determine the location of the user relative to the touchpad.
  • This information is used to interpret the intended direction of the gesture. For example, when a user swipes a finger across the touchpad, the l ine of the swipe is calculated from the touch data, and the position of the user is calculated from the related proximity data. Thus, it is possible to determine if the user is swiping right or left as seen from the user ' s own perspective. For example, when the finger moves across the surface of the touch pad along a path being linear or curved and performed like a swipe from one position to another position, the left- or right orientation of said object movement is interpreted to be left or right according to the actual user position in front of the omnidirectional touchpad.
  • the movement is interpreted as being a swipe that is directed to the right, also cal led a right-swipe, not only if the user is on one side of the touchpad but also if the user is located on an opposite side of the touchpad.
  • the system detects the location of the user relative to the touchpad and adjusts the gesture interpretation accordingly. This is in contrast to the prior art, where the user interface has to be oriented correctly, relative to the user, or the user has to adjust the gesture, for example swipe, to match the direction of the user interface. Also, in the invention, there is no need for symbols or text on the surface of the user interface. Typical applications are operation of multimedia apparatus, AV systems, loudspeakers, media players, remote controls and similar equipment.
  • proximity sensing of a finger or hand is done by a capacitive sensor or by a light beam emitter in combination with a sensor that detect reflected light beams.
  • the position of the person relative to the apparatus is sensed by a reflected light beam, especially an infrared ( I R) reflected l ight beam.
  • I R infrared
  • Detection of gestures is performed in a 3 -dimensional space around the apparatus and by touching directly on the apparatus.
  • Dedicated means are used for detecting position of objects, for example a finger, a hand, or a pointer dev ice, located at a distance close to the apparatus and in direct physical contact w ith the apparatus.
  • the invention has properties making it suitable for mounting on printed circuit boards and improving the quality of the detection system such that user-suppl ied commands may certainly and unambiguously be detected, interpreted and executed as related functions in a given apparatus.
  • the a user-direction is found relative to a predetermined orientation of the touchpad, and the orientation of the detected gesture on the touch- pad is adjusted according to this difference prior to interpreting the gesture with respect to a gest u re-assoc i ated command and executing the command.
  • this can be achieved by detecting a proximity-position P of an ob ject in close proximity to the touchpad, detect ing a touch-position T of the object while in contact with the touchpad, and from the prox i m i ty-posi t ion P and the touch-position T determining a user-direction towards the user.
  • a sequence of touch positions T can be used, for example in the case of a swipe.
  • the prox imity-position P and/or the touch-position T are averaged positions, for example achieved by a weighted averaging, which in the fol lowing are called dominant proximity-position P and dominant touch-position T.
  • the method comprises
  • the proximity detection system detecting a movement of the object in close prox- imity to the touchpad and averaging this movement to a dominant proximity-position
  • the method also contains the step of detecting a gesture of the object by the touch detection system while in contact with the touchpad and averaging the gesture to a dominant touch-position T.
  • the detected proximity movement of the object and the gesture on the touchpad are translated to coordinate sequences in a pre-defined coordinate sys- tern, in this case, a practical embodiment of the method comprises
  • the method comprises adjusting the orientation of the gesture with re- spect to righ-to-left or left-to-right direction prior to interpreting the gesture.
  • the apparatus determines whether the swipe-path is a left-swipe or right-swipe depending on the determined user-direction in the coordinate-system.
  • the proximity detection system comprises plurality of proximity sensors organized along an outer perimeter of the touchpad, optionally circular touchpad.
  • the touch detection system comprises a plurality of touch sensors organized on the surface of the touchpad and surrounded by the prox imity sensors.
  • the proximity detection system comprises a plurality of infrared light emitters and a plural ity of infrared light receivers, and said receivers are configured for measuring a background level of infrared l ight and correcting infrared proximity signals by subtracting the background level.
  • the plural ity of infrared l ight emitters are configured for one emitter of the plurality of emitters being active at a time, and wherein an electronic control circuit is con figured for receiving a separate set of proximity signals from each one or of the plural ity of infrared receivers, for every subsequent activation of each further infrared emitter of the plurality of emitters.
  • the method comprises receiving the proximity signals from two infrared receivers, one on either side of the corresponding emitter, for every subsequent activation of each further infrared emitter of the plural ity of emitters
  • the apparatus comprises an input device hav ing a primary surface in an X.Y plane.
  • a plural ity of capacitive means operable to generate a plural ity of electric fields, wherein at least two of said capacitive means are posi- tioned on the X.Y plane of said surface.
  • At least one infrared transmitter means and at least two infrared receiver means are positioned on said surface and are configured to issue I R l ight beams, primarily orthogonally out from the said surface and receive I R light beams caused by the reflection from the object, for example a finger, of a user, and primarily above and orthogonal ly to the X.Y plane of said sur- face, wherein the method comprises the steps of:
  • An aspect of the invention is an omnidirectional touchpad, integrated into an appa- ratus, enabled to detect user given commands and to determine if the user is making a right or left swipe gesture independent of where the user is positioned relative to the touch pad.
  • the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger, and is configured with means to detect the touch pressure by said object onto the surface of the omnidirectional touchpad, this characterized by:
  • a first X.Y position of the object is determined as sensed by the proximity means, and validated accordingly,
  • o a second X.Y position of the object is determined as sensed by the touch means and based on the values of the first X.Y position
  • o a third X.Y position of the object is determined as sensed by the touch means, and validated accordingly.
  • o a resulting X,Y is calculated based on the second X,Y position value and the third X,Y position/value
  • a further aspect of the invention is:
  • An even further aspect of the invention is an omnidirectional touchpad. integrated into an apparatus, enabled to detect user given commands and to determine if the user is making a right or left swipe gesture independent of where the user is positioned relative to the touch pad.
  • the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger or hand, and is configured with means to detect the touch/pressure by said object onto the surface of the omnidirec- t ion a I touchpad, this characterized by:
  • a first X.Y position of the object is determined as sensed by the proximity means, and validated accordingly.
  • a second, third and subsequent X,Y positions of the object are determined as sensed by the touch means and val idated accordingly, until no further user interaction is detected by the touch means
  • a second, third and subsequent X.Y positions of the object are determined as sensed by the proximity means and validated accordingly, until no further user interaction is detected by the proximity means a resulting dominant X.Y touch position or a resulting X,Y touch sw ipe vector is calculated relative to the fixed orthogonal X,Y coordinate system, based on the sequence of detected touch X,Y values, a resulting dominant X,Y proximity position of the user's hand or finger is calculated relative to the fixed orthogonal X.Y coordinate system, based on the sequence of detected proximity X,Y values, a corrected dominant X,Y touch position or a corrected X,Y swipe vector is calculated relative to a orthogonal X.Y coordinate system, rotated towards the user
  • a command corresponding to the resulting corrected dominant X,Y touch position or the resulting corrected X,Y touch vector is interpreted by the apparatus and executed accordingly and
  • one or more of the capacitive means arc divided into two or more segments, which are indiv idually receptive to user input commands, and wherein the method includes the step of determining at which segment said user input command is detected.
  • said input device comprises a substantially planar body being integrated into an apparatus, e.g. a media player, or alternatively the means are configured as a standalone product, e.g. a remote control ler, smartphone. tablet or alike.
  • the touch commands may be input on different adjacent surfaces of a three- dimensional object, such a system al lows for a greater combination or arrangement of acceptable input commands.
  • At least one of said infrared means is operable to generate an infrared l ight beam field substantially in front of said primary surface, wherein the method comprises the step of detecting that an object is moved into said at least one infrared l ight beam field and/or moved out of said at least one infrared light beam field or moved within said at least one infrared l ight beam field wherein said step of generating a control command is based in part on said detection.
  • at least one of said infra- red means is operable to detect an infrared light beam field substantially in front of said primary surface.
  • object may refer to an object or token held by a user, for example a pointer device, or may also refer to at least a portion of the user's body detected by the system, e.g. a finger.
  • a prox imity field As detection can be made based on the movement of an object relative to a prox imity field, this allows commands to be entered by the user without touching the input dev ice.
  • different gestures may be interpretable as di fferent user commands.
  • different commands may be generated based on how close the user/object is to the input device, e.g. a display image may be adjusted based on the user's proximity to the display.
  • the method comprises the step of detecting that an object touches a touch sensitive field, and generating a related action.
  • a touch-based input apparatus comprising:
  • said control unit is operable to detect a user command in the form of a touch command or a non-touch command by a gesture remote from the touchpad surface and to generate a control command based on the user command detected.
  • one or more of said capacitive means arc divided into a plurality of segments individually receptive to user input commands, and wherein the said control unit is operable to determine at which segment said user input command is detected.
  • a resistive based or other touch system may be applied to the capacitive means.
  • said apparatus is configured with a substantial ly planar body selected from one of the following materials: a glass panel, a plastic panel, or a combination of the two materials.
  • said control unit is operable to detect a touch command applied directly to the surface of said substantially planar body.
  • two or more of the capacitive means are positioned in the same X.Y plane, each disposed along a line and mutually in parallel along the X-axis or along the Y- axis; or alternatively arranged within two or more concentric circles.
  • two or more of the infrared means are positioned in the same X.Y plane, each disposed along a line arranged within two or more concentric circles.
  • one or more of the infrared means are divided into two or more segments, which are individually receptive to active input signals.
  • one or more of the infrared means are configured in one or more pairs, a pair including at least one IR sender and one IR receiver and/or one IR transceiver.
  • the infrared means I R is emitting and reception means are detecting user- supplied control commands issued in a remote field at a distance from the apparatus which is within the defined proximity field distance.
  • the inv ention comprises use of any suitable capacitive sensor technology, e.g. surface capacitance, projected capacitiv e touch, etc.
  • the inv ention operates w ith a number of functional properties:
  • the invention operates with a number of control commands executed in the apparatus, control commands related to the detected user-supplied commands, and examples are. but not limited to:
  • the surface on the device has not an x-y orientation as such; thus, the commands L/R giv en as abov e arc relativ e to the user ' s position in front of the device to be controlled, and with the user finger at any position along the outer perimeter of the top surface of the device.
  • the device type of the invention defined to be an Omnidirectional Touchpad.
  • An omnidirectional touchpad integrated into an apparatus, is ena- bled to detect user given commands, and if the user is making a right or left sw ipe gesture independent of where the user is positioned relative to the touch pad, the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger, and is configured with means to detect the touch pressure by said object onto the surface of the omnidirectional touchpad, this characterized by:
  • a second X,Y position of the object is determined as sensed by the touch means and based on the v alues of the first X.Y position.
  • a resulting X.Y is calculated based on the second X.Y position/value and the third X.Y position value.
  • a command corresponding to the resulting X,Y is interpreted by the apparatus and executed accordingly.
  • the object moves across the surface of the touchpad along a path bein glinear or curved and performed like a swipe from one X,Y position to another X, Y position, and
  • a start vector is initialized and oriented from one point P in a detected proximity X,Y position to another point T in a detected touch X,Y position, and
  • a first movement vector is initialized and oriented from one point T in a detected touch X,Y position to another point T in another detected touch X,Y position, and
  • a second movement vector is initialized and oriented from one point P in a detected proximity X,Y position to another point P in another detected proximity X,Y position, and
  • the first movement vector is substantial parallel with the second movement vector.
  • ASPECT 5. An omnidirectional touchpad according to any aspect above, where the proximity means are organized on or along an outer perimeter of the omnidirectional touchpad.
  • ASPECT 6. An omnidirectional touchpad according to aspect 5, where the touch means are organized on the surface of the omnidirectional touchpad, and the touch means surrounded by the proximity means.
  • ASPECT 7 An omnidirectional touchpad according to aspect 6, where the touch means are based on capacitivc means, or resistive means, or a combination of the two.
  • ASPECT 8 An omnidirectional touchpad according to aspect 7, where the proximity means are based on capacitive means, or light means, in frared or laser, or a communication of the two..
  • An omnidirectional touchpad according to aspect 8, where proximity- detect ion are implemented by one or more of light emitters and a number of l ight receivers, and said receiver detects if an object is in proximity.
  • ASPECT 10 An omnidirectional touchpad according to aspect 9, where one emitter is active at a time and thus the electronic control circuit gets a separate set of proximity signals from each receiver for every subsequent emitter activation. ASPECT 1 1 . An omnidirectional touchpad according to aspect 10, where the emitters and receivers closest to object gives the highest signal .
  • An omnidirectional touchpad according to aspect 1 1 where the touch area is:
  • each conductive pad is connected to the input of a capacitance to digital converter (CDC ), and c. the digital signals are fed into a microprocessor ( ⁇ ).
  • CDC capacitance to digital converter
  • ASPECT 13 An omnidirectional touchpad according to aspect 12, where the sensing means including the touch area and the proximity detectors scanned at a rela- tively high rate (50-100Hz) and al l data continuously processed by the uP.
  • Figure 2a shows the layout of the touchpad, and Figure 2b illustrates definitions of directions
  • Figures 3 shows a block diagram of the electronics in the detection means
  • Figures 4 and 5 shows the layout of the touchpad and reflection caused by object
  • Figure 6 shows an alternative layout of the detection means
  • FIG. 7 displays principles of command detection
  • the omnidirectional touchpad is primarily intended to be positioned in the horizontal plane.
  • the omnidirectional touchpad can detect whether the user is making a right or left swipe gesture as seen from the user's own perspective, independent of where the user is positioned relative to the touch pad (see Figure 1).
  • the omnidirectional touch- pad therefore does not require any print of logos to indicate a certain swipe direction or touch area. This is a great advantage as compared to user interfaces where the user either has to perform the swiping action from a certain location or the user has to adjust the direction relative to the orientation of the user interface of the touch pad, which could possibly be upside-down for the user from, the specific user position. Also, this eliminates constraints, especially, for circular touchpads. If the omnidirectional touch pad is mounted vertically (e.g. on the wall), this property allows for simple mounting without the need for ensuring a certain orientation.
  • the omnidirectional touchpad is realized by a combination of a touch area and a num- ber of proximity detectors placed around the perimeter of the touch area as shown in Figure 2a.
  • the number of IR emitters and receivers in the illustrated case is three, but a different number of emitters and receiver pairs is possible, such as two or four of each.
  • One possible implementation of proximity detection is by means of IR technology.
  • an I R proximity detection system consists of one I R emitter and one IR receiver. If a hand or finger is in proximity, the emitted I R light is reflected and the IR receiver can detect this reflection. The closer the hand is, the higher the reflection.
  • background I R radiation will be present, for example due to sunlight or artificial IR sources in the room. Therefore, the proximity detection method needs to cancel out the contribution of such background I R radiation.
  • I R infrared
  • IR receivers IR receivers
  • microprocessor
  • the emitters and receivers closest to the hand or finger will give the highest signal .
  • the ⁇ can calculate from, which side the hand is approaching.
  • S l ,S2,S3.. . Sn the criteria for a proximity sensing related to an X,Y position is:
  • Figure 2b illustrates further details of a proximity detection system.
  • the term “background” is used instead of the term “ambient”, and the term, “total” is used as a substitution for the term ''ambient+reflected”.
  • the proximity detection system based on the I R emitter and IR receiver means is used to determine the position of the touching object (finger or hand) relative to the touch swipe or touch position on the omnidirectional touchpad.
  • the proximity detection system continuously and repeatedly calculates a dominant X,Y position of the touching object, typical ly a finger or hand of the user.
  • a dominant X,Y position of the touching object typical ly a finger or hand of the user.
  • the procedure for calculating the dominant X,Y can be as follows.
  • one position vector is found, representing the dominant position of the object (hand or finger), relative to the orthogonal X,Y coordinate system.
  • An alternative technology for the proximity detection could be capacitive proximity technology.
  • capacitive proximity technology is very sensitive to common mode noise entering the product (through e.g. the mains cord ). I f the common mode noise is low capacitive proximity may work, but in many cases, the common mode noise will interfere with the proximity signal causing the detection to be unrel iable.
  • IR proximity is not sensitive to common mode noise and therefore IR proximity is in some cases preferred for a reliable detection.
  • Suitable proximity detection technologies without this draw back are also for example: Ultrasonic sound or R F/radar.
  • the touch area is implemented with known capacitive technology.
  • Other suitable touch detection technologies are: Resistive touch, Force-Sensing Resistor touch.
  • Acoustical touch e.g. Surface Acoustic Wave
  • Strain Gauge etc.
  • both the touch area and the proximity detectors are scanned at a relatively high rate (50-100Hz) and all data is continuously processed by the ⁇ .
  • a relatively high rate 50-100Hz
  • the line of the swipe calculated from the touch data and from the proximity data will be used to calculate the position of the user.
  • Other gestures as single tap, double tap etc. can also be detected.
  • the user is tapping off-center, it is possible to detect in which position (seen from the user's perspective) the tap is applied.
  • Figure 3 further shows a block diagram of the omnidirectional touchpad circuit in an apparatus equipped with means for generating electric signals and fields used for de- tecting control commands issued by a user.
  • the means are a combination of IR signal generators and IR detectors and electric fields generated via capacitive controlled signal generators.
  • the IR system is used for detecting the presence of an object and/or movement in a remote field.
  • a pulse-based IR proximity detection method is used here.
  • An implementation can be based on a standard chip e.g. Sil l4x from Sili- conLabs.
  • a number of conductive pads are placed below the surface, as shown in Figure 3.
  • Each conductive pad is connected to the input of a capacitance to digital converter (CDC) and the digital signals fed into a microprocessor ( ⁇ ).
  • CDC capacitance to digital converter
  • microprocessor
  • the "capacitive touch system” is based on conducting areas or conducting strips being applied to the printed circuit board (PCB) or other carrier, which is hidden behind the front plate of the apparatus, which may be display screen, or a plate of glass, plastic or similar.
  • the conducting strips can be made of copper, carbon or other conducting material, which is applied or vaporized on the PCB. Depending on the functional demands to a given apparatus, two or more conducting areas or strips are applied.
  • the touch area is divided into a number of segments, each representing a touch sensi- tive area.
  • the system may detect that the user touches one or more areas at the same time, and detects a movement, like a sweep done by an object/finger across the touch sensitive surface.
  • Figure 3 shows an apparatus having a touch field divided into 12 fields, e.g. corresponding a specific function, which is activated by the user touch/swipe of the respective fields.
  • the criteria for a capacitive sensing related to X,Y position is:
  • Figure 4 displays how an object e.g. a finger of a user is detected by the proximity means at a given distance from the surface from the device to the finger.
  • the user ' s physical touch on the surface is the trigger for a device command accordingly.
  • Figure 5 displays how an object e.g. a finger of a user is detected by the prox imity means at a given distance from the surface from the device to the finger.
  • the object reflects an emitter l ight beam and a light sensor detects accordingly the presence of the object, at this given position.
  • Touch pad with resistive matrix means or capacitive means and proximity with l ight means emitting from the edge of the surface of the device ( Figure 6c ).
  • Light emission and detection from the edge of the device are, optionally, used for detection of the user's position relative to the touchpad. Accordingly, the gesture, for example swipe action, by the user can be correctly interpreted by the apparatus with respect to a correct direction. For the latter, the detected gesture is rotated into a direction that matches with the calculated location of the user relative to the touch pad. This is one way of interpreting the correct direction of the gesture, for example a swiping action .
  • Figure 7 displays one embodiment on how the device detects and interprets a user given command.
  • An object e.g. the finger of user
  • P X,Y position
  • T X,Y position
  • the X,Y positions are relative to a fixed coordinate system with origin at the center of the device surface or alternatively relative to a floating coordinate system with the origin created at the point of the detected proximity posi- tion.
  • P With origin in P (71) a start vector (75) being connection to (T), where the vector is substantially orthogonal to the X axis in the detected P point.
  • Movements done by the finger, and detected by the proximity means and the touch means define the movement vector (76, 77). The movement will typically be substantially along the X- axis with predefined acceptance iimit(s) of the related values along Y-axis.
  • the accept angles (vl ,v2,v3,v4) defines the tolerance of X,Y values within which detected touch positions and proximity positions are validated to be legal values appl ied in the evaluation X.Y values of "P and T" corresponding to a specific functional command.
  • one or more intermediate sensor values are detected and applied in the determining the executed path and the resulting X,Y positions of P and T.
  • the resulting position X,Y of the object e.g. the finger touching the surface from a given user position is calculated from the X,Y position of the touch and the X.Y position as detected by the proximity means.
  • a legal activation includes that the user executes a single touch onto the surface in "a single point", without moving the finger to another touch position; a pressure performed by w ith the finger at the same point, for a period - short or long as applicable, may follow the touch.
  • the touch and proximity concept as disclosed enables the user to activate commands on the user interface; such the commands interpreted to be "Left to Right “ or “Right to Left” relative to the user, and the user at any position along or around the border of device the user is controll ing.
  • Figure 7 il lustrates that commands executed at the lower half (80), or along the middle (90), or at the upper half ( 1 00) of a circular device, are all interpreted equally to be from Left to R ight. The same applies for command operated from Right to Left.
  • the method includes l ikewise operation performed by the user (1 10), along the complete perimeter of the device; see also Figure 1.
  • Typical applications are support of operation of multimedia apparatuses, AV systems, media players, remote controllers and similar equipment.
  • the invention has properties making it suitable for mounting on printed circuit boards and improving the qual ity of the detection system such that user-suppl ied commands may be certainly and unambiguously detected, interpreted and executed as related functions in a given apparatus.
  • Figure 8 further il lustrates the proximity detection system when combined with the touch detection system, for example a capacitive touch detection system.
  • the dominant position found with the proximity detection system will in the vast majority of cases be closer to the user than the positions touched on the touch detection system. This enables to make correct detection of the touch positions and/or touch swipes, regardless of the user's position orientation relative to the omnidirectional touch pad.
  • the user 1 10 operates a touch swipe T 200 from Left to Right over the surface of the touchpad, parallel with the X-axis.
  • the measured averaged dominant position of the object (hand or finger) making the swipe is marked with P 201.
  • the touch swipe being predominantly in the X direction, the averaged dominant position of the object is evaluated relative to the Y axis to determine the position of the user. In most cases it is sufficient to detect whether the averaged dominant position of the object has a positive or negative Y coordinate.
  • the Y-position of the averaged dominant position can be compared with the average Y-position of the touch swipe.
  • the touch swipe will be tilted compared to the X-axis, even with the user aligned with the Y axis. This may be due to the omnidirectional touchpad being operated by either the user's left hand or right hand. In the middle drawing of Figure 8, the touchpad is operated from the left with the user's left hand.
  • the touch swipe T 202 is tilted clockwise relative to the X axis.
  • the measured averaged dominant position of the object (hand or finger) making the swipe is marked with P 203. With the touch swipe still being predominantly in the X direction, the averaged dominant position of the object is evaluated relative to the Y axis to determine the position of the user.
  • the touchpad is operated from the right with the user ' s right hand.
  • the touch swipe T (204 ) is tilted counter clockw ise relative to the X axis.
  • the measured averaged dominant position of the object (hand or finger) making the swipe is marked with P (205). Also in this case the same detection principle can be used.
  • the tilt angle of the touch swipe exceeds +45 or - 45 degrees relative to the X- axis, the touch swipe will be predominantly in the Y direction. In that case, the averaged dominant position of the object must be evaluated relative to the X axis to determine the position of the user. Again, in most cases it will then be sufficient to de- tect whether the averaged dominant position of the object has a positive or negative X coordinate.
  • the X-position of the averaged dominant position can be compared with the average X-position of the touch swipe.
  • the tilt angle of the touch swipe exceeds +45 or - 45 degrees relative to the X-axis, will be consistent with a different position of the user, for example position 210 rather than the position 1 10 indicated in Figure 8, causing the method for determining the position of the user to remain valid.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

An apparatus with a touchpad for detecting gestures performed by a user and interpreting the gestures independently of the user's actual arbitrary location and the gestures corresponding gesture orientation relative to the touchpad. The touchpad comprises a proximity detection system for detecting a movement performed with a finger by the user in the proximity of the touchpad and a touch detection system for gesture performed by the user while the finger is in contact with the touchpad. In operation the following steps are performed: - by the proximity detection system detecting a pro.x i m i ty-posi t ion P of an object in close proximity to the touchpad; - by the touch detection system detecting a touch-position T of the object while in contact with the touchpad; - determining from the proximity location P relative to the touch-position T a user-direction towards the user; - from the user-direction relative to a predetermined touchpad default- orientation adjusting the orientation of the gesture prior to interpreting the gesture with respect to a gesture-associated command and finally executing the command.

Description

Apparatus and method for detecting gestures on a touchpad
FI ELD OF THE INVENTION The present invention relates to a touchpad and interpretation of gestures performed on the touchpad or in close proximity thereto.
BACKGROUND OF THE INVENTION Control systems based on touch-sensitive surfaces are described in prior art also called "sensi touch" and have been used for a number of years.
Recent applications include systems that may detect position of objects in proximity fields around an apparatus, also called "proximity sensing". A system is disclosed by Synaptics Inc, and as published in US patent application number US 2007/10262951.
For detecting objects at a greater distance from an apparatus, IR- or RF- or ultrasonic sound based systems are known. US patent application number US2011/0279397 discloses a monitoring unit for monitoring a hand or finger in three dimensions in the vicinity of a touch screen such that the monitoring unit is working in contact mode or in contact-less mode. Other publications disclosing user interfaces, for example interpreting gestures, comprise US patent US8830181 and US patent applications US2008/0168403, US2010/0245289, and US20120162073.
In this prior art, a gesture, for example a swiping action with a finger, is interpreted according to a predetermined orientation and location of the user relative to the detector. In order to get a gesture correctly interpreted by a touchpad that lies down flat, the gesture has to be performed by the user in relation to the actual orientation of the user interface of the touchpad relative to the user. In case of a symmetrical detector, symbols or text need to be printed or applied otherwise to the surface of the detector to ensure the correct orientation of the detector relative to the user. This requires attention by the user and is a limitation to the user-friendl iness. it would be desirable to increase the user-friendliness of touch pads and to remove the need for symbols or text on the surface of the detector.
DESCRIPTION / SUMMARY OF THE INVENTION It is therefore the objective of the invention to provide an improvement in the art. In particular, it is an objective to increase user- friendliness of touchpads. This objective is achieved by a system in the following.
The present invention provides an apparatus and a method for detecting user-supplied control commands given via gestures on a touch-sensitive surface, also called touch- pad, of the apparatus, for example multimedia apparatus, AV system, loudspeaker, remote control or media player.
The touchpad comprises a proximity detection system for detecting the location of an object, for example a user's finger, in the proximity of the touchpad, and for detecting a movement performed with the object by the user in the prox imity of the touchpad. The apparatus further comprises a touch detection system for detecting contact by said object with the surface of the touchpad and for detecting a gesture performed w ith the object by the user while the object is in contact with the touchpad. In operation, the function of the touch detection system is combined w ith the function of the proximity detection system, where the latter detects the presence of a finger or hand or pointer of a user before, during or after the gesture on the touchpad in order to determine the location of the user relative to the touchpad. This information is used to interpret the intended direction of the gesture. For example, when a user swipes a finger across the touchpad, the l ine of the swipe is calculated from the touch data, and the position of the user is calculated from the related proximity data. Thus, it is possible to determine if the user is swiping right or left as seen from the user's own perspective. For example, when the finger moves across the surface of the touch pad along a path being linear or curved and performed like a swipe from one position to another position, the left- or right orientation of said object movement is interpreted to be left or right according to the actual user position in front of the omnidirectional touchpad. In other words, when the user performs a movement of the finger from the left to the right on the touchpad, the movement is interpreted as being a swipe that is directed to the right, also cal led a right-swipe, not only if the user is on one side of the touchpad but also if the user is located on an opposite side of the touchpad. The system detects the location of the user relative to the touchpad and adjusts the gesture interpretation accordingly. This is in contrast to the prior art, where the user interface has to be oriented correctly, relative to the user, or the user has to adjust the gesture, for example swipe, to match the direction of the user interface. Also, in the invention, there is no need for symbols or text on the surface of the user interface. Typical applications are operation of multimedia apparatus, AV systems, loudspeakers, media players, remote controls and similar equipment.
For example, proximity sensing of a finger or hand is done by a capacitive sensor or by a light beam emitter in combination with a sensor that detect reflected light beams. For example, the position of the person relative to the apparatus is sensed by a reflected light beam, especially an infrared ( I R) reflected l ight beam. Detection of gestures is performed in a 3 -dimensional space around the apparatus and by touching directly on the apparatus. Dedicated means are used for detecting position of objects, for example a finger, a hand, or a pointer dev ice, located at a distance close to the apparatus and in direct physical contact w ith the apparatus.
The invention has properties making it suitable for mounting on printed circuit boards and improving the quality of the detection system such that user-suppl ied commands may certainly and unambiguously be detected, interpreted and executed as related functions in a given apparatus.
In a concrete embodiment, the a user-direction is found relative to a predetermined orientation of the touchpad, and the orientation of the detected gesture on the touch- pad is adjusted according to this difference prior to interpreting the gesture with respect to a gest u re-assoc i ated command and executing the command.
Advantageously, this can be achieved by detecting a proximity-position P of an ob ject in close proximity to the touchpad, detect ing a touch-position T of the object while in contact with the touchpad, and from the prox i m i ty-posi t ion P and the touch-position T determining a user-direction towards the user. Instead of the touch position T, a sequence of touch positions T can be used, for example in the case of a swipe. For example, the prox imity-position P and/or the touch-position T are averaged positions, for example achieved by a weighted averaging, which in the fol lowing are called dominant proximity-position P and dominant touch-position T. In practice, the method comprises
- by the proximity detection system detecting a movement of the object in close prox- imity to the touchpad and averaging this movement to a dominant proximity-position
P;
- determining from the dominant proximity-position P relative to the touch-position T or sequence of touch positions T a user-direction towards the user.
Optional ly, the method also contains the step of detecting a gesture of the object by the touch detection system while in contact with the touchpad and averaging the gesture to a dominant touch-position T.
Advantageously, the detected proximity movement of the object and the gesture on the touchpad are translated to coordinate sequences in a pre-defined coordinate sys- tern, in this case, a practical embodiment of the method comprises
- providing a two dimensional X.Y coordinate system paral lel to the surface of the touchpad.
- translating the movement to a sequence of pro x i m i ty-coord i n a tes in the coordinate- system and averaging this movement to a dominant prox i mi ty-posi t ion P in the coor- dinate-system;
- translating the gesture to touch-coordinates in the coord i nate-system, optional ly averaging the gesture to a dominant touch-position T in the coord i nate-system ; - determining from the dominant proximity-position P relative to the touch-position T or the sequence of touch positions T, optionally dominant touch-position T, a user- direction towards the user in the coordinate system;
- from the user-direction relative to the axes of the coordinate-system adjusting the orientation or position or both of the gesture in the coordinate system prior to interpreting the gesture with respect to a gest u re-assoc i ated command and executing the command.
For example, the method comprises adjusting the orientation of the gesture with re- spect to righ-to-left or left-to-right direction prior to interpreting the gesture. In the case of a swipe, where the object is moved across the surface of the touchpad along a linear or curved swipe-path from one X,Y position to another X,Y position, the apparatus determines whether the swipe-path is a left-swipe or right-swipe depending on the determined user-direction in the coordinate-system.
For example, the proximity detection system comprises plurality of proximity sensors organized along an outer perimeter of the touchpad, optionally circular touchpad. For example, the touch detection system comprises a plurality of touch sensors organized on the surface of the touchpad and surrounded by the prox imity sensors.
In a specific embodiment, the proximity detection system comprises a plurality of infrared light emitters and a plural ity of infrared light receivers, and said receivers are configured for measuring a background level of infrared l ight and correcting infrared proximity signals by subtracting the background level. Optional ly, the plural ity of infrared l ight emitters are configured for one emitter of the plurality of emitters being active at a time, and wherein an electronic control circuit is con figured for receiving a separate set of proximity signals from each one or of the plural ity of infrared receivers, for every subsequent activation of each further infrared emitter of the plurality of emitters. Alternatively, the method comprises receiving the proximity signals from two infrared receivers, one on either side of the corresponding emitter, for every subsequent activation of each further infrared emitter of the plural ity of emitters Accordingly, there is provided a method for detecting user-supplied input control commands for an apparatus, wherein the apparatus comprises an input device hav ing a primary surface in an X.Y plane. A plural ity of capacitive means operable to generate a plural ity of electric fields, wherein at least two of said capacitive means are posi- tioned on the X.Y plane of said surface. And w herein at least one infrared transmitter means and at least two infrared receiver means are positioned on said surface and are configured to issue I R l ight beams, primarily orthogonally out from the said surface and receive I R light beams caused by the reflection from the object, for example a finger, of a user, and primarily above and orthogonal ly to the X.Y plane of said sur- face, wherein the method comprises the steps of:
• detecting the presence of a finger of user that is in close proximity of the surface of the dev ice, but not in direct physical connection i.e. not touching the device;
• detecting a user input command in the form of a touch command performed on the surface of the device;
• generating a control command based on the user input command detected.
An aspect of the invention is an omnidirectional touchpad, integrated into an appa- ratus, enabled to detect user given commands and to determine if the user is making a right or left swipe gesture independent of where the user is positioned relative to the touch pad. The omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger, and is configured with means to detect the touch pressure by said object onto the surface of the omnidirectional touchpad, this characterized by:
o an upper surface in the X-Y plane,
o a first X.Y position of the object is determined as sensed by the proximity means, and validated accordingly,
o a second X.Y position of the object is determined as sensed by the touch means and based on the values of the first X.Y position, o a third X.Y position of the object is determined as sensed by the touch means, and validated accordingly. o a resulting X,Y is calculated based on the second X,Y position value and the third X,Y position/value,
o a command corresponding to the resulting X.Y is interpreted by the apparatus and executed accordingly, and where:
o the second X,Y position of the object determined as sensed by the touch means, and
o the third X.Y position of the object determined as sensed by the touch means,
o are substantially the same position.
A further aspect of the invention is:
o the object moves across the surface of the touchpad along a path being l inear or curved and performed l ike a sw ipe from one X position to another X position, and
o where the left- or right orientation, of said object movement is interpreted to be left or right according to the user positioned in front of the omnidirectional touchpad.
An even further aspect of the invention is an omnidirectional touchpad. integrated into an apparatus, enabled to detect user given commands and to determine if the user is making a right or left swipe gesture independent of where the user is positioned relative to the touch pad. The omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger or hand, and is configured with means to detect the touch/pressure by said object onto the surface of the omnidirec- t ion a I touchpad, this characterized by:
o an upper surface in the X-Y plane with a fixed orthogonal X,Y coordinate system defined, for example centered around the middle of the omnidirectional touchpad
o a first X,Y position of the object is determined as sensed by the touch means, and validated accordingly,
o a first X.Y position of the object is determined as sensed by the proximity means, and validated accordingly. a second, third and subsequent X,Y positions of the object are determined as sensed by the touch means and val idated accordingly, until no further user interaction is detected by the touch means
a second, third and subsequent X.Y positions of the object are determined as sensed by the proximity means and validated accordingly, until no further user interaction is detected by the proximity means a resulting dominant X.Y touch position or a resulting X,Y touch sw ipe vector is calculated relative to the fixed orthogonal X,Y coordinate system, based on the sequence of detected touch X,Y values, a resulting dominant X,Y proximity position of the user's hand or finger is calculated relative to the fixed orthogonal X.Y coordinate system, based on the sequence of detected proximity X,Y values, a corrected dominant X,Y touch position or a corrected X,Y swipe vector is calculated relative to a orthogonal X.Y coordinate system, rotated towards the user
a command corresponding to the resulting corrected dominant X,Y touch position or the resulting corrected X,Y touch vector is interpreted by the apparatus and executed accordingly and
where the left- or right orientation, of said object movement is interpreted to be left or right according to the user positioned in front of the omnidirectional touchpad.
The use of touch commands and the simultaneous proximity detection, as wel l as the different types of commands within these categories, allow for a w ide v ariety of possible input command methods for the input device.
Optional ly, one or more of the capacitive means arc divided into two or more segments, which are indiv idually receptive to user input commands, and wherein the method includes the step of determining at which segment said user input command is detected. As the capacitive means arc segmented, this allows di fferent commands to be selected based on the detected location of the touch command (i.e. the segment where the touch command was detected ). Optionally, said input device comprises a substantially planar body being integrated into an apparatus, e.g. a media player, or alternatively the means are configured as a standalone product, e.g. a remote control ler, smartphone. tablet or alike. As the touch commands may be input on different adjacent surfaces of a three- dimensional object, such a system al lows for a greater combination or arrangement of acceptable input commands.
Optional ly, at least one of said infrared means is operable to generate an infrared l ight beam field substantially in front of said primary surface, wherein the method comprises the step of detecting that an object is moved into said at least one infrared l ight beam field and/or moved out of said at least one infrared light beam field or moved within said at least one infrared l ight beam field wherein said step of generating a control command is based in part on said detection. Optionally, at least one of said infra- red means is operable to detect an infrared light beam field substantially in front of said primary surface.
It will be understood that the term "object" may refer to an object or token held by a user, for example a pointer device, or may also refer to at least a portion of the user's body detected by the system, e.g. a finger.
As detection can be made based on the movement of an object relative to a prox imity field, this allows commands to be entered by the user without touching the input dev ice. In particular, different gestures may be interpretable as di fferent user commands. Furthermore, different commands may be generated based on how close the user/object is to the input device, e.g. a display image may be adjusted based on the user's proximity to the display.
Optional ly, the method comprises the step of detecting that an object touches a touch sensitive field, and generating a related action.
There is also prov ided a touch-based input apparatus comprising:
• a primary surface provided in an X.Y plane. • a plural ity of capacitive means operable to generate a plural ity of electric fields, and
• a control unit, wherein at least two of said capacitive means are positioned on the X.Y plane of said surface, and
· wherein said control unit is operable to detect a user command in the form of a touch command or a non-touch command by a gesture remote from the touchpad surface and to generate a control command based on the user command detected. Optional ly, one or more of said capacitive means arc divided into a plurality of segments individually receptive to user input commands, and wherein the said control unit is operable to determine at which segment said user input command is detected. Alternatively, to the capacitive means a resistive based or other touch system may be applied.
Optional ly, said apparatus is configured with a substantial ly planar body selected from one of the following materials: a glass panel, a plastic panel, or a combination of the two materials. Optional ly, said control unit is operable to detect a touch command applied directly to the surface of said substantially planar body.
Optional ly two or more of the capacitive means are positioned in the same X.Y plane, each disposed along a line and mutually in parallel along the X-axis or along the Y- axis; or alternatively arranged within two or more concentric circles.
Optional ly two or more of the infrared means are positioned in the same X.Y plane, each disposed along a line arranged within two or more concentric circles. Optionally, one or more of the infrared means are divided into two or more segments, which are individually receptive to active input signals. Optionally, one or more of the infrared means are configured in one or more pairs, a pair including at least one IR sender and one IR receiver and/or one IR transceiver.
Optional ly, the infrared means I R is emitting and reception means are detecting user- supplied control commands issued in a remote field at a distance from the apparatus which is within the defined proximity field distance.
Optional ly, it is detected that an object approaches or an object is mov ing away from one or more proximity fields, and a related action is generated.
Optionally, it is detected that an object is moved in a proximity field, and a related action is generated.
Optional ly, it is detected that an object touches a touch-sensitive field, and a related action is generated.
It will be understood that the inv ention comprises use of any suitable capacitive sensor technology, e.g. surface capacitance, projected capacitiv e touch, etc. The inv ention operates w ith a number of functional properties:
• May detect a direct physical touch on the apparatus, e.g. by a user touching the apparatus w ith a finger; this is termed a touch field.
• May discern between "close to" and "directly touching" the surface, where "close to" is e.g. 1 - 5 cm from the surface; this is to be re- garded as a proximity field.
The invention operates with a number of control commands executed in the apparatus, control commands related to the detected user-supplied commands, and examples are. but not limited to:
· Object/finger touch/press on the surface, continuously or by "tapping";
• Object/finger moving from left (L) to right (R);
• Object/finger mov ing from right to left; • Object finger mov ing from any first position to any second along a predefined path, which includes one or more part ial fragments of the geometrical types: l ine and curve.
• Object/finger moving on a surface which outer perimeter has a geo- metrical form of a circle, an ell ipse or any geometrical form that is symmetrical around a common x-axis and/or symmetrical around a common y-axis in the plane of the device surface.
• The surface on the device has not an x-y orientation as such; thus, the commands L/R giv en as abov e arc relativ e to the user's position in front of the device to be controlled, and with the user finger at any position along the outer perimeter of the top surface of the device.
• The device type of the invention defined to be an Omnidirectional Touchpad.
ASPECTS
In the follow ing various interrelated aspects arc described.
ASPECT 1. An omnidirectional touchpad, integrated into an apparatus, is ena- bled to detect user given commands, and if the user is making a right or left sw ipe gesture independent of where the user is positioned relative to the touch pad, the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger, and is configured with means to detect the touch pressure by said object onto the surface of the omnidirectional touchpad, this characterized by:
a. an upper surface in the X-Y plane,
b. a first X,Y position of the object is determined as sensed by the proximity means, and v alidated accordingly,
c. a second X,Y position of the object is determined as sensed by the touch means and based on the v alues of the first X.Y position.
d. a third X.Y position of the object is determined as sensed by the touch means, and val idated accordingly,
e. a resulting X.Y is calculated based on the second X.Y position/value and the third X.Y position value. f. a command corresponding to the resulting X,Y is interpreted by the apparatus and executed accordingly.
ASPECT 2. An omnidirectional touchpad according to aspect 1 , where:
a. the second X,Y position of the object determined as sensed by the touch means, and
b. the third X.Y position of the object determined as sensed by the touch means,
c. are substantially the same position.
ASPECT 3. An omnidirectional touchpad according to aspect 1 , where:
a. the object moves across the surface of the touchpad along a path bein glinear or curved and performed like a swipe from one X,Y position to another X, Y position, and
b. where the left- or right orientation, of said object movement is interpreted to be left or right according to the user positioned in front of the omnidirectional touchpad.
ASPECT 4. An omnidirectional touchpad according to aspect 1 , where
a. a start vector is initialized and oriented from one point P in a detected proximity X,Y position to another point T in a detected touch X,Y position, and
b. a first movement vector is initialized and oriented from one point T in a detected touch X,Y position to another point T in another detected touch X,Y position, and
c. a second movement vector is initialized and oriented from one point P in a detected proximity X,Y position to another point P in another detected proximity X,Y position, and
d. the first movement vector is substantial parallel with the second movement vector. ASPECT 5. An omnidirectional touchpad according to any aspect above, where the proximity means are organized on or along an outer perimeter of the omnidirectional touchpad. ASPECT 6. An omnidirectional touchpad according to aspect 5, where the touch means are organized on the surface of the omnidirectional touchpad, and the touch means surrounded by the proximity means.
ASPECT 7. An omnidirectional touchpad according to aspect 6, where the touch means are based on capacitivc means, or resistive means, or a combination of the two.
ASPECT 8. An omnidirectional touchpad according to aspect 7, where the proximity means are based on capacitive means, or light means, in frared or laser, or a communication of the two..
ASPECT 9. An omnidirectional touchpad according to aspect 8, where proximity- detect ion are implemented by one or more of light emitters and a number of l ight receivers, and said receiver detects if an object is in proximity.
ASPECT 10. An omnidirectional touchpad according to aspect 9, where one emitter is active at a time and thus the electronic control circuit gets a separate set of proximity signals from each receiver for every subsequent emitter activation. ASPECT 1 1 . An omnidirectional touchpad according to aspect 10, where the emitters and receivers closest to object gives the highest signal .
ASPECT 12. An omnidirectional touchpad according to aspect 1 1 , where the touch area is:
a. capacitive with a number of conductive pads placed below the surface, and
b. each conductive pad is connected to the input of a capacitance to digital converter (CDC ), and c. the digital signals are fed into a microprocessor (μΡ).
ASPECT 13. An omnidirectional touchpad according to aspect 12, where the sensing means including the touch area and the proximity detectors scanned at a rela- tively high rate (50-100Hz) and al l data continuously processed by the uP.
SHORT DESCRIPTION OF THE DRAWINGS The invention will be explained in more detail with reference to the drawing, where Figure 1 shows the principle in swipe direction relative to user position;
Figure 2a shows the layout of the touchpad, and Figure 2b illustrates definitions of directions;
Figures 3 shows a block diagram of the electronics in the detection means;
Figures 4 and 5 shows the layout of the touchpad and reflection caused by object;
Figure 6 shows an alternative layout of the detection means;
Figure 7 displays principles of command detection;
Figure 8 explains principles of the method. DETA I LED DESCR IPTION / PREFERRED EMBODIMENT
The omnidirectional touchpad is primarily intended to be positioned in the horizontal plane. The omnidirectional touchpad can detect whether the user is making a right or left swipe gesture as seen from the user's own perspective, independent of where the user is positioned relative to the touch pad (see Figure 1). The omnidirectional touch- pad therefore does not require any print of logos to indicate a certain swipe direction or touch area. This is a great advantage as compared to user interfaces where the user either has to perform the swiping action from a certain location or the user has to adjust the direction relative to the orientation of the user interface of the touch pad, which could possibly be upside-down for the user from, the specific user position. Also, this eliminates constraints, especially, for circular touchpads. If the omnidirectional touch pad is mounted vertically (e.g. on the wall), this property allows for simple mounting without the need for ensuring a certain orientation.
The omnidirectional touchpad is realized by a combination of a touch area and a num- ber of proximity detectors placed around the perimeter of the touch area as shown in Figure 2a. The number of IR emitters and receivers in the illustrated case is three, but a different number of emitters and receiver pairs is possible, such as two or four of each. One possible implementation of proximity detection is by means of IR technology. In its simplest form, an I R proximity detection system consists of one I R emitter and one IR receiver. If a hand or finger is in proximity, the emitted I R light is reflected and the IR receiver can detect this reflection. The closer the hand is, the higher the reflection. In practical situations, background I R radiation will be present, for example due to sunlight or artificial IR sources in the room. Therefore, the proximity detection method needs to cancel out the contribution of such background I R radiation.
To obtain rel iable detection of the proximity of a hand, first the level of background IR radiation is detected w ith the IR emitter switched off. This will give the level of the ambient I R level (Sambient)- Then the IR emitter is turned on and the I R signal level is measured again. This will give the sum of the ambient IR level and the portion of the IR emitter light reflected by the hand (Sambient+reflected)- The reflected I R l ight (Sreflected) is calculated from the formula: Sreflected S ambient +reflected " Sambient
To obtain spatial information of the position of the hand, multiple emitters and/or receivers need to be used. A number of infrared ( I R) emitters and a number of IR receivers is used to implement proximity detection. Only one emitter is active at a time and thus the electronic control circuit gets a separate set of proximity signals from each receiver for every subsequent emitter activation. These receiver signals are fed into a microprocessor (μΡ). The emitters and receivers closest to the hand or finger will give the highest signal . Thus, the μΡ can calculate from, which side the hand is approaching. Thus, in an embodiment including a plurality of sensing means (S l ,S2,S3.. . Sn), the criteria for a proximity sensing related to an X,Y position is:
Figure 2b illustrates further details of a proximity detection system. In this case, the term "background" is used instead of the term "ambient", and the term, "total" is used as a substitution for the term ''ambient+reflected".
The proximity detection system based on the I R emitter and IR receiver means is used to determine the position of the touching object (finger or hand) relative to the touch swipe or touch position on the omnidirectional touchpad.
To achieve this, the proximity detection system continuously and repeatedly calculates a dominant X,Y position of the touching object, typical ly a finger or hand of the user. In case of the I R emitter and I R receiver lay-out as in Figure 2b, the procedure for calculating the dominant X,Y can be as follows.
In the X.Y plane of the omnidirectional touchpad, a set of 6 unit vectors U13, Un, U21, U22, U32, U33 is defined as depicted in Figure 2b. These unit vectors have the following values:
Detection using IR receiver Rl :
1. Measure the I R light level hitting I R receiver R 1 , while I R emitters E 1 , E2 and E3 are switched off, giving the signal S(Rl)background. This signal represents the background IR light level, with the object (hand or finger) in place. The background I R light level can be caused by sunlight or other IR light sources in the vicinity of the omnidirectional touchpad.
2. Switch on E3 and measure the I R light level hitting R 1 , giving the signal S(R1)E3 total- This signal represents the IR light reflected by the object (hand or finger), including the background I R l ight level . When switching on an I R emitter, the emitter may be emitting a single IR l ight pulse, or it may be used in burst mode or continuous wave mode. The IR detection system will need to be arranged accordingly.
3. Subtract S(Rl)background from S(R1)E3 total giving the signal S(R1)E3 reflected- This signal represents the actual proximity signal related to the I R emitter receiver pair E3/R 1 . Multiply unit vector U 1 3 with S(R1)E3 reflected giving output vector V I 3= S(R1)E3 reflected * U13.
4. Switch on E l and measure the I R light level hitting R l , giv ing the signal S(R1)EI total- This signal represents the I R light reflected by the object (hand or finger), including the background I R light level .
5. Subtract S(R1 background from S(Rl)Ei total giv ing the signal S(Rl)Ei reflected- This signal represents the actual proximity signal related to the I R emitter receiver pair El/Rl . Multiply unit vector U 11 w ith S(Rl)m reflected giving output vector
Similarly, detection using I R receiver R2 and I R receiver R3 is performed.
Note: In case of alternative layouts of the optical proximity detection system either w ith fewer or more IR emitters and receivers, fewer or more unit vectors are defined and correspondingly, fewer or more measurements are performed to cover all adjacent emitter/receiver pairs.
In this case, as depicted in Figure 2a and 2b, in one proximity measurement cycle, six proximity vectors are found:
After adding these 6 proximity vectors and an averag
ing/normalisation process to compensate for emitter signal strength and other factors, one position vector is found, representing the dominant position of the object (hand or finger), relative to the orthogonal X,Y coordinate system.
To obtain one single dominant position of the object (hand or finger) even during the movement of the object during a swipe, subsequent position vectors as found using the procedure mentioned above will be averaged during the duration of the swipe as detected by the touch detection system.
An alternative technology for the proximity detection could be capacitive proximity technology. However, capacitive proximity technology is very sensitive to common mode noise entering the product (through e.g. the mains cord ). I f the common mode noise is low capacitive proximity may work, but in many cases, the common mode noise will interfere with the proximity signal causing the detection to be unrel iable. IR proximity is not sensitive to common mode noise and therefore IR proximity is in some cases preferred for a reliable detection. Suitable proximity detection technologies without this draw back are also for example: Ultrasonic sound or R F/radar.
Optionally, the touch area is implemented with known capacitive technology. Other suitable touch detection technologies are: Resistive touch, Force-Sensing Resistor touch. Optical touch. Acoustical touch (e.g. Surface Acoustic Wave), Strain Gauge, etc.
Optional ly, both the touch area and the proximity detectors are scanned at a relatively high rate (50-100Hz) and all data is continuously processed by the μΡ. When a user swipes a finger across the touchpad, the line of the swipe calculated from the touch data and from the proximity data, will be used to calculate the position of the user. Thus, it is possible to determine if the user is swiping right or left (seen from his/her ow n perspective). Other gestures as single tap, double tap etc. can also be detected. In addition, if the user is tapping off-center, it is possible to detect in which position (seen from the user's perspective) the tap is applied.
Thus, a number of different gestures can be recognized independently of the angular position of the user relative to the touchpad.
Figure 3 further shows a block diagram of the omnidirectional touchpad circuit in an apparatus equipped with means for generating electric signals and fields used for de- tecting control commands issued by a user. The means are a combination of IR signal generators and IR detectors and electric fields generated via capacitive controlled signal generators. The IR system is used for detecting the presence of an object and/or movement in a remote field. A pulse-based IR proximity detection method is used here. An implementation can be based on a standard chip e.g. Sil l4x from Sili- conLabs.
In some embodiments, a number of conductive pads are placed below the surface, as shown in Figure 3. Each conductive pad is connected to the input of a capacitance to digital converter (CDC) and the digital signals fed into a microprocessor (μΡ). For example, the "capacitive touch system" is based on conducting areas or conducting strips being applied to the printed circuit board (PCB) or other carrier, which is hidden behind the front plate of the apparatus, which may be display screen, or a plate of glass, plastic or similar. The conducting strips can be made of copper, carbon or other conducting material, which is applied or vaporized on the PCB. Depending on the functional demands to a given apparatus, two or more conducting areas or strips are applied.
The touch area is divided into a number of segments, each representing a touch sensi- tive area. The system may detect that the user touches one or more areas at the same time, and detects a movement, like a sweep done by an object/finger across the touch sensitive surface. Figure 3 shows an apparatus having a touch field divided into 12 fields, e.g. corresponding a specific function, which is activated by the user touch/swipe of the respective fields.
For example, in an embodiment, the criteria for a capacitive sensing related to X,Y position is:
Figure 4 displays how an object e.g. a finger of a user is detected by the proximity means at a given distance from the surface from the device to the finger. The user's physical touch on the surface is the trigger for a device command accordingly.
Figure 5 displays how an object e.g. a finger of a user is detected by the prox imity means at a given distance from the surface from the device to the finger. The object reflects an emitter l ight beam and a light sensor detects accordingly the presence of the object, at this given position.
Figure 6 displays alternative embodiments of the invention:
• Touch pad with resistive matrix means or capacitive means and proximity with light means emitting from the surface of the device (Figure 6a);
• Touch pad with resistive matrix means or capacitive means and proximi- ty with capacitive means ( Figure 6b);
• Touch pad with resistive matrix means or capacitive means and proximity with l ight means, emitting from the edge of the surface of the device ( Figure 6c ). Light emission and detection from the edge of the device are, optionally, used for detection of the user's position relative to the touchpad. Accordingly, the gesture, for example swipe action, by the user can be correctly interpreted by the apparatus with respect to a correct direction. For the latter, the detected gesture is rotated into a direction that matches with the calculated location of the user relative to the touch pad. This is one way of interpreting the correct direction of the gesture, for example a swiping action .
Figure 7 displays one embodiment on how the device detects and interprets a user given command. An object, e.g. the finger of user, is detected by the proximity means on an X,Y position (P) and a touch on the surface is detected by the touch means on another X,Y position (T). The X,Y positions are relative to a fixed coordinate system with origin at the center of the device surface or alternatively relative to a floating coordinate system with the origin created at the point of the detected proximity posi- tion. P. With origin in P (71) a start vector (75) being connection to (T), where the vector is substantially orthogonal to the X axis in the detected P point. Movements done by the finger, and detected by the proximity means and the touch means define the movement vector (76, 77). The movement will typically be substantially along the X- axis with predefined acceptance iimit(s) of the related values along Y-axis.
The accept angles (vl ,v2,v3,v4) defines the tolerance of X,Y values within which detected touch positions and proximity positions are validated to be legal values appl ied in the evaluation X.Y values of "P and T" corresponding to a specific functional command.
In movement of the finger from one position to another position, e.g. moving from P (71 ) to P (72), and T (73) to T (74), one or more intermediate sensor values are detected and applied in the determining the executed path and the resulting X,Y positions of P and T. Thus, the resulting position X,Y of the object e.g. the finger touching the surface from a given user position is calculated from the X,Y position of the touch and the X.Y position as detected by the proximity means. A legal activation includes that the user executes a single touch onto the surface in "a single point", without moving the finger to another touch position; a pressure performed by w ith the finger at the same point, for a period - short or long as applicable, may follow the touch.
The touch and proximity concept as disclosed enables the user to activate commands on the user interface; such the commands interpreted to be "Left to Right" or "Right to Left" relative to the user, and the user at any position along or around the border of device the user is controll ing.
Figure 7 il lustrates that commands executed at the lower half (80), or along the middle (90), or at the upper half ( 1 00) of a circular device, are all interpreted equally to be from Left to R ight. The same applies for command operated from Right to Left. As the device is rotational and symmetric, the method includes l ikewise operation performed by the user (1 10), along the complete perimeter of the device; see also Figure 1.
Typical applications are support of operation of multimedia apparatuses, AV systems, media players, remote controllers and similar equipment. In addition to the features hav ing an omnidirectional touchpad, the invention has properties making it suitable for mounting on printed circuit boards and improving the qual ity of the detection system such that user-suppl ied commands may be certainly and unambiguously detected, interpreted and executed as related functions in a given apparatus.
Figure 8 further il lustrates the proximity detection system when combined with the touch detection system, for example a capacitive touch detection system.
Since the object (hand or finger) in most cases will approach the omnidirectional touchpad from the direction of the user, the dominant position found with the proximity detection system will in the vast majority of cases be closer to the user than the positions touched on the touch detection system. This enables to make correct detection of the touch positions and/or touch swipes, regardless of the user's position orientation relative to the omnidirectional touch pad.
This principle is illustrated in the left drawing of Figure 8. The user 1 10 operates a touch swipe T 200 from Left to Right over the surface of the touchpad, parallel with the X-axis. The measured averaged dominant position of the object (hand or finger) making the swipe is marked with P 201. With the touch swipe being predominantly in the X direction, the averaged dominant position of the object is evaluated relative to the Y axis to determine the position of the user. In most cases it is sufficient to detect whether the averaged dominant position of the object has a positive or negative Y coordinate.
For more robustness, the Y-position of the averaged dominant position can be compared with the average Y-position of the touch swipe.
In practical cases, the touch swipe will be tilted compared to the X-axis, even with the user aligned with the Y axis. This may be due to the omnidirectional touchpad being operated by either the user's left hand or right hand. In the middle drawing of Figure 8, the touchpad is operated from the left with the user's left hand. The touch swipe T 202 is tilted clockwise relative to the X axis. The measured averaged dominant position of the object (hand or finger) making the swipe is marked with P 203. With the touch swipe still being predominantly in the X direction, the averaged dominant position of the object is evaluated relative to the Y axis to determine the position of the user.
In the right drawing of Figure 8, the touchpad is operated from the right with the user's right hand. The touch swipe T (204 ) is tilted counter clockw ise relative to the X axis. The measured averaged dominant position of the object (hand or finger) making the swipe is marked with P (205). Also in this case the same detection principle can be used. When the tilt angle of the touch swipe exceeds +45 or - 45 degrees relative to the X- axis, the touch swipe will be predominantly in the Y direction. In that case, the averaged dominant position of the object must be evaluated relative to the X axis to determine the position of the user. Again, in most cases it will then be sufficient to de- tect whether the averaged dominant position of the object has a positive or negative X coordinate.
For more robustness, the X-position of the averaged dominant position can be compared with the average X-position of the touch swipe.
In practice, the case where the tilt angle of the touch swipe exceeds +45 or - 45 degrees relative to the X-axis, will be consistent with a different position of the user, for example position 210 rather than the position 1 10 indicated in Figure 8, causing the method for determining the position of the user to remain valid.

Claims

1. A method of operating apparatus with a touchpad. the touch pad being configured for detecting gestures performed by a user and interpreting the gestures independently of the user's actual arbitrary location and the gestures corresponding gesture-orientation relative to the touchpad; the touchpad comprising a proximity detection system for delecting the location of an object, for example a user's finger, in the prox imity of the touchpad, and for detecting a movement performed with the object by the user in the proximity of the touch- pad; the apparatus further comprising a touch detection system for detecting contact by said object w ith the surface of the touchpad and for detecting a gesture performed with the object by the user while the object is in contact with the touchpad; the method comprising:
- by the proximity detection system detecting a prox i m i ty-posi t ion P of an ob- ject in close proximity to the touchpad;
- by the touch detection system detecting a touch-position T or sequence of touch positions T of the object while in contact w ith the touchpad;
- determining from the proximity location P relative to the touch-position T or sequence of touch positions T a user-direction towards the user;
- from the user-direction relative to a predetermined touchpad default- orientation adjusting the orientation or position or both of the gesture prior to interpreting the gesture with respect to a gesture-associated command and finally executing the command.
2. A method according to claim 1 , wherein the method comprises adjusting the orientation of the gesture with respect to righ-to-left or left-to-right direction prior to interpreting the gesture.
3. A method according to claim 1 or 2, wherein the method comprises
- by the proximity detection system detecting a movement of the object in close proximity to the touchpad and averaging this movement to a dominant proximity-position P; - determining from the dominant proximity-position P relative to the touch- position T or the sequence of touch positions T a user-direction towards the user.
4. A method according to claim 3, the method comprising
- providing a two dimensional X,Y coordinate system parallel to the surface of the touchpad,
- translating the movement to a sequence of prox im i ty-coord i nates in the coordinate-system and averaging this movement to a dominant proximity-position
P in the coordinate-system;
- translating the gesture to touch-coordinates in the coordinate-system, optionally averaging the gesture to a dominant touch-position T in the coordinate- system;
- determining from the dominant proximity-position P relative to the touch- position T or the sequence of touch positions T, optionally dominant touch- position T, a user-direction towards the user in the coordinate system;
- from the user-direction relative to the axes of the coord i nate-sy stem adjusting the orientation or position or both of the gesture in the coordinate system prior to interpreting the gesture with respect to a gesture-associated command and executing the command.
5. A method according to any claim 4, wherein the method comprises moving the object across the surface of the touchpad along a linear or curved swipe-path from one position X,Y to another position X,Y, and determining whether the swipe-path is a left-swipe or right-swipe depending on the determined user- direction in the coordinate-system..
6. An apparatus for a method according to any preceding claim, the apparatus comprising a touchpad, the touchpad being configured for detecting gestures performed by a user and interpreting the gestures independently of the user's actual arbitrary location and the gestures corresponding gesture orientation relative to the touchpad; the touchpad comprising a proximity detection system for detecting the location of an object, for example a user's finger, in the prox- imity of the touchpad, and for detecting a movement performed with the object by the user in the proximity of the touchpad; the apparatus further comprising a touch detection system for detecting contact by said object with the surface of the touchpad and for detecting a gesture performed with the object by the user while the object is in contact with the touchpad, characterized in that wherein the apparatus is configured for the following:
- by the proximity detection system detecting a prox i m i ty-posi t ion P of an object in close proximity to the touchpad;
- by the touch detection system detecting a touch-position T or sequence of touch positions T of the object while in contact with the touchpad;
- determining from the proximity location P relative to the touch-position T or sequence of touch positions T a user-direction towards the user;
- from the user-direction relative to a predetermined touchpad default- orientation adjusting the orientation of the gesture prior to interpreting the ges- ture with respect to a gestu re-assoc iated command and finally executing the command.
7. An apparatus according to claims 6, wherein the proximity detection system comprises a plural ity of proximity sensors organized along an outer perimeter of the touchpad.
8. An apparatus according to claim 7, wherein the touch detection system comprises a plurality of touch sensors organ ized on the surface of the touchpad and surrounded by the proximity sensors.
9. An apparatus according to anyone of the claim 6-8, wherein the proximity detection system comprises a plurality of infrared light emitters and a plurality of infrared light receivers, and said receivers are configured for measuring a background level of infrared light and correcting infrared proximity signals by subtracting the background level.
1 0. An apparatus according to claim 9, wherein the plurality of infrared light emitters are configured for one emitter of the plurality of emitters being active at a time, and wherein an electronic control circuit is configured for
- receiving a separate set of proximity signals from each one of the plurality of infrared receivers for every subsequent activation of each further infrared emitter of the plural ity of emitters, or
- receiving the proximity signals from two infrared receivers, one on either side of the corresponding emitter, for every subsequent activation of each further infrared emitter of the plural ity of emitters.
1 1 . An apparatus according to anyone of the claims 6- 10, wherein the touchpad is circular.
EP16741039.8A 2015-07-20 2016-07-20 Apparatus and method for detecting gestures on a touchpad Withdrawn EP3326052A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA201500422 2015-07-20
PCT/EP2016/067335 WO2017013186A1 (en) 2015-07-20 2016-07-20 Apparatus and method for detecting gestures on a touchpad

Publications (1)

Publication Number Publication Date
EP3326052A1 true EP3326052A1 (en) 2018-05-30

Family

ID=56464223

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16741039.8A Withdrawn EP3326052A1 (en) 2015-07-20 2016-07-20 Apparatus and method for detecting gestures on a touchpad

Country Status (3)

Country Link
EP (1) EP3326052A1 (en)
CN (1) CN107850969B (en)
WO (1) WO2017013186A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7442940B2 (en) 2020-07-07 2024-03-05 アルプスアルパイン株式会社 Proximity detection device
CN113190164A (en) * 2021-05-14 2021-07-30 歌尔股份有限公司 Operation method, system and equipment of equipment
CN115856912B (en) * 2023-02-06 2023-05-30 宜科(天津)电子有限公司 Data processing system for detecting movement direction of object

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8339379B2 (en) * 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
US7519223B2 (en) * 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US8558161B2 (en) * 2010-08-10 2013-10-15 Avago Technologies General Ip (Singapore) Pte. Ltd. Lens having multiple conic sections for LEDs and proximity sensors
WO2013056157A1 (en) * 2011-10-13 2013-04-18 Autodesk, Inc. Proximity-aware multi-touch tabletop
US9223340B2 (en) * 2013-08-14 2015-12-29 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display

Also Published As

Publication number Publication date
CN107850969A (en) 2018-03-27
CN107850969B (en) 2021-06-08
WO2017013186A1 (en) 2017-01-26

Similar Documents

Publication Publication Date Title
US9477324B2 (en) Gesture processing
US9448645B2 (en) Digitizer using multiple stylus sensing techniques
US8169404B1 (en) Method and device for planary sensory detection
US8902193B2 (en) Interactive input system and bezel therefor
US20100207910A1 (en) Optical Sensing Screen and Panel Sensing Method
EP2274666A1 (en) Interactive input system and pen tool therefor
US20120249487A1 (en) Method of identifying a multi-touch shifting gesture and device using the same
US20210389818A1 (en) System and method for human interaction with virtual objects
EP3326052A1 (en) Apparatus and method for detecting gestures on a touchpad
US20170170826A1 (en) Optical sensor based mechanical keyboard input system and method
US9703410B2 (en) Remote sensing touchscreen
US20140111478A1 (en) Optical Touch Control Apparatus
KR101672731B1 (en) 3d hovering digitizer system using pen tilt
US20130120361A1 (en) Spatial 3d interactive instrument
JP5692764B2 (en) Object detection method and apparatus using the same
KR102169236B1 (en) Touchscreen device and method for controlling the same and display apparatus
EP2315106A2 (en) Method and system for detecting control commands
CN113906372A (en) Aerial imaging interaction system
JP4136584B2 (en) Coordinate input device, coordinate value output method and program
KR101652973B1 (en) Digitizer using gradient of stylus pen
US10372268B2 (en) Spatial image display apparatus and spatial image display method
KR102254091B1 (en) Touchscreen device and method for controlling the same and display apparatus
KR20140081425A (en) Input method with touch panel
KR20140066378A (en) Display apparatus and method of controlling the same
KR101966585B1 (en) Space touch device and and display device comprising the same

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180201

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190909

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: BANG & OLUFSEN A/S

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220104