WO2023201386A1 - An input system - Google Patents

An input system Download PDF

Info

Publication number
WO2023201386A1
WO2023201386A1 PCT/AU2023/050272 AU2023050272W WO2023201386A1 WO 2023201386 A1 WO2023201386 A1 WO 2023201386A1 AU 2023050272 W AU2023050272 W AU 2023050272W WO 2023201386 A1 WO2023201386 A1 WO 2023201386A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
interactions
positional
controller
interaction
Prior art date
Application number
PCT/AU2023/050272
Other languages
French (fr)
Inventor
Clement Koh
Original Assignee
Clement Koh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2022901055A external-priority patent/AU2022901055A0/en
Application filed by Clement Koh filed Critical Clement Koh
Priority to AU2023255407A priority Critical patent/AU2023255407B2/en
Publication of WO2023201386A1 publication Critical patent/WO2023201386A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • This invention relates generally to a type of computer input system.
  • Input systems are used to interact with electronic devices with a variety of input devices. These input devices may comprise mouse pointers, keyboards, touch sensitive pads, touch sensitive screen overlays and the like.
  • US 2022/0026998 A1 (APPLE INC.) 27 January 2022 gives an example of user input system including a stylus and an electronic device wherein a user may manipulate the stylus across an input surface of the electronic device and the movement may be detected using axially-aligned electric fields generated by the stylus.
  • the stylus may be identified by a “ring signal” which is detected through the surface of the electronic device. In this way, more than one stylus may interact simultaneously with the electronic device.
  • the present invention seeks to provide an input system, which will overcome or substantially ameliorate at least some of the deficiencies of the prior art, or to at least provide an alternative.
  • an input system interfacing a device.
  • the system comprises a positional input configured for detecting positional input interactions with a surface thereof.
  • Each positional input interaction is assigned respective coordinates with respect to the surface.
  • the system further comprises a plurality of input devices configured for detecting touch interactions.
  • Each touch interaction is assigned a respective input device ID.
  • the devices are assigned a spatial ordering.
  • the spatial ordering is configured with respect to an axis of the surface.
  • the input devices may comprise finger wearable devices wherein each input device may comprise a slimline capacitive sensitive pad at each fingertip and which is wired to a central wrist worn controller.
  • the wrist worn controller may receive capacitance signals from each pad to detect the touch interactions and to assign touch IDs to each touch interaction.
  • the system further comprises a controller configured to receive signals from the positional input and the input devices.
  • the controller is configured for matching positional input interactions with respective touch interactions by temporally correlating timing of the interactions.
  • the controller may be further configured for matching any remaining unmatched interactions by spatially correlating the assigned spatial ordering of the input devices and the coordinates of the positional input interactions.
  • the controller is able to generate input interaction signals for the device, each input interaction comprising respective coordinates and an ID of a respective input device.
  • the controller may be configured according to an applicationspecific context requiring a specific hand positioning and input device configurations.
  • a controller interface may be installed on an operating system of the device which allows an application to configure the controller with an application specific context of the application.
  • the application-specific context may be used to determine the spatial ordering and the orientation of the spatial axis.
  • the controller is able to generate the input interaction signals by processing the interactions according to the provided application-specific context.
  • the controller may process the interactions according to defined regions which may be specific to areas of a surface of a positional input.
  • Figure 1 shows a schematic of an input system interfacing the device in accordance with an embodiment
  • Figure 2 shows a physical example of the input system interfacing the device
  • Figure 3A shows examples of four input devices used on a single positional input
  • Figure 4A shows an example wherein a user uses a pair of input devices 104 on each hand and wherein the hands are orientated at 90°;
  • Figure 5A shows an example wherein input devices 104 are restricted to particular regions 125;
  • Figure 6 shows examples of nonorthogonal spatial ordering axes and nonrectangular regions
  • Figure 7 shows a conventional paint application being configured with the present input system
  • Figure 8 shows processing 128 by the controller for generating the input interactions 108
  • Figure 9 shows processing 134 by the controller 105 to match the positional input interactions 106 with the touch interactions 107;
  • Figure 10 shows temporal matching processing 139 by the controller 105
  • Figure 1 1 shows spatial matching processing by the controller
  • Figure 12 shows application-specific context processing 150. Description of Embodiments
  • Figure 1 shows a functional block level diagram schematic of an input system 101 interfacing a computer device 102.
  • the device 102 may comprise a digital display 115 configured to display a user interface 1 16 of an application 112.
  • the user interface 1 16 may comprise an input pointer 117.
  • the system 101 comprises a positional input 103.
  • the positional input 103 is configured for detecting positional input interactions with a surface thereof. Each positional input interaction is assigned respective XY coordinates with respect to the surface.
  • the positional input 103 comprises a touchpad.
  • the positional input 103 may comprise a touch sensitive layer overlaid the display 1 15 of the device 102.
  • the system 101 may comprise a plurality of positional inputs 103.
  • the system 101 comprises three positional inputs 103.
  • the positional inputs 103 may detect positional input interactions therewith capacitively. For example, is an object touches the surface 1 18 of the positional input 103, a matrix of capacitive sensors thereunderneath detects changes in capacitance at the position of the object so that the positional input can generate a corresponding positional input interaction which comprises XY coordinates.
  • Each positional input interaction may comprise a down state when the interaction is first detected (i.e., when an object presses down on the surface 118), a tracking state (i.e. as the object moves across the surface 118) and an up state (i.e., when the interaction is no longer when the object is lifted from the surface 1 18).
  • the system 100 further comprises a plurality of input devices 104.
  • Each input device 104 is configured for detecting touch interactions.
  • Each touch interaction is assigned respective input device ID.
  • the input devices 104 are fingertip devices. According to the example shown in Figure 2, four input devices 104 are worn on the index and thumb fingers of the left and right hands respectively. Each input device 104 would be assigned a respective device I D, such as a numeric index, such as 1 - 4 for the four input devices 104.
  • each input device 104 comprises a slimline capacitive sensitive pad at each fingertip and which is wired to a central wrist worn controller.
  • the wrist worn controller may receive analog capacitance signals from each pad to detect the touch interactions and to assign the respective touch ID accordingly.
  • the wrist worn controller may transmit the detected touch interactions 107 to a controller 107 wirelessly, such as via Bluetooth.
  • the input device 104 IDs are assigned a spatial ordering with respect to an axis of the surface 1 18.
  • the access may be along the X axis of the positional inputs 103.
  • the spatial ordering may be assigned directionally with respect to the axis.
  • the axis may also comprise the Y axis.
  • the access may be nonorthogonal and the orientation thereof may be configurable.
  • the index finger of the left hand would be assigned spatial ordering 1
  • the thumb of the left hand would be assigned spatial ordering 2
  • the thumb of the right hand would be assigned spatial ordering 3
  • the index finger of the right hand would be assigned spatial ordering 4.
  • the controller 105 is configured to receive signals from the positional input 103 and the input devices 104.
  • the positional input interaction signals 106 may comprise the XY coordinates with respect to the surface 1 18 of the positional input 103.
  • the touch interaction signals 107 may comprise respective device IDs.
  • the controller 105 is configured to match positional input interactions 106 with touch interactions 107 by temporally correlating timing of the interactions 106, 107.
  • the controller 105 is further configured for matching any remaining unmatched interactions 106, 107 by spatially correlating the assigned spatial ordering of the input devices 104 and the coordinates of the positional input interactions 106. [0047] The controller 105 then generates input interactions signals 108 for the device 102 according to matched interactions 106, 107. Each input interaction 108 comprises respective XY coordinates and an ID of a respective input device 104.
  • the controller 105 may interface a human input device (HID) interface 1 10 of the device 102.
  • the controller 105 may interface an HI D interface 1 10 of the device 102 to control a mouse cursor 117.
  • HID human input device
  • the controller interface 1 11 may be installed in an operating system 1 13 of the device 102.
  • the controller interface 11 1 is configured to interact with the controller 105 to receive and implement the interaction signals 108.
  • the input system 101 is separate from the device 102 and interfaces the device 102 via a wired or wireless interface.
  • the input system 101 may be implemented partially or entirely by software executed by the device 102.
  • the controller interface 111 may be used to provide an application-specific context for the controller 105. For example, different applications 1 12 may require different configurations of input devices 104 and hand orientations.
  • Figure 7A shows an interface 122 of conventional paint application 1 12 controlled using a keyboard 1 19 and mouse 120.
  • the mouse 120 controls a mouse pointer 121 on screen.
  • the interface 122 comprises object selection controls 123, each of which may be selected using the pointer 121 to select a pencil or an eraser.
  • the interface 122 further comprises layer selection controls 124 wherein layers may be selected using the pointer 121 .
  • Figure 7B shows the paint application 1 12 configured using the input system 101 wherein a first positional input 103A is used for interaction with a first layer and a second positional input 103B is user interaction with a second layer.
  • a first input device 104A is used for the pencil and a second input device 104B is used for the eraser.
  • the user may draw and erase on each layer.
  • the currently executing paint application 112 will use the controller interface 11 1 to transmit the application-specific context 109 to the controller 105.
  • the application-specific context is used to determine the spatial ordering down the y-axis of the positional input 103 (given the orientation of the hand) wherein the index finger input device 104A is assigned spatial ordering 1 and the thumb input device 104B is assigned spatial ordering 2.
  • the controller 105 may interpret the positional input interactions 106 and the touch interactions 107 according to defined regions 125.
  • the surface 118 of a positional input 103 may segmented into regions 125.
  • a pair of input devices 104 may be used on the left-hand and another pair of input devices 104 may be used in the right hand.
  • the region 125 may represent a positional input ID and corresponding respective device IDs.
  • a region 125A may specify that only the input devices 104 of the left-hand may be used on the first positional input 103A and another region 125B may specify that only the input devices 104 of the right hand may be used on the second positional input 103B.
  • Each region 125 may further be assigned an area, spatial ordering and a spatial axis.
  • Figure 3A gives an example of four input devices 104 used on a single positional input 103. According to this example, both hands are orientated ay 0° and the spatial ordering is with respect to the X axis.
  • Exemplary interactions 127 are shown at respective coordinates and device IDs.
  • each interaction 127 may comprise a down state (one sampling frame) when the input device 104 presses against the surface 1 18 of the positional input 103, tracking states (a plurality of sampling frames) when the input device 104 moves across the surface 118 of the positional input 103) and an up state (one sampling frame) wherein the input device 104 is lifted from the surface 118 of the positional input 103.
  • the controller 105 may be configured to group down states within a sampling window.
  • the sampling window may be configured to be just long enough so that the controller 105 can receive signals from the positional input 103 and at least one input device 104.
  • the controller 105 performs spatial correlation. For example, the user may press all four fingers onto the surface 118 of the positional input 103 simultaneously. Spatial correlation may reference the down states of unmatched interactions 106, 107 and the controller 1 15 may perform pairwise matching of the positional input interactions 106 and the touch interactions 107 according to the spatial ordering and the spatial axis 126.
  • the spatial axis is a positive x-axis for both the region and input devices 104.
  • the input devices 104 is assigned spatial ordering 1 to the index finger of the left hand, spatial ordering 2 to the thumb of the left-hand, spatial ordering 3 to the thumb of the right hand and spatial ordering 4 to the forefinger of the right hand.
  • the coordinates generated on the position input 103 are assigned from left to right with the leftmost coordinate assigned spatial ordering 1 and right most coordinate assigned spatial ordering 4.
  • input device 104 IDs match the assigned spatial ordering.
  • the controller 105 would perform the spatial correlation by pairwise matching the leftmost coordinates with spatial ordering 1 to determine that the lefthand index finger with spatial ordering 1 performed interaction 127A.
  • Figure 4A gives an example wherein a user uses a pair of input devices 104 on each hand and wherein the hands are orientated at 90°.
  • Figure 4 further gives an example wherein spatial ordering is assigned to respective regions 125 of the positional input 103.
  • each region 125 is a rectangular and may be defined by of minimum and maximum XY coordinates.
  • First region 125A may have spatial ordering assigned to a downward vertical axis 126A and second region 125B would have spatial ordering assigned to an upward vertical axis.
  • the first region 125A may have spatial ordering of ⁇ 1 ,2, 3, 4 ⁇ down along the vertical axis and the second region 125B may have spatial ordering of ⁇ 1 ,2, 3, 4 ⁇ up along the vertical axis.
  • Figure 4B gives an example of an invalid input because the right hand is inverted wherein the forefinger input device 104C having ID 4 is above the thumb input device 104D having ID 3, thereby being incompatible with the assigned spatial ordering of ⁇ 1 ,2,3,4 ⁇ .
  • Figure 4C gives an example of a valid input because the right hand is now rotated so that the input device IDs correlate with the assigned spatial ordering.
  • Figure 4D gives a further example of an invalid input because the input devices 104 of the right hand are above the input devices 104 of the left hand, thereby being incompatible with the assigned spatial ordering.
  • Figure 5A gives an example wherein input devices 104 are restricted to particular regions 125. This is represented by a list of input device IDs next to the region’s spatial ordering. The list of input device IDs is sorted by the input device’s spatial ordering. In the example shown, forefinger and thumb input devices 104 are used on each hand, each hand is orientated at approximately 90° and the hands are crossed over. The resulting match is ⁇ 3, 4, 1 ,2 ⁇ , i.e. coordinate 1 matches to device 3, coordinate 2 to device 4, and so on.
  • First region 125A has spatial ordering assigned up a vertical Y-axis 126A and second region 125B has spatial ordering assigned down a vertical Y-axis 126B.
  • first region 125A is restricted to interacting with input devices 104 having IDs 3 and 4 whereas second region 125B is restricted to interacting with input devices 104 having IDs 1 and 2.
  • Figure 5B gives an example of a valid input because the first region 125A is restricted to input device IDs ⁇ 1 ,2,4 ⁇ as such coordinates 1 ,2,3 is matched to ⁇ 1 ,2,4 ⁇ ., and coordinate 4 is matched to 3 as id 1 has been consumed in prior coordinate match. The resulting match is ⁇ 1 ,2, 4, 3 ⁇ .
  • Figure 5C is an example of an invalid input because when compared to Figure 5B, the region’s spatial ordering is reversed and therefore the coordinates are sorted differently, starting with the right region, then the left region. Further it is not possible to match coordinate 4 to an input device ID because the valid input device ids ⁇ 1 ,2,4 ⁇ have been previously consumed. The resulting match is ⁇ 1 ,2,4,? ⁇ .
  • Figure 5D gives a further example of an invalid input because coordinate 1 is matched incorrectly to input device 1 instead of 4.
  • the resulting match is ⁇ 1 ,2, 4, 3 ⁇ .
  • Figure 6 gives an example wherein the spatial ordering axes 126 are non- orthogonal and wherein the orientation thereof may be configured including in accordance with the application-specific context.
  • the spatial ordering axes may be at approximately 45° with respect to the orthogonal axes.
  • Figure 6A gives an example of non-rectangular regions wherein a first region 125A is circular and a second region 125B is L-shaped.
  • Figure 6 gives the example wherein each region 125 is restricted to certain device identifiers, and the region’s spatial ordering may be affected by the parent positional input’s spatial ordering.
  • FIG. 6 shows the regions 125 being separate, in embodiments, the regions 125 may be overlapping.
  • the orientation and/or shape of the regions 125 may be repositioned according to application’s requirements.
  • Figure 8 shows processing 128 by the controller for generating the input interactions 108.
  • the controller 105 receives the positional input interactions 106 which may be stored in a buffer.
  • the controller 105 receives the touch interactions 107 which may also be stored in a buffer.
  • the controller 105 assigns each positional input interaction 106 to a respective positional input 103 whereafter it is removed from the buffer.
  • controller 105 may assign each such interaction 107 to a respective input device 104 whereafter it is removed from the buffer.
  • the controller 105 matches the positional input interactions 107 and the touch interactions 107 to generate the input interactions 108 for the device 102.
  • Figure 9 shows processing 134 (being a sub flow of step 132) by the controller
  • step 135 the controller 105 takes an unmatched positional input interaction
  • a temporal match may be found where only one touch interaction
  • 107 occurs within a sampling window associated with a positional input interaction 106, such as a sampling window commencing from the down state of the positional input interaction 106.
  • the processing 134 may commence to spatial matching at step 137.
  • Temporal and/or spatial matching occurs at step 136 and 137 for all interactions 106, 107 in the buffer whereafter the processing 134 exits at step 138.
  • FIG 10 shows temporal matching processing 139 (being a sub flow of step 136) by the controller 105 wherein, at step 140, the controller ascertains 105 the touch interactions 107 falling within a sampling window (T) associated with a positional input interaction 106 (such as a sampling window commencing from the down state of a positional input interaction 106).
  • the controller 105 may determine whether the down state of a touch interaction 107 falls within the window.
  • step 141 If at step 141 , only one touch interaction 107 is found within the sampling window, a match is assigned at step 142 otherwise, the processing 139 exits at step 143 and proceeds to the spatial matching processing shown in Figure 1 1 step 144 (being a sub flow of step 137).
  • the controller 105 ascertains positional input interactions 106 and touch interactions 107 falling within a sampling window.
  • the chosen sampling window may commence from the down state of a positional input interaction 106.
  • the controller 105 may ascertain if any regions are applicable based on the collected positional input interactions.
  • the controller will determine the spatial ordering of each collected positional input interaction 106 by using the spatial order and spatial axis of a corresponding region 125.
  • step 149 the controller performs pairwise matching of positional input interaction to an input device 104, by the sorted order. A match is found if the input device is valid for the positional interaction’s region 125 and has not been consumed in a previous match.
  • Figure 12 shows application-specific context processing 150.
  • the application 1 12 may detect changing of an application-specific context, such as wherein the paint application of the example of Figure 7B is launched.
  • the application 1 12 may use the controller interface 1 11 to transmit the application-specific context 109 to the controller 105 to configure the input system 101 at step 152.
  • the application-specific context 109 may be used to determine the spatial ordering and the orientation of the spatial ordering axis.
  • the application-specific context 109 may specify regions 125 wherein device IDs are assigned to an area of a positional input 103.
  • step 153 the user may be notified of the change in application-specific context. Such notification may instruct the user of the correct positioning of the hands and input devices 104 which the user follows at step 154.
  • the input system 101 processes the positional input interactions 106 and the touch interactions 107 according to the new applicationspecific context to generate the input interaction signals at step 156.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)

Abstract

An input system has a positional input configured for detecting positional input interactions with a surface thereof and a plurality of input devices configured for detecting touch interactions. The controller is configured to receive signals from the positional input and the input devices, match positional input interactions with touch interactions by temporally correlating timing of the interactions and match any remaining unmatched interactions by spatially correlating an assigned spatial ordering of the input devices with coordinates of the positional input interactions to generate input interaction signals for a computer device.

Description

An input system
Field of the Invention
[0001 ] This invention relates generally to a type of computer input system.
Background of the Invention
[0002] Input systems are used to interact with electronic devices with a variety of input devices. These input devices may comprise mouse pointers, keyboards, touch sensitive pads, touch sensitive screen overlays and the like.
[0003] US 2022/0026998 A1 (APPLE INC.) 27 January 2022 gives an example of user input system including a stylus and an electronic device wherein a user may manipulate the stylus across an input surface of the electronic device and the movement may be detected using axially-aligned electric fields generated by the stylus.
[0004] According to D1 , the stylus may be identified by a “ring signal” which is detected through the surface of the electronic device. In this way, more than one stylus may interact simultaneously with the electronic device.
[0005] The present invention seeks to provide an input system, which will overcome or substantially ameliorate at least some of the deficiencies of the prior art, or to at least provide an alternative.
[0006] It is to be understood that, if any prior art information is referred to herein, such reference does not constitute an admission that the information forms part of the common general knowledge in the art, in Australia or any other country.
Summary of the Disclosure
[0007] There is provided herein an input system interfacing a device. The system comprises a positional input configured for detecting positional input interactions with a surface thereof. Each positional input interaction is assigned respective coordinates with respect to the surface.
[0008] The system further comprises a plurality of input devices configured for detecting touch interactions. [0009] Each touch interaction is assigned a respective input device ID. Furthermore, the devices are assigned a spatial ordering. Furthermore, the spatial ordering is configured with respect to an axis of the surface.
[0010] The input devices may comprise finger wearable devices wherein each input device may comprise a slimline capacitive sensitive pad at each fingertip and which is wired to a central wrist worn controller. The wrist worn controller may receive capacitance signals from each pad to detect the touch interactions and to assign touch IDs to each touch interaction.
[0011 ] The system further comprises a controller configured to receive signals from the positional input and the input devices.
[0012] The controller is configured for matching positional input interactions with respective touch interactions by temporally correlating timing of the interactions.
[0013] However, for simultaneous touches wherein, for example, the user presses the fingers simultaneously against the surface of the positional input so that more than one touch interaction appears within a sampling window of a positional input, the controller may be further configured for matching any remaining unmatched interactions by spatially correlating the assigned spatial ordering of the input devices and the coordinates of the positional input interactions.
[0014] Having matched the positional input interactions and the attached interactions, the controller is able to generate input interaction signals for the device, each input interaction comprising respective coordinates and an ID of a respective input device. [0015] In embodiments, the controller may be configured according to an applicationspecific context requiring a specific hand positioning and input device configurations. A controller interface may be installed on an operating system of the device which allows an application to configure the controller with an application specific context of the application.
[0016] The application-specific context may be used to determine the spatial ordering and the orientation of the spatial axis. As such, the controller is able to generate the input interaction signals by processing the interactions according to the provided application-specific context. [0017] In embodiments, the controller may process the interactions according to defined regions which may be specific to areas of a surface of a positional input.
[0018] Other aspects of the invention are also disclosed.
Brief Description of the Drawings
[0019] Notwithstanding any other forms which may fall within the scope of the present invention, preferred embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:
[0020] Figure 1 shows a schematic of an input system interfacing the device in accordance with an embodiment;
[0021 ] Figure 2 shows a physical example of the input system interfacing the device;
[0022] Figure 3A shows examples of four input devices used on a single positional input;
[0023] Figure 4A shows an example wherein a user uses a pair of input devices 104 on each hand and wherein the hands are orientated at 90°;
[0024] Figure 5A shows an example wherein input devices 104 are restricted to particular regions 125;
[0025] Figure 6 shows examples of nonorthogonal spatial ordering axes and nonrectangular regions;
[0026] Figure 7 shows a conventional paint application being configured with the present input system;
[0027] Figure 8 shows processing 128 by the controller for generating the input interactions 108;
[0028] Figure 9 shows processing 134 by the controller 105 to match the positional input interactions 106 with the touch interactions 107;
[0029] Figure 10 shows temporal matching processing 139 by the controller 105;
[0030] Figure 1 1 shows spatial matching processing by the controller; and
[0031 ] Figure 12 shows application-specific context processing 150. Description of Embodiments
[0032] Figure 1 shows a functional block level diagram schematic of an input system 101 interfacing a computer device 102. With reference to Figure 2, the device 102 may comprise a digital display 115 configured to display a user interface 1 16 of an application 112. The user interface 1 16 may comprise an input pointer 117.
[0033] The system 101 comprises a positional input 103. The positional input 103 is configured for detecting positional input interactions with a surface thereof. Each positional input interaction is assigned respective XY coordinates with respect to the surface.
[0034] According to the embodiment shown in Figure 2, the positional input 103 comprises a touchpad. However, in alternative embodiments, the positional input 103 may comprise a touch sensitive layer overlaid the display 1 15 of the device 102.
[0035] The system 101 may comprise a plurality of positional inputs 103. For example, in the embodiment shown in Figure 2, the system 101 comprises three positional inputs 103.
[0036] The positional inputs 103 may detect positional input interactions therewith capacitively. For example, is an object touches the surface 1 18 of the positional input 103, a matrix of capacitive sensors thereunderneath detects changes in capacitance at the position of the object so that the positional input can generate a corresponding positional input interaction which comprises XY coordinates.
[0037] Each positional input interaction may comprise a down state when the interaction is first detected (i.e., when an object presses down on the surface 118), a tracking state (i.e. as the object moves across the surface 118) and an up state (i.e., when the interaction is no longer when the object is lifted from the surface 1 18).
[0038] The system 100 further comprises a plurality of input devices 104. Each input device 104 is configured for detecting touch interactions. Each touch interaction is assigned respective input device ID.
[0039] According to the preferred embodiment shown, the input devices 104 are fingertip devices. According to the example shown in Figure 2, four input devices 104 are worn on the index and thumb fingers of the left and right hands respectively. Each input device 104 would be assigned a respective device I D, such as a numeric index, such as 1 - 4 for the four input devices 104.
[0040] According to an embodiment, each input device 104 comprises a slimline capacitive sensitive pad at each fingertip and which is wired to a central wrist worn controller. The wrist worn controller may receive analog capacitance signals from each pad to detect the touch interactions and to assign the respective touch ID accordingly. The wrist worn controller may transmit the detected touch interactions 107 to a controller 107 wirelessly, such as via Bluetooth.
[0041 ] The input device 104 IDs are assigned a spatial ordering with respect to an axis of the surface 1 18. For example, with reference to Figure 2, the access may be along the X axis of the positional inputs 103. The spatial ordering may be assigned directionally with respect to the axis.
[0042] The axis may also comprise the Y axis. In embodiments, the access may be nonorthogonal and the orientation thereof may be configurable.
[0043] Given the orientation of the hands and the respective positions of the hands shown in Figure 2, the index finger of the left hand would be assigned spatial ordering 1 , the thumb of the left hand would be assigned spatial ordering 2, the thumb of the right hand would be assigned spatial ordering 3 and the index finger of the right hand would be assigned spatial ordering 4.
[0044] The controller 105 is configured to receive signals from the positional input 103 and the input devices 104. As alluded to above, the positional input interaction signals 106 may comprise the XY coordinates with respect to the surface 1 18 of the positional input 103. Furthermore, the touch interaction signals 107 may comprise respective device IDs.
[0045] The controller 105 is configured to match positional input interactions 106 with touch interactions 107 by temporally correlating timing of the interactions 106, 107.
[0046] The controller 105 is further configured for matching any remaining unmatched interactions 106, 107 by spatially correlating the assigned spatial ordering of the input devices 104 and the coordinates of the positional input interactions 106. [0047] The controller 105 then generates input interactions signals 108 for the device 102 according to matched interactions 106, 107. Each input interaction 108 comprises respective XY coordinates and an ID of a respective input device 104.
[0048] The controller 105 may interface a human input device (HID) interface 1 10 of the device 102. For example, the controller 105 may interface an HI D interface 1 10 of the device 102 to control a mouse cursor 117.
[0049] Additionally, or alternatively, the controller interface 1 11 may be installed in an operating system 1 13 of the device 102. The controller interface 11 1 is configured to interact with the controller 105 to receive and implement the interaction signals 108.
[0050] In embodiment shown, the input system 101 is separate from the device 102 and interfaces the device 102 via a wired or wireless interface. Alternatively, the input system 101 may be implemented partially or entirely by software executed by the device 102.
[0051 ] The controller interface 111 may be used to provide an application-specific context for the controller 105. For example, different applications 1 12 may require different configurations of input devices 104 and hand orientations.
[0052] For example, Figure 7A shows an interface 122 of conventional paint application 1 12 controlled using a keyboard 1 19 and mouse 120. The mouse 120 controls a mouse pointer 121 on screen.
[0053] The interface 122 comprises object selection controls 123, each of which may be selected using the pointer 121 to select a pencil or an eraser. The interface 122 further comprises layer selection controls 124 wherein layers may be selected using the pointer 121 .
[0054] Figure 7B shows the paint application 1 12 configured using the input system 101 wherein a first positional input 103A is used for interaction with a first layer and a second positional input 103B is user interaction with a second layer.
[0055] Furthermore, a first input device 104A is used for the pencil and a second input device 104B is used for the eraser. As such, using j ust one hand, the user may draw and erase on each layer. [0056] As such, according to this example, the currently executing paint application 112 will use the controller interface 11 1 to transmit the application-specific context 109 to the controller 105.
[0057] In this example, the application-specific context is used to determine the spatial ordering down the y-axis of the positional input 103 (given the orientation of the hand) wherein the index finger input device 104A is assigned spatial ordering 1 and the thumb input device 104B is assigned spatial ordering 2.
[0058] In embodiments, the controller 105 may interpret the positional input interactions 106 and the touch interactions 107 according to defined regions 125. For example, with reference to Figure 1 , the surface 118 of a positional input 103 may segmented into regions 125.
[0059] For example, with reference to the example of Figure 7, a pair of input devices 104 may be used on the left-hand and another pair of input devices 104 may be used in the right hand.
[0060] The region 125 may represent a positional input ID and corresponding respective device IDs. For example, a region 125A may specify that only the input devices 104 of the left-hand may be used on the first positional input 103A and another region 125B may specify that only the input devices 104 of the right hand may be used on the second positional input 103B. Each region 125 may further be assigned an area, spatial ordering and a spatial axis.
[0061 ] Figure 3A gives an example of four input devices 104 used on a single positional input 103. According to this example, both hands are orientated ay 0° and the spatial ordering is with respect to the X axis.
[0062] Exemplary interactions 127 are shown at respective coordinates and device IDs.
[0063] As alluded to above each interaction 127 may comprise a down state (one sampling frame) when the input device 104 presses against the surface 1 18 of the positional input 103, tracking states (a plurality of sampling frames) when the input device 104 moves across the surface 118 of the positional input 103) and an up state (one sampling frame) wherein the input device 104 is lifted from the surface 118 of the positional input 103.
[0064] The controller 105 may be configured to group down states within a sampling window. The sampling window may be configured to be just long enough so that the controller 105 can receive signals from the positional input 103 and at least one input device 104.
[0065] According to the temporal correlation, if only one positional input interaction 106 and only one touch interaction 107 is received within the sampling window, the interactions 106, 107 can be matched.
[0066] However, if more than one positional input interaction 106 and/or more than one touch interaction 107 is received within the sampling window, the controller 105 performs spatial correlation. For example, the user may press all four fingers onto the surface 118 of the positional input 103 simultaneously. Spatial correlation may reference the down states of unmatched interactions 106, 107 and the controller 1 15 may perform pairwise matching of the positional input interactions 106 and the touch interactions 107 according to the spatial ordering and the spatial axis 126.
[0067] For example, in Figure 3 the spatial axis is a positive x-axis for both the region and input devices 104. The input devices 104 is assigned spatial ordering 1 to the index finger of the left hand, spatial ordering 2 to the thumb of the left-hand, spatial ordering 3 to the thumb of the right hand and spatial ordering 4 to the forefinger of the right hand. The coordinates generated on the position input 103 are assigned from left to right with the leftmost coordinate assigned spatial ordering 1 and right most coordinate assigned spatial ordering 4. Furthermore, input device 104 IDs match the assigned spatial ordering.
[0068] As such, the controller 105 would perform the spatial correlation by pairwise matching the leftmost coordinates with spatial ordering 1 to determine that the lefthand index finger with spatial ordering 1 performed interaction 127A.
[0069] The controller 105 would then match the next leftmost coordinates with spatial ordering 2 to determine that the left-hand thumb finger with spatial ordering 2 performed interaction 127B and so on until all interactions 127 are matched. [0070] Figure 3B gives an example of an invalid input caused by swapping of the hands so that the positioning of the input devices 104 no longer correlates with the assigned spatial ordering. Figure 3C further gives an example of an invalid input caused by changing the orientation of the hands.
[0071 ] Figure 4A gives an example wherein a user uses a pair of input devices 104 on each hand and wherein the hands are orientated at 90°. Figure 4 further gives an example wherein spatial ordering is assigned to respective regions 125 of the positional input 103. In the example shown, each region 125 is a rectangular and may be defined by of minimum and maximum XY coordinates.
[0072] First region 125A may have spatial ordering assigned to a downward vertical axis 126A and second region 125B would have spatial ordering assigned to an upward vertical axis.
[0073] More specifically, the first region 125A may have spatial ordering of {1 ,2, 3, 4} down along the vertical axis and the second region 125B may have spatial ordering of {1 ,2, 3, 4} up along the vertical axis.
[0074] Figure 4B gives an example of an invalid input because the right hand is inverted wherein the forefinger input device 104C having ID 4 is above the thumb input device 104D having ID 3, thereby being incompatible with the assigned spatial ordering of {1 ,2,3,4}.
[0075] However, Figure 4C gives an example of a valid input because the right hand is now rotated so that the input device IDs correlate with the assigned spatial ordering.
[0076] Figure 4D gives a further example of an invalid input because the input devices 104 of the right hand are above the input devices 104 of the left hand, thereby being incompatible with the assigned spatial ordering.
[0077] Figure 5A gives an example wherein input devices 104 are restricted to particular regions 125. This is represented by a list of input device IDs next to the region’s spatial ordering. The list of input device IDs is sorted by the input device’s spatial ordering. In the example shown, forefinger and thumb input devices 104 are used on each hand, each hand is orientated at approximately 90° and the hands are crossed over. The resulting match is {3, 4, 1 ,2}, i.e. coordinate 1 matches to device 3, coordinate 2 to device 4, and so on.
[0078] First region 125A has spatial ordering assigned up a vertical Y-axis 126A and second region 125B has spatial ordering assigned down a vertical Y-axis 126B.
[0079] Furthermore, according to Figure 5A first region 125A is restricted to interacting with input devices 104 having IDs 3 and 4 whereas second region 125B is restricted to interacting with input devices 104 having IDs 1 and 2.
[0080] Figure 5B gives an example of a valid input because the first region 125A is restricted to input device IDs {1 ,2,4} as such coordinates 1 ,2,3 is matched to {1 ,2,4}., and coordinate 4 is matched to 3 as id 1 has been consumed in prior coordinate match. The resulting match is {1 ,2, 4, 3}.
[0081 ] Figure 5C is an example of an invalid input because when compared to Figure 5B, the region’s spatial ordering is reversed and therefore the coordinates are sorted differently, starting with the right region, then the left region. Further it is not possible to match coordinate 4 to an input device ID because the valid input device ids {1 ,2,4} have been previously consumed. The resulting match is {1 ,2,4,?}.
[0082] Figure 5D gives a further example of an invalid input because coordinate 1 is matched incorrectly to input device 1 instead of 4. The resulting match is {1 ,2, 4, 3}.
[0083] Figure 6 gives an example wherein the spatial ordering axes 126 are non- orthogonal and wherein the orientation thereof may be configured including in accordance with the application-specific context. In the example shown, the spatial ordering axes may be at approximately 45° with respect to the orthogonal axes.
[0084] Furthermore, Figure 6A gives an example of non-rectangular regions wherein a first region 125A is circular and a second region 125B is L-shaped.
[0085] Furthermore, Figure 6 gives the example wherein each region 125 is restricted to certain device identifiers, and the region’s spatial ordering may be affected by the parent positional input’s spatial ordering.
[0086] Whereas Figure 6 shows the regions 125 being separate, in embodiments, the regions 125 may be overlapping. [0087] In embodiments, the orientation and/or shape of the regions 125 may be repositioned according to application’s requirements.
[0088] Figure 8 shows processing 128 by the controller for generating the input interactions 108.
[0089] At step 129, the controller 105 receives the positional input interactions 106 which may be stored in a buffer. At step 130, the controller 105 receives the touch interactions 107 which may also be stored in a buffer.
[0090] Where the system 100 comprises more than one positional input 103, at step 131 , the controller 105 assigns each positional input interaction 106 to a respective positional input 103 whereafter it is removed from the buffer.
[0091 ] Furthermore, the controller 105 may assign each such interaction 107 to a respective input device 104 whereafter it is removed from the buffer.
[0092] At step 132, the controller 105 matches the positional input interactions 107 and the touch interactions 107 to generate the input interactions 108 for the device 102.
[0093] Figure 9 shows processing 134 (being a sub flow of step 132) by the controller
105 to match the positional input interactions 106 with the touch interactions 107.
[0094] At step 135, the controller 105 takes an unmatched positional input interaction
106 and, at step 136 attempts to match it to a corresponding touch interaction 107. As alluded to above, a temporal match may be found where only one touch interaction
107 occurs within a sampling window associated with a positional input interaction 106, such as a sampling window commencing from the down state of the positional input interaction 106.
[0095] If temporal matching fails, such as wherein more than one touch interaction 107 occurs within the same sampling window, the processing 134 may commence to spatial matching at step 137.
[0096] Temporal and/or spatial matching occurs at step 136 and 137 for all interactions 106, 107 in the buffer whereafter the processing 134 exits at step 138.
[0097] Figure 10 shows temporal matching processing 139 (being a sub flow of step 136) by the controller 105 wherein, at step 140, the controller ascertains 105 the touch interactions 107 falling within a sampling window (T) associated with a positional input interaction 106 (such as a sampling window commencing from the down state of a positional input interaction 106). The controller 105 may determine whether the down state of a touch interaction 107 falls within the window.
[0098] If at step 141 , only one touch interaction 107 is found within the sampling window, a match is assigned at step 142 otherwise, the processing 139 exits at step 143 and proceeds to the spatial matching processing shown in Figure 1 1 step 144 (being a sub flow of step 137).
[0099] At step 145 and 146, the controller 105 ascertains positional input interactions 106 and touch interactions 107 falling within a sampling window. The chosen sampling window may commence from the down state of a positional input interaction 106.
[0100] At step 147, the controller 105 may ascertain if any regions are applicable based on the collected positional input interactions.
[0101 ] At step 148, the controller will determine the spatial ordering of each collected positional input interaction 106 by using the spatial order and spatial axis of a corresponding region 125.
[0102] At step 149, the controller performs pairwise matching of positional input interaction to an input device 104, by the sorted order. A match is found if the input device is valid for the positional interaction’s region 125 and has not been consumed in a previous match.
[0103] Figure 12 shows application-specific context processing 150.
[0104] At step 151 , the application 1 12 may detect changing of an application-specific context, such as wherein the paint application of the example of Figure 7B is launched. The application 1 12 may use the controller interface 1 11 to transmit the application-specific context 109 to the controller 105 to configure the input system 101 at step 152.
[0105] As alluded to above, the application-specific context 109 may be used to determine the spatial ordering and the orientation of the spatial ordering axis. [0106] Furthermore, the application-specific context 109 may specify regions 125 wherein device IDs are assigned to an area of a positional input 103.
[0107] At step 153 the user may be notified of the change in application-specific context. Such notification may instruct the user of the correct positioning of the hands and input devices 104 which the user follows at step 154.
[0108] As such, at step 155, the input system 101 processes the positional input interactions 106 and the touch interactions 107 according to the new applicationspecific context to generate the input interaction signals at step 156.
[0109] The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practise the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed as obviously many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.

Claims

Claims
1 . An input system interfacing a device, the system comprising: a positional input configured for detecting positional input interactions with a surface thereof, each positional input interaction assigned respective coordinates with respect to the surface; a plurality of input devices configured for detecting touch interactions, each touch interaction assigned a respective input device ID, the input devices assigned a spatial ordering, the spatial ordering configured with respect to a spatial axis of the surface; a controller configured to: receive signals from the positional input and the input devices; match positional input interactions with touch interactions by temporally correlating timing of the interactions; match any remaining unmatched interactions by spatially correlating the assigned spatial ordering of the input devices and the coordinates of the positional input interactions; and generate input interaction signals for the device according to matched interactions, each input interaction comprising respective coordinates and an ID of a respective input device.
2. The system as claimed in claim 1 , wherein the spatial ordering is directional with respect to the spatial axis.
3. The system as claimed in claim 1 , wherein an orientation of the spatial axis is configurable.
4. The system as claimed in claim 1 , wherein a surface of the positional input comprises regions and wherein spatial ordering is assigned to respective regions.
5. The system as claimed in claim 1 , wherein the system comprises a plurality of positional inputs and wherein spatial ordering is assigned to respective positional inputs.
6. The system as claimed in claim 1 , wherein the controller is configurable with an application-specific context and wherein the application-specific context determines the spatial ordering.
7. The system as claimed in claim 6, wherein the controller is configured to receive application-specific context changes from the device.
8. The system as claimed in claim 7, wherein a controller interface is installed on an operating system of the device which transmits the application-specific context changes to the controller.
9. The system as claimed in claim 1 , wherein the controller is configured to process the interactions according to defined regions.
10. The system as claimed in claim 9, wherein a region comprises a positional input ID, respective input device IDs, spatial ordering and spatial axis.
11 . The system as claimed in claim 10, wherein the regions overlap.
12. The system as claimed in claim 10, wherein the controller is configured to discard a touch interaction having an ID not matching input device IDs of a region.
13. The system as claimed in claim 10, wherein at least one of the shape and orientation of a region is application configurable.
14. The system as claimed in claim 1 , wherein the controller is configured to temporally match a touch interaction and a positional input interaction only if the touch interaction is the only touch interaction occurring within a sampling window associated with a positional input interaction.
15. The system as claimed in claim 14, wherein the sampling window commences from a down state of the positional input interaction.
16. The system as claimed in claim 14, wherein the controller is configured to only perform spatial correlation if more than one touch interaction occurs within the sampling window.
PCT/AU2023/050272 2022-04-21 2023-04-05 An input system WO2023201386A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2023255407A AU2023255407B2 (en) 2022-04-21 2023-04-05 An input system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2022901055A AU2022901055A0 (en) 2022-04-21 A positional input system
AU2022901055 2022-04-21

Publications (1)

Publication Number Publication Date
WO2023201386A1 true WO2023201386A1 (en) 2023-10-26

Family

ID=88418666

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2023/050272 WO2023201386A1 (en) 2022-04-21 2023-04-05 An input system

Country Status (2)

Country Link
AU (1) AU2023255407B2 (en)
WO (1) WO2023201386A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168142A1 (en) * 2012-12-18 2014-06-19 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US20140267078A1 (en) * 2013-03-15 2014-09-18 Adobe Systems Incorporated Input Differentiation for Touch Computing Devices
US20150346847A1 (en) * 2008-12-22 2015-12-03 Microsoft Technology Licensing, Llc Digitizer, stylus and method of synchronization therewith
US20160364052A1 (en) * 2015-06-10 2016-12-15 International Business Machines Corporation Touch interface with person recognition
US20200348817A1 (en) * 2017-08-23 2020-11-05 Flatfrog Laboratories Ab Pen touch matching

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6924793B2 (en) * 2002-07-16 2005-08-02 Hewlett-Packard Development Company, L.P. Multi-styli input device and method of implementation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150346847A1 (en) * 2008-12-22 2015-12-03 Microsoft Technology Licensing, Llc Digitizer, stylus and method of synchronization therewith
US20140168142A1 (en) * 2012-12-18 2014-06-19 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US20140267078A1 (en) * 2013-03-15 2014-09-18 Adobe Systems Incorporated Input Differentiation for Touch Computing Devices
US20160364052A1 (en) * 2015-06-10 2016-12-15 International Business Machines Corporation Touch interface with person recognition
US20200348817A1 (en) * 2017-08-23 2020-11-05 Flatfrog Laboratories Ab Pen touch matching

Also Published As

Publication number Publication date
AU2023255407B2 (en) 2023-12-21
AU2023255407A1 (en) 2023-11-30

Similar Documents

Publication Publication Date Title
US8266529B2 (en) Information processing device and display information editing method of information processing device
JP4795343B2 (en) Automatic switching of dual mode digitizer
US9838527B2 (en) Control circuit of electrostatic capacitive sensor and electronic device using the same
US9454274B1 (en) All points addressable touch sensing surface
KR101471267B1 (en) Method and device for generating dynamically touch keyboard
US20030098858A1 (en) Dual function input device and method
WO2016072823A2 (en) Loop-shaped tactile multi-touch input device and gestures, and method therefor
US20070126711A1 (en) Input device
CN103164067B (en) Judge the method and the electronic equipment that touch input
WO2013064915A1 (en) Single hand multi-touch surface keyboard
CN101382851A (en) Computer system
CN102693035A (en) Modal touch input
WO2004010276A1 (en) Information display input device and information display input method, and information processing device
CN1577383A (en) Touch screen system and control method therefor capable of setting active regions
KR20130099420A (en) Terminal having capacitive type touch screen and method of touch position detecting the same
JPH10228350A (en) Input device
EP2410416B1 (en) Input device and control method thereof
US20160259545A1 (en) Touch-control devices and methods for determining keys of a virtual keyboard
AU2023255407B2 (en) An input system
US9342196B2 (en) Hardware accelerator for touchscreen data processing
US20210325984A1 (en) Wearable information terminal
US10303299B2 (en) Use of groove analysis in a touch screen device to determine occurrence of an elongated touch by a single finger
US9753592B2 (en) Control circuit of electrostatic capacitive sensor and electronic device using the same
CN209803748U (en) Non-contact capacitance type virtual mouse control system
CN215642661U (en) Operation control device for medical equipment

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2023255407

Country of ref document: AU

Ref document number: AU2023255407

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2023255407

Country of ref document: AU

Date of ref document: 20230405

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23790223

Country of ref document: EP

Kind code of ref document: A1