WO2020027818A1 - Determining location of touch on touch sensitive surfaces - Google Patents

Determining location of touch on touch sensitive surfaces Download PDF

Info

Publication number
WO2020027818A1
WO2020027818A1 PCT/US2018/044683 US2018044683W WO2020027818A1 WO 2020027818 A1 WO2020027818 A1 WO 2020027818A1 US 2018044683 W US2018044683 W US 2018044683W WO 2020027818 A1 WO2020027818 A1 WO 2020027818A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sensitive surface
touch sensitive
touch
location
Prior art date
Application number
PCT/US2018/044683
Other languages
French (fr)
Inventor
Yow-Wei CHENG
Yun Tang
Hao Meng
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2018/044683 priority Critical patent/WO2020027818A1/en
Publication of WO2020027818A1 publication Critical patent/WO2020027818A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • Computing systems such as desktop computers, laptops, and smartphones may have touch sensitive surfaces which may allow interaction of users with the computing systems.
  • a touch sensitive surface may facilitate a touch-based input to be provided by a user to a computing system.
  • an image capturing system such as a camera may also be associated with the computing system to capture a video or an image of the user's input on the touch sensitive device.
  • the computing system may receive the captured video or image and may process the captured video or image for various purposes, such as displaying the user’s input on a display of the computing system or projecting the user’s input on a surface.
  • Figure 1 illustrates a computing system, in accordance with an example implementation of the present subject matter
  • Figure 2 illustrates a computing system, in accordance with another example implementation of the present subject matter
  • Figure 3 illustrates a computing system, in accordance with yet another example implementation of the present subject matter
  • Figure 4 illustrates a method for determining a location on a touch sensitive surface, according to an example implementation of the present subject matter
  • Figure 5 illustrates a method for determining a location on a touch sensitive surface, according to another example implementation of the present subject matter
  • Figure 6 illustrates a computing environment for determining a location on a touch sensitive surface, according to an example implementation of the present subject matter.
  • Computing systems such as desktops computers, and laptops may be coupled to touch sensitive surfaces, such as touch screens, touchpads, and touchmats to allow users to interact with the computing systems.
  • touch sensitive surfaces such as touch screens, touchpads, and touchmats to allow users to interact with the computing systems.
  • the interaction of a user with a computing system through a touch sensitive surface may be facilitated by an image capturing device coupled to the computing system.
  • the user may provide an input, such as a touch input, or may draw or write on the touch sensitive surface using a fingertip or a stylus.
  • the image capturing device may be positioned to capture the input of the user by recording movement of the fingertip or stylus over the touch sensitive surface in a series of images or a video.
  • the computing system may thereafter process the images or video captured by the image capturing device to interpret the input and perform an action corresponding to the input, such as displaying the images or video on a display device associated with the computing system or projecting the images or video on a surface using a projector associated with the computing system.
  • the computing system is to be aware of a location of the touch.
  • the location of the point of touch is determined using touch sensors implemented in the touch sensitive surface. Higher the density of the touch sensors on the touch sensitive surface, higher is the precision of determination of the location of the point of touch. High precision signifies that the location of the touch, as determined by the computing system, is dose to an actual point of touch on the touch sensitive surface. Accordingly, to enable determination of the location of a touch precisely, the density of the touch sensors on the touch sensitive surface is increased.
  • the increase in resolution may, however, results in making the touch sensitive surface bulky as additional touch sensors are accommodated on the touch sensitive surface. Also, the additional touch sensors may result in increase in cost of the touch sensitive surface.
  • a location of touch on touch sensitive surfaces are described.
  • the examples described herein enable a touch sensitive surface to determine a location of a point of touch more precisely than the precision that may be achieved based on a density of the touch sensors on the touch sensitive surface.
  • a touch sensitive surface is associated with an image capturing device such that the touch sensitive surface is in a field of view of the image capturing device.
  • the touch sensitive surface comprises 'N' touch sensors.
  • an image of the touch surface is obtained from the image capturing device.
  • the image may comprise the object and may have * X‘ pixels, wherein 'X' is greater than‘NT.
  • the location of the touch surface corresponding to the touch is identified based on the image.
  • resolution of image capturing devices may be exploited to determine the location of the point on touch sensitive surfaces more precisely than the determination made by the touch sensitive surfaces.
  • implementing an image capturing device having a pixel density, i.e., number of pixels in per unit area in an image captured by the image capturing device, higher than the touch sensors density, rather than increasing the density of touch sensors in the touch sensitive surface allows the touch sensitive surface to have a compact size.
  • the number of pixels per unit area in an image captured by the image capturing device is greater than the number of touch sensors per unit area of the touch sensitive surface.
  • FIG. 1 illustrates a computing system 100, in accordance with an example implementation of the present subject matter.
  • the computing system 100 comprises a processor 102.
  • processors may be provided through the use of dedicated hardware as well as hardware capable of executing software.
  • the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
  • explicit use of the term "processor” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), non-volatile storage.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • ROM read only memory
  • RAM random access memory
  • non-volatile storage Other hardware, conventional and/or custom, may also be included.
  • the computing system 100 further comprises an image capturing device 104 and a touch sensitive surface 106 coupled to the processor 102.
  • Examples of the computing system 100 may comprise computing devices, such as laptops, smartphones, and tablets that incorporate an integrated image capturing device and touch sensitive surface as well as computing devices, such as desktop computers to which an external image capturing device and a touch sensitive surface may be coupled.
  • computing devices such as laptops, smartphones, and tablets that incorporate an integrated image capturing device and touch sensitive surface as well as computing devices, such as desktop computers to which an external image capturing device and a touch sensitive surface may be coupled.
  • the touch sensitive surface 106 may enable touch- based interactions between a user and the computing system 100.
  • the touch sensitive surface 106 may comprise touch sensors to detect a touch on a surface of the touch sensitive surface 106.
  • the touch sensors may be capacitive touch sensors, resistive touch sensors, strain gauge touch sensors or a combination of these touch sensors.
  • the touch sensors detect a touch on the surface of the touch sensitive surface 106 and provide an input indicative of the touch to the processor 102 for further action.
  • the touch sensitive surface 106 may comprise 'N' touch sensors, that may be arranged in the form of an array on the surface of the touch sensitive surface 106.
  • a touch sensor density of the touch sensitive surface 106 may be defined as a number of touch sensors per unit area of the surface of the touch sensitive surface 106.
  • the touch sensor density may be such that the touch sensitive surface 106 can detect a touch on the touch sensitive surface 106 but may not detect a location of the point of touch precisely. Determination of location may be imprecise when the location of the point of touch determined by the touch sensitive surface 106 is away from an actual location of the point of touch by a threshold distance or more. As will be understood, if the touch sensor density is such that two adjacent touch sensors are spaced apart by a distance more than the threshold distance, the determination may be imprecise.
  • the touch-based interactions between the user and the computing system 100 may involve the user touching or actuating the touch sensitive surface 106 to provide inputs to the computing system 100 for performing various actions.
  • a user may use, for example, a stylus or his finger to touch the touch sensitive surface 106 to provide the actuation.
  • Examples of touch-based interactions include, user activities such as writing or drawing on the touch sensitive surface 106 that may be captured by the computing system 100, for example, for storing in a memory of the computing system 100.
  • Examples of touch-based interactions may also include touch-based commands, such as a one-finger right swipe, a two-finger left swipe, or touch on a predefined location may cause the computing system 100 to perform corresponding actions.
  • an image such as that of a webpage may be projected by the computing system 100 on the touch sensitive surface.
  • the user may touch a top right comer of the webpage projected on the touch sensitive surface 106 to minimize the webpage.
  • the user may touch the touch sensitive surface 106 at a location corresponding to a hyperlink included in the webpage to open the hyperlink.
  • an image, such as a map may be projected on the touch sensitive surface 106 and the user may zoom in or zoom out of the map by making two-finger movements in the same or opposite direction on the touch sensitive surface 106.
  • the image capturing device 104 may be employed to facilitate the touch- based interactions of the user with the computing system 100.
  • the image capturing device 104 may be a camera that is inbuilt or integrated into the computing system 100, such as a webcam.
  • a camera may be a complementary metal-oxide semiconductor (CMOS) camera in an example.
  • CMOS complementary metal-oxide semiconductor
  • the image capturing device may be an external camera coupled to the computing device 100, such as an external camera coupled to the computing device 100 through a universal serial bus (USB) port.
  • USB universal serial bus
  • Examples of the image capturing device 104 include infrared camera, dual infrared camera, digital single lens reflex camera, depth camera, and mirrorless camera.
  • image capturing device 104 may capture an image or video of a user’s interaction with the touch sensitive surface.
  • the touch sensitive surface 106 may be positioned such that the touch sensitive surface 206 lies in a field of view of the image capturing device 204.
  • the image capturing device 104 may record the interaction and provide an indication to the computing system 102, for the computing system 100 to perform the corresponding action.
  • the image capturing device 104 may have an image resolution such that an image captured by the image capturing device 104 may have 'X' pixels.
  • the number of pixels‘X’ in the image is greater than number of touch sensors 'N' in the touch sensitive surface 106.
  • the number of pixels 'X * may be 2073600 (1920*1080).
  • the number of touch sensors on the touch sensitive surface may be 50000.
  • a pixel density of the image captured by the image capturing device 104 may be defined as number of pixels per unit area of the image captured by the image capturing device 104. Thus, the pixel density is higher than the sensor density.
  • the computing system 100 comprises an input detection module 108 coupled to the processor 102.
  • the input detection module 108 receives an indication of actuation of the touch sensitive surface 106.
  • the actuation may be caused by a touch, by an object, such as a finger or a stylus, at a point on the touch sensitive surface 106.
  • the input detection module 108 may cause the image capturing device 104 to capture an image of the touch sensitive surface 106 such that the image comprises the object.
  • a location detection module 110 of the computing system 100 is coupled to the processor 102 to determine a location of the point on the touch sensitive surface 106 based on the image.
  • the location of the point on the touch sensitive surface 106 is a physical location of the point of touch on the touch sensitive surface 106 which, as previously explained, may be further processed by the computing device 100 for performing a corresponding action.
  • FIG. 1 shows the image capturing device 104 and touch sensitive surface 106 integrated in the computing system 100
  • the image capturing device 104 and touch sensitive surface 106 may be implemented as separate devices as well. Accordingly, the techniques for processing images of a touch sensitive surface using significantly low processing resources, described herein, also extends to computing systems comprising computing devices that may be coupled to external image capturing devices and external touch sensitive surfaces.
  • Figure 2 shows a computing system 200 according to an example implementation of the present subject matter.
  • the computing system 200 comprises a computing device 202 coupled to an image capturing device 204 and a touch sensitive surface 206.
  • the image capturing device 204 and touch sensitive surface 206 are similar to the above-explained image capturing device 104 and touch sensitive surface 106.
  • the touch sensitive surface 206 and the image capturing device 204 are positioned such that the touch sensitive surface 206 lies in a field of view of the image capturing device 204.
  • the touch sensitive surface 206 may be positioned on a flat horizonal surface, such as a surface of a desk and the image capturing device 204 may be attached to a supporting structure which holds the image capturing device 204 in a downward facing orientation, such that the image capturing device 204 faces the touch sensitive surface 206.
  • the computing system 200 includes processors) 208, similar to the processor 102, a memory 210, and interface(s)
  • the memory 210 may include any computer-readable medium including, for example, volatile memory (e.g., RAM), and/or non-volatile memory (e.g., EPROM, flash memory, etc.).
  • the interface(s) 212 may include a variety of software and hardware interfaces that enable the computing device 202 to communicate with the image capturing device 204 and touch sensitive surface 206.
  • Modules 214 and data 216 may reside in the memory 210.
  • the modules 214 include routines, programs, objects, components, data structures, and the like, which perform particular tasks or implement particular abstract data types.
  • the modules 214 may include a calibration module 218, a display module 220, and a feature detection module 232 in addition to the aforementioned input detection module 108 and location detection module 110.
  • the modules 214 may also comprise other modules 222 that supplement functions of the computing device 202.
  • the data 216 serves, amongst other things, as a repository for storing data that may be fetched, processed, received, or generated by the modules 214.
  • the data 216 comprises other data 224 corresponding to the other modules 222.
  • the data of the computing device 200 also comprises calibration data 226, image data 228, and location data 230.
  • the image capturing device 204 and the touch sensitive surface 206 may be calibrated for a given position of the image capturing device 204 and the touch sensitive surface 206.
  • Calibration may be understood as a process of mapping locations on the touch sensitive surface 206 to their corresponding locations in a view of the camera or on an image captured by the image capturing device 204.
  • a calibration mode may be initiated on the image capturing device 204 for the calibration.
  • respective locations of a plurality of reference points on a surface of the touch sensitive surface 206 facing the image capturing device 204 is made known to the computing device 202.
  • An image of the touch sensitive surface 206 is captured and correction parameters are used to remove the distortions in the image.
  • a mapping of the respective locations of the reference points on the touch sensitive surface 206 and their corresponding location in the image is provided to the calibration module 218 coupled to the processor 208 of the computing device 202.
  • the calibration module 218 may map locations of various points on the touch sensitive surface 206 to their corresponding locations in an image captured by the image capturing device 204.
  • coordinates of the location of the reference points on the touch sensitive surface 206 may be provided to the calibration module 218.
  • a two-dimensional coordinate system may be used to define physical location of various points on the touch sensitive surface 206.
  • the surface coordinates corresponding to the physical location of the reference points on the touch sensitive surface 206 may be provided to the calibration module 218 as user input.
  • four reference points say, A, B, C, and D may be used for the calibration and accordingly respective locations of the four reference points may be provided to the calibration module 218.
  • the surface coordinates (x1,y1). (X2,y2). (x3.y3), and (x4,y4) of the reference points A, B, C, and D, respectively, may be provided to the calibration module 218.
  • a user input may be provided to indicate the respective locations of the reference points in the image.
  • the user may mark respective locations of the reference points in the image.
  • the user may use a graphical user interface provided by the interface(s) 212 of the computing device 202 to provide the surface coordinates and to indicate respective locations of the surface coordinates of the reference points in the image to the calibration module 218.
  • the reference points may be determined by the calibration module 218 and may not be defined by the user.
  • the reference points A, B, C, and D may be four comers of the touch sensitive surface 206.
  • the calibration module 218 may implement comer detection algorithms to detect the four comers of the touch sensitive surface 206.
  • the calibration module 218 may detect the comers of the touch sensitive surface 206 and may accordingly map the respective locations of each of the comers to their corresponding locations in the image, for example, based on the relative distance of the comers from each other.
  • the calibration module 218 may determine image coordinates of each of the reference points in the image.
  • the image coordinates indicate the location of the reference points in the image.
  • the processor 208 may determine image coordinates of each of the points on the touch sensitive surface 206.
  • the processor 208 uses the surface coordinates and the image coordinates of the reference points to generate a transformation matrix.
  • the transformation matrix may be used to determine the surface coordinates corresponding to the image coordinates. Accordingly, once calibrated, the image coordinates corresponding to a point on the touch sensitive surface 206 may be determined.
  • the user input such as the location of the reference points received during calibration, and the corresponding transformation matrix generated may be stored in the calibration data 226 in the memory of the computing device 202.
  • the calibration data 226 may be applicable for a relative position of the image capturing device 204 and the touch sensitive surface 206 for which the calibration process was performed and may be used until the relative position between the image capturing device 204 and the touch sensitive surface 206 does not change.
  • a re-calibration may be performed by the user.
  • the calibration data corresponding to the changed position may be stored in the memory.
  • the calibration process may also provide for correcting various distortions, such as radial distortion and tangential that may be present in an image captured by the image capturing device 204.
  • various distortions such as radial distortion and tangential that may be present in an image captured by the image capturing device 204.
  • a tangential distortion may occur in the captured images.
  • the image capturing device 204 may be corrected for the tangential distortion by applying a compensation parameter that offsets the misalignment between the imaging sensor and lens.
  • correction parameters may be defined corresponding to the various types of distortions for eliminating the distortions from the captured image.
  • the correction parameters may also be stored in the calibration data 226 and may be applied to images captured by the image capturing device 204 to remove the distortion in the image. For example, when an image of the touch sensitive surface 206 is captured for indicating the respective location of reference points to the image capturing device 204, the correction parameters may be applied to the image prior to indicating the locations.
  • the input detection module 108 coupled to the processor 208 receives an indication of the actuation of the touch sensitive surface 206. In an example, the indication of the actuation may be received from the touch sensitive surface 206 that may have sensed the actuation.
  • the input detection module 108 may receive the indication of actuation from the image capturing device 204 that may provide the indication, for example, by detecting a movement, such as an entry of the object in the field of view of the image capturing device 204.
  • the input detection module 108 may receive the indication of actuation from the stylus.
  • the stylus may have a tip which can be actuated when touched on the touch sensitive surface 206.
  • the stylus may be coupled with the computing system 200 and when the tip of the stylus touches the touch sensitive surface 206, the input detection module 108 may receive an actuation of the touch on the touch sensitive surface 206.
  • the input detection module 108 causes the image capturing device 204 to capture an image of the touch sensitive surface 206.
  • the image comprises the touch sensitive surface 206.
  • the image comprises 'X' pixels.
  • the image of the touch sensitive surface 206 may be stored in the image data 228 in the memory of the computing device 202.
  • the image of the touch sensitive surface 206 captured by the image capturing device 204 is used by the location detection module 110 to determine the location of the point on the touch sensitive surface 206 touched by the user.
  • the user may use an object, for example, a finger to touch the touch sensitive surface 206. Accordingly, a tip of the finger or stylus actuates the touch sensitive surface 206 at the point.
  • the image captured by the image capturing device 204 comprises the object, such that the image includes the object either completely or partially. For example, in case the object is a stylus, the image may not include an entire length but rather a fraction of the length of the stylus.
  • the location detection module 110 identifies the object in the image.
  • the location detection module may identity the object to be a finger or a stylus.
  • object detection algorithm such as YOLO, R- convolutional neural network (RCNN), Fast RCNN. Mask RCNN, Multibox may be used to identity the object in the image.
  • the feature detection module 232 may further identity a tip of the object in the image.
  • the feature detection module 232 may implement a combination of the geometry based algorithm and model based algorithm to identity the tip of the object.
  • the identification of the tip of the object in the image is used to locate the point where the tip is positioned on the touch sensitive surface in the image.
  • the identification of the tip of the object indicates the location of point of touch on the touch sensitive surface in the image.
  • the calibration module 218 may determine image coordinates corresponding to the point where the tip is positioned on the touch sensitive surface in the image.
  • the image coordinates indicate a location of the point of touch in the image.
  • the image coordinates of the point are, say, (M', N').
  • the calibration module 218 may convert the image coordinates (M * . N') into surface coordinates (M, N).
  • the conversion of the image coordinates into surface coordinates may be done using the transformation matrix stored in the calibration data 226.
  • the calibration module 218 may select a transformation matrix from amongst a plurality of transformation matrix stored in the calibration data 226, depending on a current relative position of the touch sensitive surface 206 and the image capturing device 204.
  • the surface coordinates of the point on the touch sensitive surface may be provided as an input to the location detection module 110. Based on the surface coordinates of the point, the location detection module 110 may identity a location of the point on the touch sensitive surface 206. In an example, the identified location, surface coordinates may be stored in the location data 230. As mentioned previously, the determination of the location of the point of touch allows the computing device 202 to perform further actions, for example, displaying the point of touch on a display device 234 coupled to the computing device.
  • the computing system 200 may include the display device 234, as shown in Figure 2.
  • the display device 234 may be a display device 234, such as a light emitting diode (LED) device, liquid crystal display (LCD) device, organic light emitting diode (OLED) device, cathode ray tube (CRT) device.
  • the display device 234 may be integral to the computing device 202 or may be coupled to the computing device 202 via interfaces, such as video graphics array (VGA), High-Definition Multimedia Interface (HDMI).
  • VGA video graphics array
  • HDMI High-Definition Multimedia Interface
  • the display module 220 may be used.
  • the display module 220 may utilize the location data 230 to determine a location on a display device 234 corresponding to the location of the point on the touch sensitive surface 206. Accordingly, the input provided by the user to the touch sensitive surface 206 may be depicted on the display device 234.
  • the computing device 202 is shown directly coupled to the touch sensitive surface 206, image capturing device 204, and the display device 234.
  • the computing device 202 may be at a location different from the touch sensitive surface 206 and the image capturing device 204.
  • the computing device 202 may be coupled to the touch sensitive surface 206 and the image capturing device 204 via a network.
  • the interface 212 may allow the computing device 202 to interface with the touch sensitive surface 206 and image capturing device 204.
  • the indication of actuation of the touch sensitive surface 206 and the image captured by the image capturing device 204 may be transmitted to the computing device via the network.
  • the network may be a single network or a combination of multiple networks and may use a variety of different communication protocols.
  • the network 108 may be a wireless or a wired network, or a combination thereof. Examples of such individual networks include, but are not limited to, Global System for Mobile Communication (GSM) network, Universal Mobile Telecommunications System (UMTS) network, Personal Communications Service (PCS) network. Time Division Multiple Access (TDMA) network, Code Division Multiple Access (CDMA) network. Next Generation Network (NON), Public Switched Telephone Network (PSTN).
  • GSM Global System for Mobile Communication
  • UMTS Universal Mobile Telecommunications System
  • PCS Personal Communications Service
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • NON Next Generation Network
  • PSTN Public Switched Telephone Network
  • the image capturing device may include a location detection module and a processor which may be responsible for detecting the location on the touch sensitive surface corresponding to the point.
  • Figure 3 illustrates a computing device 300 according to another example implementation of the present subject matter.
  • the computing device 300 comprises a touch sensitive surface 302, an image capturing device 304, and a display device 314.
  • the touch sensitive surface 302 and the image capturing device 304 are positioned such that the image capturing device 304 feces the touch sensitive surface 302.
  • the touch sensitive surface 302 may be completely or partially in a field of view 306 of the image capturing device 304.
  • the touch sensitive surface 302 is partially in the field of view 306 of the image capturing device 304, such that a portion 302-1 of the touch sensitive surface 302 lies in the field of view 306 of the image capturing device 304.
  • the portion 302-1 of the touch sensitive surface 302 may be understood as a fraction of the entire surface area of the touch sensitive surface 302 that faces the image capturing device 304.
  • the total number of touch sensors in the touch sensitive surface is 'N * and the portion 302-1 of the touch sensitive surface 302 comprises 'M' out of the 'N' touch sensors.
  • the image capturing device 304 when the image capturing device 304 captures an image of the touch sensitive surface 302 upon receiving an indication of actuation of the touch sensitive surface 302, the image comprises the portion 302-1 of the touch sensitive surface 302.
  • the Image capturing device 304 has an image resolution such that an image captured by the image capturing device 304 has 'X' pixels. Further, as will be understood, since the touch sensitive surface 302 is partially in the field of view 306, a proportional fraction, say * P' of the * X * pixels compose the portion 302-1 in the image.
  • the image may comprise a full or partial view of a stylus 308, used for actuating the touch sensitive surface 302.
  • the image resolution is selected such that 'R’ is greater than 'M'.
  • the 'M' touch sensors may be comprised of an array of Mi touch sensors in a first direction 312 along an edge 302-2 of the touch sensitive surface 302 and M2 touch sensors in a second direction 310 along an edge 302-3 of the touch sensitive surface 302.
  • 'R’ pixels may be comprised of an array of P1 and P2 pixels in the first direction 312 and the second direction 310, respectively, such that Pi is greater than Mi and P2 is greater than M2.
  • the resolution allows the computing device 300 to ascertain the location of the point of touch more precisely based on the image than based on inputs from the touch sensors.
  • the resolution of the image capturing device 304 is higher compared to when the touch sensitive surface 302 is entirely in the field of view 306.
  • the image capturing device 304 may have several modes of operation, such as an extended video graphics array (XVGA), quantum extended graphics array QXGA, high definition (HD), and full high definition (FHD) mode having various resolutions to capture images.
  • XVGA extended video graphics array
  • QXGA quantum extended graphics array
  • HD high definition
  • FHD full high definition
  • the mode of operation and in turn the resolution of the image capturing device 304 is selected such that the number of pixels covering the part of the touch sensitive surface 302 in the field of view 306 is higher than the number of touch sensors included in the part of the touch sensitive surface 302.
  • the location of a point of touch on the touch sensitive surface is mapped to a point on the display device 314.
  • the display device 314 may have a display resolution.
  • the display resolution may be such that the number of pixels in the display device 314 is 'Y * .
  • the number pixels in the image capturing device 304 is X. such that 'X * is greater than ⁇ .
  • the mapping of the determined location of the touch sensitive surface to the corresponding point on the display device 314 may be done by mapping the pixels on an image of the image capturing device 304 to the pixels of the display device 314.
  • the pixels of the image capturing device 304 is more than the pixels of the display device 314, more than one pixels of the image capturing device 304 may represent a single pixel on the display device 314.
  • the pixel location of the point of touch in the image is mapped to the pixel in the display device 314 to identify the point on the display device 314.
  • the display in the Figure 3 is shown coupled to the touch sensitive surface 302 and the image capturing device 304. However, it should be understood that a display may be projected on any surface including the touch sensitive surface 302.
  • Figure 4 illustrates a method 400 for identifying a location of a touch sensitive surface in accordance with an example implementation of the present subject matter.
  • the method 400 may be implemented in a variety of computing system, but for the ease of explanation, the present description of the example method 400 to determine the location of the touch sensitive surface corresponding to the touch by an object is provided in reference to the above- described computing system 100.
  • blocks of the method 400 may be performed by the computing system 100.
  • the blocks of the method 400 may be executed based on instructions stored in a non-transitory computer-readable medium, as will be readily understood.
  • the non-transitory computer-readable medium may include, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital date storage media.
  • an indication of actuation of a touch sensitive surface comprising * N’ touch sensors, such as the touch sensitive surface 206, is received.
  • the actuation of the touch sensitive surface is based on touch by an object on the touch sensitive surface.
  • a user may use an object, such as a stylus to touch the touch sensitive surface.
  • the indication may be received by a processor of a computing device, such as the processor 102 of the computing system 100.
  • an image of the touch sensitive surface is obtained in response to the receiving of the indication.
  • the image of the touch sensitive surface has X pixels and the number of pixels X is greater than the number of touch sensors ⁇ * .
  • the image may be captured by an image capturing device, such as the image capturing device 104. Further, in an example, the image may be obtained by the computing system 100. The method, thereafter, proceeds to block 406.
  • FIG. 5 illustrates a method 500 for identifying a location on a touch sensitive surface in accordance with another example implementation of the present subject matter.
  • the method 500 may be implemented in a variety of computing systems, but for the ease of explanation, the present description of the example method 500 to determine a location on the touch sensitive surface is provided in reference to the above-described computing system 100.
  • blocks of the method 500 may be performed by the computing system 100.
  • the blocks of the method 500 may be executed based on Instructions stored in a non-transitory computer-readable medium, as will be readily understood.
  • the non-transitory computer-readable medium may include, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
  • an indication of actuation of a touch sensitive surface is received.
  • the touch sensitive surface may be similar to the touch sensitive surface 206 and comprises 'N' touch sensors.
  • the actuation of the touch sensitive surface is based on a touch, by an object, at a point on the touch sensitive surface.
  • the touch by the object may be a single-point touch or a multi-point touch.
  • single-point touch the user may touch at a single point on the touch sensitive surface or the touch may be localized to a few points on the touch sensitive surface.
  • a user's initiating a graphical Interface projected on the touch sensitive surface may actuate localized points on the touch sensitive surface.
  • the touch may be a multi-point touch, such as when a user draws something on the touch sensitive surface.
  • the touching of the touch sensitive surface by the object is detected by the touch sensors.
  • the number of touch sensors is such that the touch sensitive surface can detect a touch at a point and is agnostic of a location of the touch.
  • an image of the touch sensitive surface is obtained from an image capturing device, such as the image capturing device 104.
  • the image comprises the object and has‘X * pixels.
  • the number of pixels‘X’ is greater than the number of touch sensors 'N' on the touch sensitive surface.
  • the object in the image is identified to be a stylus.
  • object detection algorithms such as YOLO, RCNN, Fast RCNN,
  • the object detection algorithms may use a reference image of the stylus to identify the stylus in the image.
  • the computing system may capture a reference image of the stylus and may store the same in a memory.
  • the object detection algorithms may use the stored reference image of the stylus to identity the stylus in the image.
  • the object detection algorithms may use the stored reference image of the stylus to identity the stylus in the image.
  • a location of tip of the stylus is identified.
  • the user may touch the touch sensitive surface with the tip of the stylus.
  • a geometry based algorithm and a model based algorithm may be used to identity the location tip the stylus in an image.
  • a feature detection algorithm may be used to determine the tip of the stylus.
  • the feature detection algorithm may compare the reference image of the stylus with the identified stylus in the image to identity the tip and the location of the tip of the stylus. In an example, even when a portion of the stylus is identified in the image, the feature detection algorithm may identity the location of the tip based on determining an orientation of the stylus in the image.
  • a location on the touch sensitive surface corresponding to the point is determined based on the location of the tip of the stylus in the image.
  • a transformation matrix may be used to determine the location on the touch sensitive surface corresponding to the point.
  • image coordinates corresponding to the location of the tip may be determined which may be then converted to the surface coordinates using a transformation matrix.
  • the surface coordinates may indicate the location of the location on the touch sensitive surface corresponding to the point.
  • the location of the touch sensitive surface is mapped to a point on a display of a display device.
  • the display device has a display resolution having T pixels.
  • the number of pixels in an image of the image capturing device is 'X', which is greater than the number of pixels ⁇ ’ of the display device.
  • the pixels‘X’ of the image capturing device may be mapped to the pixels ⁇ * of the display device.
  • Figure 6 illustrates a computing environment implementing a non- transitory computer-readable medium for determining a location on the touch sensitive surface, according to an example.
  • the computing environment 600 may comprise a computing system, such as the computing system 100.
  • the computing environment 600 includes a processing resource 602 communicatively coupled to the non-transitory computer-readable medium 604 through a communication link 606.
  • the processing resource 602 may be a processor of the computing system 100 that fetches and executes computer-readable instructions from the non-transitory computer-readable medium 604.
  • the non-transitory computer-readable medium 604 can be, for example, an internal memory device or an external memory device.
  • the communication link 606 may be a direct communication link, such as any memory read/write interface.
  • the communication link 606 may be an indirect communication link, such as a network interface.
  • the processing resource 602 can access the non-transitory computer- readable medium 604 through a network 608.
  • the network 608 may be a single network or a combination of multiple networks and may use a variety of different communication protocols.
  • the processing resource 602 and foe non-transitory computer- readable medium 604 may also be communicatively coupled to data sources 610.
  • the data source(s) 610 may also be used to store data, such as calibration data, location data. Further the data source(s) 610 may be a database.
  • the non-transitory computer-readable medium 604 comprises executable instructions 612 for determining a location on the touch sensitive surface.
  • the non-transitory computer-readable medium 604 may comprise instructions executable to implement the previously described location detection module 110.
  • the instructions 612 may be executable by the processing resource 602 to receive an indication of actuation of a touch sensitive surface.
  • the actuation of the touch sensitive surface may be based on a touch, by an object, at a point on the touch sensitive surface.
  • the touch sensitive surface comprises ⁇ ' touch sensors.
  • an image of the touch sensitive surface, comprising the object is received from an image capturing device.
  • the image capturing device has an image resolution such that the number of pixels in the image is‘X*. which is greater than number of touch sensors 'N' on the touch sensitive surface.
  • the instructions 612 may be further executable by the processing resource 602 to determine a location of the point on the touch sensitive surface. Further, the instructions 612 may be executable by the processing resource 602 to ascertain a location, on a display device, coresponding to the point on the touch sensitive surface.
  • the display device has a display resolution. The image resolution of the image capturing device is greater than the display resolution of the display device.
  • the instruction 612 may be executable by the processing resource 602 to identify the object to be stylus and farther identify a location of the tip of the stylus. Further, in another example, the instruction 612 may be executable by the processing resource 602 to identify the object to be a finger of a user and farther identify a location of the tip of the finger. In an example, a feature detection algorithm may be used to identify the object and locate the tip of foe object. Based on the location of the tip of the object the processing resource 602 may determine image coordinates of the location of the point on the image. The instruction 612 may further be executable by the processing resource 602 to convert the image coordinates to surface coordinates. The surface coordinates indicate the location on the touch sensitive surface corresponding to the point. The instruction 612 may further be executable by the processing resource 602 to convert the image coordinates to display coordinates which would indicate the location on the display device corresponding to the point.
  • the methods and devices of the present subject matter provide techniques to determine a location on the touch sensitive surface.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

Examples techniques to determine a location of the touch sensitive surface corresponding to a touch by an object are described. In an example, an indication of actuation of touch sensitive surface, based on a touch, by an object, at a point on a touch sensitive surface is received. An image of the touch sensitive surface is received from an image capturing device, in response to the receiving of the indication. The touch sensitive surface has N touch sensors and the resolution of the image capturing device has an image resolution comprising "X" pixels which is greater than "N". A location of the touch sensitive surface corresponding to the touch by the object is identified based on the image.

Description

DETERMINING LOCATION OF TOUCH ON TOUCH SENSITIVE
SURFACES
BACKGROUND
[0001] Computing systems, such as desktop computers, laptops, and smartphones may have touch sensitive surfaces which may allow interaction of users with the computing systems. For instance, a touch sensitive surface may facilitate a touch-based input to be provided by a user to a computing system.
[0002] In some cases, an image capturing system, such as a camera may also be associated with the computing system to capture a video or an image of the user's input on the touch sensitive device. The computing system may receive the captured video or image and may process the captured video or image for various purposes, such as displaying the user’s input on a display of the computing system or projecting the user’s input on a surface.
BRIEF DESCRIPTION OF FIGURES
[0003] The following detailed description references the drawings, wherein:
[0004] Figure 1 illustrates a computing system, in accordance with an example implementation of the present subject matter;
[0005] Figure 2 illustrates a computing system, in accordance with another example implementation of the present subject matter,
[0006] Figure 3 illustrates a computing system, in accordance with yet another example implementation of the present subject matter;
[0007] Figure 4 illustrates a method for determining a location on a touch sensitive surface, according to an example implementation of the present subject matter;
[0008] Figure 5 illustrates a method for determining a location on a touch sensitive surface, according to another example implementation of the present subject matter, and [0009] Figure 6 illustrates a computing environment for determining a location on a touch sensitive surface, according to an example implementation of the present subject matter.
DETAILED DESCRIPTION
[0010] Computing systems, such as desktops computers, and laptops may be coupled to touch sensitive surfaces, such as touch screens, touchpads, and touchmats to allow users to interact with the computing systems. In some cases, the interaction of a user with a computing system through a touch sensitive surface may be facilitated by an image capturing device coupled to the computing system.
[0011] For example, the user may provide an input, such as a touch input, or may draw or write on the touch sensitive surface using a fingertip or a stylus. The image capturing device may be positioned to capture the input of the user by recording movement of the fingertip or stylus over the touch sensitive surface in a series of images or a video. The computing system may thereafter process the images or video captured by the image capturing device to interpret the input and perform an action corresponding to the input, such as displaying the images or video on a display device associated with the computing system or projecting the images or video on a surface using a projector associated with the computing system.
[0012] Generally, to perform an action corresponding to the input provided by the user on the touch sensitive surface, the computing system is to be aware of a location of the touch. The location of the point of touch is determined using touch sensors implemented in the touch sensitive surface. Higher the density of the touch sensors on the touch sensitive surface, higher is the precision of determination of the location of the point of touch. High precision signifies that the location of the touch, as determined by the computing system, is dose to an actual point of touch on the touch sensitive surface. Accordingly, to enable determination of the location of a touch precisely, the density of the touch sensors on the touch sensitive surface is increased. The increase in resolution may, however, results in making the touch sensitive surface bulky as additional touch sensors are accommodated on the touch sensitive surface. Also, the additional touch sensors may result in increase in cost of the touch sensitive surface.
[0013] According to examples of the present subject matter, techniques for determining a location of touch on touch sensitive surfaces are described. The examples described herein enable a touch sensitive surface to determine a location of a point of touch more precisely than the precision that may be achieved based on a density of the touch sensors on the touch sensitive surface.
10014] According to an example of the present subject matter, a touch sensitive surface is associated with an image capturing device such that the touch sensitive surface is in a field of view of the image capturing device. In an example, the touch sensitive surface comprises 'N' touch sensors. In an example, when an indication of actuation of a touch sensitive surface based on a touch by an object is received, an image of the touch surface is obtained from the image capturing device. The image may comprise the object and may have *X‘ pixels, wherein 'X' is greater than‘NT. The location of the touch surface corresponding to the touch is identified based on the image.
10015] In accordance with the example techniques described herein, resolution of image capturing devices may be exploited to determine the location of the point on touch sensitive surfaces more precisely than the determination made by the touch sensitive surfaces. In an example, implementing an image capturing device having a pixel density, i.e., number of pixels in per unit area in an image captured by the image capturing device, higher than the touch sensors density, rather than increasing the density of touch sensors in the touch sensitive surface, allows the touch sensitive surface to have a compact size. In an example, the number of pixels per unit area in an image captured by the image capturing device is greater than the number of touch sensors per unit area of the touch sensitive surface. Also, increasing the resolution of the image capturing device in lieu of the number of touch sensors in the touch sensitive surface incurs less cost in manufacturing of the touch sensitive surface. This is because technical advent in image processing technology allows high resolution in image capturing devices to be achieved at significantly low cost in comparison to cost incurred in increasing the density of touch sensors in touch sensitive surfaces.
[0016] The above techniques are further described with reference to Figure
1 to Figure 6. It should be noted that the description and the Figures merely illustrate the principles of the present subject matter along with examples described herein and should not be construed as a limitation to the present subject matter. It is thus understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects, and implementations of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
[001h Figure 1 illustrates a computing system 100, in accordance with an example implementation of the present subject matter. According to an implementation of the present subject matter, the computing system 100 comprises a processor 102.
10018] The functions of the various elements shown in the Figures, including any functional blocks labelled as "processors)", may be provided through the use of dedicated hardware as well as hardware capable of executing software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term "processor” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), non-volatile storage. Other hardware, conventional and/or custom, may also be included.
[0019] The computing system 100 further comprises an image capturing device 104 and a touch sensitive surface 106 coupled to the processor 102.
Examples of the computing system 100 may comprise computing devices, such as laptops, smartphones, and tablets that incorporate an integrated image capturing device and touch sensitive surface as well as computing devices, such as desktop computers to which an external image capturing device and a touch sensitive surface may be coupled.
[0020] In an example, the touch sensitive surface 106 may enable touch- based interactions between a user and the computing system 100. The touch sensitive surface 106 may comprise touch sensors to detect a touch on a surface of the touch sensitive surface 106. In an example, the touch sensors may be capacitive touch sensors, resistive touch sensors, strain gauge touch sensors or a combination of these touch sensors. The touch sensors detect a touch on the surface of the touch sensitive surface 106 and provide an input indicative of the touch to the processor 102 for further action.
[0021] In an example, the touch sensitive surface 106 may comprise 'N' touch sensors, that may be arranged in the form of an array on the surface of the touch sensitive surface 106. A touch sensor density of the touch sensitive surface 106 may be defined as a number of touch sensors per unit area of the surface of the touch sensitive surface 106. In an example, the touch sensor density may be such that the touch sensitive surface 106 can detect a touch on the touch sensitive surface 106 but may not detect a location of the point of touch precisely. Determination of location may be imprecise when the location of the point of touch determined by the touch sensitive surface 106 is away from an actual location of the point of touch by a threshold distance or more. As will be understood, if the touch sensor density is such that two adjacent touch sensors are spaced apart by a distance more than the threshold distance, the determination may be imprecise.
[0022] The touch-based interactions between the user and the computing system 100 may involve the user touching or actuating the touch sensitive surface 106 to provide inputs to the computing system 100 for performing various actions. A user may use, for example, a stylus or his finger to touch the touch sensitive surface 106 to provide the actuation. Examples of touch-based interactions include, user activities such as writing or drawing on the touch sensitive surface 106 that may be captured by the computing system 100, for example, for storing in a memory of the computing system 100. Examples of touch-based interactions may also include touch-based commands, such as a one-finger right swipe, a two-finger left swipe, or touch on a predefined location may cause the computing system 100 to perform corresponding actions. In an example, an image, such as that of a webpage may be projected by the computing system 100 on the touch sensitive surface. The user may touch a top right comer of the webpage projected on the touch sensitive surface 106 to minimize the webpage. Similarly, the user may touch the touch sensitive surface 106 at a location corresponding to a hyperlink included in the webpage to open the hyperlink. In another example, an image, such as a map may be projected on the touch sensitive surface 106 and the user may zoom in or zoom out of the map by making two-finger movements in the same or opposite direction on the touch sensitive surface 106. In an example, the image capturing device 104 may be employed to facilitate the touch- based interactions of the user with the computing system 100. In an example, the image capturing device 104 may be a camera that is inbuilt or integrated into the computing system 100, such as a webcam. A camera may be a complementary metal-oxide semiconductor (CMOS) camera in an example. In another example, the image capturing device may be an external camera coupled to the computing device 100, such as an external camera coupled to the computing device 100 through a universal serial bus (USB) port. Examples of the image capturing device 104 include infrared camera, dual infrared camera, digital single lens reflex camera, depth camera, and mirrorless camera.
[0023] To enable the touch-based interactions of the user with the computing system 100, image capturing device 104 may capture an image or video of a user’s interaction with the touch sensitive surface. For the purpose, the touch sensitive surface 106 may be positioned such that the touch sensitive surface 206 lies in a field of view of the image capturing device 204. For example, the image capturing device 104 may record the interaction and provide an indication to the computing system 102, for the computing system 100 to perform the corresponding action.
[0024] In an example, the image capturing device 104 may have an image resolution such that an image captured by the image capturing device 104 may have 'X' pixels. The number of pixels‘X’ in the image is greater than number of touch sensors 'N' in the touch sensitive surface 106. For example, for a high definition image, the number of pixels 'X* may be 2073600 (1920*1080). Further, in an example, the number of touch sensors on the touch sensitive surface may be 50000. A pixel density of the image captured by the image capturing device 104 may be defined as number of pixels per unit area of the image captured by the image capturing device 104. Thus, the pixel density is higher than the sensor density.
10025] According to an example implementation of the present subject matter, the computing system 100 comprises an input detection module 108 coupled to the processor 102. The input detection module 108 receives an indication of actuation of the touch sensitive surface 106. The actuation may be caused by a touch, by an object, such as a finger or a stylus, at a point on the touch sensitive surface 106. In response to the receiving of the indication, the input detection module 108 may cause the image capturing device 104 to capture an image of the touch sensitive surface 106 such that the image comprises the object.
[0026] In an example, a location detection module 110 of the computing system 100 is coupled to the processor 102 to determine a location of the point on the touch sensitive surface 106 based on the image. The location of the point on the touch sensitive surface 106 is a physical location of the point of touch on the touch sensitive surface 106 which, as previously explained, may be further processed by the computing device 100 for performing a corresponding action.
[002h While the computing system 100 depicted in the example implementation illustrated in Figure 1 shows the image capturing device 104 and touch sensitive surface 106 integrated in the computing system 100, it should be understood that the image capturing device 104 and touch sensitive surface 106 may be implemented as separate devices as well. Accordingly, the techniques for processing images of a touch sensitive surface using significantly low processing resources, described herein, also extends to computing systems comprising computing devices that may be coupled to external image capturing devices and external touch sensitive surfaces. [0028] Figure 2 shows a computing system 200 according to an example implementation of the present subject matter. The computing system 200 comprises a computing device 202 coupled to an image capturing device 204 and a touch sensitive surface 206. The image capturing device 204 and touch sensitive surface 206 are similar to the above-explained image capturing device 104 and touch sensitive surface 106.
[0029] In an example, the touch sensitive surface 206 and the image capturing device 204 are positioned such that the touch sensitive surface 206 lies in a field of view of the image capturing device 204. In an example, the touch sensitive surface 206 may be positioned on a flat horizonal surface, such as a surface of a desk and the image capturing device 204 may be attached to a supporting structure which holds the image capturing device 204 in a downward facing orientation, such that the image capturing device 204 faces the touch sensitive surface 206.
[0030] The computing system 200, among other things, includes processors) 208, similar to the processor 102, a memory 210, and interface(s)
212 coupled to the processors) 208. In an example, the aforementioned image capturing device 204 and the touch sensitive surface 206 may be coupled to the processors) 208. The memory 210 may include any computer-readable medium including, for example, volatile memory (e.g., RAM), and/or non-volatile memory (e.g., EPROM, flash memory, etc.). The interface(s) 212 may include a variety of software and hardware interfaces that enable the computing device 202 to communicate with the image capturing device 204 and touch sensitive surface 206.
[0031] Modules 214 and data 216 may reside in the memory 210. The modules 214 include routines, programs, objects, components, data structures, and the like, which perform particular tasks or implement particular abstract data types. In an example, the modules 214 may include a calibration module 218, a display module 220, and a feature detection module 232 in addition to the aforementioned input detection module 108 and location detection module 110. The modules 214 may also comprise other modules 222 that supplement functions of the computing device 202.
[0032] The data 216 serves, amongst other things, as a repository for storing data that may be fetched, processed, received, or generated by the modules 214. The data 216 comprises other data 224 corresponding to the other modules 222. In the illustrated example implementation, the data of the computing device 200 also comprises calibration data 226, image data 228, and location data 230.
10033] In an example, in an initial set-up process, the image capturing device 204 and the touch sensitive surface 206 may be calibrated for a given position of the image capturing device 204 and the touch sensitive surface 206. Calibration may be understood as a process of mapping locations on the touch sensitive surface 206 to their corresponding locations in a view of the camera or on an image captured by the image capturing device 204.
[0034] In an example, a calibration mode may be initiated on the image capturing device 204 for the calibration. In the calibration mode, in an example, respective locations of a plurality of reference points on a surface of the touch sensitive surface 206 facing the image capturing device 204 is made known to the computing device 202. An image of the touch sensitive surface 206 is captured and correction parameters are used to remove the distortions in the image. Thereafter, a mapping of the respective locations of the reference points on the touch sensitive surface 206 and their corresponding location in the image is provided to the calibration module 218 coupled to the processor 208 of the computing device 202. Based on the mapping of the locations of the reference points on the touch sensitive surface 206 to their corresponding locations in the image, the calibration module 218 may map locations of various points on the touch sensitive surface 206 to their corresponding locations in an image captured by the image capturing device 204.
[0036] To indicate the location of the reference points on the touch sensitive surface 206 to the calibration module 218, coordinates of the location of the reference points on the touch sensitive surface 206, also referred to as surface coordinates, may be provided to the calibration module 218. As may be understood, a two-dimensional coordinate system may be used to define physical location of various points on the touch sensitive surface 206. The surface coordinates corresponding to the physical location of the reference points on the touch sensitive surface 206 may be provided to the calibration module 218 as user input.
[0036] In an example, four reference points say, A, B, C, and D may be used for the calibration and accordingly respective locations of the four reference points may be provided to the calibration module 218. For instance, the surface coordinates (x1,y1). (X2,y2). (x3.y3), and (x4,y4) of the reference points A, B, C, and D, respectively, may be provided to the calibration module 218. Also, a user input may be provided to indicate the respective locations of the reference points in the image. For example, the user may mark respective locations of the reference points in the image. In an example, the user may use a graphical user interface provided by the interface(s) 212 of the computing device 202 to provide the surface coordinates and to indicate respective locations of the surface coordinates of the reference points in the image to the calibration module 218.
[0037] In some example implementations, the reference points may be determined by the calibration module 218 and may not be defined by the user. For example, the reference points A, B, C, and D may be four comers of the touch sensitive surface 206. In an example, the calibration module 218 may implement comer detection algorithms to detect the four comers of the touch sensitive surface 206. The calibration module 218 may detect the comers of the touch sensitive surface 206 and may accordingly map the respective locations of each of the comers to their corresponding locations in the image, for example, based on the relative distance of the comers from each other.
[0038] Based on surface coordinates of the reference points, the calibration module 218 may determine image coordinates of each of the reference points in the image. The image coordinates indicate the location of the reference points in the image. Using the image coordinates of the reference points as a reference, the processor 208 may determine image coordinates of each of the points on the touch sensitive surface 206. In an example, the processor 208 uses the surface coordinates and the image coordinates of the reference points to generate a transformation matrix. The transformation matrix may be used to determine the surface coordinates corresponding to the image coordinates. Accordingly, once calibrated, the image coordinates corresponding to a point on the touch sensitive surface 206 may be determined.
[0039] In an example, the user input, such as the location of the reference points received during calibration, and the corresponding transformation matrix generated may be stored in the calibration data 226 in the memory of the computing device 202. The calibration data 226 may be applicable for a relative position of the image capturing device 204 and the touch sensitive surface 206 for which the calibration process was performed and may be used until the relative position between the image capturing device 204 and the touch sensitive surface 206 does not change. In case the relative position between the image capturing device 204 and the touch sensitive surface changes, a re-calibration may be performed by the user. Accordingly, the calibration data corresponding to the changed position may be stored in the memory.
[0040] In an example, the calibration process may also provide for correcting various distortions, such as radial distortion and tangential that may be present in an image captured by the image capturing device 204. For example, in case an imaging sensor and lens of the image capturing device 204 are not aligned to each other, a tangential distortion may occur in the captured images. The image capturing device 204 may be corrected for the tangential distortion by applying a compensation parameter that offsets the misalignment between the imaging sensor and lens. Similarly, correction parameters may be defined corresponding to the various types of distortions for eliminating the distortions from the captured image.
[0041] The correction parameters may also be stored in the calibration data 226 and may be applied to images captured by the image capturing device 204 to remove the distortion in the image. For example, when an image of the touch sensitive surface 206 is captured for indicating the respective location of reference points to the image capturing device 204, the correction parameters may be applied to the image prior to indicating the locations. [0042] In operation, when a user touches a point on the touch sensitive surface 206, by an object, the input detection module 108 coupled to the processor 208 receives an indication of the actuation of the touch sensitive surface 206. In an example, the indication of the actuation may be received from the touch sensitive surface 206 that may have sensed the actuation.
[0043] In another example, the input detection module 108 may receive the indication of actuation from the image capturing device 204 that may provide the indication, for example, by detecting a movement, such as an entry of the object in the field of view of the image capturing device 204. In another example, the input detection module 108 may receive the indication of actuation from the stylus. In such cases, the stylus may have a tip which can be actuated when touched on the touch sensitive surface 206. The stylus may be coupled with the computing system 200 and when the tip of the stylus touches the touch sensitive surface 206, the input detection module 108 may receive an actuation of the touch on the touch sensitive surface 206.
[0044] In response to the indication, the input detection module 108 causes the image capturing device 204 to capture an image of the touch sensitive surface 206. Considering that the position of the touch sensitive surface 206 with respect to the image capturing device 204 is such that the touch sensitive surface 206 corresponds to a field of view 236 of the image capturing device 204, the image comprises the touch sensitive surface 206. As mentioned earlier, the image comprises 'X' pixels. In an example, the image of the touch sensitive surface 206 may be stored in the image data 228 in the memory of the computing device 202.
[0046] The image of the touch sensitive surface 206 captured by the image capturing device 204 is used by the location detection module 110 to determine the location of the point on the touch sensitive surface 206 touched by the user. As mentioned previously, the user may use an object, for example, a finger to touch the touch sensitive surface 206. Accordingly, a tip of the finger or stylus actuates the touch sensitive surface 206 at the point. The image captured by the image capturing device 204 comprises the object, such that the image includes the object either completely or partially. For example, in case the object is a stylus, the image may not include an entire length but rather a fraction of the length of the stylus.
[0046] In an example, the location detection module 110 identifies the object in the image. The location detection module may identity the object to be a finger or a stylus. In an example, object detection algorithm, such as YOLO, R- convolutional neural network (RCNN), Fast RCNN. Mask RCNN, Multibox may be used to identity the object in the image. After the object in the image is identified to be a finger or a stylus, the feature detection module 232 may further identity a tip of the object in the image. In an example, the feature detection module 232 may implement a combination of the geometry based algorithm and model based algorithm to identity the tip of the object. In an example, the identification of the tip of the object in the image is used to locate the point where the tip is positioned on the touch sensitive surface in the image. Evidently, the identification of the tip of the object indicates the location of point of touch on the touch sensitive surface in the image.
[004h The calibration module 218 may determine image coordinates corresponding to the point where the tip is positioned on the touch sensitive surface in the image. The image coordinates indicate a location of the point of touch in the image. In an example, the image coordinates of the point are, say, (M', N'). The calibration module 218 may convert the image coordinates (M*. N') into surface coordinates (M, N). In an example, the conversion of the image coordinates into surface coordinates may be done using the transformation matrix stored in the calibration data 226. For example, the calibration module 218 may select a transformation matrix from amongst a plurality of transformation matrix stored in the calibration data 226, depending on a current relative position of the touch sensitive surface 206 and the image capturing device 204.
[0048] The surface coordinates of the point on the touch sensitive surface, determined by the calibration module 218, may be provided as an input to the location detection module 110. Based on the surface coordinates of the point, the location detection module 110 may identity a location of the point on the touch sensitive surface 206. In an example, the identified location, surface coordinates may be stored in the location data 230. As mentioned previously, the determination of the location of the point of touch allows the computing device 202 to perform further actions, for example, displaying the point of touch on a display device 234 coupled to the computing device.
[0049] Accordingly, in an example implementation, the computing system 200 may include the display device 234, as shown in Figure 2. In an example, the display device 234 may be a display device 234, such as a light emitting diode (LED) device, liquid crystal display (LCD) device, organic light emitting diode (OLED) device, cathode ray tube (CRT) device. The display device 234 may be integral to the computing device 202 or may be coupled to the computing device 202 via interfaces, such as video graphics array (VGA), High-Definition Multimedia Interface (HDMI).
[0050] To enable display of the point of touch, for example, to depict a drawing being made by the user on the touch sensitive surface 206, the display module 220 may be used. In an example, the display module 220 may utilize the location data 230 to determine a location on a display device 234 corresponding to the location of the point on the touch sensitive surface 206. Accordingly, the input provided by the user to the touch sensitive surface 206 may be depicted on the display device 234.
[0051] In the above description, the computing device 202 is shown directly coupled to the touch sensitive surface 206, image capturing device 204, and the display device 234. In another example, the computing device 202 may be at a location different from the touch sensitive surface 206 and the image capturing device 204. In such cases, the computing device 202 may be coupled to the touch sensitive surface 206 and the image capturing device 204 via a network. The interface 212 may allow the computing device 202 to interface with the touch sensitive surface 206 and image capturing device 204. The indication of actuation of the touch sensitive surface 206 and the image captured by the image capturing device 204 may be transmitted to the computing device via the network.
[0052] In an example, the network may be a single network or a combination of multiple networks and may use a variety of different communication protocols. The network 108 may be a wireless or a wired network, or a combination thereof. Examples of such individual networks include, but are not limited to, Global System for Mobile Communication (GSM) network, Universal Mobile Telecommunications System (UMTS) network, Personal Communications Service (PCS) network. Time Division Multiple Access (TDMA) network, Code Division Multiple Access (CDMA) network. Next Generation Network (NON), Public Switched Telephone Network (PSTN). In another example, the image capturing device may include a location detection module and a processor which may be responsible for detecting the location on the touch sensitive surface corresponding to the point.
10053] Figure 3 illustrates a computing device 300 according to another example implementation of the present subject matter. The computing device 300 comprises a touch sensitive surface 302, an image capturing device 304, and a display device 314. The touch sensitive surface 302 and the image capturing device 304 are positioned such that the image capturing device 304 feces the touch sensitive surface 302. The touch sensitive surface 302 may be completely or partially in a field of view 306 of the image capturing device 304.
[0054] According to the example implementation shown in Figure 3, the touch sensitive surface 302 is partially in the field of view 306 of the image capturing device 304, such that a portion 302-1 of the touch sensitive surface 302 lies in the field of view 306 of the image capturing device 304. The portion 302-1 of the touch sensitive surface 302 may be understood as a fraction of the entire surface area of the touch sensitive surface 302 that faces the image capturing device 304. In an example, the total number of touch sensors in the touch sensitive surface is 'N* and the portion 302-1 of the touch sensitive surface 302 comprises 'M' out of the 'N' touch sensors. In such a positioning of the touch sensitive surface 302 with respect to the image capturing device 304, when the image capturing device 304 captures an image of the touch sensitive surface 302 upon receiving an indication of actuation of the touch sensitive surface 302, the image comprises the portion 302-1 of the touch sensitive surface 302.
[0055] In an example, the Image capturing device 304 has an image resolution such that an image captured by the image capturing device 304 has 'X' pixels. Further, as will be understood, since the touch sensitive surface 302 is partially in the field of view 306, a proportional fraction, say *P' of the *X* pixels compose the portion 302-1 in the image. In an example, the image may comprise a full or partial view of a stylus 308, used for actuating the touch sensitive surface 302.
[0056] In an example, the image resolution is selected such that 'R’ is greater than 'M'. In an example, the 'M' touch sensors may be comprised of an array of Mi touch sensors in a first direction 312 along an edge 302-2 of the touch sensitive surface 302 and M2 touch sensors in a second direction 310 along an edge 302-3 of the touch sensitive surface 302. Similarly, 'R’ pixels may be comprised of an array of P1 and P2 pixels in the first direction 312 and the second direction 310, respectively, such that Pi is greater than Mi and P2 is greater than M2. The resolution allows the computing device 300 to ascertain the location of the point of touch more precisely based on the image than based on inputs from the touch sensors. As will be understood, in cases where the field of view 306 of the image capturing device 304 covers a part of the touch sensitive surface 302, the resolution of the image capturing device 304 is higher compared to when the touch sensitive surface 302 is entirely in the field of view 306.
[0067] In an example, the image capturing device 304 may have several modes of operation, such as an extended video graphics array (XVGA), quantum extended graphics array QXGA, high definition (HD), and full high definition (FHD) mode having various resolutions to capture images. The mode of operation and in turn the resolution of the image capturing device 304 is selected such that the number of pixels covering the part of the touch sensitive surface 302 in the field of view 306 is higher than the number of touch sensors included in the part of the touch sensitive surface 302.
[0058] In operation, when a user actuates the touch sensitive surface 302 by touching at a point on the portion 302-1 of the touch sensitive surface 302, an image of the portion 302-1 of the touch sensitive surface 302 is captured by the image capturing device 304. The image is processed in the same manner, as explained with reference to Figure 2 and a location of the touch sensitive surface 302 corresponding to the point is determined.
[0059] In an example, the location of a point of touch on the touch sensitive surface is mapped to a point on the display device 314. In an example, the display device 314 may have a display resolution. The display resolution may be such that the number of pixels in the display device 314 is 'Y*. The number pixels in the image capturing device 304 is X. such that 'X* is greater than Ύ. The mapping of the determined location of the touch sensitive surface to the corresponding point on the display device 314 may be done by mapping the pixels on an image of the image capturing device 304 to the pixels of the display device 314. Since the pixels of the image capturing device 304 is more than the pixels of the display device 314, more than one pixels of the image capturing device 304 may represent a single pixel on the display device 314. The pixel location of the point of touch in the image, as determined above, is mapped to the pixel in the display device 314 to identify the point on the display device 314. The display in the Figure 3 is shown coupled to the touch sensitive surface 302 and the image capturing device 304. However, it should be understood that a display may be projected on any surface including the touch sensitive surface 302.
[0060] When a user touches at a point on the touch sensitive surface, the corresponding location is detected and subsequently a point on the display device 314 is also mapped to the location, thus providing the user with a touch sensitive surface such that a location on the touch sensitive surface 302 can be precisely mapped to a location on the display device 314 even when the number of touch sensors ΊM’ in the touch sensitive surface is less than the number of pixels Ύ of the display device 314. While Figure 3 has been explained with reference to a single touch sensitive surface 302, it should be understood that more than one touch sensitive surfaces may be employed for interacting with the computing device 300 such that a portion of each of the more than one touch sensitive surfaces is within the field of view 306 of the image capturing device 304. As will be understood, the number of pixels covering the respective portions of each of the more than one touch sensitive surfaces in the field of view 306 is higher than the number of touch sensors included in the respective portions.
[0061] Figure 4 illustrates a method 400 for identifying a location of a touch sensitive surface in accordance with an example implementation of the present subject matter. Although the method 400 may be implemented in a variety of computing system, but for the ease of explanation, the present description of the example method 400 to determine the location of the touch sensitive surface corresponding to the touch by an object is provided in reference to the above- described computing system 100.
[0062] The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method 400, or an alternative method.
[0063] It may be understood that blocks of the method 400 may be performed by the computing system 100. The blocks of the method 400 may be executed based on instructions stored in a non-transitory computer-readable medium, as will be readily understood. The non-transitory computer-readable medium may include, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital date storage media.
[0064] At block 402, an indication of actuation of a touch sensitive surface comprising *N’ touch sensors, such as the touch sensitive surface 206, is received. The actuation of the touch sensitive surface is based on touch by an object on the touch sensitive surface. In an example, a user may use an object, such as a stylus to touch the touch sensitive surface. The indication may be received by a processor of a computing device, such as the processor 102 of the computing system 100.
[0066] At block 404, an image of the touch sensitive surface is obtained in response to the receiving of the indication. The image of the touch sensitive surface has X pixels and the number of pixels X is greater than the number of touch sensors Ή*. In an example, the image may be captured by an image capturing device, such as the image capturing device 104. Further, in an example, the image may be obtained by the computing system 100. The method, thereafter, proceeds to block 406.
[0066] At block 406, a location of the touch sensitive surface corresponding to the touch by the object is identified based on the image. [0067] Figure 5 illustrates a method 500 for identifying a location on a touch sensitive surface in accordance with another example implementation of the present subject matter. Although the method 500 may be implemented in a variety of computing systems, but for the ease of explanation, the present description of the example method 500 to determine a location on the touch sensitive surface is provided in reference to the above-described computing system 100.
[0068] The order in which the method 500 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method 500, or an alternative method.
[0069] It may be understood that blocks of the method 500 may be performed by the computing system 100. The blocks of the method 500 may be executed based on Instructions stored in a non-transitory computer-readable medium, as will be readily understood. The non-transitory computer-readable medium may include, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
[0070] At block 502, an indication of actuation of a touch sensitive surface is received. The touch sensitive surface may be similar to the touch sensitive surface 206 and comprises 'N' touch sensors. The actuation of the touch sensitive surface is based on a touch, by an object, at a point on the touch sensitive surface.
[0071] In an example, the touch by the object may be a single-point touch or a multi-point touch. In single-point touch, the user may touch at a single point on the touch sensitive surface or the touch may be localized to a few points on the touch sensitive surface. For example, a user's initiating a graphical Interface projected on the touch sensitive surface may actuate localized points on the touch sensitive surface. In another example, the touch may be a multi-point touch, such as when a user draws something on the touch sensitive surface. The touching of the touch sensitive surface by the object is detected by the touch sensors. In an example, the number of touch sensors is such that the touch sensitive surface can detect a touch at a point and is agnostic of a location of the touch. [0072] At block 504, an image of the touch sensitive surface is obtained from an image capturing device, such as the image capturing device 104. The image comprises the object and has‘X* pixels. The number of pixels‘X’ is greater than the number of touch sensors 'N' on the touch sensitive surface.
[0073] At block 506, the object in the image is identified to be a stylus. In an example, object detection algorithms, such as YOLO, RCNN, Fast RCNN,
Mask RCNN, Multibox may be used to identify the object in the image. In an example, the object detection algorithms may use a reference image of the stylus to identify the stylus in the image. During initial setup, the computing system may capture a reference image of the stylus and may store the same in a memory. In operation, the object detection algorithms may use the stored reference image of the stylus to identity the stylus in the image. Thus, even when a portion of the stylus is captured in the image, the object detection algorithms may use the stored reference image of the stylus to identity the stylus in the image.
[0074] At block 508, a location of tip of the stylus is identified. As understood, during actuation of the touch sensitive surface, the user may touch the touch sensitive surface with the tip of the stylus. In an example, a geometry based algorithm and a model based algorithm may be used to identity the location tip the stylus in an image. For example, a feature detection algorithm may be used to determine the tip of the stylus. In an example, the feature detection algorithm may compare the reference image of the stylus with the identified stylus in the image to identity the tip and the location of the tip of the stylus. In an example, even when a portion of the stylus is identified in the image, the feature detection algorithm may identity the location of the tip based on determining an orientation of the stylus in the image.
[0075] At block 510, a location on the touch sensitive surface corresponding to the point is determined based on the location of the tip of the stylus in the image. In an example, a transformation matrix may be used to determine the location on the touch sensitive surface corresponding to the point For example, to determine the location on the touch sensitive surface corresponding to the point, image coordinates corresponding to the location of the tip may be determined which may be then converted to the surface coordinates using a transformation matrix. The surface coordinates may indicate the location of the location on the touch sensitive surface corresponding to the point.
[0076] At block 512, the location of the touch sensitive surface is mapped to a point on a display of a display device. In an example, the display device has a display resolution having T pixels. The number of pixels in an image of the image capturing device is 'X', which is greater than the number of pixels Ύ’ of the display device. The pixels‘X’ of the image capturing device may be mapped to the pixels Ύ* of the display device. Thus, mapping of the point of touch on the touch sensitive surface to a point on the display device is enabled using the image and is dependent of a touch sensor density of the touch sensitive surface.
[0077] Figure 6 illustrates a computing environment implementing a non- transitory computer-readable medium for determining a location on the touch sensitive surface, according to an example. In an example, the computing environment 600 may comprise a computing system, such as the computing system 100. The computing environment 600 includes a processing resource 602 communicatively coupled to the non-transitory computer-readable medium 604 through a communication link 606. In an example, the processing resource 602 may be a processor of the computing system 100 that fetches and executes computer-readable instructions from the non-transitory computer-readable medium 604.
[0078] The non-transitory computer-readable medium 604 can be, for example, an internal memory device or an external memory device. In an example, the communication link 606 may be a direct communication link, such as any memory read/write interface. In another example, the communication link 606 may be an indirect communication link, such as a network interface. In such a case, the processing resource 602 can access the non-transitory computer- readable medium 604 through a network 608. The network 608 may be a single network or a combination of multiple networks and may use a variety of different communication protocols. [0079] The processing resource 602 and foe non-transitory computer- readable medium 604 may also be communicatively coupled to data sources 610. The data source(s) 610 may also be used to store data, such as calibration data, location data. Further the data source(s) 610 may be a database. In an example, the non-transitory computer-readable medium 604 comprises executable instructions 612 for determining a location on the touch sensitive surface. For example, the non-transitory computer-readable medium 604 may comprise instructions executable to implement the previously described location detection module 110.
[0080] In an example, the instructions 612 may be executable by the processing resource 602 to receive an indication of actuation of a touch sensitive surface. In an example, the actuation of the touch sensitive surface may be based on a touch, by an object, at a point on the touch sensitive surface. In an example, the touch sensitive surface comprises Ή' touch sensors. In response to the receiving of the actuation, an image of the touch sensitive surface, comprising the object, is received from an image capturing device. In an example, the image capturing device has an image resolution such that the number of pixels in the image is‘X*. which is greater than number of touch sensors 'N' on the touch sensitive surface. The instructions 612 may be further executable by the processing resource 602 to determine a location of the point on the touch sensitive surface. Further, the instructions 612 may be executable by the processing resource 602 to ascertain a location, on a display device, coresponding to the point on the touch sensitive surface. In an example, the display device has a display resolution. The image resolution of the image capturing device is greater than the display resolution of the display device.
[0081] In an example, to determine the location on the touch sensitive surface, the instruction 612 may be executable by the processing resource 602 to identify the object to be stylus and farther identify a location of the tip of the stylus. Further, in another example, the instruction 612 may be executable by the processing resource 602 to identify the object to be a finger of a user and farther identify a location of the tip of the finger. In an example, a feature detection algorithm may be used to identify the object and locate the tip of foe object. Based on the location of the tip of the object the processing resource 602 may determine image coordinates of the location of the point on the image. The instruction 612 may further be executable by the processing resource 602 to convert the image coordinates to surface coordinates. The surface coordinates indicate the location on the touch sensitive surface corresponding to the point. The instruction 612 may further be executable by the processing resource 602 to convert the image coordinates to display coordinates which would indicate the location on the display device corresponding to the point.
10082] Thus, the methods and devices of the present subject matter provide techniques to determine a location on the touch sensitive surface.
Although examples of determining the location on the touch sensitive surface have been described in a language, specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples determining the location on the touch sensitive surface.

Claims

Claims:
1. A computing system comprising:
a processor:
an image capturing device coupled to the processor, the image capturing device having an image resolution, such that an image captured by the image capturing device comprises 'X' pixels;
a touch sensitive surface coupled to the processor, the touch sensitive surface comprising 'N' touch sensors, wherein 'C' is greater than 'N';
an input detection module coupled to the processor to:
receive an indication of actuation of the touch sensitive surface, the actuation being caused by a touch at a point on the touch sensitive surface by an object; and
in response to receiving the indication, cause the image capturing device to capture an image of the touch sensitive surface, the image comprising the object; and
a location detection module coupled to the processor to:
determine, based on the image, a location of the point on the touch sensitive surface.
2. The system as claimed in claim 1 , further comprising:
a display device coupled to the processor and having a display resolution such that the display resolution is less than the image resolution; and
a display module coupled to the processor to determine a location corresponding to the point on the display device.
3. The system as claimed in claim 1, wherein the touch sensitive surface detects a touch at a point on the touch sensitive surface agnostic of a location of the point.
4. The system as claimed in claim 1 , wherein the object is a finger or a stylus and wherein the location detection module comprises a feature detection module to identify a tip of the finger or the stylus.
5. The system as claimed in claim 1 , further comprising a calibration module coupled to the processor to:
determine image coordinates of the point in the image, the image coordinates being indicative of a location of the point in the image; and
convert image coordinates into surface coordinates, the surface coordinates being indicative of a location of the point on the touch sensitive surface, the transformation between image coordinates and surface coordinates is recorded
wherein the location detection module is to determine the location of the point on the touch sensitive surface based on the surface coordinates.
6. The system as claimed in claim 1, wherein the image comprises a portion of the touch sensitive surface in a field of view of the image capturing device, the portion comprising 'M' touch sensors of the N' touch sensors, wherein the portion of the touch sensitive surface in the image covers 'R’ pixels of the‘X’ pixels of the image,‘P’ being greater than‘M.
7. A method comprising:
receiving an indication of actuation of a touch sensitive surface, the touch sensitive surface comprising 'N' touch sensors, the actuation being based on a touch, by an object, on the touch sensitive surface;
obtaining, in response to receiving the indication, an image of the touch sensitive surface from an image capturing device, the image comprising the object and having 'X' pixels, wherein 'X' is greater than *N‘; and
identifying, based on the image, a location of the touch sensitive surface coresponding to the touch by the object.
8. The method as claimed in claim 7, further comprising: mapping, based on the image, the identified location to a point on a display of a display device, the display device having y pixels, wherein‘X* is greater than y.
9. The method as claimed in claim 7, wherein the indication of actuation of the touch sensitive surface based on the touch is agnostic of a location of the touch.
10. The method as claimed in claim 7 further comprising:
identifying the object in the image to be a finger of a user;
identifying a location of a tip of the finger in the image, wherein the tip of the finger touches the touch sensitive surface; and
determining the location of the touch sensitive surface corresponding to the touch based on the location of the tip of the finger.
11. The method as claimed in claim 7, wherein the method further comprises: identifying the object in the image to be a stylus;
identifying a location of a tip of the stylus in the image, wherein the tip of the stylus touches the touch sensitive surface; and
determining the location of the touch sensitive surface corresponding to the touch based on the location of the tip of the stylus.
12. A non-transitory computer-readable medium comprising instructions executable by a processing resource to:
receive, an indication of actuation of a touch sensitive surface, the actuation being caused by a touch at a point on the touch sensitive surface by an object, wherein the touch sensitive surface comprises *N‘ touch sensors;
obtain, in response to the indication, an image of the touch sensitive surface comprising the object, wherein the image has an image resolution comprising X pixels,‘X* being greater than Ή';
identify, based on the image, a location on the touch sensitive surface corresponding to the point; and ascertain, based on the image, a location on a display of a display device corresponding to the point, wherein the display device has a display resolution, the image resolution being higher than the display resolution.
13. The non-transitory computer-readable medium as claimed in claim 12, comprising instructions executable by the processing resource to:
determine image coordinates corresponding to the point in the image, the image coordinates being indicative of a location of the point in the image;
convert the image coordinates into surface coordinates, the surface coordinates being indicative of the location on the touch sensitive surface corresponding to the point; and
convert the image coordinates into display coordinates, the display coordinates being indicative of the location on the display corresponding to the point.
14. The non-transitory computer-readable medium as claimed in claim 12, comprising instructions executable by the processing resource to:
identify the object in the image to be a stylus;
identify a location of a tip of the stylus from the image, wherein the tip of the stylus touches the touch sensitive surface; and
determine the location of the touch sensitive surface corresponding to the touch based on the location of the tip of the stylus.
15. The non-transitory computer-readable medium as claimed in claim 12, comprising instructions executable by the processing resource to:
identify the object in the image to be a finger of a user;
identify a location of a tip of the finger from the image, wherein the tip of the finger touches the touch sensitive surface; and
determine the location of the touch sensitive surface corresponding to the touch based on the location of the tip of the finger.
PCT/US2018/044683 2018-07-31 2018-07-31 Determining location of touch on touch sensitive surfaces WO2020027818A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/044683 WO2020027818A1 (en) 2018-07-31 2018-07-31 Determining location of touch on touch sensitive surfaces

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/044683 WO2020027818A1 (en) 2018-07-31 2018-07-31 Determining location of touch on touch sensitive surfaces

Publications (1)

Publication Number Publication Date
WO2020027818A1 true WO2020027818A1 (en) 2020-02-06

Family

ID=69230890

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/044683 WO2020027818A1 (en) 2018-07-31 2018-07-31 Determining location of touch on touch sensitive surfaces

Country Status (1)

Country Link
WO (1) WO2020027818A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11287926B1 (en) 2020-09-25 2022-03-29 Apple Inc. System and machine learning method for detecting input device distance from touch sensitive surfaces
US11449175B2 (en) 2020-03-31 2022-09-20 Apple Inc. System and method for multi-frequency projection scan for input device detection
US11460933B2 (en) 2020-09-24 2022-10-04 Apple Inc. Shield electrode for input device
US11467678B2 (en) 2018-07-24 2022-10-11 Shapirten Laboratories Llc Power efficient stylus for an electronic device
US11526240B1 (en) 2020-09-25 2022-12-13 Apple Inc. Reducing sensitivity to leakage variation for passive stylus
US11997777B2 (en) 2020-09-24 2024-05-28 Apple Inc. Electrostatic discharge robust design for input device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113920A1 (en) * 2011-11-04 2013-05-09 Robert D. Blanton Determining position in a projection capture system
US20140139717A1 (en) * 2011-07-29 2014-05-22 David Bradley Short Projection capture system, programming and method
US20140152843A1 (en) * 2012-12-04 2014-06-05 Seiko Epson Corporation Overhead camera and method for controlling overhead camera
US20160334938A1 (en) * 2014-01-31 2016-11-17 Hewlett-Packard Development Company, L.P. Touch sensitive mat of a system with a projector unit
US20160334892A1 (en) * 2014-01-31 2016-11-17 Hewlett-Packard Development Company, L.P. Display unit manager

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140139717A1 (en) * 2011-07-29 2014-05-22 David Bradley Short Projection capture system, programming and method
US20130113920A1 (en) * 2011-11-04 2013-05-09 Robert D. Blanton Determining position in a projection capture system
US20140152843A1 (en) * 2012-12-04 2014-06-05 Seiko Epson Corporation Overhead camera and method for controlling overhead camera
US20160334938A1 (en) * 2014-01-31 2016-11-17 Hewlett-Packard Development Company, L.P. Touch sensitive mat of a system with a projector unit
US20160334892A1 (en) * 2014-01-31 2016-11-17 Hewlett-Packard Development Company, L.P. Display unit manager

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11467678B2 (en) 2018-07-24 2022-10-11 Shapirten Laboratories Llc Power efficient stylus for an electronic device
US11449175B2 (en) 2020-03-31 2022-09-20 Apple Inc. System and method for multi-frequency projection scan for input device detection
US11460933B2 (en) 2020-09-24 2022-10-04 Apple Inc. Shield electrode for input device
US11997777B2 (en) 2020-09-24 2024-05-28 Apple Inc. Electrostatic discharge robust design for input device
US11287926B1 (en) 2020-09-25 2022-03-29 Apple Inc. System and machine learning method for detecting input device distance from touch sensitive surfaces
US11526240B1 (en) 2020-09-25 2022-12-13 Apple Inc. Reducing sensitivity to leakage variation for passive stylus
US11907475B2 (en) 2020-09-25 2024-02-20 Apple Inc. System and machine learning method for localization of an input device relative to a touch sensitive surface

Similar Documents

Publication Publication Date Title
WO2020027818A1 (en) Determining location of touch on touch sensitive surfaces
JP6417702B2 (en) Image processing apparatus, image processing method, and image processing program
JP4820285B2 (en) Automatic alignment touch system and method
US9218124B2 (en) Information processing apparatus, information processing method, and program
US9454260B2 (en) System and method for enabling multi-display input
US11514650B2 (en) Electronic apparatus and method for controlling thereof
US9207779B2 (en) Method of recognizing contactless user interface motion and system there-of
JP6089722B2 (en) Image processing apparatus, image processing method, and image processing program
US20120319945A1 (en) System and method for reporting data in a computer vision system
US9442607B2 (en) Interactive input system and method
JP5802247B2 (en) Information processing device
CN108027656B (en) Input device, input method, and program
TW201214243A (en) Optical touch system and object detection method therefor
US10817054B2 (en) Eye watch point tracking via binocular and stereo images
US9921054B2 (en) Shooting method for three dimensional modeling and electronic device supporting the same
JP6287382B2 (en) Gesture recognition device and method for controlling gesture recognition device
US20160188950A1 (en) Optical fingerprint recognition device
JP2016103137A (en) User interface system, image processor and control program
TW201621454A (en) Projection alignment
TWI448918B (en) Optical panel touch system
WO2020027813A1 (en) Cropping portions of images
US11257208B2 (en) Defect inspection system for specimen and defect inspection method for specimen
US20160205272A1 (en) Image processing apparatus, information processing method, and non-transitory computer-readable medium
JP6898021B2 (en) Operation input device, operation input method, and program
JP2009222446A (en) Distance measuring device and its program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928973

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928973

Country of ref document: EP

Kind code of ref document: A1