WO2017074809A1 - Dispositif informatique comprenant un accessoire d'entrée utilisateur - Google Patents

Dispositif informatique comprenant un accessoire d'entrée utilisateur Download PDF

Info

Publication number
WO2017074809A1
WO2017074809A1 PCT/US2016/058106 US2016058106W WO2017074809A1 WO 2017074809 A1 WO2017074809 A1 WO 2017074809A1 US 2016058106 W US2016058106 W US 2016058106W WO 2017074809 A1 WO2017074809 A1 WO 2017074809A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
computer
identifier
examples
software application
Prior art date
Application number
PCT/US2016/058106
Other languages
English (en)
Inventor
Oscar KOENDERS
Original Assignee
Microsoft Technology Licensing, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing, Llc filed Critical Microsoft Technology Licensing, Llc
Priority to EP16794799.3A priority Critical patent/EP3368977B1/fr
Priority to CN201680064071.5A priority patent/CN108351791B/zh
Publication of WO2017074809A1 publication Critical patent/WO2017074809A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/005Discovery of network devices, e.g. terminals

Definitions

  • Smartphones, desktop computers, and other computing devices can run a wide variety of software programs. Many of these programs are designed to interact with specific peripherals. For example, many smartphone apps are designed to operate in computing devices included or connected to multi-touch-capable touchscreens. Similarly, some graphics applications are configured to operate with pen tablets instead of or in addition to mice or other pointing devices. The user experience of an application can depend significantly on which peripherals are used to interact with the application.
  • a computing device can include a wireless interrogator configured to wirelessly detect a first identifier associated with a tagged user- input accessory in operational proximity to the wireless interrogator and wirelessly detect a second identifier associated with a tagged object in operational proximity to the wireless interrogator.
  • the computing device can include a force sensor having a sensing surface.
  • the computing device can determine a software application corresponding to the first identifier and execute the determined software application.
  • the computing device can detect a force exerted against the sensing surface by the object and provide to the software application information of the second identifier and information of the detected force.
  • Example techniques described herein can detect spatially-varying forces across the sensing surface.
  • a location or shape of the object can be determined and provided to the software application.
  • a representation of the object can be presented for display in a user interface.
  • the software application can display a user interface corresponding to the first identifier.
  • FIG. 1 is a block diagram depicting an example environment for implementing application selection and operation as described herein.
  • FIG. 2 is a block diagram depicting an example computing device configured to participate in application selection and operation according to various examples described herein.
  • FIG. 3 is a perspective illustrating an example computing device and example uses thereof.
  • FIG. 4 is a dataflow diagram depicting example module interactions during application selection and operation.
  • FIG. 5 is a flow diagram that illustrates example processes for application selection and operation.
  • FIG. 6 is a flow diagram that illustrates example processes for application operation.
  • FIG. 7 is a flow diagram that illustrates example processes for application operation based on spatial data of one or more objects.
  • FIG. 8 is a flow diagram that illustrates example processes for user-interface selection based on a user-input accessory.
  • Examples described herein provide techniques and constructs to launch or operate an application in response to a user-input accessory (UIA) selected, e.g., by an entity, such as an operator entity or a user.
  • UIA user-input accessory
  • an application specific to the UIA can be launched.
  • user interfaces presented by the application can be adjusted based on the UIA. Either of app-specific launching and user-interface customizing can improve operational efficiency compared to using an application not designed for that UIA.
  • a UIA is a computer peripheral, an attachment to a computer or computer peripheral, or an accessory (electronic or not) usable with a computer or computer peripheral.
  • a UIA of any of these types is configured to afford specific user inputs or types of user inputs, i.e., to express or represent to a user the possibility of providing those inputs or types of user inputs.
  • a UIA showing a depiction of a painter's palette affords selecting a color, e.g., to be used in a computer paint program such as GIMP. Further examples of user-input accessories are described below.
  • the user experience of an application can change significantly, and operational efficiency with the application can be reduced, when the application is used with a peripheral other than that for which the application was designed.
  • Some examples described herein permit using an application with the peripheral for which that application was designed.
  • Some examples described herein permit automatically executing a software application or automatically configuring a software application based on a connected peripheral, reducing the time required to begin using the peripheral and increasing operational efficiency.
  • Some examples described herein permit effectively manipulating virtual objects using corresponding physical objects, e.g., placed on a force-sensing surface.
  • physical objects such as wooden or plastic blocks carrying radio-frequency identification (RFID) tags
  • RFID radio-frequency identification
  • the UIA can show outlines indicating where blocks should be placed, or can be arranged horizontally to serve as a tray or other support for blocks. Further examples of objects are described below.
  • the terms “application,” “app,” and “software program” refer generally to any software or portion thereof running on a computing device and responsive to or associated with user-input accessories as described herein.
  • apps can include smartphone downloadable programs, desktop-computer programs such as word processors and spreadsheets, smartphone and other embedded operating systems, programs included with such operating systems such as shells or device-management subsystems, and embedded programs (e.g., firmware or software) running on sensor devices such as Internet of Things (IoT) devices.
  • IoT Internet of Things
  • the terms “application,” “app,” and “software program” also encompass hardwired logic included in a computing device and configured to respond to presence of, or signals from, a UIA as described herein.
  • FIG. 1 shows an example environment 100 in which examples of devices or systems responsive to user-input accessories can operate or in which program-launching or -operation methods such as described below can be performed.
  • various devices and/or components of environment 100 include computing devices 102(1)- 102(N) (individually or collectively referred to herein with reference 102), where N is any integer greater than or equal to 1.
  • computing devices 102 can include a diverse variety of device categories, classes, or types and are not limited to a particular type of device.
  • computing devices 102 can include, but are not limited to, automotive computers such as vehicle control systems, vehicle security systems, or electronic keys for vehicles (e.g., 102(1), represented graphically as an automobile); smartphones, mobile phones, mobile phone-tablet hybrid devices, personal data assistants (PDAs), or other telecommunication devices (e.g., 102(2)); portable or console-based gaming devices or other entertainment devices such as network-enabled televisions, set-top boxes, media players, cameras, or personal video recorders (PVRs) (e.g., 102(3), represented graphically as a gamepad); desktop computers (e.g., 102(4)); laptop computers, thin clients, terminals, or other mobile computers (e.g., 102(5)); tablet computers or tablet hybrid computers (e.g., 102(N)); server computers or blade servers such as Web servers, map/reduce servers or other computation engines, or network-attached- storage units; wearable computers such as smart watches or biometric or medical sensors; implant
  • computing devices 102 or types of computing devices 102 can use different peripherals and can have different uses for those peripherals.
  • peripherals can include user-input accessories such as those discussed below.
  • portable devices such as computing devices 102(2) and 102(N) can use peripherals designed to cooperate with the small sizes of those devices. Larger devices such as computing devices 102(4) and 102(5) can use peripherals that take advantage of physical desktop space, e.g., pen tablets.
  • computing devices 102 can communicate with each other and/or with other computing devices via one or more networks 104. In some examples, computing devices 102 can communicate with external devices via networks 104.
  • networks 104 can include public networks such as the Internet, private networks such as an institutional or personal intranet, cellular networks, or combinations of private and public networks.
  • Networks 104 can also include any type of wired or wireless network, including but not limited to local area networks (LANs), wide area networks (WANs), satellite networks, cable networks, Wi-Fi networks, WiMAX networks, mobile communications networks (e.g., 3G, 4G, and so forth) or any combination thereof.
  • Networks 104 can utilize communications protocols, such as, for example, packet-based or datagram-based protocols such as Internet Protocol (IP), Transmission Control Protocol (TCP), User Datagram Protocol (UDP), other types of protocols, or combinations thereof.
  • IP Internet Protocol
  • TCP Transmission Control Protocol
  • UDP User Datagram Protocol
  • networks 104 can also include a number of devices that facilitate network communications or form a hardware infrastructure for the networks, such as switches, routers, gateways, access points, firewalls, base stations, repeaters, backbone devices, and the like.
  • Networks 104 can also include devices that facilitate communications between computing devices 102 using bus protocols of various topologies, e.g., crossbar switches, INFINIBAND switches, or FIBRE CHANNEL switches or hubs.
  • Different networks have different characteristics, e.g., bandwidth, latency, accessibility (open, announced but secured, or not announced), or coverage area.
  • the type of network 104 used for any given connection between, e.g., a computing device 102 and a computing cluster can be selected based on these characteristics and on the type of interaction. For example, a low-power, low-bandwidth network can be selected for IoT sensors, and a low-latency network can be selected for smartphones.
  • networks 104 can further include devices that enable connection to a wireless network, such as a wireless access point (WAP).
  • WAP wireless access point
  • Examples support connectivity through WAPs that send and receive data over various electromagnetic frequencies (e.g., radio frequencies), including WAPs that support Institute of Electrical and Electronics Engineers (IEEE) 802.11 standards (e.g., 802. l lg, 802.11 ⁇ , and so forth), other standards, e.g., BLUETOOTH, cellular-telephony standards such as GSM, LTE, or WiMAX, or multiples or combinations thereof.
  • IEEE Institute of Electrical and Electronics Engineers
  • computing device 102(N) can include one or more processing units 108 operably connected to one or more computer-readable media 110 such as via a bus 112, which in some instances can include one or more of a system bus, a data bus, an address bus, a Peripheral Component Interconnect (PCI) Express (PCIe) bus, a PCI bus, a Mini-PCI bus, and any variety of local, peripheral, or independent buses, or any combination thereof.
  • PCI Peripheral Component Interconnect
  • PCIe Peripheral Component Interconnect Express
  • plural processing units 108 can exchange data through an internal bus 112 (e.g., PCIe), rather than or in addition to network 104. While in this example the processing units 108 are described as residing on the computing device 102(N), the processing units 108 can also reside on different computing devices 102 in some examples. In some examples, at least two of the processing units 108 can reside on different computing devices 102. In such examples, multiple processing units 108 on the same computing device 102 can use a bus 112 of the computing device 102 to exchange data, while processing units 108 on different computing devices 102 can exchange data via networks 104.
  • PCIe internal bus 112
  • Processing units 108 can be or include one or more single-core processors, multi- core processors, central processing units (CPUs), graphics processing units (GPUs), general-purpose GPUs (GPGPUs), or hardware logic components configured, e.g., via specialized programming from modules or Application Programming Interfaces (APIs), to perform functions described herein.
  • CPUs central processing units
  • GPUs graphics processing units
  • GPGPUs general-purpose GPUs
  • hardware logic components configured, e.g., via specialized programming from modules or Application Programming Interfaces (APIs), to perform functions described herein.
  • illustrative types of hardware logic components that can be used in or as processing units 108 include Field- programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Digital Signal Processors (DSPs), and other types of customizable processors.
  • FPGAs Field- programmable Gate Arrays
  • ASICs Application-specific Integrated Circuits
  • ASSPs Application-specific Standard Products
  • SOCs System-on-a-chip systems
  • CPLDs Complex Programmable Logic Devices
  • DSPs Digital Signal Processors
  • a processing unit 108 can represent a hybrid device, such as a device from ALTERA or XILINX that includes a CPU core embedded in an FPGA fabric.
  • These or other hardware logic components can operate independently or, in some instances, can be driven by a CPU.
  • computing devices 102 can include a plurality of processing units 108 of multiple types.
  • the processing units 108 in computing device 102(N) can be a combination of one or more GPGPUs and one or more FPGAs.
  • Different processing units 108 can have different execution models, e.g., as is the case for graphics processing units (GPUs) and central processing unit (CPUs).
  • processing units 108, computer-readable media 110, and modules or engines stored on computer-readable media 110 can together represent an ASIC, FPGA, or other logic device configured to carry out the functions of such modules or engines.
  • an engine as described herein can include one or more modules.
  • Computer-readable media described herein includes computer storage media and/or communication media.
  • Computer storage media includes tangible storage units such as volatile memory, nonvolatile memory, and/or other persistent and/or auxiliary computer storage media, removable and non-removable computer storage media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data.
  • Computer storage media includes tangible or physical forms of media included in a device or hardware component that is part of a device or external to a device, including but not limited to random-access memory (RAM), static random-access memory (SRAM), dynamic random-access memory (DRAM), phase change memory (PRAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory, compact disc read-only memory (CD-ROM), digital versatile disks (DVDs), optical cards or other optical storage media, magnetic cassettes, magnetic tape, magnetic disk storage, magnetic cards or other magnetic storage devices or media, solid-state memory devices, storage arrays, network attached storage, storage area networks, hosted computer storage or memories, storage, devices, and/or storage media that can be used to store and maintain information for access by a computing device 102.
  • RAM random-access memory
  • SRAM static random-access memory
  • DRAM dynamic random-access memory
  • PRAM phase change memory
  • ROM read-only memory
  • communication media can embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transmission mechanism.
  • a modulated data signal such as a carrier wave, or other transmission mechanism.
  • computer storage media does not include communication media.
  • computer-readable media 110 can store instructions executable by the processing units 108 that, as discussed above, can represent a processing unit incorporated in computing device 102.
  • Computer-readable media 110 can additionally or alternatively store instructions executable by external processing units such as by an external CPU or external processor of any type discussed above.
  • at least one processing unit 108 e.g., a CPU, GPU, or hardware logic device
  • at least one processing unit 108 is incorporated in computing device 102
  • at least one processing unit 108 e.g., one or more of a CPU, GPU, or hardware logic device, is external to computing device 102.
  • Computer-readable media 110 can store, for example, executable instructions of an operating system 114, a launching engine 116, an interaction engine 118, and other modules, programs, or applications that are loadable and executable by processing units 108.
  • one or more of the processing units 108 in one of the computing devices 102 can be operably connected to computer-readable media 110 in a different one of the computing devices 102, e.g., via communications interface 120 and network 104.
  • program code to perform steps of flow diagrams herein can be downloaded from a server, e.g., computing device 102(4), to a client, e.g., computing device 102(N), e.g., via the network 104, and executed by one or more processing units 108 in computing device 102(N).
  • the computer-executable instructions stored on the computer-readable media 110 can upon execution configure a computer such as a computing device 102 to perform operations described herein with reference to the operating system 114, the launching engine 116, or the interaction engine 118.
  • Computer-readable media 110 of the computing device 102 can store an operating system 114.
  • operating system 114 is not used (commonly referred to as a "bare metal" configuration).
  • operating system 114 can include components that enable or direct the computing device 102 to receive data via various inputs (e.g., user controls, network or communications interfaces, memory devices, or sensors), and process the data using the processing units 108 to generate output.
  • the operating system 114 can further include one or more components that present the output (e.g., display an image on an electronic display, store data in memory, transmit data to another computing device, etc.).
  • the operating system 114 can enable a user to interact with apps or with modules of the interaction engine 118 using a user interface (omitted for brevity). Additionally, the operating system 114 can include components that perform various functions generally associated with an operating system, e.g., storage management and internal-device management.
  • Computing device 102 can also include one or more communications interfaces 120 to enable wired or wireless communications between computing devices 102 and other networked computing devices 102 involved in application selection or operation, or other computing devices, over networks 104.
  • Such communications interfaces 120 can include one or more transceiver devices, e.g., network interface controllers (NICs) such as Ethernet NICs or other types of transceiver devices, to send and receive communications over a network.
  • the processing units 108 can exchange data through respective communications interfaces 120.
  • the communications interface 120 can be a PCIe transceiver
  • the network 104 can be a PCIe bus.
  • the communications interface 120 can include, but is not limited to, a transceiver for cellular (3G, 4G, or other), WI-FI, Ultra-wideband (UWB), BLUETOOTH, or satellite transmissions.
  • the communications interface 120 can include a wired I/O interface, such as an Ethernet interface, a serial interface, a Universal Serial Bus (USB) interface, an INFINIBAND interface, or other wired interfaces. For simplicity, these and other components are omitted from the illustrated computing device 102.
  • a computing device 102 such as the computing device 102(N) can include or be connected with a wireless interrogator 122.
  • the wireless interrogator 122 can be configured to wirelessly detect respective identifiers associated with tagged objects, e.g., by transmitting interrogation signals and receiving responses from tagged objects in operational proximity to the wireless interrogator 122.
  • the wireless interrogator 122 can include an RFID or near-field communications (NFC) reader configured to wirelessly detect the identifiers of RFID-tagged or NFC-tagged objects of the tagged objects.
  • NFC near-field communications
  • the wireless interrogator 122 can include a reader configured to wirelessly detect the identifiers of ones of the tagged objects having transceivers for BLUETOOTH, BLUETOOTH Low-Energy (BLE), or other personal-area- networking (PAN) technologies.
  • the wireless interrogator 122 can include an optical detector configured to locate and decode visual indicia on or in surfaces of the tagged objects, e.g., barcodes, specific colors, or specific patterns.
  • a computing device 102 such as the computing device 102(N) can include or be connected with a force sensor 124.
  • the force sensor 124 can have a sensing surface.
  • the force sensor 124 can be configured to sense force, pressure, or other mechanical actions on or against the sensing surface.
  • the force sensor 124 can be configured to detect spatially-varying forces across the sensing surface.
  • Example force sensors useful with various examples are described in U.S. Patent No. 9,001,082 to Rosenberg et al. ('"082").
  • the force sensor 124 can include a two-dimensional variable-impedance array having column and row electrodes interconnected with nonzero impedances, e.g., as shown in FIG.
  • the force sensor 124 can include one or more optical, resistive, or capacitive touch sensors or one or more capacitive or inductive proximity sensors. In some examples, the force sensor 124 can include one or more deformable membranes or strain gauges.
  • the wireless interrogator 122 can be configured to detect a user-input accessory (UIA) 126 or an object 128 arranged in operational proximity to the wireless interrogator 122, e.g., in a sensing range thereof.
  • the UIA 126 is associated with (e.g., is attached to or includes) a tag 130, e.g., an RFID tag.
  • the object 128 is associated with a tag 132, e.g., an RFID tag.
  • the wireless interrogator 122 can detect identifiers (e.g., class, object, vendor, product, or unique identifiers) or other information stored in the tags 130 or 132, e.g., when the tags 130 or 132 are in operational proximity to the wireless interrogator 122, e.g., in the sensing range thereof.
  • the wireless interrogator 122 can store or update information in the tags 130 or 132.
  • the wireless interrogator 122 can wirelessly detect a first identifier associated with the tagged user-input accessory 126 in operational proximity to the wireless interrogator 122, and wirelessly detect a second identifier associated with the tagged object 128 in operational proximity to the wireless interrogator 122. Further examples are discussed below with reference to FIG. 3.
  • the force sensor 124 can be configured to detect the UIA 126 or the object 128 arranged in operational proximity to the force sensor 124.
  • the UIA 126 or the object 128 can exert a force against a sensing surface of the force sensor 124.
  • the force sensor 124 can detect, e.g., the presence, magnitude, magnitude per area (i.e., pressure), or direction of such a force, whether exerted against the sensing surface directly or indirectly.
  • Example forces exerted directly against the sensing surface are graphically represented by solid arrow shafts; example forces exerted indirectly against the sensing surface (e.g., by the object 128 through the UIA 126) are graphically represented by stippled arrow shafts.
  • the object 128 can be arranged above and supported by the UIA 126 or the force sensor 124, and exert gravitational force (weight) against the force sensor 124. Examples of the force sensor 124 are discussed in more detail below with reference to FIG. 4.
  • the UIA 126 can include a pad configured to overlie the force sensor 124 or the sensing surface of the force sensor 124.
  • the pad can include, e.g., a deformable sheet, e.g., comprising a thermoplastic material.
  • the UIA 126 and force sensor 124 can be arranged in any orientation.
  • the UIA 126 and force sensor 124 can be arranged horizontally, e.g., to support object 128 thereupon.
  • the UIA 126 and force sensor 124 can be arranged vertically, e.g., to detect lateral forces exerted by or via object 128.
  • the UIA 126 can be selected by an entity.
  • the UIA 126 can be placed by the entity into operational proximity to, or operational arrangement with, the wireless interrogator 122 or the force sensor 124.
  • the entity can be, for example, a robot or other robotic operator, which may not be directly controlled by a human user.
  • the entity can be a manufacturing robot, e.g., configured to pick and place the UIA 126.
  • the entity can be a robotic assistance device such as a powered prosthesis or exoskeleton or a robotic remote manipulator ("waldo").
  • the operator can be a human user and the UIA 126 can be selected by the human user.
  • the computing device 102(N) is communicatively connected, e.g., via the network 104, with a computing device 134, e.g., a cloud or other server.
  • a computing device 134 e.g., a cloud or other server.
  • computing devices e.g., computing devices 102 and 134, can intercommunicate to participate in or carry out application selection or operation as described herein.
  • the computing device 134 can include components shown at inset 106.
  • the computing device 134 can additionally or alternatively include one or more computer-readable media (omitted for brevity) including a mapping store 138 holding one or more mappings from identifiers of user-input accessories such as the UIA 126 to indications of software applications corresponding to those identifiers. Examples of the mapping store 138 and the mappings are discussed in more detail below with reference to FIG. 4.
  • the computing device 134 or one or more of the computing devices 102 can be computing nodes in a computing cluster, e.g., a cloud service such as MICROSOFT AZURE.
  • Cloud computing permits computing resources to be provided as services rather than a deliverable product.
  • computing power, software, information, and/or network connectivity are provided (for example, through a rental agreement) over a network, such as the Internet.
  • computing devices 102 can be clients of a cluster including the computing device 134 and can submit jobs, e.g., including identifiers of UIAs 126, to the cluster and/or receive job results, e.g., indications of software applications, from the cluster.
  • Computing devices in the cluster can, e.g., share resources, balance load, increase performance, or provide fail-over support or redundancy.
  • FIG. 2 is an illustrative diagram that shows example components of a computing device 200, which can represent computing devices 102 or 134, and which can be configured to participate in application selection or operation according to various examples described herein.
  • Computing device 200 can implement a launching engine 202, which can represent launching engine 116, FIG. 1.
  • Computing device 200 can implement an interaction engine 204, which can represent interaction engine 118, FIG. 1.
  • Computing device 200 can implement a mapping engine 206.
  • Computing device 200 can implement a software application 208, e.g., configured for interaction with a user.
  • the computing device 200 can implement mapping engine 206 but not launching engine 202, interaction engine 204, or software application 208.
  • the computing device 200 can implement launching engine 202, interaction engine 204, or software application 208 but not mapping engine 206.
  • the computing device 200 can implement launching engine 202, interaction engine 204, mapping engine 206, and software application 208.
  • the computing device 200 can include or be connected to a communications interface 210, which can represent communications interface 120.
  • communications interface 210 can include a transceiver device such as a network interface controller (NIC) to send and receive communications over a network 104 (shown in phantom), e.g., as discussed above.
  • NIC network interface controller
  • the computing device 200 can have network capabilities.
  • the computing device 200 can exchange data with other computing devices 102 or 134 (e.g., laptops, computers, and/or servers) via one or more networks 104, such as the Internet.
  • the computing device 200 can include or be connected to user-interface hardware 212.
  • User-interface hardware 212 can include a display 214 or other device(s) configured to present user interfaces, e.g., as described below.
  • Display 214 can include an organic light-emitting-diode (OLED) display, a liquid-crystal display (LCD), a cathode-ray tube (CRT), or another type of visual display.
  • Display 214 can be a component of a touchscreen, or can include a touchscreen.
  • the user-interface hardware 212 can include various types of output devices configured for communication to a user or to another computing device 200. Output devices can be integral or peripheral to computing device 200.
  • Examples of output devices can include a display, a printer, audio speakers, beepers, or other audio output devices, a vibration motor, linear vibrator, or other haptic output device, and the like.
  • the interaction engine 206 is operatively coupled to the display 214 or another output device.
  • User-interface hardware 212 can include a user-operable input device 216 (graphically represented as a gamepad).
  • User-operable input device 216 can include various types of input devices, integral or peripheral to computing device 200. The input devices can be user-operable, or can be configured for input from other computing device 200.
  • Examples of input devices can include, e.g., a keyboard, keypad, a mouse, a trackball, a pen sensor or smart pen, a light pen or light gun, a game controller such as a joystick or game pad, a voice input device such as a microphone, voice-recognition device, or speech- recognition device, a touch input device, a gestural input device such as a touchscreen, a grip sensor, an accelerometer, another haptic input, a visual input device such as one or more cameras or image sensors, and the like.
  • computing device 102 can include one or more measurement units 218.
  • Measurement units 218 can detect physical properties or status of computing device 200 or its environment. Examples of measurement units 218 can include units to detect motion, temperature, force, pressure, light, sound, electromagnetic radiation (e.g., for wireless networking), or any detectable form of energy or matter in or within sensing range of computing device 200.
  • the measurement units 218 can include the wireless interrogator 122 and/or the force sensor 124, FIG. 1.
  • Individual measurement units of the measurement units 218 can be configured to output data corresponding to at least one physical property, e.g., a physical property of the computing device 200, such as acceleration, or of an environment of the computing device 200, such as temperature or humidity.
  • measurement units 218 can include an accelerometer, a microphone, or front- and rear-facing cameras.
  • measurement units 218 can include a motion sensor, a proximity detector (e.g., for nearby life forms, people, or devices), a light sensor (e.g., a CdS photoresistor or a phototransistor), a still imager (e.g., a charge-coupled device, CCD, or complementary metal-oxide-semiconductor, CMOS, sensor), a video imager (e.g., CCD or CMOS), a microphone, a fingerprint reader, a retinal scanner, an iris scanner, or a touchscreen (e.g., in or associated with a display in user-interface hardware 212, such as display 214).
  • a proximity detector e.g., for nearby life forms, people, or devices
  • a light sensor e.g., a CdS photoresistor or a phototransistor
  • a still imager
  • computing device 102 can include one or more sensors 220.
  • Components of communications interface 210 e.g., transceivers for BLUETOOTH, WI-FI, RFID, NFC, or LTE, can be examples of sensors 220.
  • Such components can be used to, e.g., detect signals corresponding to characteristics of accessible networks. Such signals can also be detected by automatically locating information in a table of network information (e.g., cell-phone tower locations), or by a combination of detection by component of communications interface 120 and table lookup.
  • Input components of user-interface hardware 212 e.g., touchscreens or phone mouthpieces, can also be examples of sensors 220.
  • Measurement units 218 can also be examples of sensors 220.
  • a particular device can simultaneously or selectively operate as part of two or more of communications interface 210, user-interface hardware 212, and one or more measurement units 218.
  • a touchscreen can be an element of user-interface hardware 212 and used to present information and receive user commands. Signals from the same touchscreen can also be used in determining a user's grip on computing device 200. Accordingly, that touchscreen in this example is also a sensor 220.
  • Computing device 200 can further include one or more input/output (I/O) interfaces 222 by which computing device 200 can communicate with input, output, or I/O devices (for clarity, some not depicted). Examples of such devices can include components of user-interface hardware 212 or sensors 220. Computing device 200 can communicate via I/O interface 222 with suitable devices or using suitable electronic/software interaction methods. Input data, e.g., of user inputs on user-operable input device 216, can be received via I/O interfaces 222, and output data, e.g., of user interface screens, can be provided via I/O interfaces 222 to display 214, e.g., for viewing by a user.
  • I/O interfaces 222 by which computing device 200 can communicate with input, output, or I/O devices (for clarity, some not depicted). Examples of such devices can include components of user-interface hardware 212 or sensors 220. Computing device 200 can communicate via I/O interface 222 with suitable devices or using suitable electronic/software interaction methods. In
  • the computing device 200 can include one or more processing units 224, which can represent processing units 108.
  • processing units 224 can include or be connected to a memory 226, e.g., a random-access memory (RAM) or cache.
  • Processing units 224 can be operably coupled, e.g., via the I/O interface 222, to the user-interface hardware 212 and/or the sensors 220.
  • Processing units 224 can be operably coupled to at least one computer-readable media 228, discussed below.
  • Processing units 224 can include, e.g., processing unit types described above such as CPU- or GPGPU-type processing units.
  • the processing units 224 can be configured to execute modules of the plurality of modules.
  • the computer-executable instructions stored on the computer- readable media 228 can upon execution configure a computer such as a computing device 200 to perform operations described herein with reference to the modules of the plurality of modules, e.g., modules of the launching engine 202, interaction engine 204, mapping engine 206, or software application 208.
  • the modules stored in the computer- readable media 228 can include instructions that, when executed by the one or more processing units 224, cause the one or more processing units 224 to perform operations described below. Examples of modules in computer-readable media 228 are discussed below.
  • Computer-readable media 228 can also include an operating system, e.g., operating system 114.
  • computer-readable media 228 includes a data store 230.
  • data store 230 can include data storage, structured or unstructured, such as a database (e.g., a Structured Query Language, SQL, or NoSQL database) or data warehouse.
  • data store 230 can include a corpus or a relational database with one or more tables, arrays, indices, stored procedures, and so forth to enable data access.
  • Data store 230 can store data for the operations of processes, applications, components, or modules stored in computer-readable media 228 or computer instructions in those modules executed by processing units 224.
  • the data store can store computer program instructions (omitted for brevity), e.g., instructions corresponding to apps, to processes described herein, or to other software executable by processing units 224, a mapping store 232, which can represent the mapping store 138, or any combination thereof.
  • the computer program instructions include one or more program modules executable by the processing units 224, e.g., program modules of an app.
  • processing units 224 can access the modules on the computer-readable media 228 via a bus 234, which can represent bus 112, FIG. 1.
  • I/O interface 222 and communications interface 210 can also communicate with processing units 224 via bus 234.
  • the launching engine 202 stored on computer-readable media 228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as a accessory-identifying module 236, an app-determining module 238, and a spawning module 240.
  • modules e.g., shell modules, or API modules, which are illustrated as a accessory-identifying module 236, an app-determining module 238, and a spawning module 240.
  • the interaction engine 204 stored on computer-readable media 228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as an object- identifying module 242, a force-detecting module 244, a force-analysis module 246.
  • modules e.g., shell modules, or API modules, which are illustrated as an object- identifying module 242, a force-detecting module 244, a force-analysis module 246.
  • the mapping engine 206 stored on computer-readable media 228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as a mapping module 248.
  • the software application 208 stored on computer-readable media 228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as a user- interface (UI) presentation module 250, a representation-determining module 252, and an object-presentation module 254.
  • modules e.g., shell modules, or API modules, which are illustrated as a user- interface (UI) presentation module 250, a representation-determining module 252, and an object-presentation module 254.
  • the launching engine 202, interaction engine 204, mapping engine 206, or software application 208 can be embodied in a cloud service, or on a computing device 102 controlled by a user, or any combination thereof. Module(s) stored on the computer- readable media 228 can implement inter-process communications (IPC) functions or networking functions to communicate with servers or other computing devices 102 or 134.
  • IPC inter-process communications
  • the number of modules can vary higher or lower, and modules of various types can be used in various combinations.
  • functionality described associated with the illustrated modules can be combined to be performed by a fewer number of modules or APIs or can be split and performed by a larger number of modules or APIs.
  • the object-identifying module 242 and the force-detecting module 244, or the accessory-identifying module 236 and the force-detecting module 244 can be combined in a single module that performs at least some of the example functions described below of those modules.
  • the accessory-identifying module 236 and the object-identifying module 242 can be combined in a single module that performs at least some of the example functions described below of those modules.
  • the app-determining module 238 and the mapping module 248 can be combined in a single module that performs at least some of the example functions described below of those modules.
  • computer-readable media 228 can include a subset of modules 236, 238, 240, 242, 244, 246, 248, 250, 252, or 254.
  • FIG. 3 is a perspective illustrating an example computing device 300, and example uses thereof.
  • the computing device 300 can represent computing devices 102, 134, or 200.
  • Computing device 300 can additionally or alternatively represent a peripheral, e.g., a user-operable input device 216, communicatively connectable with a computing device 102, 134, or 200.
  • Computing device 300 includes the force sensor 124 (FIG. 1; omitted here for brevity) having a sensing surface 302.
  • the UIA 126 is arranged over the sensing surface 302 in this example; however, other arrangements can be used.
  • the computing device 300 includes mounting features 304 configured to retain the UIA 126 in operational relationship with the sensing surface 302.
  • the UIA 126 can include mating features (omitted for brevity) configured to attach, affix, or otherwise hold to the mounting features 304.
  • the mounting features 304 or the mating features can include, e.g., one or more magnets, pins, sockets, snaps, clips, buttons, zippers, adhesives (permanent, semipermanent, or temporary), nails, screws, bolts, studs, points, or kinematic mounts or mating features of any of those.
  • the UIA 126 can include batteries or other electronics. In some examples, the UIA 126 can omit batteries or other power supplies and operate, e.g., based on inductively-coupled power transfers from the computing device 300. [0059] In the illustrated example, the force sensor 124 includes a plurality of sensing elements 306 distributed across the sensing surface 302, graphically represented as ellipses.
  • the sensing elements 306 can be distributed regularly, e.g., in a grid arrangement (as illustrated), irregularly, randomly, or according to any other pattern or arrangement.
  • the sensing elements can include, e.g., resistive or capacitive touch sensors or strain gauges.
  • the UIA 126 includes the outlines 308(l)-308(3) (individually or collectively referred to herein with reference 308) of three abutting irregular pentagons.
  • This UIA 126 can be used, e.g., to assist young children in developing fine- motor skills or learning how to arrange shapes.
  • the UIA 126 can include outlines (or other tactile or visual indicia, and likewise throughout this paragraph) of positions or orientations of objects or groups of objects.
  • Such a UIA 126 can be used, e.g., to teach shape identification, sorting, or arranging, or relationships between entities represented by the objects.
  • the UIA 126 can include the relative positions of the planets in a depiction of the solar system, and the objects can represent the planets.
  • the UIA 126 can include the outline of a completed tangram puzzle but not the inner lines showing how the tangram pieces should be oriented.
  • the UIA 126 can include a schematic outline of a molecule, e.g., a DNA double helix, and the objects can include representations of portions of the molecule. These and other examples can permit using the computing device 300 with the UIA 126 as an educational tool.
  • the UIA 126 can include outlines or other indicia of tools or input interfaces such as, e.g., a painter's palette, a painting or drawing canvas, a music controller such as a MIDI (Musical Instrument Digital Interface) controller, a piano, an organ, a drum pad, a keyboard such as a QWERTY keyboard, a video game controller, or a multiple-choice answer form (e.g., having respective buttons for choices "A,” "B,” and so on).
  • a painter's palette such as a painter's palette, a painting or drawing canvas
  • a music controller such as a MIDI (Musical Instrument Digital Interface) controller
  • a piano such as a piano
  • an organ such as a QWERTY keyboard
  • a video game controller or a multiple-choice answer form (e.g., having respective buttons for choices "A," "B,” and so on).
  • a multiple-choice answer form e.g., having respective buttons for choices
  • the UIA 126 can include combinations of any of the above examples. In some examples, the UIA 126 can include multiple regions and at least two of the regions can have indicia corresponding to respective, different ones of the above examples.
  • the UIA 126 can correspond with a design environment.
  • the software application 208 can present representations of one or more objects arranged over the UIA 126. For example, multiple gears can be placed on the UIA 126, and the software application 208 can present an animated representation showing how the gears would interlock and interact while turning.
  • objects representing parts of a robot can be placed on the UIA 126, and the software application 208 can present an animated representation showing the robot having those parts assembled according to the spatial relationships between the objects on the UIA 126. As the user moves objects on the UIA 126, the software application 208 can reconfigure the representation of the object accordingly.
  • building blocks can be placed on the UIA 126, and the software application 208 can present a representation showing how the blocks can stack or interlock to form a structure.
  • data of the magnitude of force can be used to determine whether multiple objects are stacked on top of each other.
  • a database of object attributes can be queried with the object identifier to determine a 3-D model or extent of the object or a representation corresponding to the object.
  • objects 310 and 312 which can represent object 128, FIG. 1, are illustrated over the corresponding outlines on the UIA 126.
  • objects 310 and 312 include respective tags 314 and 316, e.g., RFID or NFC tags, carrying identifiers of the respective objects 310 and 312.
  • objects 310 and 312 have identifiers printed on surfaces thereof, e.g., in the form of barcodes or specific colors or patterns.
  • the wireless interrogator 122 can determine a respective identifier of that UIA 126 or that one of the objects 310 and 312.
  • a force can be exerted against the sensing surface 302, e.g., because of gravity or a user's pressing the object 310 or 312 against the sensing surface 302.
  • the force sensor 124 can detect and provide information of this force, as discussed below with reference to the force-analysis module 246, FIG. 4.
  • FIG. 4 is a dataflow diagram 400 illustrating example interactions between the modules illustrated in FIG. 2, e.g., during application selection and operation. Some of the modules make use of a mapping store 138 or 232, e.g., holding mappings between identifiers of UIAs 126 and software applications such as software application 208, FIG. 2.
  • a mapping store 138 or 232 e.g., holding mappings between identifiers of UIAs 126 and software applications such as software application 208, FIG. 2.
  • the accessory-identifying module 236 can be configured to, using the wireless interrogator 122, detect a first identifier corresponding to the UIA 126 in operational proximity to the wireless interrogator 122.
  • the accessory- identifying module 236 can read the identifier of the UIA 126 from an RFID or NFC tag or another storage device in, on, or associated with the UIA 126.
  • the accessory-identifying module 236 can be configured to detect (or attempt to detect) the first identifier, e.g., automatically when or after the UIA 126 enters operational proximity to the wireless interrogator 122 (e.g., a sensing range of the wireless interrogator 122), in response to a actuation of a user-input control such as a "connect” or “activate” button, or on a schedule (e.g., every 0.5 s).
  • a user-input control such as a "connect” or "activate” button
  • the app-determining module 238 can be configured to determine a software application 208 corresponding to the first identifier. For example, the app-determining module can determine the software application 208 using the stored mapping(s) in mapping store 232 of identifiers to software applications, e.g., by querying the mapping store 232 with the first identifier.
  • the app-determining module 238 can be configured to look up software applications in a centralized location, e.g., as is done to map file extensions to applications in MICROSOFT WINDOWS using the Registry (in which HKEY CLASSES ROOT has entries mapping extensions to programmatic identifiers, ProglDs, and entries mapping ProglDs to Component Object Model, COM, class identifiers, CLSIDs, and entries mapping CLSIDs to filenames of software applications), or as is done to map USB Vendor ID (VID)ZProduct ID (PID) pairs to device drivers using INF driver-information files.
  • the Registry in which HKEY CLASSES ROOT has entries mapping extensions to programmatic identifiers, ProglDs, and entries mapping ProglDs to Component Object Model, COM, class identifiers, CLSIDs, and entries mapping CLSIDs to filenames of software applications
  • the app-determining module 238 can be configured to determine the software application 208 corresponding to the first identifier by transmitting the first identifier via the communications interface 210, FIG. 2.
  • the app- determining module 238 can transmit the first identifier to a mapping module 248, e.g., executing on a computing device 134, FIG. 1.
  • the mapping module 248 can determine the software application 208 using the mapping(s) stored in mapping store 232, and transmit an indication of the determined software application 208.
  • the app-determining module 238 can then receive the indication of the determined software application 208 via the communications interface 210.
  • the mapping store 232 can store data indicating which software application corresponds to the first identifier.
  • the first identifier can include one or more of, e.g., a universally unique identifier (UUID) or globally unique identifier (GUID); a uniform resource locator (URL), uniform resource name (URN), uniform resource identifier (URI), Digital Object Identifier (DOI), domain name, or reverse domain name; a unique hardware identifier such as an Ethernet Media Access Control (MAC) address, a 1-WIRE device serial number, or a GS1 Electronic Product Code (EPC), e.g., in URI form or stored in EPC Binary Encoding on, e.g., an RFID tag; a platform-specific app identifier (e.g., a GOOGLE PLAY package name); or a public key or public
  • the indication data can include one or more of, e.g., a name, filename, file path, or registry path; a MICROSOFT WINDOWS ProgID, CLSID, or COM interface ID (IID), or a corresponding identifier in Common Object Request Broker Architecture (CORBA) or another object system; or an identifier of any of the types listed above with reference to the first identifier.
  • a name, filename, file path, or registry path e.g., a name, filename, file path, or registry path
  • IID COM interface ID
  • CORBA Common Object Request Broker Architecture
  • the first identifier can be a UUID.
  • the first identifier can be stored as a 128-bit value, e.g., the hexadecimal value 0x5258AC20517311E59F4F0002A5D5C51B, or as a human-readable string, e.g., " ⁇ 5258ac20-5173-l le5-9f4f-0002a5d5c51b ⁇ ".
  • the mapping store 232 can include a mapping associating the 128-bit first identifier, in either form or another form, with a specific software program, e.g., the reverse domain name "com. microsoft. exchange. mowa" to identify the MICROSOFT OUTLOOK WEB APP for ANDROID.
  • mappings stored in the mapping store 232 can be cryptographically signed, e.g., using signatures according to the Public-Key Cryptography Standard (PKCS) #1 or another public-key cryptosystem.
  • PKCS Public-Key Cryptography Standard
  • SHA-256 Secure Hash Algorithm-256 bit
  • the hash can then encrypted with a predetermined private key.
  • the encrypted hash can be stored in the mapping store 232 in association with the hashed mapping.
  • the app-determining module 238 can select only software applications indicated in mappings having valid cryptographic signatures. For example, the app-determining module 238 can locate a candidate one of the mappings in the mapping store 232 based at least in part on the identifier of the UIA 126. The app-determining module 238 can decrypt the signature of the candidate mapping using a predetermined public key and compare the decrypted hash to the hash of the candidate mapping. If the hashes match, the candidate mapping was provided by a holder of the private key corresponding to the predetermined public key. In this situation, the app-determining module 238 can determine that the candidate mapping is valid and that the indicated software application 208 can be used.
  • mappings can improve system security by reducing the chance that a mapping for, e.g., a malware program or Trojan horse will be selected by the app-determining module 238.
  • the app-determining module 238 can provide a warning, e.g., an "are you sure?" prompt, before selecting a software application indicated in a mapping not having a valid cryptographic signature.
  • determining a software application corresponding to the first identifier can include selecting one of the mappings, e.g., in the mapping store 232, corresponding to the first identifier. A cryptographic signature of the selected one of the mappings can then be verified. If the cryptographic signature is not verified, a different one of the mappings can be selected from the mapping store 232, a warning can be presented, e.g., via a user interface, or processing can terminate without determining an application.
  • the spawning module 240 or the mapping module 248 can perform these or other functions of the app-determining module 238 related to verifying cryptographic signatures of mappings, or any two or more of the spawning module 240, the mapping module 248, and the app-determining module 238 can divide these functions between them.
  • the spawning module 240 in response to verification of the cryptographic signature, can execute the software application indicated in the verified mapping.
  • the spawning module 240 can be configured to execute the determined software application 208.
  • the spawning module 240 can include a loader, such as a relocating or dynamic-linking loader, configured to read processor- executable instructions from a disk image of the determined software application 208 and place those instructions in main memory to be executed.
  • a loader such as a relocating or dynamic-linking loader
  • the spawning module 240 can be configured to determine whether the determined software application 208 is available to be executed. Applications can be available if, e.g., they are loaded on local storage of computing device 102 or on storage accessible to computing device 102 via a network. In some examples, if the determined software application 208 is not available or unavailable to be executed, the spawning module 240 can execute an installation package of the determined software application 208 to make the determined software application 208 available, and can then execute the determined software application 208. In some examples, if the determined software application 208 is not available to be executed, the spawning module 240 can execute a user interface configured to permit the user to locate, download, or purchase the determined software application 208.
  • the user interface is configured to, when executed, present an indication of the determined software application 208; receive payment information; and, at least partly in response to the received payment information (e.g., based at least in part on exchanging payment and authorization data with a payment server), download the determined software application.
  • the spawning module 240 can then execute the determined software application 208 once the determined software application 208 is available. This can permit, e.g., distributing a set of mappings, e.g., via automatic updates to operating system 114.
  • the mappings can be distributed without regard to which applications are installed on a particular computer. This can provide increased user efficiency, since a user employing a new UIA 126 for the first time can be automatically prompted to download the determined software application 208. This can save network bandwidth and user time compared to requiring the user to manually locate the determined software application 208.
  • the object-identifying module 242 can be configured to, using the wireless interrogator 122, detect a second identifier corresponding to an object in operational proximity to the wireless interrogator 122.
  • the object-identifying module 242 can detect the second identifier in any of the ways for detecting identifiers described above with reference to the accessory-identifying module 236, e.g., periodically polling for tags using an RFID reader.
  • the force-detecting module 244 can be configured to detect a force exerted against the sensing surface 302 of the force sensor 124 by the object. This can be done, e.g., as discussed above with reference to FIGS. 2 and 3.
  • the object-identifying module 242 or the force-detecting module 244 can be configured to provide information of the second identifier and information of the detected force, e.g., to the software application 208 or to the force-analysis module 246.
  • the information can include, e.g., spatial data of the object (e.g., the object 310 or 312, FIG. 3).
  • the information can be transmitted via a kernel-process or interprocess communication channel such as a socket, pipe, or callback, or can be retrieved by the software application via an API call.
  • the force sensor 124 can be configured to detect spatially-varying forces across the sensing surface, e.g., using a plurality of sensing elements distributed across the sensing surface.
  • the information of the detected force can include information of the spatially-varying forces.
  • the force-analysis module 246 can be configured to determine a location of the object based at least in part on the detected spatially-varying forces. The force-analysis module 246 can then provide to the software application 208 information of the determined location.
  • the spatial data of the object 310 or 312 can include location data of the object, e.g., with respect to the sensing surface 302 or the UIA 126.
  • bitmap data of magnitudes of the spatially-varying forces can be processed using object- or feature-detection image-processing algorithms such as edge tracing, gradient matching, the Speeded Up Robust Features (SURF) feature detector, or (e.g., for a single object) location-weighted summing. Once a shape is located, the centroid can be determined using location-weighted summing over the pixels of the bitmap data considered to lie within the shape.
  • SURF Speeded Up Robust Features
  • the force-analysis module 246 can be configured to determine a shape of the object based at least in part on the detected spatially -varying forces. The force-analysis module 246 can then provide to the software application 208 information of the determined shape.
  • the spatial data of the object can include shape data of the object, e.g., coordinates of an outline of the shape or a raster map of force readings corresponding to the shape(s) of surface(s) of the object exerting force against the sensing surface 302.
  • shapes can be detected in bitmap force-magnitude data as noted above with respect to shape location.
  • Pixels considered to lie within the shape can then be outlined, e.g., by fitting, e.g., cubic or quadratic Bezier curves or other polynomials to the edges defined by borders between those pixels and pixels not considered to lie within the shape.
  • the Sobel, Canny, or other edge detectors can be used to locate edges.
  • the potrace, Autotrace, or other algorithms can be used to trace, e.g., edges or centerlines of pixel regions to provide the shape information.
  • determined bitmap or outline data of a shape can be compared to a catalog of known shapes to provide shape data such as "triangle" or "square.”
  • information associated with the identifier of an object can be used together with the force data for that object to determine where that object is.
  • the determined shapes of the objects can be correlated with the identifiers, e.g., using a database lookup or a shape catalog, to determine spatial data associated with each of the shape identifiers.
  • the spatial data can include one or more bitmap representations of the detected spatially-varying forces or portions thereof.
  • the spatial data can include a raw bitmap (rasterized) representation of the spatially-varying forces, e.g., detected by a regular grid of force sensors.
  • the bitmap can represent the entire active area of the force sensor 124 or only a portion thereof, e.g., a portion in which force greater than a threshold or noise level is present, or a bounding polygon of such a portion.
  • the spatial data can additionally or alternatively include values calculated from force data, e.g., the size or center of a bounding box of a portion in one or more dimensions, the location in one or more dimensions of the highest force in a portion, or the average magnitude or direction of force in a portion.
  • the spatial data can include one or more vector representations of the detected spatially-varying forces or portions thereof.
  • the spatial data can include coordinates of contours of constant force magnitude, coordinates of outlines of areas experiencing a force above a threshold, or coordinates or axis values of centroids or bounding boxes of such areas,
  • the Ul-presentation module 250 of the software application 208 can be configured to present for display a user interface.
  • the user interface can be presented for display on a display 214 of user-interface hardware 212, or on another display.
  • the user interface can have at least one content element or presentation element determined based at least in part on the identifier of the UIA 126, e.g., detected by the accessory-identifying module 236.
  • content elements can include, e.g., text box controls, labels, images, and other areas or representations conveying content not predetermined by the operating system 114 or the software application 208.
  • presentation elements can include, e.g., window borders, menus, shell controls, color schemes, sounds, and other areas or representations conveying structural or other information predetermined by the operating system 114 or the software application 208, or constant from the point of view of the software application 208.
  • the user interface can include one or more screens or windows holding, e.g., content elements or presentation elements; different screens, windows, or elements can be visible at different times. For example, the user interface can show and hide different sets of screens, windows or elements at different times, e.g., in response to user inputs via the UIA 126. Further examples of user interfaces are discussed below.
  • references to presenting user interfaces and representations "for display” are examples and not limiting.
  • User interfaces and representations can alternatively or additionally be presented audibly, e.g., via speakers or headphones, or in a haptic or tactile manner, e.g., using force-feedback devices such as a SPACEBALL or using refreshable braille displays.
  • the Ul-presentation module 250 of the software application 208 can be configured to present for display the user interface including a plurality of content elements in an arrangement based at least in part on the detected identifier of the UIA 126.
  • the sequence, relative or absolute sizes, or relative or absolute positions of the content elements can be determined based at least in part on the identifier of the UIA 126.
  • the software application 208 can include a table, program module, or other data or instructions mapping the identifier of the UIA 126 to a list of relative positions of the content elements.
  • the list of relative positions can include an indication of which user-interface screen or window should hold each content element.
  • the order or type of questions on a test or survey for a particular user can be determined using the list of relative positions corresponding to the identifier of that user's UIA 126.
  • the representation-determining module 252 of the software application 208 can be configured to determine a representation of an object.
  • the representation can be based at least in part on an identifier of the object, e.g., detected by the object-identifying module 242.
  • the representation can include a two- dimensional (2-D) or three-dimensional (3-D) model of the object determined, e.g., by querying a database of models using the identifier of the object.
  • the database can be stored locally on the computing device 102 running the software application 208, or can be stored on a computing device 134.
  • the object-presentation module 254 of the software application 208 can be configured to present for display the representation of the object.
  • the presented representation can be arranged in the presented user interface based at least in part on spatial data of the object.
  • the spatial data of the object can include at least location data of the object or shape data of the object.
  • the representation can be positioned based on the location data of the object.
  • the representation can be rotated or flipped based on the shape data of the object. For example, in FIG. 3, the objects corresponding to outlines 308(1) and 308(3) are identical, but one is flipped along the long axis of the object compared to the other.
  • the shape data can be used to determine which side of the object is against the force sensor.
  • the representation can then be oriented accordingly.
  • the representation can be presented with colors, shades, outlines, or other marks or attributes corresponding to the magnitude or direction of the force. For example, as the user presses object 310 more forcefully into the sensing surface 302, the representation can change color from green to yellow or red.
  • the object-presentation module 254 can be configured to present the representation of the object in a three-dimensional virtual environment of the user interface.
  • the three-dimensional virtual environment can be a subset of a four- or higher-dimensional virtual environment.
  • a geometry of the representation of the object in a first dimension or set of dimensions e.g., an extent or a position of the representation of the object in the first dimension or set of dimensions, is determined based at least in part on the spatial data of the object.
  • a geometry (e.g., an extent or a position) of the representation of the object in a second dimension or set of dimensions is determined based at least in part on the identifier of the object.
  • the first and second dimensions can be different, e.g., orthogonal or otherwise nonparallel, or the first set of dimensions can differ in at least one dimension from the second set of dimensions.
  • the position of the representation of the first object in X, Y, or both can be determined based on the location of the object 310 with respect to the UIA 126.
  • the height of the representation e.g., the Z-axis extent of the representation, can be determined based on an identifier of the object. For example, the height can be retrieved from a datastore mapping identifiers to heights. This can permit modeling three- dimensional environments with the force sensor 124 providing data in only two dimensions.
  • the object-presentation module 254 can be configured to determine that the spatial data of the object correspond to a spatial target.
  • the software application 208 can store data of the locations of the outlines 308 of the pentagons on the UIA 126.
  • the spatial target can correspond to the location of outline 308(1), and the object can be object 310.
  • the object-presentation module 254 can determine that the object 310 has been placed according to the outline 308(1).
  • the object-presentation module 254 can present for display a success indicator, e.g., a highlight, outline, star, or other indicator that the object 310 has been correctly positioned with respect to the UIA 126.
  • the Ul-presentation module 250 or the object-presentation module 254 can be configured to present for display a goal representation arranged in the user interface based at least in part on the spatial target. For example, an outline or shaded area shaped as the object 310 or the corresponding outline 308 can be displayed on screen.
  • the goal representation can be positioned with respect to the representation of the object in a way corresponding to the spatial relationship between the object 310 and the corresponding outline 308. For example, when the object 310 is to the right of the outline 308(1), the representation of the object can be presented arranged to the right of the goal representation.
  • the representations of the object or the goal can be presented with colors or other attributes determined based at least in part on, e.g., the distance between the object and the spatial target, or the degree of similarity or difference between an orientation of the object and an orientation of the spatial target. For example, if the object 310 is spaced apart from, or rotated or flipped with respect to, the spatial target, the goal representation can be highlighted more brightly than when the object 310 is close to or oriented as the spatial target.
  • Some examples present representations of multiple objects 310, 312. Functions described above with reference to spatial targets, goal representations, and success indicators can be performed for any number of objects. For example, a success indicator can be displayed when a plurality of objects have been positioned correctly with respect to respective spatial targets, whether or not goal indicator(s) were displayed for any particular one(s) of the objects.
  • the representation- determining module 252 can be configured to determine that a second object, e.g., object 312, is in a selected spatial relationship with the object, e.g., object 310. The determination can be made based at least in part on the spatial data of the object and spatial data of the second object, e.g., on respective location data.
  • the spatial data of the second object can include, e.g., location data of the second object or shape data of the second object.
  • the selected spatial relationship can be represented, e.g., in data store 230, and can include, e.g., proximity of shapes, edges or vertices; proximity of defined connection points; similarity of orientation or of position along one axis; or coaxial positioning.
  • the representation-determining module 252 can determine a representation of the second object.
  • the representation can be based at least in part on an identifier of the second object.
  • the object-presentation module 254 can present for display the representation of the second object.
  • the representation of the second object can be arranged in the user interface, e.g., based at least in part on the spatial data of the object and the spatial data of the second object.
  • the respective representations of objects 310 and 312 can be presented together in the user interface, or with an indication that they are linked or correctly placed with respect to each other.
  • Such an indication can include, e.g., a green or blue highlight indicating where the representations of objects 310 and 312 abut.
  • Example functions described herein with respect to the object and the second object can be applied for any number of objects.
  • the Ul-presentation module 250 is responsive to the UIA 126 or type of UIA 126 detected, as graphically represented by the dashed arrow (the dashes are only for clarity).
  • individual UIAs 126 are associated with specific user interfaces.
  • individual types of UIAs 126 e.g., USB device classes or other groupings of UIAs
  • a UIA 126 having a 12-key piano keyboard can correspond to a first user interface of an application
  • a UIA 126 have a 24-key piano keyboard can correspond to a second user interface of that application.
  • the application can, e.g., remove controls or UI features not relevant to the current UIA 126, improving operational efficiency, e.g., of users or other entities, by reducing screen clutter and by reducing the amount of time users have to search to locate a desired feature.
  • Relationships between UIAs 126, types of UIAs 126, and user interfaces can be stored in any combination of memory in the UIA 126, memory in the computing device 102, or remote storage such as that hosted by the computing device 134.
  • the Ul-presentation module 250 in response to detection by the accessory-identifying module 236 of a first user-input accessory, can present for display a first user interface of an application.
  • the first user interface can include at least one content element or presentation element determined based at least in part on a first identifier corresponding to the first user-input accessory.
  • the Ul- presentation module 250 can present for display a second, different user interface of the application.
  • the second user interface can have at least one content element or presentation element determined based at least in part on a second identifier corresponding to the second user-input accessory.
  • the Ul-presentation module 250 can retrieve a UI layout or a set of UI resources based at least in part on the identifer of the UIA 126, and arrange UI elements according to the retrieved layout or resources.
  • the two user interfaces can include a basic interface and a "pro" interface.
  • the first user interface can include one or more user-interface controls.
  • the second user interface can include the one or more user-interface controls from the first user interface, and one or more additional user-interface controls.
  • the first user interface can include a text-editing control and buttons for bold, italic, and underline.
  • the second user interface can include the text-editing control and the buttons for bold, italic, and underline, and also buttons to invoke complex paragraph- and page-formatting dialogs.
  • the two user interfaces can include a limited-functionality interface and a full-functionality (or not-as-limited functionality) interface.
  • the first user interface can include one or more user-interface controls presented in a disabled state.
  • the user-interface controls can be grayed out, nonresponsive to events or attempts to actuate them, or fixed in value.
  • the second user interface can include the one or more user-interface controls presented in an enabled state.
  • the controls can be presented in colors used for other active controls, can respond to events, or can be varying in value.
  • the first user interface can include a mixer with four volume sliders that are responsive to user inputs and four volume sliders that are locked at - ⁇ dB.
  • the second user interface can include all eight volume sliders responsive to user inputs.
  • the two user interfaces can present elements of the user interface in respective, different orders.
  • the first user interface can include one or more content elements presented in a first arrangement, e.g., a first order, sequence, or spatial layout.
  • the second user interface can include the one or more content elements presented in a second, different arrangement.
  • the content elements can include survey questions or test questions, and the questions can be presented in different orders to users with different UIAs 126. This can provide randomization of the population of users answering the questions and reduce bias in the answers due to the ordering of the questions.
  • the order of elements can be stored in a database and the Ul-presentation module 250 can query the database with the identifier of the UIA 126.
  • FIG. 5 is a flow diagram that illustrates an example process 500 for selecting and operating an application using a computing device, e.g., computing device 200, FIG. 2.
  • Example functions shown in FIG. 5 and other flow diagrams and example processes herein can be implemented on or otherwise embodied in one or more computing devices 102 or 134, e.g., using software running on such devices.
  • functions shown in FIG. 5 can be implemented in an operating system (OS), a hardware abstraction layer (HAL), or a device-management subsystem.
  • OS operating system
  • HAL hardware abstraction layer
  • FIG. 5 the example process 500 is described below with reference to components of environment 100, FIG. 1, processing unit 224 and other components of computing device 200, FIG. 2, or computing device 300, FIG.
  • processing unit 108 can carry out steps of described example processes such as process 500.
  • exemplary methods shown in FIGS. 6, 7, and 8 are also not limited to being carried out by any particularly-identified components.
  • each example flow diagram or process is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement each process.
  • either of blocks 506 and 508 can be performed before the other, or block 508 can be performed before block 504.
  • the operations in each of FIGS. 5, 6, 7, and 8 can be implemented in hardware, software, and/or a combination thereof.
  • the operations represent computer-executable instructions that, when executed by one or more processors, cause one or more processors to perform the recited operations.
  • the operations represent logic functions implemented in circuitry, e.g., datapath-control and finite-state-machine sequencing functions.
  • the accessory-identifying module 236 can wirelessly detect a first identifier corresponding to a user-input accessory, e.g., by reading an RFID tag on the UIA 126. This can be done, e.g., as described above with reference to the wireless interrogator 122, FIG. 1, or the accessory-identifying module 236, FIG. 4.
  • the object-identifying module 242 can wirelessly detect a second identifier corresponding to an object, e.g., by reading an RFID tag on the object. This can be done, e.g., as described above with reference to the wireless interrogator 122, FIG. 1, or the object-identifying module 242, FIG. 4.
  • the force-analysis module 246 can determine spatial data of the object using the force sensor 124.
  • the spatial data can include at least location data of the object or shape data of the object. This can be done, e.g., as discussed above with reference to the force sensor 124 or the force-analysis module 246, FIG. 4.
  • the spawning module 240 can execute a software application corresponding to the first identifier, e.g., by calling CreateProcess() on a Windows system or fork() and execQ on a UNIX system. This can be done, e.g., as discussed above with reference to the spawning module 240, FIG. 4.
  • block 508 can include executing an installation package of the software application or executing a user interface configured to permit the user to locate, download, or purchase the software application. Block 508 can then include executing the software application when the installation, download, or purchase is complete.
  • the force-analysis module 246 can provide the spatial data to the software application. This can be done, e.g., using IPC or other techniques discussed above with reference to the force-analysis module 246 , FIG. 4.
  • FIG. 6 is a flow diagram that illustrates an example process 600 for operation of an application, e.g., by presenting a representation of a physical object using a computing device, e.g., computing device 200, FIG. 2.
  • a computing device e.g., computing device 200, FIG. 2.
  • the Ul-presentation module 250 can present for display a user interface.
  • the user interface can have at least one content element or presentation element determined based at least in part on an identifier of a UIA 126. This can be done, e.g., as described above with reference to the Ul-presentation module 250, FIG. 4.
  • block 602 can include presenting a plurality of content elements in an arrangement based at least in part on the identifier of the user-input accessory. This can be done, e.g., as described above with reference to the Ul-presentation module 250, FIG. 4.
  • the representation-determining module 252 can determine a representation of an object.
  • the representation can be determined based at least in part on an identifier of the object. This can be done, e.g., as described above with reference to the representation-determining module 252, FIG. 4.
  • the object-presentation module 254 can present for display the representation of the object.
  • the representation can be presented arranged in the user interface based at least in part on spatial data of the object.
  • the spatial data of the object can include at least location data of the object or shape data of the object. This can be done, e.g., as described above with reference to the object-presentation module 254, FIG. 4.
  • block 606 can include presenting the representation of the object in a three-dimensional virtual environment of the user interface.
  • An extent or a position of the representation of the object in a first dimension can be determined based at least in part on the spatial data of the object.
  • An extent or a position of the representation of the object in a second, different dimension can be determined based at least in part on the identifier of the object.
  • the location of the representation can be determined based on the spatial data, and the height of the representation can be retrieved from a database queried by the identifier of the object. This can be done, e.g., as described above with reference to the object-presentation module 254, FIG. 4.
  • FIG. 7 is a flow diagram that illustrates an example process 700 for operating an application based on spatial data of one or more objects, e.g., by presenting a representation of a physical object using a computing device, e.g., computing device 200, FIG. 2.
  • Block 602 can also be used with process 700 but is omitted from FIG. 7 for brevity.
  • Block 604 can be followed by block 606, block 702, or block 710.
  • the dash styles of lines in FIG. 7 are solely for clarity.
  • the force-detecting module 244 or the force-analysis module 246 can receive the spatial data of the object from a force sensor having a plurality of sensing elements distributed across a sensing surface. This can be done, e.g., as described above with reference to the force-detecting module 244 or the force-analysis module 246, FIG. 4.
  • Block 702 can be followed by block 606, which can be followed by block 704 or block 706.
  • the Ul-presentation module 250 or the object-presentation module 254 can present for display a goal representation arranged in the user interface based at least in part on a spatial target. This can be done, e.g., as described above with reference to the outlines 308, FIG. 3, or the Ul-presentation module 250 or the object-presentation module 254, FIG. 4.
  • the goal representation can be used for, e.g., educational software applications 208 such as shape-sorting applications.
  • the Ul-presentation module 250 or the object-presentation module 254 can determine whether the spatial data of the object correspond to a spatial target. If so, the next block can be block 708. This determination can be made whether or not a goal representation was displayed (block 704). For example, a goal representation can be displayed for a shape-sorting application and omitted for a tangram application. This can be done, e.g., as described above with reference to the outlines 308, FIG. 3, or the Ul- presentation module 250 or the object-presentation module 254, FIG. 4.
  • the UI- presentation module 250 or the object-presentation module 254 can present for display a success indicator. This can be done, e.g., as described above with reference to the object- presentation module 254, FIG. 4.
  • the Ul-presentation module 250 or the representation-determining module 252 can determine whether a second object is in a selected spatial relationship with the object based at least in part on the spatial data of the object and spatial data of the second object. If so, the next block can be block 712. This can be done, e.g., as described above with reference to the representation-determining module 252, FIG. 4.
  • the representation-determining module 252 can determine a representation of the second object, the representation based at least in part on an identifier of the second object. This can be done, e.g., as described above with reference to the representation-determining module 252, FIG. 4.
  • the object-presentation module 254 can present for display the representation of the second object arranged in the user interface based at least in part on the spatial data of the object and the spatial data of the second object.
  • the spatial data of the second object can include at least location data of the second object or shape data of the second object. This can be done, e.g., as described above with reference to the object- presentation module 254, FIG. 4.
  • FIG. 8 is a flow diagram that illustrates an example process 800 for selecting a user interface for an application using a computing device, e.g., computing device 200, FIG. 2.
  • the user interface can be selected based upon a user-input accessory. This can provide the user an efficient experience with the particular UIA 126 that user has selected.
  • the accessory-identifying module 236 can detect a UIA 126. This can be done, e.g., by wirelessly detecting an identifier of the UIA 126, e.g., as discussed above with reference to the wireless interrogator 122, FIG. 1.
  • the accessory-identifying module 236, the app-determining module 238, or the Ul-presentation module 250 can determine whether the detected UIA 126 is a first UIA or a second, different UIA, e.g., as discussed above with reference to FIG. 4.
  • Block 804 can also distinguish between more than two UIAs or types of UIAs.
  • Block 804 can be followed by, e.g., block 806 or block 808 (or other blocks omitted for brevity) depending on the type of the UIA 126.
  • the UI- presentation module 250 can present for display a first user interface of an application.
  • the first user interface can have at least one content element or presentation element determined based at least in part on a first identifier corresponding to the first user-input accessory.
  • the Ul-presentation module 250 can present for display a second user interface of the application.
  • the second user interface can have at least one content element or presentation element determined based at least in part on a second identifier corresponding to the second user-input accessory.
  • the second user interface can be different from the first user interface.
  • the second user interface can have at least one content element or presentation element the first user interface lacks, or vice versa.
  • the first user interface can include one or more user-interface controls and the second user interface can include the one or more user-interface controls and one or more additional user-interface controls. This can be done, e.g., as described above with reference to the Ul-presentation module 250, FIG. 4.
  • the first user interface can include one or more user-interface controls presented in a disabled state and the second user interface can include the one or more user-interface controls presented in an enabled state. This can be done, e.g., as described above with reference to the Ul-presentation module 250, FIG. 4.
  • first user interface can include one or more content elements presented in a first arrangement and the second user interface can include the one or more content elements presented in a second, different arrangement. This can be done, e.g., as described above with reference to the Ul-presentation module 250, FIG. 4.
  • a device comprising: a wireless interrogator configured to: wirelessly detect a first identifier associated with a tagged user-input accessory in operational proximity to the wireless interrogator; and wirelessly detect a second identifier associated with a tagged object in operational proximity to the wireless interrogator; a force sensor having a sensing surface; one or more computer-readable media having stored thereon a plurality of modules; and one or more processing units operably coupled to the wireless interrogator, the force sensor, and at least one of the computer-readable media, the processing unit adapted to execute modules of the plurality of modules comprising: a launching engine configured to: determine a software application corresponding to the first identifier; and execute the determined software application; and an interaction engine configured to: detect a force exerted against the sensing surface by the object; and provide to the software application information of the second identifier and information of the detected force.
  • a wireless interrogator configured to: wirelessly detect a first identifier associated with a tagged user-input accessory in operational proximity
  • [0133] B A device as recited in paragraph A, wherein the force sensor is configured to detect spatially-varying forces across the sensing surface.
  • C A device as recited in paragraph B, wherein the interaction engine is further configured to: determine a location of the object based at least in part on the detected spatially-varying forces; and provide, to the software application, information of the determined location.
  • D A device as recited in paragraph B or C, wherein the interaction engine is further configured to: determine a shape of the object based at least in part on the detected spatially-varying forces; and provide, to the software application, information of the determined shape.
  • E A device as recited in any of paragraphs B-D, wherein the force sensor comprises a plurality of sensing elements distributed across the sensing surface.
  • F A device as recited in any of paragraphs A-E, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings.
  • G A device as recited in any of paragraphs A-F, further comprising a communications interface, wherein the launching engine is further configured to determine the software application corresponding to the first identifier by transmitting the first identifier via the communications interface and receiving an indication of the software application via the communications interface.
  • H A device as recited in any of paragraphs A-G, wherein the wireless interrogator comprises a radio-frequency identification (RFID) reader configured to wirelessly detect the identifiers of RFID-tagged ones of the tagged objects.
  • RFID radio-frequency identification
  • J A device as recited in any of paragraphs A-I, wherein the user-input accessory comprises a pad configured to overlie the force sensor.
  • K A device as recited in any of paragraphs A-J, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings, the determining including selecting, from the one or more mappings, a candidate mapping corresponding to the first identifier; and verifying a cryptographic signature of the candidate mapping.
  • L A device as recited in any of paragraphs A-K, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute an installation package of the determined software application.
  • M A device as recited in any of paragraphs A-L, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute a user interface configured to permit the user to locate, download, or purchase the determined software application.
  • N A device as recited in paragraph M, wherein the user interface is configured to, when executed, present an indication of the determined software application; receive payment information; and, at least partly in response to the received payment information, download the determined software application.
  • a computer-implemented method comprising: presenting for display a user interface having at least one content element or presentation element determined based at least in part on an identifier of a user-input accessory; determining a representation of an object, the representation based at least in part on an identifier of the object; and presenting for display the representation of the object arranged in the user interface based at least in part on spatial data of the object, wherein the spatial data of the object comprises at least location data of the object or shape data of the object.
  • R A computer-implemented method as recited in paragraph Q, further comprising presenting for display a goal representation arranged in the user interface based at least in part on the spatial target.
  • S A computer-implemented method as recited in any of paragraphs P-R, wherein: the presenting the representation of the object comprises presenting the representation of the object in a three-dimensional virtual environment of the user interface; a geometry of the representation of the object in a first dimension is determined based at least in part on the spatial data of the object; and a geometry of the representation of the object in a second, different dimension is determined based at least in part on the identifier of the object.
  • T A computer-implemented method as recited in any of paragraphs P-S, further comprising: determining that a second object is in a selected spatial relationship with the object based at least in part on the spatial data of the object and spatial data of the second object; determining a representation of the second object, the representation based at least in part on an identifier of the second object; and presenting for display the representation of the second object arranged in the user interface based at least in part on the spatial data of the object and the spatial data of the second object, wherein the spatial data of the second object comprises at least location data of the second object or shape data of the second object.
  • V A computer-implemented method as recited in any of paragraphs P-U, further comprising receiving the spatial data of the object from a force sensor having a plurality of sensing elements distributed across a sensing surface.
  • W A computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer-executable instructions upon execution configuring a computer to perform operations as recited in any of paragraphs P- V.
  • a device comprising: a processor; and a computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer- executable instructions upon execution by the processor configuring the device to perform operations as recited in any of paragraphs P-V.
  • a system comprising: means for processing; and means for storing having thereon computer-executable instructions, the computer-executable instructions including means to configure the system to carry out a method as recited in any of paragraphs P-V.
  • Z A computer-readable medium having thereon computer-executable instructions, the computer-executable instructions upon execution configuring a computer to perform operations comprising: detecting a first user-input accessory and, in response, presenting for display a first user interface of an application, the first user interface having at least one content element or presentation element determined based at least in part on a first identifier corresponding to the first user-input accessory; and detecting a second, different user-input accessory and, in response, presenting for display a second, different user interface of the application, the second user interface having at least one content element or presentation element determined based at least in part on a second identifier corresponding to the second user-input accessory.
  • AA A computer-readable medium as recited in paragraph Z, wherein the first user interface includes one or more user-interface controls and the second user interface includes the one or more user-interface controls and one or more additional user-interface controls.
  • AB A computer-readable medium as recited in paragraph Z or AA, wherein the first user interface includes one or more user-interface controls presented in a disabled state and the second user interface includes the one or more user-interface controls presented in an enabled state.
  • AC A computer-readable medium as recited in any of paragraphs Z-AB, wherein the first user interface includes one or more content elements presented in a first arrangement and the second user interface includes the one or more content elements presented in a second, different arrangement.
  • AD A device comprising: a processor; and a computer-readable medium as recited in any of paragraphs Z-AC.
  • AE A system comprising: means for processing; and a computer-readable medium as recited in any of paragraphs Z-AC, the computer-readable medium storing instructions executable by the means for processing.
  • a device comprising: one or more computer-readable media having stored thereon a plurality of modules; and one or more processing units operably coupled to at least one of the computer-readable media, the processing unit adapted to execute modules of the plurality of modules comprising: a launching engine configured to: detect a first identifier corresponding to a user-input accessory; determine a software application corresponding to the first identifier; and execute the determined software application; and an interaction engine configured to: detect a second identifier corresponding to an object; detect a force exerted by the object; and provide to the software application information of the second identifier and information of the detected force.
  • a device as recited in paragraph AF wherein the one or more computer- readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings, the determining including selecting, from the one or more mappings, a candidate mapping corresponding to the first identifier; and verifying a cryptographic signature of the candidate mapping.
  • AH A device as recited in paragraph AF or AG, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute an installation package of the determined software application.
  • AI A device as recited in any of paragraphs AF-AH, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute a user interface configured to permit the user to locate, download, or purchase the determined software application.
  • AJ A device as recited in paragraph AI, wherein the user interface is configured to, when executed, present an indication of the determined software application; receive payment information; and, at least partly in response to the received payment information, download the determined software application.
  • AK A device as recited in any of paragraphs AF-AJ, further comprising a sensing surface, wherein the interaction engine is configured to detect spatially-varying forces exerted by the object across the sensing surface.
  • AL A device as recited in paragraph AK, wherein the interaction engine is further configured to: determine a location of the object based at least in part on the detected spatially-varying forces; and provide, to the software application (e.g., the determined software application), information of the determined location.
  • the software application e.g., the determined software application
  • AM A device as recited in paragraph AK or AL, wherein the interaction engine is further configured to: determine a shape of the object based at least in part on the detected spatially-varying forces; and provide, to the software application, information of the determined shape.
  • AN A device as recited in any of paragraphs AK-AM, further comprising a plurality of sensing elements distributed across the sensing surface.
  • AO A device as recited in any of paragraphs AK-AN, wherein the user-input accessory includes a pad configured to overlie the sensing surface.
  • AP A device as recited in any of paragraphs AF-AO, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings.
  • AQ A device as recited in any of paragraphs AF-AP, further comprising a communications interface, wherein the launching engine is further configured to determine the software application corresponding to the first identifier by transmitting the first identifier via the communications interface and receiving an indication of the software application via the communications interface.
  • AR A device as recited in any of paragraphs AF-AQ, further comprising a radio- frequency identification (RFID) reader configured to wirelessly detect the identifiers of RFID-tagged ones of the tagged objects.
  • RFID radio- frequency identification
  • AS A device as recited in any of paragraphs AF-AR, further comprising a mounting feature configured to retain the user-input accessory in operational relationship with the sensing surface.
  • a computer-implemented method comprising: wirelessly detecting a first identifier corresponding to a user-input accessory; wirelessly detecting a second identifier corresponding to an object; determining spatial data of the object using a force sensor, wherein the spatial data comprises at least location data of the object or shape data of the object; executing a software application corresponding to the first identifier; and providing the spatial data to the software application.
  • AU A computer-implemented method as recited in paragraph AT, wherein the determining spatial data comprises detecting spatially-varying forces across the sensing surface using the force sensor.
  • AV A computer-implemented method as recited in paragraph AU, wherein the determining spatial data comprises determining the location data of the object based at least in part on the detected spatially-varying forces.
  • AW A computer-implemented method as recited in paragraph AU or AV, wherein the determining spatial data comprises determining the shape data of the object based at least in part on the detected spatially-varying forces.
  • AX A computer-implemented method as recited in any of paragraphs AT-AW, wherein the executing the software application comprises determining the software application corresponding to the first identifier using one or more stored mappings of identifiers to respective software applications.
  • AY A computer-implemented method as recited in any of paragraphs AT-AX, wherein the executing the software application comprises transmitting the first identifier via a communications interface and receiving an indication of the software application via the communications interface.
  • AZ A computer-implemented method as recited in any of paragraphs AT- AY, wherein the wirelessly detecting the first identifier comprises retrieving the first identifier from a radio-frequency identification (RFID) tag of the user-input accessory.
  • RFID radio-frequency identification
  • BA A computer-implemented method as recited in any of paragraphs AT-AZ, wherein the wirelessly detecting the second identifier comprises retrieving the second identifier from an RFID tag of the object.
  • BB A computer-implemented method as recited in any of paragraphs AT-BA, wherein the executing the software application comprises selecting, from one or more stored mappings of identifiers to respective software applications, a candidate mapping corresponding to the first identifier; verifying a cryptographic signature of the candidate mapping; and, in response to the verifying, executing the software application indicated in the candidate mapping.
  • BC A computer-implemented method as recited in any of paragraphs AT-BB, wherein the executing the software application comprises executing an installation package of the software application in response to the software application being unavailable to be executed.
  • BD A computer-implemented method as recited in any of paragraphs AT-BC, wherein the executing the software application comprises, if the software application is not available to be executed, executing a user interface configured to permit the user to locate, download, or purchase the determined software application.
  • BE A computer-implemented method as recited in paragraph BD, wherein the executing the user interface includes presenting an indication of the determined software application; receiving payment information; and, at least partly in response to the received payment information, downloading the determined software application.
  • BF A computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer-executable instructions upon execution configuring a computer to perform operations as recited in any of paragraphs AT- BE.
  • BG A device comprising: a processor; and a computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer- executable instructions upon execution by the processor configuring the device to perform operations as recited in any of paragraphs AT-BE.
  • a computer-readable medium e.g., a computer storage medium, having thereon computer-executable instructions, the computer- executable instructions upon execution by the processor configuring the device to perform operations as recited in any of paragraphs AT-BE.
  • BH A system comprising: means for processing; and means for storing having thereon computer-executable instructions, the computer-executable instructions including means to configure the system to carry out a method as recited in any of paragraphs AT- BE.
  • Application selection and operation techniques described herein can reduce the amount of time required to locate and execute a software application corresponding to a user-input accessory. This can provide increases in operational efficiency, e.g., in user efficiency and satisfaction. Application operation techniques described herein can provide rapid feedback or evaluation regarding the placement of objects on a force sensor. This can reduce the effort required for users or other entities to manipulate visual objects, increasing operational efficiency in such manipulations.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be executed in any order, combined in any order, subdivided into multiple sub-operations, and/or executed in parallel to implement the described processes.
  • the described processes can be performed by resources associated with one or more computing devices 102, 134, 200, or 300 such as one or more internal or external CPUs or GPUs, and/or one or more pieces of hardware logic such as FPGAs, DSPs, or other types described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Une application peut être lancée et utilisée conjointement avec un accessoire d'interface utilisateur (UIA). Un moteur de lancement peut détecter un identifiant de l'UIA au moyen d'un interrogateur sans fil. Le moteur de lancement peut déterminer et exécuter une application logicielle correspondant à l'identifiant de l'UIA. Un moteur d'interaction peut détecter un identifiant d'un objet à l'aide de l'interrogateur sans fil. Le moteur d'interaction peut détecter une force exercée sur une surface de détection d'un capteur de force par l'objet. Le moteur d'interaction peut fournir à l'application logicielle les informations du second identifiant et de la force détectée. Certains exemples consistent à : présenter une interface utilisateur d'après l'identifiant de l'UIA ; déterminer une représentation de l'objet au moyen de l'identifiant de l'objet ; et présenter la représentation agencée d'après les données spatiales de l'objet. Certains exemples consistent à présenter une interface utilisateur spécifique à l'identifiant de l'UIA.
PCT/US2016/058106 2015-10-28 2016-10-21 Dispositif informatique comprenant un accessoire d'entrée utilisateur WO2017074809A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP16794799.3A EP3368977B1 (fr) 2015-10-28 2016-10-21 Dispositif informatique comprenant un accessoire d'entrée utilisateur
CN201680064071.5A CN108351791B (zh) 2015-10-28 2016-10-21 具有用户输入配件的计算设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/925,272 2015-10-28
US14/925,272 US20170123622A1 (en) 2015-10-28 2015-10-28 Computing device having user-input accessory

Publications (1)

Publication Number Publication Date
WO2017074809A1 true WO2017074809A1 (fr) 2017-05-04

Family

ID=57286807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/058106 WO2017074809A1 (fr) 2015-10-28 2016-10-21 Dispositif informatique comprenant un accessoire d'entrée utilisateur

Country Status (4)

Country Link
US (1) US20170123622A1 (fr)
EP (1) EP3368977B1 (fr)
CN (1) CN108351791B (fr)
WO (1) WO2017074809A1 (fr)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7970870B2 (en) 2005-06-24 2011-06-28 Microsoft Corporation Extending digital artifacts through an interactive surface
US10346611B1 (en) * 2015-11-25 2019-07-09 Symantec Corporation Detecting malicious software
US10331155B1 (en) * 2015-12-11 2019-06-25 Amazon Technologies, Inc. Network addressable power socket automation
US10148497B1 (en) 2015-12-11 2018-12-04 Amazon Technologies, Inc. Network addressable device automation using a beacon
US10033973B1 (en) * 2017-01-25 2018-07-24 Honeywell International Inc. Systems and methods for customizing a personalized user interface using face recognition
EP3646174A1 (fr) 2017-06-28 2020-05-06 Microsoft Technology Licensing, LLC Amélioration de l'expérience utilisateur en fonction des capacités matérielles du dispositif
EP3791351A1 (fr) * 2018-10-23 2021-03-17 Google LLC Réduction de données pour générer des cartes thermiques
CA3124092A1 (fr) 2018-12-21 2020-06-25 Polaris Industries Inc. Gestion de vehicules et d'accessoires recreatifs
CN111381282A (zh) * 2018-12-28 2020-07-07 物流及供应链多元技术研发中心有限公司 基于超宽带的物件识别
US10656763B1 (en) * 2019-01-04 2020-05-19 Sensel, Inc. Dynamic adjustment of a click threshold corresponding to a force-based tactile sensor
US11983697B2 (en) * 2020-07-30 2024-05-14 Block, Inc. Embedded application within a buyer application
US11609665B2 (en) * 2020-08-20 2023-03-21 Peter Andras Vikar System and method for sharing and manipulating digital content
WO2023167948A1 (fr) * 2022-03-01 2023-09-07 Tusi, Llc Puzzle reconfigurable à visées pédagogique, thérapeutique et de divertissement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006111782A1 (fr) * 2005-04-19 2006-10-26 Nokia Corporation, Procede, dispositif et systeme de commande de l'introduction d'une application dans un dispositif de terminal mobile
US8209628B1 (en) * 2008-04-11 2012-06-26 Perceptive Pixel, Inc. Pressure-sensitive manipulation of displayed objects
US20140317303A1 (en) * 2009-03-16 2014-10-23 Apple Inc. Application launching in conjunction with an accessory
US20150091820A1 (en) * 2013-09-27 2015-04-02 Sensel, Inc. Touch Sensor Detector System and Method

Family Cites Families (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155813A (en) * 1990-01-08 1992-10-13 Wang Laboratories, Inc. Computer apparatus for brush styled writing
US5347620A (en) * 1991-09-05 1994-09-13 Zimmer Mark A System and method for digital rendering of images and printed articulation
US5835096A (en) * 1995-03-24 1998-11-10 3D Labs Rendering system using 3D texture-processing hardware for accelerated 2D rendering
US6161126A (en) * 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US7054029B1 (en) * 1999-03-09 2006-05-30 Canon Kabushiki Kaisha Image processing apparatus and method, and storage medium
US6433775B1 (en) * 1999-03-25 2002-08-13 Monkeymedia, Inc. Virtual force feedback interface
US7065242B2 (en) * 2000-03-28 2006-06-20 Viewpoint Corporation System and method of three-dimensional image capture and modeling
US7859519B2 (en) * 2000-05-01 2010-12-28 Tulbert David J Human-machine interface
US6993592B2 (en) * 2002-05-01 2006-01-31 Microsoft Corporation Location measurement process for radio-frequency badges
US7613842B2 (en) * 2004-02-17 2009-11-03 Microsoft Corporation Modular, attachable objects with tags as intuitive physical interface facilitating user interaction with a computer
US7397464B1 (en) * 2004-04-30 2008-07-08 Microsoft Corporation Associating application states with a physical object
US7467380B2 (en) * 2004-05-05 2008-12-16 Microsoft Corporation Invoking applications with virtual objects on an interactive display
US7379047B2 (en) * 2004-06-30 2008-05-27 Microsoft Corporation Using a physical object to control an attribute of an interactive display application
US20060090078A1 (en) * 2004-10-21 2006-04-27 Blythe Michael M Initiation of an application
US7342585B2 (en) * 2004-12-30 2008-03-11 Microsoft Corporation Use of an input overscaled bitmap to generate emboldened overscaled bitmap
US8803894B2 (en) * 2005-04-14 2014-08-12 Hewlett-Packard Development Company, L.P. Object identifier
US8095232B2 (en) * 2005-11-02 2012-01-10 Vistaprint Technologies Limited Printer driver systems and methods for automatic generation of embroidery designs
US7378966B2 (en) * 2006-01-04 2008-05-27 Microsoft Corporation RFID device groups
US7756747B2 (en) * 2006-03-10 2010-07-13 Microsoft Corporation RFID business process-decoupling of design and deployment time activities
US8001613B2 (en) * 2006-06-23 2011-08-16 Microsoft Corporation Security using physical objects
US7636610B2 (en) * 2006-07-19 2009-12-22 Envisiontec Gmbh Method and device for producing a three-dimensional object, and computer and data carrier useful therefor
US8199117B2 (en) * 2007-05-09 2012-06-12 Microsoft Corporation Archive for physical and digital objects
JP5150166B2 (ja) * 2007-08-24 2013-02-20 株式会社東芝 設備機器集合生成支援装置及び方法
US20090109030A1 (en) * 2007-10-24 2009-04-30 International Business Machines Corporation Using a physical object and its position on a surface to control an enablement state of a surface based computing device
US8237550B2 (en) * 2008-03-11 2012-08-07 Microsoft Corporation Action using switched device that transmits data
US20090251285A1 (en) * 2008-04-07 2009-10-08 International Business Machines Corporation Using physical objects to control enablement/disablement of device functionality
US7982609B2 (en) * 2008-06-18 2011-07-19 Microsoft Corporation RFID-based enterprise intelligence
US8847739B2 (en) * 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US8427424B2 (en) * 2008-09-30 2013-04-23 Microsoft Corporation Using physical objects in conjunction with an interactive surface
ES2575867T3 (es) * 2008-12-19 2016-07-01 Agfa Graphics Nv Método para reducir los artefactos de calidad de imagen en la impresión tridimensional
EP2199065B1 (fr) * 2008-12-19 2012-03-21 Agfa Graphics N.V. Procédé de traitement d'images pour impression tridimensionnelle
US8388151B2 (en) * 2009-07-23 2013-03-05 Kenneth J. Huebner Object aware, transformable projection system
JP5447013B2 (ja) * 2010-03-05 2014-03-19 株式会社リコー 表示装置、画像形成装置、カスタマイズ方法、プログラム
WO2011144596A1 (fr) * 2010-05-18 2011-11-24 Agfa Graphics Nv Procédé pour réaliser une matrice d'impression flexographique
US9244545B2 (en) * 2010-12-17 2016-01-26 Microsoft Technology Licensing, Llc Touch and stylus discrimination and rejection for contact sensitive computing devices
US20150169212A1 (en) * 2011-12-14 2015-06-18 Google Inc. Character Recognition Using a Hybrid Text Display
US8600450B2 (en) * 2011-12-28 2013-12-03 Sony Corporation Receiving user input on a graphical user interface
US8737035B2 (en) * 2012-05-18 2014-05-27 Disney Enterprises, Inc. Magnetically movable objects over a display of an electronic device
US9772889B2 (en) * 2012-10-15 2017-09-26 Famous Industries, Inc. Expedited processing and handling of events
US9589538B2 (en) * 2012-10-17 2017-03-07 Perceptive Pixel, Inc. Controlling virtual objects
US9483146B2 (en) * 2012-10-17 2016-11-01 Perceptive Pixel, Inc. Input classification for multi-touch systems
US20140104194A1 (en) * 2012-10-17 2014-04-17 Perceptive Pixel, Inc. Input Classification for Multi-Touch Systems
US20140104191A1 (en) * 2012-10-17 2014-04-17 Perceptive Pixel, Inc. Input Classification for Multi-Touch Systems
US20140108979A1 (en) * 2012-10-17 2014-04-17 Perceptive Pixel, Inc. Controlling Virtual Objects
US9632605B2 (en) * 2012-10-17 2017-04-25 Perceptive Pixel, Inc. Input classification for multi-touch systems
US9430066B2 (en) * 2012-10-17 2016-08-30 Perceptive Pixel, Inc. Input classification for multi-touch systems
US10078384B2 (en) * 2012-11-20 2018-09-18 Immersion Corporation Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
US9473417B2 (en) * 2013-03-14 2016-10-18 Airwatch Llc Controlling resources used by computing devices
KR102072144B1 (ko) * 2013-03-26 2020-01-31 삼성전자주식회사 액세서리를 식별하는 방법 및 그 전자 장치
US10243786B2 (en) * 2013-05-20 2019-03-26 Citrix Systems, Inc. Proximity and context aware mobile workspaces in enterprise systems
EP3047360A4 (fr) * 2013-09-18 2017-07-19 Tactual Labs Co. Systèmes et procédés pour fournir une réponse à une entrée d'utilisateur au moyen d'informations au sujet de changements d'états servant à prédire une entrée d'utilisateur future
CN104750406B (zh) * 2013-12-31 2019-12-24 深圳迈瑞生物医疗电子股份有限公司 监护设备及其显示界面布局调整方法、装置
KR102171389B1 (ko) * 2014-04-21 2020-10-30 삼성디스플레이 주식회사 영상 표시 시스템
CN104035675B (zh) * 2014-06-24 2017-06-27 联想(北京)有限公司 电子设备及其显示控制方法
US9411458B2 (en) * 2014-06-30 2016-08-09 Synaptics Incorporated System and method for determining input object information from proximity and force measurements
US20160182702A1 (en) * 2014-12-20 2016-06-23 Ilan Abehassera Physical component detection and user interface change on a communication device
US20160375354A1 (en) * 2015-06-23 2016-12-29 Intel Corporation Facilitating dynamic game surface adjustment
US20160381171A1 (en) * 2015-06-23 2016-12-29 Intel Corporation Facilitating media play and real-time interaction with smart physical objects

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006111782A1 (fr) * 2005-04-19 2006-10-26 Nokia Corporation, Procede, dispositif et systeme de commande de l'introduction d'une application dans un dispositif de terminal mobile
US8209628B1 (en) * 2008-04-11 2012-06-26 Perceptive Pixel, Inc. Pressure-sensitive manipulation of displayed objects
US20140317303A1 (en) * 2009-03-16 2014-10-23 Apple Inc. Application launching in conjunction with an accessory
US20150091820A1 (en) * 2013-09-27 2015-04-02 Sensel, Inc. Touch Sensor Detector System and Method
US9001082B1 (en) 2013-09-27 2015-04-07 Sensel, Inc. Touch sensor detector system and method

Also Published As

Publication number Publication date
EP3368977A1 (fr) 2018-09-05
CN108351791B (zh) 2021-07-20
US20170123622A1 (en) 2017-05-04
CN108351791A (zh) 2018-07-31
EP3368977B1 (fr) 2021-10-13

Similar Documents

Publication Publication Date Title
EP3368977B1 (fr) Dispositif informatique comprenant un accessoire d'entrée utilisateur
CN110532170B (zh) 搭建测试环境的方法、装置、电子设备及介质
CN103886124A (zh) 虚拟对象的方位校正
CN102695032A (zh) 信息处理装置、信息共享方法、程序以及终端设备
CN103733229A (zh) 信息处理设备、信息处理方法及程序
US10897687B2 (en) Electronic device and method for identifying location by electronic device
CN107368329B (zh) 物理元件的安装
US11347393B2 (en) Electronic device for processing wheel input and operation method thereof
WO2019217126A1 (fr) Vision artificielle par optimisation matérielle simulée
US20170330473A1 (en) Iot enhanced educational system
US20200117308A1 (en) Electronic device and method for determining touch input conditions based on type of touch input
CN108052869B (zh) 车道线识别方法、装置及计算机可读存储介质
US9047244B1 (en) Multi-screen computing device applications
US11240102B2 (en) Peripheral device identification system and method
CN115081643B (zh) 对抗样本生成方法、相关装置及存储介质
CN110737454A (zh) 软件项目更新方法、装置、计算机设备及存储介质
US12011658B2 (en) Single unit deformable controller
CN114083800B (zh) 模型支撑面的3d打印数据生成方法、装置及存储介质
EP4274172A1 (fr) Dispositif électronique et procédé d'enregistrement d'un dispositif externe à l'aide d'informations de dispositif
CN112347568B (zh) 一种仿真测试的方法、相关装置、设备及存储介质
KR102534565B1 (ko) 컨텐츠를 제공하기 위한 방법 및 그 전자 장치
US12023575B2 (en) Multi unit deformable controller
US20230128662A1 (en) Electronic device and method for spatial mapping using the same
EP4357893A1 (fr) Dispositif électronique comprenant un affichage et procédé associé audit dispositif
US20240320666A1 (en) Electronic device comprising plurality of execution environments and operating method thereof

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16794799

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016794799

Country of ref document: EP