WO2022023506A1 - Dispositif d'entrée - Google Patents

Dispositif d'entrée Download PDF

Info

Publication number
WO2022023506A1
WO2022023506A1 PCT/EP2021/071351 EP2021071351W WO2022023506A1 WO 2022023506 A1 WO2022023506 A1 WO 2022023506A1 EP 2021071351 W EP2021071351 W EP 2021071351W WO 2022023506 A1 WO2022023506 A1 WO 2022023506A1
Authority
WO
WIPO (PCT)
Prior art keywords
input device
input
projections
user
data
Prior art date
Application number
PCT/EP2021/071351
Other languages
English (en)
Inventor
Oliver Treadway
Original Assignee
Sphere Research Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sphere Research Ltd filed Critical Sphere Research Ltd
Publication of WO2022023506A1 publication Critical patent/WO2022023506A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/04Building blocks, strips, or similar building parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position

Definitions

  • Embodiments disclosed herein relate to an input device for a computing system.
  • Input devices facilitate interaction between a human user and an electronic device such as a computer or a smart TV.
  • Various types of input device such as computer mice, joysticks and trackpads provide means for a user to navigate between objects or regions presented in a graphical user interface (GUI) of an electronic device, for example by controlling a cursor visible within the GUI.
  • GUI graphical user interface
  • Examples of applications in which a user is required to navigate between objects in a three-dimensional scene and in which objects appear at a discrete set of positions include voxel editors for creating or manipulating voxel-based structures for video games or other applications, and applications in which a user builds or modifies modular structures within a GUI, for example a structure formed of object models representing Lego® blocks or Duplo® blocks, either for education or entertainment purposes.
  • an input device for a computing system.
  • the input device includes a monolithic surface member comprising a substantially planar portion and an array of projections extending from the substantially planar portion in a direction perpendicular to the substantially planar portion, with respective input sensors disposed on or in at least some of the projections and operable to detect user input by human touch.
  • the input device further includes one or more indicator elements, the or each indicator element having a plurality of possible activation states, and communication means arranged to transmit, in response to receiving first user input by human touch at one or more of the input sensors, first data to the computing system, the first data indicating said one or more of the input sensors.
  • the input device further comprises feedback circuitry arranged to change an activation state of at least one of the indicator elements in dependence on receiving second data from the computing system via the communication means.
  • the indicator elements can assist a user, such as a child, to make a cognitive connection between input provided at the input device and actions performed within a GUI of the computing device.
  • the indicator elements may include respective light sources disposed on or in at least some of the projections of the input device. By activating the light sources in different configurations, feedback can be provided to relate the projections of the input device to respective positions in the GUI, helping the user to understand how to provide input to control what is displayed in the GUI.
  • the apparatus includes an input device as described above, and further includes one or more modules removably attachable to the projections of the input device. At least some of the projections of the input device comprise means for detecting when one of the modules is attached to those projections, and the input device is arranged to transmit, in response to a detachable module being attached to one or more of the projections, third data to the computing device via the communications device, the third data indicating the one or more projections to which the detachable module is attached.
  • At least one of the one or more modules includes means for detecting second user input by human touch, and means for transmitting, in response to detecting the second user, touch data to the input device indicative of said second user input.
  • the input device is arranged to transmit fourth data to the computing device via the communication means indicative of the second user input.
  • Enabling detachable modules to be attached to projections of the input device allows the input device to be customised for a given application.
  • the detachable modules may have different capabilities for detecting user input compared with the input sensors of the input device (for example being able to detect input in a continuous domain in the manner of the slider or touch pad).
  • the detachable module effectively becomes an extension of the input device, with additional capabilities for receiving user input.
  • the functionality controlled by the detachable module may be made dependent on which projections the module is attached. In this way, the versatility of the input device may be increased.
  • a system comprising a computing device arranged to generate a user interface configured to depict a plurality of positions in which to place a plurality of object models, and an input device as described above, wherein each of the projections corresponds to a respective one or more of said positions depicted in the user interface.
  • the computing device is arranged to cause the user interface to depict a selected object model of the plurality of object models in a position corresponding to the projection on/in which said one or more input sensors are disposed.
  • the input device is used to facilitate building of a modular structure depicted within the user interface.
  • the computing device may bestow additional functionality on the depicted model, for example by animating or otherwise altering the model, providing a rewarding experience for the user.
  • the computing device may be arranged to transmit signals to the input device to activate the feedback circuitry of the input device, causing the indicator elements to further enhance the interactive experience.
  • Figure 1 shows an input device in perspective view
  • Figure 2 shows a system comprising a computing device connected to a display and the input device of Figure 1;
  • FIGS 3 to 7 show further configurations of the system of Figure 2;
  • Figure 8 shows an example of a user interface
  • Figure 9 shows the system of Figure 2 and a detachable module
  • Figure 10 shows the system of Figure 2 and two detachable modules.
  • Embodiments of the present disclosure relate to an input device for controlling a computing device.
  • embodiments described herein enable a user to interact with a user interface of a computing device in a manner that is sufficiently intuitive to be suitable for a wide range of users, including children.
  • Figure 1 shows an example of an input device 100.
  • the input device 100 has an upper surface, preferably formed of a single monolithic sheet of plastic of substantially rigid construction.
  • the upper surface forms part of a housing 101 containing active elements of the input device, including a battery and various other electrical components as will be described in more detail hereafter.
  • the input device 100 is durable and resistant to liquids, dust and dirt which might otherwise fall into gaps in the housing of a user device.
  • the upper surface comprises a substantially planar portion 102 and an array of projections 104 projecting from the substantially planar portion 102.
  • the projections 104 are arranged in a regular grid with equal grid spacings in two perpendicular dimensions.
  • the input device 100 may have a different number and/or configuration of projections 104, for example arranged in a 48x48 grid, a 60x60 grid, a 48x96 grid, or any other suitable size or shape of grid.
  • an input device may include projections arranged on a different type of grid, for example a curvilinear grid or an isometric grid formed of equilateral triangles.
  • the projections 104 are shaped as cylindrical studs, though in other examples projections may be shaped differently, for example as domes, cones, frustums, rods or studs of any cross-sectional shape. Different projections may be identical to one another or may be shaped differently from one another, for example to provide a user with a tactile indication of different regions of the input device.
  • each, of the projections 104 includes a respective input sensor for detecting user input at said projection 104.
  • the input sensor for each projection 104 is a surface capacitive sensor for detecting when a user touches an upper surface of the projection 104, for example with a finger or thumb.
  • alternative types of tactile sensor or contact sensor may be provided, for example projected capacitive sensors or resistive touch sensors. Certain types of tactile sensor may be capable of detecting different levels of pressure applied to the projections.
  • an input device may additionally or alternatively include other types of input sensor, such as light sensors or conductive sensors, for detecting other types of user input as will be described in more detail hereinafter.
  • each, of the projections 104 includes an indicator element in form of a light source comprising one or more light emitting diodes (LEDs).
  • the indicator element for each of the projections 104 can be in any of several possible activation states. In this example, one of the activation states corresponds to the indicator element being inactive (off), and a set of further activation states each correspond to the indicator element being active (on) in a particular colour state. In other examples, an indicator element may have only two activation states, “on” and “off’. In some examples, an indicator element may have activation states corresponding to different brightness levels.
  • an input device may include other types of indicator element in addition to, or as an alternative to, a set of light sources.
  • an input device may include one or more vibrational motors for providing haptic feedback, or one or more audio signalling elements, such as a buzzer or loudspeaker, for providing audible feedback. In this way, an input device may be suitable for blind or partially- sighted users.
  • an input device may include vibrational motors at multiple locations in the input device in order to provide location-specific haptic feedback.
  • the input device 100 is arranged to communicate with a computing device via a communication module 106.
  • the communication module 106 in this example includes a radio modem and transceiver for communicating directly or indirectly with the computing device using wireless signals in accordance with a suitable wireless communications standard, for example Bluetooth® or Wi-Fi.
  • wired connection means may be provided for communicating with a computing device.
  • a wired connection means may further be used to power the input device 100 and/or charge a battery of the input device 100, either via the computing device 200 or directly from a power supply.
  • the input device 102 includes feedback circuitry, which includes logical circuitry that, in this example, is embedded within a printed circuit board located beneath the indicator elements and input sensors of the input device 100.
  • the feedback circuitry is configured to change the activation states of the indicator elements in response to user input as described above and/or signals received via the communication module 106.
  • the feedback circuitry may be configured to change the activation states of one or more of the indicator elements in response to receiving user input at a corresponding one or more of the projections 104.
  • the indicator elements may return to their previous activation states after a certain period of time or alternatively may remain in their changed activation states until further user input is received or until a signal is received via the communications module 106.
  • the feedback circuitry may be configured in response to data received via the communication module 106, and different configurations of the feedback circuitry may result in different sets of indicator elements being activated in response to user input.
  • the feedback circuitry may be configured such that several indicator elements are activated in response to user input at only one of the projections 104.
  • the feedback circuitry may be configured to change the activation states of one or more of the indicator elements in response to data received via the communications module 106, irrespective of whether user input is received. Examples of possible configurations of the feedback circuitry will be described in more detail hereafter.
  • FIG. 2 shows an example in which the input device 100 is used to control a computing device 200.
  • the computing device 200 in this example is integral to a smart TV which further includes a display 202.
  • a smart TV is a television with internet connectivity and processing circuitry capable of executing software applications (“apps”).
  • the input device 100 could be used to control another types of computing device such as an iPadTM, a laptop computer or a desktop computer.
  • the computing device 200 includes a communication module 204, such that the respective communication modules 106, 204 of the input device 100 and the computing device 200 enable wireless communication between the input device 100 and the computing device 200.
  • the computing device 200 is configured with an app that causes the computing device 200 to generate a GUI on the display 202, where the GUI depicts a three-dimensional space in which one or more selectable object models (referred to hereinafter as objects) may be rendered.
  • the GUI depicts a base board 206, which in this example is a substantially planar object from which an array of cylindrical studs 208 extend in a regular array, with axes perpendicular the base board 206.
  • the studs 208 are depicted as having a configuration corresponding to that of the projections 104 of the input device 100.
  • the input device 100 in this example is arranged to transmit identification information to the computing device 200 during a setup process, from which the computing device 200 can derive the configuration of the grid of the projections 104 of the input device 100.
  • Each of the studs 208 defines a position at which an object can be connected to the base board 206.
  • a substantially cuboid block 210 is shown as being connected to four mutually adjacent studs 208.
  • the block 210 includes four further cylindrical studs 212 extending in the same direction as the studs 208 of the base board 206, enabling further objects to be connected to the block 210.
  • a user By positioning multiple objects in the GUI in such a configuration that the objects are connected to one another and/or to the base board 206, a user is able to build a modular structure for rendering in real time within the GUI.
  • the possible positions at which an object can be positioned is finite and is determined by the positions of any exposed studs to which an object has not already been connected.
  • the selectable objects are three-dimensional computer-aided design (CAD) models of Lego® pieces, which include blocks with varying characteristics such as size, colour, and shape, as well as other pieces which are connectable to the blocks and/or one another using regularly-spaced cylindrical studs.
  • CAD computer-aided design
  • a user interface may depict other types of modular structure, for example structures formed of Duplo® pieces or structures formed of functional modules such as software-simulated synthesiser modules for music generation.
  • a user first selects the block 210 from a set of selectable objects, and further selects an orientation and one or more characteristics for the block 210, for example a colour and/or surface texture for the block 210.
  • an orientation and one or more characteristics for the block 210 for example a colour and/or surface texture for the block 210.
  • An example of how objects can be selected will be discussed in more detail hereinafter.
  • the computing device 200 transmits data to the input device 100 via the communication modules 106, 204 indicating characteristics of the block 210.
  • the characteristics of the block 210 include dimensions of the block 210 and a colour of the block 210.
  • the input device 100 Upon receiving the data from the computing device 204, the input device 100 configures the feedback circuitry in dependence on the characteristics of the block 210, so that the feedback circuitry is ready to activate an appropriate set of indicator elements in response to receiving user input.
  • the input device 100 may receive user input at a projection 104a.
  • the feedback circuitry changes the activation state of the indicator element of the projection 104a, and further changes the activation states of the indicator elements of three further projections 104b, 104c, 104d arranged in a row with the projection 104a.
  • the three further projections 104b, 104c, 104d are determined in accordance with the dimensions of the block 210 and the selected orientation of the block 210.
  • the feedback circuitry is configured such that the indicator elements of the projections 104a-d display a colour resembling the selected colour of the block 210.
  • the input device 100 further transmits data to the computing device 200 via the communication modules 106, 204, indicating the projection 104a at which the user input was received and, optionally, indicating further characteristics of the user input such as pressure detected by the input elements and the duration for which the pressure is applied.
  • the computing device 200 updates the GUI to show the block 210 connected to the studs 208 of the base board 206 corresponding to the projections 104a-d of the input device 100
  • the orientation and dimensions of the block 210 are effectively selected using the GUI of the computing device, in other examples the dimensions and/or the orientation may be determined in accordance with the user input received at the input device 100. For example, a user may select a certain height and colour of block, but without specifying the in-plane dimensions or orientation of the block. The user may subsequently provide input at projections 104a and 104d of the input device 100, for example by touching or pressing projections 104a and 104d, from which the feedback circuitry determines that the indicator elements at projections 104a-104d should be activated in a colour state representing the selected block colour.
  • the input device 100 then transmits data to the computing device 200 indicating the projections 104a and 104d (and, optionally, the intervening projections 104b, 104c), causing the computing device 200 to depict the block 210 connected to corresponding studs on the base board 206.
  • a user may provide input at projections 104a and 104h shown in Figure 3, for example by touching or pressing projections 104a and 104h, causing the feedback circuitry to determine that the indicator elements at projections 104a-104h should be activated.
  • the input device may then transmit data to the computing device 200 indicating the projections 104a and 104h (and, optionally, the intervening projections 104b-104g), causing the computing device 200 to depict a block 310 connected to corresponding studs on the base board 206.
  • the input device 100 is arranged to receive data from the computing device 200 indicating characteristics of an object that is selected using the GUI, following which the feedback circuitry of the input device 100 is configured to activate the indicator elements in dependence on the received user input and the characteristics of the selected object.
  • the feedback circuitry may store information relating to the characteristics of the object, allowing the feedback circuitry to activate or otherwise change the activation states of the indicator elements without receiving further signals from the computing device 200. In this way, the feedback circuitry can respond rapidly to user input, irrespective of any processing lag at the computing device 200 or signalling lag between the input device 100 and the computing device 200.
  • the user may move his or her finger over the projections 104, causing the corresponding indicator elements to be activated and deactivated in such a way to give the impression of an object being dragged around the input device 100.
  • signals may be sent to the computing device 200, causing the user interface to depict the block 210 being dragged around in the same way.
  • the input device 100 includes memory circuitry and processing circuitry for storing and executing software associated with the app on the computing device 200.
  • the input device 100 may be configured to behave differently in response to signals received from the computing device 200.
  • the input device 100 is flexible and can be updated for example when the app on the computing device 200 is updated.
  • an input device may be hard wired to interpret signals in accordance with a predetermined specification, which developers of apps or software for computing devices must then adhere to.
  • the feedback circuitry is configured to change the activation states of indicator elements directly in response to receiving user input.
  • the feedback circuitry is configured to change the activation states of one or more indicator elements in response to data received from the computing device 200.
  • the user device 100 may receive user input at one or more of the projections 104, causing the input device 100 to transmit a signal to the computing device 200 indicating those projections 104.
  • the computing device 200 may update the GUI to depict a new arrangement of objects.
  • the computing device 200 may further transmit data to the input device 100 indicating a new configuration of activation states to be implemented by the feedback circuitry of the input device 100.
  • the feedback circuitry changes the activation states of the indicator elements in dependence on the user input at the projections 104, but not directly in response to the user input.
  • the feedback circuitry only needs to implement instructions received from the computing system 200, simplifying the function of the feedback circuitry.
  • the input device 100 is able to operate in either of the operational modes described above, or in a combination of both operational modes, providing flexibility for different applications of the input device 100. In other examples, an input device may operate exclusively in either one of the operational modes.
  • Configuring the feedback circuitry to change the activation states of the indicator elements in response to data received from the computing device 200 allows the computing device 200 to cause changes of the activation states irrespective of whether user input is received at the input device 100, allowing for further flexibility in the type of feedback provided by the input device 100.
  • the computing device 200 may send data to the input device 100 indicating one or more possible positions in which an object can or should be placed.
  • the data may comprise a set of instructions for building a predetermined modular structure.
  • the feedback circuitry may then activate one or more corresponding indicator elements to assist the user in selecting and/or positioning an appropriate object.
  • the feedback circuitry may be configured such that each of the indicator elements of the input device 100 will remain in a given activation state until further user input is detected and/or until a signal is received from the computing device 200.
  • the activation states of the indicator elements may reflect an arrangement of objects already positioned in the user interface.
  • two blocks 410 and 412 are depicted with the larger block 410 connected to the base board 206 and the smaller block 412 connected to the top of the first block 410.
  • the blocks 410, 412 are different colours, represented in Figure 4 by different orientations of diagonal stripes.
  • the indicator elements of the projections 104a, 104b, 104e and 104f are in an activation state corresponding to the colour of the smaller block 412.
  • the indicator elements of the projections 104c, 104d, 104g, 104h are in an activation state corresponding to the larger block 410.
  • the configuration of activation states is indicative of a top-down view of the modular structure depicted in the GUI.
  • the activation states of the indicator elements represent the positions at which further objects can be connected to the modular structure.
  • the indicator elements help a user, such as a child, to make a cognitive link between the modular structure depicted in the user interface, and locations of the projections 104 on the input device 100. This, in turn, may assist the user to build or modify the modular structure in an intuitive manner.
  • the input device 100 may include an orientation detector for determining an orientation of the input device 100.
  • the orientation detector may, for example, include a microelectromechanical systems (MEMS) magnetic field sensor employed as an electronic compass and an accelerometer for detecting the orientation of the input device 100 in relation to gravitational field of the Earth.
  • MEMS microelectromechanical systems
  • the relative orientation of the input device 100 with respect to the display 202 can be determined from the outputs of the electronic compass and accelerometer on the basis of a calibration process.
  • the orientation detector may include one or more cameras for determining an orientation of the input device 100 with respect to a local environment using simultaneous location and mapping (SLAM), or an infrared transmitter/receiver for determining an orientation with respect to a remote receiver/transmitter module.
  • SLAM simultaneous location and mapping
  • a computing device such as the computing device 200 may include or be connected to one or more further devices for determining an orientation of the input device 100, for example one or more cameras for capturing images of the input device 100, from which the orientation of the input device 100 can be determined using pose determination software.
  • the input device 100 is configured to transmit data to the computing device 200 indicating the orientation of the input device 100, and the computing device 200 is arranged to depict the base board 206 in the GUI at an orientation depending on the orientation of the input device 100.
  • the input device 100 may configured to transmit this orientation data periodically at a sufficiently high frequency that the orientation of the base board 206 appears to respond to any reorientation of the input device 100.
  • the input device 100 may be configured to transmit orientation data only when a change of orientation is detected, resulting in less signalling and accordingly less power use.
  • reorientation of the base board 206 in the GUI is animated to appear as smooth and continuous motion.
  • Providing the computing device 200 with information indicative of the orientation of the input device 100 advantageously allows the user to manipulate the GUI by changing the orientation of the input device 100.
  • the user can thereby rotate the structure to view different parts of the structure during the building process.
  • orienting the base board 206 in dependence on the orientation of the input device 100 may further assist the user to associate the projections 104 on the input device 100 with locations depicted in the GUI, particularly when combined with the use of indicator elements as described above.
  • the GUI is configured such that a user can position objects either directly on the base board 206 or on top of other objects, in positions defined by the cylindrical studs 212 depicted in GUI.
  • the positioning of the objects reflects how a corresponding set of physical objects can be connected together, and the resulting modular structure therefore reflects a physical structures insofar as the objects are supported from below, as opposed to “floating” in an unsupported manner.
  • the out-of-plane component is uniquely determined by the configuration of objects already positioned in the GUI.
  • a user interface such as the GUI described above may be configured such that a user can position an object in any of a discrete set of positions in three-dimensions, irrespective of whether such positioning results in “unphysical” behaviour such as floating objects or intersecting objects.
  • Such a configuration may provide additional flexibility, and is particularly useful for learning how to build modular structures.
  • the input device 100 may be used for other applications in which positions can be specified in three dimensions, for example within a voxel editor. In such cases, the input device 100 may be used to specify both in-plane components and out-of-plane components of the three-dimensional positions.
  • the feedback circuitry may be configured to change the activation states of the indicator elements in dependence on a specified out-of-plane component. For example, as a user selects different out-of-plane components of the position (corresponding to different layers or slices of the three-dimensional space in which objects can be placed), the indicator elements may be used to represent objects positioned in the currently-selected layer, and/or those in the layer below the currently selected layer. In this way, the input device 100 assists the user to select positions relative to an existing modular structure within the GUI, whilst also providing information about internal portions of the modular structure which are otherwise hidden from view.
  • the computing device 200 generates a GUI in which an object can be positioned in any of a discrete set of three-dimensional positions, where the in-plane components of the positions correspond to the positions of the projections 104 as described with reference to the examples above.
  • a set of projections can be reserved for specifying the out-of-plane component of the position.
  • the set of projections 104 is those disposed in a column 602 of projections 104.
  • the activation states of the indicator elements within the column 602 may optionally be changed to indicate that the column 602 is reserved for this purpose.
  • a user In order to specify a position in three dimensions, in this example a user first provides input one of the projections 104 in the column 602 to select the out-of-plane component of the position. The user then provides input at one or more further projections 104 to specify the in-plane components of the position (as described with reference to any of the examples described above).
  • Figures 6 and 7 show an example of a block 610 being moved within the GUI of the computing system 200 when a user provides input at two different projections 104 in the column 602 whilst specifying the same in-plane components. It is observed that the block 610 moves between positions of a column extending perpendicular to base board 206.
  • an input device may include additional or alternative means for specifying an out-of-plane component of the position, for example a slider, a scroll wheel, or a set of additional touch sensors or buttons each corresponding to a selectable value of the out-of-plane component.
  • objects are selectable using the GUI of the computing device 200. Whilst interacting with the GUI, a user may, for example, indicate that he or she wishes to select a new object, for example by providing input at the input device 100 or another input device connected to the computing system 200. The computing device 200 may then present a selection of objects in the GUI. The selection of objects may be stored at the computing device 200 and updated via a network interface of the computing device 200, or alternatively may be stored remotely and downloaded to the computing device 200 in real-time as the user interacts with the GUI.
  • a selection of objects may be presented in a menu, for example a menu with a hierarchical format in which different categories of objects are grouped together.
  • the GUI may depict the selectable objects, and optionally one or more selectable characteristics for the selectable objects, on an interior surface of a three-dimensional structure.
  • An example of a user interface in which objects are presented on a curved interior surface of a hollow three-dimensional structure is discussed in international patent publication WO 2011/151367 Al.
  • the base board 206 and/or any objects already connected to the base board 206 may be displayed simultaneously with the interior surface of the three- dimensional structure, such that the three-dimensional structure appears to partially or completely surround the base board 206 as shown for example in Figure 8.
  • four selectable blocks 810, 812, 814, 816 having different dimensions to one another are presented on a curved interior surface of a three-dimensional structure surrounding the base board 206.
  • the user is able to rotate the base board 206 and the three-dimensional structure by rotating the input device 100, providing a convenient and intuitive way for the user to view different objects on the surface.
  • the user is also provided with controls for selecting and navigating between the selectable objects.
  • the feedback circuitry changes the activation states of the corresponding indicator elements to indicate that the projections 104 may be used for selecting and navigating.
  • the three-dimensional structure and selectable objects may optionally be hidden until next time the user wishes to select a new object.
  • the input device 100 can be used both for selecting and positioning objects, a further input device such as a mouse or trackpad may alternatively be used for selecting objects.
  • the computing device 200 may animate the modular structure in the user interface, providing a rewarding experience for the user.
  • the computing device 200 may further transmit data to the input device 100 including instructions to change the activation states of one or more of the indicator elements, so as to further enhance the interactive experience.
  • the app is configured to navigate the user to a webpage for buying a set of physical blocks corresponding to the object models used in building the modular structure.
  • the app may therefore include a gravity simulator which can be switched on to test the structural integrity of the modular structure in the GUI. In this way, a user is able to build a model of a modular structure conveniently in a gravity-free environment, then test how a corresponding real structure would behave when subjected to a gravitational field.
  • the input sensors of the input device 100 are operable to detect tactile input at the projections 104.
  • the input device may additionally be capable of detecting when a detachable module is connected to one or more projections of the input device.
  • an input device 900 includes features corresponding to those of the input device 100 described above, but in this example the projections 904 include means for detecting when a detachable module such as the module 908 is attached to one or more of the projections 904.
  • the module 908 is a block-shaped module including a set of recesses arranged to receive the projections 904 of the input device 900, and further including a set of projections 910 to which other detachable modules may be attached.
  • the input device 900 and the module 908 contain circuitry including inductive elements located within the projections 904 and portions the module 908 arranged to connect to the projections 904.
  • the circuitry of the detachable module 908 is logical circuitry which allows the input device 900 to determine characteristics of the module 908 when the module 908 is connected to the input device 900.
  • the module 908 may, for example, include memory circuitry storing an identification code for uniquely identifying the module 908.
  • the input device 900 may then transmit a signal to the computing device 200 indicating the characteristics of the module 908 in addition to the position and orientation of the module 908 on the input device 900.
  • the computing device 200 is arranged to depict a block 912 with characteristics corresponding to those of the module 908 in a position and orientation on the base board 206 corresponding to the position and orientation of the module 910 on the input device 900.
  • the circuitry of the module 908 is passive, and receives power from the input device 900 via inductive coupling.
  • the detachable module 908 may include active circuitry, in which case the module 908 would include a battery.
  • the battery of a detachable module may be charged using inductive charging when connected to the input device 900.
  • the input device 900 may include one or more near-field communication (NFC) transceivers for communicating with a radio frequency identification (RFID) tag of a detachable module such as module 908.
  • NFC near-field communication
  • RFID radio frequency identification
  • a detachable module such as module 908 and corresponding set of projections of an input device such as input device 900 includes conducting elements arranged to be in contact when the detachable module is connected to the input device, forming a circuit for communication between the detachable module and the input device.
  • the input device 900 may utilize other types of input sensor, for example light sensors based on light dependent resistors (LDRs) for determining when one or more projections are covered by a detachable module.
  • LDRs light dependent resistors
  • the module 908 is connected to a further detachable module 1008.
  • the modules 908 and 1008 each include circuitry to allow the input device 900 to determine an arrangement of the modules 908, 1008 attached to the input device 900.
  • the input device 900 is arranged to send a signal to the computing device 200 indicating the determined arrangement of the modules 908, 1008 attached to the input device.
  • the GUI depicts two blocks 912, 914 with characteristics corresponding to those of the modules 908, 1008 in an arrangement on the base board 206 corresponding to the arrangement of the modules 912, 914 on the input device 900. Further modules may be added in the same way.
  • Physical modules attachable to an input device may be capable of performing various functions, for example receiving user input via switches, sliders, knobs, or other types of input sensor, providing additional functionality of the input device for controlling aspects of a computing device.
  • a physical module may include means for detecting tactile user input (for example one or more surface capacitive sensors, projected capacitive sensors and/or resistive touch sensors) and means for transmitting data to the input device indicative of the detected touch input.
  • the input device may then be arranged to convey data to the computing device indicative of the detected touch input.
  • the detachable module effectively becomes an extension of the input device, with additional capabilities for receiving user input.
  • the module may be capable of determining a location on a surface of the module at which user input is received.
  • FIG 11 shows an example of a module 1108 attached to an input device 1100.
  • the module 1108 is elongate with a major axis in the direction of the arrow AB, and has a substantially flat upper surface 1112 housing a surface capacitive sensor capable of detecting and locating touch input on the upper surface 1112.
  • the module 1108 When connected to the input device 1100, the module 1108 functions as a slider on which a user can provide input at any position, extending the functionality of the input device 1100 to receive user input in a continuous domain.
  • sliding a finger along the upper surface 1112 of the module 1108 along the direction of the arrow AB may cause continuously-varying attributes or aspects of objects appearing in the user interface displayed on the display 202, such as colour, to change.
  • the module 1108 may be used to move objects in a continuous manner, for example to scroll through selectable blocks or to rotate the user interface.
  • a module may be arranged to determine a location of user input in two dimensions, providing the functionality of a touch pad. By connecting further modules, the functionality of the input device can be customised depending on the application. The aspects of the computing system controlled by a given detachable module may be made dependent upon where the module is attached to the input device.
  • Physical modules may be provided with indicator elements in the form of motors, lights, or audio signalling elements such as loudspeakers, all of which may be activated by feedback circuitry of the input device.
  • an input device as described above in combination with one or more detachable modules can provides a flexible and convenient means for a user to interact with a computing device.
  • functional modules may be provided for controlling different software-defined synthesiser components. By connecting different configurations of functional modules to the input device, different configurations of modular synthesisers may be created and controlled.
  • an input device and computing device can be components of a single device, for example a portable device which may then be connected to a display.
  • input devices described herein may similarly be used to arrange or otherwise interact with various other types of object, for example irregularly-shaped objects and/or objects with various types of functionality.
  • an input device of the type described above may be used to interact with objects in two dimensions, for example where a GUI is used to depict a grid-based game such as chess, Scrabble® or Go.
  • the housing 101 of the input device 100 is described as having a monolithic plastic upper surface, alternative constructions are possible.
  • the upper surface may comprise several portions formed of a common material or different materials. All or part of the upper surface may be flexible or otherwise moveable relative to other portions of the upper surface.
  • at least some of the projections 104 may be operable as push buttons for receiving user input.
  • At least part of the housing 101 may be transparent or translucent in order to allow the user to see the activation states of LEDs disposed within the housing.
  • the housing 101 may be formed of any suitable material or combination of materials such as a plastics, polymers, metals or composite materials. Depending on the material used, the housing 101 may be moulded, 3D printed, or constructed using any other suitable manufacturing process.

Abstract

Un dispositif d'entrée pour un système informatique comprend : un élément de surface comprenant une partie sensiblement plane et un réseau de saillies s'étendant à partir de la partie sensiblement plane dans une direction perpendiculaire à la partie sensiblement plane ; des capteurs d'entrée respectifs disposés sur ou dans au moins certaines des saillies et pouvant fonctionner pour détecter une entrée d'utilisateur par contact humain ; un ou plusieurs éléments indicateurs, le ou chaque élément indicateur ayant une pluralité d'états d'activation possibles ; des moyens de communication agencés pour transmettre, en réponse à la réception d'une première entrée d'utilisateur par un toucher humain au niveau d'un ou de plusieurs des capteurs d'entrée, des premières données au système informatique, les premières données indiquant ledit ou lesdits capteurs d'entrée ; et un circuit de rétroaction conçu pour modifier un état d'activation d'au moins un des éléments indicateurs en fonction de la réception de secondes données provenant du système informatique par l'intermédiaire du moyen de communication.
PCT/EP2021/071351 2020-07-29 2021-07-29 Dispositif d'entrée WO2022023506A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2011752.9A GB2597918B (en) 2020-07-29 2020-07-29 Input device
GB2011752.9 2020-07-29

Publications (1)

Publication Number Publication Date
WO2022023506A1 true WO2022023506A1 (fr) 2022-02-03

Family

ID=72339320

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/071351 WO2022023506A1 (fr) 2020-07-29 2021-07-29 Dispositif d'entrée

Country Status (2)

Country Link
GB (1) GB2597918B (fr)
WO (1) WO2022023506A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1271415A1 (fr) * 2001-06-20 2003-01-02 Gateway, Inc. Représentation virtuelle d'assemblage de pièces et création du contenu
WO2011151367A1 (fr) 2010-06-01 2011-12-08 Sphere Technology Limited Procédé, appareil et système pour interface utilisateur graphique
US20160361662A1 (en) * 2012-02-17 2016-12-15 Technologyone, Inc. Interactive lcd display back light and triangulating toy brick baseplate
US20180280822A1 (en) * 2017-03-31 2018-10-04 Intel Corporation Building blocks with lights for guided assembly
CN109491582A (zh) * 2017-09-13 2019-03-19 施政 一种动态用户界面元素系统
US20190094841A1 (en) * 2017-09-26 2019-03-28 International Business Machines Corporation Assembly of a modular structure

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1271415A1 (fr) * 2001-06-20 2003-01-02 Gateway, Inc. Représentation virtuelle d'assemblage de pièces et création du contenu
WO2011151367A1 (fr) 2010-06-01 2011-12-08 Sphere Technology Limited Procédé, appareil et système pour interface utilisateur graphique
US20160361662A1 (en) * 2012-02-17 2016-12-15 Technologyone, Inc. Interactive lcd display back light and triangulating toy brick baseplate
US20180280822A1 (en) * 2017-03-31 2018-10-04 Intel Corporation Building blocks with lights for guided assembly
CN109491582A (zh) * 2017-09-13 2019-03-19 施政 一种动态用户界面元素系统
US20190094841A1 (en) * 2017-09-26 2019-03-28 International Business Machines Corporation Assembly of a modular structure

Also Published As

Publication number Publication date
GB2597918A (en) 2022-02-16
GB2597918B (en) 2023-08-09
GB202011752D0 (en) 2020-09-09

Similar Documents

Publication Publication Date Title
US11181984B2 (en) Virtual reality input and haptic feedback system
Le Goc et al. Zooids: Building blocks for swarm user interfaces
CN107708820B (zh) 用于虚拟现实和模拟环境中的脚控动作和运动控制的系统、方法和装置
CN103092406B (zh) 用于触敏表面上的多压交互的系统和方法
US8325138B2 (en) Wireless hand-held electronic device for manipulating an object on a display
Sugiura et al. Detecting shape deformation of soft objects using directional photoreflectivity measurement
RU2475290C1 (ru) Устройство для игр
US20190391647A1 (en) Real-world haptic interactions for a virtual reality user
EP3209401B1 (fr) Système de construction de jouet et procédé pour une structure spatiale à détecter par un dispositif électronique comprenant un écran tactile
CN107427719B (zh) 包括可由计算装置检测的玩具元件的玩具系统
EP3364272A1 (fr) Système de génération haptique localisée automatique
US20140002390A1 (en) Apparatus and method for user input
JP2015506807A (ja) おもちゃ片と一緒に用いられる基板組立品
GB2533314A (en) Modular robotic system
EP3727623B1 (fr) Système de jeu et procédé de détection de jouets
Gallotti et al. v-Glove: A 3D virtual touch interface
Parilusyan et al. Sensurfaces: A novel approach for embedded touch sensing on everyday surfaces
US20220083138A1 (en) Virtual Reality Input and Haptic Feedback System
WO2016037978A1 (fr) Procédé d'établissement d'une relation fonctionnelle entre des fonctions d'entrée et de sortie
CN105992993A (zh) 用户界面
US11498014B1 (en) Configurable devices
WO2022023506A1 (fr) Dispositif d'entrée
Jeong et al. iSIG-Blocks: interactive creation blocks for tangible geometric games
US20200166990A1 (en) Device and methodology for the interaction through gestures and movements of human limbs and fingers
EP3073355A1 (fr) Système et kit pour la transmission d'un signal d'une surface irrégulière d'une structure à un détecteur qui registre une impulsion électrique, méthode pour la transmission d'un signal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21755915

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21755915

Country of ref document: EP

Kind code of ref document: A1