GB2597918A - Input device - Google Patents

Input device Download PDF

Info

Publication number
GB2597918A
GB2597918A GB2011752.9A GB202011752A GB2597918A GB 2597918 A GB2597918 A GB 2597918A GB 202011752 A GB202011752 A GB 202011752A GB 2597918 A GB2597918 A GB 2597918A
Authority
GB
United Kingdom
Prior art keywords
input device
input
projections
user
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2011752.9A
Other versions
GB202011752D0 (en
GB2597918B (en
Inventor
Treadway Oliver
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sphere Research Ltd
Original Assignee
Sphere Research Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sphere Research Ltd filed Critical Sphere Research Ltd
Priority to GB2011752.9A priority Critical patent/GB2597918B/en
Publication of GB202011752D0 publication Critical patent/GB202011752D0/en
Priority to PCT/EP2021/071351 priority patent/WO2022023506A1/en
Publication of GB2597918A publication Critical patent/GB2597918A/en
Application granted granted Critical
Publication of GB2597918B publication Critical patent/GB2597918B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/04Building blocks, strips, or similar building parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

An input device for a computing system includes: a surface member comprising a substantially planar portion and an array of projections extending from the substantially planar portion in a direction perpendicular to the substantially planar portion; respective input sensors disposed on or in at least some of the projections and operable to detect user input; one or more indicator elements, the or each indicator element having a plurality of possible activation states; communication means arranged to transmit, in response to receiving the user input at said one or more of the input sensors, first data to the computing system, the first data indicating said one or more of the input sensors; and feedback circuitry arranged to change an activation state of at least one of the indicator elements in dependence on receiving second data from the computing system via the communication means.

Description

INPUT DEVICE
Technical Field
Embodiments disclosed herein relate to an input device for a computing system.
Background
Input devices facilitate interaction between a human user and an electronic device such as a computer or a smart TV. Various types of input device, such as computer mice, joysticks and trackpads provide means for a user to navigate between objects or regions presented in a graphical user interface (GUI) of an electronic device, for example by controlling a cursor visible within the GUI Conventional input devices, while versatile, are not optimal for interacting with certain types of GUI. In particular, most input devices are designed to facilitate continuous navigation between objects or regions in a two-dimensional scene, and are not well-suited to applications where a user is required to navigate between objects or regions in a three-dimensional scene, and/or in which objects can be positioned only at a discrete set of positions. In such applications, it can be difficult for a user to associate input provided at the input device with locations in the GUI, meaning that such applications typically require a high level of skill and training, and are in particular are not suitable for all users, for example children. Examples of applications in which a user is required to navigate between objects in a three-dimensional scene and in which objects appear at a discrete set of positions include voxel editors for creating or manipulating voxel-based structures for video games or other applications, and applications in which a user builds or modifies modular structures within a GUI, for example a structure formed of object models representing Lego® blocks or Duplo0 blocks, either for education or entertainment purposes.
Summary
According to a first aspect of the present disclosure, there is provided an input device for a computing system The input device includes a monolithic surface member comprising a substantially planar portion and an array of projections extending from the substantially planar portion in a direction perpendicular to the substantially planar portion, with respective input sensors disposed on or in at least some of the projections and operable to detect user input. The input device further includes one or more indicator elements, the or each indicator element having a plurality of possible activation states, and communication means arranged to transmit, in response to receiving the user input at said one or more of the input sensors, first data to the computing system, the first data indicating said one or more of the input sensors. The input device further comprises feedback circuitry arranged to change an activation state of at least one of the indicator elements in dependence on receiving second data from the computing system via the communication means.
Providing the input device with one or more indicator elements can assist a user, such as a child, to make a cognitive connection between input provided at the input device and actions performed within a GUI of the computing device. For example, the indicator elements may include respective light sources disposed on or in at least some of the projections of the input device By activating the light sources in different configurations, feedback can be provided to relate the projections of the input device to respective positions in the GUI, helping the user to understand how to provide input to control what is displayed in the GUI.
The first data may be transmitted to the computing system in one or more signals, and may further indicate additional characteristics of the received user input, for example relating to duration or pressure where the input sensor is a touch or pressure sensor, or characteristics of one or more removable modules attached to the projections. The second data may be received from the computing system in one or more signals, and may either include an instruction to change the state of the at least one indicator element, or information for configuring the feedback circuitry to activate the one or at least one indicator element in response to receiving the user input at the one or more input elements.
According to a second aspect of the disclosure, there is provided apparatus for providing input to a computing device. The apparatus includes an input device as described above, and further includes a set of modules removably attachable to the discrete input elements of the input device and removably attachable to one another.
The input device and the modules include detection circuitry to allow the input device to determine an arrangement of the modules when the modules are attached to at least some of the projections of the input device. The first data transmitted to the computing device conveys information indicating the determined arrangement of the modules attached to the input device. The computing device may be arranged to depict a model of the resulting modular structure in a GUI, where the depicted model either appears identical to the physical structure formed by the removable modules or is altered or enhanced in one or more ways, for example by applying colours or textures, adding lighting, modifying the geometry of certain portions of the depicted model, animating certain portions of the depicted model, and/or displaying the depicted model in a scene along with one or more further object models.
In accordance with a third aspect of the disclosure, there is provided a system comprising a computing device arranged to generate a user interface configured to depict a plurality of positions in which to place a plurality of object models, and an input device as described above, wherein each of the projections corresponds to a respective one or more of said positions depicted in the user interface. In response to receiving the signal from the input device indicating said one or more of the input sensors, the computing device is arranged to cause the user interface to depict a selected object model of the plurality of object models in a position corresponding to the projection on/in which said one or more input sensors are disposed.
In some examples, the input device is used to facilitate building of a modular structure depicted within the user interface. Once the modular structure has been completed, the computing device may bestow additional functionality on the depicted model, for example by animating or otherwise altering the model, providing a rewarding experience for the user. Additionally, the computing device may be arranged to transmit signals to the input device to activate the feedback circuitry of the input device, causing the indicator elements to further enhance the interactive experience.
Further features and advantages of the disclosure will become apparent from the following description of preferred embodiments of the disclosure, given by way of example only, which is made with reference to the accompanying drawings.
Brief Description of the Drawings
Figure 1 shows an input device in perspective view; Figure 2 shows a system comprising a computing device connected to a display and the input device of Figure 1; Figures 3 to 7 show further configurations of the system of Figure 2; Figure 8 shows an example of a user interface; Figure 9 shows the system of Figure 2 and a detachable module; Figure 10 shows the system of Figure 2 and two detachable modules.
Detailed Description
Embodiments of the present disclosure relate to an input device for controlling a computing device. In particular, embodiments described herein enable a user to interact with a user interface of a computing device in a manner that is sufficiently intuitive to be suitable for a wide range of users, including children.
Figure 1 shows an example of an input device 100. The input device 100 has an upper surface, preferably formed of a single monolithic sheet of plastic of substantially rigid construction. The upper surface forms part of a housing 101 containing active elements of the input device, including a battery and various other electrical components as will be described in more detail hereafter. By constructing the upper surface of the input device 100 from a single sheet of plastic, the input device 100 is durable and resistant to liquids, dust and dirt which might otherwise fall into gaps in the housing of a user device. The upper surface comprises a substantially planar portion 102 and an array of projections 104 projecting from the substantially planar portion 102. In this example, the projections 104 are arranged in a regular grid with equal grid spacings in two perpendicular dimensions. Although in Figure 1 the grid is shown as a 6x6 grid for convenience, the input device 100 may have a different number and/or configuration of projections 104, for example arranged in a 48x48 grid, a 60x60 grid, a 48x96 grid, or any other suitable size or shape of grid. In other examples, an input device may include projections arranged on a different type of grid, for example a curvilinear grid or an isometric grid formed of equilateral triangles. The projections 104 are shaped as cylindrical studs, though in other examples projections may be shaped differently, for example as domes, cones, frustums, rods or studs of any cross-sectional shape. Different projections may be identical to one another or may be shaped differently from one another, for example to provide a user with a tactile indication of different regions of the input device.
At least some, and preferably each, of the projections 104 includes a respective input sensor for detecting user input at said projection 104. In this example, the input sensor for each projection 104 is a surface capacitive sensor for detecting when a user touches an upper surface of the projection 104, for example with a finger or thumb. In other examples, alternative types of tactile sensor or contact sensor may be provided, for example projected capacitive sensors or resistive touch sensors. Certain types of tactile sensor may be capable of detecting different levels of pressure applied to the projections. In some examples, an input device may additionally or alternatively include other types of input sensor, such as light sensors or conductive sensors, for detecting other types of user input as will be described in more detail hereinafter. At least some, and preferably each, of the projections 104 includes an indicator element in form of a light source comprising one or more light emitting diodes (LEDs).
The indicator element for each of the projections 104 can be in any of several possible activation states. In this example, one of the activation states corresponds to the indicator element being inactive (oft), and a set of further activation states each correspond to the indicator element being active (on) in a particular colour state. In other examples, an indicator element may have only two activation states, "on" and "off'. In some examples, an indicator element may have activation states corresponding to different brightness levels. In this example, the upper surface of each of the projections 104 is translucent, and the indicator element is located inside the projection 104 behind the surface capacitive sensor, such that the activation state of the indicator element is visible at least through the upper surface of the projection 104. The surface capacitive sensor is arranged to detect user input directly over the indicator element. In other examples, an input device may include other types of indicator element in addition to, or as an alternative to, a set of light sources. For example, an input device may include one or more vibrational motors for providing haptic feedback, or one or more audio signalling elements, such as a buzzer or loudspeaker, for providing audible feedback. In this way, an input device may be suitable for blind or partially-sighted users. For example, an input device may include vibrational motors at multiple locations in the input device in order to provide location-specific haptic feedback.
The input device 100 is arranged to communicate with a computing device via a communication module 106. The communication module 106 in this example includes a radio modem and transceiver for communicating directly or indirectly with the computing device using wireless signals in accordance with a suitable wireless communications standard, for example Bluetooth® or Wi-Fi. In other examples, wired connection means may be provided for communicating with a computing device. A wired connection means may further be used to power the input device 100 and/or charge a battery of the input device 100, either via the computing device 200 or directly from a power supply.
The input device 102 includes feedback circuitry, which includes logical circuitry that, in this example, is embedded within a printed circuit board located beneath the indicator elements and input sensors of the input device 100. The feedback circuitry is configured to change the activation states of the indicator elements in response to user input as described above and/or signals received via the communication module 106. The feedback circuitry may be configured to change the activation states of one or more of the indicator elements in response to receiving user input at a corresponding one or more of the projections 104. The indicator elements may return to their previous activation states after a certain period of time or alternatively may remain in their changed activation states until further user input is received or until a signal is received via the communications module 106. The feedback circuitry may be configured in response to data received via the communication module 106, and different configurations of the feedback circuitry may result in different sets of indicator elements being activated in response to user input. For example, the feedback circuitry may be configured such that several indicator elements are activated in response to user input at only one of the projections 104.
Furthermore, the feedback circuitry may be configured to change the activation states of one or more of the indicator elements in response to data received via the communications module 106, irrespective of whether user input is received. Examples of possible configurations of the feedback circuitry will be described in more detail hereafter.
Figure 2 shows an example in which the input device 100 is used to control a computing device 200. The computing device 200 in this example is integral to a smart TV which further includes a display 202. A smart TV is a television with intemet connectivity and processing circuitry capable of executing software applications ("apps") In other examples, the input device 100 could be used to control another types of computing device such as an iPadTM, a laptop computer or a desktop computer. The computing device 200 includes a communication module 204, such that the respective communication modules 106, 204 of the input device 100 and the computing device 200 enable wireless communication between the input device 100 and the computing device 200.
In the example of Figure 2, the computing device 200 is configured with an app that causes the computing device 200 to generate a GUI on the display 202, where the GUI depicts a three-dimensional space in which one or more selectable object models (referred to hereinafter as objects) may be rendered. In this example, the GUI depicts a base board 206, which in this example is a substantially planar object from which an array of cylindrical studs 208 extend in a regular array, with axes perpendicular the base board 206 The studs 208 are depicted as having a configuration corresponding to that of the projections 104 of the input device 100. In order to achieve this correspondence, the input device 100 in this example is arranged to transmit identification information to the computing device 200 during a setup process, from which the computing device 200 can derive the configuration of the grid of the projections 104 of the input device 100.
Each of the studs 208 defines a position at which an object can be connected to the base board 206. In the example shown in Figure 2, a substantially cuboid block 210 is shown as being connected to four mutually adjacent studs 208. The block 210 includes four further cylindrical studs 212 extending in the same direction as the studs 208 of the base board 206, enabling further objects to be connected to the block 210.
By positioning multiple objects in the GUI in such a configuration that the objects are connected to one another and/or to the base board 206, a user is able to build a modular structure for rendering in real time within the GUI. The possible positions at which an object can be positioned is finite and is determined by the positions of any exposed studs to which an object has not already been connected.
In the example of Figure 2, the selectable objects are three-dimensional computer-aided design (CAD) models of Lego® pieces, which include blocks with varying characteristics such as size, colour, and shape, as well as other pieces which are connectable to the blocks and/or one another using regularly-spaced cylindrical studs. In other examples, a user interface may depict other types of modular structure, for example structures formed of Duplog pieces or structures formed of functional modules such as software-simulated synthesiser modules for music generation.
In the example of Figure 2, a user first selects the block 210 from a set of selectable objects, and further selects an orientation and one or more characteristics for the block 210, for example a colour and/or surface texture for the block 210. An example of how objects can be selected will be discussed in more detail hereinafter.
Once the block 210 has been selected, the computing device 200 transmits data to the input device 100 via the communication modules 106, 204 indicating characteristics of the block 210. In this example, the characteristics of the block 210 include dimensions of the block 210 and a colour of the block 210.
Upon receiving the data from the computing device 204, the input device 100 configures the feedback circuitry in dependence on the characteristics of the block 210, so that the feedback circuitry is ready to activate an appropriate set of indicator elements in response to receiving user input. After configuring the feedback circuitry, the input device 100 may receive user input at a projection 104a. In response to receiving the user input, the feedback circuitry changes the activation state of the indicator element of the projection 104a, and further changes the activation states of the indicator elements of three further projections 104b, 104c, 104d arranged in a row with the projection 104a. In this example, the three further projections 104b, 104c, 104d are determined in accordance with the dimensions of the block 210 and the selected orientation of the block 210. The feedback circuitry is configured such that the indicator elements of the projections 104a-d display a colour resembling the selected colour of the block 210. In response to receiving the user input mentioned above, the input device 100 further transmits data to the computing device 200 via the communication modules 106, 204, indicating the projection 104a at which the user input was received and, optionally, indicating further characteristics of the user input such as pressure detected by the input elements and the duration for which the pressure is applied. The computing device 200 updates the GUI to show the block 210 connected to the studs 208 of the base board 206 corresponding to the projections 104a-d of the input device 100.
Although in the example described above, the orientation and dimensions of the block 210 are effectively selected using the GUI of the computing device, in other examples the dimensions and/or the orientation may be determined in accordance with the user input received at the input device 100. For example, a user may select a certain height and colour of block, but without specifying the in-plane dimensions or orientation of the block. The user may subsequently provide input at projections 104a and 104d of the input device 100, for example by touching or pressing projections 104a and 104d, from which the feedback circuitry determines that the indicator elements at projections 104a-104d should be activated in a colour state representing the selected block colour. The input device 100 then transmits data to the computing device 200 indicating the projections 104a and 104d (and, optionally, the intervening projections 104b, 104c), causing the computing device 200 to depict the block 210 connected to corresponding studs on the base board 206 In another example, a user may provide input at projections 104a and 104h shown in Figure 3, for example by touching or pressing projections 104a and 104h, causing the feedback circuitry to determine that the indicator elements at projections 104a-104h should be activated. The input device may then transmit data to the computing device 200 indicating the projections 104a and 104h (and, optionally, the intervening projections 104b-104g), causing the computing device 200 to depict a block 310 connected to corresponding studs on the base board 206.
As described above, the input device 100 is arranged to receive data from the computing device 200 indicating characteristics of an object that is selected using the GUI, following which the feedback circuitry of the input device 100 is configured to activate the indicator elements in dependence on the received user input and the characteristics of the selected object. For example, the feedback circuitry may store information relating to the characteristics of the object, allowing the feedback circuitry to activate or otherwise change the activation states of the indicator elements without receiving further signals from the computing device 200. In this way, the feedback circuitry can respond rapidly to user input, irrespective of any processing lag at the computing device 200 or signalling lag between the input device 100 and the computing device 200. For example, whilst the block 210 is selected, the user may move his or her finger over the projections 104, causing the corresponding indicator elements to be activated and deactivated in such a way to give the impression of an object being dragged around the input device 100. At the same time, signals may be sent to the computing device 200, causing the user interface to depict the block 210 being dragged around in the same way.
In order for the input device 100 to be able to interpret signals received from the computing device 200, the input device 100 includes memory circuitry and processing circuitry for storing and executing software associated with the app on the computing device 200. By provisioning the input device 100 with different software, the input device 100 may be configured to behave differently in response to signals received from the computing device 200. In this way, the input device 100 is flexible and can be updated for example when the app on the computing device 200 is updated. Nevertheless, in other examples an input device may be hard wired to interpret signals in accordance with a predetermined specification, which developers of apps or software for computing devices must then adhere to.
In the examples described above, the feedback circuitry is configured to change the activation states of indicator elements directly in response to receiving user input. In an alternative operational mode, the feedback circuitry is configured to change the activation states of one or more indicator elements in response to data received from the computing device 200. For example, the user device 100 may receive user input at one or more of the projections 104, causing the input device 100 to transmit a signal to the computing device 200 indicating those projections 104. In response to receiving the signal, the computing device 200 may update the GUI to depict a new arrangement of objects. The computing device 200 may further transmit data to the input device 100 indicating a new configuration of activation states to be implemented by the feedback circuitry of the input device 100. In this way, the feedback circuitry changes the activation states of the indicator elements in dependence on the user input at the projections 104, but not directly in response to the user input. In this example, the feedback circuitry only needs to implement instructions received from the computing system 200, simplifying the function of the feedback circuitry. Advantageously, the input device 100 is able to operate in either of the operational modes described above, or in a combination of both operational modes, providing flexibility for different applications of the input device 100. In other examples, an input device may operate exclusively in either one of the operational modes.
Configuring the feedback circuitry to change the activation states of the indicator elements in response to data received from the computing device 200 allows the computing device 200 to cause changes of the activation states irrespective of whether user input is received at the input device 100, allowing for further flexibility in the type of feedback provided by the input device 100. For example, the computing device 200 may send data to the input device 100 indicating one or more possible positions in which an object can or should be placed. The data may comprise a set of instructions for building a predetermined modular structure. The feedback circuitry may then activate one or more corresponding indicator elements to assist the user in selecting and/or positioning an appropriate object.
As mentioned above, the feedback circuitry may be configured such that each of the indicator elements of the input device 100 will remain in a given activation state until further user input is detected and/or until a signal is received from the computing device 200. In this way, the activation states of the indicator elements may reflect an arrangement of objects already positioned in the user interface. In the example of Figure 4, two blocks 410 and 412 are depicted with the larger block 410 connected to the base board 206 and the smaller block 412 connected to the top of the first block 410.
The blocks 410, 412 are different colours, represented in Figure 4 by different orientations of diagonal stripes. In this example, after both of the blocks 410, 412 have been positioned, the indicator elements of the projections 104a, 104b, 104e and 104f are in an activation state corresponding to the colour of the smaller block 412. By contrast, the indicator elements of the projections 104c, 104d, 104g, 104h are in an activation state corresponding to the larger block 410. In this way, the configuration of activation states is indicative of a top-down view of the modular structure depicted in the GUI. In this way, the activation states of the indicator elements represent the positions at which further objects can be connected to the modular structure. In this way, the indicator elements help a user, such as a child, to make a cognitive link between the modular structure depicted in the user interface, and locations of the projections 104 on the input device 100. This, in turn, may assist the user to build or modify the modular structure in an intuitive manner.
The input device 100 may include an orientation detector for determining an orientation of the input device 100. The orientation detector may, for example, include a microelectromechanicaI systems (MEATS) magnetic field sensor employed as an electronic compass and an accelerometer for detecting the orientation of the input device 100 in relation to gravitational field of the Earth. The relative orientation of the input device 100 with respect to the display 202 can be determined from the outputs of the electronic compass and accelerometer on the basis of a calibration process. In other examples, the orientation detector may include one or more cameras for determining an orientation of the input device 100 with respect to a local environment using simultaneous location and mapping (SLAM), or an infrared transmitter/receiver for determining an orientation with respect to a remote receiver/transmitter module. In other examples, a computing device such as the computing device 200 may include or be connected to one or more further devices for determining an orientation of the input device 100, for example one or more cameras for capturing images of the input device 100, from which the orientation of the input device 100 can be determined using pose determination software.
In the example of Figure 5, the input device 100 is configured to transmit data to the computing device 200 indicating the orientation of the input device 100, and the computing device 200 is arranged to depict the base board 206 in the GUI at an orientation depending on the orientation of the input device 100. The input device 100 may configured to transmit this orientation data periodically at a sufficiently high frequency that the orientation of the base board 206 appears to respond to any reorientation of the input device 100. Alternatively, the input device 100 may be configured to transmit orientation data only when a change of orientation is detected, resulting in less signalling and accordingly less power use. In general, reorientation of the base board 206 in the GUI is animated to appear as smooth and continuous motion.
Providing the computing device 200 with information indicative of the orientation of the input device 100 advantageously allows the user to manipulate the GUI by changing the orientation of the input device 100. In the example of building a modular structure, the user can thereby rotate the structure to view different parts of the structure during the building process. Furthermore, orienting the base board 206 in dependence on the orientation of the input device 100 may further assist the user to associate the projections 104 on the input device 100 with locations depicted in the GUI, particularly when combined with the use of indicator elements as described above.
In the examples described above, the GUI is configured such that a user can position objects either directly on the base board 206 or on top of other objects, in positions defined by the cylindrical studs 212 depicted in GUI. In this way, the positioning of the objects reflects how a corresponding set of physical objects can be connected together, and the resulting modular structure therefore reflects a physical structures insofar as the objects are supported from below, as opposed to "floating" in an unsupported manner. Advantageously, when the input device 100 provides in-plane position components of objects, the out-of-plane component is uniquely determined by the configuration of objects already positioned in the GUI.
In an alternative configuration, a user interface such as the GUI described above may be configured such that a user can position an object in any of a discrete set of positions in three-dimensions, irrespective of whether such positioning results in "unphysical" behaviour such as floating objects or intersecting objects. Such a configuration may provide additional flexibility, and is particularly useful for learning how to build modular structures. More generally, the input device 100 may be used for other applications in which positions can be specified in three dimensions, for example within a voxel editor. In such cases, the input device 100 may be used to specify both in-plane components and out-of-plane components of the three-dimensional positions. In examples where the input device 100 is used to specify out-of-plane components of the position, the feedback circuitry may be configured to change the activation states of the indicator elements in dependence on a specified out-of-plane component. For example, as a user selects different out-of-plane components of the position (corresponding to different layers or slices of the three-dimensional space in which objects can be placed), the indicator elements may be used to represent objects positioned in the currently-selected layer, and/or those in the layer below the currently selected layer. In this way, the input device 100 assists the user to select positions relative to an existing modular structure within the GUI, whilst also providing information about internal portions of the modular structure which are otherwise hidden from view.
In the example of Figures 6 and 7, the computing device 200 generates a GUI in which an object can be positioned in any of a discrete set of three-dimensional positions, where the in-plane components of the positions correspond to the positions of the projections 104 as described with reference to the examples above. A set of projections can be reserved for specifying the out-of-plane component of the position. In Figure 6, the set of projections 104 is those disposed in a column 602 of projections 104. The activation states of the indicator elements within the column 602 may optionally be changed to indicate that the column 602 is reserved for this purpose. In order to specify a position in three dimensions, in this example a user first provides input one of the projections 104 in the column 602 to select the out-of-plane component of the position. The user then provides input at one or more further projections 104 to specify the in-plane components of the position (as described with reference to any of the examples described above). Figures 6 and 7 show an example of a block 610 being moved within the GUI of the computing system 200 when a user provides input at two different projections 104 in the column 602 whilst specifying the same in-plane components. It is observed that the block 610 moves between positions of a column extending perpendicular to base board 206.
As explained above, in the example of Figures 6 and 7, a set of projections, which is a subset of the projections 104, is used for specifying out-of-plane components of position. In other examples, an input device may include additional or alternative means for specifying an out-of-plane component of the position, for example a slider, a scroll wheel, or a set of additional touch sensors or buttons each corresponding to a selectable value of the out-of-plane component.
In the examples described above, objects are selectable using the GUI of the computing device 200. Whilst interacting with the GUI, a user may, for example, indicate that he or she wishes to select a new object, for example by providing input at the input device 100 or another input device connected to the computing system 200.
The computing device 200 may then present a selection of objects in the GUI. The selection of objects may be stored at the computing device 200 and updated via a network interface of the computing device 200, or alternatively may be stored remotely and downloaded to the computing device 200 in real-time as the user interacts with the GUI.
In some examples, a selection of objects may be presented in a menu, for example a menu with a hierarchical format in which different categories of objects are grouped together. Advantageously, the GUI may depict the selectable objects, and optionally one or more selectable characteristics for the selectable objects, on an interior surface of a three-dimensional structure. An example of a user interface in which objects are presented on a curved interior surface of a hollow three-dimensional structure is discussed in international patent publication WO 2011/151367 Al. By providing a GUI of this type, more objects of a given size can be displayed on the display 202 than is possible if the objects are presented in a conventional two-dimensional GUI, and a user may be able to locate objects in a more intuitive manner than when using a conventional two-dimensional GUI.
Optionally, the base board 206 and/or any objects already connected to the base board 206 may be displayed simultaneously with the interior surface of the three-dimensional structure, such that the three-dimensional structure appears to partially or completely surround the base board 206 as shown for example in Figure 8. In this example, four selectable blocks 810, 812, 814, 816 having different dimensions to one another are presented on a curved interior surface of a three-dimensional structure surrounding the base board 206. Advantageously, the user is able to rotate the base board 206 and the three-dimensional structure by rotating the input device 100, providing a convenient and intuitive way for the user to view different objects on the surface. The user is also provided with controls for selecting and navigating between the selectable objects. In this example, when the user indicates that he or she wishes to select a new object, a subset of the projections 104 are then used for selecting and navigating between objects. The feedback circuitry changes the activation states of the corresponding indicator elements to indicate that the projections 104 may be used for selecting and navigating. Once the user has selected an object, the three-dimensional structure and selectable obj ects may optionally be hidden until next time the user wishes to select a new object. Although the input device 100 can be used both for selecting and positioning objects, a further input device such as a mouse or trackpad may alternatively be used for selecting objects.
Once a modular structure has been built in accordance with any of the examples described above, the app running on the computing device 200 may cause the computing device 200 to perform various actions. For example, if a user successfully builds a modular structure in accordance with a predetermined set of building instructions, the computing device 200 may animate the modular structure in the user interface, providing a rewarding experience for the user. The computing device 200 may further transmit data to the input device 100 including instructions to change the activation states of one or more of the indicator elements, so as to further enhance the interactive experience.
In another example, the app is configured to navigate the user to a webpage for buying a set of physical blocks corresponding to the object models used in building the modular structure. In this case, it is advantageous for the user to be able to test the modular structure before buying the corresponding physical blocks. The app may therefore include a gravity simulator which can be switched on to test the structural integrity of the modular structure in the GUI. In this way, a user is able to build a model of a modular structure conveniently in a gravity-free environment, then test how a corresponding real structure would behave when subjected to a gravitational field. In the examples described above, the input sensors of the input device 100 are operable to detect tactile input at the projections 104. However, in other examples input sensors may additionally or alternatively be capable of detecting other types of input.
In particular, input sensors may be capable of determining when a detachable module is connected to one or more projections of the input device. In the example of Figure 9, an input device 900 includes features corresponding to those of the input device 100 described above, but in this example the input sensors of the projections 904 are arranged to detect when a detachable module such as the module 908 is attached to one or more of the projections 904. In this example, the module 908 is a block-shaped module including a set of recesses arranged to receive the projections 904 of the input device 900, and further including a set of projections 910 to which other detachable modules may be attached.
The input device 900 and the module 908 contain circuitry including inductive elements located within the projections 904 and portions the module 908 arranged to connect to the projections 904. The circuitry of the detachable module 908 is logical circuitry which allows the input device 900 to determine characteristics of the module 908 when the module 908 is connected to the input device 900. The module 908 may, for example, include memory circuitry storing an identification code for uniquely identifying the module 908. The input device 900 may then transmit a signal to the computing device 200 indicating the characteristics of the module 908 in addition to the position and orientation of the module 908 on the input device 900. In the example of Figure 9, the computing device 200 is arranged to depict a block 912 with characteristics corresponding to those of the module 908 in a position and orientation on the base board 206 corresponding to the position and orientation of the module 910 on the input device 900.
In this example, the circuitry of the module 908 is passive, and receives power from the input device 900 via inductive coupling. Alternatively, the detachable module 908 may include active circuitry, in which case the module 908 would include a battery. In some cases, the battery of a detachable module may be charged using inductive charging when connected to the input device 900. The input device 900 may include one or more near-field communication (NFC) transceivers for communicating with a radio frequency identification (REID) tag of a detachable module such as module 908. In another example, a detachable module such as module 908 and corresponding set of projections of an input device such as input device 900 includes conducting elements arranged to be in contact when the detachable module is connected to the input device, forming a circuit for communication between the detachable module and the input device. In further examples, the input device 900 may utilize other types of input sensor, for example light sensors based on light dependent resistors (LDRs) for determining when one or more projections are covered by a detachable module.
Referring now to Figure 10, the module 908 is connected to a further detachable module 1008. In this example, the modules 908 and 1008 each include circuitry to allow the input device 900 to determine an arrangement of the modules 908, 1008 attached to the input device 900. The input device 900 is arranged to send a signal to the computing device 200 indicating the determined arrangement of the modules 908, 1008 attached to the input device. In the example of Figure 10, the GUI depicts two blocks 912, 914 with characteristics corresponding to those of the modules 908, 1008 in an arrangement on the base board 206 corresponding to the arrangement of the modules 912, 914 on the input device 900. Further modules may be added in the same way.
The above embodiments are to be understood as illustrative examples of the disclosure. Further embodiments of the disclosure are envisaged. For example, although the computing device 200 was described above as being remote from the input devices 100, 900, in other examples, an input device and computing device can be components of a single device, for example a portable device which may then be connected to a display.
Although the housing 101 of the input device 100 is described as having a monolithic plastic upper surface, alternative constructions are possible. For example, the upper surface may comprise several portions formed of a common material or different materials. All or part of the upper surface may be flexible or otherwise moveable relative to other portions of the upper surface. For example, at least some of the projections 104 may be operable as push buttons for receiving user input. At least part of the housing 101 may be transparent or translucent in order to allow the user to see the activation states of LEDs disposed within the housing. The housing 101 may be formed of any suitable material or combination of materials such as a plastics, polymers, metals or composite materials. Depending on the material used, the housing 101 may be moulded, 3D printed, or constructed using any other suitable manufacturing process.
Although embodiments have been described with reference to simple blocks, it is to be understood that input devices described herein may similarly be used to arrange or otherwise interact with various other types of object, for example irregularly-shaped objects and/or objects with various types of functionality. In some examples, an input device of the type described above may be used to interact with objects in two dimensions, for example where a GUI is used to depict a grid-based game such as chess, Scrabble® or Go. Similarly, physical modules attachable to an input device may be capable of performing various functions, for example receiving user input via switches, sliders, knobs, or other types of input sensor, providing additional functionality of the input device for controlling different aspects of a computing system. Furthermore, physical modules may be provided with indicator elements in the form of motors, lights, or audio signalling elements such as loudspeakers, all of which may be activated by feedback circuitry of the input device. In this way, an input device as described above in combination with one or more detachable modules can provide a host of ways for a user to interact with a computing device. For example, where the input device is used for controlling software-defined modules for music generation, functional modules may be provided for controlling different software-defined synthesiser components. By connecting different configurations of functional modules to the input device, different configurations of modular synthesisers may be created and controlled.
It is to be understood that any feature described in relation to any one embodiment may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the embodiments, or any combination of any other of the embodiments. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the disclosure, which is defined in the accompanying claims.

Claims (23)

  1. CLAIMSI. An input device for a computing system, the input device comprising.a monolithic surface member comprising a substantially planar portion and an array of projections extending from the substantially planar portion in a direction perpendicular to the substantially planar portion; respective input sensors disposed on or in at least some of the projections and operable to detect user input; one or more indicator elements, the or each indicator element having a plurality of possible activation states; communication means arranged to transmit, in response to receiving user input at one or more of the input sensors, first data to the computing system, the first data indicating said one or more of the input sensors; and feedback circuitry arranged to change an activation state of at least one of the indicator elements in dependence on receiving second data from the computing system via the communication means.
  2. 2 The input device of claim 1, wherein the projections are cylindrical studs with axes extending perpendicular to the substantially planar portion.
  3. The input device of claim 1 or 2, wherein the grid is a regular grid
  4. 4. The input device of any preceding claim, wherein at least some of the input sensors are touch sensors.
  5. 5. The input device of any preceding claim, wherein the one or more indicator elements include one or more vibrational motors.
  6. 6. The input device of any preceding claim, wherein the one or more indicator elements include one or more audio signalling elements.
  7. 7. The input device of any preceding claim, wherein the one or more indicator elements comprise respective light sources disposed on or in at least some of the proj ecti on s
  8. 8. The input device of claim 7, wherein the plurality of possible activation states of a given light source includes a plurality of colour states.
  9. 9. The input device of claim 7 or 8, wherein the light source for a given one of the projections comprises one or more light emitting diodes (LEDs).
  10. The input device of any preceding claim, wherein the feedback circuitry is arranged to change the activation state of said at least one of the indicator elements further in dependence on receiving the user input at said one or more of the input sensors.
  11. 11. The input device of any preceding claim, wherein the communication means comprise a radio modem for wireless communication with the computing device.
  12. 12. The input device of claim 11, wherein: the input sensors of at least some of the projections are arranged to detect when a detachable module is attached to those projections; and said user input comprises a user attaching the detachable module to the projections on/in which said one or more input sensors are disposed.
  13. 13. The input device of claim 12, comprising identification means for determining one or more characteristics of said detachable module, wherein: the first data transmitted to the computing device conveys information indicating the determined one or more characteristics of said detachable module.
  14. 14. The input device of claim 13, wherein the identification means comprise a near-field communication, NEC, transceiver for communicating with a radio frequency identification, RFID, tag of said detachable module.
  15. 15. Apparatus for providing input to a computing device, the apparatus comprising: the input device of any of claims 12 to 14; and a plurality of modules removably attachable to the discrete input elements of the input device and removably attachable to one another, wherein: the input device and the plurality of modules comprise detection circuitry to allow the input device to determine an arrangement of the plurality of modules when the plurality of modules are attached to at least some of the projections of the input device; and the first data transmitted to the computing device conveys information indicating the determined arrangement of the plurality of modules attached to the input device.
  16. 16. A system comprising: a computing device arranged to generate a user interface configured to depict a plurality of positions in which to place a plurality of object models; and the input device of any of claims 1 to 14, wherein each of the projections corresponds to a respective one or more of said positions depicted in the user interface, wherein in response to receiving the first data from the input device indicating said one or more of the input sensors, the computing device is arranged to cause the user interface to depict a selected object model of the plurality of object models in a position corresponding to the projection on/in which said one or more input sensors are disposed.
  17. 17. The system of claim 16, comprising an orientation detector for determining an orientation of the input device, wherein the user interface is configured to depict the plurality of positions in dependence on the determined orientation of the input device.
  18. 18. The system of claim 16 or 17, wherein: each of the plurality of positions is a position in three dimensions hav ng in-plane components and an out-of-plane component; and the projections of the input device correspond to the in-plane components of the plurality of positions.
  19. 19. The system of claim 18, wherein: the input device comprises means for specifying the out-of-plane components of the plurality positions; and the first data transmitted to the computing device further indicates the specified out-of-plane component of said one of the plurality of positions.
  20. 20. The system of claim 19, wherein the means for specifying the out-of-plane component of said one of the plurality of positions comprises further input sensors each corresponding to a respective selectable value for the out-of-plane component.
  21. 21. The system of any of claims 18 to 20, wherein: the object models of the plurality of object models are models of components of a modular structure; and the computing device comprises a gravity simulator operable to simulate a behaviour of the modular structure in a simulated gravitational field.
  22. 22. The system any of claims 16 to 21, wherein the user interface is arranged to depict a plurality of selectable objects, including said selected object, on an interior surface of a three-dimensional structure at least partially surrounding the plurality of positions.
  23. 23. The system of claim 22, wherein the user interface is further arranged to display one or more selectable characteristics for said selected element.
GB2011752.9A 2020-07-29 2020-07-29 Input device Active GB2597918B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2011752.9A GB2597918B (en) 2020-07-29 2020-07-29 Input device
PCT/EP2021/071351 WO2022023506A1 (en) 2020-07-29 2021-07-29 Input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2011752.9A GB2597918B (en) 2020-07-29 2020-07-29 Input device

Publications (3)

Publication Number Publication Date
GB202011752D0 GB202011752D0 (en) 2020-09-09
GB2597918A true GB2597918A (en) 2022-02-16
GB2597918B GB2597918B (en) 2023-08-09

Family

ID=72339320

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2011752.9A Active GB2597918B (en) 2020-07-29 2020-07-29 Input device

Country Status (2)

Country Link
GB (1) GB2597918B (en)
WO (1) WO2022023506A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1271415A1 (en) * 2001-06-20 2003-01-02 Gateway, Inc. Parts assembly virtual representation and content creation
WO2011151367A1 (en) 2010-06-01 2011-12-08 Sphere Technology Limited Method, apparatus and system for a graphical user interface
US20180280822A1 (en) * 2017-03-31 2018-10-04 Intel Corporation Building blocks with lights for guided assembly
US20190094841A1 (en) * 2017-09-26 2019-03-28 International Business Machines Corporation Assembly of a modular structure

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160361662A1 (en) * 2012-02-17 2016-12-15 Technologyone, Inc. Interactive lcd display back light and triangulating toy brick baseplate
CN109491582A (en) * 2017-09-13 2019-03-19 施政 A kind of dynamic user interface system of elements

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1271415A1 (en) * 2001-06-20 2003-01-02 Gateway, Inc. Parts assembly virtual representation and content creation
WO2011151367A1 (en) 2010-06-01 2011-12-08 Sphere Technology Limited Method, apparatus and system for a graphical user interface
US20180280822A1 (en) * 2017-03-31 2018-10-04 Intel Corporation Building blocks with lights for guided assembly
US20190094841A1 (en) * 2017-09-26 2019-03-28 International Business Machines Corporation Assembly of a modular structure

Also Published As

Publication number Publication date
GB202011752D0 (en) 2020-09-09
WO2022023506A1 (en) 2022-02-03
GB2597918B (en) 2023-08-09

Similar Documents

Publication Publication Date Title
Le Goc et al. Zooids: Building blocks for swarm user interfaces
US11181984B2 (en) Virtual reality input and haptic feedback system
CN107708820B (en) System, method and apparatus for foot-controlled motion and motion control in virtual reality and simulation environments
CN103092406B (en) The system and method for multiple pressure interaction on touch sensitive surface
US8325138B2 (en) Wireless hand-held electronic device for manipulating an object on a display
US7886621B2 (en) Digital foam
CN102830795B (en) Utilize the long-range control of motion sensor means
Avola et al. Design of an efficient framework for fast prototyping of customized human–computer interfaces and virtual environments for rehabilitation
CN107427719B (en) Toy system comprising a toy element detectable by a computing device
Lifton et al. Metaphor and manifestation cross-reality with ubiquitous sensor/actuator networks
EP3209401B1 (en) A toy construction system and a method for a spatial structure to be detected by an electronic device comprising a touch screen
CN108434726A (en) Automatic topognosis generates system
Villar et al. A malleable control structure for softwired user interfaces
WO2011112498A1 (en) Physical action languages for distributed tangible user interface systems
GB2533314A (en) Modular robotic system
EP3727623B1 (en) Play system and method for detecting toys
US20160313855A1 (en) Touch sensitive computing surface for interacting with physical surface devices
CN109697002A (en) A kind of method, relevant device and the system of the object editing in virtual reality
CN105992993A (en) A user interface
Parilusyan et al. Sensurfaces: A novel approach for embedded touch sensing on everyday surfaces
EP3191940A1 (en) A method for establishing a functional relationship between input and output functions
GB2597918A (en) Input device
CN102293059B (en) Apparatus and method for providing settings of a control system for implementing a spatial distribution of perceptible output
Jeong et al. iSIG-Blocks: interactive creation blocks for tangible geometric games
EP3073355A1 (en) System and kit for transmission of signal from an irregular surface of a 3-d structure to a detector registering electric impulse, method of signal transmission