WO2010000925A1 - Apparatus comprising a deformable structure - Google Patents
Apparatus comprising a deformable structure Download PDFInfo
- Publication number
- WO2010000925A1 WO2010000925A1 PCT/FI2009/050523 FI2009050523W WO2010000925A1 WO 2010000925 A1 WO2010000925 A1 WO 2010000925A1 FI 2009050523 W FI2009050523 W FI 2009050523W WO 2010000925 A1 WO2010000925 A1 WO 2010000925A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- deformable structure
- signal
- sensors
- deformation
- user
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/067—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using optical means
- G06N3/0675—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using optical means using electro-optical, acousto-optical or opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
Definitions
- the invention relates to an apparatus comprising an arrangement for sensing its deformation.
- Such devices may be capable of detecting the position and or motion of part of the body of a user, thereby permitting the device to trace postures or movements, for example in a monitoring application or in a robotic application for replicating the user's movements.
- position and/or motion detection can be employed in a user interface, permitting a device and/or a second, remote, device to be controlled by the user's movements or gestures.
- the computing resources required for such detection depends upon the complexity of the system and the number of degrees of freedom of variations monitored by the device.
- the processing of the sensor outputs, measurement, tracing and/or reconstruction of the postures or movements may require considerable computing resources and may result in a significant delay in providing appropriate control or feedback, for example, when generating a command signal based on the determined user posture.
- the electronic circuitry may become complex, with a relatively large number of interconnections between the sensors and the processor.
- An apparatus may comprise a deformable structure and a network including a plurality of sensors arranged to monitor deformation of the deformable structure and a plurality of processing circuits configured to generate a signal characterising features of said deformation, based on the output of two or more of said sensors.
- the processing circuits are distributed amongst said plurality of sensors.
- an apparatus can be provided with a network of deformation sensors and distributed in-situ processing circuits.
- Such circuits may be arranged to generate a signal characterising features of the local deformation of the structure and/or a command signal corresponding to the detected deformation. For instance, commands may be associated with one or more predetermined shapes of the structure.
- the structure may be a wearable sleeve that conforms to deformations of a user's skin, part of an electronic device, such as a touch sensitive screen, or an object in itself.
- the apparatus can provide a user interface, wherein a command corresponding to a current shape of the structure is generated and acted upon by a integrated or remote device, or a device for monitoring a user's position or movement e.g. for replication by a robotic device.
- the apparatus may have machine learning capability to improve the matching of commands with determined shapes of the deformable structure.
- Such an arrangement permits computing resources required to process signals from the plurality of sensors to be integrated and distributed in the sensor network.
- Such "in-situ" processing permits the use of a relatively simple configuration for monitoring a deformation having several degrees of freedom and/or requiring a high level of precision, when compared with conventional mesh networks based around a centralized processor.
- the network may be a neural network.
- the distributed processing network may generate one or more output signals that correspond one or more of the shape of the deformable structure, a user posture or changes in the shape of the deformable structure and a user movement. Alternatively, or additionally, a determined posture or movement can be mapped onto a command signal to be acted upon by the apparatus and/or transmitted to a remote electronic device.
- the apparatus may be used to provide a user interface permitting such an electronic device to be controlled using gestures made by a finger, hand or other body part of a user.
- a user interface may provide a highly intuitive control arrangement for applications such as controlling the movement of a cursor on a display, scrolling an image presented on a display and/or movement of a remote apparatus etc. If the network is trained so that the output signal corresponds to a particular posture or gesture, the required computing resources and the time required to respond to a user input may be reduced.
- any of the above mentioned embodiments may include an electronic device, the functionality of which can be adapted according to the detected shape of the deformable material.
- the apparatus may be transformable between first and second shapes and the electronic device may be configured to provide access to a first function of the electronic device when the deformable structure has a first one of said predetermined shapes and to provide access to a second function of the electronic device when the deformable structure is in a second one of said predetermined shapes.
- the first shape may be an "open" shape of the apparatus, in which a number of functions, including telecommunications functions, are made available to a user while, while the apparatus, when in a second "closed" shape, permits a user to access only a limited set of non-telecommunications functions.
- the deformable structure may be configured to be worn over a body part.
- the deformable structure may be a sleeve of stretchable material configured to fit over a joint, such as a finger or wrist, or other part of a body.
- the spatial distribution of the sensors may be concentrated in one or more areas of the deformable structure. Such a distribution may permit relatively high resolution monitoring of selected regions of the deformable structure. For instance, the regions may be those in which a relatively high degree of deformation is expected. In such a case, the concentration of the sensors in such a region permits the computational resources to be utilized more effectively.
- the sensors may include piezoelectric material.
- the piezoelectric material may be provided in the form of a plurality of nanowires. Such nanowires may be aligned along a common direction.
- the sensors may include a material that changes its resistivity when stretched. Where both types of sensors are provided, both stretching and flexure of the deformable structure can be detected.
- the sensors may have a plurality of sensing elements, wherein each sensing element is configured to monitor stretching and flexure of at least part of the deformable structure.
- first and second ones of said plurality of sensing elements may be arranged to monitor said stretching along first and second directions respectively, wherein the first direction is different from the second direction.
- One or more of the above apparatuses may be provided as part of a mobile telephone, for example, as part of its user interface.
- the deformable structure may form the body of the mobile telephone.
- One or more of the above apparatuses may be provided in a foldable electronic device.
- the deformable structure may be provided within, or even form, the hinge of such a foldable device.
- the foldable device may be an electronic telephone.
- One or more of the above apparatuses may be provided in a portable computing device.
- One or more of the above apparatuses may be provided in an electronic device where said deformable structure forms part of a touch sensitive display.
- a system may include one of the above apparatuses together with a second, remote apparatus configured to receive said signal from the apparatus.
- the second apparatus may be arranged to be controlled according to the signal received from said first apparatus and/or to replicate a posture or movement corresponding to the deformation of said at least part of the deformable structure.
- a method may include obtaining a plurality of measurements of deformations from a plurality of respective sensors distributed within a deformable structure and generating a signal identifying one or more features of said deformation based on two or more of said measurements, wherein said generation of the signal includes processing said two or more measurements using a plurality of processing circuits distributed amongst said sensors.
- the generated signal may be compared with previous signals to determine time- based changes in a shape of said deformable structure. Such time-based changes may then be compared with reference data to identify a corresponding movement or command. Alternatively, or additionally, the generated signal may be compared with reference data in order to identify said one or more features.
- Any of the above methods may include mapping a command onto said generated signal or onto the result of any comparison with reference data.
- the method can include compiling said reference data based on a plurality of shapes of the deformable structure or on a plurality of sets of time-based changes in the shape of the deformable structure.
- the method may also include updating the reference data according to measurements and/or comparisons made in operation.
- Any of the above methods may include adjusting a set of functions available on a device according to said generated signal or the result of any comparison with reference data. Any of the above methods may include transmitting to a remote device a signal corresponding to any one of a shape of the deformable structure, time-based changes in the shape of the deformable structure, a posture of a user manipulating the deformable structure, a movement of such a user and a command corresponding to a shape or to time- based changes.
- the remote device may execute a received command or, alternatively, map a command onto the received signal and then execute the mapped command.
- Such methods may include causing the remote device to replicate a shape of the deformable structure, time-based changes in the shape of the deformable structure, at least part of a posture of a user or a movement of a user.
- Figures Ia and Ib depict an apparatus according to a first embodiment of the invention, in first and second configurations respectively;
- Figure 2 is a plan view of a detection module of the apparatus of Figures I a to Ib;
- Figure 3 is a cross-sectional view of sensor of the detection module of Figure 2;
- Figure 4 is a cross-sectional view of an alternative sensor that could be used in the detection module of Figure 2;
- Figure 5 depicts a general configuration of part of an array of detection modules in the apparatus shown in Figures Ia and Ib;
- Figure 6 depicts an equivalent circuit of a detection module in the apparatus of
- Figure 7 is a representation of a neural network of the apparatus of Figures Ia and Ib;
- Figure 8 is a flowchart of a method of operating the apparatus of Figures I a and Ib;
- Figure 9 is a block diagram showing a configuration of part of the apparatus of Figures Ia and Ib;
- Figure 10 depicts a general configuration of part of an alternative array of detection modules that could be used in the apparatus of Figures Ia and Ib;
- Figure 1 1 is depicts an arrangement comprising an apparatus according to a second embodiment of the invention;
- Figures 12a to 12d depict the apparatus shown in Figure 1 1 when conforming to a number of positions of a user's hand;
- Figure 13 is a block diagram of the apparatus shown in Figure 1 1 ;
- Figure 14 is a flowchart of a method performed by the apparatus shown in Figure i i ;
- Figure 15 is a flowchart of a method performed by a device controlled by the apparatus shown in Figure 11 ;
- Figure 16 is a flowchart of an alternative method performed by a device controlled by the apparatus shown in Figure 11 ;
- Figure 17 depicts a system including an apparatus according to a third embodiment of the invention.
- Figure 18 depicts an apparatus according to a fourth embodiment of the invention
- Figure 19 is a flowchart of a method performed by the apparatus of Figure 18;
- Figure 20 depicts a mobile telecommunications device including an apparatus according to fifth embodiment of the invention.
- Figure 21 depicts an apparatus according to a sixth embodiment of the invention.
- FIGS Ia and Ib depict an apparatus 1 according to a first embodiment of the invention.
- the apparatus 1 is a transformable structure comprising an electronic device mounted on a flexible substrate that can be arranged in two or more configurations. Two example configurations are shown in Figures I a and Ib.
- materials suitable for forming the transformable structure include silicone rubber, composite rubber and elastomeric materials, which can provide a substrate for the integration of soft electronic parts.
- the components of the electronic device may include stretchable thin film coatings of a material such as gold or another metal, deposited directly on the substrate, or nano wire coatings, such as carbon nanotubes, metallic flakes or metal coated microfibers.
- the apparatus 1 includes at least one surface 2 arranged to receive user input.
- the surface 2 is a touch-sensitive display and the apparatus 1 provides different functionalities to a user according to its configuration.
- the apparatus 1 In the "open" configuration shown in Figure Ia, the apparatus 1 is arranged to provide the functions of a mobile telephone and the surface 2 is used to display touch keys 3 for receiving user input and a display region 4 for presenting information to a user.
- the mobile telephone functionality is not active.
- the apparatus 1 is arranged to be worn on a user's wrist and the surface 2 is used to provide other functions, such as a clock 5 and a lock 6.
- the apparatus 1 is arranged to determine and characterise its configuration using local measurements of strain and/or flexure from a plurality of detection modules. An example of a detection module 7a is shown in Figure 2.
- the detection module 7a includes a sensor 8 comprising two or more sensing elements 8a, 8b.
- the sensing elements 8a, 8b are arranged to monitor stretching along respective first and second directions, simultaneously.
- the first and second directions are perpendicular to one another.
- the sensor 8 may include a single sensing element or more than two sensing elements.
- the directions along which the sensing elements 8a, 8b detect elongation need not be perpendicular to each other.
- the output from the sensor 8 is transmitted to one or more of in-situ processing circuits 9a.
- the output from the one or more processing circuits 9a may, if required, be directed to one or more further processing circuit 10a and/or an application processor (not shown) via a bus 11.
- part of the sensor 8 and the processing circuits 9a, 10a may be mounted on rigid substrates 12a, 12b, 12c.
- the sensing elements 8a, 8b are arranged so that they can stretch and bend to conform with deformations of the apparatus 1.
- Such a combination of rigid and soft structures referred to hereinafter as a "semi- rigid" architecture, can be realised by laminating suitable materials for component of the sensor 8 and processing circuits 9a, 10a onto the substrate of the transformable structure.
- Suitable materials for use in such an architecture can include thin film metal electrodes, piezoelectric ceramic material (zinc-oxide (ZnO), lead-zirconate-titanate (PZT), polyvinylidene fluoride (PVDF), elastomeric composites with appropriate sensing properties, suitable nano-structured inorganic materials (highly aligned ZnO nanomaterials) and so on.
- piezoelectric ceramic material zinc-oxide (ZnO), lead-zirconate-titanate (PZT), polyvinylidene fluoride (PVDF), elastomeric composites with appropriate sensing properties
- suitable nano-structured inorganic materials highly aligned ZnO nanomaterials
- the sensing element 8a comprises a layer of sensing material 13, with first and second electrodes 14, 15 positioned on either side thereof.
- the sensing layer 13 includes a material having an electrical property, such as charge generation, resistance or capacitance, that changes in response to in-plane and out-of-plane deformations, in other words, when stretched and/or flexed.
- the detection module 7a may detect flexure and/or two- dimensional stretching, or a combination of the two.
- the sensor 8 may monitor one or more of localised charges, that is, voltages (V), generated by the sensing elements 8a, 8a' , 8b in response to stretching and/or flexure, resistance (R) changes, where the sensing material comprises a sensitive piezoelectric, electrostrictive, ferroelectric or similar materials or are simple elongation sensitive resistors, or where the sensing material is configured to change its capacitance in response to stretching and/or flexure, localised capacitance changes.
- V voltages
- R resistance
- the sensing layer 13 is a flexible and stretchable film, formed from a material such as silicon rubber and comprising an array of highly aligned piezoelectric nanowires.
- the nanowires are formed of a suitable material, such as piezoelectric (ZnO and/or BaTiO 3 ) or resistive (CNT) nano wires grown or dispersed on the second electrode 15.
- the first electrode 14 is laminated onto the sensing layer 13.
- the nanowires are aligned along a predetermined direction. In the examples shown in Figures 3 and 4, the nanowires are aligned perpendicularly and parallel to the electrodes 14, 15 respectively.
- the nanowires may be aligned along another direction, for example, so that they extend diagonally between the first and second electrodes 14, 15.
- the first and second electrodes 14, 15 are flexible and stretchable.
- the first electrode 14 may be formed of a material such as carbon (carbon black) or another material that changes its inherent resistance in accordance with, and/or proportionally to, its elongation. In this manner, the first electrode 14 can perform two functions, reading signals from the sensing layer 13 undergoing deformation and serving as a simple one- dimensional elongation sensor.
- the second electrode 15 is formed of a material such as gold, silver, copper or indium tin oxide (ITO), the resistance of which does not change significantly when flexed or elongated.
- the electrodes 14, 15 may be thin metallic films, that is, films having a sub-micrometre thickness, of gold, silver or copper. Such films may, if required, include micro-cracks in order to increase elasticity.
- the electrodes 14, 15 may be formed of a conductive composite elastic layer in which conductive particles are embedded in an elastic polymeric matrix.
- the embedded particles may be micro-sized metallic particles, nano-sized carbon fullerenes ("bucky balls"), carbon nano-fibres, or microscopic metallic fibres, metallic flakes or metal covered fibre networks.
- Such elastic composites or thin films on elastomers may be elongated, reversibly, by up to 50%, while remaining electrically conductive.
- Such a level of reversible stretchability can accommodate the deformation of the apparatus 1 required to move between the open and closed configurations of Figures Ia and Ib.
- the functional components such as the sensors 8, processing circuits 9a, 10a, bus 11 and the connections therebetween can be realised by using soft and/or rigid materials on elastic and stretchable substrates 12a-12c.
- Such functional components are directly deposited on a stretchable substrate forming the apparatus 1, in a so-called "semi-rigid" form of the overall system. Deposition of the functional components can be realised for example, by using a printing technique.
- polymer sensors may be provided on the apparatus 1 using ink -jet printing to deposit the sensors 8 and circuitry 9, 10, 11.
- the sensors 8 and other components are prefabricated and embedded into the substrate.
- an insulating layer of deformable material may be deposited on the substrate, in order to encapsulate the sensors 8 and circuitry 9, 10, 11, to avoid interference.
- the example sensing elements 8a, 8a' can detect localised stretching and flexing of the apparatus 1 as follows.
- the first electrode 14 When the substrate of the apparatus 1 is stretched, along direction A, shown in Figures 3 and 4, the first electrode 14 will lengthen, causing a change in its resistance that corresponds to the change in its length. Although the second electrode 15 also lengthens, its resistance is substantially unchanged.
- the first and second electrodes 14, 15 and the nanowires will bend.
- the nanowires will produce a voltage that is proportional to a bending angle/surface curvature of the sensing layer 13.
- the polarity of the voltage can be used to indicate whether the sensing layer 13 is being flexed in a concave or convex shape.
- the detection module 7a can be used to generate an output signal indicating elongation along direction A as well as local flexing/bending of the apparatus 1.
- a network of such detection modules 7a-71 and in-situ processing circuits 9a-9d, 10a, 10b is distributed within the apparatus 1, as shown in the partial view of Figure 5.
- At least some of the processing circuits 9a-9d, 10a, 10b comprise computational capabilities, as will be explained later.
- at least some of the processing circuits 10a, 10b are connected to the application processor 16 via the bus 11.
- Figure 6 depicts an equivalent circuit for a part of a detection module 7a that produces and conditions a signal S ou t having a frequency corresponding to the voltage across the sensing layer 13 of one or more sensor elements 8a, 8b.
- the detection modules 7a-71 are configured as "spiking neurons". In other words, the detection modules 7a-71 are configured to transmit an output to the processing circuits 9a-9d when the voltages generated by the nanowires of the sensing layer 13 exceed a predetermined threshold.
- the detection module 7a therefore includes an integrator 17, which integrates the voltage across the sensing layer 13 or layers. When the voltage across the capacitor 17 exceeds the predetermined threshold, the integrator 17 is discharged and a pulse is generated.
- the integrator 17 is the sensor 8 itself. In other words, the intrinsic capacitance of the sensor 8 is used and the piezoelectric voltage is used as a variable ground potential therefor.
- the integrator may be provided in the form of a separate capacitor.
- the detection module 7a of Figure 6 also provides components for conditioning the signal S ou t produced by the sensor 8.
- the detection module 7a comprises a comparator, in the form of a charge amplifier 18 with high input impedance, provided with a negative feedback path, and a Sigma-Delta modulator 19 to convert the signal S ou t into a pulse modulated signal.
- Signals between and from the detection modules 7a-71 can be transmitted within the apparatus 1 in the form of pulse or spike coded signals, to reduce or remove interference from external electromagnetic fields or any effects from environmental factors, such as temperature variations.
- the processing circuit 9a-9d could be configured to transmit signals using one of pulse width modulation (PWM), pulse density modulation (PDM), pulse train frequency modulation or spike code modulation.
- the coding scheme may be configured so that there are time-slots for sending data from the detection modules 7a-71 to processing circuits 9a-9d and, in turn from the processing circuits 9a-9d to processing circuits 10a, 10b and so on. Such a scheme may avoid the need for an address-based x-y read-out structure, permitting the use of a simpler network configuration with fewer interconnections.
- Figure 6 shows the signal conditioning components 18, 19 as part of a detection module 7a
- such signal conditioning may be instead be performed by at least some of the processing circuits 9a, 9d.
- signal conditioning components 18, 19 may be provided in both the detection modules 7a and the processing circuits 9a.
- the detection modules 7a-71 and processing circuits 9a-9d, 10a, 10b provide a neural network that can classify and recognise the configuration of the apparatus 1.
- the neural network may be configured to identify a transformation of the apparatus 1 and, if required, to determine the dynamics of such a transformation.
- the local deformations measured by the sensors 8 of n detection modules 7 can be expressed as a vector X 1 as follows:
- the processing circuits 9a-9d, 10a, 10b are configured to map the local measurements X 1 5 X 25 X 3 OnIo a higher dimensional feature space.
- the mapping is made using a nonlinear classifier, in the form of a Radial Basis Function (RBF).
- RBF Radial Basis Function
- the described processing may be based on any number of available measurements and/or on another suitable classifier as required.
- the RBF classifier is based on a Gaussian kernel function
- x m represents a model vector.
- the value of ⁇ f determines the sharpness of the kernel function.
- the kernel function can be interpreted as a similarity operator, which indicates how closely the local measurements X 1 , x 25 x 3 correspond to the model vector x m .
- the local measurements X 1 , x 2 , x 3 are mapped onto a five dimensional feature space, using classifiers Z 1 to z 5 .
- the classifiers Z 1 to z 5 are then mapped onto a lower dimensional space that has a sufficient number of dimensions to represent and classify at least the major features of a deformation, or non-deformation, of the apparatus 1.
- a two- dimensional state space Iy 15 J 2 J is used.
- the layer feeding the measurements X 1 , X 2 , X 3 is referred to as the input layer
- the layer calculating the classifiers is the middle layer
- the final layer, which calculates the outputs Iy 15 J 2 J is referred to as the output layer.
- the outputs are weighted sums of the classifiers Z 1 to z 5 plus an additional bias value Zbias-
- a traditional artificial neural network could be employed, in which a hidden layer and an output layer form weighted sums of the input measurements and calculate a sigmoid function.
- a cellular neural/nonlinear network could be used.
- the parameters of the neural network may be reprogrammable, in order to allow the processing circuits 9a-9d, 10a, 10b to improve matches between desired and actual outputs. In this manner, the apparatus 1 can continue "learning" to recognise pre- determined configurations, such as three-dimensional shapes, over its lifetime.
- the parameters could be derived during development and the learned values permanently configured in the processing circuits 9a-9d, 10a, 10b of the detection modules 7a-71.
- Such learning may take place in a supervised manner, by performing a trial and adjusting the parameters to obtain suitable matches between desired and actual outputs.
- unsupervised learning techniques may be employed, for example, using self-organizing maps (SOM).
- FIG. 8 An example of a method for operating the device is depicted in Figure 8.
- step s8.0 localised measurements X 1 ,... X n of the stretching and/or flexure of the apparatus 1 are obtained by the sensors 8 of the detection modules 7a-71 (step s8.1).
- Classifier functions z are calculated based thereon (step s8.2), in a similar manner to that described above.
- the classifier functions are then used to calculate outputs that characterise at least the major features of the stretching and/or flexure of the apparatus 1 (step s8.3) and are used to determine the configuration of the apparatus 1 (step s8.4).
- the overall configuration of the apparatus 1 can be determined by a centralised processor, such as the application processor 13 shown in Figure 5. Alternatively, a local configuration may be determined, by the application processor 13 or by the one or more of the processing circuits 9a-9d, 10a, 10b.
- step s8.5 the application processor 13 determines whether or not the change in configuration requires a change in the functionality to be made available to a user (step s8.6).
- the application processor may activate mobile telephone functionality (step s8.7).
- the application processor may deactivate the mobile telephone functionality and activate other features, such as a clock (step s8.7).
- the application processor 13 may determine whether the configuration parameters satisfy one or more pre-determined conditions based on particular shapes/configurations, trigger/execute an protocol or function at the apparatus 1 and update displayed information accordingly (step s8.8) , completing the procedure (step s8.9).
- the processor 16 may be arranged to transmit a wired or wireless signal to a remote device at step s8.7, in order to guide, steer and/or execute commands and functions at the remote device, such as pointer steering on a personal computer or a display device, browsing through menus and/or selecting the most appropriate icon and executing a protocol associated with a selected menu option, internet browsing applications, controlling home appliances, a TV or other home electronic devices, and so on.
- a remote device such as pointer steering on a personal computer or a display device, browsing through menus and/or selecting the most appropriate icon and executing a protocol associated with a selected menu option, internet browsing applications, controlling home appliances, a TV or other home electronic devices, and so on.
- the application processor 16 may make changes to the information, images and/or content displayed on the apparatus 1, such as updating content or changing an area on the apparatus 1 on which it is displayed, in accordance with the determined configuration (step s8.8), for instance, to complete the procedure (step s7.9).
- FIG. 9 is a block diagram of the apparatus 1.
- the detection modules 7a-71 and processing circuits 9, 10 provide a neural network 21.
- the outputs from the neural network are sent to the application processor 16, which runs software stored in memory 22 in order to perform the relevant determinations and updating functions discussed above in relation to Figure 8.
- the application processor 16 can control the presentation of information on a display 23, as well as mobile phone components in the form of a codec 24, transceiver 25, a microphone 26, a speaker 27 and respective amplifiers (not shown).
- An antenna 28 is provided for transmitting data to, and receiving data from, a base station (not shown).
- the apparatus 1 can be powered by a rechargeable battery 29, which can be recharged via charging circuit 30.
- the neural network 21 produces a signal based on the outputs of multiples ones of the detection modules 7.
- the processing circuits 9, 10 may be relatively simple, when compared with a conventional processor, potentially reducing both the power consumption and the cost of manufacturing the apparatus 1. Moreover, the use of distributed processing can allow the apparatus to react to user input relatively quickly.
- the processing circuits 9a-9d, 10a, 10b are provided separately from the detection modules 7a-71. However, in other embodiments, some or all of the processing circuitry may be provided within one or more of the detection modules.
- Figure 10 depicts a detection module 7a' comprising a rigid substrate 12a upon which part of the sensor 8 and processing circuitry 9a'. 10a' are provided, in a similar manner to the "semi-rigid" arrangement described above in relation to Figure 2.
- the sensors 8 can be realised in the form of soft and flexible structures that connect the rigid substrates 12a on which the processing circuitry 9a', 10a' is mounted.
- the processing circuitry 9a', 10a' comprises at least an amplifier and an ADC in order to condition the output of the sensor 8. The output can then be further processed by the processing circuitry 9a', 10a' and/or transmitted to another processing circuit or a processor 16 via bus 12.
- Figure 11 depicts a wearable apparatus 31.
- the apparatus 31 comprises a sleeve 32, arranged to be worn over the hand 33 of a user.
- the sleeve 32 is arranged to conform with stretching and relaxation of the skin of the user's wrist 34 and so is capable of undergoing resiliently deformations along arbitrary directions.
- the sleeve 32 comprises a substrate formed from a soft, conformable material, such as natural rubber, silicone rubber, another elastomer or elastomer-like material, capable of tolerating a strain of around 20%.
- a network of detection modules 7 and processing circuits 9, 10 is embedded in the sleeve 32, in a similar manner to that described above in relation to the apparatus 1 of Figures Ia and Ib.
- the detection modules 7 comprise sensors 8 for monitoring stretching and/or flexing of the sleeve 32 as it conforms to movements of the user's wrist 34, for instance, when moving between a rest position, shown in Figure 11 , to one of the other postures shown in Figures 12a to 12d.
- the apparatus 31 is arranged to act as a user interface, so that a user can remotely control a remote device via a wireless communications link 35.
- the remote device is a headmounted display 36 and the movements of the user's wrist 34 shown in Figures 12a to 12d cause an image output by the headmounted display 36 to be scrolled to the left, right, up and down respectively.
- FIG 13 is a block diagram of the apparatus 31 , in which components similar to those discussed in relation to the first embodiment are referred to by the same reference numerals.
- the apparatus 31 is equipped with a distributed neural network, which includes a plurality of detection modules 7 and processing circuits 9, 10, as described hereinabove in relation to the apparatus 1 of Figures Ia and Ib.
- the signals output and processed by the detection modules 7 are processed by the processing circuits 9, 10.
- a combination signal can then be generated and processed by a processor 37, based on the signals from two or more of the detection modules 7, for transmission to the display 36 via the transceiver 25 and antenna 28.
- the combination signal may identify at least the major features of the shape of the sleeve 32 and, therefore, the posture of all or part of the user's wrist 34.
- a processor 38 within the display 36 can then map a command signal onto the determined deformation, for example, by using a look-up table stored in a memory 39.
- the processor 33 of the apparatus 31 may be arranged to identify the posture of the user's wrist 34 using a look-up table stored in a memory 40.
- the combination signal transmitted by the apparatus 31 to the display 32 may be the command signal itself.
- the memory 40 of the apparatus 31 may store information about shape of the sleeve 32 or determined postures so that the processor 37 can compare a current shape or posture with stored information, identify changes in the shape of the sleeve 32 and determine a movement of the user's hand 30 or wrist 34.
- the signal transmitted to the display 36 may be a signal identifying that movement or, alternatively, a corresponding command signal based thereon.
- a remote device such as the display 36 can be controlled by the user's movements instead of, or as well as, the posture of a part of the user's body 33, 34.
- the neural network 21 can classify and recognise the configuration of the apparatus 31 , identify any conformal transformation and, if required, to determine the dynamics of such a transformation.
- the apparatus 31 is thus arranged to output a combination signal that relates to the output of multiple ones of the detection modules 7. Such an arrangement can reduce, or even minimise, the number of output signals to be conveyed to a remote device, thereby reducing the amount of data to be transmitted and, potentially, improving the speed with which a receiving device can respond to user input.
- the need for additional computing resources in the remote device can be reduced.
- Figures 14 and 15 are flowcharts of examples of methods that can be performed by the apparatus 31 and the remote device, display36, respectively.
- step sl4.0 measurements of localised deformations are obtained by the sensors 8 of the detection modules 7 (step si 4.1).
- the processing circuits 9, 10 of the neural network 21 calculates classifier functions based on the measurements (step sl4.2), as described above in relation to the apparatus 1 of the first embodiment.
- a combination signal, including one or more outputs characterising the features of at least a local configuration of the apparatus 31 are generated by the processing circuits 9, 10 and/or the processor 37 (step sl4.3) and transmitted to the display 36 (step sl4.4), completing the process (step sl4.5).
- step si 5.0 the combination signal from the apparatus 31 is received by the display 36 (step si 5.1) and a corresponding command is determined (step si 5.2).
- the relevant command is then executed (step s 15.3).
- the deformation of the apparatus 31 indicates that a user wishes to scroll an image presented by the display 36, that image is updated accordingly.
- the remote device is a different type of device, such as a home appliance, tv or computer
- the execution of the relevant command may include changing the functionality of the remote device, moving a cursor on a display, calling up a menu and so on.
- the process is then complete (step si 5.4).
- the apparatus 31 may be configured to determine the command corresponding to the configuration, (step sl4.2) and, in step sl4.4, to transmit a corresponding command signal in place of the corresponding signal for reception by the display 36 in step si 5.1.
- Figure 16 depicts a method performed by the display 36 in place of the method of Figure 15, when controlled by movement of the user's hand 33 or wrist 34.
- step sl6.0 the combination signal from the apparatus 31 is received (step s 16.1) and compared with previously received signals stored in the memory 39 of the display 36 (step sl6.2).
- step sl6.2 The movement of the user's hand 33 or wrist 34 is determined, based on differences between the received signal and one or more stored signals (step sl6.3).
- a command corresponding to that movement is identified (step sl6.4) and executed by the display 36 (step sl6.5), as described above in relation to step s 15.3 of Figure 15.
- the most recently received signal is then stored in the memory 39 (step sl6.6) for comparison with subsequent combination signals and the process ends (step sl6.7).
- steps sl6.2, 16.3 and, optionally, step si 6.4 may be performed by the apparatus 31 and a signal identifying the movement or the corresponding command signal may be transmitted to the remote device in place of the combination signal.
- sleeve 32 of the apparatus 31 of Figures 1 1 and 12 is configured to fit over a user's hand 33 and to detect the shape of at least part of the sleeve 32 to identify a posture and/or motion of the user's wrist 34
- other embodiments of the invention may comprise one or more deformable sleeves, substrates or surfaces in other forms, depending on the particular application of the apparatus.
- a wearable apparatus that acts as a user interface or a detector for monitoring a body part of a user in a remote presence, or telepresence, arrangement may include a sleeve configured to fit over a particular part of a user's body.
- Figures 11 and 12 depict an apparatus 31 in which the detection modules 7 and processing circuits 9, 10 are arranged in a uniform distribution around the user's wrist 34.
- detection modules 7 and processing circuits 9, 10 may be provided, and arranged in other manners.
- Figure 17 depicts an apparatus 41 according to a third embodiment of the invention, in which the spatial distribution of detection modules 7 and processing circuits 9, 10 is concentrated in particular areas 42, 43 of the sleeve 32.
- the areas 42, 43 correspond to points where a relatively high level of local deformation may be experienced.
- Such an arrangement can permit the shape of the deformation to be determined with greater precision, by making a relatively high number of measurements in regions where most of the deformation is expected.
- the concentration of detection modules 7 in the sleeve 32 can be varied from area to area to allow for localised high resolution tracking capabilities.
- the apparatus 41 generates a combination signal based on the deformation of the sleeve 32 as it conforms to the user's hand 33 and wrist 34, as described previously with reference to Figure 14, steps si 4.0 to si 4.5.
- the combination signal is transmitted to a computer 44 via a wireless link 45.
- the computer 44 is arranged to reconstruct the movement of the user's wrist 34 based as described previously with reference to Figure 15, steps sl5.0 to sl5.4.
- the computer 44 is arranged to steer a robotic device 46 that replicates the movements of the user's wrist 34.
- the command signal generated at step s 15.2 may thus be a command signal that controls movement of a component of the robotic device 46.
- the command signal may be transmitted to the robotic device 46, via a network 47.
- the robotic device 46 then executes the received command (step sl5.3).
- the system of Figure 17 may be particularly suitable for teleoperation of the robotic device 46 in applications requiring the replication of delicate movements.
- the robotic device 46 a remote device containing an array of actuators arranged to perform object movement in accordance with an initial transformation and the received combined signal, to provide remotely controlled object manipulation.
- such an arrangement can permit the measurements to be concentrated in such a region so that the deformation can be monitored using fewer detection modules 7 and processing circuits 9, 10, potentially reducing complexity, power consumption and manufacturing costs.
- the apparatuses 31 , 41 shown in Figures 11 and 17 are intended for controlling a display 36 and a robotic device 46 respectively, the apparatuses 31, 41 or apparatuses similar thereto, may be used to control other types of remote device.
- the apparatuses 31, 41 could be used to move a cursor on a display associated with a personal computer, to change channels and/or control volume, brightness and contrast on a television or to control a home appliance.
- the methods of Figures 15 and 16 may be performed by the remote device.
- the apparatus 31, 41 may be used to monitor the posture and/or movements of the user and to transmit the combination signal to a remote device, such as a computer 44, for storage and/or analysis.
- Figure 18 depicts an example of an apparatus 48 according to another embodiment of the invention, in the form of a personal digital assistant device.
- a network 21 of detection modules 7 and processing circuits 9, 10, as described above is utilised to provide a touch sensitive user interface 49, or touch screen, through which a user can control processes such as browsing of content and services, requesting and selecting items from pop-up menus displayed thereon.
- the touch screen 49 includes a resiliency deformable surface in which a distributed network of sensors 8 and processing circuits 9, 10 are provided, as described above in relation to the apparatus 1 of Figure 1.
- a user can input information using a stylus 50 or their fingers (not shown) to exert pressure on a particular area of the touch screen 49 to select an icon 51 associated with an application or a function button 52 or to operate a scroll bar 53.
- the configuration of the apparatus otherwise resembles that shown in the block diagram of Figure 9, with the detection modules 7 sensing localised deformation of the touch screen 49 in the area pressed by the user and producing one or more output signals for transmission to an application processor 16.
- the apparatus 48 may be configured as shown in Figure 13 and arranged to transmit a combination signal and/or a command signal to a remote device, such as a mobile terminal, personal computer, home appliance and so on.
- a remote device such as a mobile terminal, personal computer, home appliance and so on.
- the apparatus 48 may be equipped with a handwriting recognition function, so that a symbol 54 input in at least a particular region 55 of the touch-screen 49 can be identified.
- Figure 19 depicts a method performed by the apparatus 48 of Figure 18.
- steps sl9.0 measurements of localised deformation are obtained from some or all of the detection modules 7 (step si 9.1).
- Classifier functions and outputs are determined by processing circuits 9, 10, as described above in relation to Figure 5 (steps sl9.2, sl9.3).
- the region of the touch-screen 49 in which the user input was received is determined (step sl9.4) by the application processor 16. If the user input was made outside the handwriting recognition region 55, the application processor 16 determines whether a function has been selected (step sl9.6), for example, by the user pressing on an icon 51, button 52 or bar 53. If so, the selected function is executed (step si 9.7).
- step sl9.5 If it is determined that the user input was made within the handwriting region 55 (step sl9.5), successive measurements from the detection modules 7 are monitored and compared, for example, using a method similar to that described in relation to steps sl6.2, sl6.3 and sl6.6 of Figure 16 (step sl9.8), to build up a record of the user's movements and, therefore, the gesture made by the user. This record is then compared with reference data stored in a memory 18 in order to identify the symbol input by the user (step sl9.9) and, where necessary, to store it for use in an appropriate application.
- the symbol may be an alphabetic character, numeral or punctuation mark.
- the display is then updated (step si 9.10).
- the display is updated to show a corresponding user interface and/or to show that the function has been activated.
- the identified symbol 56 may be displayed on the touch screen 49 so that the user can verify it.
- the procedure is then complete (step si 9.11).
- a distributed network of sensors 8 and processing circuits 9, 10 can be used to detect deformation.
- a network can be used in other apparatuses and/or systems.
- such a network can be used to monitor the shape of a deformable surface, such as the substrate 2 of the apparatus 1, in order to determine a command to be carried out by a remote device 36.
- a user can manipulate the deformable surface into a particular shape, or into one of a set of predetermined shapes, in order to control a function on a remote device.
- the deformable surface may be provided in close proximity to the user, for example, in a wearable device.
- such an apparatus could be a component of, or attached to, another object.
- such an apparatus 57 could be included within the hinge 58 of a foldable, or "clamshell", mobile telecommunications device 59, as shown in Figure 20.
- the apparatus 57 may itself form the hinge 58.
- the apparatus 57 includes a layer of deformable material in which a network of sensors 8 and processing circuits 9, 10 are provided. The configuration of the apparatus may depend on whether the mobile device 59 is in an "open” or "closed” configuration. The output from the sensors 8 of the apparatus 57 can then be used to provide different functionality depending on whether the mobile device 59 is open or closed, in a similar manner to that depicted in Figure 9.
- a further apparatus 60 could be provided on, or form part of, a part of the mobile telecommunications device 59 or its chassis.
- the further apparatus 60 is arranged to be manipulated by the user, for instance, to manipulate a cursor or focus region 61 on a display 62 of the mobile telecommunications device 59.
- the device shown in Figure 20 is a mobile telecommunications device 59
- a similar apparatus could be provided within, or even form, the hinge of another type of folding electronic device.
- the apparatus 57 could be provided in a laptop computer, in order to change properties such as brightness or contrast of a displayed image according to an angle of a screen relative to a keyboard.
- a deformable structure forms part of a larger device.
- a deformable structure may be provided as part of a surface or as a projection from a surface of a larger device in other embodiments of the invention.
- the primary deformable surface of the apparatus may be an independent object.
- Figure 21 shows an example apparatus 63, comprising a resiliency deformable substrate 64.
- the substrate 64 is in the form of a layer of material, however, such an apparatus can be provided in other forms.
- the apparatus 63 could take the form of another three-dimensional object , such as a strip, a ball, a cube, and so on.
- a network of detection modules 7 and processing circuits 9, 10 is distributed in the substrate 64, along with the other components shown in Figure 13.
- suitable materials for the substrate 64, sensors 8 and processing circuits 9, 10 include those discussed above in relation to the flexible substrate, sensors 8 and processing circuits 9, 10 of the apparatus 1 , shown in Figures Ia and Ib.
- a user can manipulate the apparatus 63 to change its shape. The localised deformations resulting from such shape changes are detected by the sensors 8 and characterised by the processing circuits 9, 10, and a combination signal identifying the transformation and/or a command signal can be transmitted to a remote device 65 via a wireless link 66, as described above with reference to Figure 14.
- the apparatus 63 is used as a remote control for a television 65.
- such an apparatus could be used in a telepresence application, where the combination signal is copied to a remote device arranged to replicate or reconstruct the shape of the apparatus 63, in a similar manner to that described in relation to the robotic device 46 of Figure 17.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Neurology (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An apparatus includes a deformable structure in which a neural network comprising a plurality of deformation sensors (8a, 8b), e.g. nanowire sensors, and distributed in-situ processing circuits (9a, 10a). The circuits generate a signal characterising features of the local deformation of the structure and/or a command signal corresponding to the detected deformation. The structure may be a wearable sleeve that conforms to deformations of a user's skin, part of an electronic device, such as a touch sensitive screen, or an object in itself. The apparatus can provide a user interface wherein a command corresponding to a current shape of the structure is generated and acted upon by a integrated or remote device, or a device for monitoring a user's position or movement e.g. for replication by a robotic device. The apparatus may have machine learning capability to improve the matching of commands with determined shapes of the deformable structure.
Description
Apparatus
Technical Field
The invention relates to an apparatus comprising an arrangement for sensing its deformation.
Background of the Invention
Recently, there has been a great deal of interest in flexible electronic devices, in particular, wearable electronic devices. Such devices may be capable of detecting the position and or motion of part of the body of a user, thereby permitting the device to trace postures or movements, for example in a monitoring application or in a robotic application for replicating the user's movements. Alternatively, or additionally, such position and/or motion detection can be employed in a user interface, permitting a device and/or a second, remote, device to be controlled by the user's movements or gestures. The computing resources required for such detection depends upon the complexity of the system and the number of degrees of freedom of variations monitored by the device. Where a relatively large number of sensors is used, the processing of the sensor outputs, measurement, tracing and/or reconstruction of the postures or movements may require considerable computing resources and may result in a significant delay in providing appropriate control or feedback, for example, when generating a command signal based on the determined user posture. In addition, the electronic circuitry may become complex, with a relatively large number of interconnections between the sensors and the processor.
Similar considerations may apply to touch-sensitive devices, particularly where a high degree of precision is required.
Summary of the Invention
An apparatus according to an embodiment of the invention may comprise a deformable structure and a network including a plurality of sensors arranged to monitor deformation of the deformable structure and a plurality of processing circuits configured to generate a signal characterising features of said deformation, based on the output of two or more of said sensors. The processing circuits are distributed amongst said plurality of sensors.
In other words, an apparatus can be provided with a network of deformation sensors and distributed in-situ processing circuits. Such circuits may be arranged to generate a signal characterising features of the local deformation of the structure and/or a command signal corresponding to the detected deformation. For instance, commands may be associated with one or more predetermined shapes of the structure. The structure may be a wearable sleeve that conforms to deformations of a user's skin, part of an electronic device, such as a touch sensitive screen, or an object in itself. The apparatus can provide a user interface, wherein a command corresponding to a current shape of the structure is generated and acted upon by a integrated or remote device, or a device for monitoring a user's position or movement e.g. for replication by a robotic device. The apparatus may have machine learning capability to improve the matching of commands with determined shapes of the deformable structure.
Such an arrangement permits computing resources required to process signals from the plurality of sensors to be integrated and distributed in the sensor network. Such "in-situ" processing permits the use of a relatively simple configuration for monitoring a deformation having several degrees of freedom and/or requiring a high level of precision, when compared with conventional mesh networks based around a centralized processor. The network may be a neural network. The distributed processing network may generate one or more output signals that correspond one or more of the shape of the deformable structure, a user posture or changes in the shape of the deformable structure and a user movement. Alternatively, or additionally, a determined posture or movement can be mapped onto a command signal to be acted upon by the apparatus and/or transmitted to a remote electronic device. In another embodiment of the invention, the apparatus may be used to provide a user interface permitting such an electronic device to be controlled using gestures made by a finger, hand or other body part of a user. Such a user interface may provide a highly intuitive control arrangement for applications such as controlling the movement of a cursor on a display, scrolling an image presented on a display and/or movement of a remote apparatus etc. If the network is trained so that the output signal corresponds to a particular posture or gesture, the required computing resources and the time required to respond to a user input may be reduced.
Any of the above mentioned embodiments may include an electronic device, the functionality of which can be adapted according to the detected shape of the deformable material. For instance, the apparatus may be transformable between first and second
shapes and the electronic device may be configured to provide access to a first function of the electronic device when the deformable structure has a first one of said predetermined shapes and to provide access to a second function of the electronic device when the deformable structure is in a second one of said predetermined shapes. The first shape may be an "open" shape of the apparatus, in which a number of functions, including telecommunications functions, are made available to a user while, while the apparatus, when in a second "closed" shape, permits a user to access only a limited set of non-telecommunications functions.
The deformable structure may be configured to be worn over a body part. For instance, the deformable structure may be a sleeve of stretchable material configured to fit over a joint, such as a finger or wrist, or other part of a body.
In any of the above embodiments, the spatial distribution of the sensors may be concentrated in one or more areas of the deformable structure. Such a distribution may permit relatively high resolution monitoring of selected regions of the deformable structure. For instance, the regions may be those in which a relatively high degree of deformation is expected. In such a case, the concentration of the sensors in such a region permits the computational resources to be utilized more effectively.
In any of the above embodiments, the sensors may include piezoelectric material. The piezoelectric material may be provided in the form of a plurality of nanowires. Such nanowires may be aligned along a common direction. Alternatively, or additionally, the sensors may include a material that changes its resistivity when stretched. Where both types of sensors are provided, both stretching and flexure of the deformable structure can be detected.
In any of the above embodiments, the sensors may have a plurality of sensing elements, wherein each sensing element is configured to monitor stretching and flexure of at least part of the deformable structure. For example, first and second ones of said plurality of sensing elements may be arranged to monitor said stretching along first and second directions respectively, wherein the first direction is different from the second direction. One or more of the above apparatuses may be provided as part of a mobile telephone, for example, as part of its user interface. The deformable structure may form the body of the mobile telephone.
One or more of the above apparatuses may be provided in a foldable electronic device. For example, the deformable structure may be provided within, or even form, the hinge of such a foldable device. The foldable device may be an electronic telephone.
One or more of the above apparatuses may be provided in a portable computing device.
One or more of the above apparatuses may be provided in an electronic device where said deformable structure forms part of a touch sensitive display.
A system may include one of the above apparatuses together with a second, remote apparatus configured to receive said signal from the apparatus. The second apparatus may be arranged to be controlled according to the signal received from said first apparatus and/or to replicate a posture or movement corresponding to the deformation of said at least part of the deformable structure.
A method according to an embodiment of the invention may include obtaining a plurality of measurements of deformations from a plurality of respective sensors distributed within a deformable structure and generating a signal identifying one or more features of said deformation based on two or more of said measurements, wherein said generation of the signal includes processing said two or more measurements using a plurality of processing circuits distributed amongst said sensors.
The generated signal may be compared with previous signals to determine time- based changes in a shape of said deformable structure. Such time-based changes may then be compared with reference data to identify a corresponding movement or command. Alternatively, or additionally, the generated signal may be compared with reference data in order to identify said one or more features.
Any of the above methods may include mapping a command onto said generated signal or onto the result of any comparison with reference data.
Where reference data is used, the method can include compiling said reference data based on a plurality of shapes of the deformable structure or on a plurality of sets of time-based changes in the shape of the deformable structure. Optionally, the method may also include updating the reference data according to measurements and/or comparisons made in operation.
Any of the above methods may include adjusting a set of functions available on a device according to said generated signal or the result of any comparison with reference data.
Any of the above methods may include transmitting to a remote device a signal corresponding to any one of a shape of the deformable structure, time-based changes in the shape of the deformable structure, a posture of a user manipulating the deformable structure, a movement of such a user and a command corresponding to a shape or to time- based changes. Optionally, the remote device may execute a received command or, alternatively, map a command onto the received signal and then execute the mapped command. Such methods may include causing the remote device to replicate a shape of the deformable structure, time-based changes in the shape of the deformable structure, at least part of a posture of a user or a movement of a user.
Brief Description of the Drawings
Figures Ia and Ib depict an apparatus according to a first embodiment of the invention, in first and second configurations respectively;
Figure 2 is a plan view of a detection module of the apparatus of Figures I a to Ib; Figure 3 is a cross-sectional view of sensor of the detection module of Figure 2;
Figure 4 is a cross-sectional view of an alternative sensor that could be used in the detection module of Figure 2;
Figure 5 depicts a general configuration of part of an array of detection modules in the apparatus shown in Figures Ia and Ib; Figure 6 depicts an equivalent circuit of a detection module in the apparatus of
Figures I a and Ib;
Figure 7 is a representation of a neural network of the apparatus of Figures Ia and Ib;
Figure 8 is a flowchart of a method of operating the apparatus of Figures I a and Ib;
Figure 9 is a block diagram showing a configuration of part of the apparatus of Figures Ia and Ib;
Figure 10 depicts a general configuration of part of an alternative array of detection modules that could be used in the apparatus of Figures Ia and Ib; Figure 1 1 is depicts an arrangement comprising an apparatus according to a second embodiment of the invention;
Figures 12a to 12d depict the apparatus shown in Figure 1 1 when conforming to a number of positions of a user's hand;
Figure 13 is a block diagram of the apparatus shown in Figure 1 1 ;
Figure 14 is a flowchart of a method performed by the apparatus shown in Figure i i ;
Figure 15 is a flowchart of a method performed by a device controlled by the apparatus shown in Figure 11 ; Figure 16 is a flowchart of an alternative method performed by a device controlled by the apparatus shown in Figure 11 ;
Figure 17 depicts a system including an apparatus according to a third embodiment of the invention;
Figure 18 depicts an apparatus according to a fourth embodiment of the invention; Figure 19 is a flowchart of a method performed by the apparatus of Figure 18;
Figure 20 depicts a mobile telecommunications device including an apparatus according to fifth embodiment of the invention; and
Figure 21 depicts an apparatus according to a sixth embodiment of the invention.
Detailed Description
Figures Ia and Ib depict an apparatus 1 according to a first embodiment of the invention. The apparatus 1 is a transformable structure comprising an electronic device mounted on a flexible substrate that can be arranged in two or more configurations. Two example configurations are shown in Figures I a and Ib. Examples of materials suitable for forming the transformable structure include silicone rubber, composite rubber and elastomeric materials, which can provide a substrate for the integration of soft electronic parts. The components of the electronic device may include stretchable thin film coatings of a material such as gold or another metal, deposited directly on the substrate, or nano wire coatings, such as carbon nanotubes, metallic flakes or metal coated microfibers.
The apparatus 1 includes at least one surface 2 arranged to receive user input. In this particular example, the surface 2 is a touch-sensitive display and the apparatus 1 provides different functionalities to a user according to its configuration. In the "open" configuration shown in Figure Ia, the apparatus 1 is arranged to provide the functions of a mobile telephone and the surface 2 is used to display touch keys 3 for receiving user input and a display region 4 for presenting information to a user. In the "closed" configuration of Figure Ib, the mobile telephone functionality is not active. The apparatus 1 is arranged to be worn on a user's wrist and the surface 2 is used to provide other functions, such as a clock 5 and a lock 6.
The apparatus 1 is arranged to determine and characterise its configuration using local measurements of strain and/or flexure from a plurality of detection modules. An example of a detection module 7a is shown in Figure 2.
The detection module 7a, in this particular example, includes a sensor 8 comprising two or more sensing elements 8a, 8b. The sensing elements 8a, 8b are arranged to monitor stretching along respective first and second directions, simultaneously. In this embodiment, the first and second directions are perpendicular to one another. In other embodiments of the invention, the sensor 8 may include a single sensing element or more than two sensing elements. Moreover, the directions along which the sensing elements 8a, 8b detect elongation need not be perpendicular to each other.
The output from the sensor 8 is transmitted to one or more of in-situ processing circuits 9a. The output from the one or more processing circuits 9a may, if required, be directed to one or more further processing circuit 10a and/or an application processor (not shown) via a bus 11.
As shown in Figure 2, part of the sensor 8 and the processing circuits 9a, 10a may be mounted on rigid substrates 12a, 12b, 12c. However, the sensing elements 8a, 8b are arranged so that they can stretch and bend to conform with deformations of the apparatus 1. Such a combination of rigid and soft structures, referred to hereinafter as a "semi- rigid" architecture, can be realised by laminating suitable materials for component of the sensor 8 and processing circuits 9a, 10a onto the substrate of the transformable structure. Suitable materials for use in such an architecture can include thin film metal electrodes, piezoelectric ceramic material (zinc-oxide (ZnO), lead-zirconate-titanate (PZT), polyvinylidene fluoride (PVDF), elastomeric composites with appropriate sensing properties, suitable nano-structured inorganic materials (highly aligned ZnO nanomaterials) and so on.
The configurations of two example sensing elements 8a, 8a' are shown in Figures 3 and 4. In both example configurations, the sensing element 8a comprises a layer of sensing material 13, with first and second electrodes 14, 15 positioned on either side thereof.
In the example of Figure 3, the sensing layer 13 includes a material having an electrical property, such as charge generation, resistance or capacitance, that changes in response to in-plane and out-of-plane deformations, in other words, when stretched and/or flexed. Depending on the configuration of the detection module 7a and type of
sensing material employed, the detection module 7a may detect flexure and/or two- dimensional stretching, or a combination of the two. For example, the sensor 8 may monitor one or more of localised charges, that is, voltages (V), generated by the sensing elements 8a, 8a' , 8b in response to stretching and/or flexure, resistance (R) changes, where the sensing material comprises a sensitive piezoelectric, electrostrictive, ferroelectric or similar materials or are simple elongation sensitive resistors, or where the sensing material is configured to change its capacitance in response to stretching and/or flexure, localised capacitance changes. These sensing techniques may be used and, if required, combined, in order to monitor complex deformations in three dimensions. In this particular embodiment, the sensing layer 13 is a flexible and stretchable film, formed from a material such as silicon rubber and comprising an array of highly aligned piezoelectric nanowires. The nanowires are formed of a suitable material, such as piezoelectric (ZnO and/or BaTiO3) or resistive (CNT) nano wires grown or dispersed on the second electrode 15. The first electrode 14 is laminated onto the sensing layer 13. The nanowires are aligned along a predetermined direction. In the examples shown in Figures 3 and 4, the nanowires are aligned perpendicularly and parallel to the electrodes 14, 15 respectively. However, in other embodiments, the nanowires may be aligned along another direction, for example, so that they extend diagonally between the first and second electrodes 14, 15. The first and second electrodes 14, 15 are flexible and stretchable. The first electrode 14 may be formed of a material such as carbon (carbon black) or another material that changes its inherent resistance in accordance with, and/or proportionally to, its elongation. In this manner, the first electrode 14 can perform two functions, reading signals from the sensing layer 13 undergoing deformation and serving as a simple one- dimensional elongation sensor.
Meanwhile, the second electrode 15 is formed of a material such as gold, silver, copper or indium tin oxide (ITO), the resistance of which does not change significantly when flexed or elongated. The electrodes 14, 15 may be thin metallic films, that is, films having a sub-micrometre thickness, of gold, silver or copper. Such films may, if required, include micro-cracks in order to increase elasticity. Alternatively, the electrodes 14, 15 may be formed of a conductive composite elastic layer in which conductive particles are embedded in an elastic polymeric matrix. The embedded particles may be micro-sized metallic particles, nano-sized carbon fullerenes ("bucky balls"), carbon nano-fibres, or microscopic metallic fibres, metallic flakes or metal
covered fibre networks. Such elastic composites or thin films on elastomers may be elongated, reversibly, by up to 50%, while remaining electrically conductive. Such a level of reversible stretchability can accommodate the deformation of the apparatus 1 required to move between the open and closed configurations of Figures Ia and Ib. In some embodiments of the invention, the functional components such as the sensors 8, processing circuits 9a, 10a, bus 11 and the connections therebetween can be realised by using soft and/or rigid materials on elastic and stretchable substrates 12a-12c. Such functional components are directly deposited on a stretchable substrate forming the apparatus 1, in a so-called "semi-rigid" form of the overall system. Deposition of the functional components can be realised for example, by using a printing technique. For example, polymer sensors may be provided on the apparatus 1 using ink -jet printing to deposit the sensors 8 and circuitry 9, 10, 11. In other embodiments of the invention, the sensors 8 and other components are prefabricated and embedded into the substrate. In either case, an insulating layer of deformable material may be deposited on the substrate, in order to encapsulate the sensors 8 and circuitry 9, 10, 11, to avoid interference.
The example sensing elements 8a, 8a' can detect localised stretching and flexing of the apparatus 1 as follows. When the substrate of the apparatus 1 is stretched, along direction A, shown in Figures 3 and 4, the first electrode 14 will lengthen, causing a change in its resistance that corresponds to the change in its length. Although the second electrode 15 also lengthens, its resistance is substantially unchanged. If the structure is flexed along direction B in Figures 3 and 4, the first and second electrodes 14, 15 and the nanowires will bend. The nanowires will produce a voltage that is proportional to a bending angle/surface curvature of the sensing layer 13. The polarity of the voltage can be used to indicate whether the sensing layer 13 is being flexed in a concave or convex shape. In this manner, the detection module 7a can be used to generate an output signal indicating elongation along direction A as well as local flexing/bending of the apparatus 1.
In this particular embodiment of the invention, a network of such detection modules 7a-71 and in-situ processing circuits 9a-9d, 10a, 10b is distributed within the apparatus 1, as shown in the partial view of Figure 5. At least some of the processing circuits 9a-9d, 10a, 10b comprise computational capabilities, as will be explained later. In this particular embodiment, at least some of the processing circuits 10a, 10b are connected to the application processor 16 via the bus 11.
Figure 6 depicts an equivalent circuit for a part of a detection module 7a that produces and conditions a signal Sout having a frequency corresponding to the voltage across the sensing layer 13 of one or more sensor elements 8a, 8b.
In this particular example, the detection modules 7a-71 are configured as "spiking neurons". In other words, the detection modules 7a-71 are configured to transmit an output to the processing circuits 9a-9d when the voltages generated by the nanowires of the sensing layer 13 exceed a predetermined threshold. The detection module 7a therefore includes an integrator 17, which integrates the voltage across the sensing layer 13 or layers. When the voltage across the capacitor 17 exceeds the predetermined threshold, the integrator 17 is discharged and a pulse is generated. In this particular embodiment, the integrator 17 is the sensor 8 itself. In other words, the intrinsic capacitance of the sensor 8 is used and the piezoelectric voltage is used as a variable ground potential therefor. However, in other embodiments, the integrator may be provided in the form of a separate capacitor. The detection module 7a of Figure 6 also provides components for conditioning the signal Sout produced by the sensor 8. The detection module 7a comprises a comparator, in the form of a charge amplifier 18 with high input impedance, provided with a negative feedback path, and a Sigma-Delta modulator 19 to convert the signal Sout into a pulse modulated signal. Signals between and from the detection modules 7a-71 can be transmitted within the apparatus 1 in the form of pulse or spike coded signals, to reduce or remove interference from external electromagnetic fields or any effects from environmental factors, such as temperature variations. For example, the processing circuit 9a-9d could be configured to transmit signals using one of pulse width modulation (PWM), pulse density modulation (PDM), pulse train frequency modulation or spike code modulation. However, various alternative signal coding methods could be used. The coding scheme may be configured so that there are time-slots for sending data from the detection modules 7a-71 to processing circuits 9a-9d and, in turn from the processing circuits 9a-9d to processing circuits 10a, 10b and so on. Such a scheme may avoid the need for an address-based x-y read-out structure, permitting the use of a simpler network configuration with fewer interconnections.
Although Figure 6 shows the signal conditioning components 18, 19 as part of a detection module 7a, in other embodiments of the invention, such signal conditioning may be instead be performed by at least some of the processing circuits 9a, 9d.
Alternatively, signal conditioning components 18, 19 may be provided in both the detection modules 7a and the processing circuits 9a.
The detection modules 7a-71 and processing circuits 9a-9d, 10a, 10b provide a neural network that can classify and recognise the configuration of the apparatus 1. The neural network may be configured to identify a transformation of the apparatus 1 and, if required, to determine the dynamics of such a transformation.
An example of a deformation characterising procedure that could be performed by the processing circuits 9a-9d, 10a, 10b of such a neural network will now be described, with reference to Figure 7. The signals from the sensors 8 are collected, analysed and further forwarded in modified/simplified form by the in-situ processing circuits 9a-9d, 10a, 10b. The processing is implemented in close proximity to the sensors 8, providing analysis of local deformations. Each processing circuit 9a-9d receives input from at least two sensors.
The local deformations measured by the sensors 8 of n detection modules 7 can be expressed as a vector X1 as follows:
X,,
X.'
(1) x, =
X,,
For the sake of simplicity, the example described below, with reference to Figure 6, will relate to three local measurements X1 , X2 , X3 , which, in this particular example, are deformation parameters, based on the voltages measured by the sensors 8 respectively.
The processing circuits 9a-9d, 10a, 10b are configured to map the local measurements X1 5 X25 X3 OnIo a higher dimensional feature space. In this particular example, the mapping is made using a nonlinear classifier, in the form of a Radial Basis Function (RBF). However, in the apparatus 1 , the described processing may be based on any number of available measurements and/or on another suitable classifier as required. In this example, the RBF classifier is based on a Gaussian kernel function,
where xm represents a model vector. The value of σf determines the sharpness of the kernel function. Here, the kernel function can be interpreted as a similarity operator, which indicates how closely the local measurements X1 , x25 x3 correspond to the model vector xm . In this simplified example, the local measurements X1 , x2 , x3 are mapped onto a five dimensional feature space, using classifiers Z1 to z5 .
The classifiers Z1 to z5 are then mapped onto a lower dimensional space that has a sufficient number of dimensions to represent and classify at least the major features of a deformation, or non-deformation, of the apparatus 1. In this simplified example, a two- dimensional state space Iy15J2J is used.
The layer feeding the measurements X1 , X2 , X3 is referred to as the input layer, the layer calculating the classifiers is the middle layer, and the final layer, which calculates the outputs Iy15J2J, is referred to as the output layer. In this example case of an RBF network, the outputs are weighted sums of the classifiers Z1 to z5 plus an additional bias value Zbias- In this simplified example, there is one intermediate, or hidden, layer.
In another embodiment, a traditional artificial neural network could be employed, in which a hidden layer and an output layer form weighted sums of the input measurements and calculate a sigmoid function. Alternatively, a cellular neural/nonlinear network (CNN) could be used.
The parameters of the neural network may be reprogrammable, in order to allow the processing circuits 9a-9d, 10a, 10b to improve matches between desired and actual outputs. In this manner, the apparatus 1 can continue "learning" to recognise pre- determined configurations, such as three-dimensional shapes, over its lifetime.
Alternatively, the parameters could be derived during development and the learned values permanently configured in the processing circuits 9a-9d, 10a, 10b of the detection modules 7a-71. Such learning may take place in a supervised manner, by performing a trial and adjusting the parameters to obtain suitable matches between desired and actual
outputs. On the other hand, unsupervised learning techniques may be employed, for example, using self-organizing maps (SOM).
An example of a method for operating the device is depicted in Figure 8. Starting at step s8.0, localised measurements X1 ,... Xn of the stretching and/or flexure of the apparatus 1 are obtained by the sensors 8 of the detection modules 7a-71 (step s8.1). Classifier functions z are calculated based thereon (step s8.2), in a similar manner to that described above. The classifier functions are then used to calculate outputs that characterise at least the major features of the stretching and/or flexure of the apparatus 1 (step s8.3) and are used to determine the configuration of the apparatus 1 (step s8.4). The overall configuration of the apparatus 1 can be determined by a centralised processor, such as the application processor 13 shown in Figure 5. Alternatively, a local configuration may be determined, by the application processor 13 or by the one or more of the processing circuits 9a-9d, 10a, 10b.
If it is found that the configuration of the apparatus 1 has changed (step s8.5), the application processor 13 determines whether or not the change in configuration requires a change in the functionality to be made available to a user (step s8.6).
For example, if it is determined that the apparatus 1 has been moved into its flat configuration, shown in Figure Ia, the application processor may activate mobile telephone functionality (step s8.7). Alternatively, if it is determined that a user moved the apparatus 1 into its bent configuration shown in Figure Ib, the application processor may deactivate the mobile telephone functionality and activate other features, such as a clock (step s8.7). In either case, the application processor 13 may determine whether the configuration parameters satisfy one or more pre-determined conditions based on particular shapes/configurations, trigger/execute an protocol or function at the apparatus 1 and update displayed information accordingly (step s8.8) , completing the procedure (step s8.9).
Alternatively, or additionally, the processor 16 may be arranged to transmit a wired or wireless signal to a remote device at step s8.7, in order to guide, steer and/or execute commands and functions at the remote device, such as pointer steering on a personal computer or a display device, browsing through menus and/or selecting the most appropriate icon and executing a protocol associated with a selected menu option, internet browsing applications, controlling home appliances, a TV or other home electronic devices, and so on.
If it is found that a change in functionality is not required (step s8.6), the application processor 16 may make changes to the information, images and/or content displayed on the apparatus 1, such as updating content or changing an area on the apparatus 1 on which it is displayed, in accordance with the determined configuration (step s8.8), for instance, to complete the procedure (step s7.9).
Figure 9 is a block diagram of the apparatus 1. The detection modules 7a-71 and processing circuits 9, 10 provide a neural network 21. In this particular example, the outputs from the neural network are sent to the application processor 16, which runs software stored in memory 22 in order to perform the relevant determinations and updating functions discussed above in relation to Figure 8. The application processor 16 can control the presentation of information on a display 23, as well as mobile phone components in the form of a codec 24, transceiver 25, a microphone 26, a speaker 27 and respective amplifiers (not shown). An antenna 28 is provided for transmitting data to, and receiving data from, a base station (not shown). The apparatus 1 can be powered by a rechargeable battery 29, which can be recharged via charging circuit 30.
In addition to reducing the number of interconnects, when compared with conventional x-y readout arrangements, such as cross-bar readout architectures, direct readout architectures and so on, the neural network 21 produces a signal based on the outputs of multiples ones of the detection modules 7. The processing circuits 9, 10 may be relatively simple, when compared with a conventional processor, potentially reducing both the power consumption and the cost of manufacturing the apparatus 1. Moreover, the use of distributed processing can allow the apparatus to react to user input relatively quickly.
In the above described embodiment, the processing circuits 9a-9d, 10a, 10b are provided separately from the detection modules 7a-71. However, in other embodiments, some or all of the processing circuitry may be provided within one or more of the detection modules. For example, Figure 10 depicts a detection module 7a' comprising a rigid substrate 12a upon which part of the sensor 8 and processing circuitry 9a'. 10a' are provided, in a similar manner to the "semi-rigid" arrangement described above in relation to Figure 2. In particular, the sensors 8 can be realised in the form of soft and flexible structures that connect the rigid substrates 12a on which the processing circuitry 9a', 10a' is mounted.
The processing circuitry 9a', 10a' comprises at least an amplifier and an ADC in order to condition the output of the sensor 8. The output can then be further processed by
the processing circuitry 9a', 10a' and/or transmitted to another processing circuit or a processor 16 via bus 12.
Distributed neural network arrangements such as that discussed hereinabove can be used to detect deformation in a number of other devices. For instance, Figure 11 depicts a wearable apparatus 31. The apparatus 31 comprises a sleeve 32, arranged to be worn over the hand 33 of a user.
The sleeve 32 is arranged to conform with stretching and relaxation of the skin of the user's wrist 34 and so is capable of undergoing resiliently deformations along arbitrary directions. In this particular example, the sleeve 32 comprises a substrate formed from a soft, conformable material, such as natural rubber, silicone rubber, another elastomer or elastomer-like material, capable of tolerating a strain of around 20%.
A network of detection modules 7 and processing circuits 9, 10 is embedded in the sleeve 32, in a similar manner to that described above in relation to the apparatus 1 of Figures Ia and Ib. The detection modules 7 comprise sensors 8 for monitoring stretching and/or flexing of the sleeve 32 as it conforms to movements of the user's wrist 34, for instance, when moving between a rest position, shown in Figure 11 , to one of the other postures shown in Figures 12a to 12d.
In the arrangement shown in Figure 11 , the apparatus 31 is arranged to act as a user interface, so that a user can remotely control a remote device via a wireless communications link 35. In this particular example, the remote device is a headmounted display 36 and the movements of the user's wrist 34 shown in Figures 12a to 12d cause an image output by the headmounted display 36 to be scrolled to the left, right, up and down respectively.
Figure 13 is a block diagram of the apparatus 31 , in which components similar to those discussed in relation to the first embodiment are referred to by the same reference numerals. The apparatus 31 is equipped with a distributed neural network, which includes a plurality of detection modules 7 and processing circuits 9, 10, as described hereinabove in relation to the apparatus 1 of Figures Ia and Ib. The signals output and processed by the detection modules 7 are processed by the processing circuits 9, 10. A combination signal can then be generated and processed by a processor 37, based on the signals from two or more of the detection modules 7, for transmission to the display 36 via the transceiver 25 and antenna 28.
The combination signal may identify at least the major features of the shape of the sleeve 32 and, therefore, the posture of all or part of the user's wrist 34. A processor
38 within the display 36 can then map a command signal onto the determined deformation, for example, by using a look-up table stored in a memory 39.
Alternatively, the processor 33 of the apparatus 31 may be arranged to identify the posture of the user's wrist 34 using a look-up table stored in a memory 40. In such an arrangement, the combination signal transmitted by the apparatus 31 to the display 32 may be the command signal itself.
Alternatively, or additionally, the memory 40 of the apparatus 31 may store information about shape of the sleeve 32 or determined postures so that the processor 37 can compare a current shape or posture with stored information, identify changes in the shape of the sleeve 32 and determine a movement of the user's hand 30 or wrist 34. In such an embodiment, the signal transmitted to the display 36 may be a signal identifying that movement or, alternatively, a corresponding command signal based thereon.
In this manner, a remote device, such as the display 36 can be controlled by the user's movements instead of, or as well as, the posture of a part of the user's body 33, 34. In other words, the neural network 21 can classify and recognise the configuration of the apparatus 31 , identify any conformal transformation and, if required, to determine the dynamics of such a transformation. The apparatus 31 is thus arranged to output a combination signal that relates to the output of multiple ones of the detection modules 7. Such an arrangement can reduce, or even minimise, the number of output signals to be conveyed to a remote device, thereby reducing the amount of data to be transmitted and, potentially, improving the speed with which a receiving device can respond to user input. Depending on the configuration of the apparatus 31 and remote device, the need for additional computing resources in the remote device can be reduced.
Figures 14 and 15 are flowcharts of examples of methods that can be performed by the apparatus 31 and the remote device, display36, respectively.
Starting at step sl4.0, measurements of localised deformations are obtained by the sensors 8 of the detection modules 7 (step si 4.1). The processing circuits 9, 10 of the neural network 21 then calculates classifier functions based on the measurements (step sl4.2), as described above in relation to the apparatus 1 of the first embodiment. A combination signal, including one or more outputs characterising the features of at least a local configuration of the apparatus 31 are generated by the processing circuits 9, 10 and/or the processor 37 (step sl4.3) and transmitted to the display 36 (step sl4.4), completing the process (step sl4.5).
Turning now to the display 36 and starting at step si 5.0, the combination signal from the apparatus 31 is received by the display 36 (step si 5.1) and a corresponding command is determined (step si 5.2). The relevant command is then executed (step s 15.3). For example, where the deformation of the apparatus 31 indicates that a user wishes to scroll an image presented by the display 36, that image is updated accordingly. Where the remote device is a different type of device, such as a home appliance, tv or computer, the execution of the relevant command may include changing the functionality of the remote device, moving a cursor on a display, calling up a menu and so on. The process is then complete (step si 5.4). As noted above, the apparatus 31 may be configured to determine the command corresponding to the configuration, (step sl4.2) and, in step sl4.4, to transmit a corresponding command signal in place of the corresponding signal for reception by the display 36 in step si 5.1.
Figure 16 depicts a method performed by the display 36 in place of the method of Figure 15, when controlled by movement of the user's hand 33 or wrist 34.
Starting at step sl6.0, the combination signal from the apparatus 31 is received (step s 16.1) and compared with previously received signals stored in the memory 39 of the display 36 (step sl6.2). The movement of the user's hand 33 or wrist 34 is determined, based on differences between the received signal and one or more stored signals (step sl6.3). A command corresponding to that movement is identified (step sl6.4) and executed by the display 36 (step sl6.5), as described above in relation to step s 15.3 of Figure 15. The most recently received signal is then stored in the memory 39 (step sl6.6) for comparison with subsequent combination signals and the process ends (step sl6.7). As noted above, in other embodiments of the invention, steps sl6.2, 16.3 and, optionally, step si 6.4 may be performed by the apparatus 31 and a signal identifying the movement or the corresponding command signal may be transmitted to the remote device in place of the combination signal.
While the sleeve 32 of the apparatus 31 of Figures 1 1 and 12 is configured to fit over a user's hand 33 and to detect the shape of at least part of the sleeve 32 to identify a posture and/or motion of the user's wrist 34, other embodiments of the invention may comprise one or more deformable sleeves, substrates or surfaces in other forms, depending on the particular application of the apparatus. For instance, a wearable apparatus that acts as a user interface or a detector for monitoring a body part of a user in
a remote presence, or telepresence, arrangement may include a sleeve configured to fit over a particular part of a user's body.
Furthermore, Figures 11 and 12 depict an apparatus 31 in which the detection modules 7 and processing circuits 9, 10 are arranged in a uniform distribution around the user's wrist 34. However, in other embodiments of the invention, more or fewer detection modules 7 and processing circuits 9, 10 may be provided, and arranged in other manners.
For instance, Figure 17 depicts an apparatus 41 according to a third embodiment of the invention, in which the spatial distribution of detection modules 7 and processing circuits 9, 10 is concentrated in particular areas 42, 43 of the sleeve 32. In this particular example, the areas 42, 43 correspond to points where a relatively high level of local deformation may be experienced. Such an arrangement can permit the shape of the deformation to be determined with greater precision, by making a relatively high number of measurements in regions where most of the deformation is expected. In other words, the concentration of detection modules 7 in the sleeve 32 can be varied from area to area to allow for localised high resolution tracking capabilities.
In the example shown in Figure 17, the apparatus 41 generates a combination signal based on the deformation of the sleeve 32 as it conforms to the user's hand 33 and wrist 34, as described previously with reference to Figure 14, steps si 4.0 to si 4.5. In this particular example, the combination signal is transmitted to a computer 44 via a wireless link 45. The computer 44 is arranged to reconstruct the movement of the user's wrist 34 based as described previously with reference to Figure 15, steps sl5.0 to sl5.4. In this particular example, the computer 44 is arranged to steer a robotic device 46 that replicates the movements of the user's wrist 34. The command signal generated at step s 15.2 may thus be a command signal that controls movement of a component of the robotic device 46. The command signal may be transmitted to the robotic device 46, via a network 47. The robotic device 46 then executes the received command (step sl5.3). As the deformation of the sleeve 32 and, thus, the posture and/or movement of the user's wrist 34 can be mapped and monitored with high precision by the apparatus 41, the system of Figure 17 may be particularly suitable for teleoperation of the robotic device 46 in applications requiring the replication of delicate movements. For instance, the robotic device 46 a remote device containing an array of actuators arranged to perform object movement in accordance with an initial transformation and the received combined signal, to provide remotely controlled object manipulation.
Alternatively, or additionally, such an arrangement can permit the measurements to be concentrated in such a region so that the deformation can be monitored using fewer detection modules 7 and processing circuits 9, 10, potentially reducing complexity, power consumption and manufacturing costs. While the apparatuses 31 , 41 shown in Figures 11 and 17 are intended for controlling a display 36 and a robotic device 46 respectively, the apparatuses 31, 41 or apparatuses similar thereto, may be used to control other types of remote device. For instance, the apparatuses 31, 41 could be used to move a cursor on a display associated with a personal computer, to change channels and/or control volume, brightness and contrast on a television or to control a home appliance. In such devices, the methods of Figures 15 and 16 may be performed by the remote device.
In yet another embodiment, the apparatus 31, 41 may be used to monitor the posture and/or movements of the user and to transmit the combination signal to a remote device, such as a computer 44, for storage and/or analysis. Figure 18 depicts an example of an apparatus 48 according to another embodiment of the invention, in the form of a personal digital assistant device. In the apparatus 48, a network 21 of detection modules 7 and processing circuits 9, 10, as described above, is utilised to provide a touch sensitive user interface 49, or touch screen, through which a user can control processes such as browsing of content and services, requesting and selecting items from pop-up menus displayed thereon.
The touch screen 49 includes a resiliency deformable surface in which a distributed network of sensors 8 and processing circuits 9, 10 are provided, as described above in relation to the apparatus 1 of Figure 1.
In this particular example, a user can input information using a stylus 50 or their fingers (not shown) to exert pressure on a particular area of the touch screen 49 to select an icon 51 associated with an application or a function button 52 or to operate a scroll bar 53. The configuration of the apparatus otherwise resembles that shown in the block diagram of Figure 9, with the detection modules 7 sensing localised deformation of the touch screen 49 in the area pressed by the user and producing one or more output signals for transmission to an application processor 16.
Alternatively, the apparatus 48 may be configured as shown in Figure 13 and arranged to transmit a combination signal and/or a command signal to a remote device, such as a mobile terminal, personal computer, home appliance and so on.
The apparatus 48 may be equipped with a handwriting recognition function, so that a symbol 54 input in at least a particular region 55 of the touch-screen 49 can be identified.
Figure 19 depicts a method performed by the apparatus 48 of Figure 18. Starting at step sl9.0, measurements of localised deformation are obtained from some or all of the detection modules 7 (step si 9.1). Classifier functions and outputs are determined by processing circuits 9, 10, as described above in relation to Figure 5 (steps sl9.2, sl9.3). The region of the touch-screen 49 in which the user input was received is determined (step sl9.4) by the application processor 16. If the user input was made outside the handwriting recognition region 55, the application processor 16 determines whether a function has been selected (step sl9.6), for example, by the user pressing on an icon 51, button 52 or bar 53. If so, the selected function is executed (step si 9.7).
If it is determined that the user input was made within the handwriting region 55 (step sl9.5), successive measurements from the detection modules 7 are monitored and compared, for example, using a method similar to that described in relation to steps sl6.2, sl6.3 and sl6.6 of Figure 16 (step sl9.8), to build up a record of the user's movements and, therefore, the gesture made by the user. This record is then compared with reference data stored in a memory 18 in order to identify the symbol input by the user (step sl9.9) and, where necessary, to store it for use in an appropriate application. The symbol may be an alphabetic character, numeral or punctuation mark.
The display is then updated (step si 9.10). In the case where a selected function has been executed, the display is updated to show a corresponding user interface and/or to show that the function has been activated. Where a user has input a symbol, the identified symbol 56 may be displayed on the touch screen 49 so that the user can verify it. The procedure is then complete (step si 9.11).
The embodiments described above represent examples of apparatuses and systems in which a distributed network of sensors 8 and processing circuits 9, 10 can be used to detect deformation. Such a network can be used in other apparatuses and/or systems. For instance, such a network can be used to monitor the shape of a deformable surface, such as the substrate 2 of the apparatus 1, in order to determine a command to be carried out by a remote device 36. In this manner, a user can manipulate the deformable surface into a particular shape, or into one of a set of predetermined shapes, in order to control a
function on a remote device. In such an apparatus, the deformable surface may be provided in close proximity to the user, for example, in a wearable device.
In another arrangement, such an apparatus could be a component of, or attached to, another object. For instance, such an apparatus 57 could be included within the hinge 58 of a foldable, or "clamshell", mobile telecommunications device 59, as shown in Figure 20. In other embodiments, the apparatus 57 may itself form the hinge 58. In this particular example, the apparatus 57 includes a layer of deformable material in which a network of sensors 8 and processing circuits 9, 10 are provided. The configuration of the apparatus may depend on whether the mobile device 59 is in an "open" or "closed" configuration. The output from the sensors 8 of the apparatus 57 can then be used to provide different functionality depending on whether the mobile device 59 is open or closed, in a similar manner to that depicted in Figure 9. Alternatively, or additionally, a further apparatus 60 could be provided on, or form part of, a part of the mobile telecommunications device 59 or its chassis. In this particular example, the further apparatus 60 is arranged to be manipulated by the user, for instance, to manipulate a cursor or focus region 61 on a display 62 of the mobile telecommunications device 59.
Although the device shown in Figure 20 is a mobile telecommunications device 59, a similar apparatus could be provided within, or even form, the hinge of another type of folding electronic device. For example, the apparatus 57 could be provided in a laptop computer, in order to change properties such as brightness or contrast of a displayed image according to an angle of a screen relative to a keyboard.
In the embodiment shown in Figure 20, a deformable structure forms part of a larger device. Such a deformable structure may be provided as part of a surface or as a projection from a surface of a larger device in other embodiments of the invention. Alternatively, the primary deformable surface of the apparatus may be an independent object. Figure 21 shows an example apparatus 63, comprising a resiliency deformable substrate 64. In this particular example, the substrate 64 is in the form of a layer of material, however, such an apparatus can be provided in other forms. For instance, the apparatus 63 could take the form of another three-dimensional object , such as a strip, a ball, a cube, and so on. A network of detection modules 7 and processing circuits 9, 10 is distributed in the substrate 64, along with the other components shown in Figure 13. Examples of suitable materials for the substrate 64, sensors 8 and processing circuits 9, 10 include those discussed above in relation to the flexible substrate, sensors 8 and processing circuits 9, 10 of the apparatus 1 , shown in Figures Ia and Ib.
A user can manipulate the apparatus 63 to change its shape. The localised deformations resulting from such shape changes are detected by the sensors 8 and characterised by the processing circuits 9, 10, and a combination signal identifying the transformation and/or a command signal can be transmitted to a remote device 65 via a wireless link 66, as described above with reference to Figure 14. In this particular example, the apparatus 63 is used as a remote control for a television 65. However, in other embodiments, such an apparatus could be used in a telepresence application, where the combination signal is copied to a remote device arranged to replicate or reconstruct the shape of the apparatus 63, in a similar manner to that described in relation to the robotic device 46 of Figure 17.
It will be appreciated that many modifications can be made to the embodiments hereinbefore described without departing from the spirit and scope of the invention, including any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalisation thereof.
Claims
1. An apparatus comprising: a deformable structure; and a network including a plurality of sensors arranged to monitor deformation of the deformable structure and a plurality of processing circuits configured to generate a signal characterising features of said deformation, based on the output of two or more of said sensors; wherein said processing circuits are distributed amongst said plurality of sensors.
2. The apparatus according to claim 1, wherein said network is a neural network.
3. The apparatus according to claim 1 , wherein the deformable structure is arranged to be formed into two or more predetermined shapes, comprising an electronic device, configured to provide access to a first function of the electronic device when the deformable structure has a first one of said predetermined shapes and to provide access to a second function of the electronic device when the deformable structure is in a second one of said predetermined shapes.
4. The apparatus according to claim 1 , wherein the deformable structure is configured to be worn over a body part.
5. The apparatus according to claim 1, configured to output a command signal corresponding to a shape of at least part of the deformable structure.
6. The apparatus according to claim 5, arranged to transmit the command signal to a remote device.
7. The apparatus according to claim 1, configured to output a command signal corresponding to a change in a shape of at least part of the deformable structure.
8. The apparatus according to claim 7, arranged to transmit the command signal to a remote device.
9. The apparatus according to claim 1 , wherein said sensors comprise piezoelectric material.
10 The apparatus according to claim 9, wherein said piezoelectric material is provided in the form of a plurality of nanowires.
11. The apparatus according to claim 1 , wherein said sensors comprise a material that changes its resistivity when stretched.
12. The apparatus according to claim 1 , wherein said sensors comprise a plurality of sensing elements, wherein each sensing element is configured to monitor stretching and flexure of at least part of the deformable structure.
13. The apparatus according to claim 12, wherein first and second ones of said plurality of sensing elements are arranged to monitor said stretching along first and second directions respectively, wherein the first direction is different from the second direction.
14. A mobile telecommunications device comprising an apparatus according to claim 1 .
15. A foldable electronic device comprising an apparatus according to claim 1 , comprising a hinge, wherein said deformable structure is located in a hinge.
16. A foldable electronic device comprising an apparatus according to claim 1 , wherein said deformable structure is provided in the form a hinge.
17. A portable computing device comprising an apparatus according to claim 1.
18. A system comprising: a first apparatus according to claim 1 , arranged to transmit a signal corresponding to said identified features to a remote apparatus; and a second apparatus configured to receive said signal from said first apparatus.
19. The system according to claim 18, wherein said second apparatus is arranged to be controlled according to the signal received from said first apparatus.
20. The system according to claim 18, wherein said second apparatus is arranged to replicate a shape corresponding to the deformation of said at least part of the deformable structure.
21. The system according to claim 18, wherein said second apparatus is arranged to replicate a movement corresponding to successive deformations of said at least part of the deformable structure.
22. An apparatus comprising: a deformable structure; a plurality of sensing means for monitoring deformation of the deformable structure and a plurality of processing means for generating a signal characterising features of said deformation, based on the output of two or more of said sensors; wherein said processing means are distributed amongst said plurality of sensing means.
23. A method comprising: obtaining a plurality of measurements of deformations from a plurality of respective sensors distributed within a deformable structure; and generating a signal identifying one or more features of said deformation based on two or more of said measurements; wherein said generation of the signal comprises processing said two or more measurements using a plurality of processing circuits distributed amongst said sensors.
24. The method according to claim 23, comprising: comparing said generated signal with previous signals to determine time-based changes in a shape of said deformable structure.
25. The method according to claim 23, comprising: said generating comprises comparing said generated signal with reference data in order to identify said one or more features.
26. The method according to claim 23, comprising: mapping a command onto said generated signal.
27. The method according to claim 23, comprising: comparing said time-based changes with reference data to identify a movement corresponding to said time-based changes.
28. The method according to claim 27, comprising mapping a command onto said time-based changes.
29. The method according to claim 25, comprising: updating at least one of the reference signals in accordance with said generated signal.
30. The method according to claim 27, comprising: updating said reference data in accordance with the comparison of the generated signal with previously generated signals.
31. The method according to claim 23, comprising: adjusting the availability of one or more functions available to a user of a device according to said generated signal.
32. The method according to claim 23, comprising: transmitting a signal corresponding to said features to a remote device.
33. The method according to claim 27, comprising: transmitting a signal corresponding to the identified movement to a remote device.
34. The method according to claim 32, comprising causing said remote device to replicate a shape in accordance with said features.
35. The method according to claim 33, comprising causing said remote device to replicate said movement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09772612.9A EP2294500B1 (en) | 2008-06-30 | 2009-06-16 | Apparatus comprising a deformable structure and method for generating a characterizing signal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/164,587 | 2008-06-30 | ||
US12/164,587 US9519857B2 (en) | 2008-06-30 | 2008-06-30 | Apparatus and method for sensing characterizing features of a deformable structure |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010000925A1 true WO2010000925A1 (en) | 2010-01-07 |
Family
ID=41448454
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2009/050523 WO2010000925A1 (en) | 2008-06-30 | 2009-06-16 | Apparatus comprising a deformable structure |
Country Status (3)
Country | Link |
---|---|
US (1) | US9519857B2 (en) |
EP (1) | EP2294500B1 (en) |
WO (1) | WO2010000925A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012135935A2 (en) * | 2011-04-06 | 2012-10-11 | Research In Motion Limited | Portable electronic device having gesture recognition and a method for controlling the same |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101620454A (en) * | 2008-07-04 | 2010-01-06 | 清华大学 | Potable computer |
CN101676832B (en) * | 2008-09-19 | 2012-03-28 | 清华大学 | Desktop computer |
WO2009065436A1 (en) * | 2007-11-19 | 2009-05-28 | Nokia Corporation | Input device |
US20100073263A1 (en) * | 2008-09-22 | 2010-03-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware, | E-Paper application control based on conformation sequence status |
WO2011117681A1 (en) * | 2010-03-25 | 2011-09-29 | Nokia Corporation | Contortion of an electronic apparatus |
US9632575B2 (en) | 2010-05-21 | 2017-04-25 | Nokia Technologies Oy | Method, an apparatus and a computer program for controlling an output from a display of an apparatus |
US9823707B2 (en) | 2012-01-25 | 2017-11-21 | Nokia Technologies Oy | Contortion of an electronic apparatus |
US8972310B2 (en) * | 2012-03-12 | 2015-03-03 | The Boeing Company | Method for identifying structural deformation |
AU2014201918B2 (en) * | 2012-03-12 | 2015-06-11 | The Boeing Company | Method and apparatus for identifying structural deformation |
KR101661526B1 (en) * | 2012-04-08 | 2016-10-04 | 삼성전자주식회사 | Flexible display apparatus and user interface providing method thereof |
US9823696B2 (en) | 2012-04-27 | 2017-11-21 | Nokia Technologies Oy | Limiting movement |
US9158332B2 (en) | 2012-10-22 | 2015-10-13 | Nokia Technologies Oy | Limiting movement |
US9158334B2 (en) | 2012-10-22 | 2015-10-13 | Nokia Technologies Oy | Electronic device controlled by flexing |
US9377345B2 (en) | 2013-09-11 | 2016-06-28 | Illinois Tool Works Inc. | Food product scale |
US9545221B2 (en) | 2013-11-27 | 2017-01-17 | Samsung Electronics Co., Ltd. | Electronic system with dynamic localization mechanism and method of operation thereof |
KR102033334B1 (en) * | 2014-02-03 | 2019-10-17 | 한국전자통신연구원 | Wrist wearable apparatus with transformable material |
EP2916210B1 (en) * | 2014-03-05 | 2017-07-12 | Markantus AG | Finger-worn device for providing user input |
US9690408B1 (en) | 2014-09-26 | 2017-06-27 | Apple Inc. | Electronic device with an integrated touch sensing and force sensing device |
US9779676B2 (en) | 2014-09-30 | 2017-10-03 | Apple Inc. | Integrated touch sensor and force sensor for an electronic device |
US10488924B2 (en) * | 2014-12-17 | 2019-11-26 | Korea Electronics Technology Institute | Wearable device, and method of inputting information using the same |
US10101857B2 (en) | 2015-08-28 | 2018-10-16 | Apple Inc. | Methods for integrating a compliant material with a substrate |
US9891770B2 (en) | 2015-08-28 | 2018-02-13 | Apple Inc. | Methods for forming a patterned structure in a sensor |
CN105137743B (en) * | 2015-10-15 | 2018-05-15 | 京东方科技集团股份有限公司 | A kind of intelligent watch |
US10317997B2 (en) * | 2016-03-11 | 2019-06-11 | Sony Interactive Entertainment Inc. | Selection of optimally positioned sensors in a glove interface object |
US9965092B2 (en) | 2016-05-18 | 2018-05-08 | Apple Inc. | Managing power consumption of force sensors |
US9761806B1 (en) * | 2016-09-23 | 2017-09-12 | International Business Machines Corporation | Sensors with integrated data processing circuitry |
WO2018112833A1 (en) * | 2016-12-22 | 2018-06-28 | Intel Corporation | Efficient transferring of human experiences to robots and other autonomous machines |
US10353506B2 (en) | 2017-06-16 | 2019-07-16 | Apple Inc. | Dual resistive strain and pressure sensor for force touch |
US10871847B2 (en) | 2017-09-29 | 2020-12-22 | Apple Inc. | Sensing force and press location in absence of touch information |
CN109521784B (en) * | 2018-12-13 | 2021-05-11 | 华南农业大学 | Touch sensing type wearable upper limb exoskeleton unmanned aerial vehicle control system and method |
CN110207643B (en) * | 2019-05-31 | 2021-02-19 | 闻泰通讯股份有限公司 | Folding angle detection method and device, terminal and storage medium |
WO2021040617A1 (en) * | 2019-08-23 | 2021-03-04 | National University Of Singapore | Wearable body comprising capacitive sensor |
US11243614B2 (en) * | 2019-10-09 | 2022-02-08 | Bose Corporation | Modified force sensitive resistors |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19836805A1 (en) * | 1998-08-14 | 2000-02-17 | Daum Gmbh | System in form of data glove for scanning human hand movement and entering this in computer and glove is made of stitched flexible material so that it matches human hand back |
US6399939B1 (en) * | 2000-06-13 | 2002-06-04 | North Carolina A&T State University | Sensor array system |
US6701296B1 (en) | 1988-10-14 | 2004-03-02 | James F. Kramer | Strain-sensing goniometers, systems, and recognition algorithms |
US20050013433A1 (en) * | 1997-06-02 | 2005-01-20 | Firooz Ghassabian | Wrist-mounted telephone device |
US20070270199A1 (en) * | 2006-05-16 | 2007-11-22 | Samsung Electronics Co., Ltd. | Watch type mobile phone |
WO2009065436A1 (en) * | 2007-11-19 | 2009-05-28 | Nokia Corporation | Input device |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6586859B2 (en) * | 2000-04-05 | 2003-07-01 | Sri International | Electroactive polymer animated devices |
US20020075232A1 (en) * | 1997-08-15 | 2002-06-20 | Wolfgang Daum | Data glove |
US6940493B2 (en) * | 2002-03-29 | 2005-09-06 | Massachusetts Institute Of Technology | Socializing remote communication |
US7981057B2 (en) * | 2002-10-11 | 2011-07-19 | Northrop Grumman Guidance And Electronics Company, Inc. | Joint motion sensing to make a determination of a positional change of an individual |
US7957554B1 (en) * | 2002-12-31 | 2011-06-07 | Cognex Technology And Investment Corporation | Method and apparatus for human interface to a machine vision system |
WO2007146769A2 (en) | 2006-06-13 | 2007-12-21 | Georgia Tech Research Corporation | Nano-piezoelectronics |
US8471822B2 (en) * | 2006-09-06 | 2013-06-25 | Apple Inc. | Dual-sided track pad |
US7867782B2 (en) * | 2006-10-19 | 2011-01-11 | Agilent Technologies, Inc. | Nanoscale moiety placement methods |
-
2008
- 2008-06-30 US US12/164,587 patent/US9519857B2/en active Active
-
2009
- 2009-06-16 WO PCT/FI2009/050523 patent/WO2010000925A1/en active Application Filing
- 2009-06-16 EP EP09772612.9A patent/EP2294500B1/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6701296B1 (en) | 1988-10-14 | 2004-03-02 | James F. Kramer | Strain-sensing goniometers, systems, and recognition algorithms |
US20050013433A1 (en) * | 1997-06-02 | 2005-01-20 | Firooz Ghassabian | Wrist-mounted telephone device |
DE19836805A1 (en) * | 1998-08-14 | 2000-02-17 | Daum Gmbh | System in form of data glove for scanning human hand movement and entering this in computer and glove is made of stitched flexible material so that it matches human hand back |
US6399939B1 (en) * | 2000-06-13 | 2002-06-04 | North Carolina A&T State University | Sensor array system |
US20070270199A1 (en) * | 2006-05-16 | 2007-11-22 | Samsung Electronics Co., Ltd. | Watch type mobile phone |
WO2009065436A1 (en) * | 2007-11-19 | 2009-05-28 | Nokia Corporation | Input device |
Non-Patent Citations (3)
Title |
---|
"Proceedings of the Eighth Annual IEEE International ASIC Conference and Exhibit, 1995., 18-22 Sept. 1995", article EKLUND, J. ET AL.: "Near-sensor image processing, a VLSI realization", pages: 83 - 86, XP010217977 * |
DOMINGUEZ-CASTRO, R. ET AL.: "A 0.8-um CMOS two-dimensional programmable mixed-signal focal-plane array processor with on-chip binary imaging and instructions storage", IEEE JOURNAL OF SOLID-STATE CIRCUITS, vol. 32, no. 7, July 1997 (1997-07-01), pages 1013 - 1026, XP008145159 * |
See also references of EP2294500A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012135935A2 (en) * | 2011-04-06 | 2012-10-11 | Research In Motion Limited | Portable electronic device having gesture recognition and a method for controlling the same |
WO2012135935A3 (en) * | 2011-04-06 | 2012-11-29 | Research In Motion Limited | Gesture recognition on a portable device with force-sensitive housing |
GB2494482A (en) * | 2011-04-06 | 2013-03-13 | Research In Motion Ltd | Gesture recognition on a portable device with force-sensitive housing |
Also Published As
Publication number | Publication date |
---|---|
US9519857B2 (en) | 2016-12-13 |
US20090326833A1 (en) | 2009-12-31 |
EP2294500B1 (en) | 2017-03-29 |
EP2294500A4 (en) | 2014-04-02 |
EP2294500A1 (en) | 2011-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9519857B2 (en) | Apparatus and method for sensing characterizing features of a deformable structure | |
Xue et al. | Multimodal human hand motion sensing and analysis—A review | |
Araromi et al. | Ultra-sensitive and resilient compliant strain gauges for soft machines | |
JP7178434B2 (en) | Wearable wireless HMI device | |
Jin et al. | Triboelectric nanogenerator sensors for soft robotics aiming at digital twin applications | |
Kim et al. | A deep-learned skin sensor decoding the epicentral human motions | |
US8519950B2 (en) | Input device | |
Truong et al. | Capband: Battery-free successive capacitance sensing wristband for hand gesture recognition | |
Pan et al. | State-of-the-art in data gloves: A review of hardware, algorithms, and applications | |
Wang et al. | Hydrogel and machine learning for soft robots’ sensing and signal processing: a review | |
Kang et al. | Learning-based fingertip force estimation for soft wearable hand robot with tendon-sheath mechanism | |
JP2023542055A (en) | Interactive tactile perception method for object instance classification and recognition | |
Pyun et al. | Machine-learned wearable sensors for real-time hand-motion recognition: toward practical applications | |
Xue et al. | Human in-hand motion recognition based on multi-modal perception information fusion | |
Jin et al. | Progress on flexible tactile sensors in robotic applications on objects properties recognition, manipulation and human-machine interactions | |
Hu et al. | Machine learning-enabled intelligent gesture recognition and communication system using printed strain sensors | |
Ayodele et al. | A review of deep learning approaches in glove-based gesture classification | |
Duan et al. | Highly Sensitive and Mechanically Stable MXene Textile Sensors for Adaptive Smart Data Glove Embedded with Near-Sensor Edge Intelligence | |
Guo et al. | An on-skin triboelectric sensor system enabled by heterogeneous parallel channel fusion model for human-machine interactions | |
Lin et al. | Multimodal Surface Sensing based on Hybrid Flexible Triboelectric and Piezoresistive Sensor | |
Li et al. | k-Nearest-Neighbour based numerical hand posture recognition using a smart textile glove | |
WO2020246942A1 (en) | Interface device | |
WO2019193601A1 (en) | Piezoelectric sensors integrated hand gesture recognition device | |
KR102604259B1 (en) | Wearable glove type input apparatus | |
Kaewrakmuk et al. | Object Recognition Using Multi-Input Sensor with Convolutional Neural Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09772612 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2009772612 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009772612 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 557/CHENP/2011 Country of ref document: IN |