GB2589700A - System for transferring haptic information - Google Patents

System for transferring haptic information Download PDF

Info

Publication number
GB2589700A
GB2589700A GB2013623.0A GB202013623A GB2589700A GB 2589700 A GB2589700 A GB 2589700A GB 202013623 A GB202013623 A GB 202013623A GB 2589700 A GB2589700 A GB 2589700A
Authority
GB
United Kingdom
Prior art keywords
haptic
information
user
actuator
wearable device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2013623.0A
Other versions
GB202013623D0 (en
Inventor
Tada Ryo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB202013623D0 publication Critical patent/GB202013623D0/en
Publication of GB2589700A publication Critical patent/GB2589700A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Abstract

System for transferring haptic information comprising a first wearable device 1803 and a second wearable device 1813 to be mounted on a human or animal body 1801, 1811 by an adhesive or mounting structure (15, Fig. 1); the first device comprising a haptic input sensor and a transmitter for the haptic information, the second device comprising a receiver and a haptic actuator for simulating the haptic sensation based on the received information. The device may be mounted on a fingernail. The haptic input sensor may comprise vibration or position sensors. The first device may further comprise sensors for visual input or sound input. The haptic actuator may be a piezoelectric, a shape memory alloy, an eccentric rotating mass, a linear resonator or other known actuators. The system may further comprise a processor and memory. The devices may have waterproof casing. User sensitivity to haptic sensation may be determined. The actuator may be placed with a soft adhesive.

Description

Title of the invention
System for transferring haptic information
Field of the invention
The present invention relates to the field of electronic haptic interface systems. The invention makes haptic devices mobile and wearable. The invention may be used for communication and touch feeling, specifically feeling digitally simulated touch textures.
Background of the invention
Currently, digital experiences are focused on the audio-visual whilst other human senses are fundamentally negkcted. Of these other sensory systems, touch is important for creating trust and empathy, as is well exemplified by the bond between mother and child that results from simple touch in the form of skin-to-skin contact. Psychologists have proven that people can communicate six emotions via touch alone: anger, fear, disgust, love, gratitude and sympathy.
Despite the necessity of touch-based interfaces, there are very few haptic interfaces on the consumer market. Currently, this type of interface is mainly explored through gloves. Gloves are beneficial for immersive experiences such as virtual reality (VP); although, in this font die physical environment is shut out, and thus haptic gloves are hard to use in daily life. Also, current haptic interface devices are large/heavy and require complicated to set-up, as exemplified by PHANToMTM or HTC VIVETM.
A need, therefore, exists to provide a system to address at least one of the above problems.
Summary of the invention
According to a first aspect of the present embodiment, there is a system for transferring haptic information from one wearable device to another wearable device. The system includes two or more wearable devices including a first wearable device and a second wearable device, each of the two or more wearable devices configured to be mounted on a part of a human body or an animal body by an adhesive material or mounting structure. The first wearable device includes a haptic input sensor to detect haptic infomhttion and a transmitter for transmitting the haptic information detected by the haptic input sensor. The second wearable device includes a receiver for receiving the haptic information; and a haptic actuator for simulating haptic sensation based on the haptic information.
According to a second aspect of the present embodiment, the part of the body includes one or more fingernail.
According to a third aspect of the present embodiment, the haptic input sensor includes one or more of the following sensors: a vibration input sensor for detecting vibration and a position input sensor or computer vision for detecting XYZ coordinate in physical space. The haptic information includes one or more of the following information: vibration and XYZ coordinate in physical space.
According to a fourth aspect of the present embodiment, the first wearable device further includes a sound input sensor for detecting sound information. The transmitter is further configured to transmit the sound inforniation detected by the sound input sensor. The receiver is further configured to receive the sound information transmitted by the transmitter. The second wearable device further comprises a speaker for play backing die sound information.
According to a fifth aspect of the present embodiment, the first wearable device further includes a visual input sensor for detecting visual information. The transmitter is further configured to transmit the visual information detected by the visual input sensor. The receiver is further configured to receive the visual information transmitted by the transmitter. The second wearable device further includes a display for displaying the visual information.
According to a sixth aspect of the present embodiment, the haptic actuator comprises any one or more of the following actuators: a piezo-electric actuator, a shape-memory alloy actuator, an eccentric rotating mass actuator, a linear resonant actuator and any other actuator that reacts via electrical signals and generates vibrations and/or movements.
According to a seventh aspect of the present embodiment the system further includes a processor for processing the haptic information based on a type of the haptic actuator so that the haptic information is optimized for the type of the haptic actuator.
According to an eighth aspect of the present embodiment, the system further includes a memory for storing the haptic information.
According to a ninth aspect of the present embodiment, each of the two or more wearable devices comprises a waterproof casing for covering components to protect the components from water and enable them to be washed.
According to a tenth aspect of the present embodiment, wherein the haptic simulation is optimized according to the user's sensitivity towards the haptic sensation.
Unless the context dictates otherwise, the following terms will be given the meaning provided here: The tenn "mount" includes mount by means of an adhesive means or a mounting structure.
The term "a part of a body" includes one or more part of the following part of the human body or animal body: a fingernail, a finger pad, a palm of a hand, a toenail, a sole of a foot, a face, a hair, a neck, a shoulder, a stomach, aback, a hip, a bottom, a sex organ, an arm and a leg.
The tenn "wireless communication" includes Bluetooth, WI-F1, radio wave or any other form of data communication.
Brief description of the Drawings
Embodiments of the present invention will be better understood and readily apparent to one of ordinary skilled in the art from the following written description, which provides examples only, and in conjunction with the drawings in which: Fig. I is a schematic representation of the invention's components and shows an example of how it might be mounted on a user's body, such as a fingernail area shown in the section drawing; Fig.2 is a schematic representation of the invention's system and illustrates how the invention connnunicates with digital devices; Fig.3 illustrates an embodiment when the invention is mounted on a user's fingernail by adhesive material; Fig.4 illustrates an embodiment when the invention is mounted on a user's fingernail by =tinting structure of clipping shape; Fig.5 illustrates an embodiment when the invention is mounted on a user's fingernail by mounting structure of the form of a belt or band or ring; Fig.6 illustrates an embodiment when the invention is mounted on a user's fingernail by adhesive material; from viewpoint direction A as shown in Fig. 3; Fig.7 illustrates an embodiment when the invention is mounted on a user's fingernail by mounting structure of clipping shape, from viewpoint direction A as shown in Fig. 4; Fig.8 illustrates an embodiment when the invention is mounted on a user's fingernail by mounting structure of clipping shape, from viewpoint direction B as shown in Fig. 4; Fig.9 illustrates an embodiment when the invention is mounted on a user's body stuface by adhesive material; Fig.10 illustrates an embodiment when the invention is mounted on a user's body surface by mounting stmcture of clipping shape; Fig. II illustrates an embodiment of the invention when the device is mounted on a user's palm by adhesive material; Fig.12 illustrates an embodiment of the invention when the device is mounted on a user's finger pad by adhesive material; Fig.13 illustrates an embodiment of the invention when the device is mounted on a user's toe by adhesive material; Fig. 14 illustrates an embodiment of the invention when the device is mounted on a user's toenail by adhesive material; Fig.15 illustrates an embodiment of the invention when the device is mounted on a user's outside wrist by motmting structure of the form of a belt or elastic band; Fig. 1 6 illustrates an embodiment of the invention when the device is mounted on a user's inside wrist by mounting structure of the form of a belt or elastic band; Fig.17 is a schematic representation of the invention's data processing; Fig.18 is a detailed schematic representation of the invention for the purpose of two-person interaction; Fig.19 is an abstract schematic representation of the invention illustrating the uscr experience of two-person interaction; Fig.20 is a schematic representation of the invention illustrating its structure of components and the system connecting these with other processors; and Fig.21 is a schematic representation of the invention's system that illustrating the optimisation process of simulating haptic frequency optimised to the user's haptic sensitivity.
Detailed description
Embodiments of the present invention will be described by way of example only; with reference to the drawings. Like reference numerals and characters in the drawings refer to like element and equivalent.
Wireless conununication in some of the following embodiments are described for exemplary purpose only and the person skilled in the art will know other types of communication such as wired conununication will be applicable.
One of the objects of the present invention is to make a haptic interface mobile and wearable for everyday use. The invention can be used for daily communication, such as sending and receiving touch from one person to another. Also, the invention can be used for computer/video/mobile games and retail experiences such as online shopping for touching textures virtually. The device aims to make the digital haptic experience accessible for every:one, including children and the elderly.
A haptic or tactile interface is a digital device for people to feel virtual touch. A virtual touch is a high fidelity of vibration arising from haptic motors, in which movement is controlled by an electrical signal, with this generated and controlled by a computer processor and program.
The wearable device in the system of the present embodiment can be motmted on a fingernail to experience a virtual touch from the fingernail and of the physical world from the finger pad. The device is for experiencing touch, the touch of both the physical world and the virtual world, like augunented reality (AR). In other words, the invention creates "Augmented Touch"; augmenting virtual and physical touch. The form of a fingernail-mounted device can more effectively enhance the user's haptic experience than devices placed under the finger pad.
The wearable device can also be placed on multiple fingernails or other parts of the human body or an animal body. The way the invention may be worn involves adhesive or the mounting structure, which both are shown in the drawings.
The wearable device consists of at least one haptic actuator which allows users to feel the virtual touch. The haptic actuator could comprise a piezo-electric actuator, a shape-memory alloy actuator, an eccentric rotating mass actuator, a linear resonant actuator or any other actuator that reacts via electrical signals and generates vibrations and/or movements.
Thc wearable device may consist of input sensors for capturing the surrounding environment and/or the position of the device. These input sensors may be sound or vibration sensors for detecting the user's touch strength. The input sensor may also be a magnetometer, GPS, RFD, computer vision or other position sensors to detect the XYZ coordinate of the device in physical space.
The touch strength detected by the sensor may be digitised and used to store the data on a computer. The touch strength data may be sent to another person using the internet to "receive touch". The touch strength data may be stored in a memory on the device or an external processor and used for other purposes, such as instrument instructions.
The wearable device may be connected to computer processing systems such as smartphones and PCs which transmit electrical signals and data, via cable or wireless communication such as Bluetooth. The transmitted data and frequencies may be optimised for the frequencies for the haptic actuator.
The wearable device may carry' a battery or batteries, such as a lithium-polymer battery.
Fig.1 is a schematic representation of an embodiment of the invention's components including: a casing for the device 101; at least one haptic actuator 103; at least one printed circuit board (PCB) 105; one or more batteries 107; one or more sound sensors 109; one or more vibration sensors 111; one or more position input sensors 113 to detect the XYZ coordinate in physical space; a soft adhesive 115 for fixing a haptic actuator 103 onto the casing 101. The device may be placed on a part of the user's body 119 with an adhesive or mounting structure shown in the latter figures, Fig.3-10. The user's body may be their fingernail. A surface 121 to which the part of the user's body faces may be a tablet, smartphone or computer screen that detects the KY position of the user's finger location. The surface may be a 2D image when using a computer vision to detect the XY position of the user's finger location. The surface may be a physical surface when using the internal position sensor or not using any XYZ coordination information.
Fig. I illustrates an example of how to mount the invention on a user's body 119, such as a fingernail with adhesive 117.
The casing 101 for the device can be the form of an artificial fingernail.
The haptic actuator 103 may be of the following types: a piezo-electric actuator, a shape-memory alloy actuator, an eccentric rotating mass actuator, a linear resonant actuator or any other actuator that reads via electrical signals and generates vibrations and/or movements.
Position input sensors 113 may be a magnetometer, GPS, RFD or other position sensors to detect XYZ coordinate of the device in physical space.
A soft adhesivell5 is for simulating organic haptic sensations and it is made of a soft material that may be e.g. soft silicone, jelly, cushion adhesive and used for fixing a haptic actuator 103.
Thc adhesive 117 may be soft silicone, double-sided tape, glue or any other type of adhesive that holds the invention in place on the user's body.
Fig.2 is a schematic representation of the invention as a sTstem and illustrates how the wearable device 201 in the system communicates with digital devices. The wearable device 201 may communicate with a laptop 205 and/or a smartphonc 207 via wireless or cable communication (209, 211). The wearable device inay allow the user 203 to place it on their body part, such as a fingernail.
Fig.3 illustrates an embodiment when the invention is used with an adhesive material 303 to place the device 301 onto a user's fingernail 305. The device 301 as an artificial fingernail casing.
Fig.4 illustrates an embodiment when the device is mounted with a mounting structure 403 of clipping shape to place the device 401 onto a user's fingernail 405. The device 401 as an artificial fingernail casing. The mounting structure 403 clips onto the user's fingernail 405 from the side.
Fig.5 illustrates an embodiment when the device is mounted with a mounting structure 503 to place the device 501 onto a user's fingernail 505. The device 501 may be formed as an artificial fingernail casing. The mounting structure 5-03 can be a form of belt or ring or elastic band or any kind of material to wrap around and fix the device 501 onto the user's fingernail 505.
Fig.6 illustrates an embodiment when the device 601 is mounted with an adhesive 603 to place the device onto a user's fingernail 605, from the direction of viewpoint A shown in Fig.3. The device 601may be in the form of an artificial fingernail casing.
Fig.7 illustrates an embodiment when the device 701 is mounted with a mounting stmcture 703 of clipping shape to place the device onto a user's fingernail 705, from the direction of viewpoint A shown on Fig.4. The device 701 may be in the form of an artificial fingernail casing.
Fig.8 illustrates an embodiment when the device 801 is mounted with a mounting structure 803 of clipping shape to place the device onto a user's fingernail 805, from the direction of viewpoint B shown on Fig.4. The device may be in the form of an artificial fingernail casing.
Fig.9 illustrates an embodiment when the device 901 is mounted with an adhesive 905 to place the device onto a user's body surface 903, such as a finger or a hand or any other part of a user's body.
Fig.10 illustrates an embodiment when the device 1001 is mounted with a mounting structure 1005 of clipping shape to place the device 1001 onto a user's body surface 1003, such as a finger or a hand or any other part of a user's body. The mounting stmcture 1005 may be clips onto the body surface.
Fig.11 illustrates an embodiment of the invention when the device 1101 is mounted on a user's palm 1103 by an adhesive material.
Fig. 12 illustrates an embodiment of the invention when the device 1201 is mounted on a user's finger pad 1203 by an adhesive material.
Fig. 13 illustrates an embodiment of the invention when the device 1301 is mounted on a user's toe 1303 by an adhesive material.
Fig. 14 illustrates an embodiment of the invention when the device 1401 is mounted on a user's toenail 1403 by an adhesive material.
Fig.15 illustrates an embodiment of the invention when the device 1501 is mounted on a user's outside wrist 1505 by a mounting structure 1503. The mounting structure 1503 can be a form of belt or ring or elastic band or any kind of material to wrap around and fix the device 1501 onto the user's wrist 1505.
Fig.16 illustrates an embodiment of the invention when the device 1601 is mounted on a user's inside wrist 1605 by mounting structure 1603. The mounting structure 1603 can be a fonn of belt or ring or elastic band or any kind of material to wrap around and fix the device 1601 onto the user's wrist 1605.
Fig.17 is a schematic representation of the invention's data processing. The device 1701 consists of PCB -Printed Circuit Board 1703 to process the data. The device 1701 may connect with one or other processors 1705 such as laptops or smartphones by cable or wireless communication. The system may involve gathering environment data 1707 for the user sensing their surrounding environment data such as surface texture nearby the user and also user's environment data such as the user's motion. For gathering environment data 1707 the system may include a sound input sensor 1711 and/or a vibration sensor 1713. The system may also sense the device location XYZ 1709. For gathering location XYZ data 1709, the system may include an accelerometer 1715 anchor magnetometer 1717 and/or RFID 1719 and/or a camera 1721 on the device 1701. The system may further include a computer vision 1733 and/or a screen location XY 1731 on laptops and/or smartphones 1705 to sense a location of the device 1701 in the XYZ coordinate 1709. Alternatively, the system may include the computer vision 1733 and/or the screen location XY 1731 on the device 1701. Data from the sensed environment 1707 may be sent to the PCB's processor 1723 and data from the sensed location 1709 may be sent to the PCB's processor 1725. The data collected on processors 1723, 1725 may be sent to the comiected laptops and/or smartphones 1705 via wireless communication 1727, 1729. Data from the sensed environment 1707 may be filtered by a filtering process 1735 on the system using a laptop and/or smartphone 1705. The filtering process 1735 may involve Fourier transform 1737, equalization 1739, amplification 1741, pitch shifter 1743, band pass filter 1745 to generate an electrical signal for optimising the amplitude and frequencies for a haptic actuator 1755 on the device 1701. The location sensing data 1709 may be analysed 1747 and integrated with filtered data 1735 from the sensed environment 1707 then the information of the electrical signal is mapped at a mapping process 1749 to simulate a digital haptic texture according to the device's 1701 location. The haptic texture generated at the mapping process 1749 may be sent to a processor 1753 for the haptic actuator 1755 via wireless conummication 1751. The haptic texture is generated by electrical signals that actuate at least one haptic actuator 1755 so that its vibration 1757 will be felt as haptic textures on the user's body 1759. Processors 1723, 1725, 1753 may be the same system on the PCB 1703.
An embodiment for the system shown in Fig.17 can be a use for mobile augmented reality (AR) gaming, with the user 1759 using a smartphone app 1705 and a smartphone camera to use computer vision 1733 to detect the device's location 1709 and generate the haptic texture 1749 that the user 1759 can feel digital texture through vibration 1757 from the haptic actuator 1755.
The filtering process by processor 1705 may be processed on the internal processor on PCB 1703 the device 1701 without using the external processors 1705.
Fig.18 is a detailed schematic representation of the invention for the use of two-person interaction. User -1 (1801) can send their touch enviromnent 1821 and/or their motion 1829 to another user, User -2 (1811). User -1 may wear the device -1 (1803) and it may connect with one or other processors 1805 such as smartphones via wireless communication 1807 for transmitting data related to touch action to another processor 1815 via the Internet 1809. The processor -2(1815) may send the touch data via wireless communication 1817 to the device -2 (1813). This system illustrated in Fig.18 allows for User -1(1801) to send their touch., for example, their tap or finger stroke, to another person User -2 (1811), which User -2 (1811) experiences as receiving touch from User-1 (1801).
Embodiment 1(1819) shows that User -1 (1801) can send the touch environment 1821 to User -2 (1811) as touch simulation 1825. Touch environment 1821 is sensed by sound, texture and location data, which can be experienced by User -2 (1811) receiving the touch simulation 1825 via vibration or a motion simulation of the touch environment 1821. Embodiment 1 (1819) can be used for touching pets or other animals and this touch sensation can be sent to other people, such as family members, who live far away. Another use case of embodiment 1 (1819) will be that a user can gather touch textures from their daily lives and use these data for gaming such as mobile app games or texture example for online catalogue for shopping.
Embodiment 2 (1827) is showing User -1(1801) can send their finger or hand motion 1829 which can be experienced by User -2 (18 1 I) receives the touch simulation 1833 such as a tap or a stroke via vibration or a motion simulating the touch motion. Embodiment 2 (1827) can be used to send and receive touches between people. For instance, a family member may send and receive a touch froni a child in a hospital or a school dormitmy. Touch exchange may also benefit the elderly in care homes as they can send touch to their family members or the doctors. Another use case of embodiment 2 (1827) is between couples in long-distance relationships who can send and receive touch sensations.
Embodiments 1 (18 19) and 2 (1827) may be applied to the embodiments at the same time.
Fig.19 is an abstract schematic representation of the invention illustrating the experience for use in two-person interaction. As experienced through the device, User -1 (1901) can transmit haptic texture data and User -2 (1903) can receive the texture data. Also, User -2 (190$) can transmit the haptic texture data and User -1 (1901) can receive it. This can be used for 'touch calls', User -1 (1901) can send touch motion such as tapping and User -2 (1903) can receive it in a remote place, using the interact.
Fig.20 is a schematic representation of the invention and illustrates the structure of its components. The device 2001 may include a vibration sensor 2005, sound input 2007 such as a microphone for motion sensing. Also, for location sensing the device may consist of a camera 2009, an RED 2011, a magnetometer 2013, an acetometer 2015. The device 2001 may include at least one haptic actuator 2017. The components are connected to an internal processor 2021 via a bus system 2019, The internal processor 2021 may be connected via wireless connnunication 2023 with other processors 2003 such as laptops and smartphoncs. Data will be communicated between the device 2001 and other processors 2003. Other processors 2003 may use sound input 2029 for motion sensing of the device 2001. Other processors 2003 may use screen location XY 2031 and/or computer vision 2033 for location sensing of the device 2001. The data gathered by sound input 2029, screen location XI' 2031, computer vision 2033 may be connected to a processing system 2025 via a bus 2027 and analysed on a processor 2025. These data may be stored on the processor 2025 or sent to other devices 2035 via the interact 2037.
Fig.21 is a schematic representation of the invention's system that illustrating the optimisation process of simulating haptic frequency optimised to the user's haptic sensitivity.
As the sensitivity towards the haptic simulation is different between individuals so the haptic simulation from the device needs to be optimised according to each user's sensitivity. A User 2101 wearing the device 2102 with an internal processor or the device 2102 is connected to an external processor 2105 such as a laptop or a smarrahone with wired or wireless communication 2103. The device's internal processor 2102 or an external processor 2105 follows the data filtering and optimisation process as Process 1(2107), Process 2(2119), Process 3 (2129), Process 4 (2141) and Process 5 (2153). Then the processor uses the Optimised Haptic Data 2155 for each user to simulate the haptic sensation.
Process 1(2107): Model Simulation 2109 is the haptic sensations initially programmed in the processor, which consists with amplitude range Model Range -A (2111) and frequency range Model Lange -F (2110). Model Range -A (2111) is the amplitude range from aol (2113) to ao2 (2114). Model Range -F (2110) is the frequency range from fol (2115) to f32 (2117).
Process 2 (2119): Test 2121 is the testing process displays various range of haptic simulation to the User 2101. This is the wider range simulation than the Model Simulation 2109 and trying to find the sensitivity range of the User 2101. Test Range -A (2123) is the amplitude range from ail (2124) to th2 (2125). Test Range -F (2122) is the frequency range from ft 1 (2126) to ft2 (2127). The starting point of amplitude all (2124) and frequency -Ill (2126) can be 0.
Process 3 (2129): User's Feedback 2131 is the process that the User 2101 send the feedback to the device's internal processor 2102 or an external processor 2105 that which amplitude and frequency they could feel and which they couldn't. The User 2101 also can send feedback on what amplitude and frequency they felt uncomfortable. The feedback information is stored on the processor 2102/2105 for the later optimisation process, Process 4 (2141). User's Sensitivity Range -A (2133) is the amplitude range from all (2134) to al2 (2135) that. the User 2101 can feel. User's Sensitivity Range -F (2132) is the frequency range from fl 1 (2137) to f12 (2139) that the User 2101 can feel.
Process 4(2141): Optimising 2143 is the process for mapping the Model Simulation's 2109 haptic simulation to the User's Feedback range 2131. This process enables adjusting the model frequency to the User's 2101 haptic sensitivity and creates an ideal haptic experience for each individual. Filtering -A (2149) is mapping the amplitude data of Model Range -A (2111) towards the User's Sensitivity Range -A (2133), and this mapping process creates Optimised Amplitude dataset 2151. Filtering -F (2145) is mapping the frequency data of Model Range -F (2110) towards the User's Sensitivity Range -F (2132), and this mapping process creates Optimised Frequency datasct 2147.
Process 5(2153): Optimised Amplitude dataset 2151 and Optimised Frequency dataset 2147 together those will be an Optimised Haptic Data 2155, and this will be sent back to the Device 2102 to simulate the haptic sensation for the User 2101.

Claims (11)

  1. Claims 1. A system for transferring haptic information from one wearable device to another wearable device, comprising: two or more wearable devices including a first wearable device and a second wearable device, each of the two or more wearable devices configured to be mounted on a part of a human body or an animal body by an adhesive material or mounting structure; wherein the first wearable device comprises: a haptic input sensor to detect haptic information; and a transmitter for transmitting the haptic information detected by the haptic inpnt sensor, wherein the second wearable device comprises: a receiver for receiving the haptic information; and a haptic actuator for simulating haptic sensation based on the haptic information.
  2. 2 The system according to claim 1, wherein the part of the body includes one or more fingernail.
  3. 3. The system according to claim 1 or 2, wherein the haptic input sensor comprises one or more of the following sensors: a vibration input sensor for detecting vibration; and a position input sensor or computer vision for detecting XYZ coordinate in physical space, the haptic information comprises one or more of the following information: vibration; and XYZ coordinate in physical space.
  4. 4. The system according to any one of claims I to 3, wherein the first wearable device further comprises a sound input sensor for detecting sound information, the transmitter is further configured to transmit the sound information detected by the sound input sensor, the receiver is further configured to receive the sound information transmitted by the transmitter, the second wearable device further comprises a speaker for play backing the sound information.
  5. 5. The system according to any one of claims 1 to 4, wherein the first wearable device further comprises a visual input sensor for detecting visual information, the transmitter is further configured to transmit the visual information detected by the visual input sensor, the receiver is further configured to receive the visual information transmitted by the transmitter, the second wearable device further comprises a display for displaying the visual information.
  6. 6. The system according to any one of claims 1 to 5, wherein the haptic actuator comprises any one or more of the following actuators: a piezo-electric actuator, a shape-memory alloy actuator, an eccentric rotating mass actuator, a linear resonant actuator and any other actuator that reacts via electrical signals and generates vibrations and/or movements.
  7. 7. The system according to any one of claims 1 to 6, further comprising a processor for optimizing the haptic information in accordance with a type of the haptic actuator.
  8. 8. The system according to any one of claims I to?. further comprising a memory for storing the haptic information.
  9. 9. The system according to any one of claims 1 to 8, wherein each of the two or mom wearable devices comprises a waterproof casing for covering components to protect the components from water and enable the two or more wearable devices to be washed.
  10. 10. The system according to any one of claims 1 to 9, wherein a user's sensitivity towards the haptic sensation is tested for each user and a haptic simulation generated by the haptic actuator is presented based on a result of the test of the user's sensitivity towards the haptic sensation.
  11. 11. The system according to any one of claims 1 to 10, wherein the haptic actuator is placed with a soft adhesive to generate organic haptic sensation.
GB2013623.0A 2019-08-31 2020-09-01 System for transferring haptic information Withdrawn GB2589700A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1912546.7A GB201912546D0 (en) 2019-08-31 2019-08-31 Method and apparatus for wearable remote commuincation device

Publications (2)

Publication Number Publication Date
GB202013623D0 GB202013623D0 (en) 2020-10-14
GB2589700A true GB2589700A (en) 2021-06-09

Family

ID=68207191

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1912546.7A Ceased GB201912546D0 (en) 2019-08-31 2019-08-31 Method and apparatus for wearable remote commuincation device
GB2013623.0A Withdrawn GB2589700A (en) 2019-08-31 2020-09-01 System for transferring haptic information

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1912546.7A Ceased GB201912546D0 (en) 2019-08-31 2019-08-31 Method and apparatus for wearable remote commuincation device

Country Status (1)

Country Link
GB (2) GB201912546D0 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210405756A1 (en) * 2021-09-03 2021-12-30 Google Llc Integrating Haptic Actuators into Mobile Computing Device Accessories

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4414984A (en) * 1977-12-19 1983-11-15 Alain Zarudiansky Methods and apparatus for recording and or reproducing tactile sensations
US20090278798A1 (en) * 2006-07-26 2009-11-12 The Research Foundation Of The State University Of New York Active Fingertip-Mounted Object Digitizer
US20190004604A1 (en) * 2017-06-29 2019-01-03 Apple Inc. Finger-Mounted Device With Sensors and Haptics

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4414984A (en) * 1977-12-19 1983-11-15 Alain Zarudiansky Methods and apparatus for recording and or reproducing tactile sensations
US20090278798A1 (en) * 2006-07-26 2009-11-12 The Research Foundation Of The State University Of New York Active Fingertip-Mounted Object Digitizer
US20190004604A1 (en) * 2017-06-29 2019-01-03 Apple Inc. Finger-Mounted Device With Sensors and Haptics

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Ryo Tada, 29 June 2019, Royal College of Art Graduate Exhibition, [online] Available from: https://www.rca.ac.uk/showcase/show-2019/schoolofdesign/innovationdesignengineering/ryo-tada/ *
Science Daily, 4 July 2003, "Engineers Develop Technology To Transmit Sensation Of Touch Over Internet", [online], Available from: https://www.sciencedaily.com/releases/2003/07/030701225412.htm *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210405756A1 (en) * 2021-09-03 2021-12-30 Google Llc Integrating Haptic Actuators into Mobile Computing Device Accessories
US11640207B2 (en) * 2021-09-03 2023-05-02 Google Llc Integrating haptic actuators into mobile computing device accessories

Also Published As

Publication number Publication date
GB202013623D0 (en) 2020-10-14
GB201912546D0 (en) 2019-10-16

Similar Documents

Publication Publication Date Title
US11360558B2 (en) Computer systems with finger devices
US10678335B2 (en) Methods, devices, and systems for creating haptic stimulations and tracking motion of a user
JP6669069B2 (en) Detection device, detection method, control device, and control method
US10317997B2 (en) Selection of optimally positioned sensors in a glove interface object
CN104423709A (en) Electrical Stimulation Haptic Feedback Interface
US20210089131A1 (en) Finger-Mounted Input Devices
US10845894B2 (en) Computer systems with finger devices for sampling object attributes
US11526133B2 (en) Electronic devices and systems
US11327566B2 (en) Methods and apparatuses for low latency body state prediction based on neuromuscular data
US11347312B1 (en) Ultrasonic haptic output devices
US11941174B1 (en) Finger pinch detection
GB2589700A (en) System for transferring haptic information
WO2019087502A1 (en) Information processing device, information processing method, and program
US11416075B1 (en) Wearable device and user input system for computing devices and artificial reality environments
Kim et al. Usability of foot-based interaction techniques for mobile solutions
CN107479687A (en) For waiting for an opportunity the touch feedback of display
US11714495B2 (en) Finger devices with adjustable housing structures
CN111610857A (en) Gloves with interactive installation is felt to VR body
Cheung et al. Exploring Acceptability and Utility of Deformable Interactive Garment Buttons
WO2022203697A1 (en) Split architecture for a wristband system and related devices and methods
WO2023167892A1 (en) A hardware-agnostic input framework for providing input capabilities at various fidelity levels, and systems and methods of use thereof
KR20230101498A (en) Method for outputting emotion based on virtual reality and electronic device thereof
WO2022132598A1 (en) Devices, systems, and methods for modifying features of applications based on predicted intentions of users
CN113749662A (en) Composite bioelectrode

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)