WO2014014408A1 - 3d tactile device - Google Patents
3d tactile device Download PDFInfo
- Publication number
- WO2014014408A1 WO2014014408A1 PCT/SG2012/000258 SG2012000258W WO2014014408A1 WO 2014014408 A1 WO2014014408 A1 WO 2014014408A1 SG 2012000258 W SG2012000258 W SG 2012000258W WO 2014014408 A1 WO2014014408 A1 WO 2014014408A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- force
- fingers
- touch
- robot
- sensing layer
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04144—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position using an array of force sensing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present disclosure describes a device, algorithms, and techniques to obtain touch force amplitudes and touch positions in 3-dimentional space.
- Tactile sensing in 3-dimentional space is useful in a wide variety of applications for robotics, gaming, haptics, surgery, health care, animation production, virtual or augmented reality, software and hardware development, tele-operation, and security systems, etc. It is ideal for applications that immerse the users in an experience and guide them in navigating real, virtual, augmented, or simulated environments.
- a touch position sensing layer and touch force sensing layer are stacked together to provide touch force sensing in 3D space.
- the touch position sensing layer can be constructed with technologies used for construction of infrared, resistive and capacitive touch screens.
- the touch force sensing layer consists of tactile sensing elements arranged in array to form a sensing grid, on a flat or curved surface.
- a tactel is a 'tactile sensing element'. Each tactel is capable of detecting applied force on it.
- the adjacent tactels can be combined together to get the total touch force around the area.
- the sensing layer maybe in thin-film or rigid form.
- the electronic circuit on the control board is used to obtain touch position signals from position sensing layer, and force amplitude signal from every tactel on the force sensing layer. For every touch position provided by the position sensing layer, there will be one or more tactels nearby. Based on the total tactel outputs around a touch position, the algorithm either in the
- microcontroller of the control- board or a PC is used to estimate the total touch force around that touch position.
- FIG. 1 shows the conceptual sketch of a 3D tactile device.
- FIG. 2 shows the typical tactels arrangement on a force sensing layer.
- FIG. 3 shows a scenario where multiple tactel outputs detected around one touch point.
- FIG. 4 shows a scenario where single tactel output detected from one touch point.
- FIG. 5 shows a scenario where one tactel output detected around multiple touch points.
- FIG. 6 shows measured touch forces and positions displayed in 3D space.
- FIG. 7 shows measured left palm pressure distribution.
- FIG. 8 shows finger touch profile on the 3D tactile device. Finger positions, touch forces, and moving paths can be interpreted and used for different applications.
- FIG. 9 shows concept of robot force and motion control using the 3D tactile device.
- FIG. 10 shows concept of control with one finger.
- FIG. 11 shows concept of control with multiple fingers.
- FIG. 12 shows concept of rotation control with two fingers.
- FIG. 13 shows concept of additional functions of the 3D tactile device.
- FIG. 14 shows concept of part polishing with control of the 3D tactile device.
- FIG. 15 shows concept of mobile robot control with the 3D tactile device.
- FIG. 16 shows concept of 3D tactile device for gaming.
- FIG. 17 shows concept of rotation control for gaming.
- FIG. 18 shows concept of 3D device for animation production.
- position sensing layer 101 and force sensing layer 102 are combined together and complement one another to achieve force sensing in 3-dimentional space.
- the base 103 is used to hold the two layers 101 and 102.
- the position sensing layer 101 can be in the form of a flexible thin-film or rigid touch screen such as infrared, resistive or capacitive touch screen. In the case of infrared touch screen, it can be just a frame without thin-film layer, and the touch force will applied directly on the force sensing layer.
- the shape of a tactel can be square, rectangular, round, oval, or other shapes. As shown in FIG. 2, multiple square tactels are arranged to form a tactel grid, with small gaps between adjacent tactels.
- the size of the square tactel can be from less than 1 mmx1 mm to greater than 10mmx10mm.
- the spatial resolution of the layer is depended on the tactel size and the gap. The smaller the tactel size and gap, the higher the spatial resolution, but in turn, the higher the construction cost.
- the spatial resolutions of the layers 101 and 102 can be different. Due to cost consideration, the quantity of tactels used in force sensing layer could be limited, thus normally the force sensing layer has lower spatial resolution than that of position sensing layer.
- FIG. 3 shows a scenario where one tactel output detected around a single touch point.
- the output of the single tactel can only tell that the touch point is within that tactel.
- both touch position with position sensing layer resolution and the corresponding touch force amplitude can be obtained.
- FIG. 5 shows a scenario where one tactel output detected from two finger touches.
- the force output is the summation of two finger touch forces.
- FIG. 6 shows measured finger or stylus touch forces and positions profile in X, Y, and Z coordinates.
- the touch positions are shown in X and Y coordinates, while force amplitudes in Z coordinate.
- FIG. 7 shows measured palm pressure distribution in X and Y coordinates.
- the number in each tactel represents the force output of the tactel. The higher the number, the higher the applied force on the tactel.
- the position and force information can be used for various applications, as depicted in the later part of this disclosure.
- FIG. 8 shows concept of finger functions assignment. For example, combination of force and position information from different fingers can be used for motion and force control of the robot manipulator shown in FIG. 9.
- the motion of the robot could be controlled by motion of one finger, as shown in FIG. 10.
- a robot can be commanded to follow the finger moving path.
- the finger touch force can be used to adjust the contact force between polished part surface and polishing tool.
- the motion of the robot could also be controlled by motion of multiple fingers, as shown in FIG. 11.
- Combination of two figure motion information can be used for rotation control of an object, as shown in FIG. 12.
- one finger remains standstill, while another one sweeps around the finger.
- the motion profile can be used to rotate a robot end-effector or other object.
- buttons and perform certain tasks can be defined as Buttons 1 , 2, and 3.
- Button 1 can be defined such that, when no force is applied on the button, the robot will move forward with the commanded force, and move backward when a force is applied on the button.
- the device can also be used to replace traditional teach-pendant to achieve intuitive and better control performance.
- the touch force and position signals obtained from the 3D tactile device can find other applications such as 3-dimentional irregular surface polishing as shown in FIG. 1 .
- the robot motion in free space can be commanded to follow finger moving path, while contact force follow the applied force on the tactile device.
- the device can also be used to command a mobile robot shown in FIG. 15 to turn left and right, move back and forth, accelerate and decelerate.
- similar concept can be used to control motion of a racing car or a combat man shown in FIGs. 16 and 17.
- the device's tactile features also facilitate faster edition of a feature.
- the force amplitude can be used as input to quickly change the color depth of a carrot shown in FIG. 18.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
- Position Input By Displaying (AREA)
Abstract
A tactile device that can sense both touch forces and their positions on a flat or curved surface. The device consists of a touch position sensing layer (101), a touch force sensing layer (102), a base (103), and a control board (104) with electronic circuit and microcontroller. The touch position sensing layer (101) is used to sense one or multiple touch positions. The touch force sensing layer (102) senses applied forces near touch positions. The microcontroller in the control board (104) is used to control and communicate with the electronic circuit, get touch positions and touch forces signals for further processing.
Description
3D TACTILE DEVICE
TECHNICAL FIELD
The present disclosure describes a device, algorithms, and techniques to obtain touch force amplitudes and touch positions in 3-dimentional space.
BACKGROUND ART
Tactile sensing in 3-dimentional space is useful in a wide variety of applications for robotics, gaming, haptics, surgery, health care, animation production, virtual or augmented reality, software and hardware development, tele-operation, and security systems, etc. It is ideal for applications that immerse the users in an experience and guide them in navigating real, virtual, augmented, or simulated environments.
Traditional touch screens can only sense touch positions, unable to sense applied force amplitudes at those touch positions.
There are pressure sensor mats in the market that are made by arranging multiple tactile sensing elements in array to get touch position and amplitude information. However, such tactile sensing array has its limitations. Firstly, the output signal of a sensing element can only be detected after the applied force exceeds certain threshold, thus won't be able detect a gentle touch of small amplitude. Secondly, the size of a sensing element maybe big, thus, when a force is applied on the element, based on the corresponding output signal, which represents the total applied force on the overall area of element, it is only possible to know that the touch is somewhere within the element, unable to tell the exact touch location. Thirdly, there are gaps between sensing elements. If applied force position falls near a gap, the control board may receive output signals from multiple sensing elements, thus unable to get exact touch position. It is possible to make a pressure sensor mat with tiny tactile sensing element and small gap between adjacent elements, but is costly.
In view of the drawbacks of the above devices and arrangements, it is the objective of the present invention to augment the functionality of traditional touch screens by adding touch force sensing feature, and increase touch force position sensing resolution.
SUMMARY
According to the invention, a touch position sensing layer and touch force sensing layer are stacked together to provide touch force sensing in 3D space.
The touch position sensing layer can be constructed with technologies used for construction of infrared, resistive and capacitive touch screens.
The touch force sensing layer consists of tactile sensing elements arranged in array to form a sensing grid, on a flat or curved surface. A tactel is a 'tactile sensing element'. Each tactel is capable of detecting applied force on it. The adjacent tactels can be combined together to get the total touch force around the area. The sensing layer maybe in thin-film or rigid form.
The electronic circuit on the control board is used to obtain touch position signals from position sensing layer, and force amplitude signal from every tactel on the force sensing layer. For every touch position provided by the position sensing layer, there will be one or more tactels nearby. Based on the total tactel outputs around a touch position, the algorithm either in the
microcontroller of the control- board or a PC is used to estimate the total touch force around that touch position.
The details of the embodiments are described in the accompanying drawings, descriptions and claims below.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 shows the conceptual sketch of a 3D tactile device.
FIG. 2 shows the typical tactels arrangement on a force sensing layer.
FIG. 3 shows a scenario where multiple tactel outputs detected around one touch point.
FIG. 4 shows a scenario where single tactel output detected from one touch point.
FIG. 5 shows a scenario where one tactel output detected around multiple touch points.
FIG. 6 shows measured touch forces and positions displayed in 3D space.
FIG. 7 shows measured left palm pressure distribution.
FIG. 8 shows finger touch profile on the 3D tactile device. Finger positions, touch forces, and moving paths can be interpreted and used for different applications.
FIG. 9 shows concept of robot force and motion control using the 3D tactile device.
FIG. 10 shows concept of control with one finger.
FIG. 11 shows concept of control with multiple fingers.
FIG. 12 shows concept of rotation control with two fingers.
FIG. 13 shows concept of additional functions of the 3D tactile device.
FIG. 14 shows concept of part polishing with control of the 3D tactile device.
FIG. 15 shows concept of mobile robot control with the 3D tactile device.
FIG. 16 shows concept of 3D tactile device for gaming.
FIG. 17 shows concept of rotation control for gaming.
FIG. 18 shows concept of 3D device for animation production.
DETAILED DESCRIPTION
As shown in FIG. 1 , position sensing layer 101 and force sensing layer 102 are combined together and complement one another to achieve force sensing in 3-dimentional space. The base 103 is used to hold the two layers 101 and 102. ■
The position sensing layer 101 can be in the form of a flexible thin-film or rigid touch screen such as infrared, resistive or capacitive touch screen. In the case of infrared touch screen, it can be just a frame without thin-film layer, and the touch force will applied directly on the force sensing layer.
On force sensing layer 102, the shape of a tactel can be square, rectangular, round, oval, or other shapes. As shown in FIG. 2, multiple square tactels are arranged to form a tactel grid, with small gaps between adjacent tactels. The size of the square tactel can be from less than 1 mmx1 mm to greater than 10mmx10mm. The spatial resolution of the layer is depended on the tactel size and the gap. The smaller the tactel size and gap, the higher the spatial resolution, but in turn, the higher the construction cost.
The spatial resolutions of the layers 101 and 102 can be different. Due to cost consideration, the quantity of tactels used in force sensing layer could be limited, thus normally the force sensing layer has lower spatial resolution than that of position sensing layer.
As shown in FIG. 3, at each touch point, four tactel outputs are obtained. From the tactel outputs along, it is only possible to know that the touch position is somewhere within the area of the four tactels. Such low spatial resolution touch position estimation is not sufficient for many applications. Combining the position and force output information from the electronic circuit on the control board 104, touch position, which is obtained from position sensing layer, and total applied force around the position, which is computed from the four tactel outputs, can be determined with spatial resolution of the position sensing layer. The matching algorithm for such purpose can be implemented in the microcontroller of the board 104, or done in a computer via the data sent out from the microcontroller.
FIG. 4 shows a scenario where one tactel output detected around a single touch point. Because the size of the tactel is relatively large, the output of the single tactel can only tell that the touch point is within that tactel. By relating the higher spatial resolution position information from the position sensing layer with the force sensed, both touch position with position sensing layer resolution and the corresponding touch force amplitude can be obtained.
FIG. 5 shows a scenario where one tactel output detected from two finger touches. In this scenario, the force output is the summation of two finger touch forces.
FIG. 6 shows measured finger or stylus touch forces and positions profile in X, Y, and Z coordinates. The touch positions are shown in X and Y coordinates, while force amplitudes in Z coordinate. FIG. 7 shows measured palm pressure distribution in X and Y coordinates. The number in each tactel represents the force output of the tactel. The higher the number, the higher the applied force on the tactel. The position and force information can be used for various applications, as depicted in the later part of this disclosure.
FIG. 8 shows concept of finger functions assignment. For example, combination of force and position information from different fingers can be used for motion and force control of the robot manipulator shown in FIG. 9.
The motion of the robot could be controlled by motion of one finger, as shown in FIG. 10. A robot can be commanded to follow the finger moving path. In the case of polishing task, the finger touch force can be used to adjust the contact force between polished part surface and polishing tool. The motion of the robot could also be controlled by motion of multiple fingers, as shown in FIG. 11.
Combination of two figure motion information can be used for rotation control of an object, as shown in FIG. 12. In this case, one finger remains standstill, while another one sweeps around the finger. The motion profile can be used to rotate a robot end-effector or other object.
Some tactels on the 3D tactile device can be defined as buttons and perform certain tasks. As exampled in FIG. 13, three tactels can be defined as Buttons 1 , 2, and 3. For example, in the case of robot force and motion control, Button 1 can be defined such that, when no force is applied on the button, the robot will move forward with the commanded force, and move backward when a force is applied on the button. The device can also be used to replace traditional teach-pendant to achieve intuitive and better control performance.
The touch force and position signals obtained from the 3D tactile device can find other applications such as 3-dimentional irregular surface polishing as shown in FIG. 1 . The robot motion in free space can be commanded to follow finger moving path, while contact force follow the applied force on the tactile device. The device can also be used to command a mobile robot shown in FIG. 15 to turn left and right, move back and forth, accelerate and decelerate. For gaming, similar concept can be used to control motion of a racing car or a combat man shown in FIGs. 16 and 17.
For animation production, the device's tactile features also facilitate faster edition of a feature. For example, the force amplitude can be used as input to quickly change the color depth of a carrot shown in FIG. 18.
Claims
1. A 3D tactile device shown in FIG. 1 comprises a position sensing layer 101 , force sensing layer 102, a base 103 for holding the two layers, and a control board with electronic circuit and microcontroller. Both touch positions and the corresponding touch force amplitudes can be obtained from the device.
2. The device as claimed in Claim 1 consists of position sensing layer 101 and force
sensing layer 102 stacked together. The position sensing layer 101 can be in the form of a flexible thin-film or rigid touch screen such as infrared, resistive or capacitive touch screen. The touch force sensing layer illustrated in FIG. 2 consists of tactile sensing elements arranged side by side to form tactel grid. The two layers complement one another to achieve force sensing in 3-dimentional space.
3. In the case that the force sensing layer of the device as claimed in Claim 1 have relatively lower spatial resolution as compared with position sensing layer, as illustrated in FIG. 3, from the tactel outputs along, it is only possible to know that the touch position is somewhere within the area of the four tactels. By matching the touch position information from position sensing layer with the total applied force around the position, which is computed from the four tactel outputs, both touch force and its touch position with spatial resolution of the position sensing layer, which is higher than that of force sensing layer, can be obtained. The matching algorithm for such purpose can be implemented in the microcontroller of the board 104, or done in a computer linked to the microcontroller.
Similarly, for the scenario illustrated in FIG. 4, where one tactel output detected around a single touch point, the output of the single tactel can only tell that the touch point is within that tactel. By correlating the higher spatial resolution touch position information from the position sensing layer with the force sensed, both touch position with position sensing layer resolution and the corresponding touch force amplitude can be obtained.
4. In the case that the force sensing layer of the device as claimed in Claim 1 have relatively higher spatial resolution than that of position sensing layer. If touch force does not exceed the threshold that can be detected reliably from the force sensing layer, touch position information from the position sensing layer will still be used; If touch force exceeds the threshold and can be detected reliably from the force sensing layer, either the touch position information from the position sensing layer, or computed centroid of the surrounding touch forces can be used.
5. For the device as claimed in Claim 1 , in the scenario illustrated in FIG. 5, where one tactel output detected from two finger touches, the total force output is the summation of two finger touch forces.
6. For the device as claimed in Claim 1 , FIG. 8 shows concept of finger functions
assignment. Combination of force and position information from different fingers can be used for motion and force control of the robot manipulator shown in FIGs. 9 and 14. Some control examples are:
a. In the case of robot force control, the summation of touch forces from Fingers 1 , 2, 3, and 5 can be used as reference to control contact force between the contact surface and robot polishing tool attached to the robot end-effector shown in FIGs. 9 and 14. Touch force from Finger 4 can be used for force direction control. When a force is detected from Finger 4, the applied force direction is forward. When no force is detected from Finger 4, the applied force direction is backward. b. The force amplitude difference between Fingers 2 and 5 can be used for robot pitch control, in the case of the robot shown in FIG. 9, Axis 5 will rotate downwards if the force difference between Fingers 2 and 5 is positive, and rotate upwards if the force difference between Fingers 2 and 5 is negative.
c. The force amplitude difference between Fingers 1 and 3 can be used for robot yaw control, in the case of the robot shown in FIG. 9, Axis 1 will rotate in one direction if the force difference between Fingers 1 and 3 is positive, and rotate in the opposite direction if the force difference between Fingers 1 and 3 is negative. d. The robot end-effector rotation can be controlled by rotating path computed from Fingers 1-5, or from just one or two fingers motion. In the case of the robot shown in FIG. 9, Axis 6 will rotate in one direction if the finger rotating path is clockwise, and rotate in opposite direction if the finger rotating path is counterclockwise.
e. The robot end-effector rotation can be controlled by rotating path computed from Fingers 1 -5, or from just one or two fingers motion. In the case of the robot shown in FIG. 9, Axis 6 will rotate in one direction if the finger rotating path is clockwise, and rotate in opposite direction if the finger rotating path is counterclockwise. - f. The motion of the robot can be controlled by motion of just one finger, as shown in FIG. 10\ A robot can be commanded to follow the finger moving path. In the case of polishing task, the finger touch force can be used as reference command to adjust the contact force between polished part surface and polishing tool, as illustrated in FIG. 14. Similarly, the motion of the robot could also be controlled by motion of multiple fingers, as shown in FIG. 1 1.
g. Combination of two figure motion information can be used for rotation control of a robot as well. As shown in FIG. 2, one finger remains standstill, while another one sweeps around the finger. The motion profile can be used to rotate a robot end-effector or Axis 1 of the robot shown in FIG. 9.
This feature will facilitate robot control, teaching, and learning, with capability to enhance functions of conventional teach-pendant.
7. For the device as claimed in Claim 1 , the finger functions assignment illustrated in FIG. 8 can be used for mobile robot control shown in FIG. 15. Some control examples are:
a. The force amplitude difference between Fingers 2 and 5 can be used for robot speed or acceleration control. The robot will move forwards if the force difference between Fingers 2 and 5 is positive, and move backwards if the force difference between Fingers 2 and 5 is negative. The higher the amplitude differences between Fingers 2 and 5, the higher the speed, acceleration or deceleration rate. b. The force amplitude difference between Fingers 1 and 3 can be used for mobile robot turning control. The robot will turn left if the force difference between Fingers 1 and 3 is positive, and turn right if the force difference between Fingers
1 and 3 is negative. The higher the amplitude difference between Fingers 1 and 3, the higher the turning speed, acceleration or deceleration rate.
This feature will facilitate mobile robot control, teaching, and learning, with capability to enhance or replace functions of conventional joystick for robot control.
8. The device as claimed in Claim 1 also finds its application in gaming, virtual or
augmented reality to enhance and facilitate user control and play experiences. The finger functions assignment illustrated in FIG. 8 can be used for racing car control shown in FIG. 16 or a man's motion control shown in FIG. 17. Some control examples are:
a. The force amplitude difference between Fingers 2 and 5 can be used for speed or acceleration control of the racing car in FIG. 16 or the man in FIG. 17. The car or man will move forwards if the force difference between Fingers 2 and 5 is positive, and move backwards if the force difference between Fingers 2 and 5 is negative. The higher the amplitude differences between Fingers 2 and 5, the higher the speed, acceleration or deceleration rate.
b. The force amplitude difference between Fingers 1 and 3 can be used for turning control. The car or man will turn left if the force difference between Fingers 1 and 3 is positive, and turn right if the force difference between Fingers 1 and 3 is
negative. The higher the amplitude difference between Fingers 1 and 3, the higher the turning speed, acceleration or deceleration rate.
The device as claimed in Claim 1 also finds its application in animation production to facilitate faster edition of a feature. For example, the force amplitudes can be used as inputs to quickly change the color depth of a carrot shown in FIG. 18.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280055531.XA CN104641315B (en) | 2012-07-19 | 2012-07-19 | 3D tactile sensing apparatus |
PCT/SG2012/000258 WO2014014408A1 (en) | 2012-07-19 | 2012-07-19 | 3d tactile device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/SG2012/000258 WO2014014408A1 (en) | 2012-07-19 | 2012-07-19 | 3d tactile device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014014408A1 true WO2014014408A1 (en) | 2014-01-23 |
Family
ID=49949110
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SG2012/000258 WO2014014408A1 (en) | 2012-07-19 | 2012-07-19 | 3d tactile device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN104641315B (en) |
WO (1) | WO2014014408A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113156602A (en) * | 2021-04-28 | 2021-07-23 | 常州联影智融医疗科技有限公司 | Adaptive fastening device for fixing optical element and control method thereof |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190006962A1 (en) * | 2016-01-05 | 2019-01-03 | Interlink Electronics, Inc. | Multi-modal switch array |
CN107305464A (en) * | 2016-04-25 | 2017-10-31 | 西安中兴新软件有限责任公司 | A kind of control method and device based on pressure sensitive |
CN106598340B (en) * | 2016-12-22 | 2019-08-06 | 四川大学 | Pressure identification device |
CN107357431A (en) * | 2017-07-14 | 2017-11-17 | 信利光电股份有限公司 | A kind of touch control display apparatus and method for realizing three-dimensional touch function |
CN107576261A (en) * | 2017-08-31 | 2018-01-12 | 上海摩软通讯技术有限公司 | Texture acquirement method and mobile terminal |
CN112074706A (en) * | 2018-04-28 | 2020-12-11 | 优泰机电有限公司 | Accurate positioning system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6492979B1 (en) * | 1999-09-07 | 2002-12-10 | Elo Touchsystems, Inc. | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
US20080273013A1 (en) * | 2007-05-01 | 2008-11-06 | Levine James L | Infrared Touch Screen Gated By Touch Force |
US20100315373A1 (en) * | 2007-10-26 | 2010-12-16 | Andreas Steinhauser | Single or multitouch-capable touchscreens or touchpads comprising an array of pressure sensors and the production of such sensors |
US20120113054A1 (en) * | 2009-06-19 | 2012-05-10 | Takao Hashimoto | Resistive film type touch panel with pressing detection function |
-
2012
- 2012-07-19 WO PCT/SG2012/000258 patent/WO2014014408A1/en active Application Filing
- 2012-07-19 CN CN201280055531.XA patent/CN104641315B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6492979B1 (en) * | 1999-09-07 | 2002-12-10 | Elo Touchsystems, Inc. | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
US20080273013A1 (en) * | 2007-05-01 | 2008-11-06 | Levine James L | Infrared Touch Screen Gated By Touch Force |
US20100315373A1 (en) * | 2007-10-26 | 2010-12-16 | Andreas Steinhauser | Single or multitouch-capable touchscreens or touchpads comprising an array of pressure sensors and the production of such sensors |
US20120113054A1 (en) * | 2009-06-19 | 2012-05-10 | Takao Hashimoto | Resistive film type touch panel with pressing detection function |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113156602A (en) * | 2021-04-28 | 2021-07-23 | 常州联影智融医疗科技有限公司 | Adaptive fastening device for fixing optical element and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN104641315A (en) | 2015-05-20 |
CN104641315B (en) | 2017-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014014408A1 (en) | 3d tactile device | |
US9495035B2 (en) | Apparatus and method for user input | |
EP2754014B1 (en) | Spherical three-dimensional controller | |
KR101666096B1 (en) | System and method for enhanced gesture-based interaction | |
TW201220131A (en) | Contact-pressure detecting apparatus and input apparatus | |
CN110651238A (en) | Virtual reality/augmented reality handheld controller sensing | |
KR101318244B1 (en) | System and Method for Implemeting 3-Dimensional User Interface | |
EP3209401B1 (en) | A toy construction system and a method for a spatial structure to be detected by an electronic device comprising a touch screen | |
US20150261330A1 (en) | Method of using finger surface area change on touch-screen devices - simulating pressure | |
US11640198B2 (en) | System and method for human interaction with virtual objects | |
US10268282B2 (en) | Foot-operated touchpad system and operation method thereof | |
JP3421167B2 (en) | Input device for contact control | |
JP6735282B2 (en) | Controlling the movement of objects shown in a multi-dimensional environment on a display using vertical bisectors in multi-finger gestures | |
US20120194454A1 (en) | Finger tilt detection in touch surface-based input devices | |
KR101406855B1 (en) | Computer system using Multi-dimensional input device | |
KR101598807B1 (en) | Method and digitizer for measuring slope of a pen | |
Caruso et al. | AR-Mote: A wireless device for Augmented Reality environment | |
KR20180033396A (en) | A ring mouse for a computer | |
KR101596153B1 (en) | Elastomer touchpad for detecting touch force and method for modeling object of virtual space usign elastomer touchpad | |
JP5342354B2 (en) | Haptic presentation device and haptic presentation program | |
JP2018032123A (en) | Operation input device | |
KR101996232B1 (en) | Apparatus and method for user input | |
RU132582U1 (en) | 3D MANIPULATOR - MOUSE | |
RU132581U1 (en) | 3D MANIPULATOR - MOUSE | |
JP2000029620A (en) | Position information input device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12881223 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 27/03/2015) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12881223 Country of ref document: EP Kind code of ref document: A1 |