US20230142242A1 - Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds - Google Patents

Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds Download PDF

Info

Publication number
US20230142242A1
US20230142242A1 US17/844,409 US202217844409A US2023142242A1 US 20230142242 A1 US20230142242 A1 US 20230142242A1 US 202217844409 A US202217844409 A US 202217844409A US 2023142242 A1 US2023142242 A1 US 2023142242A1
Authority
US
United States
Prior art keywords
virtual
data
housing
user
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/844,409
Inventor
Vivien Cambridge
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
THIKA HOLDINGS LLC
Original Assignee
THIKA HOLDINGS LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by THIKA HOLDINGS LLC filed Critical THIKA HOLDINGS LLC
Priority to US17/844,409 priority Critical patent/US20230142242A1/en
Publication of US20230142242A1 publication Critical patent/US20230142242A1/en
Assigned to THIKA HOLDINGS LLC reassignment THIKA HOLDINGS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CAMBRIDGE, Vivien Johan
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present application relates to a controller, and more specifically relates to a feedback controller for the virtual reality environment.
  • Hands use their hands, and most particularly their fingers and palms, to physically manipulate and sense objects in and around their environment without exerting much effort.
  • the two primary functions of hands can be broken down into gross motor skills (such as grasping a large object) and fine motor skills (such as picking up a small pebble).
  • Hands also have a tactile sensing ability, which allows a human to detect the surface features of an object.
  • One of the newest developments in 3-D technology involves the emergence of low cost, high definition stereo headsets. These headsets present the left and right eyes with stereo images and thereby can create the illusion of being inside a computer generated 3-D world. Head-mounted displays for immersive software have been available for some time now, but the newer headsets improve the quality of the 3-D image presented to the user and lower the cost making it available to most users.
  • one type of interactive gaming technology includes data gloves that can be used by gamers to interact with objects.
  • Data gloves are equipped with sensors that detect the movements of the hand and fingers and interface those movements with a 3-D system running on a computer.
  • data gloves When data gloves are used in a virtual reality environment, the user sees an image of a corresponding hand and the user can manipulate objects in the virtual environment using the glove.
  • Neither the data gloves nor the vision-based technologies of the present art allow systems to give force feedback to the user's hand. For example, if the user uses a data glove to reach out and touch a tree in virtual space the data glove system will not physically stop the hand when it contacts the tree. If users move their virtual hands up and down against the tree the current data glove systems do not provide a physical feedback to the user's hand that conveys the texture or roughness of the tree's bark.
  • the present invention is directed, generally, to an inexpensive system that allows a user to comfortably and intuitively control articulation, location, and orientation of a virtual hand in a virtual world and provides physical force feedback to the user's hand related to interactions between the virtual hand and virtual objects in the virtual world.
  • the system of the present application comprises a handheld housing which fits in the palm of a user's hand and comprises a plurality of pressure-sensitive buttons which are positioned to rest under the user's respective fingers as the user grips the handheld housing.
  • the button may be equipped with a sensor configured to detect the degree (e.g., depth or strength of pressing) with which the button is pressed, and the associated virtual finger will bend to reflect the finger's movement against the button as detected by the sensor.
  • the handheld housing of the device of the present invention furthermore includes a gyroscope for detecting multiple degrees of motion, with as full a range of motion as possible, and for detecting orientation with respect to the housing of the handheld device.
  • the housing of the handheld device is attached to the grip on a 3-D computer mouse and force feedback controller which controller is fixed in space.
  • a bearing attaches the housing to the 3-D computer mouse and force feedback controller allows the housing to freely move in three dimensions and through a range of motion. Therefore, the user can change the orientation of the virtual hand in the virtual space by intuitively changing the orientation of the orientation of the housing of the handheld device.
  • buttons of the housing of the handheld device include an actuator and vibrator for imparting onto the fingers of the user tactile feedback related to data generated by the 3-D visualization software of the virtual world.
  • the buttons and the gyroscope are connected to a processor in the housing of the handheld device.
  • the processor creates, collects, processes, and/or transmits data related to the articulation of the virtual hand to a processor for use in software that uses such data.
  • That software can be a 3-D visualization system, such as a 3-D world viewed through a 3-D headset.
  • the software could be implemented in known 3-D headsets, such as Oculus Rift and Samsung Gear VR.
  • the software also generates tactile data for actuating and vibrating the buttons in relation with and/or in response to the user's experience in the virtual world.
  • the tactile data is transmitted to the processor for controlling said buttons.
  • the system translates the movement of a user's hand so that it is reflected in a virtual world, and also translates actions in the virtual world back to a user's hand through feedback.
  • the present device is configured to be operated by a human hand for controlling the placement, movement, articulation, and gestures for a corresponding virtual hand in a virtual world and provides tactile feedback to the user from the associated virtual world.
  • the articulation of a hand may be considered the positioning of palm and the fingers in space through a range of motion.
  • the device is preferably compact, such the device fits in the palm of the hand and when the hand naturally and comfortably grasps the device, the fingers of the hand rest on buttons. An ergonomic design is preferred.
  • the device preferably has at least five primary buttons, one corresponding to each finger. However, one of ordinary skill in the art would recognize from this disclosure that any number of buttons could be provided.
  • buttons per finger can be included in one embodiment, to accommodate each segment of a user's respective finger and the related movement of the phalanges.
  • the buttons are preferably movement and/or pressure sensitive so that the user can flex individual fingers by slightly pressing and releasing buttons associated with each finger. Depending on the degree of pressure applied, the corresponding motion of the associated finger will vary.
  • the buttons of the device also preferably vibrate and move to provide tactile feedback to the user which is associated with touch experienced by fingers in the virtual world associated with each button. In other words, the device allows the virtual hand to send sensory output to the user's actual hand.
  • the device of the present invention senses changes in the orientation of the handheld device so that the user can naturally and intuitively control the orientation of the hand in the virtual world.
  • the device preferably includes an accelerometer or gyroscope for detecting rotation of the device.
  • the device attaches to a 3-D computer mouse and force feedback controller.
  • a 3-D mouse and feedback controller is a combined device that has a grip which can be held and moved in three dimensions by a user to enter data related to 3-D position into a computer.
  • the 3-D mouse and feedback controller include motors that can impart forces onto the grip so that the user receives force feedback from software running in the computer. Accordingly, the location of the device of the present invention in 3-D space can be detected and transmitted to virtual world software which allows the user to control the location of a virtual hand and allows force feedback to be applied to the user's hand through the device.
  • the device of the present invention is designed to limit the actual amount of movement of the user's hand and software implementing relative positioning algorithms serves to move the user's virtual hand in the virtual world through its entire range of motion. This promotes ease of use of the system, particularly limiting use of the user's arm and shoulder.
  • FIG. 1 a shows a side view of the device according to one embodiment.
  • FIG. 1 b shows a bottom view of the device according to one embodiment.
  • FIG. 1 c shows an alternative embodiment of a housing of the device.
  • FIG. 1 d shows a schematic diagram of an embodiment of the housing of the device.
  • FIG. 2 shows an internal view of components within the device of FIGS. 1 a - 1 d.
  • FIG. 3 shows a perspective view of the device including degrees of movement axes.
  • FIG. 4 shows a diagram of the device of FIGS. 1 a , 1 b , and 2 with a computer system according to one embodiment.
  • FIG. 5 shows a flow chart illustrating the steps for interaction between the device and the computer according to one embodiment.
  • a first embodiment of the device 100 includes a housing 1 which is configured to fit comfortably in the palm of the user's hand 200 .
  • the housing 1 preferably has a generally ellipsoid shape, which corresponds to a user's palm and closed first.
  • the housing 1 has an ergonomic shape providing for fit and comfort.
  • the housing may comprise or be otherwise covered by a flexible or elastic material, such as a rubberized material, or NEOPRENE, for comfort.
  • the housing 1 preferably includes a plurality of buttons 2 , 4 , 6 , 8 , 10 .
  • the housing 1 includes five buttons 2 , 4 , 6 , 8 , 10 which are positioned on the housing 1 such that when the user grasps the housing 1 with their palm positioned comfortably, the buttons 2 , 4 , 6 , 8 , 10 rest, in one embodiment, under the tips of the fingers of the user's hand. As shown in FIG. 1 c , in a multiple buttons embodiment, a button may rest under each section or portion of a user's finger. As shown in FIG.
  • the housing 1 ′ includes a plurality of finger buttons 2 a , 2 b , 4 a , 4 b , 6 a , 6 b , 8 a , 8 b , 10 a , 10 b , as well as a button for the palm 23 .
  • the housing 1 can include, by way of illustration, fifteen buttons, with a five sets of three buttons, each of the five sets configured for engagement with a respective finger.
  • a track-ball or scroll-wheel can be provided on the housing 1 for manipulation by a user's hand.
  • Each of the plurality of buttons 2 , 4 , 6 , 8 , 10 is associated with a corresponding one of a plurality of vibration elements 3 , 5 , 7 , 9 , 11 .
  • each of the plurality of buttons 2 , 4 , 6 , 8 , 10 is arranged directly in physical engagement with a respective one of the plurality of vibration elements 3 , 5 , 7 , 9 , 11 .
  • the housing 1 furthermore comprises an orientation sensor 30 preferably a gyroscope 30 a in communication with or otherwise comprising a sensor 29 , for detecting and measuring the orientation of the housing 1 .
  • the orientation sensor 30 includes a gyroscope 30 a and an accelerometer 30 b .
  • the orientation sensor 30 can also include a heartrate sensor.
  • the gyroscope 30 a is configured to detect a plurality of types of motion, including but not limited to displacement, velocity, acceleration, tilting, rotation, and inversion.
  • the plurality of buttons 2 , 4 , 6 , 8 , 10 , plurality of vibration elements 3 , 5 , 7 , 9 , 11 , and the gyroscope 30 a are preferably associated with a processor 12 .
  • the plurality of buttons 2 , 4 , 6 , 8 , 10 , the plurality of vibration elements 3 , 5 , 7 , 9 , 11 , and the gyroscope 30 a each provide input signals to the processor 12 , and the processor 12 sends output signals to each of the plurality of buttons 2 , 4 , 6 , 8 , 10 , the plurality of vibration elements 3 , 5 , 7 , 9 , 11 , and the gyroscope 30 a.
  • the processor 12 is preferably in electrical communication with the plurality of buttons 2 , 4 , 6 , 8 , 10 , plurality of vibration elements 3 , 5 , 7 , 9 , 11 , and gyroscope 30 a such that data can be transmitted and received to and from the processor 12 from the components in the housing 1 .
  • the processor 12 accepts data from the plurality of buttons 2 , 4 , 6 , 8 , 10 for processing and transmission to a transmitter/receiver unit 13 .
  • the transmitter/receiver unit 13 transmits data to a central processing unit (CPU) or computer 40 for use in visualization software such as immersive 3-D virtual reality software.
  • the processor 12 transmits data corresponding to that respective combination of pressure for the buttons 2 , 4 , 6 , 8 , 10 to the transmitter/receiver unit 13 , which then transmits the data to the computer 40 .
  • the visualization software associated with the computer 40 then also generates tactile feedback data, which can be transmitted by the computer 40 to the transmitter/receiver unit 13 .
  • the transmitter/receiver unit 13 sends this data to the processor 12 which causes vibration elements 3 , 5 , 7 , 9 11 to vibrate.
  • the transmitter/receiver unit 13 serves as an uplink/downlink or transmitter/receiver between the processor 12 and the computer 40 .
  • a bearing 14 allows a user to freely rotate the housing 1 through a range of motion when the user grips and operates the housing 1 .
  • the bearing 14 could include any suitable joint-type arrangement, such as a ball and socket bearing, to allow the housing 1 to move in multiple dimensions and through a range of motion.
  • the bearing 14 allows the housing 1 rotate, move forward, backward, left, right, tilt, and any combination thereof.
  • the gyroscope 30 a generates data related to the three dimensional orientation of the housing 1 so that data can be used in visualization software in the computer 40 to rotate an associated virtual hand in a virtual world.
  • a user can bend and twist the user's wrist while grasping the housing 1 and intuitively cause similar movements of a virtual hand in a virtual world.
  • the housing 1 preferably attaches to a grip 15 of a 3-D mouse 17 and a feedback controller 22 so that as the user moves the housing 1 , then the grip 15 of the 3-D mouse 17 moves accordingly and the transmitter/receiver unit 13 sends data to visualization software.
  • the grip 15 is designed to comfortably fit in the user's hand and can include buttons 15 a , 15 b , 15 c that are actuated to provide further feedback to the user.
  • the grip 15 includes analogue buttons.
  • the buttons are placed so that each of the user's fingers rests comfortable on a corresponding button.
  • the feedback controller 22 holds and moves the grip 15 in accordance with physical sensations that the user would feel and which are commensurate with the actions and experiences of the user's representation in the virtual world.
  • the grip 15 and feedback controller 22 each provide an additional point of movement and/or pivoting for the housing 1 , such that the grip 15 and feedback controller 22 act as joints for the device 100 , i.e. each of the components provide additional degrees of freedom for the device.
  • FIG. 1 d shows a schematic diagram of the housing 1 , the 3-D mouse 17 , the grip 15 , and the feedback control 22 .
  • the grip 15 is connected to the housing 1 via a grip shaft 15 ′.
  • the grip shaft 15 ′ is flexible.
  • the grip shaft 15 ′ is connected at a first end by a resilient ball-socket joint to the housing 1 and at a second end by a resilient ball-socket joint to the grip 15 . Based on this connection to the housing 1 and grip 15 , the grip shaft 15 ′ provides enough support to keep the housing 1 erect in a resting position, but provides additional degrees of motion for the device 100 when a user manipulates the housing 1 .
  • the grip shaft 15 ′ includes a grip sensor 15 a ′ that detects deformations and movement of the grip shaft 15 ′.
  • the grip sensor 15 a ′ collects data related to deformation and movement of the grip shaft 15 ′ and this data is used by the computer 40 to correspond the physical movement of the housing 1 with the virtual reality element.
  • the grip sensor 15 a ′ includes a plurality of sensors, including an accelerometer, gyroscopic sensor, torque sensor, strain gauge or any combination of known sensors.
  • feedback controller shafts 22 a , 22 b , 22 c , 22 d are provided between the grip 15 and the feedback controller 22 . Although four feedback controller shafts 22 a , 22 b , 22 c , 22 d are shown in FIG.
  • each of the feedback controller shafts 22 a , 22 b , 22 c , 22 d are preferably connected at each end to a respective component via a resilient ball-socket joint. This arrangement provides additional degrees of freedom for movement of the housing 1 .
  • the feedback controller shafts 22 a , 22 b , 22 c , 22 d each include feedback controller shaft sensors 22 a ′, 22 b ′, 22 c ′, 22 d ′, respectively.
  • These feedback controller shaft sensors 22 a ′, 22 b ′, 22 c ′, 22 d ′ each includes a plurality of types of sensors, including but not limited to an accelerometer, a gyroscopic sensor, a torque sensor, and a strain gauge.
  • sensors including but not limited to an accelerometer, a gyroscopic sensor, a torque sensor, and a strain gauge.
  • sensors could be integrated into the feedback controller shafts 22 a , 22 b , 22 c , 22 d to detect additional data points regarding the feedback controller shafts 22 a , 22 b , 22 c , 22 d .
  • the data detected by the grip sensor 15 a ′ and the feedback controller shaft sensors 22 a ′, 22 b ′, 22 c ′, 22 d ′ is sent to the computer 40 for plotting and mapping the detected motions with respect to a virtual reality element.
  • the data from the grip sensor 15 a ′ and the feedback controller shaft sensors 22 a ′, 22 b ′, 22 c ′, 22 d ′ can be processed by the device processor 12 , and sent to the computer 40 via the transmitted/receiver unit 13 .
  • the visualization software includes immersive 3-D virtual reality software.
  • the grip 15 of the feedback controller 22 receives forces related to feedback generated by visualization software then those forces are imparted to the housing 1 of the device and thereby felt by the hand of the user.
  • FIG. 4 one embodiment of the device as used as intended with the computer 40 , the 3-D mouse 17 , and the feedback controller 22 .
  • the transmitter/receiver unit 13 transmits data generated by the processor 12 .
  • the transmitter/receiver unit 13 is preferably wireless, but a wired or other communication could be used to transmit data and signals between the components.
  • a central processor 18 accepts data from the device processor 12 which is used to compute refined data related to the articulation and orientation of the virtual hand, and the refined data is stored or used in software such as immersive 3-D virtual reality software.
  • the central processor 18 receives data from the device's transmitter/receiver unit 13 though a second transmitter/receiver unit 19 in the central processor 18 .
  • the central processor 18 connects to a storage device 20 and a display device 21 , which provides visual feedback to the user.
  • the storage device 20 includes a memory unit, and the memory unit stores multiple virtual reality scenarios, which can be loaded by the computer 40 .
  • the computer 40 is preferably connected to the internet.
  • the computer 40 can download or stream multiple virtual reality scenarios, which are displayed on the display device 21 .
  • the user manipulates the housing 1 to interact with a selected one of the virtual reality scenarios.
  • a virtual reality scenario can include, for example, an artificially created landscape that the user's point of view moves through as if the user is present in that particular landscape.
  • the user approach a virtual reality tree in a virtual reality landscape, and manipulate the tree via movement of the housing.
  • the user would receive physical feedback through the device that indicate when the user's artificially created hand contacts the artificial tree and that reflect the texture of the tree's surface.
  • FIG. 5 illustrates the steps for interaction between the housing 1 and the computer 40 .
  • step 500 includes a user gripping the housing 1 .
  • step 510 includes the user manipulating the housing 1 , which includes multiple degrees of motion, as well as displacement of the housing 1 .
  • step 520 includes detecting the manipulation of the housing 1 , and converting data regarding the manipulation via the processor 12 .
  • Step 530 includes the first transmitter/receiver unit 13 of the housing 1 transmitting data to the second transmitter/receiver unit 19 of the computer 40 .
  • Step 540 includes the central processor 18 of the computer 40 analyzing the data transmitted from the housing 1 to manipulate a virtual reality element such that the manipulation of the virtual reality element corresponds with the manipulation of the housing 1 .
  • Step 550 includes the display device 21 simultaneously displaying a virtual reality element in a virtual reality scenario as being manipulated in real-time based on the movement detected during step 510 .
  • Step 560 includes feedback data being sent by the second transmitter/receiver unit 19 to the first transmitter/receiver unit 13 , and the housing 1 being moved and manipulated based on the feedback data.
  • Step 570 includes the repetition of steps 510 , 520 , 530 , 540 , 550 , and 560 , in any order or combination of steps.
  • steps 510 , 520 , 530 , 540 , 550 , and 560 in any order or combination of steps.
  • steps may be included to provide for interaction between the housing 1 and the computer 40 .
  • the steps are provided in a continuous feedback loop, such that the user continues manipulating the housing 1 and the computer 40 continuously provides visual feedback via the display device 21 and physical feedback via impulses sent to the housing 1 based on the user's manipulation of the housing 1 and interaction of the virtual reality element in the virtual reality scenario.
  • buttons may be actuated as well as or instead of vibrating.
  • buttons may be replaced by touch sensors.
  • direction changes can be detected by an accelerometer rather than a gyroscope. I intend to include all such embodiments that are obvious to one with ordinary skill in the art.

Abstract

A device for dexterous interaction in a virtual world in disclosed. The device includes a housing including a plurality of buttons and a plurality of vibration elements each associated with at least one of the plurality of buttons. An orientation sensor detects orientation of the housing, and a bearing is configured to allow the housing to freely rotate in a plurality of directions. A processor is in communication with the plurality of buttons, the plurality of vibration elements, and the orientation sensor. A transmitter/receiver unit is configured to receive data from the processor and configured to send and receive data from a central processing unit.

Description

    INCORPORATION BY REFERENCE
  • The following documents are incorporated herein by reference as if fully set forth herein: U.S. Provisional Application 62/172,061 filed Jun. 6, 2015, U.S. Provisional Application 62/080,759, filed Nov. 17, 2014; and U.S. Non-Provisional application Ser. No. 14/943,421, filed Nov. 17, 2015.
  • FIELD OF INVENTION
  • The present application relates to a controller, and more specifically relates to a feedback controller for the virtual reality environment.
  • BACKGROUND
  • Humans use their hands, and most particularly their fingers and palms, to physically manipulate and sense objects in and around their environment without exerting much effort. The two primary functions of hands can be broken down into gross motor skills (such as grasping a large object) and fine motor skills (such as picking up a small pebble). Hands also have a tactile sensing ability, which allows a human to detect the surface features of an object. One of the newest developments in 3-D technology involves the emergence of low cost, high definition stereo headsets. These headsets present the left and right eyes with stereo images and thereby can create the illusion of being inside a computer generated 3-D world. Head-mounted displays for immersive software have been available for some time now, but the newer headsets improve the quality of the 3-D image presented to the user and lower the cost making it available to most users.
  • It is expected that the emergence of low cost headsets and similar new immersive technologies will sharply increase the number of gamers that will be entering the marketplace for 3-D environment games and applications. However, the technology that allows these users to interact with the 3-D world is lagging behind. The most direct, instinctive and effective way of interacting is through the use of hands. Currently, one type of interactive gaming technology includes data gloves that can be used by gamers to interact with objects. Data gloves are equipped with sensors that detect the movements of the hand and fingers and interface those movements with a 3-D system running on a computer. When data gloves are used in a virtual reality environment, the user sees an image of a corresponding hand and the user can manipulate objects in the virtual environment using the glove. Unfortunately, these existing data gloves are expensive, cumbersome, and often provide an inaccurate replication of a user's hand movements. The existing data gloves require that the user physically wears a glove, sometimes for extended periods of time, which can cause discomfort for the user while the user raises their hands and arms to gesture and to manipulate objects. Current data gloves also include complex sensor systems to collect data related to the positions of the fingers and the location and orientation of the hand, which further increases the costs of these devices.
  • Various systems have been proposed that include cameras to capture image data that includes the hands of the users. This image data is converted into parametric data for use in software such as games or 3-D virtual world visualization systems. Systems of the current art however require that the hand is always in the field of view of cameras and that cameras are either stationary and in a location independent from the user's body or worn on the user's head so that the field of view of the cameras does not necessarily include the user's hand.
  • Neither the data gloves nor the vision-based technologies of the present art allow systems to give force feedback to the user's hand. For example, if the user uses a data glove to reach out and touch a tree in virtual space the data glove system will not physically stop the hand when it contacts the tree. If users move their virtual hands up and down against the tree the current data glove systems do not provide a physical feedback to the user's hand that conveys the texture or roughness of the tree's bark.
  • SUMMARY
  • The present invention is directed, generally, to an inexpensive system that allows a user to comfortably and intuitively control articulation, location, and orientation of a virtual hand in a virtual world and provides physical force feedback to the user's hand related to interactions between the virtual hand and virtual objects in the virtual world. The system of the present application comprises a handheld housing which fits in the palm of a user's hand and comprises a plurality of pressure-sensitive buttons which are positioned to rest under the user's respective fingers as the user grips the handheld housing.
  • As the user presses a specific finger or fingers onto associated buttons, then that particular finger or fingers associated with the corresponding virtual hand in a virtual environment is manipulated to reflect the user's movements. For example, if the user presses a specific finger on a respective physical button, then the virtual hand's associated finger will also grip and/or bend. As the user releases the button, then the finger of the associated virtual hand in virtual space opens and/or unbends. The button may be equipped with a sensor configured to detect the degree (e.g., depth or strength of pressing) with which the button is pressed, and the associated virtual finger will bend to reflect the finger's movement against the button as detected by the sensor.
  • In one embodiment, the handheld housing of the device of the present invention furthermore includes a gyroscope for detecting multiple degrees of motion, with as full a range of motion as possible, and for detecting orientation with respect to the housing of the handheld device. The housing of the handheld device is attached to the grip on a 3-D computer mouse and force feedback controller which controller is fixed in space. A bearing attaches the housing to the 3-D computer mouse and force feedback controller allows the housing to freely move in three dimensions and through a range of motion. Therefore, the user can change the orientation of the virtual hand in the virtual space by intuitively changing the orientation of the orientation of the housing of the handheld device. Since the housing of the handheld device is physically attached, through the bearing, to the grip on a 3-D computer mouse and force feedback controller, the user can move the handset in three dimensions and through a range of motion to change the associated location of a virtual hand in a virtual space. The buttons of the housing of the handheld device include an actuator and vibrator for imparting onto the fingers of the user tactile feedback related to data generated by the 3-D visualization software of the virtual world.
  • The buttons and the gyroscope are connected to a processor in the housing of the handheld device. The processor creates, collects, processes, and/or transmits data related to the articulation of the virtual hand to a processor for use in software that uses such data. That software can be a 3-D visualization system, such as a 3-D world viewed through a 3-D headset. The software could be implemented in known 3-D headsets, such as Oculus Rift and Samsung Gear VR. The software also generates tactile data for actuating and vibrating the buttons in relation with and/or in response to the user's experience in the virtual world. The tactile data is transmitted to the processor for controlling said buttons. Thus, the system translates the movement of a user's hand so that it is reflected in a virtual world, and also translates actions in the virtual world back to a user's hand through feedback.
  • The present device is configured to be operated by a human hand for controlling the placement, movement, articulation, and gestures for a corresponding virtual hand in a virtual world and provides tactile feedback to the user from the associated virtual world. According to an aspect of the invention, the articulation of a hand may be considered the positioning of palm and the fingers in space through a range of motion. The device is preferably compact, such the device fits in the palm of the hand and when the hand naturally and comfortably grasps the device, the fingers of the hand rest on buttons. An ergonomic design is preferred. The device preferably has at least five primary buttons, one corresponding to each finger. However, one of ordinary skill in the art would recognize from this disclosure that any number of buttons could be provided. For example, one embodiment can include three or four buttons per finger, to accommodate each segment of a user's respective finger and the related movement of the phalanges. The buttons are preferably movement and/or pressure sensitive so that the user can flex individual fingers by slightly pressing and releasing buttons associated with each finger. Depending on the degree of pressure applied, the corresponding motion of the associated finger will vary. The buttons of the device also preferably vibrate and move to provide tactile feedback to the user which is associated with touch experienced by fingers in the virtual world associated with each button. In other words, the device allows the virtual hand to send sensory output to the user's actual hand.
  • The device of the present invention senses changes in the orientation of the handheld device so that the user can naturally and intuitively control the orientation of the hand in the virtual world. To accommodate any variations in the orientation and manipulation of the device by the user's hand, the device preferably includes an accelerometer or gyroscope for detecting rotation of the device. In addition, the device attaches to a 3-D computer mouse and force feedback controller. A 3-D mouse and feedback controller is a combined device that has a grip which can be held and moved in three dimensions by a user to enter data related to 3-D position into a computer. The 3-D mouse and feedback controller include motors that can impart forces onto the grip so that the user receives force feedback from software running in the computer. Accordingly, the location of the device of the present invention in 3-D space can be detected and transmitted to virtual world software which allows the user to control the location of a virtual hand and allows force feedback to be applied to the user's hand through the device.
  • The device of the present invention is designed to limit the actual amount of movement of the user's hand and software implementing relative positioning algorithms serves to move the user's virtual hand in the virtual world through its entire range of motion. This promotes ease of use of the system, particularly limiting use of the user's arm and shoulder.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 a shows a side view of the device according to one embodiment.
  • FIG. 1 b shows a bottom view of the device according to one embodiment.
  • FIG. 1 c shows an alternative embodiment of a housing of the device.
  • FIG. 1 d shows a schematic diagram of an embodiment of the housing of the device.
  • FIG. 2 shows an internal view of components within the device of FIGS. 1 a -1 d.
  • FIG. 3 shows a perspective view of the device including degrees of movement axes.
  • FIG. 4 shows a diagram of the device of FIGS. 1 a, 1 b , and 2 with a computer system according to one embodiment.
  • FIG. 5 shows a flow chart illustrating the steps for interaction between the device and the computer according to one embodiment.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • As shown in FIG. 2 , a first embodiment of the device 100 includes a housing 1 which is configured to fit comfortably in the palm of the user's hand 200. The housing 1 preferably has a generally ellipsoid shape, which corresponds to a user's palm and closed first. Preferably, the housing 1 has an ergonomic shape providing for fit and comfort. The housing may comprise or be otherwise covered by a flexible or elastic material, such as a rubberized material, or NEOPRENE, for comfort. The housing 1 preferably includes a plurality of buttons 2, 4, 6, 8, 10. In one embodiment, the housing 1 includes five buttons 2, 4, 6, 8, 10 which are positioned on the housing 1 such that when the user grasps the housing 1 with their palm positioned comfortably, the buttons 2, 4, 6, 8, 10 rest, in one embodiment, under the tips of the fingers of the user's hand. As shown in FIG. 1 c , in a multiple buttons embodiment, a button may rest under each section or portion of a user's finger. As shown in FIG. 1 c , the housing 1′ includes a plurality of finger buttons 2 a, 2 b, 4 a, 4 b, 6 a, 6 b, 8 a, 8 b, 10 a, 10 b, as well as a button for the palm 23.
  • Those of skill in the art will recognize from this disclosure that a different number and configuration of buttons could be used within the scope of the invention. For example, in one embodiment, the housing 1 can include, by way of illustration, fifteen buttons, with a five sets of three buttons, each of the five sets configured for engagement with a respective finger. In another embodiment, a track-ball or scroll-wheel can be provided on the housing 1 for manipulation by a user's hand. Each of the plurality of buttons 2, 4, 6, 8, 10 is associated with a corresponding one of a plurality of vibration elements 3, 5, 7, 9, 11. In one embodiment, each of the plurality of buttons 2, 4, 6, 8, 10 is arranged directly in physical engagement with a respective one of the plurality of vibration elements 3, 5, 7, 9, 11.
  • The plurality of vibration elements 3, 5, 7, 9, 11 causes its associated button 2, 4, 6, 8, 10 to vibrate, which imparts tactile feedback to the finger associated with said button 2, 4, 6, 8, 10. In one embodiment, the housing 1 furthermore comprises an orientation sensor 30 preferably a gyroscope 30 a in communication with or otherwise comprising a sensor 29, for detecting and measuring the orientation of the housing 1. In one embodiment, the orientation sensor 30 includes a gyroscope 30 a and an accelerometer 30 b. In another embodiment, the orientation sensor 30 can also include a heartrate sensor. The gyroscope 30 a is configured to detect a plurality of types of motion, including but not limited to displacement, velocity, acceleration, tilting, rotation, and inversion. The plurality of buttons 2, 4, 6, 8, 10, plurality of vibration elements 3, 5, 7, 9, 11, and the gyroscope 30 a are preferably associated with a processor 12. The plurality of buttons 2, 4, 6, 8, 10, the plurality of vibration elements 3, 5, 7, 9, 11, and the gyroscope 30 a each provide input signals to the processor 12, and the processor 12 sends output signals to each of the plurality of buttons 2, 4, 6, 8, 10, the plurality of vibration elements 3, 5, 7, 9, 11, and the gyroscope 30 a.
  • The processor 12 is preferably in electrical communication with the plurality of buttons 2, 4, 6, 8, 10, plurality of vibration elements 3, 5, 7, 9, 11, and gyroscope 30 a such that data can be transmitted and received to and from the processor 12 from the components in the housing 1. The processor 12 accepts data from the plurality of buttons 2, 4, 6, 8, 10 for processing and transmission to a transmitter/receiver unit 13. The transmitter/receiver unit 13 transmits data to a central processing unit (CPU) or computer 40 for use in visualization software such as immersive 3-D virtual reality software. For example, if the user presses a combination of the buttons 2, 4, 6, 8, 10, then the processor 12 transmits data corresponding to that respective combination of pressure for the buttons 2, 4, 6, 8, 10 to the transmitter/receiver unit 13, which then transmits the data to the computer 40. The visualization software associated with the computer 40 then also generates tactile feedback data, which can be transmitted by the computer 40 to the transmitter/receiver unit 13. The transmitter/receiver unit 13 sends this data to the processor 12 which causes vibration elements 3, 5, 7, 9 11 to vibrate. The transmitter/receiver unit 13 serves as an uplink/downlink or transmitter/receiver between the processor 12 and the computer 40.
  • As shown in FIG. 3 , a bearing 14 allows a user to freely rotate the housing 1 through a range of motion when the user grips and operates the housing 1. The bearing 14 could include any suitable joint-type arrangement, such as a ball and socket bearing, to allow the housing 1 to move in multiple dimensions and through a range of motion. For example, the bearing 14 allows the housing 1 rotate, move forward, backward, left, right, tilt, and any combination thereof. As the user rotates the housing 1, then the gyroscope 30 a generates data related to the three dimensional orientation of the housing 1 so that data can be used in visualization software in the computer 40 to rotate an associated virtual hand in a virtual world. For example, a user can bend and twist the user's wrist while grasping the housing 1 and intuitively cause similar movements of a virtual hand in a virtual world. The housing 1 preferably attaches to a grip 15 of a 3-D mouse 17 and a feedback controller 22 so that as the user moves the housing 1, then the grip 15 of the 3-D mouse 17 moves accordingly and the transmitter/receiver unit 13 sends data to visualization software. The grip 15 is designed to comfortably fit in the user's hand and can include buttons 15 a, 15 b, 15 c that are actuated to provide further feedback to the user. In one embodiment, the grip 15 includes analogue buttons. In one embodiment, the buttons are placed so that each of the user's fingers rests comfortable on a corresponding button. The feedback controller 22 holds and moves the grip 15 in accordance with physical sensations that the user would feel and which are commensurate with the actions and experiences of the user's representation in the virtual world. The grip 15 and feedback controller 22 each provide an additional point of movement and/or pivoting for the housing 1, such that the grip 15 and feedback controller 22 act as joints for the device 100, i.e. each of the components provide additional degrees of freedom for the device.
  • FIG. 1 d shows a schematic diagram of the housing 1, the 3-D mouse 17, the grip 15, and the feedback control 22. As shown in FIG. 1 d , the grip 15 is connected to the housing 1 via a grip shaft 15′. In one embodiment, the grip shaft 15′ is flexible. In one embodiment, the grip shaft 15′ is connected at a first end by a resilient ball-socket joint to the housing 1 and at a second end by a resilient ball-socket joint to the grip 15. Based on this connection to the housing 1 and grip 15, the grip shaft 15′ provides enough support to keep the housing 1 erect in a resting position, but provides additional degrees of motion for the device 100 when a user manipulates the housing 1. In one embodiment, the grip shaft 15′ includes a grip sensor 15 a′ that detects deformations and movement of the grip shaft 15′. The grip sensor 15 a′ collects data related to deformation and movement of the grip shaft 15′ and this data is used by the computer 40 to correspond the physical movement of the housing 1 with the virtual reality element. The grip sensor 15 a′ includes a plurality of sensors, including an accelerometer, gyroscopic sensor, torque sensor, strain gauge or any combination of known sensors. Similarly, feedback controller shafts 22 a, 22 b, 22 c, 22 d are provided between the grip 15 and the feedback controller 22. Although four feedback controller shafts 22 a, 22 b, 22 c, 22 d are shown in FIG. 1 d , one of ordinary skill in the art will recognize from the present disclosure that any number of shafts could be used. Similar to the grip shaft 15′, each of the feedback controller shafts 22 a, 22 b, 22 c, 22 d are preferably connected at each end to a respective component via a resilient ball-socket joint. This arrangement provides additional degrees of freedom for movement of the housing 1. The feedback controller shafts 22 a, 22 b, 22 c, 22 d each include feedback controller shaft sensors 22 a′, 22 b′, 22 c′, 22 d′, respectively. These feedback controller shaft sensors 22 a′, 22 b′, 22 c′, 22 d′ each includes a plurality of types of sensors, including but not limited to an accelerometer, a gyroscopic sensor, a torque sensor, and a strain gauge. One of ordinary skill in the art recognizes that other types of sensors could be integrated into the feedback controller shafts 22 a, 22 b, 22 c, 22 d to detect additional data points regarding the feedback controller shafts 22 a, 22 b, 22 c, 22 d. The data detected by the grip sensor 15 a′ and the feedback controller shaft sensors 22 a′, 22 b′, 22 c′, 22 d′ is sent to the computer 40 for plotting and mapping the detected motions with respect to a virtual reality element. The data from the grip sensor 15 a′ and the feedback controller shaft sensors 22 a′, 22 b′, 22 c′, 22 d′ can be processed by the device processor 12, and sent to the computer 40 via the transmitted/receiver unit 13.
  • In one embodiment, the visualization software includes immersive 3-D virtual reality software. In addition, if the grip 15 of the feedback controller 22 receives forces related to feedback generated by visualization software then those forces are imparted to the housing 1 of the device and thereby felt by the hand of the user.
  • FIG. 4 one embodiment of the device as used as intended with the computer 40, the 3-D mouse 17, and the feedback controller 22. The transmitter/receiver unit 13 transmits data generated by the processor 12. The transmitter/receiver unit 13 is preferably wireless, but a wired or other communication could be used to transmit data and signals between the components. A central processor 18 accepts data from the device processor 12 which is used to compute refined data related to the articulation and orientation of the virtual hand, and the refined data is stored or used in software such as immersive 3-D virtual reality software. The central processor 18 receives data from the device's transmitter/receiver unit 13 though a second transmitter/receiver unit 19 in the central processor 18. The central processor 18 connects to a storage device 20 and a display device 21, which provides visual feedback to the user. The storage device 20 includes a memory unit, and the memory unit stores multiple virtual reality scenarios, which can be loaded by the computer 40. The computer 40 is preferably connected to the internet. The computer 40 can download or stream multiple virtual reality scenarios, which are displayed on the display device 21. The user manipulates the housing 1 to interact with a selected one of the virtual reality scenarios. As used in this application, a virtual reality scenario can include, for example, an artificially created landscape that the user's point of view moves through as if the user is present in that particular landscape. In one embodiment, the user approach a virtual reality tree in a virtual reality landscape, and manipulate the tree via movement of the housing. In our system the user would receive physical feedback through the device that indicate when the user's artificially created hand contacts the artificial tree and that reflect the texture of the tree's surface.
  • FIG. 5 illustrates the steps for interaction between the housing 1 and the computer 40. As shown in FIG. 5 , step 500 includes a user gripping the housing 1. Next, step 510 includes the user manipulating the housing 1, which includes multiple degrees of motion, as well as displacement of the housing 1. Step 520 includes detecting the manipulation of the housing 1, and converting data regarding the manipulation via the processor 12. Step 530 includes the first transmitter/receiver unit 13 of the housing 1 transmitting data to the second transmitter/receiver unit 19 of the computer 40. Step 540 includes the central processor 18 of the computer 40 analyzing the data transmitted from the housing 1 to manipulate a virtual reality element such that the manipulation of the virtual reality element corresponds with the manipulation of the housing 1. Step 550 includes the display device 21 simultaneously displaying a virtual reality element in a virtual reality scenario as being manipulated in real-time based on the movement detected during step 510. Step 560 includes feedback data being sent by the second transmitter/receiver unit 19 to the first transmitter/receiver unit 13, and the housing 1 being moved and manipulated based on the feedback data. Step 570 includes the repetition of steps 510, 520, 530, 540, 550, and 560, in any order or combination of steps. One of ordinary skill in the art recognizes that other steps may be included to provide for interaction between the housing 1 and the computer 40. The steps are provided in a continuous feedback loop, such that the user continues manipulating the housing 1 and the computer 40 continuously provides visual feedback via the display device 21 and physical feedback via impulses sent to the housing 1 based on the user's manipulation of the housing 1 and interaction of the virtual reality element in the virtual reality scenario.
  • Other embodiments will be obvious to one skilled in the art. For example, the buttons may be actuated as well as or instead of vibrating. Also, buttons may be replaced by touch sensors. Also, for example, direction changes can be detected by an accelerometer rather than a gyroscope. I intend to include all such embodiments that are obvious to one with ordinary skill in the art.

Claims (20)

What is claimed is:
1. A device for dexterous interaction in a virtual world, the device comprising:
a housing including at least one vibration element;
an orientation sensor configured for detecting data based on orientation of the housing; and
a processor configured to receive the data from the orientation sensor, and the processor is configured to transmit the data;
wherein the device is configured to virtually move at least one virtual element in a virtual environment such that the at least one virtual element virtually engages with a secondary virtual element, and the device is configured to receive tactile feedback based on at least a surface characteristic of the secondary virtual element.
2. The device of claim 1, wherein the device is configured to be connected to a computer comprising a display, and
the virtual environment, the at least one virtual element, and the secondary virtual element are visible on the display.
3. The device of claim 2, wherein the computer further comprises a storage device that includes a memory unit, the memory unit includes a plurality of virtual environments that are configured to be displayed on the display, and user interaction occurs with each one of the plurality of virtual environments based on movement of the device.
4. The device of claim 2, wherein the processor is configured to wirelessly transmit and receive the data, and the computer is configured to wirelessly transmit and receive the data.
5. The device of claim 1, wherein the at least one vibration element provides sensory output associated with a virtual hand in visualization software.
6. The device of claim 1, wherein the housing is configured to move with three degrees of freedom via a bearing.
7. The device of claim 1, wherein the orientation sensor comprises an accelerometer and a gyroscope.
8. The device according to claim 1, further comprising at least one additional sensor configured to detect engagement with a user's hand, and the at least one virtual element is virtually articulated based on output from the at least one additional sensor, wherein the virtual articulation is related to articulation of the user's hand.
9. A method of providing interaction in a virtual world, the method comprising:
virtually engaging a secondary virtual element with at least one virtual element in a virtual environment via use of a device configured to control virtual movement of the at least one virtual element; and
providing tactile feedback to the device based on at least a surface characteristic of the secondary virtual element.
10. The method of claim 9, wherein the device comprises:
a housing including at least one vibration element;
an orientation sensor configured for detecting data based on orientation of the housing; and
a processor configured to receive the data from the orientation sensor, and the processor is configured to transmit the data.
11. The method of claim 10, wherein the housing has an ellipsoid shape and is connected to a flexible shaft.
12. The method of claim 10, wherein the orientation sensor comprises an accelerometer and a gyroscope.
13. The method of claim 10, wherein the housing further comprises at least one button that is pressure sensitive, and the at least one button is configured to provide tactile feedback based on a degree of pressure applied to the at least one button.
14. The method of claim 9, wherein the device is configured to be wirelessly connected to a computer.
15. The method of claim 9, wherein the at least one virtual element is configured to be manipulated in real-time and in a continuous feedback loop.
16. The method of claim 15, wherein the continuous feedback loop provides physical feedback to the device based on manipulation of the at least one virtual element.
17. A system for dexterous interaction in a virtual environment, the system comprising:
a device comprising:
at least one vibration element;
a sensor configured to detect an orientation of the device; and
a processor configured to receive data from the sensor, and the processor is configured to transmit the data; and
a computer comprising a headset display configured to display a virtual environment, the computer is configured to receive the data from the processor, and the computer is configured to analyze the data to virtually manipulate at least one virtual element in the virtual environment, and
virtual manipulation of a secondary virtual element by the at least one virtual element provides tactile feedback to the device, and the tactile feedback is representative of at least a surface characteristic of the secondary virtual element.
18. The system according to claim 17, wherein the at least one virtual element is a virtual hand, and the at least one vibration element provides sensory output associated with the virtual hand virtually engaging other virtual elements in the virtual environment.
19. The system according to claim 17, wherein the at least one virtual element is configured to be manipulated in real-time and in a continuous feedback loop, and the continuous feedback loop provides physical feedback to the device based on virtual manipulation of the at least one virtual element.
20. The system according to claim 17, wherein the device is configured to move with three degrees of freedom via a bearing.
US17/844,409 2014-11-17 2022-06-20 Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds Pending US20230142242A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/844,409 US20230142242A1 (en) 2014-11-17 2022-06-20 Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462080759P 2014-11-17 2014-11-17
US201562172061P 2015-06-06 2015-06-06
US14/943,421 US11366521B2 (en) 2014-11-17 2015-11-17 Device for intuitive dexterous touch and feel interaction in virtual worlds
US17/844,409 US20230142242A1 (en) 2014-11-17 2022-06-20 Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/943,421 Continuation US11366521B2 (en) 2014-11-17 2015-11-17 Device for intuitive dexterous touch and feel interaction in virtual worlds

Publications (1)

Publication Number Publication Date
US20230142242A1 true US20230142242A1 (en) 2023-05-11

Family

ID=55961639

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/943,421 Active 2036-12-16 US11366521B2 (en) 2014-11-17 2015-11-17 Device for intuitive dexterous touch and feel interaction in virtual worlds
US17/844,409 Pending US20230142242A1 (en) 2014-11-17 2022-06-20 Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/943,421 Active 2036-12-16 US11366521B2 (en) 2014-11-17 2015-11-17 Device for intuitive dexterous touch and feel interaction in virtual worlds

Country Status (2)

Country Link
US (2) US11366521B2 (en)
WO (1) WO2016081425A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108527320B (en) * 2018-03-30 2021-08-13 天津大学 Three-dimensional mouse-based collaborative robot guiding teaching method
US11911651B1 (en) * 2023-08-10 2024-02-27 Barron Associates, Inc. System, device and method for electronically mediated upper extremity therapy

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140362014A1 (en) * 2013-06-11 2014-12-11 Immersion Corporation Systems and Methods for Pressure-Based Haptic Effects
US20160129346A1 (en) * 2013-06-09 2016-05-12 Sony Computer Entertainment Inc. Head Mounted Display

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6924787B2 (en) 2000-04-17 2005-08-02 Immersion Corporation Interface for controlling a graphical image
CN101124534A (en) 2005-02-24 2008-02-13 诺基亚公司 Motion input device for computing terminal and its operation method
US8497767B2 (en) * 2009-03-02 2013-07-30 Butterfly Haptics, LLC Magnetic levitation haptic interface system
US9423894B2 (en) * 2010-12-02 2016-08-23 Seesaw, Inc. Magnetically sensed user interface devices
US10795448B2 (en) 2011-09-29 2020-10-06 Magic Leap, Inc. Tactile glove for human-computer interaction
US20150097937A1 (en) 2013-10-08 2015-04-09 Ali Kord Single-camera motion capture system
WO2015061750A1 (en) 2013-10-24 2015-04-30 Ali Kord Motion capture system
US10379614B2 (en) * 2014-05-19 2019-08-13 Immersion Corporation Non-collocated haptic cues in immersive environments
US20150358543A1 (en) 2014-06-05 2015-12-10 Ali Kord Modular motion capture system
FR3025902B1 (en) * 2014-09-16 2017-12-08 E-Concept MULTI-DIMENSIONAL MOUSE
EP3356877A4 (en) 2015-10-04 2019-06-05 Thika Holdings LLC Eye gaze responsive virtual reality headset

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160129346A1 (en) * 2013-06-09 2016-05-12 Sony Computer Entertainment Inc. Head Mounted Display
US20140362014A1 (en) * 2013-06-11 2014-12-11 Immersion Corporation Systems and Methods for Pressure-Based Haptic Effects

Also Published As

Publication number Publication date
WO2016081425A1 (en) 2016-05-26
US11366521B2 (en) 2022-06-21
US20160139669A1 (en) 2016-05-19

Similar Documents

Publication Publication Date Title
US10838495B2 (en) Devices for controlling computers based on motions and positions of hands
KR101666096B1 (en) System and method for enhanced gesture-based interaction
US10534431B2 (en) Tracking finger movements to generate inputs for computer systems
US20230142242A1 (en) Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds
JP2023052259A (en) Finger-mounted device with sensors and haptics
CN112424730A (en) Computer system with finger device
US20100090949A1 (en) Method and Apparatus for Input Device
KR101548156B1 (en) A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same
CN111638801A (en) Controller for gesture recognition and gesture recognition method thereof
JP2010108500A (en) User interface device for wearable computing environmental base, and method therefor
RU179301U1 (en) VIRTUAL REALITY GLOVE
RU187548U1 (en) VIRTUAL REALITY GLOVE
KR20200110502A (en) Haptic controller, and System and Method for providing haptic feedback using the haptic controller
US20150009145A1 (en) Interaction peripheral device capable of controlling an element for touching and grasping multidimensional virtual objects
RU2662399C1 (en) System and method for capturing movements and positions of human body and parts of human body
JP2002304246A (en) Tactile presenting device, and imaginary space system
RU2670649C9 (en) Method of manufacturing virtual reality gloves (options)
RU176318U1 (en) VIRTUAL REALITY GLOVE
CN113508355A (en) Virtual reality controller
RU2673406C1 (en) Method of manufacturing virtual reality glove
RU186397U1 (en) VIRTUAL REALITY GLOVE
CN110658925A (en) Handheld wireless mouse positioned by gyroscope
WO2023170843A1 (en) Controller device, method for controlling controller device, and program
CN117251058B (en) Control method of multi-information somatosensory interaction system
RU176660U1 (en) VIRTUAL REALITY GLOVE

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: THIKA HOLDINGS LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CAMBRIDGE, VIVIEN JOHAN;REEL/FRAME:064211/0566

Effective date: 20160629

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED