WO2024030656A3 - Haptic platform and ecosystem for immersive computer mediated environments - Google Patents
Haptic platform and ecosystem for immersive computer mediated environments Download PDFInfo
- Publication number
- WO2024030656A3 WO2024030656A3 PCT/US2023/029559 US2023029559W WO2024030656A3 WO 2024030656 A3 WO2024030656 A3 WO 2024030656A3 US 2023029559 W US2023029559 W US 2023029559W WO 2024030656 A3 WO2024030656 A3 WO 2024030656A3
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- haptic interface
- location data
- locations
- series
- frames
- Prior art date
Links
- 230000001404 mediated effect Effects 0.000 title abstract 3
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/218—Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Cardiology (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A device may receive, by a haptic interface module that is configured to interact with one or more haptic interface devices and an application that generates a computer-mediated environment comprising an avatar corresponding to a user wearing the one or more haptic interface devices, respective sensor data for each respective haptic interface device, wherein the respective sensor data for a respective haptic interface device indicates respective positioning of respective sensors of the respective wearable haptic interface. A device may process, by the haptic interface module, the respective sensor data to generate respective relative location data for each respective haptic interface device, wherein the relative location data is relative to a reference location defined with respect to the corresponding wearable haptic interface. A device may receive, by the haptic interface module, tracked location data from one or more motion tracking sensors, wherein the tracked location data indicates respective locations of the one or more haptic interface devices relative to a spatial environment of the user. A device may generate, by the haptic interface module, a series of motion capture frames based on the tracked location data and the respective relative location data for each respective haptic interface device, wherein each respective motion capture frame indicates a set of locations and orientations for each respective haptic interface device at a given time. A device may generate, by the haptic interface module, a series of kinematic frames based on the series of motion capture frames and one or more mediation processes that collectively convert, for each of the motion capture frames, the set of locations and orientations of the one or more respective haptic interface devices into a set of intended locations and intended orientations for configuring the avatar in the computer-mediated environment. A device may output the series of kinematic frames to the application, wherein the kinematic frames are provided to the application as user input.
Applications Claiming Priority (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202263395747P | 2022-08-05 | 2022-08-05 | |
US63/395,747 | 2022-08-05 | ||
USPCT/US2023/019494 | 2023-04-21 | ||
PCT/US2023/019494 WO2023205479A1 (en) | 2022-04-22 | 2023-04-21 | Whole-body haptic system, device, and method |
US202363464118P | 2023-05-04 | 2023-05-04 | |
PCT/US2023/021015 WO2023215485A1 (en) | 2022-05-04 | 2023-05-04 | Haptic glove system and manufacture of haptic glove systems |
USPCT/US2023/021015 | 2023-05-04 | ||
US63/464,118 | 2023-05-04 | ||
US202363467560P | 2023-05-18 | 2023-05-18 | |
US63/467,560 | 2023-05-18 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2024030656A2 WO2024030656A2 (en) | 2024-02-08 |
WO2024030656A3 true WO2024030656A3 (en) | 2024-03-14 |
Family
ID=89849837
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2023/029559 WO2024030656A2 (en) | 2022-08-05 | 2023-08-04 | Haptic platform and ecosystem for immersive computer mediated environments |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024030656A2 (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140081612A1 (en) * | 2012-09-17 | 2014-03-20 | Daniel Jonathan Ignatoff | Adaptive Physics Engine for Rendering Rigid Body and or Soft Body Physics for Virtual Objects in Contact with Voxelized Fluid |
KR101626375B1 (en) * | 2015-06-30 | 2016-06-01 | 한양대학교 산학협력단 | Glove type haptic transmission apparatus for detecting objects in augmented reality and its method |
US20160306431A1 (en) * | 2015-04-15 | 2016-10-20 | Sony Computer Entertainment Inc. | Pinch And Hold Gesture Navigation On A Head-Mounted Display |
US20170003738A1 (en) * | 2015-06-15 | 2017-01-05 | Survios, Inc. | Systems and methods for immersive physical interaction with a virtual environment |
US20170371416A1 (en) * | 2014-12-30 | 2017-12-28 | Philip Zeitler | Haptic devices and methods |
US20180081436A1 (en) * | 2016-09-20 | 2018-03-22 | Oculus Vr, Llc | Composite ribbon in a virtual reality device |
US20190101981A1 (en) * | 2017-09-29 | 2019-04-04 | Apple Inc. | Imu-based glove |
US20190283248A1 (en) * | 2015-03-04 | 2019-09-19 | The Johns Hopkins University | Robot control, training and collaboration in an immersive virtual reality environment |
US20200174569A1 (en) * | 2017-06-30 | 2020-06-04 | Microsoft Technology Licensing, Llc | Haptic feedback system |
US20200183488A1 (en) * | 2018-12-10 | 2020-06-11 | Samsung Electronics Co., Ltd. | Compensating For A Movement of A Sensor Attached To A Body Of A User |
US20220020198A1 (en) * | 2018-04-23 | 2022-01-20 | Magic Leap, Inc. | Avatar facial expression representation in multidimensional space |
-
2023
- 2023-08-04 WO PCT/US2023/029559 patent/WO2024030656A2/en unknown
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140081612A1 (en) * | 2012-09-17 | 2014-03-20 | Daniel Jonathan Ignatoff | Adaptive Physics Engine for Rendering Rigid Body and or Soft Body Physics for Virtual Objects in Contact with Voxelized Fluid |
US20170371416A1 (en) * | 2014-12-30 | 2017-12-28 | Philip Zeitler | Haptic devices and methods |
US20190283248A1 (en) * | 2015-03-04 | 2019-09-19 | The Johns Hopkins University | Robot control, training and collaboration in an immersive virtual reality environment |
US20160306431A1 (en) * | 2015-04-15 | 2016-10-20 | Sony Computer Entertainment Inc. | Pinch And Hold Gesture Navigation On A Head-Mounted Display |
US20170003738A1 (en) * | 2015-06-15 | 2017-01-05 | Survios, Inc. | Systems and methods for immersive physical interaction with a virtual environment |
KR101626375B1 (en) * | 2015-06-30 | 2016-06-01 | 한양대학교 산학협력단 | Glove type haptic transmission apparatus for detecting objects in augmented reality and its method |
US20180081436A1 (en) * | 2016-09-20 | 2018-03-22 | Oculus Vr, Llc | Composite ribbon in a virtual reality device |
US20200174569A1 (en) * | 2017-06-30 | 2020-06-04 | Microsoft Technology Licensing, Llc | Haptic feedback system |
US20190101981A1 (en) * | 2017-09-29 | 2019-04-04 | Apple Inc. | Imu-based glove |
US20220020198A1 (en) * | 2018-04-23 | 2022-01-20 | Magic Leap, Inc. | Avatar facial expression representation in multidimensional space |
US20200183488A1 (en) * | 2018-12-10 | 2020-06-11 | Samsung Electronics Co., Ltd. | Compensating For A Movement of A Sensor Attached To A Body Of A User |
Non-Patent Citations (3)
Title |
---|
INTELLIGENT SYSTEMS LAB NTRC: "HEXON - a two-finger haptic exoskeleton", 22 November 2020 (2020-11-22), pages 1 - 2, XP093149030, Retrieved from the Internet <URL:https://www.youtube.com/watch?app=desktop&v=vwxGTvljvsQ> [retrieved on 20240408] * |
MANIKANTA DAVULURI, K. HARITH, S. SAMEER BASHA, M. SUMANTH REDDY: "Cursor Movement on Object Motion", INTERNATIONAL JOURNAL OF ADVANCE RESEARCH, vol. 8, no. 3, 16 May 2022 (2022-05-16), pages 178 - 182, XP093149036 * |
TSUKASA KIKUCHI: "Visual Simulation of Hand Touching Introducing Elastic Deformation and Color Change Vol.1", 19 April 2011 (2011-04-19), pages 1 - 2, XP093149038, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=wUl5aZleZtI> [retrieved on 20231228] * |
Also Published As
Publication number | Publication date |
---|---|
WO2024030656A2 (en) | 2024-02-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210255763A1 (en) | Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device | |
US11315287B2 (en) | Generating pose information for a person in a physical environment | |
US9747717B2 (en) | Iterative closest point technique based on a solution of inverse kinematics problem | |
Fisher et al. | Virtual interface environment workstations | |
JP5951695B2 (en) | Method, apparatus and computer program for action recognition | |
SG126058A1 (en) | System and method for applying development patterns for component based applications | |
EP3014583A1 (en) | Reprojection oled display for augmented reality experiences | |
US10045001B2 (en) | Powering unpowered objects for tracking, augmented reality, and other experiences | |
US20220075633A1 (en) | Method and Device for Process Data Sharing | |
CN111783596B (en) | Training method and device of face recognition model, electronic equipment and storage medium | |
CN117280711A (en) | Head related transfer function | |
JP2016517505A (en) | Adaptive depth detection | |
WO2024030656A3 (en) | Haptic platform and ecosystem for immersive computer mediated environments | |
Lin et al. | The implementation of augmented reality in a robotic teleoperation system | |
US20180004290A1 (en) | Apparatus and method for providing feedback at a predetermined distance | |
KR20180052156A (en) | Mobile terminal and method for controlling the same | |
KR20230079152A (en) | Pose Tracking for Rolling-Shutter Cameras | |
CN102955563A (en) | Robot control system and method | |
Akman | Robust augmented reality | |
US11262854B2 (en) | Sensing movement of a hand-held controller | |
Taqvi | Reality and perception: Utilization of many facets of augmented reality | |
NL2029549B1 (en) | Incremental 2d-to-3d pose lifting for fast and accurate human pose estimation | |
CN104516660A (en) | Information processing method and system and electronic device | |
KR100825859B1 (en) | Indirect object pose estimation method with respect to user-wearable camera using multi-view camera system | |
CN114207669A (en) | Human face illumination image generation device and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23850801 Country of ref document: EP Kind code of ref document: A2 |