WO2024081783A1 - Virtual and augmented interactive and tactile projections of sound and light - Google Patents

Virtual and augmented interactive and tactile projections of sound and light Download PDF

Info

Publication number
WO2024081783A1
WO2024081783A1 PCT/US2023/076683 US2023076683W WO2024081783A1 WO 2024081783 A1 WO2024081783 A1 WO 2024081783A1 US 2023076683 W US2023076683 W US 2023076683W WO 2024081783 A1 WO2024081783 A1 WO 2024081783A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
camera
haptic
user
computer
Prior art date
Application number
PCT/US2023/076683
Other languages
French (fr)
Inventor
David CHARLOT
Zavosh ZABOLIYAN
Original Assignee
Aurasense Tech Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aurasense Tech Corporation filed Critical Aurasense Tech Corporation
Publication of WO2024081783A1 publication Critical patent/WO2024081783A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Definitions

  • AR Augmented reality
  • VR virtual reality
  • the current systems lack interactive feedback with the virtual environments, and typically incorporate vibration as the only mode of haptic feedback.
  • systems and methods for providing interactive and tactile projections of light and sound for enhancement of virtual and augmented realities are provided herein.
  • devices, systems, and methods for emission of electromagnetic waves and/or mechanical waves incorporate mechanical elements to provide non-laser focused mechanical pressure waves in the human audible spectrum (i.e., about 20 Hz to 20 kHz) and/or human non-audible spectrum.
  • Mechanical elements may include parametric speaker arrays such as ultrasonic speaker arrays, piezo speakers, or electromagnetic speakers, and the like.
  • beam forming and/or beam shaping methods are utilized to focus, direct, or otherwise manipulate waves propagated from the systems and devices disclosed herein.
  • the devices and systems incorporate optical elements to provide laser focused mechanical pressure waves in the human audible spectrum and/or human non-audible spectrum.
  • Optical elements may also be utilized to provide optical signals in the infrared, near infrared, or visible light spectrum.
  • Optical elements may include lasers, light emitting diodes, lenses, mirrors, or a combination thereof.
  • devices and systems incorporate thermal elements to alter an ambient temperature.
  • thermal elements are utilized to lower an ambient temperature.
  • thermal elements are utilized to lower an ambient temperature.
  • thermal elements are utilized to adjust an ambient temperature between about 0 °C to about 100 °C.
  • temperature sensors are incorporated to measure temperatures of surfaces or areas which may interact with the thermal elements.
  • temperature sensors allow for dynamic adjustment of the thermal elements, as disclosed herein.
  • devices and systems include interferometric elements to measure mechanical pressure waves or optical waves.
  • interferometric elements are utilized for dynamic adjustment of optical elements, emission of electromagnetic waves, and/or emission of mechanical waves.
  • devices and system include optical sensors.
  • optical sensors are utilized to dynamically measure mechanical waves, optical waves, and motion/position of objects (e.g., animate and inanimate objects such as people, cars, rocks, etc.).
  • an optical sensor is provided to capture images at a rate of 10Hz to 10,000Hz. Said captured images may be combined into a video format.
  • an optical sensor comprises a camera.
  • optical sensors include infrared, near infrared, visible light, ultra-violet spectrum sensors.
  • optical sensors comprise three-dimensional (3D) spectroscopic cameras capable of sensing in infrared (IR), near infrared, visible light, and/or ultra-violet spectrum.
  • systems utilize multiple stereo infrared (IR) imaging devices.
  • systems and devices incorporate one or more computational elements (e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.) to perform data processing and real-time data processing for dynamic output signal conditioning and adjustment based on desired output and measured signal inputs, as disclosed herein.
  • computational elements e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.
  • systems include closed mesh network elements for selfrecognizing interact-ability with like devices to allow constructive or destructive distributed signal modification.
  • systems include open network elements (e.g., 3G, 4G, 5G, long range (LoRa), and the like) to enable connection to internet, intranet, distributed computing network (cloud computing).
  • systems include electrical elements to generate, consume, receive, and transmit power (e.g., solar panels, rechargeable battery, battery, wireless energy transmission / reception components, and the like) to provide power to the system and similar devices within a known proximity.
  • communication between devices utilizes free space optics communication and has the ability to adjust data transmission bandwidth based on power consumption restrictions.
  • a system for haptic interactive gaming comprising: a haptic array comprising a plurality of ultrasonic devices; a camera; a light source; a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: receiving a two-dimensional (2D) data, a three-dimensional (3D) data, or both from a gaming console; directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a first acoustic field based on the 2D data, the 3D data, or both; directing the light source to emit light based on the 2D data, the 3D data, or both; determining a user motion based on data received by the camera; and providing the motion data to the user console based on the user motion.
  • the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three- dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
  • the camera captures data at a rate of about 10 Hz to 10,000 Hz.
  • the user motion is a motion of an appendage of a user.
  • the light source comprises a laser, a light emitting diode, a light bulb, or any combination thereof.
  • the emitted light has a wavelength of about 10 nm to about 10,000 nm.
  • the emitted light has a frequency of about 0.3 THz to about 300 THz.
  • the system further comprises an interferometric device, wherein the operations further comprise calibrating the haptic array based on data received from the interferometric device.
  • the interferometric device comprises a laser doppler vibrometer, a laser interferometer, an acoustic interferometer, or any combination thereof.
  • the operations further comprise directing the thermal element to emit heat based on the 2D data, the 3D data, the user motion, or any combination thereof.
  • the system further comprises an energy storage device providing power to the haptic array, the camera, the non- transitory computer-readable storage media, or any combination thereof.
  • the energy storage device comprises a battery, a supercapacitor, or any combination thereof.
  • a computer-implemented method for haptic interactive gaming comprising: receiving, by a computer, a two- dimensional (2D) data, a three-dimensional (3D) data, or both from a gaming console; directing, by the computer, at least a portion of a plurality of ultrasonic devices in a haptic array to emit a first acoustic field based on the 2D data, the 3D data, or both; directing, by the computer, a light source to emit light based on the 2D data, the 3D data, or both; determining, by the computer, a user motion based on data received by a camera; and providing, by the computer, the motion data to the user console based on the user motion.
  • the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof.
  • the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
  • the data is received by the camera at a rate of about 10 Hz to 10,000 Hz.
  • the method further comprises calibrating, by the computer, the haptic array based on data received from an interferometric device. In some embodiments, the method further comprises directing, by the computer, a thermal element to emit heat based on the 2D data, the 3D data, the user motion, or any combination thereof.
  • FIG. 1 shows a non-limiting example of a computing device; in this case, a device with one or more processors, memory, storage, and a network interface;
  • FIG. 2 shows a non-limiting example of a web/mobile application provision system; in this case, a system providing browser-based and/or native mobile user interfaces;
  • FIG. 3 shows a non-limiting example of a cloud-based web/mobile application provision system; in this case, a system comprising an elastically load balanced, auto-scaling web server and application server resources as well synchronously replicated databases;
  • FIG. 4 shows a non-limiting example of a system a bio-haptic security system, per one or more embodiments herein;
  • FIG. 5 shows a non-limiting example of a haptic array device, per one or more embodiments herein;
  • FIG. 6A shows an image of an exemplary system for haptic interactive gaming, per one or more embodiments herein;
  • FIG. 6B shows an image of a user manipulating a virtual cube with an exemplary system for haptic interactive gaming, per one or more embodiments herein;
  • FIG. 7A shows an image of a user throwing a virtual ball with an exemplary system for haptic interactive gaming, per one or more embodiments herein;
  • FIG. 7B shows an image of a user manipulating one of three balls with an exemplary system for haptic interactive gaming, per one or more embodiments herein;
  • FIG. 8A shows a first image an exemplary system for haptic interactive gaming, per one or more embodiments herein;
  • FIG. 8B shows a second image an exemplary system for haptic interactive gaming, per one or more embodiments herein;
  • FIG. 9A shows a third image an exemplary system for haptic interactive gaming, per one or more embodiments herein.
  • FIG. 9B shows an image an exemplary ancillary device, per one or more embodiments herein.
  • the haptic feedback system utilizes a combination of optic and acoustic fields simultaneously.
  • generated optic and acoustic fields have no direct interference, however, combining them provides benefits such as multi-resolution haptic images and a synergistic effect on haptic perception.
  • the fields are applied simultaneously as elastic wave to stimulate nerves signals.
  • the optic field is utilized to simulate or produce a “skin feeling,” or feeling of touch.
  • the acoustic field is utilized to apply pressure. Combining two fields of different physical quantities would provide not only the superposition effect proposed above but also synergistic effects such as modification of the feeling.
  • FIG. 4 shows a diagram of the components of haptic array device, according to some embodiments.
  • FIG. 5 depicts a haptic array device, according to some embodiments.
  • the system is parametric.
  • the non-linearity of the frequency response produced by multiple ultrasonic frequencies in air is modeled utilizing parametric equations.
  • the parametric equations may be utilized in computer and/or machine learning systems to (and resultingly, the effect is best modeled with parametric equations).
  • the system includes Field Programmable Gate Arrays (FPGAs), machine learning, autonomous control systems, fast-networking, fast-self healing, interferometer sensors, ultrasonic speaker arrays, and the like.
  • FPGAs Field Programmable Gate Arrays
  • the system utilizes laser interferometer technology to measure the response of an environment, one or more objects, or a combination thereof to dynamically change parameters and achieve desired effects.
  • a laser interferometer system sends out a two-beam laser to measure vibration of a surface.
  • laser interferometer is used to receive vibration signals to calibrate the output of the ultrasonic transducer array to effectively beamform the audio waves to focus on one or more points on a subject or object.
  • parametric speaker array is a highly directive speaker that consists of an array of ultrasonic transducers that exploit the nonlinear properties of air to self-demodulate modulated ultrasonic signals with the aim of creating narrow, focused sound waves (audible and inaudible).
  • the ultrasonic transducers are piezoelectrically driven.
  • the system utilizes one or more parametric speaker/transducer arrays.
  • each transducer array comprises multiple transducers.
  • the multiple transducers of each array output the same signal which is amplified by constructive interference.
  • two or more arrays are configured to further amplify a signal via constructive interference.
  • a plurality of speaker arrays may be utilized to precisely direct sound or amplify sound at a precise location.
  • Use of a parametric speaker array may the traditional use of broadcasting audio through distributed & coherent beamforming functionality. This approach offers the capability of numerous smaller devices to output the same audio volume as a single large device.
  • current acoustic hailing or loudspeaker systems focus on high energy output over focused energy output, requiring large and powerful emitters that are difficult to move and/or emplace.
  • the system and methods herein allow for high powered acoustic energy signals to be achieved with a system which is relatively compact and has low power requirements.
  • the system combines the laser interferometer and parametric speaker array technologies with the distributed coherent beamforming technique through a network capable control system that uses algorithms and/or machine learning (ML) to rapidly tune the audio effect to mitigate destructive environmental noise and to enable effective beam coherence. Therefore, in some embodiments, the system provides autonomous environmental adjustments and distributed coherence beam forming.
  • ML machine learning
  • the inventive device combines three fundamental technologies:
  • the inventive device combines four fundamental technologies: (1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms, (2) one or more lasers for generating laser haptics, (3) one or more video capture device for monitoring at least a portion of a subject, and (4) a network-connected system controller to manage data from both the network and the individual components.
  • an individual system functions on its own.
  • individual systems are combined in a network that provides a distributed coherent beamforming function.
  • the system utilizes digital signal processing; embedded systems, information technology for distributed networking (i.e., Internet of Things (IOT)), and machine leaming/artificial intelligence (ML/ Al) for device self-calibration.
  • IOT Internet of Things
  • ML/ Al machine leaming/artificial intelligence
  • a system 400 for controlling providing haptic feedback or stimulation is depicted, according to some embodiments.
  • the system 400 is utilized to stimulate or provide haptic feedback to subject or portion of a subject (e.g., a hand of a subject 490).
  • the system 400 includes network module 405, system controller 410, acoustic payload controller 420, a monitoring controller 425, monitoring sensors 430, acoustic haptic array controller 435, acoustic haptic array 450, optical emission controller 460, optical emitter 465, and recorder 440.
  • the functions of the system 400 are controlled by system controller 410.
  • the system controller 410 comprises a computer processing unit (CPU), as described herein.
  • the CPU may comprise one or more programs loaded onto a memory for sending instructions for operating the various components of the system, as described herein.
  • the system controller 410 may further comprise a field programmable gate array (FPGA) configurable to provide a logic circuit for specified functions of the system.
  • the system controller 410 is in operative communication with a network module 405.
  • the network module 405 may be configured to receive information instructions, such a programming instructions, parameter inputs, or the like and transmit said instructions to the system controller 410.
  • the network module 405 may communicate with an external network, remote device, user interface, or the like, as disclosed herein.
  • mesh networking is utilized.
  • mesh networking allows the system to provide distributed coherence.
  • mesh networking may allow many small systems to achieve the performance of a much larger system.
  • Mesh networking may also allow the system to provide unique and complicated acoustic algorithms (e.g., machine learning) to enable precise spatial audio or ultrasonic feedback.
  • the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460. In some embodiments, the acoustic payload controller and the optical emission controller are integrated into a single haptic array controller. In some embodiments, the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460 via one or more control buses 415.
  • the acoustic payload controller 420 comprises an application specific integrated circuit (ASIC) processes one or more signals and provides an output signal to the acoustic haptic array controller 435.
  • ASIC application specific integrated circuit
  • the acoustic haptic array controller 435 provides an output signal to the acoustic haptic array 450, where the output signal is transformed into to mechanical waveform (e.g., an acoustic, sound, or ultrasonic waveform) by one or more transducers of the acoustic haptic array.
  • the haptic array controller comprises an amplifier to amplify the signal prior to output to the haptic array(s).
  • the system is connected to a plurality of haptic arrays and the output to each haptic array is varied to produce a desired output.
  • the constructive interference of the sonic waves produced by the transducers is utilized to produce one or more focal points.
  • focal points of sonic energy are produced with a resolution of 1/16 of the wavelength (e.g., approximately 0.5 mm for the 40-kHz ultrasound).
  • the optical emission controller 460 comprises an application specific integrated circuit (ASIC) processes one or more received signals.
  • the optical emission controller 425 receives signals from the system controller 410.
  • the optical emission controller 425 receives signals from the system controller 410, the acoustic payload controller 420, the monitoring controller 425, or a combination thereof.
  • the optical emission controller 460 provides directs and controls one or more optical emitters 465.
  • the optical one or more optical emitters 465 comprise at least one light source. In some embodiments, the optical one or more optical emitters 465 comprise at least one light source coupled to one or more optical elements.
  • the optical elements may comprise lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location.
  • the system is connected to a plurality of optical emitters and the output to each optical emitter is varied to produce a desired output.
  • the light source of the optical emitter is a laser, as described herein.
  • the optical emitter produces electromagnetic energy outside of the visible light spectrum.
  • the optical emitter may produce electromagnetic waves within the ultraviolet or infrared spectrum.
  • the optical emitter is replaced or used in combination with an emitter which generates another type of electromagnetic energy, such as radio emissions.
  • the optical emitter is replaced or used in combination with a thermal emitter which generates and transmits heat toward a target location or focal point.
  • the system 400 comprises a monitoring controller 425.
  • the monitoring controller operates and receives data from one or more monitoring sensors.
  • Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490).
  • a target e.g., a target area, volume, or a portion of a subject 490.
  • an interferometer is utilized as a monitoring sensor, as disclosed herein.
  • the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the acoustic payload controller 420.
  • the acoustic payload controller 420 comprises a digitally-programmable potentiometer (DPP) which receives the interferometer data.
  • the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the optical emission controller 460.
  • the optical emission controller 460 comprises a digitally-programmable potentiometer (DPP) which receives the data generated by the monitoring sensors.
  • the monitoring data is sent back to system controller 410.
  • the acoustic payload controller 420 may adjust the output signal to the acoustic haptic array controller 420 based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440.
  • the optical emission controller 460 may adjust the output signal to the optical emitter based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440.
  • the system is configured such that feedback received from the monitoring sensors 430 is utilized to adjust the system, output of the haptic arrays 450, and output of the optical emitters 465. In some embodiments, adjustments are made in real-time to provide a self-calibrating system.
  • the system further comprises a recorder 440.
  • Recorder 440 may receive and store monitoring data via an input/output (I/O) integrated circuit coupled to the monitoring controller.
  • the stored data may be utilized by the system to improve outputs.
  • the stored monitoring data is input into a machine learning module to improve the system.
  • the system is used for audio recording using an interferometer (i.e., ISR).
  • ISR interferometer
  • the monitoring data is used to track a target 490.
  • the monitoring data is used to monitor the response of a target to the haptic output of the system.
  • the system is modular, such that multiple systems can be networked to provide different levels of performance based on user needs.
  • An individual system may operate independently for reduced function based on user needs.
  • Combined systems may operate together to produce a higher output signal, provide haptic feedback to a larger volume of space.
  • sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). In some embodiments, sonic haptic feedback is provided to a target via an array of ultrasonic transducers. In some embodiments, an array of ultrasonic transducers comprises 324 transducers arranged in an 18 x 18 square grid. However, multiple arrangements of the transducers may be provided to better suit various applications. In some embodiments, the transducers are arranged as a planar array.
  • the transducers are arranged in a non-planar array. In some embodiments, the transducers are arranged in two or more planar arrays which are provided at an angle to each other. In some embodiments, the transducers are arranged in two or more planar arrays which are orthogonal to each other. In some embodiments, the transducers are open aperture ultrasonic transducers. In some embodiments, the transducers are ceramic transducers (e.g., Nippon Ceramic T4010A1 transducers).
  • an array of ultrasonic transducers comprises about 4 transducers to about 1,025 transducers. In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 25 transducers, about 4 transducers to about 64 transducers, about 4 transducers to about 256 transducers, about 4 transducers to about 324 transducers, about 4 transducers to about 576 transducers, about 4 transducers to about 1,025 transducers, about 25 transducers to about 64 transducers, about 25 transducers to about 256 transducers, about 25 transducers to about 324 transducers, about 25 transducers to about 576 transducers, about 25 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about 324 transducers, about 64 transducers to about 576 transducers, about 64 transducers to about 1,025 transducers, about 256 transducers, about 64 transducers to about 324 transducers,
  • an array of ultrasonic transducers comprises at least about 4 transducers, about 25 transducers, about 64 transducers, about 256 transducers, about 324 transducers, or about 576 transducers, including increments therebetween. In some embodiments, a plurality of transducer arrays is provided.
  • the transducers are capable of producing an ultrasonic focal point having a diameter of about 20 millimeters (mm). In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 100 mm.
  • the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 5 mm, about 1 mm to about 10 mm, about 1 mm to about 20 mm, about 1 mm to about 40 mm, about 1 mm to about 50 mm, about 1 mm to about 100 mm, about 5 mm to about 10 mm, about 5 mm to about 20 mm, about 5 mm to about 40 mm, about 5 mm to about 50 mm, about 5 mm to about 100 mm, about 10 mm to about 20 mm, about 10 mm to about 40 mm, about 10 mm to about 50 mm, about 10 mm to about 100 mm, about 20 mm to about 40 mm, about 20 mm to about 50 mm, about 20 mm to about 100 mm, about 40 mm to about 50 mm, about 40 mm to about 100 mm, or about 50 mm to about 100 mm.
  • the transducers are capable of producing an ultrasonic focal point having a diameter of at least about 1 mm, about 5 mm, about 10 mm, about 20 mm, about 40 mm, or about 50 mm, including increments therebetween. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at most about 5 mm, about 10 mm, about 20 mm, about 40 mm, about 50 mm, or about 100 mm, including increments therebetween.
  • the transducer array is capable of providing pressure forces of about 10 millinewtons (mN) to about 20 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 100 mN.
  • the transducer array is capable of providing pressure forces of about 1 mN to about 2 mN, about 1 mN to about 5 mN, about 1 mN to about 10 mN, about 1 mN to about 20 mN, about 1 mN to about 50 mN, about 1 mN to about 100 mN, about 2 mN to about 5 mN, about 2 mN to about 10 mN, about 2 mN to about 20 mN, about 2 mN to about 50 mN, about 2 mN to about 100 mN, about 5 mN to about 10 mN, about 5 mN to about 20 mN, about 5 mN to about 50 mN, about 5 mN to about 100 mN, about 10 mN to about 20 mN, about 10 mN to about 50 mN, about 10 mN to about 100 mN, about 20 mN to about 50 mN, about 20 mN to about 100 mN, about 10
  • the ultrasonic haptics are based on acoustic radiation pressure, which is not vibrational and presses the skin surface. This can be applied on the skin for a long time but this is relatively weak. The sensation maybe similar to a laminar air flow within a narrow area.
  • vibrotactile stimulations are produced by modulation of ultrasonic emission as waveforms.
  • vibrotactile stimulations are produced by modulated by 200 Hz and 50 Hz waves.
  • the waveforms for producing ultrasonic haptic feedback are sinewaves, rectangular waves, triangular waves, or a combination thereof.
  • the spatial resolution produced by the transducer array is about 8.5 mm when the array is operating at 40 kilohertz (kHz).
  • the haptic array device comprises one or more lasers for providing haptic feedback.
  • a laser emits energy at a wavelength of about 10 nm to about 10,000 nm.
  • a laser has a frequency of about 0.3 THz to about 300 THz.
  • a power output of the laser is about 0.16 watts (W).
  • a power output of the laser is about 0.01 W to about 0.5 W.
  • a power output of the laser is about 0.01 W to about 0.05 W, about 0.01 W to about 0.1 W, about 0.01 W to about 0.13 W, about 0.01 W to about 0.16 W, about 0.01 W to about 0.2 W, about 0.01 W to about 0.3 W, about 0.01 W to about 0.5 W, about 0.05 W to about 0.1 W, about 0.05 W to about 0.13 W, about 0.05 W to about 0.16 W, about 0.05 W to about 0.2 W, about 0.05 W to about 0.3 W, about 0.05 W to about 0.5 W, about 0.1 W to about 0.13 W, about 0.1 W to about 0.16 W, about 0.1 W to about 0.2 W, about 0.1 W to about 0.3 W, about 0.1 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about 0.3 W, about 0.13 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about
  • a power output of the laser is about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W. In some embodiments, a power output of the laser is at least about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, or about 0.3 W. In some embodiments, a power output of the laser is at most about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therebetween.
  • a low laser power levels prevent damaging of the skin of a user.
  • the sensation produced by the laser system may be similar to an electric sensation.
  • the haptic feedback from the laser causes evaporation from a nonthermal shockwave produced on skin.
  • duration of laser exposure is limited to prevent damage to the skin.
  • a haptic laser system comprises at least one laser light source.
  • the haptic laser system comprises optical elements such as lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location.
  • a haptic laser system comprises galvo- mirrors for precise positioning of the laser energy.
  • a laser system comprises a computer-controlled optical phased array comprising pixels that modulate a laser beam’s intensity, phase, or both.
  • the haptic array device utilizes a combination of electromagnetic energy and pressure from mechanical waves to produce unique sensations for a user.
  • the ultrasonic transducers can produce pressure in larger areas (e.g., about 30 cm areas).
  • the laser haptics systems produce sensations in more focused areas (e.g., down to 1 micron). Therefore, a combination of laser and ultrasonic transducer systems may produce focused haptics at different scales simultaneously. For example, if a target is a hand of a user, the ultrasonic haptic system may produce a pressure sensation on the palm of the hand, while the laser haptic system focuses a sensation on a fingertip of the user. Such a configuration may be useful in confirming registration or detection of various parts of the hand when being used in combination with a gesture registration system.
  • lasers of the haptic array device are utilized to produce visualizations.
  • constructive interference produced by a laser emission system is utilized to generate 3D images or holograms.
  • a 3D image or hologram is utilized to help guide a user when the haptic array device is being used as a controller or for gesture recognition.
  • a 3D image or hologram is utilized to help guide a user when using an external device is being used as a controller or for gesture recognition.
  • a 3D image is produced to guide a user’s hand to the center of an image captured by a camera (either incorporated or external to the haptic array device) being utilized for gesture recognition.
  • a haptic array device utilizes a laser system to produce both haptic and visual effects.
  • the haptic feedback is provided as the user interacts with a 3D image or hologram.
  • a 3D image or hologram is utilized to help guide a user through a series movements as part of rehabilitation or training program.
  • one or more sensors are provided to monitor interaction with the haptic array device.
  • a monitoring system comprising one or more sensors is provided to monitor a user position, the user motion, or both is outside a security threshold from a set user position, a set user motion, or both.
  • Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490).
  • an interferometer is utilized as a monitoring sensor, as disclosed herein.
  • a monitoring system comprises a camera.
  • the camera captures data at a rate of about 10 Hz to 10,000 Hz.
  • the camera comprises a two-dimensional camera.
  • the camera comprises a three-dimensional camera.
  • the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
  • the camera is coupled to a computer processing unit (CPU) of the system, as disclosed herein.
  • the camera may be utilized for gesture recognition.
  • haptic feedback is provided by the haptic array device in response to position or movement of a target within the field of view of the camera.
  • feature detection and extraction methods are utilized to identify a region of interest on the target.
  • regions of interest may include such as a finger, palm, thumb, fingertip, etc. of a user.
  • feature detection and extraction methods comprise computing processing of images to analyze contrasts in pixel brightness to recognize features.
  • Feature detection and extractions methods may include edge detection, corner detection, blob detection, ridge detection, and combinations thereof.
  • an edge detection algorithm is utilized to identify an outline or border of a target.
  • a nearest neighbor, thresholding, clustering, partial differential equation, and/or other digital image processing methods are utilized to identify an outline or border of a target.
  • Canny, Deriche, differential, Sobel, Prewitt, and Roberts cross edge detection techniques may be utilized to identify target or a portion thereof.
  • Gaussian or Laplacian techniques are utilized to smooth or improve the accuracy of the identified target or portion thereof.
  • additional sensors are utilized to enhance or supplement the performance of the haptic array device or monitoring system thereof.
  • ancillary sensors comprise wearable sensors which are attached to a user to receive additional data generated by movements or electrical signals (e.g., electromyographic (EMG), electroencephalographic (EEG), etc.) produced by a user.
  • EMG electromyographic
  • EEG electroencephalographic
  • a wearable ancillary sensor comprises one or more motion sensors.
  • the motion sensors comprise an accelerometer, a gyroscope, or a combination thereof.
  • a wearable ancillary sensor array is configured to couple to an appendage, limb, or extremity of a user.
  • an existing device comprising one or more motion sensors (e.g., a smart watch) is coupled to the haptic array device to act as an ancillary sensor device.
  • additional bioinformatics are acquired by the ancillary sensors such as heart rate, body temperature, blood pressure, or a combination thereof.
  • a wearable ancillary sensor array is configured to be worn on a head of user.
  • a wearable ancillary sensor array comprising one or more EEG sensors is configured to place the EEG sensors in proximity to the scalp of a user and receive electric signals produced by the brain of the user.
  • the EEG sensors do not require direct contact to the skin (e.g., no need for shaving of the head) or a gel to be applied to the scalp.
  • the ancillary sensors are used confirm or verify actions or gestures made by a user.
  • bioinformatic information obtained by the ancillary sensors is recorded and stored in a memory of the system.
  • the ancillary sensor is head wearable and comprises a helmet, a visor, glasses, a headband, earbuds, earphones, or any combination thereof.
  • the ancillary sensor is head wearable and comprises an inertial motion unity (IMU) sensor, a facial micro-motion sensor, or any combination thereof.
  • IMU inertial motion unity
  • the ancillary sensor is wrist and/or hand wearable and comprises a photoacoustic sensor, an ultrasound sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
  • the wrist ancillary sensor is reconfigurable based on a user’s handedness.
  • the ancillary sensor is hand graspable.
  • the wrist ancillary sensor comprises a wireless communication device, a wired communication device, or both. In some embodiments, the wrist ancillary sensor comprises an energy storage device, a wired charge connector, a wireless charge connector, or any combination thereof. In some embodiments, the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof. In some embodiments, the haptic feedback comprises a finger haptic, a magneto haptic, an opto- haptic, or any combination thereof. In some embodiments, the ancillary sensor is hand graspable. In some embodiments, the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof.
  • the ancillary sensor is foot-wearable and comprises a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, inertial motion unity (IMU) sensor, a thermometer, an altimeter, a barometer, a humidity sensor, a sweat rate generation sensor, a hydration sensor, a bioacoustics sensor, or any combination thereof.
  • BIOA bioimpedance
  • EDA electrodermal activity
  • IMU inertial motion unity
  • the ancillary sensor comprises a facial micro-motion sensor, a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
  • PPG photoplethysmography sensor
  • BIOA bioimpedance
  • EDA electrodermal activity
  • IMU inertial motion unity
  • an interactive haptic feedback system is provided by the devices and methods disclosed herein.
  • haptic feedback is provided to help guide the user.
  • haptic feedback may be utilized to confirm that the monitoring system has identified a target portion of a user or confirm that movement of the user has been properly registered.
  • haptic feedback is utilized to confirm a portion of the user is properly in view of the monitoring system.
  • movement of a hand of a user is monitored by a haptic array device, as the user interacts with a hologram or visualization produced by the haptic array device.
  • the movements of the user’s hand are translated into instructions such to provide a controller.
  • haptic feedback is provided in response to the user’s movements to simulate interaction with an object represented by the visualization or hologram.
  • computer vision systems may be utilized to identify portions of a user hand and monitor movement of the user’s hand and fingers. Information from monitoring systems may be provided to render or position a visualization in response to the movement by the user. While the description herein provides methods and systems for monitoring a hand of a user for interacting with a hologram or visualization, it should be appreciated that other portions of the body, or the entire body, of a user may be monitored to manipulate a visualization as the user interacts with the visualization.
  • a haptic array device provides haptic feedback to a user as they manipulate virtual items in an augmented reality.
  • the haptic array device provides interactive feedback responsive to a user’s movements to supplement an augmented reality.
  • an external device produces an image in augmented reality projected proximal to the haptic array device.
  • haptic feedback from the haptic array device is synchronized with a user’s interaction with the augmented reality visualization.
  • Coupling of a haptic array device to a computing system, to provide interactive haptic feedback may be carried out through wired or wireless communication.
  • a computing system may include a personal computer, mobile computing device, gaming system, or the like.
  • the haptic array device monitors movement of a user, such that the haptic array device provides a controller for operation of the computing system.
  • the haptic array device provides haptic feedback in response to a user’s movements and/or in response to situations provided in a virtual environment.
  • movement of a hand of a user is monitored by a haptic array device, as described herein, and translated into instructions such to provide a controller.
  • haptic feedback is provided in response to the user’s movements. Haptic feedback may also be provided in response to conditions of a virtual or real-world environment.
  • a haptic controller provided by the haptic array device monitors movements made by the hand and fingers of a user to provide a controller.
  • each finger of a hand of a user is monitored to provide controlling instructions for a system or device.
  • computer vision systems may be utilized to identify portions of a user hand and monitor movement of the user’s hand and fingers. While the description herein provides methods and systems for monitoring a hand of a user for controlling systems and devices, it should be appreciated that other portions of the body, or the entire body, of a user may be monitored to provide a controller device.
  • the haptic array device is configured as a controller for a computing system. In some embodiments, the haptic array device is configured as a gaming controller. In some embodiments, the haptic array device provides a controller by monitoring movement of the hand of a user. Use of the haptic array device may provide an intuitive controller. Further, the haptic array device may provide an adaptive controller to facilitate use by individuals with disabilities.
  • the haptic array device being used as a controller, provides a “deadzone” wherein smaller movements are not registered as inputs as to prevent unintentional movement being registered as controller inputs.
  • the controller deadzone is configurable to a user’s preference.
  • a controller provided by the haptic array device allows for user guided navigation through virtual environments.
  • a controller provided by the haptic array device allows intuitive interaction with virtual objects in virtual or augmented realities.
  • the haptic array device replaces a traditional controller system (e.g., a gaming controller, mouse, keyboard, etc.) typically utilized for a computer system or device.
  • the haptic array device supplements a traditional controller system utilized for a computer system or device.
  • the various modes of haptic feedback may provide a more immersive environment.
  • the various modes of haptic feedback capable of being emitted by the system are capable of producing interactive feedback well beyond the vibration feedback produced by traditional gaming controllers.
  • the haptic array device may be capable of producing oscillating sonic haptics which will create vibrational feedback similar to traditional gaming vibration feedback.
  • laser or sonic haptics may be utilized to create tactile stimulation as the user interacts operates the haptic array device as a controller.
  • haptic feedback may be utilized to simulate the pressing of buttons or pulling of triggers.
  • Haptic feedback may also be utilized to simulate limitations of the haptic array device as a controller. For example, haptic feedback intensity may increase as a user’s hand is moved to toward the outer periphery of the monitoring system. This may allow haptic stimulation to indicate that the user is approaching the boundary of the controller and should not move any further.
  • haptic feedback may increase as the user pushes the controller towards the limit of the movement. For example, if a haptic array device is being utilized to control acceleration of a simulated car in a virtual environment, increasing haptic feedback may indicate to a user that a top speed of a car has been reached, or that the acceleration of the car is at a maximum.
  • the haptic array device is utilized as a controller for real-world systems or devices.
  • the haptic array device may be utilized as a controller for unmanned vehicles, such as unmanned ariel vehicles (UAVs) or drones.
  • unmanned vehicles such as unmanned ariel vehicles (UAVs) or drones.
  • environmental conditions surrounding real world devices are recorded and utilized for providing haptic feedback to a user.
  • a UAV may be equipped with at anemometer to measure wind experienced by the UAV. Data obtained by the anemometer may be used to provide haptic feedback to a user such that the effects of the wind on the UAV are simulated and felt by the user. Further environmental conditionals may be simulated by the sonic, laser, and thermal haptics provided by the system.
  • a haptic array device provides haptic feedback to a user as they manipulate objects in a virtual reality. In some embodiments, a haptic array device provides haptic feedback to a user as they manipulate virtual items in an augmented reality environment. In some embodiments, the haptic array device provides interactive feedback responsive to a user’s movements to supplement virtual items in a virtual or augmented reality.
  • the haptic array device provides a three-dimensional visualization or hologram.
  • the three-dimensional visualization or hologram may represent a virtual item.
  • haptic feedback is provided to the user as they interact with the three-dimensional visualization or hologram.
  • the haptic feedback produced as the user interacts with the visualization may simulate interaction of with a physical object.
  • ancillary sensors as described herein may be used to track a user’s response to stimulation, haptic or other (e.g., reactions to music or video).
  • haptic feedback is dynamically varied based bioinformatic data obtained by ancillary sensors. For example, a user’s response may produce changes in holographic or haptic projections by the device. This may also be utilized to reduce stimuli for user’s which may be sensitive to over stimulation.
  • the system provides spatial audio for static or dynamic exhibits (e.g., for entertainments, sports events, music events, museum exhibits, art exhibits, theme parks, etc.).
  • static or dynamic exhibits e.g., for entertainments, sports events, music events, museum exhibits, art exhibits, theme parks, etc.
  • the haptic array device provides a controller for interacting with a virtual reality or environment. In some embodiments, the haptic array device provides a controller for interacting with virtual items. In some embodiments, haptic feedback is provided as the user manipulates virtual items using the haptic array device as a controller. Haptic feedback may include sonic, laser, and thermal feedback, as disclosed herein. In some embodiments, the haptic feedback simulates how the virtual item would respond as it is being manipulated, as if it were a real, physical object.
  • a mobile phone may be simulated in a virtual world.
  • sonic haptic feedback may be provided to simulate the weight of the virtual phone as it is manipulated by the user.
  • haptic feedback from the haptic array device may be utilized to create pressure against a user’s fingertips, as if they were interacting with the screen of the phone.
  • Similar haptic feedback may be provided to a variety of virtual items, such as doors, levers, knobs, living creatures simulated in virtual reality, etc.
  • a computing system which provides the virtual environment sends instructions to the haptic array device to coordinate the interaction of virtual items with the haptic feedback provided by the haptic array device.
  • the user’s movements are sent to the computing systems as inputs to manipulate the items in virtual reality, such that the graphics of a virtual environment are manipulated according to the user’s movements detected by the haptic array device.
  • the haptic array device provides a virtual console or controller for training purposes.
  • the device may provide a virtual airplane console as part of a virtual simulator.
  • Such virtual consoles may be used for training or entertainment purposes.
  • Virtual consoles may be utilized in additional applications such as driver training, heavy machinery training, etc.
  • a haptic array device produces interactive augmented reality items.
  • augmented reality produces a visualization within proximity of the haptic array device.
  • augmented reality glasses may produce a visualization located proximal to the haptic array device, such that the haptic array device can provide interactive feedback as the user manipulates the virtual item in augmented reality.
  • the haptic array device produces the visualization to provide an augmented reality.
  • the haptic array device provides interactive feedback as the user manipulates the interacts with the visualization.
  • the device provides a head’s up display type for a user. The head’s up display may provide supplementary information as the user interacts with a virtual environment.
  • interactive augmented reality items supplement a virtual environment.
  • the haptic array device may produce a visualization of a map of a virtual environment.
  • a user may interact with the augmented reality item to manipulate the item.
  • a user may manipulate a visualization of a map of a virtual environment to adjust the view or zoom in or out to locations on the map.
  • Further examples may include simulated locks, puzzles, or other items to supplement a virtual environment. 3.
  • the haptic array device provides a visualization of a game or a portion thereof.
  • the visualization may comprise one or more components of a game.
  • haptic feedback is provided as the user interacts with the visualization.
  • Haptic feedback may include sonic, laser, and thermal feedback, as disclosed herein.
  • the haptic feedback simulates how the visualization would respond as it is being manipulated, as if it were a real, physical object.
  • a visualization may simulate a chess board and chess pieces.
  • a user may interact with the visualization to pick and move their chess pieces.
  • sonic haptic feedback may be provided provide tactile stimulation as the chess piece holograms are manipulated by the user.
  • the visualization rendered by the haptic array device may project the chess piece moving as the player interacts with the hologram and places in a desired location.
  • Similar haptic feedback and rendering of three-dimensional visualizations may be provided to a variety of game pieces, game boards, and puzzles. Such arrangements may facilitate playing of games by those who are unfamiliar with controllers and other computer inputs. For example, an older chess player may find a tactile, interactive visualization more to be more intuitive than using a mouse to move chess pieces, preventing errors in placement of their chess pieces.
  • a computing system which provides the virtual environment sends instructions to the haptic array device to coordinate the interaction of virtual items with the haptic feedback provided by the haptic array device. This may be useful in cases wherein only one player has access to a haptic array device and the players are competing from remote locations.
  • the haptic array device produces visualizations of products.
  • a rendering of a product is provided the haptic array device.
  • the product may be provided by a manufacturer and downloaded or loaded to the haptic array device via a product website.
  • a visualization of a product may be useful for purposes of testing or demonstration, wherein a user wants to exam characteristics of a product without having access to a physical version of the product. Such an embodiment may be useful if a user is a consumer who is interested in purchasing a product.
  • the visualization may comprise a scaled or actual size model of the product.
  • the visualization is an interactive visualization.
  • a user interaction with the visualization allows for simulated manipulation.
  • a user may interact with the visualization to rotate it or zoom in on aspects of the visualization.
  • a user may interact with simulated components of the product, such as buttons, switches, knobs, etc.
  • haptic feedback is provided to the user such that tactile stimulation simulates interaction with the product.
  • interaction of the product by a user manipulates the visualization of a product. For example, a user may be interested in purchasing a new vehicle.
  • the system may allow a scaled rendering of the vehicle to be displayed to a user by the haptic array device.
  • the model is manipulated according. For example, a user manipulating a model of a could interact with the model to open the doors or hood of the model car visualization.
  • a haptic array device projects interactive three-dimensional images as digital art pieces.
  • a digital art piece is provided as a hologram or three-dimensional rendering.
  • a user is able to interact with the hologram to manipulate (e.g., rotate, move, zoom in on, etc.) the hologram.
  • haptic feedback is provided to the user as the manipulate the hologram, as disclosed herein.
  • Haptic feedback may comprise sonic haptics, laser haptics, thermal haptics, or a combination thereof.
  • Digital art pieces may comprise additional characteristics, such as projected audio, changing colors, etc.
  • a digital art piece comprises a hologram with associated haptic feedback provided during interaction with the hologram.
  • the associated haptic feedback corresponds to the physical attributed of the hologram.
  • interaction with a hologram comprising ribs or texturing may cause haptic feedback provided to a user to simulate a tactile feeling of the ribs or texturing to the user.
  • a digital art piece comprises a hologram or three-dimensional image and haptic feedback which seemingly does not correspond with the physical appearance of the hologram.
  • a digital art piece comprising a hologram and associated haptic feedback is connected to a block-chain authentication system.
  • the digital art piece comprises a non-fungible token (NFT) to authenticate the digital art piece and provide a record of ownership.
  • NFT non-fungible token
  • a digital art piece associated with an NFT allows for limited release of copies of the digital art piece.
  • Digital art pieces may be acquired via downloaded, bought, sold, or traded on online marketplaces. Digital art pieces may be downloaded to a coupled computing device, external to the haptic array device, and then displayed and/or interacted with using the haptic array device.
  • a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range.
  • description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • rehabilitation is often used interchangeably herein to refer to methods of recovery or prevention from injuries, pain, and/or medical procedures. Methods may be guided or automated using the systems and devices disclosed herein.
  • acoustic “sound,” or “sonic” are often used interchangeably herein to refer to mechanical pressure waves. Unless specified, the terms “acoustic” and “sonic” should broadly read on waveforms ranging through all sonic frequency ranges, including audible, inaudible, and ultrasonic frequencies.
  • the term “about” a number refers to that number plus or minus 10% of that number.
  • the term “about” a range refers to that range minus 10% of its lowest value and plus 10% of its greatest value.
  • the haptic array device 500 comprises an array of transducers 550 for producing sonic haptics, as described.
  • the array 550 is an ultrasonic transducer array, as disclosed herein.
  • the haptic array device 500 further comprises laser systems 511, 512, 513.
  • the haptic array device 500 further comprises an integrated monitoring system 520.
  • the haptic array device 500 is configured to provide haptic feedback or sensations to an object or focal point 505.
  • the object 505 is a portion of a user, such as a hand.
  • the laser systems may be configured to produce haptics, 3 dimensional visualizations (i.e., holograms), or both.
  • a hologram is produced by two of the laser systems function as optical emitters and using constructive interference to produce a 3D rendering.
  • a third laser system produces haptic feedback while the other two laser systems produce the hologram.
  • laser systems 511 and 512 may produce a hologram while laser system 513 provides haptic feedback to a target area 505.
  • monitoring system 520 comprises one or more sensors for monitoring an object or an objects response to the provided haptics, as disclosed herein.
  • the one or more sensors of the monitoring system may comprise optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target 505.
  • the monitoring system is coupled to a computer system which identifies and tracks the target 505 and/or portions thereof, as disclosed herein.
  • haptic array device 500 depicted in FIG. 5 depicts a device with fully integrated components, it should be appreciated that the components may not be integrated or may be separate from the device. Further, is should be appreciated that the device may be supplemented with further components (such as additional ultrasound transducer arrays) or additional haptic array devices of the same or a similar type.
  • the system can be used as an entertainment performance enhancer for shows in different venues including stage or large outdoor events.
  • the system can mitigate the degradation associated with the natural or ambient noise of performances.
  • the system can mitigate excess noise from an audience or crowd.
  • the system can be used as a distributed network that provides several uses.
  • the system 700 comprises a monitoring device 710 and an ancillary device 720.
  • the monitoring device 710 comprises a display 711, a haptic array 712, a camera 713, and a non-transitory computer-readable storage media.
  • the monitoring device 710 comprises the display 711, the haptic array 712, a time-of-flight sensor 714, and the non-transitory computer-readable storage media.
  • the monitoring device 710 comprises the display 711, the haptic array 712, the camera 713, the time-of-flight sensor 714, and the non-transitory computer-readable storage media.
  • the monitoring device 710 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
  • the display 711 is configured to show a display image.
  • FIG. 6B shows an exemplary display image of the user’s hand manipulating a virtual cube. In some embodiments, this display image is shown while the user experiences a sensation of manipulating the virtual cube by pressure waves emitted from the haptic array 712.
  • FIG. 7A shows an exemplary display image of the user’s hand throwing a ball.
  • FIG. 7B shows an exemplary display image of the user’s hand manipulating one of three displayed balls.
  • the ancillary device 720 comprises a biometric sensor.
  • the biometric sensor is configured to measure a biometric data.
  • the ancillary device 720 is configured to couple to an appendage of a user.
  • the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
  • the ancillary device 720 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
  • the monitoring device 710 or the ancillary device 720 comprise the non-transitory computer-readable storage media.
  • the ancillary device 720 further comprises an ancillary communication device and wherein the monitoring device 710 further comprises a monitoring communication device communicably coupled to the ancillary device 720.
  • the ancillary communication device and the monitoring communication device are wireless communication devices.
  • the camera 713 is configured to capture a plurality of pose images of the user. In some embodiments, the plurality of pose images of the user form a video of the motion of the user. In some embodiments, the camera 713 comprises a two- dimensional camera 713. In some embodiments, the camera 713 comprises a three- dimensional camera 713. In some embodiments, the camera 713 is an infrared camera 713, a near infrared camera 713, a visible light camera 713, an ultra-violet spectrum camera 713, a thermographic camera 713, or any combination thereof.
  • the camera 713, the time-of-flight sensor 714, or both captures data at a rate of about 10 Hz to 10,000 Hz.
  • the monitoring device 710 comprises two or more cameras 713, two or more time-of-flight sensors 714, or both. In one embodiment the two or more cameras 713, the two or more time-of-flight sensors 714, or both are arrayed to capture the user from two or more directions. In one embodiment the two or more cameras 713, the two or more time- of-flight sensors 714, or both are arrayed about the haptic array 712.
  • the haptic array 712 comprises a plurality of ultrasonic devices.
  • the haptic array 712 is a planar array.
  • the haptic array 712 is a non-planar array.
  • the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. V. Computing Systems
  • FIG. 1 a block diagram is shown depicting an exemplary machine that includes a computer system 100 (e.g., a processing or computing system) within which a set of instructions can execute for causing a device to perform or execute any one or more of the aspects and/or methodologies for static code scheduling of the present disclosure.
  • a computer system 100 e.g., a processing or computing system
  • the components in FIG. 1 are examples only and do not limit the scope of use or functionality of any hardware, software, embedded logic component, or a combination of two or more such components implementing particular embodiments.
  • Computer system 100 may include one or more processors 101, a memory 103, and a storage 108 that communicate with each other, and with other components, via a bus 140.
  • the bus 140 may also link a display 132, one or more input devices 133 (which may, for example, include a keypad, a keyboard, a mouse, a stylus, etc.), one or more output devices 134, one or more storage devices 135, and various tangible storage media 136. All of these elements may interface directly or via one or more interfaces or adaptors to the bus 140.
  • the various tangible storage media 136 can interface with the bus 140 via storage medium interface 126.
  • Computer system 100 may have any suitable physical form, including but not limited to one or more integrated circuits (ICs), printed circuit boards (PCBs), mobile handheld devices (such as mobile telephones or PDAs), laptop or notebook computers, distributed computer systems, computing grids, or servers.
  • ICs integrated circuits
  • PCBs printed circuit boards
  • mobile handheld devices such as mobile telephones or PDAs
  • Computer system 100 includes one or more processor(s) 101 (e.g., central processing units (CPUs), general purpose graphics processing units (GPGPUs), or quantum processing units (QPUs)) that carry out functions.
  • processor(s) 101 optionally contains a cache memory unit 102 for temporary local storage of instructions, data, or computer addresses.
  • Processor(s) 101 are configured to assist in execution of computer readable instructions.
  • Computer system 100 may provide functionality for the components depicted in FIG. 1 as a result of the processor(s) 101 executing non-transitory, processor-executable instructions embodied in one or more tangible computer-readable storage media, such as memory 103, storage 108, storage devices 135, and/or storage medium 136.
  • the computer-readable media may store software that implements particular embodiments, and processor(s) 101 may execute the software.
  • Memory 103 may read the software from one or more other computer-readable media (such as mass storage device(s) 135, 136) or from one or more other sources through a suitable interface, such as network interface 120.
  • the software may cause processor(s) 101 to carry out one or more processes or one or more steps of one or more processes described or illustrated herein. Carrying out such processes or steps may include defining data structures stored in memory 103 and modifying the data structures as directed by the software.
  • the memory 103 may include various components (e.g., machine readable media) including, but not limited to, a random access memory component (e.g., RAM 104) (e.g., static RAM (SRAM), dynamic RAM (DRAM), ferroelectric random access memory (FRAM), phase-change random access memory (PRAM), etc.), a read-only memory component (e.g., ROM 105), and any combinations thereof.
  • ROM 105 may act to communicate data and instructions unidirectionally to processor(s) 101
  • RAM 104 may act to communicate data and instructions bidirectionally with processor(s) 101.
  • ROM 105 and RAM 104 may include any suitable tangible computer-readable media described below.
  • a basic input/output system 106 (BIOS) including basic routines that help to transfer information between elements within computer system 100, such as during start-up, may be stored in the memory 103.
  • Fixed storage 108 is connected bidirectionally to processor(s) 101, optionally through storage control unit 107.
  • Fixed storage 108 provides additional data storage capacity and may also include any suitable tangible computer-readable media described herein.
  • Storage 108 may be used to store operating system 109, executable(s) 110, data 111, applications 112 (application programs), and the like.
  • Storage 108 can also include an optical disk drive, a solid-state memory device (e.g., flash-based systems), or a combination of any of the above.
  • Information in storage 108 may, in appropriate cases, be incorporated as virtual memory in memory 103.
  • storage device(s) 135 may be removably interfaced with computer system 100 (e.g., via an external port connector (not shown)) via a storage device interface 125.
  • storage device(s) 135 and an associated machine-readable medium may provide non-volatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for the computer system 100.
  • software may reside, completely or partially, within a machine-readable medium on storage device(s) 135.
  • software may reside, completely or partially, within processor(s) 101.
  • Bus 140 connects a wide variety of subsystems.
  • reference to a bus may encompass one or more digital signal lines serving a common function, where appropriate.
  • Bus 140 may be any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures.
  • such architectures include an Industry Standard Architecture (ISA) bus, an Enhanced ISA (EISA) bus, a Micro Channel Architecture (MCA) bus, a Video Electronics Standards Association local bus (VLB), a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, an Accelerated Graphics Port (AGP) bus, HyperTransport (HTX) bus, serial advanced technology attachment (SATA) bus, and any combinations thereof.
  • ISA Industry Standard Architecture
  • EISA Enhanced ISA
  • MCA Micro Channel Architecture
  • VLB Video Electronics Standards Association local bus
  • PCI Peripheral Component Interconnect
  • PCI-X PCI-Express
  • AGP Accelerated Graphics Port
  • HTTP HyperTransport
  • SATA serial advanced technology attachment
  • Computer system 100 may also include an input device 133.
  • a user of computer system 100 may enter commands and/or other information into computer system 100 via input device(s) 133.
  • Examples of an input device(s) 133 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device (e.g., a mouse or touchpad), a touchpad, a touch screen, a multi-touch screen, a joystick, a stylus, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), an optical scanner, a video or still image capture device (e.g., a camera), and any combinations thereof.
  • an alpha-numeric input device e.g., a keyboard
  • a pointing device e.g., a mouse or touchpad
  • a touchpad e.g., a touch screen
  • a multi-touch screen e.g., a joystick
  • the input device is a Kinect, Leap Motion, or the like.
  • Input device(s) 133 may be interfaced to bus 140 via any of a variety of input interfaces 123 (e.g., input interface 123) including, but not limited to, serial, parallel, game port, USB, FIREWIRE, THUNDERBOLT, or any combination of the above.
  • computer system 100 when computer system 100 is connected to network 130, computer system 100 may communicate with other devices, specifically mobile devices and enterprise systems, distributed computing systems, cloud storage systems, cloud computing systems, and the like, connected to network 130. Communications to and from computer system 100 may be sent through network interface 120.
  • network interface 120 may receive incoming communications (such as requests or responses from other devices) in the form of one or more packets (such as Internet Protocol (IP) packets) from network 130, and computer system 100 may store the incoming communications in memory 103 for processing.
  • Computer system 100 may similarly store outgoing communications (such as requests or responses to other devices) in the form of one or more packets in memory 103 and communicated to network 130 from network interface 120.
  • Processor(s) 101 may access these communication packets stored in memory 103 for processing.
  • Examples of the network interface 120 include, but are not limited to, a network interface card, a modem, and any combination thereof.
  • Examples of a network 130 or network segment 130 include, but are not limited to, a distributed computing system, a cloud computing system, a wide area network (WAN) (e.g., the Internet, an enterprise network), a local area network (LAN) (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a direct connection between two computing devices, a peer-to-peer network, and any combinations thereof.
  • a network, such as network 130 may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
  • Information and data can be displayed through a display 132.
  • a display 132 include, but are not limited to, a cathode ray tube (CRT), a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic liquid crystal display (OLED) such as a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display, a plasma display, and any combinations thereof.
  • the display 132 can interface to the processor(s) 101, memory 103, and fixed storage 108, as well as other devices, such as input device(s) 133, via the bus 140.
  • the display 132 is linked to the bus 140 via a video interface 122, and transport of data between the display 132 and the bus 140 can be controlled via the graphics control 121.
  • the display is a video projector.
  • the display is a head-mounted display (HMD) such as a VR headset.
  • suitable VR headsets include, by way of non-limiting examples, HTC Vive, Oculus Rift, Samsung Gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, Avegant Glyph, Freefly VR headset, and the like.
  • the display is a combination of devices such as those disclosed herein.
  • computer system 100 may include one or more other peripheral output devices 134 including, but not limited to, an audio speaker, a printer, a storage device, and any combinations thereof.
  • peripheral output devices may be connected to the bus 140 via an output interface 124.
  • Examples of an output interface 124 include, but are not limited to, a serial port, a parallel connection, a USB port, a FIREWIRE port, a THUNDERBOLT port, and any combinations thereof.
  • computer system 100 may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute one or more processes or one or more steps of one or more processes described or illustrated herein.
  • Reference to software in this disclosure may encompass logic, and reference to logic may encompass software.
  • reference to a computer-readable medium may encompass a circuit (such as an IC) storing software for execution, a circuit embodying logic for execution, or both, where appropriate.
  • the present disclosure encompasses any suitable combination of hardware, software, or both.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC may reside in a user terminal.
  • the processor and the storage medium may reside as discrete components in a user terminal.
  • suitable computing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • server computers desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
  • the computing device includes an operating system configured to perform executable instructions.
  • the operating system is, for example, software, including programs and data, which manages the device’s hardware and provides services for execution of applications.
  • server operating systems include, by way of non -limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®.
  • suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®.
  • the operating system is provided by cloud computing.
  • suitable mobile smartphone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBerry OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®.
  • suitable media streaming device operating systems include, by way of non-limiting examples, Apple TV®, Roku®, Boxee®, Google TV®, Google Chromecast®, Amazon Fire®, and Samsung® HomeSync®.
  • video game console operating systems include, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®.
  • the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computing device.
  • a computer readable storage medium is a tangible component of a computing device.
  • a computer readable storage medium is optionally removable from a computing device.
  • a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, distributed computing systems including cloud computing systems and services, and the like.
  • the program and instructions are permanently, substantially permanently, semi -permanently, or non-transitorily encoded on the media.
  • the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same.
  • a computer program includes a sequence of instructions, executable by one or more processor(s) of the computing device’s CPU, written to perform a specified task.
  • Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), computing data structures, and the like, which perform particular tasks or implement particular abstract data types.
  • APIs Application Programming Interfaces
  • a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof.
  • a computer program includes a web application.
  • a web application in various embodiments, utilizes one or more software frameworks and one or more database systems.
  • a web application is created upon a software framework such as Microsoft® .NET or Ruby on Rails (RoR).
  • a web application utilizes one or more database systems including, by way of non-limiting examples, relational, nonrelational, object oriented, associative, XML, and document oriented database systems.
  • suitable relational database systems include, by way of non-limiting examples, Microsoft® SQL Server, mySQLTM, and Oracle®.
  • a web application in various embodiments, is written in one or more versions of one or more languages.
  • a web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or combinations thereof.
  • a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML).
  • a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS).
  • CSS Cascading Style Sheets
  • a web application is written to some extent in a client-side scripting language such as Asynchronous JavaScript and XML (AJAX), Flash® ActionScript, JavaScript, or Silverlight®.
  • AJAX Asynchronous JavaScript and XML
  • Flash® ActionScript JavaScript
  • a web application is written to some extent in a serverside coding language such as Active Server Pages (ASP), ColdFusion®, Perl, JavaTM, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), PythonTM, Ruby, Tel, Smalltalk, WebDNA®, or Groovy.
  • a web application is written to some extent in a database query language such as Structured Query Language (SQL).
  • SQL Structured Query Language
  • a web application integrates enterprise server products such as IBM® Lotus Domino®.
  • a web application includes a media player element.
  • a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe® Flash®, HTML 5, Apple® QuickTime®, Microsoft® Silverlight®, JavaTM, and Unity®.
  • an application provision system comprises one or more databases 200 accessed by a relational database management system (RDBMS) 210.
  • RDBMSs include Firebird, MySQL, PostgreSQL, SQLite, Oracle Database, Microsoft SQL Server, IBM DB2, IBM Informix, SAP Sybase, Teradata, and the like.
  • the application provision system further comprises one or more application severs 220 (such as Java servers, .NET servers, PHP servers, and the like) and one or more web servers 230 (such as Apache, IIS, GWS and the like).
  • the web server(s) optionally expose one or more web services via app application programming interfaces (APIs) 240.
  • APIs app application programming interfaces
  • an application provision system alternatively has a distributed, cloud-based architecture 300 and comprises elastically load balanced, auto-scaling web server resources 310 and application server resources 320 as well synchronously replicated databases 330.
  • a computer program includes a mobile application provided to a mobile computing device.
  • the mobile application is provided to a mobile computing device at the time it is manufactured.
  • the mobile application is provided to a mobile computing device via the computer network described herein.
  • a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective-C, JavaTM, JavaScript, Pascal, Object Pascal, PythonTM, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
  • Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator®, Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, AndroidTM SDK, BlackBerry® SDK, BREW SDK, Palm® OS SDK, Symbian SDK, webOS SDK, and Windows® Mobile SDK.
  • iOS iPhone and iPad
  • a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in.
  • standalone applications are often compiled.
  • a compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, JavaTM, Lisp, PythonTM, Visual Basic, and VB.NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program.
  • a computer program includes one or more executable complied applications.
  • the computer program includes a web browser plug-in (e.g., extension, etc.).
  • a plug-in is one or more software components that add specific functionality to a larger software application.
  • Makers of software applications support plugins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application.
  • plugins enable customizing the functionality of a software application.
  • plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types.
  • the toolbar comprises one or more web browser extensions, add-ins, or add-ons.
  • the toolbar comprises one or more explorer bars, tool bands, or desk bands.
  • plug-in frameworks are available that enable development of plug-ins in various programming languages, including, by way of non-limiting examples, C++, Delphi, JavaTM, PHP, PythonTM, and VB.NET, or combinations thereof.
  • Web browsers are software applications, designed for use with network-connected computing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft® Internet Explorer®, Mozilla® Firefox®, Google® Chrome, Apple® Safari®, Opera Software® Opera®, and KDE Konqueror. In some embodiments, the web browser is a mobile web browser. Mobile web browsers (also called microbrowsers, mini-browsers, and wireless browsers) are designed for use on mobile computing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems.
  • PDAs personal digital assistants
  • Suitable mobile web browsers include, by way of non-limiting examples, Google® Android® browser, RIM BlackBerry® Browser, Apple® Safari®, Palm® Blazer, Palm® WebOS® Browser, Mozilla® Firefox® for mobile, Microsoft® Internet Explorer® Mobile, Amazon® Kindle® Basic Web, Nokia® Browser, Opera Software® Opera® Mobile, and Sony® PSPTM browser.
  • the platforms, systems, media, and methods disclosed herein include software, server, and/or database modules, or use of the same.
  • software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art.
  • the software modules disclosed herein are implemented in a multitude of ways.
  • a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof.
  • a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
  • the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application.
  • software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on a distributed computing platform such as a cloud computing platform. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
  • the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same.
  • suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object oriented databases, object databases, entityrelationship model databases, associative databases, XML databases, and document oriented databases. Further non-limiting examples include SQL, PostgreSQL, MySQL, Oracle, DB2, Sybase, and MongoDB.
  • a database is Internet-based.
  • a database is web-based.
  • a database is cloud computing-based.
  • a database is a distributed database.
  • a database is based on one or more local computer storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Described are systems, devices, and methods for providing interactive and tactile projections of sound and light for virtual environments.

Description

VIRTUAL AND AUGMENTED INTERACTIVE AND TACTILE
PROJECTIONS OF SOUND AND LIGHT
CROSS-REFERENCE
[0001] This application claims the benefit of U.S. Provisional Application No. 63/379,383, filed October 13, 2022, which is/are hereby incorporated by reference in its/their entirety herein.
BACKGROUND
[0002] Virtual and augmented realties appear to be the future of gaming. Augmented reality (AR) and virtual reality (VR) aim to create a more immersive environment than possible with flatscreen projections alone. However, the current systems lack interactive feedback with the virtual environments, and typically incorporate vibration as the only mode of haptic feedback. Provided herein are systems and methods for providing interactive and tactile projections of light and sound for enhancement of virtual and augmented realities.
SUMMARY
[0003] Provided herein are devices, systems, and methods for emission of electromagnetic waves and/or mechanical waves. In some embodiments, devices and systems incorporate mechanical elements to provide non-laser focused mechanical pressure waves in the human audible spectrum (i.e., about 20 Hz to 20 kHz) and/or human non-audible spectrum. Mechanical elements may include parametric speaker arrays such as ultrasonic speaker arrays, piezo speakers, or electromagnetic speakers, and the like. In some embodiments, beam forming and/or beam shaping methods are utilized to focus, direct, or otherwise manipulate waves propagated from the systems and devices disclosed herein.
[0004] In some embodiments, the devices and systems incorporate optical elements to provide laser focused mechanical pressure waves in the human audible spectrum and/or human non-audible spectrum. Optical elements may also be utilized to provide optical signals in the infrared, near infrared, or visible light spectrum. Optical elements may include lasers, light emitting diodes, lenses, mirrors, or a combination thereof. [0005] In some embodiments, devices and systems incorporate thermal elements to alter an ambient temperature. In some embodiments, thermal elements are utilized to lower an ambient temperature. In some embodiments, thermal elements are utilized to lower an ambient temperature. In some embodiments, thermal elements are utilized to adjust an ambient temperature between about 0 °C to about 100 °C. In some embodiments, temperature sensors are incorporated to measure temperatures of surfaces or areas which may interact with the thermal elements. In some embodiments, temperature sensors allow for dynamic adjustment of the thermal elements, as disclosed herein.
[0006] In some embodiments, devices and systems include interferometric elements to measure mechanical pressure waves or optical waves. In some embodiments, interferometric elements are utilized for dynamic adjustment of optical elements, emission of electromagnetic waves, and/or emission of mechanical waves.
[0007] In some embodiments, devices and system include optical sensors. In some embodiments, optical sensors are utilized to dynamically measure mechanical waves, optical waves, and motion/position of objects (e.g., animate and inanimate objects such as people, cars, rocks, etc.). In some embodiments, an optical sensor is provided to capture images at a rate of 10Hz to 10,000Hz. Said captured images may be combined into a video format. In some embodiments, an optical sensor comprises a camera. In some embodiments, optical sensors include infrared, near infrared, visible light, ultra-violet spectrum sensors. In some embodiments, optical sensors comprise three-dimensional (3D) spectroscopic cameras capable of sensing in infrared (IR), near infrared, visible light, and/or ultra-violet spectrum. In some embodiments, systems utilize multiple stereo infrared (IR) imaging devices.
[0008] In some embodiments, systems and devices incorporate one or more computational elements (e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.) to perform data processing and real-time data processing for dynamic output signal conditioning and adjustment based on desired output and measured signal inputs, as disclosed herein.
[0009] In some embodiments, systems include closed mesh network elements for selfrecognizing interact-ability with like devices to allow constructive or destructive distributed signal modification. In some embodiments, systems include open network elements (e.g., 3G, 4G, 5G, long range (LoRa), and the like) to enable connection to internet, intranet, distributed computing network (cloud computing). In some embodiments, systems include electrical elements to generate, consume, receive, and transmit power (e.g., solar panels, rechargeable battery, battery, wireless energy transmission / reception components, and the like) to provide power to the system and similar devices within a known proximity. In some embodiments, communication between devices utilizes free space optics communication and has the ability to adjust data transmission bandwidth based on power consumption restrictions.
[0010] According to some embodiments, provided herein is a system for haptic interactive gaming, the system comprising: a haptic array comprising a plurality of ultrasonic devices; a camera; a light source; a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: receiving a two-dimensional (2D) data, a three-dimensional (3D) data, or both from a gaming console; directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a first acoustic field based on the 2D data, the 3D data, or both; directing the light source to emit light based on the 2D data, the 3D data, or both; determining a user motion based on data received by the camera; and providing the motion data to the user console based on the user motion.
[0011] In some embodiments, the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three- dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. In some embodiments, the camera captures data at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the user motion is a motion of an appendage of a user. In some embodiments, the light source comprises a laser, a light emitting diode, a light bulb, or any combination thereof. In some embodiments, the emitted light has a wavelength of about 10 nm to about 10,000 nm. In some embodiments, the emitted light has a frequency of about 0.3 THz to about 300 THz. In some embodiments, the system further comprises an interferometric device, wherein the operations further comprise calibrating the haptic array based on data received from the interferometric device. In some embodiments, the interferometric device comprises a laser doppler vibrometer, a laser interferometer, an acoustic interferometer, or any combination thereof. In some embodiments, the operations further comprise directing the thermal element to emit heat based on the 2D data, the 3D data, the user motion, or any combination thereof. In some embodiments, the system further comprises an energy storage device providing power to the haptic array, the camera, the non- transitory computer-readable storage media, or any combination thereof. In some embodiments, the energy storage device comprises a battery, a supercapacitor, or any combination thereof.
[0012] According to some embodiments, provided herein is a computer-implemented method for haptic interactive gaming, the method comprising: receiving, by a computer, a two- dimensional (2D) data, a three-dimensional (3D) data, or both from a gaming console; directing, by the computer, at least a portion of a plurality of ultrasonic devices in a haptic array to emit a first acoustic field based on the 2D data, the 3D data, or both; directing, by the computer, a light source to emit light based on the 2D data, the 3D data, or both; determining, by the computer, a user motion based on data received by a camera; and providing, by the computer, the motion data to the user console based on the user motion.
[0013] In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. In some embodiments, the data is received by the camera at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the method further comprises calibrating, by the computer, the haptic array based on data received from an interferometric device. In some embodiments, the method further comprises directing, by the computer, a thermal element to emit heat based on the 2D data, the 3D data, the user motion, or any combination thereof.
INCORPORATION BY REFERENCE
[0014] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference. BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
[0016] FIG. 1 shows a non-limiting example of a computing device; in this case, a device with one or more processors, memory, storage, and a network interface;
[0017] FIG. 2 shows a non-limiting example of a web/mobile application provision system; in this case, a system providing browser-based and/or native mobile user interfaces;
[0018] FIG. 3 shows a non-limiting example of a cloud-based web/mobile application provision system; in this case, a system comprising an elastically load balanced, auto-scaling web server and application server resources as well synchronously replicated databases;
[0019] FIG. 4 shows a non-limiting example of a system a bio-haptic security system, per one or more embodiments herein;
[0020] FIG. 5 shows a non-limiting example of a haptic array device, per one or more embodiments herein;
[0021] FIG. 6A shows an image of an exemplary system for haptic interactive gaming, per one or more embodiments herein;
[0022] FIG. 6B shows an image of a user manipulating a virtual cube with an exemplary system for haptic interactive gaming, per one or more embodiments herein;
[0023] FIG. 7A shows an image of a user throwing a virtual ball with an exemplary system for haptic interactive gaming, per one or more embodiments herein;
[0024] FIG. 7B shows an image of a user manipulating one of three balls with an exemplary system for haptic interactive gaming, per one or more embodiments herein;
[0025] FIG. 8A shows a first image an exemplary system for haptic interactive gaming, per one or more embodiments herein;
[0026] FIG. 8B shows a second image an exemplary system for haptic interactive gaming, per one or more embodiments herein;
[0027] FIG. 9A shows a third image an exemplary system for haptic interactive gaming, per one or more embodiments herein; and
[0028] FIG. 9B shows an image an exemplary ancillary device, per one or more embodiments herein. DETAILED DESCRIPTION
[0029] Provided herein are embodiments of a system for providing haptic feedback comprising a haptic array. In some embodiments, the haptic feedback system utilizes a combination of optic and acoustic fields simultaneously. In some embodiments, generated optic and acoustic fields have no direct interference, however, combining them provides benefits such as multi-resolution haptic images and a synergistic effect on haptic perception. In some embodiments, the fields are applied simultaneously as elastic wave to stimulate nerves signals. In some embodiments, the optic field is utilized to simulate or produce a “skin feeling,” or feeling of touch. In some embodiments, the acoustic field is utilized to apply pressure. Combining two fields of different physical quantities would provide not only the superposition effect proposed above but also synergistic effects such as modification of the feeling.
[0030] FIG. 4 shows a diagram of the components of haptic array device, according to some embodiments. FIG. 5 depicts a haptic array device, according to some embodiments. In some embodiments, the system is parametric. In some embodiments, the non-linearity of the frequency response produced by multiple ultrasonic frequencies in air is modeled utilizing parametric equations. The parametric equations may be utilized in computer and/or machine learning systems to (and resultingly, the effect is best modeled with parametric equations). [0031] In some embodiments, the system includes Field Programmable Gate Arrays (FPGAs), machine learning, autonomous control systems, fast-networking, fast-self healing, interferometer sensors, ultrasonic speaker arrays, and the like. In some embodiments, the system utilizes laser interferometer technology to measure the response of an environment, one or more objects, or a combination thereof to dynamically change parameters and achieve desired effects. In some embodiments, a laser interferometer system sends out a two-beam laser to measure vibration of a surface. In some embodiments, laser interferometer is used to receive vibration signals to calibrate the output of the ultrasonic transducer array to effectively beamform the audio waves to focus on one or more points on a subject or object. [0032] In some embodiments, parametric speaker array is a highly directive speaker that consists of an array of ultrasonic transducers that exploit the nonlinear properties of air to self-demodulate modulated ultrasonic signals with the aim of creating narrow, focused sound waves (audible and inaudible). In some embodiments, the ultrasonic transducers are piezoelectrically driven. [0033] In some embodiments, the system utilizes one or more parametric speaker/transducer arrays. In some embodiments, each transducer array comprises multiple transducers. In some embodiments, the multiple transducers of each array output the same signal which is amplified by constructive interference. In some embodiments, two or more arrays are configured to further amplify a signal via constructive interference. Further, a plurality of speaker arrays may be utilized to precisely direct sound or amplify sound at a precise location. Use of a parametric speaker array may the traditional use of broadcasting audio through distributed & coherent beamforming functionality. This approach offers the capability of numerous smaller devices to output the same audio volume as a single large device. In contrast, current acoustic hailing or loudspeaker systems focus on high energy output over focused energy output, requiring large and powerful emitters that are difficult to move and/or emplace. In some embodiments, the system and methods herein allow for high powered acoustic energy signals to be achieved with a system which is relatively compact and has low power requirements.
[0034] In some embodiments, the system combines the laser interferometer and parametric speaker array technologies with the distributed coherent beamforming technique through a network capable control system that uses algorithms and/or machine learning (ML) to rapidly tune the audio effect to mitigate destructive environmental noise and to enable effective beam coherence. Therefore, in some embodiments, the system provides autonomous environmental adjustments and distributed coherence beam forming.
[0035] In some embodiments, the inventive device combines three fundamental technologies:
(1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms,
(2) a laser interferometer to measure environmental noise data (e.g., ambient noise, wind spike, etc.) and record audio, and (3) a network-connected system controller to manage data from both the network and the individual components. In some embodiments, the inventive device combines four fundamental technologies: (1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms, (2) one or more lasers for generating laser haptics, (3) one or more video capture device for monitoring at least a portion of a subject, and (4) a network-connected system controller to manage data from both the network and the individual components. In some embodiments, an individual system functions on its own.
[0036] In some embodiments, individual systems are combined in a network that provides a distributed coherent beamforming function. In some embodiments, the system utilizes digital signal processing; embedded systems, information technology for distributed networking (i.e., Internet of Things (IOT)), and machine leaming/artificial intelligence (ML/ Al) for device self-calibration.
I. HAPTIC ARRAY DEVICE
[0037] With reference to FIG. 4, a system 400 for controlling providing haptic feedback or stimulation is depicted, according to some embodiments. In some embodiments, the system 400 is utilized to stimulate or provide haptic feedback to subject or portion of a subject (e.g., a hand of a subject 490). In some embodiments, the system 400 includes network module 405, system controller 410, acoustic payload controller 420, a monitoring controller 425, monitoring sensors 430, acoustic haptic array controller 435, acoustic haptic array 450, optical emission controller 460, optical emitter 465, and recorder 440.
[0038] In some embodiments, the functions of the system 400 are controlled by system controller 410. In some embodiments, the system controller 410 comprises a computer processing unit (CPU), as described herein. The CPU may comprise one or more programs loaded onto a memory for sending instructions for operating the various components of the system, as described herein. The system controller 410 may further comprise a field programmable gate array (FPGA) configurable to provide a logic circuit for specified functions of the system. In some embodiments, the system controller 410 is in operative communication with a network module 405. The network module 405 may be configured to receive information instructions, such a programming instructions, parameter inputs, or the like and transmit said instructions to the system controller 410. The network module 405 may communicate with an external network, remote device, user interface, or the like, as disclosed herein. In some embodiments, mesh networking is utilized. In some embodiments, mesh networking allows the system to provide distributed coherence. In turn, mesh networking may allow many small systems to achieve the performance of a much larger system. Mesh networking may also allow the system to provide unique and complicated acoustic algorithms (e.g., machine learning) to enable precise spatial audio or ultrasonic feedback.
[0039] In some embodiments, the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460. In some embodiments, the acoustic payload controller and the optical emission controller are integrated into a single haptic array controller. In some embodiments, the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460 via one or more control buses 415. [0040] In some embodiments, the acoustic payload controller 420 comprises an application specific integrated circuit (ASIC) processes one or more signals and provides an output signal to the acoustic haptic array controller 435. In some embodiments, the acoustic haptic array controller 435 provides an output signal to the acoustic haptic array 450, where the output signal is transformed into to mechanical waveform (e.g., an acoustic, sound, or ultrasonic waveform) by one or more transducers of the acoustic haptic array. In some embodiments, the haptic array controller comprises an amplifier to amplify the signal prior to output to the haptic array(s). In some embodiments, the system is connected to a plurality of haptic arrays and the output to each haptic array is varied to produce a desired output. In some embodiments, the constructive interference of the sonic waves produced by the transducers is utilized to produce one or more focal points. In some embodiments, production of the focal point is digitally controlled by the haptic payload controller. In some embodiments, focal points of sonic energy are produced with a resolution of 1/16 of the wavelength (e.g., approximately 0.5 mm for the 40-kHz ultrasound).
[0041] In some embodiments, the optical emission controller 460 comprises an application specific integrated circuit (ASIC) processes one or more received signals. In some embodiments, the optical emission controller 425 receives signals from the system controller 410. In some embodiments, the optical emission controller 425 receives signals from the system controller 410, the acoustic payload controller 420, the monitoring controller 425, or a combination thereof. In some embodiments, the optical emission controller 460 provides directs and controls one or more optical emitters 465.
[0042] In some embodiments, the optical one or more optical emitters 465 comprise at least one light source. In some embodiments, the optical one or more optical emitters 465 comprise at least one light source coupled to one or more optical elements. The optical elements may comprise lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location. In some embodiments, the system is connected to a plurality of optical emitters and the output to each optical emitter is varied to produce a desired output. In some embodiments, the light source of the optical emitter is a laser, as described herein.
[0043] In some embodiments, the optical emitter produces electromagnetic energy outside of the visible light spectrum. For example, the optical emitter may produce electromagnetic waves within the ultraviolet or infrared spectrum. In some embodiments, the optical emitter is replaced or used in combination with an emitter which generates another type of electromagnetic energy, such as radio emissions. In some embodiments, the optical emitter is replaced or used in combination with a thermal emitter which generates and transmits heat toward a target location or focal point.
[0044] In some embodiments, the system 400 comprises a monitoring controller 425. In some embodiments, the monitoring controller operates and receives data from one or more monitoring sensors. Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490). In some embodiments, an interferometer is utilized as a monitoring sensor, as disclosed herein.
[0045] In some embodiments, the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the acoustic payload controller 420. In some embodiments, the acoustic payload controller 420 comprises a digitally-programmable potentiometer (DPP) which receives the interferometer data. In some embodiments, the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the optical emission controller 460. In some embodiments, the optical emission controller 460 comprises a digitally-programmable potentiometer (DPP) which receives the data generated by the monitoring sensors. In some embodiments, the monitoring data is sent back to system controller 410. In some embodiments, the acoustic payload controller 420 may adjust the output signal to the acoustic haptic array controller 420 based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440. In some embodiments, the optical emission controller 460 may adjust the output signal to the optical emitter based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440. In some embodiments, the system is configured such that feedback received from the monitoring sensors 430 is utilized to adjust the system, output of the haptic arrays 450, and output of the optical emitters 465. In some embodiments, adjustments are made in real-time to provide a self-calibrating system.
[0046] In some embodiments, the system further comprises a recorder 440. Recorder 440 may receive and store monitoring data via an input/output (I/O) integrated circuit coupled to the monitoring controller. The stored data may be utilized by the system to improve outputs. In some embodiments, the stored monitoring data is input into a machine learning module to improve the system. In some embodiments, the system is used for audio recording using an interferometer (i.e., ISR). In some embodiments, the monitoring data is used to track a target 490. In some embodiments, the monitoring data is used to monitor the response of a target to the haptic output of the system.
[0047] In some embodiments, the system is modular, such that multiple systems can be networked to provide different levels of performance based on user needs. An individual system may operate independently for reduced function based on user needs. Combined systems may operate together to produce a higher output signal, provide haptic feedback to a larger volume of space.
A. Ultrasonic Haptics
[0048] As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). In some embodiments, sonic haptic feedback is provided to a target via an array of ultrasonic transducers. In some embodiments, an array of ultrasonic transducers comprises 324 transducers arranged in an 18 x 18 square grid. However, multiple arrangements of the transducers may be provided to better suit various applications. In some embodiments, the transducers are arranged as a planar array. In some embodiments, the transducers are arranged in a non-planar array. In some embodiments, the transducers are arranged in two or more planar arrays which are provided at an angle to each other. In some embodiments, the transducers are arranged in two or more planar arrays which are orthogonal to each other. In some embodiments, the transducers are open aperture ultrasonic transducers. In some embodiments, the transducers are ceramic transducers (e.g., Nippon Ceramic T4010A1 transducers).
[0049] In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 1,025 transducers. In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 25 transducers, about 4 transducers to about 64 transducers, about 4 transducers to about 256 transducers, about 4 transducers to about 324 transducers, about 4 transducers to about 576 transducers, about 4 transducers to about 1,025 transducers, about 25 transducers to about 64 transducers, about 25 transducers to about 256 transducers, about 25 transducers to about 324 transducers, about 25 transducers to about 576 transducers, about 25 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about 324 transducers, about 64 transducers to about 576 transducers, about 64 transducers to about 1,025 transducers, about 256 transducers to about 324 transducers, about 256 transducers to about 576 transducers, about 256 transducers to about 1,025 transducers, about 324 transducers to about 576 transducers, about 324 transducers to about 1,025 transducers, or about 576 transducers to about 1,025 transducers. In some embodiments, an array of ultrasonic transducers comprises at least about 4 transducers, about 25 transducers, about 64 transducers, about 256 transducers, about 324 transducers, or about 576 transducers, including increments therebetween. In some embodiments, a plurality of transducer arrays is provided.
[0050] In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 20 millimeters (mm). In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 100 mm. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 5 mm, about 1 mm to about 10 mm, about 1 mm to about 20 mm, about 1 mm to about 40 mm, about 1 mm to about 50 mm, about 1 mm to about 100 mm, about 5 mm to about 10 mm, about 5 mm to about 20 mm, about 5 mm to about 40 mm, about 5 mm to about 50 mm, about 5 mm to about 100 mm, about 10 mm to about 20 mm, about 10 mm to about 40 mm, about 10 mm to about 50 mm, about 10 mm to about 100 mm, about 20 mm to about 40 mm, about 20 mm to about 50 mm, about 20 mm to about 100 mm, about 40 mm to about 50 mm, about 40 mm to about 100 mm, or about 50 mm to about 100 mm. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at least about 1 mm, about 5 mm, about 10 mm, about 20 mm, about 40 mm, or about 50 mm, including increments therebetween. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at most about 5 mm, about 10 mm, about 20 mm, about 40 mm, about 50 mm, or about 100 mm, including increments therebetween.
[0051] In some embodiments, the transducer array is capable of providing pressure forces of about 10 millinewtons (mN) to about 20 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 100 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 2 mN, about 1 mN to about 5 mN, about 1 mN to about 10 mN, about 1 mN to about 20 mN, about 1 mN to about 50 mN, about 1 mN to about 100 mN, about 2 mN to about 5 mN, about 2 mN to about 10 mN, about 2 mN to about 20 mN, about 2 mN to about 50 mN, about 2 mN to about 100 mN, about 5 mN to about 10 mN, about 5 mN to about 20 mN, about 5 mN to about 50 mN, about 5 mN to about 100 mN, about 10 mN to about 20 mN, about 10 mN to about 50 mN, about 10 mN to about 100 mN, about 20 mN to about 50 mN, about 20 mN to about 100 mN, or about 50 mN to about 100 mN. In some embodiments, the transducer array is capable of providing pressure forces of at least about 1 mN, about 2 mN, about 5 mN, about 10 mN, about 20 mN, or about 50 mN, including increments therebetween.
[0052] The ultrasonic haptics are based on acoustic radiation pressure, which is not vibrational and presses the skin surface. This can be applied on the skin for a long time but this is relatively weak. The sensation maybe similar to a laminar air flow within a narrow area.
[0053] A direct current output of ultrasound may be too weak to be perceivable at low levels. Therefore, in some embodiments, vibrotactile stimulations are produced by modulation of ultrasonic emission as waveforms. In some embodiments, vibrotactile stimulations are produced by modulated by 200 Hz and 50 Hz waves. In some embodiments, the waveforms for producing ultrasonic haptic feedback are sinewaves, rectangular waves, triangular waves, or a combination thereof. In some embodiments, the spatial resolution produced by the transducer array is about 8.5 mm when the array is operating at 40 kilohertz (kHz).
B. Laser Haptics
[0054] In some embodiments, the haptic array device comprises one or more lasers for providing haptic feedback. In some embodiments, a laser emits energy at a wavelength of about 10 nm to about 10,000 nm. In some embodiments, a laser has a frequency of about 0.3 THz to about 300 THz. In some embodiments, a power output of the laser is about 0.16 watts (W). In some embodiments, a power output of the laser is about 0.01 W to about 0.5 W. In some embodiments, a power output of the laser is about 0.01 W to about 0.05 W, about 0.01 W to about 0.1 W, about 0.01 W to about 0.13 W, about 0.01 W to about 0.16 W, about 0.01 W to about 0.2 W, about 0.01 W to about 0.3 W, about 0.01 W to about 0.5 W, about 0.05 W to about 0.1 W, about 0.05 W to about 0.13 W, about 0.05 W to about 0.16 W, about 0.05 W to about 0.2 W, about 0.05 W to about 0.3 W, about 0.05 W to about 0.5 W, about 0.1 W to about 0.13 W, about 0.1 W to about 0.16 W, about 0.1 W to about 0.2 W, about 0.1 W to about 0.3 W, about 0.1 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about 0.3 W, about 0.13 W to about 0.5 W, about 0.16 W to about 0.2 W, about 0.16 W to about 0.3 W, about 0.16 W to about 0.5 W, about 0.2 W to about 0.3 W, about 0.2 W to about 0.5 W, or about 0.3 W to about 0.5 W. In some embodiments, a power output of the laser is about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W. In some embodiments, a power output of the laser is at least about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, or about 0.3 W. In some embodiments, a power output of the laser is at most about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therebetween.
[0055] In some embodiments, a low laser power levels prevent damaging of the skin of a user. The sensation produced by the laser system may be similar to an electric sensation. In some embodiments, the haptic feedback from the laser causes evaporation from a nonthermal shockwave produced on skin. In some embodiments, duration of laser exposure is limited to prevent damage to the skin.
[0056] In some embodiments, a haptic laser system comprises at least one laser light source. In some embodiments, the haptic laser system comprises optical elements such as lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location. In some embodiments, a haptic laser system comprises galvo- mirrors for precise positioning of the laser energy. In some embodiments, a laser system comprises a computer-controlled optical phased array comprising pixels that modulate a laser beam’s intensity, phase, or both.
C. Cross-Field Haptics
[0057] In some embodiments, the haptic array device utilizes a combination of electromagnetic energy and pressure from mechanical waves to produce unique sensations for a user. In some embodiments, the ultrasonic transducers can produce pressure in larger areas (e.g., about 30 cm areas). In some embodiments, the laser haptics systems produce sensations in more focused areas (e.g., down to 1 micron). Therefore, a combination of laser and ultrasonic transducer systems may produce focused haptics at different scales simultaneously. For example, if a target is a hand of a user, the ultrasonic haptic system may produce a pressure sensation on the palm of the hand, while the laser haptic system focuses a sensation on a fingertip of the user. Such a configuration may be useful in confirming registration or detection of various parts of the hand when being used in combination with a gesture registration system.
[0058] Simultaneous application of pressure from the ultrasound transducers with application of a laser effect the perceived effects of the sensation from each haptic system. In some embodiments, application of pressure from the ultrasound transducers reduces the sensitivity of a user to the effects of a laser. This may allow for higher intensity laser application before a user perceives pain. D. Optical Simulation
[0059] In some embodiments, lasers of the haptic array device are utilized to produce visualizations. In some embodiments, constructive interference produced by a laser emission system is utilized to generate 3D images or holograms. In some embodiments, a 3D image or hologram is utilized to help guide a user when the haptic array device is being used as a controller or for gesture recognition. In some embodiments, a 3D image or hologram is utilized to help guide a user when using an external device is being used as a controller or for gesture recognition. In some embodiments, a 3D image is produced to guide a user’s hand to the center of an image captured by a camera (either incorporated or external to the haptic array device) being utilized for gesture recognition.
[0060] In some embodiments, a haptic array device utilizes a laser system to produce both haptic and visual effects. In some embodiments, the haptic feedback is provided as the user interacts with a 3D image or hologram. In some embodiments, a 3D image or hologram is utilized to help guide a user through a series movements as part of rehabilitation or training program.
E. Monitoring Systems
[0061] In some embodiments, one or more sensors are provided to monitor interaction with the haptic array device. In some embodiments, a monitoring system comprising one or more sensors is provided to monitor a user position, the user motion, or both is outside a security threshold from a set user position, a set user motion, or both. Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490). In some embodiments, an interferometer is utilized as a monitoring sensor, as disclosed herein.
[0062] In some embodiments, a monitoring system comprises a camera. In some embodiments, the camera captures data at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. The camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
[0063] In some embodiments, the camera is coupled to a computer processing unit (CPU) of the system, as disclosed herein. The camera may be utilized for gesture recognition. In some embodiments, haptic feedback is provided by the haptic array device in response to position or movement of a target within the field of view of the camera.
[0064] In some embodiments, feature detection and extraction methods are utilized to identify a region of interest on the target. In embodiments wherein the system is used for gesture recognition, regions of interest may include such as a finger, palm, thumb, fingertip, etc. of a user. In some embodiment, feature detection and extraction methods comprise computing processing of images to analyze contrasts in pixel brightness to recognize features. Feature detection and extractions methods may include edge detection, corner detection, blob detection, ridge detection, and combinations thereof.
[0065] In some embodiments, an edge detection algorithm is utilized to identify an outline or border of a target. In some embodiments, a nearest neighbor, thresholding, clustering, partial differential equation, and/or other digital image processing methods are utilized to identify an outline or border of a target. Canny, Deriche, differential, Sobel, Prewitt, and Roberts cross edge detection techniques may be utilized to identify target or a portion thereof. In some embodiments, Gaussian or Laplacian techniques are utilized to smooth or improve the accuracy of the identified target or portion thereof.
F. Ancillary Sensors
[0066] In some embodiments, additional sensors are utilized to enhance or supplement the performance of the haptic array device or monitoring system thereof. In some embodiments, ancillary sensors comprise wearable sensors which are attached to a user to receive additional data generated by movements or electrical signals (e.g., electromyographic (EMG), electroencephalographic (EEG), etc.) produced by a user. In some embodiments, a wearable ancillary sensor comprises one or more motion sensors. In some embodiments, the motion sensors comprise an accelerometer, a gyroscope, or a combination thereof.
[0067] In some embodiments, a wearable ancillary sensor array is configured to couple to an appendage, limb, or extremity of a user. In some embodiments, an existing device comprising one or more motion sensors (e.g., a smart watch) is coupled to the haptic array device to act as an ancillary sensor device. In some embodiments, additional bioinformatics are acquired by the ancillary sensors such as heart rate, body temperature, blood pressure, or a combination thereof.
[0068] In some embodiments, a wearable ancillary sensor array is configured to be worn on a head of user. In some embodiments, a wearable ancillary sensor array comprising one or more EEG sensors is configured to place the EEG sensors in proximity to the scalp of a user and receive electric signals produced by the brain of the user. In some embodiments, the EEG sensors do not require direct contact to the skin (e.g., no need for shaving of the head) or a gel to be applied to the scalp.
[0069] In some embodiments, the ancillary sensors are used confirm or verify actions or gestures made by a user. In some embodiments, bioinformatic information obtained by the ancillary sensors is recorded and stored in a memory of the system.
[0070] In some embodiments, the ancillary sensor is head wearable and comprises a helmet, a visor, glasses, a headband, earbuds, earphones, or any combination thereof. In some embodiments, the ancillary sensor is head wearable and comprises an inertial motion unity (IMU) sensor, a facial micro-motion sensor, or any combination thereof. In some embodiments, the ancillary sensor is wrist and/or hand wearable and comprises a photoacoustic sensor, an ultrasound sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof. [0071] In some embodiments, the wrist ancillary sensor is reconfigurable based on a user’s handedness. In some embodiments, the ancillary sensor is hand graspable. In some embodiments, the wrist ancillary sensor comprises a wireless communication device, a wired communication device, or both. In some embodiments, the wrist ancillary sensor comprises an energy storage device, a wired charge connector, a wireless charge connector, or any combination thereof. In some embodiments, the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof. In some embodiments, the haptic feedback comprises a finger haptic, a magneto haptic, an opto- haptic, or any combination thereof. In some embodiments, the ancillary sensor is hand graspable. In some embodiments, the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof.
[0072] In some embodiments, the ancillary sensor is foot-wearable and comprises a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, inertial motion unity (IMU) sensor, a thermometer, an altimeter, a barometer, a humidity sensor, a sweat rate generation sensor, a hydration sensor, a bioacoustics sensor, or any combination thereof. [0073] In some embodiments, the ancillary sensor comprises a facial micro-motion sensor, a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
II. INTERACTIVE HAPTIC FEEDBACK SYSTEM
[0074] In some embodiments, an interactive haptic feedback system is provided by the devices and methods disclosed herein. In some embodiments, haptic feedback is provided to help guide the user. For example, haptic feedback may be utilized to confirm that the monitoring system has identified a target portion of a user or confirm that movement of the user has been properly registered. In some embodiments, haptic feedback is utilized to confirm a portion of the user is properly in view of the monitoring system.
[0075] In some embodiments, movement of a hand of a user is monitored by a haptic array device, as the user interacts with a hologram or visualization produced by the haptic array device. In some embodiments, the movements of the user’s hand are translated into instructions such to provide a controller. In some embodiments, as the user utilizes the haptic array device as a controller, haptic feedback is provided in response to the user’s movements to simulate interaction with an object represented by the visualization or hologram.
[0076] As described herein, computer vision systems may be utilized to identify portions of a user hand and monitor movement of the user’s hand and fingers. Information from monitoring systems may be provided to render or position a visualization in response to the movement by the user. While the description herein provides methods and systems for monitoring a hand of a user for interacting with a hologram or visualization, it should be appreciated that other portions of the body, or the entire body, of a user may be monitored to manipulate a visualization as the user interacts with the visualization.
[0077] In some embodiments, a haptic array device provides haptic feedback to a user as they manipulate virtual items in an augmented reality. In some embodiments, the haptic array device provides interactive feedback responsive to a user’s movements to supplement an augmented reality. In some embodiments, an external device produces an image in augmented reality projected proximal to the haptic array device. In some embodiments, haptic feedback from the haptic array device is synchronized with a user’s interaction with the augmented reality visualization.
[0078] Coupling of a haptic array device to a computing system, to provide interactive haptic feedback, may be carried out through wired or wireless communication. A computing system may include a personal computer, mobile computing device, gaming system, or the like. In some embodiments, the haptic array device monitors movement of a user, such that the haptic array device provides a controller for operation of the computing system. In some embodiments, the haptic array device provides haptic feedback in response to a user’s movements and/or in response to situations provided in a virtual environment.
A. HAPTIC CONTROLLER
[0079] In some embodiments, movement of a hand of a user is monitored by a haptic array device, as described herein, and translated into instructions such to provide a controller. In some embodiments, as the user utilizes the haptic array device as a controller, haptic feedback is provided in response to the user’s movements. Haptic feedback may also be provided in response to conditions of a virtual or real-world environment.
[0080] In some embodiments, a haptic controller provided by the haptic array device monitors movements made by the hand and fingers of a user to provide a controller. In some embodiments, each finger of a hand of a user is monitored to provide controlling instructions for a system or device. As described herein, computer vision systems may be utilized to identify portions of a user hand and monitor movement of the user’s hand and fingers. While the description herein provides methods and systems for monitoring a hand of a user for controlling systems and devices, it should be appreciated that other portions of the body, or the entire body, of a user may be monitored to provide a controller device.
[0081] In some embodiments, the haptic array device is configured as a controller for a computing system. In some embodiments, the haptic array device is configured as a gaming controller. In some embodiments, the haptic array device provides a controller by monitoring movement of the hand of a user. Use of the haptic array device may provide an intuitive controller. Further, the haptic array device may provide an adaptive controller to facilitate use by individuals with disabilities.
[0082] In some embodiments, small movements of the hand and/or fingers are to be tolerated without registering as input instructions for a system or device. In this manner, the haptic array device, being used as a controller, provides a “deadzone” wherein smaller movements are not registered as inputs as to prevent unintentional movement being registered as controller inputs. In some embodiments, the controller deadzone is configurable to a user’s preference.
[0083] In some embodiments, a controller provided by the haptic array device allows for user guided navigation through virtual environments. In some embodiments, a controller provided by the haptic array device allows intuitive interaction with virtual objects in virtual or augmented realities. In some embodiments, the haptic array device replaces a traditional controller system (e.g., a gaming controller, mouse, keyboard, etc.) typically utilized for a computer system or device. In some embodiments, the haptic array device supplements a traditional controller system utilized for a computer system or device.
[0084] The various modes of haptic feedback may provide a more immersive environment. The various modes of haptic feedback capable of being emitted by the system are capable of producing interactive feedback well beyond the vibration feedback produced by traditional gaming controllers. However, in some embodiments, the haptic array device may be capable of producing oscillating sonic haptics which will create vibrational feedback similar to traditional gaming vibration feedback. In some embodiments, laser or sonic haptics may be utilized to create tactile stimulation as the user interacts operates the haptic array device as a controller. For example, haptic feedback may be utilized to simulate the pressing of buttons or pulling of triggers.
[0085] Haptic feedback may also be utilized to simulate limitations of the haptic array device as a controller. For example, haptic feedback intensity may increase as a user’s hand is moved to toward the outer periphery of the monitoring system. This may allow haptic stimulation to indicate that the user is approaching the boundary of the controller and should not move any further. When the haptic array device is utilized to control movement in a virtual environment, haptic feedback may increase as the user pushes the controller towards the limit of the movement. For example, if a haptic array device is being utilized to control acceleration of a simulated car in a virtual environment, increasing haptic feedback may indicate to a user that a top speed of a car has been reached, or that the acceleration of the car is at a maximum.
[0086] In some embodiments, the haptic array device is utilized as a controller for real-world systems or devices. For example, the haptic array device may be utilized as a controller for unmanned vehicles, such as unmanned ariel vehicles (UAVs) or drones. In some embodiments, environmental conditions surrounding real world devices are recorded and utilized for providing haptic feedback to a user. For example, a UAV may be equipped with at anemometer to measure wind experienced by the UAV. Data obtained by the anemometer may be used to provide haptic feedback to a user such that the effects of the wind on the UAV are simulated and felt by the user. Further environmental conditionals may be simulated by the sonic, laser, and thermal haptics provided by the system. B. INTERACTIVE ITEMS
[0087] In some embodiments, a haptic array device provides haptic feedback to a user as they manipulate objects in a virtual reality. In some embodiments, a haptic array device provides haptic feedback to a user as they manipulate virtual items in an augmented reality environment. In some embodiments, the haptic array device provides interactive feedback responsive to a user’s movements to supplement virtual items in a virtual or augmented reality.
[0088] In some embodiments, the haptic array device provides a three-dimensional visualization or hologram. The three-dimensional visualization or hologram may represent a virtual item. In some embodiments, haptic feedback is provided to the user as they interact with the three-dimensional visualization or hologram. The haptic feedback produced as the user interacts with the visualization may simulate interaction of with a physical object. In some embodiments, ancillary sensors, as described herein may be used to track a user’s response to stimulation, haptic or other (e.g., reactions to music or video). In some embodiments, haptic feedback is dynamically varied based bioinformatic data obtained by ancillary sensors. For example, a user’s response may produce changes in holographic or haptic projections by the device. This may also be utilized to reduce stimuli for user’s which may be sensitive to over stimulation.
[0089] In some embodiments, the system provides spatial audio for static or dynamic exhibits (e.g., for entertainments, sports events, music events, museum exhibits, art exhibits, theme parks, etc.).
1. Interactive Virtual Reality Items
[0090] In some embodiments, the haptic array device provides a controller for interacting with a virtual reality or environment. In some embodiments, the haptic array device provides a controller for interacting with virtual items. In some embodiments, haptic feedback is provided as the user manipulates virtual items using the haptic array device as a controller. Haptic feedback may include sonic, laser, and thermal feedback, as disclosed herein. In some embodiments, the haptic feedback simulates how the virtual item would respond as it is being manipulated, as if it were a real, physical object.
[0091] For example, a mobile phone may be simulated in a virtual world. As a user picks up and rotates the phone, sonic haptic feedback may be provided to simulate the weight of the virtual phone as it is manipulated by the user. As the user presses the interacts with the screen of the phone, haptic feedback from the haptic array device may be utilized to create pressure against a user’s fingertips, as if they were interacting with the screen of the phone.
[0092] Similar haptic feedback may be provided to a variety of virtual items, such as doors, levers, knobs, living creatures simulated in virtual reality, etc. In some embodiments, a computing system which provides the virtual environment sends instructions to the haptic array device to coordinate the interaction of virtual items with the haptic feedback provided by the haptic array device. In some embodiments, the user’s movements are sent to the computing systems as inputs to manipulate the items in virtual reality, such that the graphics of a virtual environment are manipulated according to the user’s movements detected by the haptic array device.
[0093] In some embodiments, the haptic array device provides a virtual console or controller for training purposes. For example, the device may provide a virtual airplane console as part of a virtual simulator. Such virtual consoles may be used for training or entertainment purposes. Virtual consoles may be utilized in additional applications such as driver training, heavy machinery training, etc.
2. Interactive A ugmented Reality Items
[0094] In some embodiments, a haptic array device produces interactive augmented reality items. In some embodiments, augmented reality produces a visualization within proximity of the haptic array device. For example, augmented reality glasses may produce a visualization located proximal to the haptic array device, such that the haptic array device can provide interactive feedback as the user manipulates the virtual item in augmented reality. In some embodiments, the haptic array device produces the visualization to provide an augmented reality. In some embodiment, the haptic array device provides interactive feedback as the user manipulates the interacts with the visualization. In some embodiments, the device provides a head’s up display type for a user. The head’s up display may provide supplementary information as the user interacts with a virtual environment.
[0095] In some embodiments, interactive augmented reality items supplement a virtual environment. For example, the haptic array device may produce a visualization of a map of a virtual environment. A user may interact with the augmented reality item to manipulate the item. For example, a user may manipulate a visualization of a map of a virtual environment to adjust the view or zoom in or out to locations on the map. Further examples may include simulated locks, puzzles, or other items to supplement a virtual environment. 3. Interactive Games
[0096] In some embodiments, the haptic array device provides a visualization of a game or a portion thereof. The visualization may comprise one or more components of a game. In some embodiments, haptic feedback is provided as the user interacts with the visualization. Haptic feedback may include sonic, laser, and thermal feedback, as disclosed herein. In some embodiments, the haptic feedback simulates how the visualization would respond as it is being manipulated, as if it were a real, physical object.
[0097] For example, a visualization may simulate a chess board and chess pieces. As a user may interact with the visualization to pick and move their chess pieces. As the player interacts with the holographic chess pieces, sonic haptic feedback may be provided provide tactile stimulation as the chess piece holograms are manipulated by the user. The visualization rendered by the haptic array device may project the chess piece moving as the player interacts with the hologram and places in a desired location.
[0098] Similar haptic feedback and rendering of three-dimensional visualizations may be provided to a variety of game pieces, game boards, and puzzles. Such arrangements may facilitate playing of games by those who are unfamiliar with controllers and other computer inputs. For example, an older chess player may find a tactile, interactive visualization more to be more intuitive than using a mouse to move chess pieces, preventing errors in placement of their chess pieces. In some embodiments, a computing system which provides the virtual environment sends instructions to the haptic array device to coordinate the interaction of virtual items with the haptic feedback provided by the haptic array device. This may be useful in cases wherein only one player has access to a haptic array device and the players are competing from remote locations.
4. Product Demonstration
[0099] In some embodiments, the haptic array device produces visualizations of products. In some embodiments, a rendering of a product is provided the haptic array device. The product may be provided by a manufacturer and downloaded or loaded to the haptic array device via a product website.
[0100] A visualization of a product may be useful for purposes of testing or demonstration, wherein a user wants to exam characteristics of a product without having access to a physical version of the product. Such an embodiment may be useful if a user is a consumer who is interested in purchasing a product. The visualization may comprise a scaled or actual size model of the product.
[0101] In some embodiments, the visualization is an interactive visualization. In some embodiments, a user interaction with the visualization allows for simulated manipulation. In some embodiments, a user may interact with the visualization to rotate it or zoom in on aspects of the visualization. In some embodiments, wherein a user may interact with simulated components of the product, such as buttons, switches, knobs, etc. In some embodiments, as the user interacts with simulated components of a product being rendered as a hologram by the haptic array device, haptic feedback is provided to the user such that tactile stimulation simulates interaction with the product. In some embodiments, interaction of the product by a user manipulates the visualization of a product. For example, a user may be interested in purchasing a new vehicle. The system may allow a scaled rendering of the vehicle to be displayed to a user by the haptic array device. In some embodiments, as the user interacts with the model, the model is manipulated according. For example, a user manipulating a model of a could interact with the model to open the doors or hood of the model car visualization.
[0102] While a few examples are provided herein, one would appreciate that numerous item/products could be visualized and interacted with using the systems and methods disclosed herein.
5. Holographic Art
[0103] In some embodiments, a haptic array device projects interactive three-dimensional images as digital art pieces. In some embodiments, a digital art piece is provided as a hologram or three-dimensional rendering. In some embodiments, a user is able to interact with the hologram to manipulate (e.g., rotate, move, zoom in on, etc.) the hologram. In some embodiments, haptic feedback is provided to the user as the manipulate the hologram, as disclosed herein. Haptic feedback may comprise sonic haptics, laser haptics, thermal haptics, or a combination thereof. Digital art pieces may comprise additional characteristics, such as projected audio, changing colors, etc.
[0104] In some embodiments, a digital art piece comprises a hologram with associated haptic feedback provided during interaction with the hologram. In some embodiments, the associated haptic feedback corresponds to the physical attributed of the hologram. For example, interaction with a hologram comprising ribs or texturing, may cause haptic feedback provided to a user to simulate a tactile feeling of the ribs or texturing to the user. In some embodiments, a digital art piece comprises a hologram or three-dimensional image and haptic feedback which seemingly does not correspond with the physical appearance of the hologram.
[0105] In some embodiments, a digital art piece comprising a hologram and associated haptic feedback is connected to a block-chain authentication system. In some embodiments, the digital art piece comprises a non-fungible token (NFT) to authenticate the digital art piece and provide a record of ownership. In some embodiments, a digital art piece associated with an NFT allows for limited release of copies of the digital art piece. Digital art pieces may be acquired via downloaded, bought, sold, or traded on online marketplaces. Digital art pieces may be downloaded to a coupled computing device, external to the haptic array device, and then displayed and/or interacted with using the haptic array device.
[0106]
III. DEFINITIONS
[0107] Unless defined otherwise, all terms of art, notations and other technical and scientific terms or terminology used herein are intended to have the same meaning as is commonly understood by one of ordinary skill in the art to which the claimed subject matter pertains. In some cases, terms with commonly understood meanings are defined herein for clarity and/or for ready reference, and the inclusion of such definitions herein should not necessarily be construed to represent a substantial difference over what is generally understood in the art. [0108] Throughout this application, various embodiments may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the disclosure. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
[0109] As used in the specification and claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. For example, the term “a sample” includes a plurality of samples, including mixtures thereof.
[0110] The terms “rehabilitation,” “training,” or “treatment” are often used interchangeably herein to refer to methods of recovery or prevention from injuries, pain, and/or medical procedures. Methods may be guided or automated using the systems and devices disclosed herein.
[OHl] The terms “acoustic,” “sound,” or “sonic” are often used interchangeably herein to refer to mechanical pressure waves. Unless specified, the terms “acoustic” and “sonic” should broadly read on waveforms ranging through all sonic frequency ranges, including audible, inaudible, and ultrasonic frequencies.
[0112] As used herein, the term “about” a number refers to that number plus or minus 10% of that number. The term “about” a range refers to that range minus 10% of its lowest value and plus 10% of its greatest value.
[0113] The section headings used herein are for organizational purposes only and are not to be construed as limiting the subject matter described.
IV. EXEMPLARY EMBODIMENTS
[0114] With reference to FIG. 5, an exemplary haptic array device 500 is depicted, according to some embodiments. In some embodiments, the haptic array device 500 comprises an array of transducers 550 for producing sonic haptics, as described. In some embodiments, the array 550 is an ultrasonic transducer array, as disclosed herein. In some embodiments, the haptic array device 500, further comprises laser systems 511, 512, 513. In some embodiments, the haptic array device 500, further comprises an integrated monitoring system 520. In some embodiments, the haptic array device 500 is configured to provide haptic feedback or sensations to an object or focal point 505. In some embodiments, the object 505 is a portion of a user, such as a hand.
[0115] The laser systems may be configured to produce haptics, 3 dimensional visualizations (i.e., holograms), or both. In some embodiments, a hologram is produced by two of the laser systems function as optical emitters and using constructive interference to produce a 3D rendering. In some embodiments, a third laser system produces haptic feedback while the other two laser systems produce the hologram. For example, laser systems 511 and 512 may produce a hologram while laser system 513 provides haptic feedback to a target area 505. [0116] In some embodiments, monitoring system 520 comprises one or more sensors for monitoring an object or an objects response to the provided haptics, as disclosed herein. The one or more sensors of the monitoring system may comprise optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target 505. In some embodiments, the monitoring system is coupled to a computer system which identifies and tracks the target 505 and/or portions thereof, as disclosed herein.
[0117] While the haptic array device 500 depicted in FIG. 5 depicts a device with fully integrated components, it should be appreciated that the components may not be integrated or may be separate from the device. Further, is should be appreciated that the device may be supplemented with further components (such as additional ultrasound transducer arrays) or additional haptic array devices of the same or a similar type.
[0118] In an exemplary embodiment, the system can be used as an entertainment performance enhancer for shows in different venues including stage or large outdoor events. In some embodiments, for acoustic event environments with associated natural or ambient noise that distorts or degrades the performance, the system can mitigate the degradation associated with the natural or ambient noise of performances. In some embodiments, the system can mitigate excess noise from an audience or crowd. In some embodiments, the system can be used as a distributed network that provides several uses.
[0119] Provided herein, per FIGS. 6A-9B, are systems 700 for haptic interactive gaming. As shown, in some embodiments, the system 700 comprises a monitoring device 710 and an ancillary device 720.
[0120] In one embodiment the monitoring device 710 comprises a display 711, a haptic array 712, a camera 713, and a non-transitory computer-readable storage media. In another embodiment the monitoring device 710 comprises the display 711, the haptic array 712, a time-of-flight sensor 714, and the non-transitory computer-readable storage media. In another embodiment the monitoring device 710 comprises the display 711, the haptic array 712, the camera 713, the time-of-flight sensor 714, and the non-transitory computer-readable storage media. In some embodiments, the monitoring device 710 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
[0121] In some embodiments, the display 711 is configured to show a display image. FIG. 6B shows an exemplary display image of the user’s hand manipulating a virtual cube. In some embodiments, this display image is shown while the user experiences a sensation of manipulating the virtual cube by pressure waves emitted from the haptic array 712. FIG. 7A shows an exemplary display image of the user’s hand throwing a ball. FIG. 7B shows an exemplary display image of the user’s hand manipulating one of three displayed balls.
[0122] In some embodiments, the ancillary device 720 comprises a biometric sensor. In some embodiments, the biometric sensor is configured to measure a biometric data. In some
- l- embodiments, the ancillary device 720 is configured to couple to an appendage of a user. In some embodiments, the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof. In some embodiments, the ancillary device 720 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
[0123] In some embodiments, the monitoring device 710 or the ancillary device 720 comprise the non-transitory computer-readable storage media. In some embodiments, the ancillary device 720 further comprises an ancillary communication device and wherein the monitoring device 710 further comprises a monitoring communication device communicably coupled to the ancillary device 720. In some embodiments, the ancillary communication device and the monitoring communication device are wireless communication devices.
[0124] In some embodiments, the camera 713 is configured to capture a plurality of pose images of the user. In some embodiments, the plurality of pose images of the user form a video of the motion of the user. In some embodiments, the camera 713 comprises a two- dimensional camera 713. In some embodiments, the camera 713 comprises a three- dimensional camera 713. In some embodiments, the camera 713 is an infrared camera 713, a near infrared camera 713, a visible light camera 713, an ultra-violet spectrum camera 713, a thermographic camera 713, or any combination thereof. In some embodiments, the camera 713, the time-of-flight sensor 714, or both, captures data at a rate of about 10 Hz to 10,000 Hz. In one embodiment the monitoring device 710 comprises two or more cameras 713, two or more time-of-flight sensors 714, or both. In one embodiment the two or more cameras 713, the two or more time-of-flight sensors 714, or both are arrayed to capture the user from two or more directions. In one embodiment the two or more cameras 713, the two or more time- of-flight sensors 714, or both are arrayed about the haptic array 712.
[0125] In some embodiments, the haptic array 712 comprises a plurality of ultrasonic devices. In some embodiments, the haptic array 712 is a planar array. In some embodiments, the haptic array 712 is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. V. Computing Systems
[0126] Referring to FIG. 1, a block diagram is shown depicting an exemplary machine that includes a computer system 100 (e.g., a processing or computing system) within which a set of instructions can execute for causing a device to perform or execute any one or more of the aspects and/or methodologies for static code scheduling of the present disclosure. The components in FIG. 1 are examples only and do not limit the scope of use or functionality of any hardware, software, embedded logic component, or a combination of two or more such components implementing particular embodiments.
[0127] Computer system 100 may include one or more processors 101, a memory 103, and a storage 108 that communicate with each other, and with other components, via a bus 140. The bus 140 may also link a display 132, one or more input devices 133 (which may, for example, include a keypad, a keyboard, a mouse, a stylus, etc.), one or more output devices 134, one or more storage devices 135, and various tangible storage media 136. All of these elements may interface directly or via one or more interfaces or adaptors to the bus 140. For instance, the various tangible storage media 136 can interface with the bus 140 via storage medium interface 126. Computer system 100 may have any suitable physical form, including but not limited to one or more integrated circuits (ICs), printed circuit boards (PCBs), mobile handheld devices (such as mobile telephones or PDAs), laptop or notebook computers, distributed computer systems, computing grids, or servers.
[0128] Computer system 100 includes one or more processor(s) 101 (e.g., central processing units (CPUs), general purpose graphics processing units (GPGPUs), or quantum processing units (QPUs)) that carry out functions. Processor(s) 101 optionally contains a cache memory unit 102 for temporary local storage of instructions, data, or computer addresses. Processor(s) 101 are configured to assist in execution of computer readable instructions. Computer system 100 may provide functionality for the components depicted in FIG. 1 as a result of the processor(s) 101 executing non-transitory, processor-executable instructions embodied in one or more tangible computer-readable storage media, such as memory 103, storage 108, storage devices 135, and/or storage medium 136. The computer-readable media may store software that implements particular embodiments, and processor(s) 101 may execute the software. Memory 103 may read the software from one or more other computer-readable media (such as mass storage device(s) 135, 136) or from one or more other sources through a suitable interface, such as network interface 120. The software may cause processor(s) 101 to carry out one or more processes or one or more steps of one or more processes described or illustrated herein. Carrying out such processes or steps may include defining data structures stored in memory 103 and modifying the data structures as directed by the software.
[0129] The memory 103 may include various components (e.g., machine readable media) including, but not limited to, a random access memory component (e.g., RAM 104) (e.g., static RAM (SRAM), dynamic RAM (DRAM), ferroelectric random access memory (FRAM), phase-change random access memory (PRAM), etc.), a read-only memory component (e.g., ROM 105), and any combinations thereof. ROM 105 may act to communicate data and instructions unidirectionally to processor(s) 101, and RAM 104 may act to communicate data and instructions bidirectionally with processor(s) 101. ROM 105 and RAM 104 may include any suitable tangible computer-readable media described below. In one example, a basic input/output system 106 (BIOS), including basic routines that help to transfer information between elements within computer system 100, such as during start-up, may be stored in the memory 103.
[0130] Fixed storage 108 is connected bidirectionally to processor(s) 101, optionally through storage control unit 107. Fixed storage 108 provides additional data storage capacity and may also include any suitable tangible computer-readable media described herein. Storage 108 may be used to store operating system 109, executable(s) 110, data 111, applications 112 (application programs), and the like. Storage 108 can also include an optical disk drive, a solid-state memory device (e.g., flash-based systems), or a combination of any of the above. Information in storage 108 may, in appropriate cases, be incorporated as virtual memory in memory 103.
[0131] In one example, storage device(s) 135 may be removably interfaced with computer system 100 (e.g., via an external port connector (not shown)) via a storage device interface 125. Particularly, storage device(s) 135 and an associated machine-readable medium may provide non-volatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for the computer system 100. In one example, software may reside, completely or partially, within a machine-readable medium on storage device(s) 135. In another example, software may reside, completely or partially, within processor(s) 101.
[0132] Bus 140 connects a wide variety of subsystems. Herein, reference to a bus may encompass one or more digital signal lines serving a common function, where appropriate. Bus 140 may be any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures. As an example and not by way of limitation, such architectures include an Industry Standard Architecture (ISA) bus, an Enhanced ISA (EISA) bus, a Micro Channel Architecture (MCA) bus, a Video Electronics Standards Association local bus (VLB), a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, an Accelerated Graphics Port (AGP) bus, HyperTransport (HTX) bus, serial advanced technology attachment (SATA) bus, and any combinations thereof.
[0133] Computer system 100 may also include an input device 133. In one example, a user of computer system 100 may enter commands and/or other information into computer system 100 via input device(s) 133. Examples of an input device(s) 133 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device (e.g., a mouse or touchpad), a touchpad, a touch screen, a multi-touch screen, a joystick, a stylus, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), an optical scanner, a video or still image capture device (e.g., a camera), and any combinations thereof. In some embodiments, the input device is a Kinect, Leap Motion, or the like. Input device(s) 133 may be interfaced to bus 140 via any of a variety of input interfaces 123 (e.g., input interface 123) including, but not limited to, serial, parallel, game port, USB, FIREWIRE, THUNDERBOLT, or any combination of the above.
[0134] In particular embodiments, when computer system 100 is connected to network 130, computer system 100 may communicate with other devices, specifically mobile devices and enterprise systems, distributed computing systems, cloud storage systems, cloud computing systems, and the like, connected to network 130. Communications to and from computer system 100 may be sent through network interface 120. For example, network interface 120 may receive incoming communications (such as requests or responses from other devices) in the form of one or more packets (such as Internet Protocol (IP) packets) from network 130, and computer system 100 may store the incoming communications in memory 103 for processing. Computer system 100 may similarly store outgoing communications (such as requests or responses to other devices) in the form of one or more packets in memory 103 and communicated to network 130 from network interface 120. Processor(s) 101 may access these communication packets stored in memory 103 for processing.
[0135] Examples of the network interface 120 include, but are not limited to, a network interface card, a modem, and any combination thereof. Examples of a network 130 or network segment 130 include, but are not limited to, a distributed computing system, a cloud computing system, a wide area network (WAN) (e.g., the Internet, an enterprise network), a local area network (LAN) (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a direct connection between two computing devices, a peer-to-peer network, and any combinations thereof. A network, such as network 130, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
[0136] Information and data can be displayed through a display 132. Examples of a display 132 include, but are not limited to, a cathode ray tube (CRT), a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic liquid crystal display (OLED) such as a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display, a plasma display, and any combinations thereof. The display 132 can interface to the processor(s) 101, memory 103, and fixed storage 108, as well as other devices, such as input device(s) 133, via the bus 140. The display 132 is linked to the bus 140 via a video interface 122, and transport of data between the display 132 and the bus 140 can be controlled via the graphics control 121. In some embodiments, the display is a video projector. In some embodiments, the display is a head-mounted display (HMD) such as a VR headset. In further embodiments, suitable VR headsets include, by way of non-limiting examples, HTC Vive, Oculus Rift, Samsung Gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, Avegant Glyph, Freefly VR headset, and the like. In still further embodiments, the display is a combination of devices such as those disclosed herein.
[0137] In addition to a display 132, computer system 100 may include one or more other peripheral output devices 134 including, but not limited to, an audio speaker, a printer, a storage device, and any combinations thereof. Such peripheral output devices may be connected to the bus 140 via an output interface 124. Examples of an output interface 124 include, but are not limited to, a serial port, a parallel connection, a USB port, a FIREWIRE port, a THUNDERBOLT port, and any combinations thereof.
[0138] In addition or as an alternative, computer system 100 may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute one or more processes or one or more steps of one or more processes described or illustrated herein. Reference to software in this disclosure may encompass logic, and reference to logic may encompass software. Moreover, reference to a computer-readable medium may encompass a circuit (such as an IC) storing software for execution, a circuit embodying logic for execution, or both, where appropriate. The present disclosure encompasses any suitable combination of hardware, software, or both. [0139] Those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality.
[0140] The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[0141] The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by one or more processor(s), or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
[0142] In accordance with the description herein, suitable computing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles. Those of skill in the art will also recognize that select televisions, video players, and digital music players with optional computer network connectivity are suitable for use in the system described herein. Suitable tablet computers, in various embodiments, include those with booklet, slate, and convertible configurations, known to those of skill in the art.
[0143] In some embodiments, the computing device includes an operating system configured to perform executable instructions. The operating system is, for example, software, including programs and data, which manages the device’s hardware and provides services for execution of applications. Those of skill in the art will recognize that suitable server operating systems include, by way of non -limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®. Those of skill in the art will recognize that suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®. In some embodiments, the operating system is provided by cloud computing. Those of skill in the art will also recognize that suitable mobile smartphone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBerry OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®. Those of skill in the art will also recognize that suitable media streaming device operating systems include, by way of non-limiting examples, Apple TV®, Roku®, Boxee®, Google TV®, Google Chromecast®, Amazon Fire®, and Samsung® HomeSync®. Those of skill in the art will also recognize that suitable video game console operating systems include, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®.
VI. Non-transitory computer readable storage medium
[0144] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computing device. In further embodiments, a computer readable storage medium is a tangible component of a computing device. In still further embodiments, a computer readable storage medium is optionally removable from a computing device. In some embodiments, a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, distributed computing systems including cloud computing systems and services, and the like. In some cases, the program and instructions are permanently, substantially permanently, semi -permanently, or non-transitorily encoded on the media.
VII. Computer Programs
[0145] In some embodiments, the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same. A computer program includes a sequence of instructions, executable by one or more processor(s) of the computing device’s CPU, written to perform a specified task. Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), computing data structures, and the like, which perform particular tasks or implement particular abstract data types. In light of the disclosure provided herein, those of skill in the art will recognize that a computer program may be written in various versions of various languages.
[0146] The functionality of the computer readable instructions may be combined or distributed as desired in various environments. In some embodiments, a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof.
VIII. Web Applications
[0147] In some embodiments, a computer program includes a web application. In light of the disclosure provided herein, those of skill in the art will recognize that a web application, in various embodiments, utilizes one or more software frameworks and one or more database systems. In some embodiments, a web application is created upon a software framework such as Microsoft® .NET or Ruby on Rails (RoR). In some embodiments, a web application utilizes one or more database systems including, by way of non-limiting examples, relational, nonrelational, object oriented, associative, XML, and document oriented database systems. In further embodiments, suitable relational database systems include, by way of non-limiting examples, Microsoft® SQL Server, mySQL™, and Oracle®. Those of skill in the art will also recognize that a web application, in various embodiments, is written in one or more versions of one or more languages. A web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or combinations thereof. In some embodiments, a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML). In some embodiments, a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS). In some embodiments, a web application is written to some extent in a client-side scripting language such as Asynchronous JavaScript and XML (AJAX), Flash® ActionScript, JavaScript, or Silverlight®. In some embodiments, a web application is written to some extent in a serverside coding language such as Active Server Pages (ASP), ColdFusion®, Perl, Java™, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), Python™, Ruby, Tel, Smalltalk, WebDNA®, or Groovy. In some embodiments, a web application is written to some extent in a database query language such as Structured Query Language (SQL). In some embodiments, a web application integrates enterprise server products such as IBM® Lotus Domino®. In some embodiments, a web application includes a media player element. In various further embodiments, a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe® Flash®, HTML 5, Apple® QuickTime®, Microsoft® Silverlight®, Java™, and Unity®.
[0148] Referring to FIG. 2, in a particular embodiment, an application provision system comprises one or more databases 200 accessed by a relational database management system (RDBMS) 210. Suitable RDBMSs include Firebird, MySQL, PostgreSQL, SQLite, Oracle Database, Microsoft SQL Server, IBM DB2, IBM Informix, SAP Sybase, Teradata, and the like. In this embodiment, the application provision system further comprises one or more application severs 220 (such as Java servers, .NET servers, PHP servers, and the like) and one or more web servers 230 (such as Apache, IIS, GWS and the like). The web server(s) optionally expose one or more web services via app application programming interfaces (APIs) 240. Via a network, such as the Internet, the system provides browser-based and/or mobile native user interfaces.
[0149] Referring to FIG. 3, in a particular embodiment, an application provision system alternatively has a distributed, cloud-based architecture 300 and comprises elastically load balanced, auto-scaling web server resources 310 and application server resources 320 as well synchronously replicated databases 330.
IX. Mobile Applications
[0150] In some embodiments, a computer program includes a mobile application provided to a mobile computing device. In some embodiments, the mobile application is provided to a mobile computing device at the time it is manufactured. In other embodiments, the mobile application is provided to a mobile computing device via the computer network described herein.
[0151] In view of the disclosure provided herein, a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective-C, Java™, JavaScript, Pascal, Object Pascal, Python™, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
[0152] Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator®, Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, Android™ SDK, BlackBerry® SDK, BREW SDK, Palm® OS SDK, Symbian SDK, webOS SDK, and Windows® Mobile SDK.
[0153] Those of skill in the art will recognize that several commercial forums are available for distribution of mobile applications including, by way of non-limiting examples, Apple® App Store, Google® Play, Chrome WebStore, BlackBerry® App World, App Store for Palm devices, App Catalog for webOS, Windows® Marketplace for Mobile, Ovi Store for Nokia® devices, Samsung® Apps, and Nintendo® DSi Shop.
X. Standalone Applications
[0154] In some embodiments, a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in. Those of skill in the art will recognize that standalone applications are often compiled. A compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, Java™, Lisp, Python™, Visual Basic, and VB.NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program. In some embodiments, a computer program includes one or more executable complied applications.
XI. Web browser Plug-ins
[0155] In some embodiments, the computer program includes a web browser plug-in (e.g., extension, etc.). In computing, a plug-in is one or more software components that add specific functionality to a larger software application. Makers of software applications support plugins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application. When supported, plugins enable customizing the functionality of a software application. For example, plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types. Those of skill in the art will be familiar with several web browser plug-ins including, Adobe® Flash® Player, Microsoft® Silverlight®, and Apple® QuickTime®. In some embodiments, the toolbar comprises one or more web browser extensions, add-ins, or add-ons. In some embodiments, the toolbar comprises one or more explorer bars, tool bands, or desk bands.
[0156] In view of the disclosure provided herein, those of skill in the art will recognize that several plug-in frameworks are available that enable development of plug-ins in various programming languages, including, by way of non-limiting examples, C++, Delphi, Java™, PHP, Python™, and VB.NET, or combinations thereof.
[0157] Web browsers (also called Internet browsers) are software applications, designed for use with network-connected computing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft® Internet Explorer®, Mozilla® Firefox®, Google® Chrome, Apple® Safari®, Opera Software® Opera®, and KDE Konqueror. In some embodiments, the web browser is a mobile web browser. Mobile web browsers (also called microbrowsers, mini-browsers, and wireless browsers) are designed for use on mobile computing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems. Suitable mobile web browsers include, by way of non-limiting examples, Google® Android® browser, RIM BlackBerry® Browser, Apple® Safari®, Palm® Blazer, Palm® WebOS® Browser, Mozilla® Firefox® for mobile, Microsoft® Internet Explorer® Mobile, Amazon® Kindle® Basic Web, Nokia® Browser, Opera Software® Opera® Mobile, and Sony® PSP™ browser.
XII. Software Modules
[0158] In some embodiments, the platforms, systems, media, and methods disclosed herein include software, server, and/or database modules, or use of the same. In view of the disclosure provided herein, software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art. The software modules disclosed herein are implemented in a multitude of ways. In various embodiments, a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof. In various embodiments, the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application. In some embodiments, software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on a distributed computing platform such as a cloud computing platform. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
XIII. Databases
[0159] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same. In view of the disclosure provided herein, those of skill in the art will recognize that many databases are suitable for storage and retrieval of information associated with registered movements and haptic feedback design. In various embodiments, suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object oriented databases, object databases, entityrelationship model databases, associative databases, XML databases, and document oriented databases. Further non-limiting examples include SQL, PostgreSQL, MySQL, Oracle, DB2, Sybase, and MongoDB. In some embodiments, a database is Internet-based. In further embodiments, a database is web-based. In still further embodiments, a database is cloud computing-based. In a particular embodiment, a database is a distributed database. In other embodiments, a database is based on one or more local computer storage devices.
[0160] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims

CLAIMS What is claimed is:
1. A system for haptic interactive gaming, the system comprising:
(a) a haptic array comprising a plurality of ultrasonic devices;
(b) a camera;
(c) a light source;
(d) a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising:
(i) receiving a two-dimensional (2D) data, a three-dimensional (3D) data, or both from a gaming console;
(ii) directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a first acoustic field based on the 2D data, the 3D data, or both;
(iii) directing the light source to emit light based on the 2D data, the 3D data, or both;
(iv) determining a user motion based on data received by the camera; and
(v) providing the motion data to the user console based on the user motion.
2. The system of claim 1, wherein the haptic array is a planar array.
3. The system of claim 1, wherein the haptic array is a non-planar array.
4. The system of claim 1, wherein the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof.
5. The system of claim 1, wherein at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz.
6. The system of claim 1, wherein the camera comprises a two-dimensional camera.
7. The system of claim 1, wherein the camera comprises a three-dimensional camera. The system of claim 1, wherein the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. The system of claim 1, wherein the camera captures data at a rate of about 10 Hz to 10,000 Hz. The system of claim 1, wherein the user motion is a motion of an appendage of a user. The system of claim 1, wherein the light source comprises a laser, a light emitting diode, a light bulb, or any combination thereof. The system of claim 1, wherein the emitted light has a wavelength of about 10 nm to about 10,000 nm. The system of claim 1, wherein the emitted light has a frequency of about 0.3 THz to about 300 THz. The system of claim 1, further comprising an interferometric device, wherein the operations further comprise calibrating the haptic array based on data received from the interferometric device. The system of claim 14, wherein the interferometric device comprises a laser doppler vibrometer, a laser interferometer, an acoustic interferometer, or any combination thereof. The system of claim 1, further comprising a thermal element, wherein the operations further comprise directing the thermal element to emit heat based on the 2D data, the 3D data, the user motion, or any combination thereof. The system of claim 1, further comprising an energy storage device providing power to the haptic array, the camera, the non-transitory computer-readable storage media, or any combination thereof. The system of claim 17, wherein the energy storage device comprises a battery, a supercapacitor, or any combination thereof. A computer-implemented method for haptic interactive gaming, the method comprising:
(a) receiving, by a computer, a two-dimensional (2D) data, a three-dimensional (3D) data, or both from a gaming console; (b) directing, by the computer, at least a portion of a plurality of ultrasonic devices in a haptic array to emit a first acoustic field based on the 2D data, the 3D data, or both;
(c) directing, by the computer, a light source to emit light based on the 2D data, the 3D data, or both;
(d) determining, by the computer, a user motion based on data received by a camera; and
(e) providing, by the computer, the motion data to the user console based on the user motion. The method of claim 19, wherein the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. The method of claim 19, wherein the camera comprises a two-dimensional camera. The method of claim 19, wherein the camera comprises a three-dimensional camera. The method of claim 19, wherein the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. The method of claim 19, wherein the data is received by the camera at a rate of about 10 Hz to 10,000 Hz. The method of claim 19, further comprising calibrating, by the computer, the haptic array based on data received from an interferometric device. The method of claim 19, further comprising directing, by the computer, a thermal element to emit heat based on the 2D data, the 3D data, the user motion, or any combination thereof.
PCT/US2023/076683 2022-10-13 2023-10-12 Virtual and augmented interactive and tactile projections of sound and light WO2024081783A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263379383P 2022-10-13 2022-10-13
US63/379,383 2022-10-13

Publications (1)

Publication Number Publication Date
WO2024081783A1 true WO2024081783A1 (en) 2024-04-18

Family

ID=90670363

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/076683 WO2024081783A1 (en) 2022-10-13 2023-10-12 Virtual and augmented interactive and tactile projections of sound and light

Country Status (1)

Country Link
WO (1) WO2024081783A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170164876A1 (en) * 2014-07-17 2017-06-15 Elwha Llc Monitoring body movement or condition according to motion regimen with conformal electronics
US20180224926A1 (en) * 2015-08-06 2018-08-09 Pcms Holdings, Inc. Methods and systems for providing haptic feedback for virtual 3d objects
US20210112647A1 (en) * 2018-05-07 2021-04-15 Zane Coleman Angularly varying light emitting device with an imager

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170164876A1 (en) * 2014-07-17 2017-06-15 Elwha Llc Monitoring body movement or condition according to motion regimen with conformal electronics
US20180224926A1 (en) * 2015-08-06 2018-08-09 Pcms Holdings, Inc. Methods and systems for providing haptic feedback for virtual 3d objects
US20210112647A1 (en) * 2018-05-07 2021-04-15 Zane Coleman Angularly varying light emitting device with an imager

Similar Documents

Publication Publication Date Title
US10902034B2 (en) Method for populating a map with a plurality of avatars through the use of a mobile technology platform
Rizzo et al. Is clinical virtual reality ready for primetime?
CN106104423B (en) Pose parameter is adjusted
CN108475120B (en) Method for tracking object motion by using remote equipment of mixed reality system and mixed reality system
CN110456626B (en) Holographic keyboard display
US10317997B2 (en) Selection of optimally positioned sensors in a glove interface object
JP2020004395A (en) Real-world haptic interaction for virtual reality user
KR102162373B1 (en) Associating an object with a subject
CN105283824A (en) Virtual interaction with image projection
US20170363867A1 (en) Control device with holographic element
US10845894B2 (en) Computer systems with finger devices for sampling object attributes
US11157084B2 (en) Touch enabling process, haptic accessory, and core haptic engine to enable creation and delivery of tactile-enabled experiences with virtual objects
JP2022509769A (en) Systems and methods for transitioning between modes of tracking real-world objects for artificial reality interfaces
CN114630738A (en) System and method for simulating sensing data and creating perception
Lee et al. Rich pinch: Perception of object movement with tactile illusion
WO2024081783A1 (en) Virtual and augmented interactive and tactile projections of sound and light
WO2024081803A2 (en) Bio-haptic interactive and tactile projections of sound and light
WO2024081786A2 (en) Skin treatment interactive and tactile projections of sound and light
Wu et al. Launching your VR neuroscience laboratory
KR102177734B1 (en) Stabilization of held objects in virtual reality
Bethel et al. From Components to Caring: The Development Trajectory of a Socially Therapeutic Assistive Robot (STAR) Named Therabot™
WO2024081781A1 (en) Rehab and training interactive and tactile projections of sound and light
Lanier et al. The RealityMashers: Augmented Reality Wide Field-of-View Optical See-Through Head Mounted Displays
US11430170B1 (en) Controlling joints using learned torques
US11149243B2 (en) Electronic device, wearable device, and method of providing content-based somatic senses using ultrasound

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23878225

Country of ref document: EP

Kind code of ref document: A1