WO2024081803A2 - Projections tactiles et interactives bio-haptiques de son et de lumière - Google Patents

Projections tactiles et interactives bio-haptiques de son et de lumière Download PDF

Info

Publication number
WO2024081803A2
WO2024081803A2 PCT/US2023/076714 US2023076714W WO2024081803A2 WO 2024081803 A2 WO2024081803 A2 WO 2024081803A2 US 2023076714 W US2023076714 W US 2023076714W WO 2024081803 A2 WO2024081803 A2 WO 2024081803A2
Authority
WO
WIPO (PCT)
Prior art keywords
camera
user
haptic
computer
array
Prior art date
Application number
PCT/US2023/076714
Other languages
English (en)
Other versions
WO2024081803A3 (fr
Inventor
David CHARLOT
Scott J. Henderson
Ryan J. Dowd
Corey A.M. BERGSRUD
Original Assignee
C3I Tech Llc
The United States Of America, As Represented By The Secretary Of The Navy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by C3I Tech Llc, The United States Of America, As Represented By The Secretary Of The Navy filed Critical C3I Tech Llc
Publication of WO2024081803A2 publication Critical patent/WO2024081803A2/fr
Publication of WO2024081803A3 publication Critical patent/WO2024081803A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • devices and systems incorporate mechanical elements to provide non-laser focused mechanical pressure waves in the human audible spectrum (i.e., about 20 Hz to 20 kHz) and/or human non-audible spectrum.
  • Mechanical elements may include parametric speaker arrays such as ultrasonic speaker arrays, piezo speakers, or electromagnetic speakers, and the like.
  • beam forming and/or beam shaping methods are utilized to focus, direct, or otherwise manipulate waves propagated from the systems and devices disclosed herein.
  • the devices and systems incorporate optical elements to provide laser focused mechanical pressure waves in the human audible spectrum and/or human non-audible spectrum.
  • Optical elements may also be utilized to provide optical signals in the infrared, near infrared, or visible light spectrum.
  • Optical elements may include lasers, light emitting diodes, lenses, mirrors, or a combination thereof.
  • devices and systems incorporate thermal elements to alter an ambient temperature.
  • thermal elements are utilized to lower an ambient temperature.
  • thermal elements are utilized to lower an ambient temperature.
  • thermal elements are utilized to adjust an ambient temperature between about 0° C to about 100° C.
  • temperature sensors are incorporated to measure temperatures of surfaces or areas which may interact with the thermal elements.
  • temperature sensors allow for dynamic adjustment of the thermal elements, as disclosed herein.
  • devices and systems include interferometric elements to measure mechanical pressure waves or optical waves.
  • interferometric elements are utilized for dynamic adjustment of optical elements, emission of electromagnetic waves, and/or emission of mechanical waves.
  • devices and system include optical sensors.
  • optical sensors are utilized to dynamically measure mechanical waves, optical waves, and motion/position of objects (e.g., animate and inanimate objects such as people, cars, rocks, etc.).
  • an optical sensor is provided to capture images at a rate of 10 Hz to 10,000 Hz. Said captured images may be combined into a video format.
  • an optical sensor comprises a camera.
  • optical sensors include infrared, near infrared, visible light, ultra-violet spectrum sensors.
  • optical sensors comprise three-dimensional (3D) spectroscopic cameras capable of sensing in infrared (IR), near infrared, visible light, and/or ultra-violet spectrum.
  • systems utilize multiple stereo infrared (IR) imaging devices.
  • systems and devices incorporate one or more computational elements (e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.) to perform data processing and real-time data processing for dynamic output signal conditioning and adjustment based on desired output and measured signal inputs, as disclosed herein.
  • systems include closed mesh network elements for selfrecognizing interact-ability with like devices to allow constructive or destructive distributed signal modification.
  • systems include open network elements (e.g., 3G, 4G, 5G, long range (LoRa), and the like) to enable connection to internet, intranet, distributed computing network (cloud computing).
  • systems include electrical elements to generate, consume, receive, and transmit power (e.g., solar panels, rechargeable battery, battery, wireless energy transmission / reception components, and the like) to provide power to the system and similar devices within a known proximity.
  • power e.g., solar panels, rechargeable battery, battery, wireless energy transmission / reception components, and the like
  • communication between devices utilizes free space optics communication and has the ability to adjust data transmission bandwidth based on power consumption restrictions.
  • a system for bio-haptic security comprising: a haptic array comprising a plurality of ultrasonic devices; a camera; and a non- transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit an acoustic field having a focal point; determining a user position, a user motion, or both of a user based on data received by the camera; and preventing the user from accessing a physical or virtual object if the user position, the user motion, or both is outside a security threshold from a set user position, a set user motion, or both.
  • the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three- dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
  • the camera captures data at a rate of about 10 Hz to 10,000 Hz.
  • the user position is a position of an appendage of a user, wherein the user motion is a motion of an appendage of a user, or both.
  • the operations further comprise determining a measured position of the focal point based on data received by the camera.
  • the operations further comprise directing at least a portion of the plurality of ultrasonic devices based on the measured position.
  • the system further comprises a light source emitting light at or near the focal point.
  • the light source comprises a laser, a light emitting diode, a light bulb, or any combination thereof.
  • the emitted light has a wavelength of about 10 nm to about 10,000 nm. In some embodiments, the emitted light has a frequency of about 0.3 THz to about 300 THz.
  • the system further comprises an interferometric device, wherein the operations further comprise calibrating the haptic array based on data received from the interferometric device.
  • the interferometric device comprises a laser doppler vibrometer, a laser interferometer, an acoustic interferometer, or any combination thereof.
  • the system further comprises a thermal element emitting heat at or near the focal point, wherein the operations further comprise directing the thermal element to emit heat at or near the focal point.
  • the system further a communication device, wherein the operations further comprise transmitting the user position, the user motion, the data received by the camera, or any combination thereof, via the communication device.
  • the communication device comprises a cellular device, a Wi-Fi device, a mesh network device, a satellite device, a Bluetooth device, or any combination thereof.
  • the system further comprises an energy storage device providing power to the haptic array, the camera, the non-transitory computer-readable storage media, or any combination thereof.
  • the energy storage device comprises a battery, a supercapacitor, or any combination thereof.
  • a computer-implemented method of providing bio-haptic security comprising: directing, by a computer, one or more ultrasonic devices in a haptic array to emit an acoustic field having a focal point; determining, by the computer, a user position, a user motion, or both based on at least one of a position, an orientation, a translation, and a rotation of at least a portion of a user based on data received by a camera; and preventing, by the computer, the user from accessing a physical or virtual object if the user position, the user motion, or both is outside a security threshold from a set user position, a set user motion, or both.
  • the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof.
  • the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. In some embodiments, the data is received by the camera at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the method further comprises calibrating, by the computer, the haptic array based on data received from an interferometric device.
  • the method further comprises determining, by the computer, a measured position of the focal point based on data received by the camera. In some embodiments, the method further comprises directing, by the computer, at least a portion of the plurality of ultrasonic devices based on the measured position. In some embodiments, the method further comprises directing, by the computer, a light source to emit light at or near the focal point. In some embodiments, the method further comprises directing, by the computer, a thermal element to emit heat at or near the focal point. In some embodiments, the method further comprises transmitting, by the computer, the user position, the user motion, the data received by the camera or any combination thereof, via a communication device.
  • FIG. 1 shows a non-limiting example of a computing device; in this case, a device with one or more processors, memory, storage, and a network interface;
  • FIG. 2 shows a non-limiting example of a web/mobile application provision system; in this case, a system providing browser-based and/or native mobile user interfaces;
  • FIG. 3 shows a non-limiting example of a cloud-based web/mobile application provision system; in this case, a system comprising an elastically load balanced, auto-scaling web server and application server resources as well synchronously replicated databases;
  • FIG. 4 shows a non-limiting example of a bio-haptic security system
  • FIG. 5 shows a non-limiting example of a haptic array device
  • FIG. 6 depicts a non-limiting example of a method of using a haptic array device in a bio-haptic security system
  • FIG. 7 shows a non-limiting example of a bio-haptic security system.
  • the haptic feedback system utilizes a combination of optic and acoustic fields simultaneously.
  • generated optic and acoustic fields have no direct interference, however, combining them provides benefits such as multi-resolution haptic images and a synergistic effect on haptic perception.
  • the fields are applied simultaneously as elastic wave to stimulate nerves signals.
  • the optic field is utilized to simulate or produce a “skin feeling,” or feeling of touch.
  • the acoustic field is utilized to apply pressure. Combining two fields of different physical quantities would provide not only the superposition effect proposed above but also synergistic effects such as modification of the feeling.
  • FIG. 4 shows a diagram of the components of haptic array device, according to some embodiments.
  • FIG. 5 depicts a haptic array device, according to some embodiments.
  • the system is parametric.
  • the non-linearity of the frequency response produced by multiple ultrasonic frequencies in air is modeled utilizing parametric equations.
  • the parametric equations may be utilized in computer and/or machine learning systems to (and resultingly, the effect is best modeled with parametric equations).
  • the system includes Field Programmable Gate Arrays (FPGAs), machine learning, autonomous control systems, fast-networking, fast-self healing, interferometer sensors, ultrasonic speaker arrays, and the like.
  • FPGAs Field Programmable Gate Arrays
  • the system utilizes laser interferometer technology to measure the response of an environment, one or more objects, or a combination thereof to dynamically change parameters and achieve desired effects.
  • a laser interferometer system sends out a two-beam laser to measure vibration of a surface.
  • laser interferometer is used to receive vibration signals to calibrate the output of the ultrasonic transducer array to effectively beamform the audio waves to focus on one or more points on a subject or object.
  • parametric speaker array is a highly directive speaker that consists of an array of ultrasonic transducers that exploit the nonlinear properties of air to self-demodulate modulated ultrasonic signals with the aim of creating narrow, focused sound waves (audible and inaudible).
  • the ultrasonic transducers are piezoelectrically driven.
  • the system utilizes one or more parametric speaker/transducer arrays.
  • each transducer array comprises multiple transducers.
  • the multiple transducers of each array output the same signal which is amplified by constructive interference.
  • two or more arrays are configured to further amplify a signal via constructive interference.
  • a plurality of speaker arrays may be utilized to precisely direct sound or amplify sound at a precise location.
  • Use of a parametric speaker array may the traditional use of broadcasting audio through distributed & coherent beamforming functionality. This approach offers the capability of numerous smaller devices to output the same audio volume as a single large device.
  • the system and methods herein allow for high powered acoustic energy signals to be achieved with a system which is relatively compact and has low power requirements.
  • the system combines the laser interferometer and parametric speaker array technologies with the distributed coherent beamforming technique through a network capable control system that uses algorithms and/or machine learning (ML) to rapidly tune the audio effect to mitigate destructive environmental noise and to enable effective beam coherence. Therefore, in some embodiments, the system provides autonomous environmental adjustments and distributed coherence beam forming.
  • ML machine learning
  • the inventive device combines three fundamental technologies:
  • the inventive device combines four fundamental technologies: (1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms, (2) one or more lasers for generating laser haptics, (3) one or more video capture device for monitoring at least a portion of a subject, and (4) a network-connected system controller to manage data from both the network and the individual components.
  • an individual system functions on its own.
  • individual systems are combined in a network that provides a distributed coherent beamforming function.
  • the system utilizes digital signal processing; embedded systems, information technology for distributed networking (i.e., Internet of Things (IOT)), and machine leaming/artificial intelligence (ML/ Al) for device self-calibration.
  • IOT Internet of Things
  • ML/ Al machine leaming/artificial intelligence
  • a system 400 for controlling providing haptic feedback or stimulation is depicted, according to some embodiments.
  • the system 400 is utilized to stimulate or provide haptic feedback to subject or portion of a subject (e.g., a hand of a subject 490).
  • the system 400 includes network module 405, system controller 410, acoustic payload controller 420, a monitoring controller 425, monitoring sensors 430, acoustic haptic array controller 435, acoustic haptic array 450, optical emission controller 460, optical emitter 465, and recorder 440.
  • the functions of the system 400 are controlled by system controller 410.
  • the system controller 410 comprises a computer processing unit (CPU), as described herein.
  • the CPU may comprise one or more programs loaded onto a memory for sending instructions for operating the various components of the system, as described herein.
  • the system controller 410 may further comprise a field programmable gate array (FPGA) configurable to provide a logic circuit for specified functions of the system.
  • the system controller 410 is in operative communication with a network module 405.
  • the network module 405 may be configured to receive information instructions, such a programming instructions, parameter inputs, or the like and transmit said instructions to the system controller 410.
  • the network module 405 may communicate with an external network, remote device, user interface, or the like, as disclosed herein.
  • mesh networking is utilized.
  • mesh networking allows the system to provide distributed coherence.
  • mesh networking may allow many small systems to achieve the performance of a much larger system.
  • Mesh networking may also allow the system to provide unique and complicated acoustic algorithms (e.g., machine learning) to enable precise spatial audio or ultrasonic feedback.
  • the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460.
  • the acoustic payload controller and the optical emission controller are integrated into a single haptic array controller.
  • the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460 via one or more control buses 415.
  • the acoustic payload controller 420 comprises an application specific integrated circuit (ASIC) processes one or more signals and provides an output signal to the acoustic haptic array controller 435.
  • ASIC application specific integrated circuit
  • the acoustic haptic array controller 435 provides an output signal to the acoustic haptic array 450, where the output signal is transformed into to mechanical waveform (e.g., an acoustic, sound, or ultrasonic waveform) by one or more transducers of the acoustic haptic array.
  • the haptic array controller comprises an amplifier to amplify the signal prior to output to the haptic array(s).
  • the system is connected to a plurality of haptic arrays and the output to each haptic array is varied to produce a desired output.
  • the constructive interference of the sonic waves produced by the transducers is utilized to produce one or more focal points.
  • production of the focal point is digitally controlled by the haptic payload controller.
  • focal points of sonic energy are produced with a resolution of 1/16 of the wavelength (e.g., approximately 0.5 mm for the 40-kHz ultrasound).
  • the optical emission controller 460 comprises an application specific integrated circuit (ASIC) processes one or more received signals.
  • the optical emission controller 425 receives signals from the system controller 410.
  • the optical emission controller 425 receives signals from the system controller 410, the acoustic payload controller 420, the monitoring controller 425, or a combination thereof.
  • the optical emission controller 460 provides directs and controls one or more optical emitters 465.
  • the optical one or more optical emitters 465 comprise at least one light source. In some embodiments, the optical one or more optical emitters 465 comprise at least one light source coupled to one or more optical elements.
  • the optical elements may comprise lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location.
  • the system is connected to a plurality of optical emitters and the output to each optical emitter is varied to produce a desired output.
  • the light source of the optical emitter is a laser, as described herein.
  • the optical emitter produces electromagnetic energy outside of the visible light spectrum.
  • the optical emitter may produce electromagnetic waves within the ultraviolet or infrared spectrum.
  • the optical emitter is replaced or used in combination with an emitter which generates another type of electromagnetic energy, such as radio emissions.
  • the optical emitter is replaced or used in combination with a thermal emitter which generates and transmits heat toward a target location or focal point.
  • the system 400 comprises a monitoring controller 425.
  • the monitoring controller operates and receives data from one or more monitoring sensors.
  • Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490).
  • a target e.g., a target area, volume, or a portion of a subject 490.
  • an interferometer is utilized as a monitoring sensor, as disclosed herein.
  • the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the acoustic payload controller 420.
  • the acoustic payload controller 420 comprises a digitally-programmable potentiometer (DPP) which receives the interferometer data.
  • the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the optical emission controller 460.
  • the optical emission controller 460 comprises a digitally-programmable potentiometer (DPP) which receives the data generated by the monitoring sensors.
  • the monitoring data is sent back to system controller 410.
  • the acoustic payload controller 420 may adjust the output signal to the acoustic haptic array controller 420 based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440.
  • the optical emission controller 460 may adjust the output signal to the optical emitter based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440.
  • the system is configured such that feedback received from the monitoring sensors 430 is utilized to adjust the system, output of the haptic arrays 450, and output of the optical emitters 465. In some embodiments, adjustments are made in real-time to provide a self-calibrating system.
  • the system further comprises a recorder 440.
  • Recorder 440 may receive and store monitoring data via an input/output (I/O) integrated circuit coupled to the monitoring controller.
  • the stored data may be utilized by the system to improve outputs.
  • the stored monitoring data is input into a machine learning module to improve the system.
  • the system is used for audio recording using an interferometer (i.e., ISR).
  • ISR interferometer
  • the monitoring data is used to track a target 490.
  • the monitoring data is used to monitor the response of a target to the haptic output of the system.
  • the system is modular, such that multiple systems can be networked to provide different levels of performance based on user needs.
  • An individual system may operate independently for reduced function based on user needs.
  • Combined systems may operate together to produce a higher output signal, provide haptic feedback to a larger volume of space.
  • sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). In some embodiments, sonic haptic feedback is provided to a target via an array of ultrasonic transducers. In some embodiments, an array of ultrasonic transducers comprises 324 transducers arranged in an 18 x 18 square grid. However, multiple arrangements of the transducers may be provided to better suit various applications. In some embodiments, the transducers are arranged as a planar array.
  • the transducers are arranged in a non-planar array. In some embodiments, the transducers are arranged in two or more planar arrays which are provided at an angle to each other. In some embodiments, the transducers are arranged in two or more planar arrays which are orthogonal to each other. In some embodiments, the transducers are open aperture ultrasonic transducers. In some embodiments, the transducers are ceramic transducers (e.g., Nippon Ceramic T4010A1 transducers).
  • an array of ultrasonic transducers comprises about 4 transducers to about 1,025 transducers. In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 25 transducers, about 4 transducers to about 64 transducers, about 4 transducers to about 256 transducers, about 4 transducers to about 324 transducers, about 4 transducers to about 576 transducers, about 4 transducers to about 1,025 transducers, about 25 transducers to about 64 transducers, about 25 transducers to about 256 transducers, about 25 transducers to about 324 transducers, about 25 transducers to about 576 transducers, about 25 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about 324 transducers, about 64 transducers to about 576 transducers, about 64 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about
  • an array of ultrasonic transducers comprises at least about 4 transducers, about 25 transducers, about 64 transducers, about 256 transducers, about 324 transducers, or about 576 transducers, including increments therebetween. In some embodiments, a plurality of transducer arrays is provided.
  • the transducers are capable of producing an ultrasonic focal point having a diameter of about 20 millimeters (mm). In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 100 mm.
  • the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 5 mm, about 1 mm to about 10 mm, about 1 mm to about 20 mm, about 1 mm to about 40 mm, about 1 mm to about 50 mm, about 1 mm to about 100 mm, about 5 mm to about 10 mm, about 5 mm to about 20 mm, about 5 mm to about 40 mm, about 5 mm to about 50 mm, about 5 mm to about 100 mm, about 10 mm to about 20 mm, about 10 mm to about 40 mm, about 10 mm to about 50 mm, about 10 mm to about 100 mm, about 20 mm to about 40 mm, about 20 mm to about 50 mm, about 20 mm to about 100 mm, about 40 mm to about 50 mm, about 40 mm to about 100 mm, or about 50 mm to about 100 mm.
  • the transducers are capable of producing an ultrasonic focal point having a diameter of at least about 1 mm, about 5 mm, about 10 mm, about 20 mm, about 40 mm, or about 50 mm, including increments therebetween. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at most about 5 mm, about 10 mm, about 20 mm, about 40 mm, about 50 mm, or about 100 mm, including increments therebetween.
  • the transducer array is capable of providing pressure forces of about 10 millinewtons (mN) to about 20 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 100 mN.
  • the transducer array is capable of providing pressure forces of about 1 mN to about 2 mN, about 1 mN to about 5 mN, about 1 mN to about 10 mN, about 1 mN to about 20 mN, about 1 mN to about 50 mN, about 1 mN to about 100 mN, about 2 mN to about 5 mN, about 2 mN to about 10 mN, about 2 mN to about 20 mN, about 2 mN to about 50 mN, about 2 mN to about 100 mN, about 5 mN to about 10 mN, about 5 mN to about 20 mN, about 5 mN to about 50 mN, about 5 mN to about 100 mN, about 10 mN to about 20 mN, about 10 mN to about 50 mN, about 10 mN to about 100 mN, about 20 mN to about 50 mN, about 20 mN to about 100 mN, about 10
  • the ultrasonic haptics are based on acoustic radiation pressure, which is not vibrational and presses the skin surface. This can be applied on the skin for a long time but this is relatively weak. The sensation maybe similar to a laminar air flow within a narrow area.
  • vibrotactile stimulations are produced by modulation of ultrasonic emission as waveforms.
  • vibrotactile stimulations are produced by modulated by 200 Hz and 50 Hz waves.
  • the waveforms for producing ultrasonic haptic feedback are sinewaves, rectangular waves, triangular waves, or a combination thereof.
  • the spatial resolution produced by the transducer array is about 8.5 mm when the array is operating at 40 kilohertz (kHz).
  • the haptic array device comprises one or more lasers for providing haptic feedback.
  • a laser emits energy at a wavelength of about 10 nm to about 10,000 nm.
  • a laser has a frequency of about 0.3 THz to about 300 THz.
  • a power output of the laser is about 0.16 watts (W).
  • a power output of the laser is about 0.01 W to about 0.5 W.
  • a power output of the laser is about 0.01 W to about 0.05 W, about 0.01 W to about 0.1 W, about 0.01 W to about 0.13 W, about 0.01 W to about 0.16 W, about 0.01 W to about 0.2 W, about 0.01 W to about 0.3 W, about 0.01 W to about 0.5 W, about 0.05 W to about 0.1 W, about 0.05 W to about 0.13 W, about 0.05 W to about 0.16 W, about 0.05 W to about 0.2 W, about 0.05 W to about 0.3 W, about 0.05 W to about 0.5 W, about 0.1 W to about 0.13 W, about 0.1 W to about 0.16 W, about 0.1 W to about 0.2 W, about 0.1 W to about 0.3 W, about 0.1 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about 0.3 W, about 0.13 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about
  • a power output of the laser is about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therebetween. In some embodiments, a power output of the laser is at least about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, or about 0.3 W, including increments therebetween. In some embodiments, a power output of the laser is at most about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therebetween.
  • a low laser power levels prevent damaging of the skin of a user.
  • the sensation produced by the laser system may be similar to an electric sensation.
  • the haptic feedback from the laser causes evaporation from a nonthermal shockwave produced on skin.
  • duration of laser exposure is limited to prevent damage to the skin.
  • a haptic laser system comprises at least one laser light source.
  • the haptic laser system comprises optical elements such as lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location.
  • a haptic laser system comprises galvo- mirrors for precise positioning of the laser energy.
  • a laser system comprises a computer-controlled optical phased array comprising pixels that modulate a laser beam’s intensity, phase, or both.
  • the haptic array device utilizes a combination of electromagnetic energy and pressure from mechanical waves to produce unique sensations for a user.
  • the ultrasonic transducers can produce pressure in larger areas (e.g., about 30 cm areas).
  • the laser haptics systems produce sensations in more focused areas (e.g., down to 1 micron). Therefore, a combination of laser and ultrasonic transducer systems may produce focused haptics at different scales simultaneously. For example, if a target is a hand of a user, the ultrasonic haptic system may produce a pressure sensation on the palm of the hand, while the laser haptic system focuses a sensation on a fingertip of the user. Such a configuration may be useful in confirming registration or detection of various parts of the hand when being used in combination with a gesture registration system.
  • lasers of the haptic array device are utilized to produce visualizations.
  • constructive interference produced by a laser emission system is utilized to generate 3D images or holograms.
  • a 3D image or hologram is utilized to help guide a user when the haptic array device is being used as a controller or for gesture recognition.
  • a 3D image or hologram is utilized to help guide a user when using an external device is being used as a controller or for gesture recognition.
  • a 3D image may be produced to guide a user’s hand to the center of an image when a camera (either incorporated or external to the haptic array device) is being utilized for gesture recognition.
  • a haptic array device utilizes a laser system to produce both haptic and visual effects.
  • the haptic feedback is provided as the user interacts with a 3D image or hologram.
  • one or more sensors are provided to monitor interaction with the haptic array device.
  • a monitoring system comprising one or more sensors is provided to monitor a user position, the user motion, or both is outside a security threshold from a set user position, a set user motion, or both.
  • Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490).
  • an interferometer is utilized as a monitoring sensor, as disclosed herein.
  • a monitoring system comprises a camera.
  • the camera captures data at a rate of about 10 Hz to 10,000 Hz.
  • the camera comprises a two-dimensional camera.
  • the camera comprises a three-dimensional camera.
  • the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
  • the camera is coupled to a computer processing unit (CPU) of the system, as disclosed herein.
  • the camera may be utilized for gesture recognition.
  • haptic feedback is provided by the haptic array device in response to position or movement of a target within the field of view of the camera.
  • feature detection and extraction methods are utilized to identify a region of interest on the target.
  • regions of interest may include such as a finger, palm, thumb, fingertip, etc. of a user.
  • feature detection and extraction methods comprise computing processing of images to analyze contrasts in pixel brightness to recognize features.
  • Feature detection and extractions methods may include edge detection, corner detection, blob detection, ridge detection, and combinations thereof.
  • an edge detection algorithm is utilized to identify an outline or border of a target.
  • a nearest neighbor, thresholding, clustering, partial differential equation, and/or other digital image processing methods are utilized to identify an outline or border of a target.
  • Canny, Deriche, differential, Sobel, Prewitt, and Roberts cross edge detection techniques may be utilized to identify target or a portion thereof.
  • Gaussian or Laplacian techniques are utilized to smooth or improve the accuracy of the identified target or portion thereof.
  • additional sensors are utilized to enhance or supplement the performance of the haptic array device or monitoring system thereof.
  • ancillary sensors comprise wearable sensors which are attached to a user to receive additional data generated by movements or electrical signals (e.g., electromyographic (EMG), electroencephalographic (EEG), etc.) produced by a user.
  • EMG electromyographic
  • EEG electroencephalographic
  • a wearable ancillary sensor comprises one or more motion sensors.
  • the motion sensors comprise an accelerometer, a gyroscope, or a combination thereof.
  • a wearable ancillary sensor array is configured to couple to an appendage, limb, or extremity of a user.
  • an existing device comprising one or more motion sensors (e.g., a smart watch) is coupled to the haptic array device to act as an ancillary sensor device.
  • additional bioinformatics are acquired by the ancillary sensors such as heart rate, body temperature, blood pressure, or a combination thereof.
  • a wearable ancillary sensor array is configured to be worn on a head of user.
  • a wearable ancillary sensor array comprising one or more EEG sensors is configured to place the EEG sensors in proximity to the scalp of a user and receive electric signals produced by the brain of the user.
  • the EEG sensors do not require direct contact to the skin (e.g., no need for shaving of the head) or a gel to be applied to the scalp.
  • the ancillary sensors are used confirm or verify actions or gestures made by a user.
  • bioinformatic information obtained by the ancillary sensors is recorded and stored in a memory of the system.
  • a bio-haptic security system is provided by the devices and methods disclosed herein.
  • a haptic array device is utilized in a bio- haptic security system.
  • the haptic array device comprises a monitoring system and is utilized for gesture recognition as part of bio-haptic security methods and systems.
  • the monitoring system further identifies features of a portion of a user for authentication. Authentication of user may allow for access to secured information, access to secured accounts, or unlocking/locking of an associated device. Authentication or verification of a user may further comprise recognition of a series of movements performed by the user and detected by a monitoring system.
  • haptic feedback is provided to help guide the user.
  • haptic feedback may be utilized to confirm that the monitoring system has identified a target portion of a user or confirm that a step of an unlocking process has been completed and detected.
  • haptic feedback is utilized to confirm a portion of the user is properly in view of the monitoring system.
  • a laser system of the haptic array provides a visualization (e.g., a hologram or 3D image).
  • a user may interact with the visualization produced by the laser system to unlock or access a secure account, device, or information.
  • haptic feedback is provided as a user interacts with the visualization to confirm that the user’s actions are properly registered.
  • unlocking gestures or movements are timed with haptic feedback to provide a unique series of movements as an unlocking feature.
  • an authentication may be provided as a one or more gestures made by a user when the user receives a specific type of haptic feedback or haptic feedback at a specific location on a portion of their body.
  • an ancillary sensor comprising one or more EEG sensors is utilized to supplement a bio-haptic security system.
  • EEG sensors monitor brain activity of a user while cues are provided or while the user engages with the haptic array. The unique brain activity of an individual user may be recorded and utilized as an additional security measure to correctly identify the user.
  • acoustic signals emitted from the haptic array device are unique to an individual’s anatomical transfer function (ATF) or head-related transfer function (HRTF).
  • ATF anatomical transfer function
  • HRTF head-related transfer function
  • the acoustic signals may be emitted based on an individual’s head and ear shape, such that only the specific individual may be able to hear cues or commands emitted from the device.
  • a frequency of the emitted acoustic signals is based on the anatomical transfer function of the individual.
  • cues are emitted to produce a three-dimensional sound construction based on a user’s ATF.
  • a user is instructed to perform specified gestures or movements to be detected by the bio-haptic security system by cues customized to the user’s ATF.
  • Coupling of a haptic array device to a secure device or account may be carried out through wired or wireless communication. Unlocking of a device may be carried out through a combination of physical hardware and electronic verification using the haptic array device.
  • a secure device may include a mobile device, computer, safe, door, safety deposit box, or any device which could use of a secure digital locking system may be desired.
  • use of a haptic array device is utilized to grant access to a secured area, container, or the like.
  • the bio-haptic security system may be implemented to grant a user access to a storage unit, locker, secured file library, etc.
  • a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range.
  • description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • a sample includes a plurality of samples, including mixtures thereof.
  • acoustic “sound,” or “sonic” are often used interchangeably herein to refer to mechanical pressure waves. Unless specified, the terms “acoustic” and “sonic” should broadly read on waveforms ranging through all sonic frequency ranges, including audible, inaudible, and ultrasonic frequencies.
  • the term “about” a number refers to that number plus or minus 10% of that number.
  • the term “about” a range refers to that range minus 10% of its lowest value and plus 10% of its greatest value.
  • the haptic array device 500 comprises an array of transducers 550 for producing sonic haptics, as described.
  • the array 550 is an ultrasonic transducer array, as disclosed herein.
  • the haptic array device 500 further comprises laser systems 511, 512, 513.
  • the haptic array device 500 further comprises an integrated monitoring system 520.
  • the haptic array device 500 is configured to provide haptic feedback or sensations to an object or focal point 505.
  • the object 505 is a portion of a user, such as a hand.
  • the laser systems may be configured to produce haptics, 3 dimensional visualizations (i.e., holograms), or both.
  • a hologram is produced by two of the laser systems function as optical emitters and using constructive interference to produce a 3D rendering.
  • a third laser system produces haptic feedback while the other two laser systems produce the hologram.
  • laser systems 511 and 512 may produce a hologram while laser system 513 provides haptic feedback to a target area 505.
  • monitoring system 520 comprises one or more sensors for monitoring an object or an objects response to the provided haptics, as disclosed herein.
  • the one or more sensors of the monitoring system may comprise optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target 505.
  • the monitoring system is coupled to a computer system which identifies and tracks the target 505 and/or portions thereof, as disclosed herein.
  • haptic array device 500 depicted in FIG. 5 depicts a device with fully integrated components, it should be appreciated that the components may not be integrated or may be separate from the device. Further, is should be appreciated that the device may be supplemented with further components (such as additional ultrasound transducer arrays) or additional haptic array devices of the same or a similar type.
  • the bio-haptic security system comprises a screen 710 to provide visual instructions or cues.
  • screen 710 is a touchscreen, enabling a user to interact with the touchscreen, navigate menus, confirm instructions, provide information, etc.
  • the bio-haptic security system comprises an ultrasonic haptic transducer array 750, as disclosed herein.
  • the bio- haptic security system 700 comprises one or more monitoring sensors 720.
  • the monitoring sensors may comprise optical sensors such an interferometer, camera, infrared camera, etc., as disclosed herein.
  • the monitoring sensors 720 comprise at least two optical sensors (e.g., cameras) to provide detection of an appendage (e.g., a hand) of a user in a three-dimensional space, such that the user’s movements and positioning of their appendage can be accurately detected and/or tracked.
  • an appendage e.g., a hand
  • the bio-haptic security system 700 comprises one or more speakers 730 for providing audio instructions or cues to a user.
  • instructions are provided by the screen 710 and/or speakers 730 instructing a user to position an appendage, perform a movement/gesture, interact with ultrasonic haptics, etc. as part of a user verification, as disclosed herein.
  • the speakers 730 are direction speakers.
  • the speakers utilize constructive interference to direct instructions to the user without disturbing or allowing nearby individuals or audio recording devices to detect the audio instructions.
  • an audio output is customized based on a user’s anatomical transfer function.
  • FIG. 1 a block diagram is shown depicting an exemplary machine that includes a computer system 100 (e.g., a processing or computing system) within which a set of instructions can execute for causing a device to perform or execute any one or more of the aspects and/or methodologies for static code scheduling of the present disclosure.
  • a computer system 100 e.g., a processing or computing system
  • the components in FIG. 1 are examples only and do not limit the scope of use or functionality of any hardware, software, embedded logic component, or a combination of two or more such components implementing particular embodiments.
  • Computer system 100 may include one or more processors 101, a memory 103, and a storage 108 that communicate with each other, and with other components, via a bus 140.
  • the bus 140 may also link a display 132, one or more input devices 133 (which may, for example, include a keypad, a keyboard, a mouse, a stylus, etc.), one or more output devices 134, one or more storage devices 135, and various tangible storage media 136. All of these elements may interface directly or via one or more interfaces or adaptors to the bus 140.
  • the various tangible storage media 136 can interface with the bus 140 via storage medium interface 126.
  • Computer system 100 may have any suitable physical form, including but not limited to one or more integrated circuits (ICs), printed circuit boards (PCBs), mobile handheld devices (such as mobile telephones or PDAs), laptop or notebook computers, distributed computer systems, computing grids, or servers.
  • ICs integrated circuits
  • PCBs printed circuit boards
  • mobile handheld devices such as mobile telephones or PDAs
  • Computer system 100 includes one or more processor(s) 101 (e.g., central processing units (CPUs), general purpose graphics processing units (GPGPUs), or quantum processing units (QPUs)) that carry out functions.
  • processor(s) 101 optionally contains a cache memory unit 102 for temporary local storage of instructions, data, or computer addresses.
  • Processor(s) 101 are configured to assist in execution of computer readable instructions.
  • Computer system 100 may provide functionality for the components depicted in FIG. 1 as a result of the processor(s) 101 executing non-transitory, processor-executable instructions embodied in one or more tangible computer-readable storage media, such as memory 103, storage 108, storage devices 135, and/or storage medium 136.
  • the computer-readable media may store software that implements particular embodiments, and processor(s) 101 may execute the software.
  • Memory 103 may read the software from one or more other computer-readable media (such as mass storage device(s) 135, 136) or from one or more other sources through a suitable interface, such as network interface 120.
  • the software may cause processor(s) 101 to carry out one or more processes or one or more steps of one or more processes described or illustrated herein. Carrying out such processes or steps may include defining data structures stored in memory 103 and modifying the data structures as directed by the software.
  • the memory 103 may include various components (e.g., machine readable media) including, but not limited to, a random-access memory component (e.g., RAM 104) (e.g., static RAM (SRAM), dynamic RAM (DRAM), ferroelectric random access memory (FRAM), phase-change random access memory (PRAM), etc.), a read-only memory component (e.g., ROM 105), and any combinations thereof.
  • ROM 105 may act to communicate data and instructions unidirectionally to processor(s) 101
  • RAM 104 may act to communicate data and instructions bidirectionally with processor(s) 101.
  • ROM 105 and RAM 104 may include any suitable tangible computer-readable media described below.
  • a basic input/output system 106 (BIOS) including basic routines that help to transfer information between elements within computer system 100, such as during start-up, may be stored in the memory 103.
  • Fixed storage 108 is connected bidirectionally to processor(s) 101, optionally through storage control unit 107.
  • Fixed storage 108 provides additional data storage capacity and may also include any suitable tangible computer-readable media described herein.
  • Storage 108 may be used to store operating system 109, executable(s) 110, data 111, applications 112 (application programs), and the like.
  • Storage 108 can also include an optical disk drive, a solid-state memory device (e.g., flash-based systems), or a combination of any of the above.
  • Information in storage 108 may, in appropriate cases, be incorporated as virtual memory in memory 103.
  • storage device(s) 135 may be removably interfaced with computer system 100 (e.g., via an external port connector (not shown)) via a storage device interface 125.
  • storage device(s) 135 and an associated machine-readable medium may provide non-volatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for the computer system 100.
  • software may reside, completely or partially, within a machine-readable medium on storage device(s) 135.
  • software may reside, completely or partially, within processor(s) 101.
  • Bus 140 connects a wide variety of subsystems.
  • reference to a bus may encompass one or more digital signal lines serving a common function, where appropriate.
  • Bus 140 may be any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures.
  • such architectures include an Industry Standard Architecture (ISA) bus, an Enhanced ISA (EISA) bus, a Micro Channel Architecture (MCA) bus, a Video Electronics Standards Association local bus (VLB), a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, an Accelerated Graphics Port (AGP) bus, HyperTransport (HTX) bus, serial advanced technology attachment (SATA) bus, and any combinations thereof.
  • ISA Industry Standard Architecture
  • EISA Enhanced ISA
  • MCA Micro Channel Architecture
  • VLB Video Electronics Standards Association local bus
  • PCI Peripheral Component Interconnect
  • PCI-X PCI-Express
  • AGP Accelerated Graphics Port
  • HTTP HyperTransport
  • SATA serial advanced technology attachment
  • Computer system 100 may also include an input device 133.
  • a user of computer system 100 may enter commands and/or other information into computer system 100 via input device(s) 133.
  • Examples of an input device(s) 133 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device (e.g., a mouse or touchpad), a touchpad, a touch screen, a multi-touch screen, a joystick, a stylus, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), an optical scanner, a video or still image capture device (e.g., a camera), and any combinations thereof.
  • an alpha-numeric input device e.g., a keyboard
  • a pointing device e.g., a mouse or touchpad
  • a touchpad e.g., a touch screen
  • a multi-touch screen e.g., a joystick
  • the input device is a Kinect, Leap Motion, or the like.
  • Input device(s) 133 may be interfaced to bus 140 via any of a variety of input interfaces 123 (e.g., input interface 123) including, but not limited to, serial, parallel, game port, USB, FIREWIRE, THUNDERBOLT, or any combination of the above.
  • computer system 100 when computer system 100 is connected to network 130, computer system 100 may communicate with other devices, specifically mobile devices and enterprise systems, distributed computing systems, cloud storage systems, cloud computing systems, and the like, connected to network 130. Communications to and from computer system 100 may be sent through network interface 120.
  • network interface 120 may receive incoming communications (such as requests or responses from other devices) in the form of one or more packets (such as Internet Protocol (IP) packets) from network 130, and computer system 100 may store the incoming communications in memory 103 for processing.
  • Computer system 100 may similarly store outgoing communications (such as requests or responses to other devices) in the form of one or more packets in memory 103 and communicated to network 130 from network interface 120.
  • Processor(s) 101 may access these communication packets stored in memory 103 for processing.
  • Examples of the network interface 120 include, but are not limited to, a network interface card, a modem, and any combination thereof.
  • Examples of a network 130 or network segment 130 include, but are not limited to, a distributed computing system, a cloud computing system, a wide area network (WAN) (e.g., the Internet, an enterprise network), a local area network (LAN) (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a direct connection between two computing devices, a peer-to-peer network, and any combinations thereof.
  • a network, such as network 130 may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
  • Information and data can be displayed through a display 132.
  • a display 132 include, but are not limited to, a cathode ray tube (CRT), a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic liquid crystal display (OLED) such as a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display, a plasma display, and any combinations thereof.
  • the display 132 can interface to the processor(s) 101, memory 103, and fixed storage 108, as well as other devices, such as input device(s) 133, via the bus 140.
  • the display 132 is linked to the bus 140 via a video interface 122, and transport of data between the display 132 and the bus 140 can be controlled via the graphics control 121.
  • the display is a video projector.
  • the display is a head-mounted display (HMD) such as a VR headset.
  • suitable VR headsets include, by way of non-limiting examples, HTC Vive, Oculus Rift, Samsung Gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, Avegant Glyph, Freefly VR headset, and the like.
  • the display is a combination of devices such as those disclosed herein.
  • computer system 100 may include one or more other peripheral output devices 134 including, but not limited to, an audio speaker, a printer, a storage device, and any combinations thereof. Such peripheral output devices may be connected to the bus 140 via an output interface 124. Examples of an output interface 124 include, but are not limited to, a serial port, a parallel connection, a USB port, a FIREWIRE port, a THUNDERBOLT port, and any combinations thereof. [0097] In addition or as an alternative, computer system 100 may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute one or more processes or one or more steps of one or more processes described or illustrated herein.
  • references to software in this disclosure may encompass logic, and reference to logic may encompass software.
  • reference to a computer-readable medium may encompass a circuit (such as an IC) storing software for execution, a circuit embodying logic for execution, or both, where appropriate.
  • the present disclosure encompasses any suitable combination of hardware, software, or both.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC may reside in a user terminal.
  • the processor and the storage medium may reside as discrete components in a user terminal.
  • suitable computing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • server computers desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
  • the computing device includes an operating system configured to perform executable instructions.
  • the operating system is, for example, software, including programs and data, which manages the device’s hardware and provides services for execution of applications.
  • server operating systems include, by way of non -limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®.
  • suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®.
  • the operating system is provided by cloud computing.
  • suitable mobile smartphone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBerry OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®.
  • suitable media streaming device operating systems include, by way of non-limiting examples, Apple TV®, Roku®, Boxee®, Google TV®, Google Chromecast®, Amazon Fire®, and Samsung® HomeSync®.
  • Non-transitory computer readable storage medium includes, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®.
  • Non-transitory computer readable storage medium includes, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®.
  • the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computing device.
  • a computer readable storage medium is a tangible component of a computing device.
  • a computer readable storage medium is optionally removable from a computing device.
  • a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, distributed computing systems including cloud computing systems and services, and the like.
  • the program and instructions are permanently, substantially permanently, semi -permanently, or non-transitorily encoded on the media.
  • the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same.
  • a computer program includes a sequence of instructions, executable by one or more processor(s) of the computing device’s CPU, written to perform a specified task.
  • Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), computing data structures, and the like, that perform particular tasks or implement particular abstract data types.
  • APIs Application Programming Interfaces
  • a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof. Web application
  • a computer program includes a web application.
  • a web application in various embodiments, utilizes one or more software frameworks and one or more database systems.
  • a web application is created upon a software framework such as Microsoft® .NET or Ruby on Rails (RoR).
  • a web application utilizes one or more database systems including, by way of non-limiting examples, relational, non-relational, object oriented, associative, XML, and document oriented database systems.
  • suitable relational database systems include, by way of non-limiting examples, Microsoft® SQL Server, mySQLTM, and Oracle®.
  • a web application in various embodiments, is written in one or more versions of one or more languages.
  • a web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or combinations thereof.
  • a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML).
  • a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS).
  • CSS Cascading Style Sheets
  • a web application is written to some extent in a client-side scripting language such as Asynchronous JavaScript and XML (AJAX), Flash® ActionScript, JavaScript, or Silverlight®.
  • AJAX Asynchronous JavaScript and XML
  • Flash® ActionScript JavaScript
  • a web application is written to some extent in a serverside coding language such as Active Server Pages (ASP), ColdFusion®, Perl, JavaTM, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), PythonTM, Ruby, Tel, Smalltalk, WebDNA®, or Groovy.
  • a web application is written to some extent in a database query language such as Structured Query Language (SQL).
  • SQL Structured Query Language
  • a web application integrates enterprise server products such as IBM® Lotus Domino®.
  • a web application includes a media player element.
  • a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe® Flash®, HTML 5, Apple® QuickTime®, Microsoft® Silverlight®, JavaTM, and Unity®.
  • an application provision system comprises one or more databases 200 accessed by a relational database management system (RDBMS) 210.
  • RDBMSs include Firebird, MySQL, PostgreSQL, SQLite, Oracle Database, Microsoft SQL Server, IBM DB2, IBM Informix, SAP Sybase, Teradata, and the like.
  • the application provision system further comprises one or more application severs 220 (such as Java servers, .NET servers, PHP servers, and the like) and one or more web servers 230 (such as Apache, IIS, GWS and the like).
  • the web server(s) optionally expose one or more web services via app application programming interfaces (APIs) 240.
  • APIs app application programming interfaces
  • an application provision system alternatively has a distributed, cloud-based architecture 300 and comprises elastically load balanced, auto-scaling web server resources 310 and application server resources 320 as well synchronously replicated databases 330.
  • a computer program includes a mobile application provided to a mobile computing device.
  • the mobile application is provided to a mobile computing device at the time it is manufactured.
  • the mobile application is provided to a mobile computing device via the computer network described herein.
  • a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective-C, JavaTM, JavaScript, Pascal, Object Pascal, PythonTM, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
  • Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator®, Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, AndroidTM SDK, BlackBerry® SDK, BREW SDK, Palm® OS SDK, Symbian SDK, webOS SDK, and Windows® Mobile SDK.
  • iOS iPhone and iPad
  • a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in.
  • standalone applications are often compiled.
  • a compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, JavaTM, Lisp, PythonTM, Visual Basic, and VB .NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program.
  • a computer program includes one or more executable complied applications.
  • the computer program includes a web browser plug-in (e.g., extension, etc.).
  • a plug-in is one or more software components that add specific functionality to a larger software application.
  • Makers of software applications support plugins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application.
  • plugins enable customizing the functionality of a software application.
  • plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types.
  • the toolbar comprises one or more web browser extensions, add-ins, or add-ons.
  • the toolbar comprises one or more explorer bars, tool bands, or desk bands.
  • Web browsers are software applications, designed for use with network-connected computing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft® Internet Explorer®, Mozilla® Firefox®, Google® Chrome, Apple® Safari®, Opera Software® Opera®, and KDE Konqueror. In some embodiments, the web browser is a mobile web browser.
  • Mobile web browsers are designed for use on mobile computing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems.
  • Suitable mobile web browsers include, by way of non-limiting examples, Google® Android® browser, RIM BlackBerry® Browser, Apple® Safari®, Palm® Blazer, Palm® WebOS® Browser, Mozilla® Firefox® for mobile, Microsoft® Internet Explorer® Mobile, Amazon® Kindle® Basic Web, Nokia® Browser, Opera Software® Opera® Mobile, and Sony® PSPTM browser.
  • the platforms, systems, media, and methods disclosed herein include software, server, and/or database modules, or use of the same.
  • software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art.
  • the software modules disclosed herein are implemented in a multitude of ways.
  • a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof.
  • a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
  • the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application.
  • software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on a distributed computing platform such as a cloud computing platform. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
  • the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same.
  • suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object oriented databases, object databases, entity -relationship model databases, associative databases, XML databases, and document oriented databases. Further non-limiting examples include SQL, PostgreSQL, MySQL, Oracle, DB2, Sybase, and MongoDB.
  • a database is Internet-based.
  • a database is web-based.
  • a database is cloud computing-based.
  • a database is a distributed database.
  • a database is based on one or more local computer storage devices.
  • a user interacts with a haptic array device, as disclosed herein, to perform a series of gestures to unlock a secured system or device.
  • a secured system is an account, database, or other system containing valuables, classified information, personal information, etc.
  • a secured device may be a personal device, such as a phone, computer, or a device comprising a physical lock.
  • the process begins by associating a secured system (e.g., a device or account) with the haptic array device, at step 610. Registration may comprise pairing or coupling of the secured system to the haptic array. Pairing or coupling may be accomplished via a network module of the haptic array device. [0122] After the system is associated with the haptic array device, the device is ready monitor gestures made by a user and register the gestures to create a unique and secure method of unlocking the associated.
  • a secured system e.g., a device or account
  • Registration may comprise pairing or coupling of the secured system to the haptic array. Pairing or coupling may be accomplished via a network module of the haptic array device.
  • the haptic array device is configured to monitor and store a series of gestures which a user makes with one hand.
  • the haptic array device produces a visualization of where the user should place their hand.
  • the visualization may be a marker to indicate where the user should place the center of their palm or hand.
  • the haptic array device provides feedback to confirm the users hand is in a proper position.
  • the feedback comprises haptic feedback from the transducer array and/or a laser system of the device.
  • the sonic haptics may be applied to the palm to indicate proper positioning and the laser haptics may be applied to each fingertip to indicate that the hand and fingers are in proper position and that the device is ready to monitor the user’s gestures.
  • the user completes a series of gestures or movements which are captured by the monitoring system of the haptic array device, according to some embodiments.
  • a series of gestures might include at least two gestures made by the hand, or at least one movement from a first gesture to a second gesture.
  • the number of gestures is limited by the device.
  • the number of gestures is chosen by the user.
  • completion of each gesture in the series is confirmed by haptic feedback provided by the device.
  • the user repeats the same gestures to confirm the series of gestures they wish to use to unlock the associated system.
  • step 640 When the user wishes to unlock the system, they are guided by the device to place their hand, at step 640, such that the device can monitor their gestures properly.
  • step 650 proper placement of their hand or confirmation of an initial hand position is confirmed by the system.
  • confirmation comprises haptic feedback provided by the device.
  • haptic feedback confirms that the hand and fingers have been detected by the device.
  • step 660 the user performs their unique, registered series of gestures, which unlocks the associated system at step 670.
  • biometric properties of the user’s hand along with the series of gestures. For example, the size of the user’s hand, the palm lines of a user, and/or fingerprints of a user may also be registered.
  • the biometric properties of the user’s hand are detected and registered by the monitoring system of the device.
  • the biometric information of the user is confirmed during initial placement/positioning of the hand.
  • bio-haptic security system could be supplemented with additional security measures, such a facial recognition, voice recognition, one or two step authentication, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne des systèmes, des dispositifs et des procédés pour assurer des projections interactives et tactiles de son et de lumière dans un système de sécurité bio-haptique.
PCT/US2023/076714 2022-10-13 2023-10-12 Projections tactiles et interactives bio-haptiques de son et de lumière WO2024081803A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263379378P 2022-10-13 2022-10-13
US63/379,378 2022-10-13

Publications (2)

Publication Number Publication Date
WO2024081803A2 true WO2024081803A2 (fr) 2024-04-18
WO2024081803A3 WO2024081803A3 (fr) 2024-05-16

Family

ID=90670359

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/076714 WO2024081803A2 (fr) 2022-10-13 2023-10-12 Projections tactiles et interactives bio-haptiques de son et de lumière

Country Status (1)

Country Link
WO (1) WO2024081803A2 (fr)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10123753B2 (en) * 2017-03-28 2018-11-13 Coleridge Design Associates Llc Haptic feedback and interface systems for reproducing internal body sounds
EP3665617A4 (fr) * 2017-08-09 2021-07-21 The Board of Trustees of the Leland Stanford Junior University Dispositif de détection biométrique à ultrasons intégré à un système optique
US11740071B2 (en) * 2018-12-21 2023-08-29 Apple Inc. Optical interferometry proximity sensor with temperature variation compensation
US10936073B1 (en) * 2020-01-31 2021-03-02 Dell Products, Lp System and method for generating high-frequency and mid-frequency audible sound via piezoelectric actuators of a haptic keyboard

Also Published As

Publication number Publication date
WO2024081803A3 (fr) 2024-05-16

Similar Documents

Publication Publication Date Title
US10902034B2 (en) Method for populating a map with a plurality of avatars through the use of a mobile technology platform
US10044712B2 (en) Authentication based on gaze and physiological response to stimuli
JP6566906B2 (ja) ハプティックcaptcha
RU2684189C2 (ru) Адаптивное распознавание событий
TWI781226B (zh) 用於偵測行動認證中之欺騙之方法、系統及媒體
CA3133229C (fr) Detection de mystification de reconnaissance faciale avec des dispositifs mobiles
US11507248B2 (en) Methods, systems, and media for anti-spoofing using eye-tracking
KR20170052976A (ko) 모션을 수행하는 전자 장치 및 그 제어 방법
US20160364321A1 (en) Emulating a user performing spatial gestures
WO2021061275A1 (fr) Bague intelligente
WO2022265750A1 (fr) Détection de surface par l'intermédiaire d'un capteur résonant
US20210209840A1 (en) Generating a 3D Model of a Fingertip for Visual Touch Detection
WO2024081803A2 (fr) Projections tactiles et interactives bio-haptiques de son et de lumière
WO2021142138A1 (fr) Estimation de rétroaction de force de granularité laser
WO2024081786A2 (fr) Projections de son et de lumière interactives et tactiles pour le traitement de la peau
WO2024081783A1 (fr) Projections interactives et tactiles virtuelles et augmentées de son et de lumière
CN110928472B (zh) 物品处理方法、装置及电子设备
WO2024081781A1 (fr) Projections interactives et tactiles de son et de lumière pour réadaptation et entraînement
US11149243B2 (en) Electronic device, wearable device, and method of providing content-based somatic senses using ultrasound
Abate et al. Integrating Gaze Tracking with Augmented Reality on Mobile Devices: A Framework for Enhanced User Interaction
KR20180052330A (ko) 햅틱 효과를 제공하기 위한 방법 및 그 전자 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23878237

Country of ref document: EP

Kind code of ref document: A2