WO2022261031A3 - Dynamic visual optimization - Google Patents

Dynamic visual optimization Download PDF

Info

Publication number
WO2022261031A3
WO2022261031A3 PCT/US2022/032407 US2022032407W WO2022261031A3 WO 2022261031 A3 WO2022261031 A3 WO 2022261031A3 US 2022032407 W US2022032407 W US 2022032407W WO 2022261031 A3 WO2022261031 A3 WO 2022261031A3
Authority
WO
WIPO (PCT)
Prior art keywords
sensory inputs
sensory
user
inputs
those
Prior art date
Application number
PCT/US2022/032407
Other languages
French (fr)
Other versions
WO2022261031A2 (en
WO2022261031A9 (en
Inventor
Scott W. Lewis
Original Assignee
Percept Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Percept Technologies, Inc. filed Critical Percept Technologies, Inc.
Publication of WO2022261031A2 publication Critical patent/WO2022261031A2/en
Publication of WO2022261031A3 publication Critical patent/WO2022261031A3/en
Publication of WO2022261031A9 publication Critical patent/WO2022261031A9/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

Devices and methods capable of optimizing sensory inputs so as to allow observation of those sensory inputs, while ameliorating limits generally imposed by sensory processing limits or cognitive limits. Devices can include digital eyewear that detect problematic sensory inputs and adjust one or more of: (A) the sensory inputs themselves, (B) the user's receipt of those sensory inputs, or (C) the user's sensory or cognitive reaction to those sensory inputs. Detecting problematic sensory inputs can include detecting warning signals. Adjusting sensory inputs or user receipt thereof can include audio/video shading/inverse-shading, for luminance/loudness and particular frequencies, intermittent strobe presentation of objects, audio/video object recognition.
PCT/US2022/032407 2021-06-07 2022-06-06 Dynamic visual optimization WO2022261031A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202117340087A 2021-06-07 2021-06-07
US17/340,087 2021-06-07
US202217833382A 2022-06-06 2022-06-06
US17/833,382 2022-06-06

Publications (3)

Publication Number Publication Date
WO2022261031A2 WO2022261031A2 (en) 2022-12-15
WO2022261031A3 true WO2022261031A3 (en) 2023-02-23
WO2022261031A9 WO2022261031A9 (en) 2023-10-19

Family

ID=82932400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/032407 WO2022261031A2 (en) 2021-06-07 2022-06-06 Dynamic visual optimization

Country Status (1)

Country Link
WO (1) WO2022261031A2 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
EP2967324A2 (en) * 2013-03-15 2016-01-20 Percept Technologies, Inc. Enhanced optical and perceptual digital eyewear
WO2016089972A1 (en) * 2014-12-02 2016-06-09 Instinct Performance Llc Wearable sensors with heads up display

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8696113B2 (en) 2005-10-07 2014-04-15 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
EP2967324A2 (en) * 2013-03-15 2016-01-20 Percept Technologies, Inc. Enhanced optical and perceptual digital eyewear
WO2016089972A1 (en) * 2014-12-02 2016-06-09 Instinct Performance Llc Wearable sensors with heads up display

Also Published As

Publication number Publication date
WO2022261031A2 (en) 2022-12-15
WO2022261031A9 (en) 2023-10-19

Similar Documents

Publication Publication Date Title
WO2018094230A3 (en) Methods and systems for neural stimulation via auditory stimulation
WO2019139857A3 (en) Sensor device and method for outputing data indicative of hemodynamics of a user
MX2023006478A (en) Apparatus and method for providing individual sound zones.
MY188901A (en) Method and device for displaying control
NZ595980A (en) A method and system for controlling a device
GB201207903D0 (en) Headset and head mountable display
EP2627097A3 (en) Video/audio switching in a computing device
WO2015126814A3 (en) Content-aware audio modes
WO2020201999A3 (en) Pupil tracking system and method, and digital display device and digital image rendering system and method using same
MY188581A (en) Headtracking for parametric binaural output system and method
MX371222B (en) Apparatus and method for volume control.
EP4250089A3 (en) Systems, methods, apparatus, and articles of manufacture to control audio playback devices
EP3496408A3 (en) Apparatus and method for providing various audio environments in multimedia content playback system
MY201634A (en) Voice signal detection method and apparatus
WO2015153553A3 (en) Situation dependent transient suppression
EP2485218A3 (en) Graphical audio signal control
EP3923784A4 (en) Systems and methods for generating synthetic cardio-respiratory signals
HUP0204070A2 (en) Contour correction device
EP2924970A3 (en) Medical image processing device and method for operating the same
EP3024220A3 (en) Display apparatus and display method
KR20160125840A (en) Smart pillow based on snoring sound
EP4184335A4 (en) Processor, signal adjustment method, and computer system
WO2020123967A3 (en) Gaze-driven recording of video
WO2020144536A3 (en) Eyewear systems,apparatuses,and methods for providing assistance to user
MX2022006246A (en) Systems and methods for noise control.

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE