JP7356596B2 - 仮想及び拡張現実hmdシステムにおける高速予測追跡及びレイテンシ補償のための光強度ベースのcmos及びイベント検出センサの組み合わせ - Google Patents
仮想及び拡張現実hmdシステムにおける高速予測追跡及びレイテンシ補償のための光強度ベースのcmos及びイベント検出センサの組み合わせ Download PDFInfo
- Publication number
- JP7356596B2 JP7356596B2 JP2022542462A JP2022542462A JP7356596B2 JP 7356596 B2 JP7356596 B2 JP 7356596B2 JP 2022542462 A JP2022542462 A JP 2022542462A JP 2022542462 A JP2022542462 A JP 2022542462A JP 7356596 B2 JP7356596 B2 JP 7356596B2
- Authority
- JP
- Japan
- Prior art keywords
- image
- isp
- rgb
- eds
- generate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003190 augmentative effect Effects 0.000 title claims description 8
- 238000001514 detection method Methods 0.000 title claims description 8
- 238000013528 artificial neural network Methods 0.000 claims description 20
- 238000000034 method Methods 0.000 claims description 15
- 230000008859 change Effects 0.000 claims description 14
- 230000033001 locomotion Effects 0.000 claims description 9
- 238000005286 illumination Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 8
- 230000003247 decreasing effect Effects 0.000 claims description 2
- 230000015654 memory Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 210000004247 hand Anatomy 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000005094 computer simulation Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Description
Claims (20)
- 少なくとも1つの仮想現実または拡張現実(AR)ヘッドマウントディスプレイ(HMD)を含むシステムであって、
前記少なくとも1つの仮想現実またはAR HMDは、
照明強度を感知する1つ以上のカメラピクセルを有し、赤-緑-青(RGB)画像を生成するように構成される少なくとも1つのカメラユニットと、
少なくとも1つの画像信号プロセッサ(ISP)と、
少なくとも1つのイベント検出センサ(EDS)と、
を含み、
前記少なくとも1つのEDSは、
前記1つ以上のカメラピクセルによって感知された照明強度の変化を動きのインジケーションとして使用して、前記感知された照明強度における変化を表す信号を前記ISPに出力し、前記ISPが前記カメラユニットを制御して現在時刻=tでの第一RGB画像を生成し、前記第一RGB画像及び前記信号に基づいて将来時刻=t+Dの予測画像を生成して外挿することを可能にするように構成され、Dは時間値である、前記システム。 - 前記カメラユニットは赤外線(IR)画像を生成するように構成される、請求項1に記載のシステム。
- 前記カメラユニット、前記ISP、及び前記EDSは、単一のチップに配置される、請求項1に記載のシステム。
- 前記カメラユニット、前記ISP、及び前記EDSの処理は、デジタルシグナルプロセッサ(DSP)によって実施される、請求項1に記載のシステム。
- 前記ISPは、前記HMDに関連する少なくとも1つのアプリケーションから前記時間値Dを受信するように構成される、請求項1に記載のシステム。
- 前記ISPは、前記ISPによって実行可能な、前記第一RGB画像及び前記予測画像を前記アプリケーションに返す命令で構成される、請求項5に記載のシステム。
- 前記ISPは、前記ISPによって実行可能な、前記将来時刻t+Dに第二RGB画像を生成する命令で構成される、請求項1に記載のシステム。
- 前記ISPは、前記ISPによって実行可能な命令で構成され、前記命令は、
少なくとも1つのニューラルネットワーク(NN)を実行して前記予測画像を生成することと、
前記第一RGB画像と前記第二RGB画像との間の差分を前記NNにフィードバックすることと、
のためのものである、請求項7に記載のシステム。 - 照明強度を感知する1つ以上のカメラピクセルを有し、赤-緑-青(RGB)画像及び/または赤外線(IR)画像を生成するように構成される少なくとも1つのカメラユニットと、
少なくとも1つの画像信号プロセッサ(ISP)と、
少なくとも1つのイベント検出センサ(EDS)と、
を含み、
前記少なくとも1つのEDSは、
前記1つ以上のカメラピクセルによって感知された照明強度の変化を動きのインジケーションとして使用して、前記ISPが前記カメラユニットを制御して、現在時刻=tでの第一RGB画像及び/または第一IR画像を生成し、前記第一RGB画像及び/または前記第一IR画像と前記照明強度の変化とに基づいて、将来時刻=t+Dの予測画像を生成して外挿することを可能にする信号を出力するように構成される、システム。 - 前記カメラユニットはRGB画像を生成するように構成される、請求項9に記載のシステム。
- 前記カメラユニット、前記ISP、及び前記EDSは、単一のチップに配置される、請求項9に記載のシステム。
- 前記カメラユニット、前記ISP、及び前記EDSは、デジタルシグナルプロセッサ(DSP)に実装される、請求項9に記載のシステム。
- 前記ISPは、ヘッドマウントディスプレイ(HMD)に関連する少なくとも1つのアプリケーションからDを受信するように構成される、請求項9に記載のシステム。
- 前記ISPは、前記ISPによって実行可能な、前記第一RGB画像及び前記予測画像を前記アプリケーションに返す命令で構成される、請求項13に記載のシステム。
- 前記ISPは、前記ISPによって実行可能な、前記将来時刻t+Dに第二画像を生成する命令で構成される、請求項9に記載のシステム。
- 前記ISPは、前記ISPによって実行可能な命令で構成され、前記命令は、
少なくとも1つのニューラルネットワーク(NN)を実行して前記予測画像を生成することと、
前記第二画像を前記NNにフィードバックして前記NNをトレーニングすることと、
のためのものである、請求項15に記載のシステム。 - 前記EDSによって出力される前記信号は、照明強度における変化を表す、請求項9に記載のシステム。
- 時間Dを受信することと、
現在時刻で第一画像を生成することと、
1つ以上のカメラピクセルが感知した光強度における変化を表す信号を受信することであって、前記信号は、前記光強度の変化がないことを示す0と、前記光強度が減少していることを示す負の値と、前記光強度が増加していることを示す正の値を含む、前記信号を受信することと、
前記第一画像及び前記光強度における変化を表す前記信号を使用して、前記現在時刻足すDに等しい将来時刻の予測画像を生成することと、
を含む、方法。 - 前記第一画像及び前記予測画像を、ヘッドマウントディスプレイ(HMD)に関連する少なくとも1つのアプリケーションに返すことを含む、請求項18に記載の方法。
- 少なくとも1つのニューラルネットワーク(NN)を使用して前記予測画像を生成することと、
前記第一画像が生成された前記現在時刻足すDに等しい時刻での第二画像を生成することと、
前記第二画像を前記NNに提供して、前記NNをトレーニングすることと、
を含む、請求項18に記載の方法。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/741,285 US11635802B2 (en) | 2020-01-13 | 2020-01-13 | Combined light intensity based CMOS and event detection sensor for high speed predictive tracking and latency compensation in virtual and augmented reality HMD systems |
US16/741,285 | 2020-01-13 | ||
PCT/US2021/012693 WO2021146113A1 (en) | 2020-01-13 | 2021-01-08 | Combined light intensity based cmos and event detection sensor for high speed predictive tracking and latency compensation in virtual and augmented reality hmd systems |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2023512166A JP2023512166A (ja) | 2023-03-24 |
JP7356596B2 true JP7356596B2 (ja) | 2023-10-04 |
Family
ID=76763632
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2022542462A Active JP7356596B2 (ja) | 2020-01-13 | 2021-01-08 | 仮想及び拡張現実hmdシステムにおける高速予測追跡及びレイテンシ補償のための光強度ベースのcmos及びイベント検出センサの組み合わせ |
Country Status (5)
Country | Link |
---|---|
US (1) | US11635802B2 (ja) |
EP (1) | EP4091015A4 (ja) |
JP (1) | JP7356596B2 (ja) |
CN (1) | CN114981706A (ja) |
WO (1) | WO2021146113A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12029971B2 (en) | 2022-06-10 | 2024-07-09 | Sony Interactive Entertainment LLC | Hybrid pixel dynamic vision sensor tracking using IR and ambient light (or depth sensor) |
US11995226B2 (en) | 2022-06-10 | 2024-05-28 | Sony Interactive Entertainment Inc. | Dynamic vision sensor tracking based on light source occlusion |
US12064682B2 (en) * | 2022-06-10 | 2024-08-20 | Sony Interactive Entertainment Inc. | Deployment of dynamic vision sensor hybrid element in method for tracking a controller and simultaneous body tracking, slam or safety shutter |
US12059609B2 (en) | 2022-06-10 | 2024-08-13 | Sony Interactive Entertainment Inc. | Asynchronous dynamic vision sensor LED AI tracking system and method |
JP2024147359A (ja) * | 2023-04-03 | 2024-10-16 | キヤノン株式会社 | 画像処理装置、システム、画像処理方法、および機器 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140010305A1 (en) | 2012-07-03 | 2014-01-09 | Samsung Electronics Co., Ltd. | Method of multi-view video sequence coding/decoding based on adaptive local correction of illumination of reference frames without transmission of additional parameters (variants) |
US20160026253A1 (en) | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US9595083B1 (en) | 2013-04-16 | 2017-03-14 | Lockheed Martin Corporation | Method and apparatus for image producing with predictions of future positions |
US20170213388A1 (en) | 2016-01-25 | 2017-07-27 | Jeffrey Neil Margolis | Frame Projection For Augmented Reality Environments |
US20170334066A1 (en) | 2016-05-20 | 2017-11-23 | Google Inc. | Machine learning methods and apparatus related to predicting motion(s) of object(s) in a robot's environment based on image(s) capturing the object(s) and based on parameter(s) for future robot movement in the environment |
US20180137389A1 (en) | 2016-11-16 | 2018-05-17 | Facebook, Inc. | Deep Multi-Scale Video Prediction |
WO2018176015A1 (en) | 2017-03-24 | 2018-09-27 | Magic Leap, Inc. | Accumulation and confidence assignment of iris codes |
WO2019226374A1 (en) | 2018-05-24 | 2019-11-28 | Microsoft Technology Licensing, Llc | Dead reckoning and latency improvement in 3d game streaming scenario |
Family Cites Families (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5837995A (en) * | 1996-11-25 | 1998-11-17 | Alan Y. Chow | Wavelength-controllable voltage-phase photodiode optoelectronic switch ("opsistor") |
US20020097322A1 (en) * | 2000-11-29 | 2002-07-25 | Monroe David A. | Multiple video display configurations and remote control of multiple video signals transmitted to a monitoring station over a network |
US7403646B2 (en) * | 2002-10-24 | 2008-07-22 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, program, and recording medium for generating a difference image from a first radiographic image and second radiographic image |
US7774075B2 (en) * | 2002-11-06 | 2010-08-10 | Lin Julius J Y | Audio-visual three-dimensional input/output |
JP2010256242A (ja) * | 2009-04-27 | 2010-11-11 | Hitachi High-Technologies Corp | 欠陥検査装置及び欠陥検査方法 |
GB2476258A (en) * | 2009-12-16 | 2011-06-22 | Thales Holdings Uk Plc | Motion detection using histogram analysis |
US9041298B2 (en) * | 2011-08-10 | 2015-05-26 | Brian R. Andy | Motion activated toilet bowl lighting device |
US10585472B2 (en) * | 2011-08-12 | 2020-03-10 | Sony Interactive Entertainment Inc. | Wireless head mounted display with differential rendering and sound localization |
US9268406B2 (en) * | 2011-09-30 | 2016-02-23 | Microsoft Technology Licensing, Llc | Virtual spectator experience with a personal audio/visual apparatus |
US9087471B2 (en) * | 2011-11-04 | 2015-07-21 | Google Inc. | Adaptive brightness control of head mounted display |
US20130248691A1 (en) * | 2012-03-23 | 2013-09-26 | Google Inc. | Methods and Systems for Sensing Ambient Light |
US9702977B2 (en) * | 2013-03-15 | 2017-07-11 | Leap Motion, Inc. | Determining positional information of an object in space |
US9063330B2 (en) * | 2013-05-30 | 2015-06-23 | Oculus Vr, Llc | Perception based predictive tracking for head mounted displays |
US10091419B2 (en) * | 2013-06-14 | 2018-10-02 | Qualcomm Incorporated | Computer vision application processing |
KR20150034558A (ko) * | 2013-09-26 | 2015-04-03 | 엘지전자 주식회사 | 헤드 마운트 디스플레이 및 제어 방법 |
US9630105B2 (en) * | 2013-09-30 | 2017-04-25 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
US9645654B2 (en) * | 2013-12-04 | 2017-05-09 | Leap Motion, Inc. | Initializing predictive information for free space gesture control and communication |
US9659403B1 (en) * | 2014-01-06 | 2017-05-23 | Leap Motion, Inc. | Initializing orientation in space for predictive information for free space gesture control and communication |
CN110308561A (zh) * | 2014-03-14 | 2019-10-08 | 索尼互动娱乐股份有限公司 | 用于头戴式显示器(hmd)的方法和系统 |
KR20150116260A (ko) * | 2014-04-07 | 2015-10-15 | 삼성전자주식회사 | 마커 추적 방법 및 그 전자 장치 |
EP2933707B1 (en) * | 2014-04-14 | 2017-12-06 | iOnRoad Technologies Ltd. | Head mounted display presentation adjustment |
US9767613B1 (en) * | 2015-01-23 | 2017-09-19 | Leap Motion, Inc. | Systems and method of interacting with a virtual object |
US10684485B2 (en) * | 2015-03-06 | 2020-06-16 | Sony Interactive Entertainment Inc. | Tracking system for head mounted display |
US9910275B2 (en) * | 2015-05-18 | 2018-03-06 | Samsung Electronics Co., Ltd. | Image processing for head mounted display devices |
US9898091B2 (en) * | 2015-06-03 | 2018-02-20 | Oculus Vr, Llc | Virtual reality system with head-mounted display, camera and hand-held controllers |
EP3347810A1 (en) * | 2015-09-10 | 2018-07-18 | Google LLC | Playing spherical video on a limited bandwidth connection |
KR102501752B1 (ko) * | 2015-09-21 | 2023-02-20 | 삼성전자주식회사 | 헤드 마운트 디스플레이의 움직임을 보상하는 방법 및 이를 위한 장치 |
US10359806B2 (en) * | 2016-03-28 | 2019-07-23 | Sony Interactive Entertainment Inc. | Pressure sensing to identify fitness and comfort of virtual reality headset |
US10110913B2 (en) * | 2016-09-30 | 2018-10-23 | Intel Corporation | Motion estimation using hybrid video imaging system |
KR20180065756A (ko) * | 2016-12-08 | 2018-06-18 | 삼성전자주식회사 | 무인비행체를 제어하는 전자 장치 및 그 제어 방법 |
US10237481B2 (en) * | 2017-04-18 | 2019-03-19 | Facebook Technologies, Llc | Event camera for generation of event-based images |
US10466779B1 (en) * | 2017-07-24 | 2019-11-05 | Facebook Technologies, Llc | Event camera for eye tracking |
US11190753B1 (en) * | 2017-09-28 | 2021-11-30 | Apple Inc. | Head-mountable device with object movement detection |
US10839547B2 (en) * | 2017-09-28 | 2020-11-17 | Samsung Electronics Co., Ltd. | Camera pose determination and tracking |
CN111868737A (zh) * | 2018-01-24 | 2020-10-30 | 苹果公司 | 使用神经网络的基于事件相机的注视跟踪 |
US10845601B1 (en) * | 2018-02-07 | 2020-11-24 | Apple Inc. | AR/VR controller with event camera |
US10509467B1 (en) * | 2018-06-01 | 2019-12-17 | Facebook Technologies, Llc | Determining fixation of a user's eyes from images of portions of the user's face enclosed by a head mounted display |
US10824864B2 (en) * | 2018-06-25 | 2020-11-03 | Apple Inc. | Plane detection using semantic segmentation |
US10782779B1 (en) * | 2018-09-27 | 2020-09-22 | Apple Inc. | Feedback coordination for a virtual interaction |
US11170521B1 (en) * | 2018-09-27 | 2021-11-09 | Apple Inc. | Position estimation based on eye gaze |
US11288818B2 (en) * | 2019-02-19 | 2022-03-29 | The Trustees Of The University Of Pennsylvania | Methods, systems, and computer readable media for estimation of optical flow, depth, and egomotion using neural network trained using event-based learning |
US10944912B2 (en) * | 2019-06-04 | 2021-03-09 | Ford Global Technologies, Llc | Systems and methods for reducing flicker artifacts in imaged light sources |
US11367416B1 (en) * | 2019-06-27 | 2022-06-21 | Apple Inc. | Presenting computer-generated content associated with reading content based on user interactions |
US11503202B1 (en) * | 2022-01-03 | 2022-11-15 | Varjo Technologies Oy | Optical focus adjustment |
-
2020
- 2020-01-13 US US16/741,285 patent/US11635802B2/en active Active
-
2021
- 2021-01-08 WO PCT/US2021/012693 patent/WO2021146113A1/en unknown
- 2021-01-08 EP EP21741451.5A patent/EP4091015A4/en active Pending
- 2021-01-08 CN CN202180008969.1A patent/CN114981706A/zh active Pending
- 2021-01-08 JP JP2022542462A patent/JP7356596B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140010305A1 (en) | 2012-07-03 | 2014-01-09 | Samsung Electronics Co., Ltd. | Method of multi-view video sequence coding/decoding based on adaptive local correction of illumination of reference frames without transmission of additional parameters (variants) |
US9595083B1 (en) | 2013-04-16 | 2017-03-14 | Lockheed Martin Corporation | Method and apparatus for image producing with predictions of future positions |
US20160026253A1 (en) | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20170213388A1 (en) | 2016-01-25 | 2017-07-27 | Jeffrey Neil Margolis | Frame Projection For Augmented Reality Environments |
US20170334066A1 (en) | 2016-05-20 | 2017-11-23 | Google Inc. | Machine learning methods and apparatus related to predicting motion(s) of object(s) in a robot's environment based on image(s) capturing the object(s) and based on parameter(s) for future robot movement in the environment |
US20180137389A1 (en) | 2016-11-16 | 2018-05-17 | Facebook, Inc. | Deep Multi-Scale Video Prediction |
WO2018176015A1 (en) | 2017-03-24 | 2018-09-27 | Magic Leap, Inc. | Accumulation and confidence assignment of iris codes |
WO2019226374A1 (en) | 2018-05-24 | 2019-11-28 | Microsoft Technology Licensing, Llc | Dead reckoning and latency improvement in 3d game streaming scenario |
Also Published As
Publication number | Publication date |
---|---|
JP2023512166A (ja) | 2023-03-24 |
CN114981706A (zh) | 2022-08-30 |
WO2021146113A1 (en) | 2021-07-22 |
EP4091015A4 (en) | 2024-01-24 |
US20210216133A1 (en) | 2021-07-15 |
US11635802B2 (en) | 2023-04-25 |
EP4091015A1 (en) | 2022-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7356596B2 (ja) | 仮想及び拡張現実hmdシステムにおける高速予測追跡及びレイテンシ補償のための光強度ベースのcmos及びイベント検出センサの組み合わせ | |
US11967087B2 (en) | Dynamic vision sensor for visual audio processing | |
WO2020005563A1 (en) | Privacy chat trigger using mutual eye contact | |
US11590416B2 (en) | Multipoint SLAM capture | |
US11445269B2 (en) | Context sensitive ads | |
US20150262613A1 (en) | Providing audio video content during playback pause | |
JP2024531113A (ja) | アイトラッキングによるモーションブラー補正 | |
JP2020532352A (ja) | プレイヤ選択についての注意ベースのai決定 | |
WO2021091799A1 (en) | Adaptive time dilation based on player performance | |
JP7462069B2 (ja) | 複数のカメラからの合成入力を使用してビデオを生成するための仮想カメラ位置のユーザ選択 | |
US20210291037A1 (en) | Using camera on computer simulation controller | |
US11373342B2 (en) | Social and scene target awareness and adaptation of an occlusion system for increased social and scene interaction in an optical see-through augmented reality head mounted display | |
US20240119921A1 (en) | Gradual noise canceling in computer game | |
US20220180854A1 (en) | Sound effects based on footfall | |
US20210350246A1 (en) | Altering motion of computer simulation characters to account for simulation forces imposed on the characters | |
US20240335942A1 (en) | Reproducing fast eye movement using imaging of robot with limited actuator speed | |
US20240001239A1 (en) | Use of machine learning to transform screen renders from the player viewpoint | |
JP2023509454A (ja) | 発光ダイオード(led)アレイのイベント駆動型センサ(eds)による追跡 | |
US20200278759A1 (en) | Controller inversion detection for context switching | |
KR20240038125A (ko) | 장치의 기능에 따른 게임의 적응형 렌더링 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20220711 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20230613 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20230810 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20230919 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20230922 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7356596 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |