JPWO2020240470A5 - - Google Patents
Download PDFInfo
- Publication number
- JPWO2020240470A5 JPWO2020240470A5 JP2021571484A JP2021571484A JPWO2020240470A5 JP WO2020240470 A5 JPWO2020240470 A5 JP WO2020240470A5 JP 2021571484 A JP2021571484 A JP 2021571484A JP 2021571484 A JP2021571484 A JP 2021571484A JP WO2020240470 A5 JPWO2020240470 A5 JP WO2020240470A5
- Authority
- JP
- Japan
- Prior art keywords
- user
- housing
- head
- data
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Claims (27)
ユーザ混乱指数を決定することと、
前記決定されたユーザ混乱指数が予め定めた閾値を満たしたことを受けて、拡張現実の見当識回復グラフィックとして前記ユーザに視認される電磁放射をプロジェクタに放射させて、前記電磁放射が少なくとも部分的にARレンズで反射して補正レンズを介して前記ユーザの眼球に向けられるようにすることと、
を含み、
前記ARレンズが、前記ユーザの頭部に装着されるように構成されたフレームに連結される、
方法。 A method of helping a user overcome disorientation, comprising:
determining a user confusion index;
causing a projector to emit electromagnetic radiation that is viewed by the user as an augmented reality disorientation graphic in response to the determined user confusion index meeting a predetermined threshold, wherein the electromagnetic radiation is at least partially so that it is reflected by an AR lens and directed toward the user's eyeball through a corrective lens ;
including
the AR lens is coupled to a frame configured to be worn on the user's head;
Method.
前記生成された画像データに少なくとも部分的に基づいて、前記ユーザの前記頭部の移動成分を推定することと、
をさらに含む、請求項1に記載の方法。 generating image data, via a camera, reproducible as one or more images generally corresponding to the user's field of view ;
estimating a motion component of the head of the user based at least in part on the generated image data;
2. The method of claim 1 , further comprising:
1つ以上のプロセッサを含む制御システムと、
機械可読命令が記憶されたメモリと、を含み、
前記制御システムは、前記メモリへ連結され、請求項1~5のいずれか一項に記載の方法は、前記メモリ中の機械により実行可能な命令が前記制御システムの前記1つ以上のプロセッサのうち少なくとも1つによって実行されたときに実行される、
システム。 a system,
a control system including one or more processors;
a memory in which machine-readable instructions are stored;
The control system is coupled to the memory, and the method of any one of claims 1 to 5 , wherein machine-executable instructions in the memory are executed by one of the one or more processors of the control system. executed when executed by at least one
system.
ユーザの頭部に装着されるように構成されたフレームに連結されるように構成されたハウジングと、
前記ハウジングに取り付けられたARレンズと、
前記ハウジングに連結され、拡張現実の見当識回復グラフィックとして前記ユーザに視認される電磁放射が前記ARレンズに少なくとも部分的に反射して前記ユーザの眼球に向けられるように前記電磁放射を発出するように構成されたプロジェクタと、
機械可読命令を記憶するメモリと、
前記機械可読命令を実行するように構成された1つ以上のプロセッサを含む制御システムと、を備え、
前記機械可読命令は、
ユーザ混乱指数を決定し、
前記決定されたユーザ混乱指数が予め定めた閾値を満たすと、前記拡張現実の見当識回復グラフィックが前記ユーザに視認されるように前記プロジェクタに前記電磁放射を発出させる、
システム。 A system for assisting a user to overcome disorientation, comprising:
a housing configured to be coupled to a frame configured to be worn on a user's head;
an AR lens attached to the housing;
coupled to the housing to emit the electromagnetic radiation such that the electromagnetic radiation viewed by the user as an augmented reality disorientation graphic is at least partially reflected off the AR lens and directed toward the user's eyeballs; a projector configured to
a memory that stores machine-readable instructions;
a control system comprising one or more processors configured to execute the machine-readable instructions;
The machine-readable instructions are:
determine a user confusion index;
causing the projector to emit the electromagnetic radiation such that the augmented reality disorientation graphic is visible to the user when the determined user confusion index meets a predetermined threshold;
system.
前記ハウジングに接続され、モーションデータを生成するように構成されたモーションセンサと、
をさらに備え、
前記制御システムが、前記生成された画像データに少なくとも部分的に基づいて、かつ前記生成されたモーションデータに少なくとも部分的に基づいて、前記ユーザの前記頭部の移動成分を推定するようにさらに構成されている、請求項7に記載のシステム。 a camera coupled to the housing and configured to generate image data;
a motion sensor connected to the housing and configured to generate motion data;
further comprising
The control system is further configured to estimate a motion component of the head of the user based at least in part on the generated image data and at least in part on the generated motion data. 8. The system of claim 7 , wherein:
前記ユーザ混乱指数の前記決定が、生成された皮膚伝導度データに少なくとも部分的に基づく、
請求項7~25のいずれか一項に記載のシステム。 further comprising a skin conductivity sensor coupled to the housing and configured to generate skin conductivity data;
wherein said determination of said user confusion index is based at least in part on generated skin conductivity data;
A system according to any one of claims 7-25 .
前記ハウジングに接続され、モーションデータを生成するように構成されたモーションセンサと、
前記ハウジングに連結され、心拍データを生成するように構成された心拍センサと、
前記ハウジングに接続され、皮膚伝導度データを生成するように構成された皮膚伝導度センサと、
をさらに備え、
前記ユーザ混乱指数の前記決定が、(i)前記生成された画像データ、(ii)前記生成されたモーションデータ、(iii)前記生成された心拍データ、(iv)前記生成された皮膚伝導度データ、または(v)(i)~(iv)の任意の組み合わせに少なくとも部分的に基づく、請求項7に記載のシステム。
a camera connected to the housing and configured to generate image data;
a motion sensor connected to the housing and configured to generate motion data;
a heart rate sensor coupled to the housing and configured to generate heart rate data;
a skin conductivity sensor connected to the housing and configured to generate skin conductivity data;
further comprising
(ii) the generated motion data; (iii) the generated heart rate data; (iv) the generated skin conductivity data. , or (v) based at least in part on any combination of (i ) -(iv).
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962855457P | 2019-05-31 | 2019-05-31 | |
US62/855,457 | 2019-05-31 | ||
PCT/IB2020/055081 WO2020240470A1 (en) | 2019-05-31 | 2020-05-28 | Systems and methods for minimizing cognitive decline using augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2022535032A JP2022535032A (en) | 2022-08-04 |
JPWO2020240470A5 true JPWO2020240470A5 (en) | 2023-05-24 |
Family
ID=73553609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2021571484A Pending JP2022535032A (en) | 2019-05-31 | 2020-05-28 | Systems and methods for minimizing cognitive decline using augmented reality |
Country Status (7)
Country | Link |
---|---|
US (1) | US20220226600A1 (en) |
EP (1) | EP3975845A4 (en) |
JP (1) | JP2022535032A (en) |
KR (1) | KR20220068198A (en) |
CN (1) | CN114173657A (en) |
SG (1) | SG11202113275SA (en) |
WO (1) | WO2020240470A1 (en) |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7733224B2 (en) * | 2006-06-30 | 2010-06-08 | Bao Tran | Mesh network personal emergency response appliance |
US8831278B2 (en) * | 2010-11-30 | 2014-09-09 | Eastman Kodak Company | Method of identifying motion sickness |
US9256711B2 (en) * | 2011-07-05 | 2016-02-09 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
US9030495B2 (en) * | 2012-11-21 | 2015-05-12 | Microsoft Technology Licensing, Llc | Augmented reality help |
WO2017004695A1 (en) * | 2015-07-06 | 2017-01-12 | Frank Jones | Methods and devices for demountable head mounted displays |
CA3020390A1 (en) * | 2016-04-08 | 2017-10-12 | Vizzario, Inc. | Methods and systems for obtaining, aggregating, and analyzing vision data to assess a person's vision performance |
US10191541B2 (en) * | 2016-06-30 | 2019-01-29 | Sony Interactive Entertainment Inc. | Augmenting virtual reality content with real world content |
CN108592937A (en) * | 2018-05-09 | 2018-09-28 | 何辉 | A kind of night flight or navigation road conditions for field exploration identify system |
-
2020
- 2020-05-28 CN CN202080053118.4A patent/CN114173657A/en active Pending
- 2020-05-28 US US17/614,719 patent/US20220226600A1/en active Pending
- 2020-05-28 WO PCT/IB2020/055081 patent/WO2020240470A1/en unknown
- 2020-05-28 JP JP2021571484A patent/JP2022535032A/en active Pending
- 2020-05-28 EP EP20815464.1A patent/EP3975845A4/en active Pending
- 2020-05-28 KR KR1020217043207A patent/KR20220068198A/en unknown
- 2020-05-28 SG SG11202113275SA patent/SG11202113275SA/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6657289B2 (en) | Systems and methods for augmented and virtual reality | |
US20240004458A1 (en) | Massive simultaneous remote digital presence world | |
US10311640B2 (en) | Method and apparatus for providing virtual space, and non-transitory computer readable data storage medium storing program causing computer to perform method | |
CN104866105B (en) | The eye of aobvious equipment is dynamic and head moves exchange method | |
WO2017047178A1 (en) | Information processing device, information processing method, and program | |
JP4083684B2 (en) | Image processing system and image processing apparatus | |
WO2020031767A1 (en) | Information processing device, information processing method, and program | |
US11380072B2 (en) | Neutral avatars | |
JPWO2020090477A5 (en) | ||
WO2020050186A1 (en) | Information processing apparatus, information processing method, and recording medium | |
JPWO2020240470A5 (en) | ||
TW200940116A (en) | A motion sickness prevention device | |
KR20210137831A (en) | Electronic apparatus and operaintg method thereof | |
JP2005279895A (en) | Robot | |
WO2022146858A1 (en) | Controller position tracking using inertial measurement units and machine learning | |
TWI674518B (en) | Calibration method of eye-tracking and device thereof | |
JP6874207B2 (en) | Estimator, estimation method and program | |
JP2022535032A (en) | Systems and methods for minimizing cognitive decline using augmented reality | |
JP2022175890A (en) | Advice presentation system, information terminal, instrument terminal, program and advice presentation method | |
WO2023078677A1 (en) | Assisting a person to perform a personal care activity | |
TWM555004U (en) | Visual displaying device |