JP7353982B2 - 拡張現実におけるユーザインタラクションのための仮想カバー - Google Patents
拡張現実におけるユーザインタラクションのための仮想カバー Download PDFInfo
- Publication number
- JP7353982B2 JP7353982B2 JP2019563626A JP2019563626A JP7353982B2 JP 7353982 B2 JP7353982 B2 JP 7353982B2 JP 2019563626 A JP2019563626 A JP 2019563626A JP 2019563626 A JP2019563626 A JP 2019563626A JP 7353982 B2 JP7353982 B2 JP 7353982B2
- Authority
- JP
- Japan
- Prior art keywords
- virtual
- space
- user
- augmented reality
- control device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003190 augmentative effect Effects 0.000 title claims description 164
- 230000003993 interaction Effects 0.000 title claims description 63
- 230000009471 action Effects 0.000 claims description 58
- 238000000034 method Methods 0.000 claims description 51
- 230000015654 memory Effects 0.000 claims description 32
- 230000008569 process Effects 0.000 claims description 23
- 210000003128 head Anatomy 0.000 claims description 18
- 230000006870 function Effects 0.000 claims description 11
- 230000033001 locomotion Effects 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 7
- 230000003213 activating effect Effects 0.000 claims description 5
- 230000002596 correlated effect Effects 0.000 description 19
- 238000013507 mapping Methods 0.000 description 13
- 230000003287 optical effect Effects 0.000 description 13
- 230000001276 controlling effect Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 230000000670 limiting effect Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 239000000463 material Substances 0.000 description 5
- 238000002604 ultrasonography Methods 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 230000004424 eye movement Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000000875 corresponding effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 206010002091 Anaesthesia Diseases 0.000 description 2
- 230000037005 anaesthesia Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000002608 intravascular ultrasound Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 230000000004 hemodynamic effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013152 interventional procedure Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000010859 live-cell imaging Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Computer Graphics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Surgery (AREA)
- Urology & Nephrology (AREA)
- Optics & Photonics (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
Description
X線機械:
・cアーム、テーブルの物理的な位置決め
・照射レベル、ウェッジ、コリメーションなどのX線構成
・レビュー用の様々な画像の提供、画像のマスキング、注釈付けなどの画像表示
超音波機械:
・焦点、深度、ゲインなどの画像獲得パラメータ
・画像の回転などの表示パラメータ
カテーテルロボットや手術用ロボットなどのロボット
電気機械的環境又は室内制御
・ライトのオン/オフ
・音楽や、制御室への音声通信などのサウンド
・タイマーのオン/オフ
・注釈のオン/オフ
・録音又は録画のオン/オフ
IVUS、FFR、患者のバイタル、麻酔機器などの他の機器
仮想情報:
・どの画面を表示するか、どこに表示するかなど、表示内容の制御
・標識の配置、特定のオブジェクトの選択、画像/モデルへの注釈付けなど、仮想情報のためのポインタ又はセレクタ
・モデルとのインタラクション(モデルを回転させる、拡大/縮小させるなど)
・麻酔を投与し、患者を監視する麻酔科医
・TEEプローブを位置決めし、超音波画像獲得を制御する心エコー図技師
・カテーテル、ガイドワイヤ、及び他のデバイスを操作して、患者に療法を送達するインターベンション医師#1
・インターベンション医師#1を補助するインターベンション医師#2~3
・適切なツール及びデバイスをインターベンション医師へ渡す看護師
・介入型X線システムの操作を補佐するX線技師
・術中X線(ライブイメージ、ロードマップ、参照画像)
・術中超音波(TEE、ICE、IVUSなど)
・術前イメージング(超音波、CT、MRI)
・患者の病歴
・患者のバイタル、血行力学的状態
・線量情報(スタッフ向け-DoseAware、又は患者用)
・様々な人々が見ている映像のライブビュー
・ライブイメージング上のオーバーレイ
・ターゲット/マーカ
Claims (18)
- 医療現場の3次元(3D)空間における拡張現実のための制御装置であって、前記制御装置は、
命令を格納するメモリと、
前記命令を実行するプロセッサと、を備え、
前記制御装置が、前記3D空間内で仮想オブジェクト及び仮想操作部材をディスプレイシステムのユーザに提示して拡張現実を実現するように前記ディスプレイシステムを制御し、
前記プロセッサによって実行されるときに、前記命令が、前記制御装置にプロセスを実施させ、前記プロセスが、
前記3D空間内の第1の物理的オブジェクト又は前記仮想オブジェクトに対するユーザによる第1のアクションを検出すること、及び
前記ユーザと前記3D空間内の前記第1の物理的オブジェクト又は前記仮想オブジェクトとの間の前記第1のアクションの検出に基づいて、前記仮想操作部材を選択的に有効化又は無効化すること、を備え、
前記仮想操作部材が、操作されるときに、前記医療現場の医療機械の動作の有効化又は無効化が前記第1のアクションの検出に基づくように、前記3D空間内の前記医療機械の動作を制御し、
前記仮想操作部材及び前記仮想オブジェクトは、前記ディスプレイシステムの複数のユーザから選択されたユーザに対して可視化される、制御装置。 - 前記3D空間内の前記仮想オブジェクトが、前記仮想操作部材のための仮想カバーを備え、
前記第1のアクションが、前記ユーザが前記仮想操作部材に被せて前記仮想カバーを閉じるときの第1のインタラクションを備える、
請求項1に記載の制御装置。 - 前記第1のアクションが、前記ユーザが前記3D空間内の前記第1の物理的オブジェクトを押すときの第1のインタラクションを備える、
請求項1に記載の制御装置。 - 前記第1のアクションが、前記ユーザからの音声コマンドを備える、
請求項1に記載の制御装置。 - 前記第1のアクションが、前記ユーザが前記3D空間内でジェスチャを行うときの第1のインタラクションを備える、
請求項1に記載の制御装置。 - 前記第1のアクションが、前記3D空間内での前記ユーザの頭の位置決めを備える、
請求項1に記載の制御装置。 - 前記第1のアクションが、前記3D空間内での前記ユーザの目の位置決めを備える、
請求項1に記載の制御装置。 - 前記仮想操作部材が、前記3D空間内の位置に選択的に配置される、
請求項1に記載の制御装置。 - 前記仮想操作部材が、仮想ボタンを備える、
請求項1に記載の制御装置。 - 前記仮想操作部材が、前記3D空間内の第2の物理的オブジェクトの動きの方向を制御する方向性制御装置を備える、
請求項1に記載の制御装置。 - 前記仮想操作部材が、前記3D空間内の第2の物理的オブジェクトの上に投影される、
請求項1に記載の制御装置。 - 前記仮想操作部材が前記第1のアクションによって選択的に無効化されるとき、前記ユーザと前記3D空間内の前記第1の物理的オブジェクト又は前記仮想オブジェクトとの間の第2のインタラクションを前記制御装置が検出したのに基づいて、前記仮想操作部材が選択的に有効化され、
前記仮想操作部材が前記第1のアクションによって選択的に有効化されるとき、前記ユーザと前記3D空間内の前記第1の物理的オブジェクト又は前記仮想オブジェクトとの間の第2のインタラクションを前記制御装置が検出したのに基づいて、前記仮想操作部材が選択的に無効化される、
請求項1に記載の制御装置。 - 前記ディスプレイシステムが、前記制御装置により前記3D空間内に仮想カーソルを投影し、
前記第1のアクションが、前記ユーザが前記仮想カーソルをターゲット上に移動させることを備える、
請求項1に記載の制御装置。 - 前記3D空間が、前記ディスプレイシステムによって前記仮想オブジェクトが重畳される事前定義された物理的環境に基づく、
請求項1に記載の制御装置。 - 前記制御装置が、ヘッドマウント可能ディスプレイを備え、
前記3D空間が、前記ユーザが前記ヘッドマウント可能ディスプレイを装着している空間である、
請求項1に記載の制御装置。 - 拡張現実を用いて医療現場の3次元(3D)空間内の機能を制御するための方法であって、前記方法は、
前記3D空間内に仮想オブジェクト及び仮想操作部材をディスプレイシステムのユーザに提示して拡張現実を実現するように前記ディスプレイシステムを制御するステップと、
前記3D空間内の物理的オブジェクト又は前記仮想オブジェクトに対するユーザによる第1のアクションを検出するステップと、
前記ユーザと前記3D空間内の前記物理的オブジェクト又は前記仮想オブジェクトとの間の前記第1のアクションの検出に基づいて、前記仮想操作部材を選択的に有効化又は無効化するステップと、を有し、
前記仮想操作部材が、操作されるときに、前記医療現場の医療機械の動作の有効化又は無効化が前記第1のアクションの検出に基づくように、前記3D空間内の前記医療機械を制御し、
前記仮想操作部材及び前記仮想オブジェクトは、前記ディスプレイシステムの複数のユーザから選択されたユーザに対して可視化される、方法。 - 前記ユーザによる第2のアクションを検出するステップと、
前記ユーザによる前記第2のアクションに基づいて前記仮想操作部材を動かすステップと、
をさらに有する、請求項16に記載の方法。 - 前記ユーザによる前記第1のアクションの検出に基づいて、前記仮想操作部材を、前記3D空間内の前記仮想オブジェクトで覆うステップ、
をさらに有する、請求項16に記載の方法。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762506895P | 2017-05-16 | 2017-05-16 | |
US62/506,895 | 2017-05-16 | ||
PCT/EP2018/061947 WO2018210645A1 (en) | 2017-05-16 | 2018-05-09 | Virtual cover for user interaction in augmented reality |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2020520522A JP2020520522A (ja) | 2020-07-09 |
JP2020520522A5 JP2020520522A5 (ja) | 2021-07-26 |
JP7353982B2 true JP7353982B2 (ja) | 2023-10-02 |
Family
ID=62244455
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2019563626A Active JP7353982B2 (ja) | 2017-05-16 | 2018-05-09 | 拡張現実におけるユーザインタラクションのための仮想カバー |
Country Status (5)
Country | Link |
---|---|
US (2) | US11334213B2 (ja) |
EP (1) | EP3635512A1 (ja) |
JP (1) | JP7353982B2 (ja) |
CN (1) | CN110914788A (ja) |
WO (1) | WO2018210645A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11571107B2 (en) * | 2019-03-25 | 2023-02-07 | Karl Storz Imaging, Inc. | Automated endoscopic device control systems |
US11612316B2 (en) * | 2019-06-20 | 2023-03-28 | Awss Zidan | Medical system and method operable to control sensor-based wearable devices for examining eyes |
EP4030219A1 (en) * | 2021-01-13 | 2022-07-20 | BHS Technologies GmbH | Medical imaging system and method of controlling such imaging system |
US11334215B1 (en) | 2021-04-09 | 2022-05-17 | Htc Corporation | Method of generating user-interactive object, multimedia system, and non-transitory computer-readable medium |
CN113343303A (zh) * | 2021-06-29 | 2021-09-03 | 视伴科技(北京)有限公司 | 一种遮蔽目标房间的方法及装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013232062A (ja) | 2012-04-27 | 2013-11-14 | Mitsubishi Electric Corp | タッチパネル誤操作防止入出力装置 |
WO2014045683A1 (ja) | 2012-09-21 | 2014-03-27 | ソニー株式会社 | 制御装置および記憶媒体 |
US20160054791A1 (en) | 2014-08-25 | 2016-02-25 | Daqri, Llc | Navigating augmented reality content with a watch |
US20160191887A1 (en) | 2014-12-30 | 2016-06-30 | Carlos Quiles Casas | Image-guided surgery with surface reconstruction and augmented reality visualization |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9152309B1 (en) * | 2008-03-28 | 2015-10-06 | Google Inc. | Touch screen locking and unlocking |
CA2727474A1 (en) * | 2010-07-16 | 2012-01-16 | Exopc | Method for controlling interactions of a user with a given zone of a touch screen panel |
KR20160084502A (ko) * | 2011-03-29 | 2016-07-13 | 퀄컴 인코포레이티드 | 로컬 멀티-사용자 협업을 위한 모듈식 모바일 접속된 피코 프로젝터들 |
US20130050069A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation, A Japanese Corporation | Method and system for use in providing three dimensional user interface |
US9791921B2 (en) * | 2013-02-19 | 2017-10-17 | Microsoft Technology Licensing, Llc | Context-aware augmented reality object commands |
US10288881B2 (en) * | 2013-03-14 | 2019-05-14 | Fresenius Medical Care Holdings, Inc. | Wearable interface for remote monitoring and control of a medical device |
US10416760B2 (en) * | 2014-07-25 | 2019-09-17 | Microsoft Technology Licensing, Llc | Gaze-based object placement within a virtual reality environment |
US9852546B2 (en) * | 2015-01-28 | 2017-12-26 | CCP hf. | Method and system for receiving gesture input via virtual control objects |
US10007413B2 (en) * | 2015-04-27 | 2018-06-26 | Microsoft Technology Licensing, Llc | Mixed environment display of attached control elements |
JP6684559B2 (ja) * | 2015-09-16 | 2020-04-22 | 株式会社バンダイナムコエンターテインメント | プログラムおよび画像生成装置 |
US10546109B2 (en) * | 2017-02-14 | 2020-01-28 | Qualcomm Incorporated | Smart touchscreen display |
-
2018
- 2018-05-09 WO PCT/EP2018/061947 patent/WO2018210645A1/en unknown
- 2018-05-09 US US16/613,499 patent/US11334213B2/en active Active
- 2018-05-09 CN CN201880047219.3A patent/CN110914788A/zh active Pending
- 2018-05-09 JP JP2019563626A patent/JP7353982B2/ja active Active
- 2018-05-09 EP EP18726933.7A patent/EP3635512A1/en active Pending
-
2022
- 2022-05-13 US US17/743,629 patent/US11740757B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013232062A (ja) | 2012-04-27 | 2013-11-14 | Mitsubishi Electric Corp | タッチパネル誤操作防止入出力装置 |
WO2014045683A1 (ja) | 2012-09-21 | 2014-03-27 | ソニー株式会社 | 制御装置および記憶媒体 |
US20160054791A1 (en) | 2014-08-25 | 2016-02-25 | Daqri, Llc | Navigating augmented reality content with a watch |
US20160191887A1 (en) | 2014-12-30 | 2016-06-30 | Carlos Quiles Casas | Image-guided surgery with surface reconstruction and augmented reality visualization |
Also Published As
Publication number | Publication date |
---|---|
US20210165556A1 (en) | 2021-06-03 |
US20220276764A1 (en) | 2022-09-01 |
JP2020520522A (ja) | 2020-07-09 |
WO2018210645A1 (en) | 2018-11-22 |
US11334213B2 (en) | 2022-05-17 |
CN110914788A (zh) | 2020-03-24 |
US11740757B2 (en) | 2023-08-29 |
EP3635512A1 (en) | 2020-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7353982B2 (ja) | 拡張現実におけるユーザインタラクションのための仮想カバー | |
US11069146B2 (en) | Augmented reality for collaborative interventions | |
JP7478795B2 (ja) | 医療アシスタント | |
US11446098B2 (en) | Surgical system with augmented reality display | |
CN109478346B (zh) | 使用虚拟现实头戴式显示器增强的眼科手术体验 | |
CN110099649B (zh) | 具有用于工具致动的虚拟控制面板的机器人外科系统 | |
JP2023500053A (ja) | 外科手術仮想現実ユーザインターフェース | |
US9480539B2 (en) | Viewing system and viewing method for assisting user in carrying out surgery by identifying a target image | |
JP7448551B2 (ja) | コンピュータ支援手術システムのためのカメラ制御システム及び方法 | |
US20230186574A1 (en) | Systems and methods for region-based presentation of augmented content | |
US20210286701A1 (en) | View-Based Breakpoints For A Display System | |
US11413111B2 (en) | Augmented reality system for medical procedures | |
Yu et al. | EyeRobot: enabling telemedicine using a robot arm and a head-mounted display | |
US20240104860A1 (en) | Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments | |
EP4345838A1 (en) | Visualizing an indication of a location in a medical facility | |
US20240103677A1 (en) | User interfaces for managing sharing of content in three-dimensional environments | |
US20240104871A1 (en) | User interfaces for capturing media and manipulating virtual objects | |
Hatscher | Touchless, direct input methods for human-computer interaction to support image-guided interventions | |
EP3825816A1 (en) | Rendering a virtual object on a virtual user interface | |
WO2023014732A1 (en) | Techniques for adjusting a field of view of an imaging device based on head motion of an operator | |
Baldari et al. | New Robotic Platforms | |
WO2024064036A1 (en) | User interfaces for managing sharing of content in three-dimensional environments | |
AU2022221706A1 (en) | User interfaces and device settings based on user identification | |
Elle et al. | Head tracking of a surgical robotic scopeholder-a user involvement test of the system | |
JPWO2021059357A1 (ja) | アニメーション制作システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20210507 |
|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20210507 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20220427 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20220428 |
|
A601 | Written request for extension of time |
Free format text: JAPANESE INTERMEDIATE CODE: A601 Effective date: 20220720 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20221020 |
|
A02 | Decision of refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A02 Effective date: 20230201 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20230525 |
|
A911 | Transfer to examiner for re-examination before appeal (zenchi) |
Free format text: JAPANESE INTERMEDIATE CODE: A911 Effective date: 20230602 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20230822 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20230920 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7353982 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |