JP7033218B2 - 仮想オブジェクトとしての物理入力デバイスの表示 - Google Patents
仮想オブジェクトとしての物理入力デバイスの表示 Download PDFInfo
- Publication number
- JP7033218B2 JP7033218B2 JP2020565377A JP2020565377A JP7033218B2 JP 7033218 B2 JP7033218 B2 JP 7033218B2 JP 2020565377 A JP2020565377 A JP 2020565377A JP 2020565377 A JP2020565377 A JP 2020565377A JP 7033218 B2 JP7033218 B2 JP 7033218B2
- Authority
- JP
- Japan
- Prior art keywords
- input
- physical
- representation
- displayed
- input device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Description
この出願は、2018年6月5日に出願された「Displaying Physical Input Devices as Augmented-Reality Objects in a Mixed Reality Environment」と題する米国仮出願第62/680,819号、及び2019年5月13日に出願された「Displaying Physical Input Devices as Virtual Objects」と題する米国特許出願第16/410,547号の利益を主張するものであり、その内容は、参照によりその全体が本明細書に組み込まれる。
以下の説明では、その一部を形成し、いくつかの実施形態を示す、添付図面を参照する。本開示の範囲から逸脱することなく、他の実施形態を利用してもよく、構造及び動作の変更を行うことができることが理解されよう。異なる図面における同じ参照シンボルの使用は、類似又は同一のアイテムを示している。
Claims (20)
- ディスプレイを有する電子デバイスにおいて、
コンピュータ生成現実(CGR)環境の、表示されたアプリケーション内で入力フィールドを検出したことに応じて、前記CGR環境において、前記表示されたアプリケーションの少なくとも一部分の表現を物理入力デバイスの表現上に表示することであって、前記表示されたアプリケーションの前記少なくとも一部分の表現が、前記検出された入力フィールドの表現を含む、ことと、
前記物理入力デバイスにおいて受信された入力を検出したことに応じて、前記入力に基づいて前記入力フィールドを更新することであって、更新された外観を有する前記入力フィールドの前記表現を表示することを含む、ことと、
を含む、方法。 - 前記入力が、前記物理入力デバイスの前記表現上に表示された前記入力フィールドの場所に対応する前記物理入力デバイス上の場所におけるタッチ入力を含む、請求項1に記載の方法。
- 前記更新された外観を有する前記入力フィールドの前記表現を表示することが、前記表示されたアプリケーション内の前記更新された外観を有する前記入力フィールドを表示することを含む、請求項1又は2に記載の方法。
- 前記更新された外観を有する前記入力フィールドの前記表現を表示することが、前記物理入力デバイスの前記表現上に前記更新された外観を有する前記入力フィールドの前記表現を表示することを含む、請求項1~3のいずれか一項に記載の方法。
- 前記物理入力デバイスにおいて受信された前記入力を検出したことに応じて、触覚フィードバックを生成すること、
を更に含む、請求項1~4のいずれか一項に記載の方法。 - 前記表示されたアプリケーション内の前記入力フィールドを検出したことに応じて、前記物理入力デバイスの前記表現上にキーボードキーを表示すること、
を更に含む、請求項1~5のいずれか一項に記載の方法。 - 前記入力が、キーボードキーの表示された場所に対応する前記物理入力デバイス上の場所におけるタッチ入力を含む、請求項6に記載の方法。
- 前記物理入力デバイスが、物理キーボードキーを有する物理キーボードであり、
前記キーボードキーが、前記検出された入力フィールドがテキスト入力フィールドであるとの判定に従って表示され、
それぞれの表示されたキーボードキーが、それらが表示される前記物理キーボードキーとは異なる値を有する、
請求項6又は7に記載の方法。 - 前記物理入力デバイスが、タッチ感知面である、請求項1~8のいずれか一項に記載の方法。
- 前記タッチ感知面が、表示構成要素を含まない、請求項9に記載の方法。
- 前記物理入力デバイスが、前記電子デバイスの外部にある、請求項1~10のいずれか一項に記載の方法。
- 前記入力に基づいて前記入力フィールドを更新することが、
前記物理入力デバイスにおける前記入力を示す入力データを受信することと、
前記受信された入力データに従って、前記入力フィールドを更新することと、
を含む、請求項1~11のいずれか一項に記載の方法。 - 前記表示されたアプリケーションの前記少なくとも一部分の表現を前記物理入力デバイスの前記表現上に表示することが、前記CGR環境において、前記物理入力デバイスの前記表現上に配置された、前記検出された入力フィールドの前記表現を表示することを含む、請求項1~12のいずれか一項に記載の方法。
- 前記検出された入力フィールドがテキスト入力フィールドであるとの判定に従って、前記表示されたアプリケーションの前記少なくとも一部分の表現を前記物理入力デバイスの前記表現上に表示することが、前記CGR環境において、前記物理入力デバイスの前記表現上に配置されたテキストボックスを表示することを含む、
ことを更に含む、請求項1~13のいずれか一項に記載の方法。 - 前記検出された入力フィールドがデジタル署名フィールドであるとの判定に従って、前記表示されたアプリケーションの前記少なくとも一部分の表現を前記物理入力デバイスの前記表現上に表示することが、前記CGR環境において、前記物理入力デバイスの前記表現上に配置されたデジタル署名ボックスを表示することを含む、
ことを更に含む、請求項1~14のいずれか一項に記載の方法。 - 前記検出された入力フィールドが1つ以上のラジオボタンを含むとの判定に従って、前記表示されたアプリケーションの前記少なくとも一部分の表現を前記物理入力デバイスの前記表現上に表示することが、前記CGR環境において、前記物理入力デバイスの前記表現上に配置された1つ以上のラジオボタンを表示することを含む、
ことを更に含む、請求項1~15のいずれか一項に記載の方法。 - 前記表示されたアプリケーションの前記少なくとも一部分の表現を前記物理入力デバイスの前記表現上に表示することが、前記CGR環境において、前記検出された入力フィールドの前記1つ以上のラジオボタンに関連付けられたテキストを表示することを更に含む、請求項16に記載の方法。
- 請求項1~17のいずれか一項に記載の方法をコンピュータに実行させる、コンピュータプログラム。
- 請求項18に記載の前記コンピュータプログラムを記憶するメモリと、
前記メモリに記憶された前記コンピュータプログラムを実行することができる1つ以上のプロセッサと、
を備える、電子デバイス。 - 請求項1~17のいずれか一項に記載の方法を実行する手段と、
を備える、電子デバイス。
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862680819P | 2018-06-05 | 2018-06-05 | |
US62/680,819 | 2018-06-05 | ||
US16/410,547 US11500452B2 (en) | 2018-06-05 | 2019-05-13 | Displaying physical input devices as virtual objects |
US16/410,547 | 2019-05-13 | ||
PCT/US2019/033253 WO2019236280A1 (en) | 2018-06-05 | 2019-05-21 | Displaying physical input devices as virtual objects |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2021524971A JP2021524971A (ja) | 2021-09-16 |
JP7033218B2 true JP7033218B2 (ja) | 2022-03-09 |
Family
ID=68693931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2020565377A Active JP7033218B2 (ja) | 2018-06-05 | 2019-05-21 | 仮想オブジェクトとしての物理入力デバイスの表示 |
Country Status (6)
Country | Link |
---|---|
US (2) | US11500452B2 (ja) |
EP (1) | EP3782014A1 (ja) |
JP (1) | JP7033218B2 (ja) |
KR (1) | KR102459238B1 (ja) |
CN (1) | CN112136096B (ja) |
WO (1) | WO2019236280A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11500452B2 (en) | 2018-06-05 | 2022-11-15 | Apple Inc. | Displaying physical input devices as virtual objects |
US20190385372A1 (en) * | 2018-06-15 | 2019-12-19 | Microsoft Technology Licensing, Llc | Positioning a virtual reality passthrough region at a known distance |
US11733824B2 (en) * | 2018-06-22 | 2023-08-22 | Apple Inc. | User interaction interpreter |
US10809910B2 (en) | 2018-09-28 | 2020-10-20 | Apple Inc. | Remote touch detection enabled by peripheral device |
US11599717B2 (en) | 2020-03-20 | 2023-03-07 | Capital One Services, Llc | Separately collecting and storing form contents |
US11308266B1 (en) * | 2020-10-20 | 2022-04-19 | Google Llc | Augmented reality assisted physical form completion |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110063224A1 (en) | 2009-07-22 | 2011-03-17 | Frederic Vexo | System and method for remote, virtual on screen input |
US20130194188A1 (en) | 2012-01-31 | 2013-08-01 | Research In Motion Limited | Apparatus and method of facilitating input at a second electronic device |
US20140191977A1 (en) | 2013-01-09 | 2014-07-10 | Lenovo (Singapore) Pte. Ltd. | Touchpad operational mode |
US20170262045A1 (en) | 2016-03-13 | 2017-09-14 | Logitech Europe S.A. | Transition between virtual and augmented reality |
US20170300116A1 (en) | 2016-04-15 | 2017-10-19 | Bally Gaming, Inc. | System and method for providing tactile feedback for users of virtual reality content viewers |
US20170308258A1 (en) | 2014-09-26 | 2017-10-26 | Lg Electronics Inc. | Mobile device, hmd and system |
US20180004312A1 (en) | 2016-06-29 | 2018-01-04 | Lg Electronics Inc. | Terminal and controlling method thereof |
US11045725B1 (en) | 2014-11-10 | 2021-06-29 | Valve Corporation | Controller visualization in virtual and augmented reality environments |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101650520A (zh) * | 2008-08-15 | 2010-02-17 | 索尼爱立信移动通讯有限公司 | 移动电话的可视激光触摸板和方法 |
US20100177035A1 (en) * | 2008-10-10 | 2010-07-15 | Schowengerdt Brian T | Mobile Computing Device With A Virtual Keyboard |
JP4702441B2 (ja) | 2008-12-05 | 2011-06-15 | ソニー株式会社 | 撮像装置及び撮像方法 |
US8564621B2 (en) * | 2010-08-11 | 2013-10-22 | International Business Machines Corporation | Replicating changes between corresponding objects |
US9052804B1 (en) | 2012-01-06 | 2015-06-09 | Google Inc. | Object occlusion to initiate a visual search |
EP2624653A1 (en) * | 2012-01-31 | 2013-08-07 | Research In Motion Limited | Mobile wireless communications device with wireless local area network and cellular scheduling and related methods |
US9041622B2 (en) * | 2012-06-12 | 2015-05-26 | Microsoft Technology Licensing, Llc | Controlling a virtual object with a real controller device |
IN2015DN02046A (ja) | 2012-09-21 | 2015-08-14 | Sony Corp | |
CN103019377A (zh) * | 2012-12-04 | 2013-04-03 | 天津大学 | 基于头戴式可视显示设备的输入方法及装置 |
US9253375B2 (en) | 2013-04-02 | 2016-02-02 | Google Inc. | Camera obstruction detection |
US20140362110A1 (en) | 2013-06-08 | 2014-12-11 | Sony Computer Entertainment Inc. | Systems and methods for customizing optical representation of views provided by a head mounted display based on optical prescription of a user |
EP3049899A4 (en) | 2013-09-24 | 2017-07-05 | Hewlett-Packard Development Company, L.P. | Identifying a target touch region of a touch-sensitive surface based on an image |
US9649558B2 (en) | 2014-03-14 | 2017-05-16 | Sony Interactive Entertainment Inc. | Gaming device with rotatably placed cameras |
KR102219464B1 (ko) * | 2014-05-23 | 2021-02-25 | 삼성전자주식회사 | 보안 운용 방법 및 이를 지원하는 전자 장치 |
CN106062862B (zh) | 2014-10-24 | 2020-04-21 | 杭州凌感科技有限公司 | 用于沉浸式和交互式多媒体生成的系统和方法 |
KR20160063812A (ko) * | 2014-11-27 | 2016-06-07 | 삼성전자주식회사 | 화면 구성 방법, 전자 장치 및 저장 매체 |
US10073516B2 (en) * | 2014-12-29 | 2018-09-11 | Sony Interactive Entertainment Inc. | Methods and systems for user interaction within virtual reality scene using head mounted display |
US10338673B2 (en) * | 2015-09-16 | 2019-07-02 | Google Llc | Touchscreen hover detection in an augmented and/or virtual reality environment |
KR101870245B1 (ko) | 2016-08-30 | 2018-06-25 | 주식회사 악어스캔 | 오프라인 문서 추적 방법 및 문서 추적 시스템 |
CN109952552A (zh) * | 2016-10-11 | 2019-06-28 | 惠普发展公司有限责任合伙企业 | 视觉提示系统 |
WO2018071190A1 (en) * | 2016-10-14 | 2018-04-19 | Google Llc | Virtual reality privacy settings |
WO2018090060A1 (en) | 2016-11-14 | 2018-05-17 | Logitech Europe S.A. | A system for importing user interface devices into virtual/augmented reality |
EP3324270A1 (en) | 2016-11-16 | 2018-05-23 | Thomson Licensing | Selection of an object in an augmented reality environment |
US10169973B2 (en) | 2017-03-08 | 2019-01-01 | International Business Machines Corporation | Discontinuing display of virtual content and providing alerts based on hazardous physical obstructions |
US10514801B2 (en) | 2017-06-15 | 2019-12-24 | Microsoft Technology Licensing, Llc | Hover-based user-interactions with virtual objects within immersive environments |
US10691945B2 (en) | 2017-07-14 | 2020-06-23 | International Business Machines Corporation | Altering virtual content based on the presence of hazardous physical obstructions |
US10484530B2 (en) | 2017-11-07 | 2019-11-19 | Google Llc | Sensor based component activation |
US11500452B2 (en) | 2018-06-05 | 2022-11-15 | Apple Inc. | Displaying physical input devices as virtual objects |
US10809910B2 (en) | 2018-09-28 | 2020-10-20 | Apple Inc. | Remote touch detection enabled by peripheral device |
-
2019
- 2019-05-13 US US16/410,547 patent/US11500452B2/en active Active
- 2019-05-21 WO PCT/US2019/033253 patent/WO2019236280A1/en unknown
- 2019-05-21 EP EP19728844.2A patent/EP3782014A1/en active Pending
- 2019-05-21 JP JP2020565377A patent/JP7033218B2/ja active Active
- 2019-05-21 CN CN201980032853.4A patent/CN112136096B/zh active Active
- 2019-05-21 KR KR1020207033785A patent/KR102459238B1/ko active IP Right Grant
-
2022
- 2022-11-10 US US17/985,040 patent/US11954245B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110063224A1 (en) | 2009-07-22 | 2011-03-17 | Frederic Vexo | System and method for remote, virtual on screen input |
US20130194188A1 (en) | 2012-01-31 | 2013-08-01 | Research In Motion Limited | Apparatus and method of facilitating input at a second electronic device |
US20140191977A1 (en) | 2013-01-09 | 2014-07-10 | Lenovo (Singapore) Pte. Ltd. | Touchpad operational mode |
US20170308258A1 (en) | 2014-09-26 | 2017-10-26 | Lg Electronics Inc. | Mobile device, hmd and system |
US11045725B1 (en) | 2014-11-10 | 2021-06-29 | Valve Corporation | Controller visualization in virtual and augmented reality environments |
US20170262045A1 (en) | 2016-03-13 | 2017-09-14 | Logitech Europe S.A. | Transition between virtual and augmented reality |
US20170300116A1 (en) | 2016-04-15 | 2017-10-19 | Bally Gaming, Inc. | System and method for providing tactile feedback for users of virtual reality content viewers |
US20180004312A1 (en) | 2016-06-29 | 2018-01-04 | Lg Electronics Inc. | Terminal and controlling method thereof |
Also Published As
Publication number | Publication date |
---|---|
US20230273674A1 (en) | 2023-08-31 |
CN112136096A (zh) | 2020-12-25 |
KR102459238B1 (ko) | 2022-10-27 |
KR20210005913A (ko) | 2021-01-15 |
US20190369714A1 (en) | 2019-12-05 |
CN112136096B (zh) | 2024-05-14 |
WO2019236280A1 (en) | 2019-12-12 |
JP2021524971A (ja) | 2021-09-16 |
US11954245B2 (en) | 2024-04-09 |
US11500452B2 (en) | 2022-11-15 |
EP3782014A1 (en) | 2021-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7033218B2 (ja) | 仮想オブジェクトとしての物理入力デバイスの表示 | |
CN111448542B (zh) | 显示应用程序 | |
US11520456B2 (en) | Methods for adjusting and/or controlling immersion associated with user interfaces | |
US11782571B2 (en) | Device, method, and graphical user interface for manipulating 3D objects on a 2D screen | |
US11367416B1 (en) | Presenting computer-generated content associated with reading content based on user interactions | |
US20240045501A1 (en) | Directing a Virtual Agent Based on Eye Behavior of a User | |
US20230221830A1 (en) | User interface modes for three-dimensional display | |
US20220413691A1 (en) | Techniques for manipulating computer graphical objects | |
US11393164B2 (en) | Device, method, and graphical user interface for generating CGR objects | |
US11995285B2 (en) | Methods for adjusting and/or controlling immersion associated with user interfaces | |
US20230376110A1 (en) | Mapping a Computer-Generated Trackpad to a Content Manipulation Region | |
JP7384951B2 (ja) | 遮られた物理的オブジェクトの位置を示すこと | |
US20240103687A1 (en) | Methods for interacting with user interfaces based on attention | |
CN116888562A (zh) | 将计算机生成的触控板映射到内容操纵区域 | |
CN116802589A (zh) | 基于手指操纵数据和非系留输入的对象参与 | |
CN116529713A (zh) | 多用户环境中的应用程序 | |
CN117980870A (zh) | 经由触控板的计算机生成的表示进行内容操纵 | |
CN117882034A (zh) | 显示和操纵用户界面元素 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20201120 |
|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20201120 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20220126 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20220225 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7033218 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |