WO2017122270A1 - 画像表示装置 - Google Patents
画像表示装置 Download PDFInfo
- Publication number
- WO2017122270A1 WO2017122270A1 PCT/JP2016/050670 JP2016050670W WO2017122270A1 WO 2017122270 A1 WO2017122270 A1 WO 2017122270A1 JP 2016050670 W JP2016050670 W JP 2016050670W WO 2017122270 A1 WO2017122270 A1 WO 2017122270A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- display device
- user
- camera
- Prior art date
Links
- 238000000034 method Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 13
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/361—Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Definitions
- Patent Document 1 International Publication No. 2012/033578 pamphlet discloses a camera system that measures the physical distance from an object using light.
- This camera system irradiates patterned light (for example, light of a lattice pattern, a dot pattern, etc.) with respect to the target object with an illuminating device.
- patterned light for example, light of a lattice pattern, a dot pattern, etc.
- the light pattern is deformed according to the shape of the object.
- the physical distance between the camera system and the object is calculated by photographing and analyzing the deformed pattern with a visible light camera.
- This specification discloses a technique that can appropriately specify the position and orientation of an image display device in a surrounding space.
- Spatial information for specifying, spatial information, a first captured image acquired from the first camera, a second captured image acquired from the second camera, and an image display detected by the sensor
- the position and orientation of the image display device in the space are specified based on the posture of the device, an object image representing an object associated with the predetermined position in the space is generated, and the predetermined range includes the predetermined position.
- the display unit displays a first display screen showing a state in which the object image is arranged at a predetermined position in the space.
- the control unit further changes the display mode of the object image in the first display screen according to an operation performed by the user while the first display screen is displayed on the display unit. Also good.
- the object image in the first display screen can be displayed in an appropriate mode according to the operation performed by the user.
- the display unit may be a translucent display, and when the user wears the image display device, the user can visually recognize the surroundings through the display unit.
- the control unit may display the first display screen on the display unit by displaying the object image on the display unit.
- the display unit may be a light-shielding display, and may block the user's field of view when the user wears the image display device.
- the control unit displays at least one of the first photographed image and the second photographed image on the display unit, and in the first case, the display unit displays the object together with at least one of the first photographed image and the second photographed image.
- the first display screen may be displayed on the display unit by displaying an image.
- the display units 10a and 10b are translucent display members, respectively.
- the display unit 10a When the user wears the image display device 2 on the head, the display unit 10a is disposed at a position facing the right eye of the user, and the display unit 10b is disposed at a position facing the left eye.
- the left and right display units 10a and 10b may be collectively referred to as the display unit 10.
- the user can visually recognize the surroundings through the display unit 10.
- control unit 30 displays a desired screen on the display unit 10 by instructing the projection unit 11 to project an image
- the operation of the projection unit 11 will be described. It may be omitted and simply expressed as “the control unit 30 causes the display unit 10 to display a desired image”.
- Sensor 20 is a three-axis acceleration sensor.
- the sensor 20 detects acceleration of three axes of X, Y, and Z.
- the control unit 30 can specify the posture and motion state of the image display device 2.
- the communication I / F 22 is an I / F for executing wireless communication with an external device (for example, a terminal device having a communication function).
- Display device processing (Display device processing; FIG. 3)
- display device processing executed by the control unit 30 of the image display device 2 of the present embodiment will be described.
- the control unit 30 starts the display device processing of FIG.
- control unit 30 starts real-time processing (see FIG. 5).
- the control unit 30 repeatedly executes the real-time process until the power of the image display device 2 is turned off.
- the control unit 30 proceeds to S16.
- the control unit 30 acquires, from the first camera 12, a first captured image that is an image of a specific range captured by the first camera 12 at the time of S30, and the second The second captured image that is an image in a specific range captured by the second camera 14 at the time of S30 is acquired from the first camera 14. That is, the first captured image and the second captured image acquired in S30 are both real-time captured images corresponding to the current visual field range of the user.
- the control unit 30 calculates a distance between a specific feature point common to the first captured image and the second captured image and the image display device 2.
- the “feature point” referred to here is, for example, one of a plurality of feature points included in the spatial information (in the case of YES in S12 of FIG. 3).
- the distance between the feature point and the image display device 2 is calculated by performing triangulation using the first captured image and the second captured image, as in the above case.
- control unit 30 determines the distance between the spatial information identified in the case of YES in S12 of FIG. 3 and the feature point calculated in S32, the attitude of the image display device 2 calculated in S34, Is used to specify the position and orientation of the image display device 2 in the space where the image display device 2 exists.
- control unit 30 generates a globe object image 80 indicating a virtual globe and virtually arranges it at a predetermined position (that is, the right side as viewed from the user) (see FIGS. 7 to 9).
- the display unit 10 further displays a guide image 90 indicating that the globe object image 80 is virtually arranged on the right side when viewed from the user.
- control unit 30 When the control unit 30 finishes S22, the control unit 30 returns to S18 and monitors whether the user operation is performed again. Thereby, whenever the user performs a gesture within a specific range or the user performs an operation of changing the direction of the line of sight, the control unit 30 displays the object image and the guide image displayed on the display unit 10 according to the operation. The display position and display mode are changed. The control unit 30 repeatedly executes the processes of S18 to S22 until the end gesture is performed (YES in S20).
- the display unit 10 is a translucent display, and the user can visually recognize the surroundings through the display unit 10 when the user wears the image display device 2.
- the control unit 30 causes the display unit 10 to display a screen in which the menu object image 60 is combined with a real image that can be viewed through the display unit 10. Therefore, the user can view a screen in a mode in which the menu object image 60 is combined with an actual field of view that can be viewed through the display unit 10.
- the control unit 30 displays an image captured by the first camera 12 in a region facing the user's right eye, and displays the image captured in the region facing the user's left eye.
- the image captured by the second camera 14 is displayed.
- the control unit 30 may display only one of the image captured by the first camera 12 and the image captured by the second camera 14 on the display unit 10.
- the control unit may display an image obtained by combining the image captured by the first camera 12 and the image captured by the second camera 14 on the display unit 10.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
(画像表示装置2の構成;図1、図2)
図1に示す画像表示装置2は、ユーザの頭部に装着して用いられる画像表示装置(いわゆるヘッドマウントディスプレイ)である。画像表示装置2は、支持体4と、表示部10a、10bと、投影部11a、11bと、第1のカメラ12と、第2のカメラ14と、コントロールボックス16とを備えている。
図3を参照して、本実施例の画像表示装置2の制御部30が実行する表示装置処理について説明する。ユーザが画像表示装置2を自身の頭部に装着し、画像表示装置2の電源をオンすると、制御部30は、図3の表示装置処理を開始する。
図5のS30では、制御部30は、第1のカメラ12から、S30の時点で第1のカメラ12によって撮影されている特定範囲の画像である第1の撮影画像を取得するとともに、第2のカメラ14から、S30の時点で第2のカメラ14によって撮影されている特定範囲の画像である第2の撮影画像とを取得する。即ち、S30で取得される第1の撮影画像及び第2の撮影画像は、いずれも、ユーザの現時点の視界範囲に相当するリアルタイムの撮影画像である。
上記の通り、制御部30は、図3のS14でリアルタイム処理(図5参照)を開始すると、S16に進む。S16では、制御部30は、メインメニューオブジェクトを表すメニューオブジェクト画像を生成し、所定位置に対応付ける。言い換えると、制御部30は、メニューオブジェクト画像を生成し、所定位置に仮想的に配置する。ここで、「メニューオブジェクト画像を所定位置に仮想的に配置する」とは、特定範囲(即ち、第1のカメラ12及び第2のカメラ14の撮像範囲)に所定位置が含まれる場合に、メニューオブジェクト画像が空間内の所定位置に配置された状態の画面が表示部10に表示されるように、メニューオブジェクト画像を所定位置に対応付けることを意味する。なお、S16では、制御部30は、ユーザの視界の正面の位置を所定位置として指定して、メニューオブジェクト画像を仮想的に配置する。そのため、S16の処理の時点では、特定範囲(即ちユーザの視界範囲)に所定位置が含まれる。そのため、図6に示すように、表示部10には、メニューオブジェクトを示すメニューオブジェクト画像60が空間内に配置された状態の画面が表示される。本実施例では、表示部10は透光性のディスプレイであるため、ユーザは、表示部10越しに視認できる現実の物品(即ち、室内の光景)にメニューオブジェクト画像60が合わされた態様の表示を見ることができる。
図6に示されている表示部10越しに視認できる現実画像にメニューオブジェクト画像60が合成された態様の画面が「第1のディスプレイ画面」の一例である。メニューオブジェクト画像60が「オブジェクト画像」の一例である。メニューオブジェクト画像60が仮想的に配置される位置が「所定位置」の一例である。また、図9に示されているガイド画像92を含む画面が「第2のディスプレイ画面」の一例である。
図10を参照して、第2実施例の画像表示装置102について、第1実施例と異なる点を中心に説明する。本実施例では、表示部110が遮光性のディスプレイであって、ユーザが画像表示装置2を装着した際に、ユーザの視界を遮るものである点において、第1実施例とは異なる。その他の構成要素は第1実施例とほぼ同様である。
Claims (6)
- ユーザの頭部に装着して用いられる画像表示装置であって、
表示部と、
前記ユーザの視界範囲に対応する特定範囲を撮影する第1のカメラと、
前記第1のカメラとは異なる位置に設けられるとともに、前記特定範囲を撮影する第2のカメラと、
前記画像表示装置の姿勢を検出可能なセンサと、
制御部と、
を有しており、
前記制御部は、
前記第1のカメラから取得される第1のキャリブレーション画像と、前記第2のカメラから取得される第2のキャリブレーション画像と、に基づいて、前記画像表示装置の周囲の空間の特徴を特定するための空間情報を特定し、
前記空間情報と、前記第1のカメラから取得される第1の撮影画像と、前記第2のカメラから取得される第2の撮影画像と、前記センサが検出する前記画像表示装置の姿勢と、に基づいて、前記空間内における前記画像表示装置の位置及び姿勢を特定し、
前記空間内の所定位置に対応付けられるオブジェクトを表すオブジェクト画像を生成し、
前記特定範囲に前記所定位置が含まれる第1の場合に、前記オブジェクト画像が前記空間内の前記所定位置に配置された状態を示す第1のディスプレイ画面を前記表示部に表示させる、
画像表示装置。 - 前記制御部は、さらに、
前記表示部に前記第1のディスプレイ画面が表示されている間に、前記ユーザが行う操作に応じて、前記第1のディスプレイ画面内における前記オブジェクト画像の表示態様を変更する、
請求項1に記載の画像表示装置。 - 前記操作は、前記ユーザが前記特定範囲内で行うジェスチャーを含む、
請求項2に記載の画像表示装置。 - 前記表示部は、透光性のディスプレイであって、前記ユーザが前記画像表示装置を装着した際に、前記ユーザが前記表示部越しに周囲を視認できるものであり、
前記制御部は、前記第1の場合に、前記表示部に前記オブジェクト画像を表示させることによって、前記第1のディスプレイ画面を前記表示部に表示させる、
請求項1から3のいずれか一項に記載の画像表示装置。 - 前記表示部は遮光性のディスプレイであって、前記ユーザが前記画像表示装置を装着した際に、前記ユーザの視界を遮るものであり、
前記制御部は、
前記第1の撮影画像と前記第2の撮影画像の少なくとも一方を前記表示部に表示させ、
前記第1の場合に、前記表示部に前記第1の撮影画像と前記第2の撮影画像の少なくとも一方とともに前記オブジェクト画像を表示させることによって、前記第1のディスプレイ画面を前記表示部に表示させる、
請求項1から3のいずれか一項に記載の画像表示装置。 - 前記制御部は、前記特定範囲に前記所定位置が含まれない第2の場合に、前記所定位置の方向を示すガイド画像を含む第2のディスプレイ画面を前記表示部に表示させる、
請求項1から5のいずれか一項に記載の画像表示装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/050670 WO2017122270A1 (ja) | 2016-01-12 | 2016-01-12 | 画像表示装置 |
US16/069,382 US20190019308A1 (en) | 2016-01-12 | 2016-01-12 | Image display device |
JP2017561085A JPWO2017122270A1 (ja) | 2016-01-12 | 2016-01-12 | 画像表示装置 |
EP16884873.7A EP3404623A4 (en) | 2016-01-12 | 2016-01-12 | IMAGE DISPLAY DEVICE |
CN201680078579.0A CN108541323A (zh) | 2016-01-12 | 2016-01-12 | 图像显示装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/050670 WO2017122270A1 (ja) | 2016-01-12 | 2016-01-12 | 画像表示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017122270A1 true WO2017122270A1 (ja) | 2017-07-20 |
Family
ID=59312007
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/050670 WO2017122270A1 (ja) | 2016-01-12 | 2016-01-12 | 画像表示装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20190019308A1 (ja) |
EP (1) | EP3404623A4 (ja) |
JP (1) | JPWO2017122270A1 (ja) |
CN (1) | CN108541323A (ja) |
WO (1) | WO2017122270A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3627208A1 (en) * | 2018-09-18 | 2020-03-25 | Samsung Electronics Co., Ltd. | Electronic device having optical member for adjusting permeation rate of light and method for operating thereof |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180037887A (ko) * | 2016-10-05 | 2018-04-13 | 엠티스코퍼레이션(주) | 스마트 안경 |
US11749142B2 (en) | 2018-12-04 | 2023-09-05 | Telefonaktiebolaget Lm Ericsson (Publ) | Optical see-through viewing device and method for providing virtual content overlapping visual objects |
KR102232045B1 (ko) | 2019-01-08 | 2021-03-25 | 삼성전자주식회사 | 전자 장치, 전자 장치의 제어 방법 및 컴퓨터 판독 가능 매체. |
KR20210014813A (ko) | 2019-07-30 | 2021-02-10 | 삼성디스플레이 주식회사 | 표시장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009212582A (ja) * | 2008-02-29 | 2009-09-17 | Nippon Hoso Kyokai <Nhk> | バーチャルスタジオ用フィードバックシステム |
JP2015114905A (ja) * | 2013-12-12 | 2015-06-22 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2015114757A (ja) * | 2013-12-10 | 2015-06-22 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2015192436A (ja) * | 2014-03-28 | 2015-11-02 | キヤノン株式会社 | 送信端末、受信端末、送受信システム、およびそのプログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9111394B1 (en) * | 2011-08-03 | 2015-08-18 | Zynga Inc. | Rendering based on multiple projections |
US9147111B2 (en) * | 2012-02-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Display with blocking image generation |
US9389420B2 (en) * | 2012-06-14 | 2016-07-12 | Qualcomm Incorporated | User interface interaction for transparent head-mounted displays |
US9411160B2 (en) * | 2013-02-12 | 2016-08-09 | Seiko Epson Corporation | Head mounted display, control method for head mounted display, and image display system |
WO2014162852A1 (ja) * | 2013-04-04 | 2014-10-09 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
US9430038B2 (en) * | 2014-05-01 | 2016-08-30 | Microsoft Technology Licensing, Llc | World-locked display quality feedback |
-
2016
- 2016-01-12 JP JP2017561085A patent/JPWO2017122270A1/ja active Pending
- 2016-01-12 US US16/069,382 patent/US20190019308A1/en not_active Abandoned
- 2016-01-12 CN CN201680078579.0A patent/CN108541323A/zh active Pending
- 2016-01-12 WO PCT/JP2016/050670 patent/WO2017122270A1/ja active Application Filing
- 2016-01-12 EP EP16884873.7A patent/EP3404623A4/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009212582A (ja) * | 2008-02-29 | 2009-09-17 | Nippon Hoso Kyokai <Nhk> | バーチャルスタジオ用フィードバックシステム |
JP2015114757A (ja) * | 2013-12-10 | 2015-06-22 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2015114905A (ja) * | 2013-12-12 | 2015-06-22 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2015192436A (ja) * | 2014-03-28 | 2015-11-02 | キヤノン株式会社 | 送信端末、受信端末、送受信システム、およびそのプログラム |
Non-Patent Citations (2)
Title |
---|
See also references of EP3404623A4 * |
YOSHIHIRO BAN ET AL.: "A Study of Visual Navigating for Wearable Augmented Reality System : An Astronomical Observation Supporting System based on Wearable Augmented Reality Technology", TRANSACTIONS OF THE VIRTUAL REALITY SOCIETY OF JAPAN, vol. 6, no. 2, 30 June 2001 (2001-06-30), pages 89 - 98, XP055400239, ISSN: 1344-011X * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3627208A1 (en) * | 2018-09-18 | 2020-03-25 | Samsung Electronics Co., Ltd. | Electronic device having optical member for adjusting permeation rate of light and method for operating thereof |
Also Published As
Publication number | Publication date |
---|---|
US20190019308A1 (en) | 2019-01-17 |
CN108541323A (zh) | 2018-09-14 |
JPWO2017122270A1 (ja) | 2018-10-11 |
EP3404623A4 (en) | 2019-08-07 |
EP3404623A1 (en) | 2018-11-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112926428B (zh) | 使用合成图像训练对象检测算法的方法和系统和存储介质 | |
CN107209386B (zh) | 增强现实视野对象跟随器 | |
KR20230074780A (ko) | 검출된 손 제스처들에 응답한 터치리스 포토 캡처 | |
CN116719413A (zh) | 用于操纵环境中的对象的方法 | |
WO2017122270A1 (ja) | 画像表示装置 | |
KR20230016209A (ko) | 포지션 추적을 사용한 인터랙티브 증강 현실 경험들 | |
US20210405772A1 (en) | Augmented reality eyewear 3d painting | |
JP6399692B2 (ja) | ヘッドマウントディスプレイ、画像表示方法及びプログラム | |
KR20230017849A (ko) | 증강 현실 안내 | |
WO2017122274A1 (ja) | 画像表示装置 | |
KR20230026502A (ko) | 3d 의상들을 갖는 증강 현실 안경류 | |
KR20230022239A (ko) | 증강 현실 환경 향상 | |
CN110895433A (zh) | 用于增强现实中用户交互的方法和装置 | |
US20240045494A1 (en) | Augmented reality with eyewear triggered iot | |
US20210406542A1 (en) | Augmented reality eyewear with mood sharing | |
US20230007227A1 (en) | Augmented reality eyewear with x-ray effect | |
WO2021059369A1 (ja) | アニメーション制作システム | |
WO2021059360A1 (ja) | アニメーション制作システム | |
CN112262373A (zh) | 基于视图的断点 | |
JP6941130B2 (ja) | 情報処理方法、情報処理プログラム及び情報処理装置 | |
KR20230112729A (ko) | 증강 현실 공간 오디오 경험 | |
WO2021059370A1 (ja) | アニメーション制作システム | |
KR20240050437A (ko) | 증강 현실 프로프 상호작용 | |
KR20240056558A (ko) | 핸드크래프티드 증강 현실 경험들 | |
WO2024064932A1 (en) | Methods for controlling and interacting with a three-dimensional environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16884873 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017561085 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016884873 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016884873 Country of ref document: EP Effective date: 20180813 |