WO2017163517A1 - 眼鏡型ウェアラブル情報端末、その制御方法および制御プログラム - Google Patents
眼鏡型ウェアラブル情報端末、その制御方法および制御プログラム Download PDFInfo
- Publication number
- WO2017163517A1 WO2017163517A1 PCT/JP2016/088308 JP2016088308W WO2017163517A1 WO 2017163517 A1 WO2017163517 A1 WO 2017163517A1 JP 2016088308 W JP2016088308 W JP 2016088308W WO 2017163517 A1 WO2017163517 A1 WO 2017163517A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map image
- type wearable
- information terminal
- display control
- glasses
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 29
- 238000001514 detection method Methods 0.000 claims abstract description 15
- 230000010365 information processing Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims 2
- 238000002834 transmittance Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/367—Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/3676—Overview of the route on the road map
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0143—Head-up displays characterised by optical features the two eyes not being equipped with identical nor symmetrical optical devices
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present invention relates to a glasses-type wearable information terminal, a control method thereof, and a control program.
- Patent Documents 1 to 4 disclose techniques for controlling display images in wearable terminals and other terminals.
- JP 2015-213226 A Japanese Patent Laying-Open No. 2015-125464 JP 2012-079138 A JP 2011-030116 A
- the technique described in the above document is not a technique for performing display suitable for the current situation during guidance to the destination.
- An object of the present invention is to provide a technique for solving the above-described problems.
- a terminal provides: Position detecting means for detecting the position; Display control means for displaying a map image for guiding to the destination; Glasses-type wearable information terminal equipped with The display control means is a glasses-type wearable information terminal that changes the transparency of the map image according to the distance between the current location and the point where the traveling direction should be changed.
- the method according to the present invention comprises: A position detection step for detecting the position; A display control step for displaying a map image for guiding to the destination; A method for controlling a glasses-type wearable information terminal including: The display control step is a control method for a glasses-type wearable information terminal that changes the transparency of the map image according to the distance between the current location and the point where the traveling direction should be changed.
- a program provides: A position detection step for detecting the position; A display control step for displaying a map image for guiding to the destination; Is an information processing program for causing a glasses-type wearable information terminal to execute,
- the display control step includes: The display control step is an information processing program that changes the transparency of the map image according to the distance between the current location and the point where the traveling direction should be changed.
- a glasses-type wearable information terminal 100 as a first embodiment of the present invention will be described with reference to FIG.
- the glasses-type wearable information terminal 100 is a device having a function of guiding from the current location to the destination.
- the glasses-type wearable information terminal 100 includes a position detection unit 101 and a display control unit 102.
- the position detection unit 101 detects the current position.
- the display control unit 102 displays a map image for guiding from the position acquired by the position detection unit 101 to the destination.
- the display control unit 102 changes the transparency of the map image according to the distance between the current location detected by the position detection unit 101 and the point where the traveling direction should be changed. For example, when the current position is a road that should travel linearly and is far from the corner to bend, the map image is displayed with high transparency to make the road easier to see.
- the display control unit 102 displays the map image with low transparency so that the map image is easy to see as the current location approaches the corner to be turned.
- FIG. 2 is a diagram for explaining a functional configuration of the glasses-type wearable information terminal 200 according to the present embodiment.
- the glasses-type wearable information terminal 200 includes left and right display units 201 and 202, a camera 203, a position detection unit 204, a destination guidance unit 205, a distance determination unit 206, a roadway determination unit 207, and a display control unit 208.
- the glasses-type wearable information terminal 200 further includes an image composition unit 209, a crossing determination unit 210, an obstacle determination unit 211, and a speed determination unit 212.
- the camera 203 is provided in the glasses-type wearable information terminal and acquires an image including the user's field of view.
- a plurality of cameras 203 may be provided in the glasses-type wearable information terminal 200.
- the first camera is installed at a position where a range that can be recognized by the user's right eye is captured, and the second camera has a range that can be recognized by the user's left eye. It is installed at a position where you can shoot.
- the first camera can be installed in the right temple portion of the wearable terminal 200
- the second camera can be installed in the left temple portion of the wearable terminal 200.
- the position detection unit 204 can be configured to acquire current position information from GPS (Global Positioning System).
- GPS Global Positioning System
- the destination guidance unit 205 acquires the position information acquired by the position detection unit 204, the information of the destination that has received input from the user, and the map information.
- the destination guidance unit 205 generates a route for guiding to the destination based on the position information, the destination information received from the user, and the map information.
- a route generation method a known method can be used.
- the distance determination unit 206 determines the distance between the current location and the point where the traveling direction should be changed. Specifically, the distance determination unit 206 acquires information on a point where the traveling direction is changed based on the route generated by the destination guidance unit 205. The distance determination unit 206 calculates the distance from the current position to the point where the traveling direction is changed most recently on the route based on the acquired information on the point where the traveling direction is changed and the current position information.
- the roadway determination unit 207 acquires information on the presence or absence of a road along the roadway and the location of the road along the roadway.
- the roadway determination unit 207 detects whether there is a roadway on the left or right of the current position based on the current position information, the presence / absence of a road along the roadway, and the information on the location of the road along the roadway. .
- the display control unit 208 controls the left and right display units 201 and 202. Specifically, control for displaying the map image generated by the image composition unit 209 on either or both of the left and right display units 201 and 202 is performed. The display control unit 208 controls the transmittance displayed on the left and right display units 201 and 202.
- the image composition unit 209 composes the images acquired by the plurality of cameras 203 into one image.
- the crossing determination unit 210 determines whether the user wearing the glasses-type wearable information terminal 200 is crossing the road. Specifically, the crossing determination unit 210 acquires pedestrian crossing position information from the map information, and determines whether or not the user is crossing based on the acquired pedestrian crossing position information and current position information. It can also be configured.
- the obstacle determination unit 211 may be configured to determine whether there is an obstacle on the route generated by the destination guidance unit 205, or by analyzing the image acquired by the camera 203, It is good also as a structure to detect.
- the speed determination unit 212 can be configured to determine the speed based on the output value of the acceleration sensor, and can also be configured to detect the speed using GPS reception information.
- the glasses-type wearable information terminal 200 further includes an operation unit that receives a user operation. The operation unit does not accept a user operation when there is an obstacle ahead, when moving at an intersection, or when moving at a predetermined speed or higher.
- FIG. 3 is a flowchart showing the flow of processing of the glasses-type wearable information terminal 200.
- step S301 the destination guide unit 205 calculates a route to the destination input by the user.
- the destination guide unit 205 determines whether the user is being guided according to the calculated route. If guidance is in progress, the process proceeds to step S303.
- step S303 the roadway determination unit 207 determines whether or not the current location detected by the position detection unit 204 is on a sidewalk along the roadway. When walking on the sidewalk along the roadway, the process proceeds to step 305.
- the display control unit 208 displays a map image on the display unit on the side of the left and right display units 201 and 202 that is not the roadway. For example, in the case of FIG. 4, a map image can be displayed on the left display unit 201 to ensure visibility on the roadway side.
- the display control unit 208 can also prohibit display on the display unit on the roadway side among the left and right display units 201 and 202.
- step S307 the distance determination unit 206 determines the distance between the current location and the point where the traveling direction should be changed. If the determined distance is greater than or equal to the predetermined distance, the process proceeds to step S319.
- step S319 the display control unit 208 displays the map image with high transparency so that the front can be easily seen.
- the high transmittance may be 80%, for example, or may be 100%.
- the transparency is 100%, the map image is not displayed.
- high transparency can also be set by the user.
- the process proceeds to step S309.
- step S309 the display control unit 208 causes the display unit displaying the map image to display the map image with low transparency so that the map image is easy to see.
- the transmittance is 0%
- the front view is blocked and the map image is clearly displayed.
- FIG. 5 shows the state of the display screen of the display unit when viewed from the user.
- the map image may be displayed darker by gradually lowering the transparency as it approaches the corner. As a result, it is possible to immediately determine where the traveling direction should be changed, and to prevent a failure of passing through a corner. Further, by gradually changing the transmittance, it is possible to prevent the user's field of view from being suddenly blocked.
- the display control unit 208 displays images for both fields of view on the display units of the right display unit 201 and the left display unit 202 on which no map image is displayed. Display.
- This step is an optional process when the cameras 203 are provided on the left and right sides as shown in FIG.
- images captured by the right camera and the left camera can be combined by an image combining unit 209 and displayed on one display unit that does not display a map image as shown in FIG. 6. As a result, it is possible to secure the entire field of view ahead while viewing the map.
- the crossing determination unit 210 determines whether or not the road is crossing, whether or not it is approaching an intersection, and the like. In a situation where a forward field of view is to be secured, such as when crossing a road or at an intersection, the process proceeds to step S319, and the display control unit 208 increases the map image so that either the right display unit 201 or the left display unit 202 can easily see the front. Display with transparency.
- step S315 the process proceeds to step S315, and the obstacle determination unit 211 determines whether there is an obstacle ahead. In a situation where there is an obstacle and a forward visual field should be secured, the process advances to step S319, and the display control unit 208 displays a map image with high transparency so that either the right display unit 201 or the left display unit 202 can easily see the front. To display. For example, if there is a person coming from the front, it is judged as an obstacle and the map is displayed lightly. If it is a person heading in the same direction, it is not judged as an obstacle and the map is displayed clearly. May be.
- step S317 If there is no obstacle, the process proceeds to step S317, and the speed determination unit 212 determines the traveling speed of the user. When moving at a speed faster than the predetermined speed X, the process proceeds to step S319, and the display control unit 208 transfers a map image to either the right display unit 201 or the left display unit 202 with a high transmittance (for example, 80 to 100). %).
- a high transmittance for example, 80 to 100.
- the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention can also be applied to a case where an information processing program that implements the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to realize the functions of the present invention on a computer, a program installed on the computer, a medium storing the program, and a WWW (World Wide Web) server that downloads the program are also included in the scope of the present invention. . In particular, at least a non-transitory computer readable medium storing a program for causing a computer to execute the processing steps included in the above-described embodiments is included in the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Business, Economics & Management (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
位置を検知する位置検知手段と、
目的地まで誘導するための地図画像を表示させる表示制御手段と、
を備えた眼鏡型ウェアラブル情報端末であって、
前記表示制御手段は、現在地と進行方向を変えるべき地点との距離に応じて地図画像の透過度を変える眼鏡型ウェアラブル情報端末である。
位置を検知する位置検知ステップと、
目的地まで誘導するための地図画像を表示させる表示制御ステップと、
を含む眼鏡型ウェアラブル情報端末の制御方法であって、
前記表示制御ステップは、現在地と進行方向を変えるべき地点との距離に応じて地図画像の透過度を変える眼鏡型ウェアラブル情報端末の制御方法である。
位置を検知する位置検知ステップと、
目的地まで誘導するための地図画像を表示する表示制御ステップと、
を眼鏡型ウェアラブル情報端末に実行させる情報処理プログラムであって、
前記表示制御ステップは、
前記表示制御ステップは、現在地と進行方向を変えるべき地点との距離に応じて地図画像の透過度を変える情報処理プログラムである。
本発明の第1実施形態としての眼鏡型ウェアラブル情報端末100について、図1を用いて説明する。眼鏡型ウェアラブル情報端末100は、現在地から目的地まで誘導する機能を有する装置である。
次に本発明の第2実施形態に係る眼鏡型ウェアラブル情報端末200について、図2を用いて説明する。図2は、本実施形態に係る眼鏡型ウェアラブル情報端末200の機能構成を説明するための図である。
次に、ステップS319に進むと、表示制御部208は、前方が見やすいように地図画像を高い透過度で表示する。このとき、高い透過度とは、例えば透過度80%とすることもでき、透過度100%としたりしてもよい。透過度100%の場合、地図画像は表示されない。ここで、高い透過度をユーザにより設定することもできる。一方、現在地が進行方向を変えるべき地点から所定距離未満の場合には、ステップS309に進む。
以上、実施形態を参照して本願発明を説明したが、本願発明は上記実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。また、それぞれの実施形態に含まれる別々の特徴を如何様に組み合わせたシステムまたは装置も、本発明の範疇に含まれる。
Claims (8)
- 位置を検知する位置検知手段と、
目的地まで誘導するための地図画像を表示させる表示制御手段と、
を備えた眼鏡型ウェアラブル情報端末であって、
前記表示制御手段は、現在地と進行方向を変えるべき地点との距離に応じて地図画像の透過度を変える眼鏡型ウェアラブル情報端末。 - 前記表示制御手段は、さらに、
前記位置検知手段が検知した現在地が、前記進行方向を変えるべき地点から所定距離以上離れている場合には、前方が見やすいように前記地図画像を第1透過度で表示し、前記現在地が前記進行方向を変えるべき地点から所定距離未満の場合には、前記地図画像が見やすいように前記地図画像を前記第1透過度よりも低い第2透過度で表示する請求項1に記載の眼鏡型ウェアラブル情報端末。 - 前記表示制御手段は、前記現在地が前記進行方向を変えるべき地点から所定距離未満の場合でも、前方に障害物がある場合、交差点を移動中の場合、または所定の速度以上で移動中の場合には、前記地図画像を前記第1透過度で表示する請求項2に記載の眼鏡型ウェアラブル情報端末。
- 右側表示手段および左側表示手段と、右側前方を撮像する右側撮像手段および左側前方を撮像する左側撮像手段とをさらに備え、
前記表示制御手段は、さらに、
右側表示手段および左側表示手段のうち、いずれか一方の表示手段に前記地図画像を表示している場合には、他方の表示手段に、前記右側撮像手段および前記左側撮像手段で撮像した両視野分の画像を表示する請求項1、2または3に記載の眼鏡型ウェアラブル情報端末。 - 右側表示手段および左側表示手段をさらに備え、
前記表示制御手段は、さらに、
前記位置検知手段が検知した現在地が、車道沿いの歩道である場合に、前記右側表示手段および前記左側表示手段のうち、車道側の表示手段に前記地図画像を表示させない請求項1または2に記載の眼鏡型ウェアラブル情報端末。 - ユーザの操作を受け付ける操作手段をさらに有し、
前記操作手段は、前方に障害物がある場合、交差点を移動中の場合、または所定の速度以上で移動中の場合には、ユーザの操作を受け付けない請求項5に記載の眼鏡型ウェアラブル情報端末。 - 位置を検知する位置検知ステップと、
目的地まで誘導するための地図画像を表示させる表示制御ステップと、
を含む眼鏡型ウェアラブル情報端末の制御方法であって、
前記表示制御ステップは、現在地と進行方向を変えるべき地点との距離に応じて地図画像の透過度を変える眼鏡型ウェアラブル情報端末の制御方法。 - 位置を検知する位置検知ステップと、
目的地まで誘導するための地図画像を表示する表示制御ステップと、
を眼鏡型ウェアラブル情報端末に実行させる情報処理プログラムであって、
前記表示制御ステップは、
前記表示制御ステップは、現在地と進行方向を変えるべき地点との距離に応じて地図画像の透過度を変える情報処理プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/086,639 US20190041231A1 (en) | 2016-03-23 | 2016-12-22 | Eyeglasses-type wearable information terminal, control method thereof, and control program |
EP16895540.9A EP3435036A4 (en) | 2016-03-23 | 2016-12-22 | BRILLIANT WEARABLE INFORMATION SENDING DEVICE AND CONTROL PROCESS AND CONTROL PROGRAM THEREFOR |
JP2018506779A JP6501035B2 (ja) | 2016-03-23 | 2016-12-22 | 眼鏡型ウェアラブル情報端末、その制御方法および制御プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-058018 | 2016-03-23 | ||
JP2016058018 | 2016-03-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017163517A1 true WO2017163517A1 (ja) | 2017-09-28 |
Family
ID=59901086
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/088308 WO2017163517A1 (ja) | 2016-03-23 | 2016-12-22 | 眼鏡型ウェアラブル情報端末、その制御方法および制御プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20190041231A1 (ja) |
EP (1) | EP3435036A4 (ja) |
JP (1) | JP6501035B2 (ja) |
WO (1) | WO2017163517A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11120593B2 (en) | 2019-05-24 | 2021-09-14 | Rovi Guides, Inc. | Systems and methods for dynamic visual adjustments for a map overlay |
US11674818B2 (en) * | 2019-06-20 | 2023-06-13 | Rovi Guides, Inc. | Systems and methods for dynamic transparency adjustments for a map overlay |
JP7384014B2 (ja) | 2019-12-06 | 2023-11-21 | トヨタ自動車株式会社 | 表示システム |
US11914835B2 (en) | 2020-11-16 | 2024-02-27 | Samsung Electronics Co., Ltd. | Method for displaying user interface and electronic device therefor |
KR20220066578A (ko) * | 2020-11-16 | 2022-05-24 | 삼성전자주식회사 | 유저 인터페이스를 표시하는 방법 및 이를 지원하는 전자 장치 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014202490A (ja) * | 2013-04-01 | 2014-10-27 | パイオニア株式会社 | 端末装置、制御方法、プログラム、及び記憶媒体 |
JP2015215619A (ja) * | 2015-06-10 | 2015-12-03 | ソニー株式会社 | 表示装置、表示方法、プログラム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10036891B2 (en) * | 2010-10-12 | 2018-07-31 | DISH Technologies L.L.C. | Variable transparency heads up displays |
JP5927966B2 (ja) * | 2012-02-14 | 2016-06-01 | ソニー株式会社 | 表示制御装置、表示制御方法、及びプログラム |
US9129430B2 (en) * | 2013-06-25 | 2015-09-08 | Microsoft Technology Licensing, Llc | Indicating out-of-view augmented reality images |
US20180088323A1 (en) * | 2016-09-23 | 2018-03-29 | Sheng Bao | Selectably opaque displays |
-
2016
- 2016-12-22 JP JP2018506779A patent/JP6501035B2/ja not_active Expired - Fee Related
- 2016-12-22 US US16/086,639 patent/US20190041231A1/en not_active Abandoned
- 2016-12-22 EP EP16895540.9A patent/EP3435036A4/en not_active Withdrawn
- 2016-12-22 WO PCT/JP2016/088308 patent/WO2017163517A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014202490A (ja) * | 2013-04-01 | 2014-10-27 | パイオニア株式会社 | 端末装置、制御方法、プログラム、及び記憶媒体 |
JP2015215619A (ja) * | 2015-06-10 | 2015-12-03 | ソニー株式会社 | 表示装置、表示方法、プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3435036A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP6501035B2 (ja) | 2019-04-17 |
US20190041231A1 (en) | 2019-02-07 |
EP3435036A4 (en) | 2019-04-03 |
EP3435036A1 (en) | 2019-01-30 |
JPWO2017163517A1 (ja) | 2018-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017163517A1 (ja) | 眼鏡型ウェアラブル情報端末、その制御方法および制御プログラム | |
JP6176541B2 (ja) | 情報表示装置、情報表示方法及びプログラム | |
US9589194B2 (en) | Driving assistance device and image processing program | |
US20160185219A1 (en) | Vehicle-mounted display control device | |
KR102580476B1 (ko) | 차량의 차량 주변환경 내 가림 영역 산출 방법 및 장치 | |
CA3000110C (en) | Vehicular display device | |
EP2891953A1 (en) | Eye vergence detection on a display | |
JP6695049B2 (ja) | 表示装置及び表示制御方法 | |
JP2015114757A (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP2008015759A (ja) | 運転支援装置 | |
US9418548B2 (en) | Display control device, display control method, non-transitory computer-readable recording medium, and projecting device | |
US20210110791A1 (en) | Method, device and computer-readable storage medium with instructions for controllling a display of an augmented-reality head-up display device for a transportation vehicle | |
CN111095363B (zh) | 显示系统和显示方法 | |
JP2005346177A (ja) | 車両用情報提示装置 | |
JP2019116229A (ja) | 表示システム | |
JP2012153256A (ja) | 画像処理装置 | |
WO2020105685A1 (ja) | 表示制御装置、方法、及びコンピュータ・プログラム | |
JP2006337441A (ja) | ナビゲーション装置、画像表示方法および画像表示プログラム | |
JP2007280203A (ja) | 情報提示装置、自動車、及び情報提示方法 | |
JP2019119357A (ja) | 表示システム | |
JP6365409B2 (ja) | 車両の運転者のための画像表示装置 | |
JP5935858B2 (ja) | 表示装置 | |
JP2011191264A (ja) | 表示制御装置、方法およびプログラム | |
JP2019064422A (ja) | ヘッドアップディスプレイ装置 | |
JP7378892B2 (ja) | 一時停止線表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018506779 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016895540 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016895540 Country of ref document: EP Effective date: 20181023 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16895540 Country of ref document: EP Kind code of ref document: A1 |