WO2010073616A1 - 情報表示装置および情報表示方法 - Google Patents
情報表示装置および情報表示方法 Download PDFInfo
- Publication number
- WO2010073616A1 WO2010073616A1 PCT/JP2009/007113 JP2009007113W WO2010073616A1 WO 2010073616 A1 WO2010073616 A1 WO 2010073616A1 JP 2009007113 W JP2009007113 W JP 2009007113W WO 2010073616 A1 WO2010073616 A1 WO 2010073616A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- unit
- video
- real world
- display device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 29
- 230000004044 response Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 31
- 238000012545 processing Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 9
- 239000003550 marker Substances 0.000 description 6
- 230000015654 memory Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000004397 blinking Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000002837 heart atrium Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
- H04N21/4725—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41407—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
Definitions
- the present invention relates to an information display device and an information display method for superimposing and displaying a real world video and information about an object projected on the real world video.
- Objects such as various facilities are projected on real-world images such as landscape images.
- Image display devices that display a real world video and information (hereinafter referred to as “object information”) related to an object displayed on the real world video on a single screen are used in various fields.
- object information information
- Such a technique is called “Augmented Reality” and is also called “Enhanced Reality”, “Enhanced Reality”, or the like.
- An augmented reality device to which this technology is applied is convenient for acquiring information each time while moving.
- Patent Literature 1 and Patent Literature 2 describe a technique for controlling object information to be displayed in an apparatus that superimposes and displays real world video and object information (hereinafter simply referred to as “information display apparatus”).
- FIG. 1 is a diagram showing a configuration of an information display device described in Patent Document 1.
- FIG. 2 is a diagram illustrating an example of a real-world image input by the information display device described in Patent Document 1.
- FIG. 3 is a diagram showing an example of a video (hereinafter referred to as “information superimposed video”) on which the real world video and object information are superimposed, which are displayed by the information display device shown in FIG. 1 corresponding to the input video shown in FIG. It is.
- an independent known marker 12 for identifying the object is arranged in advance near the object 11.
- the information display device 20 calculates a position and orientation using an image input unit 21 that inputs an image of the real world, a marker extraction unit 22 that extracts a marker 12 from the input image, and the extracted marker 12. And a marker recognition unit 23. Further, the information display device 20 superimposes and displays the related information acquisition unit 24 that acquires object information associated with the marker 12 in advance, the model data creation unit 25 that processes the object information, and the input image and the processed information. Display unit 26. As a result, as shown in FIG. 3, an image 33 is displayed in which the real-world image 31 including the image of the object 11 and the object information 32 related to the object 11 are superimposed.
- the model data creation unit 25 of the information display device described in Patent Document 1 changes the drawing position of the displayed information in synchronization with the change of the input image and follows the display position of the object. Therefore, the information display device described in Patent Document 1 displays information in synchronization with the movement of the viewpoint of the input image (the movement of the user of the camera) accurately, for example, even when shooting is performed while rotating around the specimen. can do.
- FIG. 4 is a diagram showing the configuration of the information display device described in Patent Document 2.
- the information display device 40 includes a viewpoint position detection unit 41 that detects the position of the user's viewpoint, a display position detection unit 42 that detects the position of the display 47, and a target position detection unit 43 that detects an object to be projected. And have.
- the information display device 40 includes a target specifying unit 44 that specifies an object located on a line connecting the user viewpoint and the center position of the display 47, and a target information search unit 45 that acquires object information from the storage device.
- the information display device 40 includes a video generation unit 46 that selects the level of detail of the reference information to be displayed according to the distance between the user viewpoint and the display 47. As a result, more detailed information is displayed when the user viewpoint approaches the display.
- Patent Document 1 the technique described in Patent Document 1 and the technique described in Patent Document 2 have a problem that information that the user wants to know cannot be easily displayed.
- An object of the present invention is to provide an information display device and an information display method capable of easily displaying easy-to-see information that a user wants to know among information about objects displayed in real-world video.
- the information display device of the present invention includes a video input unit that inputs a real world video, a distance acquisition unit that acquires a distance between an object displayed in the input real world video and a viewpoint of the real world video, An object superimposing unit that outputs a video obtained by superimposing the information on the object and the real world video; and an information output control unit that controls the information displayed by the object superimposing unit; A method of presenting the information output by the object superimposing unit is determined according to the acquired distance.
- the information display method of the present invention includes a step of inputting a real world video, a step of acquiring a distance between an object projected on the input real world video and a viewpoint of the real world video, and the acquired Determining a method of presenting information on the object according to a distance; and superimposing and outputting the information and the real world video so as to be presented in the determined manner of presentation. .
- the present invention in order to present information related to an object in a manner of presentation according to the distance between the viewpoint of the real world video and the object, the information that the user wants to know among the information related to the object is presented in an easy-to-read manner. Is possible.
- the 1st figure which shows the structure of the conventional information display apparatus The figure which shows an example of the input image by the conventional information display apparatus The figure which shows an example of the information superimposed image by the conventional information display apparatus 2nd figure which shows the structure of the conventional information display apparatus.
- 1 is an external view of an information display device according to Embodiment 1 of the present invention.
- 1 is a block diagram showing a configuration of an information display device according to Embodiment 1.
- FIG. The figure which shows an example of the content of the object information table in Embodiment 1
- FIG. 6 is a diagram illustrating an example of an input video in the first embodiment
- FIG. 6 is a diagram illustrating an example of an information superimposed video in the first embodiment
- the block diagram which shows the structure of the information display apparatus which concerns on Embodiment 2 of this invention.
- the flowchart which shows operation
- the figure which shows an example of the change of a display image when zoom operation is performed in Info mode in Embodiment 2.
- FIG. 10 is a diagram illustrating an example of a change in a display image when a zoom operation is performed in the Photo + Info mode in the second embodiment.
- FIG. 10 is a diagram illustrating an example of a change in a display image when an aperture operation is performed in the Photo + Info mode according to the second embodiment.
- FIG. 10 is a diagram illustrating an example of a change in display video when a touch operation is performed in the Photo + Info mode according to the fourth embodiment.
- the figure which shows an example of the content of the object information table in Embodiment 5 FIG. 10 is a diagram illustrating an example of a change in display video when a touch operation is performed in the Photo + Info mode according to the fifth embodiment.
- Embodiment 1 of the present invention is an example in which the present invention is applied to a digital still camera that waits for a shooting operation while displaying a real-world image. First, an outline of the operation interface of the information display apparatus according to Embodiment 1 of the present invention will be described.
- FIG. 5 is an external view of the information display apparatus according to the present embodiment.
- FIG. 5 is a plan view of the information display device in the operation state when viewed from above.
- the information display device 100 includes a thin rectangular parallelepiped main body 200, a video input unit 300, a video display unit 400, and an operation interface unit 500 arranged outside the main body 200.
- the operation interface unit 500 includes an operation input unit 510 and a mode switching input unit 520.
- the video input unit 300 has a lens, a shutter, and an image sensor (all not shown), and inputs a real world video.
- the video input unit 300 is configured to be able to operate focus, aperture, and zoom.
- the video display unit 400 has a liquid crystal display (not shown), for example, and displays images and videos.
- the main unit 200 displays and shoots the real world video input by the video input unit 300 on the video display unit 400.
- the main body 200 displays an information superimposed video on which the real world image and the object information projected on the real world video are superimposed on the video display unit 400.
- the main body 200 acquires the distance between the viewpoint of the real world video (position of the information display device 100) and the object, and determines how to present the object information to be displayed according to the acquired distance. .
- the operation input unit 510 receives a user operation, and reflects the received user operation on the real world video input by the main body unit 200 and the displayed object information.
- the operation input unit 510 includes a lever 511 and a button 512.
- the lever 511 is a knob that can be rotated in an arc around the button 512.
- the button 512 can be pressed in two stages, half-press and full-press.
- the mode switching input unit 520 includes a switch 521 and a slide unit 522 that slides the switch 521 between three locking positions, and accepts a switching operation from the user for three modes in which the operation of the main body unit 200 is different.
- the first mode is a mode for displaying only real world video (hereinafter referred to as “Photo mode”).
- the second mode is a mode that displays only object information (hereinafter referred to as “Info mode”).
- the third mode is a mode for displaying a video in which the real world video and the object information are superimposed (hereinafter referred to as “Photo + Info mode”).
- FIG. 6 is a block diagram showing the configuration of the information display device 100. As shown in FIG. The same parts as those in FIG. 5 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
- the information display apparatus 100 includes an operation input unit 510, a mode switching input unit 520, an operation control unit 610, a lens control unit 620, a video input unit 300, a video recognition unit 630, a self-position / posture calculation unit 640, and a distance acquisition.
- the operation input unit 510 outputs information indicating the content of the user operation (hereinafter referred to as “operation information”) to the operation control unit 610.
- the mode switching input unit 520 outputs information indicating the set mode (hereinafter referred to as “mode information”) to the operation control unit 610.
- the motion control unit 610 controls the operations of the lens control unit 620, the information output control unit 690, and the object superimposing unit 710 according to the input mode information.
- the operation control unit 610 outputs operation information to the lens control unit 620, while stopping the operation of the information output control unit 690, and performs an operation on the object superimposing unit 710. Display only the world video.
- the operation control unit 610 outputs operation information to the lens control unit 620 and operates the information output control unit 690 to display the information superimposed video on the object superimposing unit 710 in the Photo + Info mode. Make it.
- the motion control unit 610 stops the operation of the lens control unit 620 and stops the operation of the information output control unit 690, and fixes the information superimposed video being displayed to the object superimposing unit 710. Display.
- the lens control unit 620 controls the lens of the video input unit 300. Specifically, the lens control unit 620 controls the zoom magnification, the focus position, and the aperture value of the real world video input by the video input unit 300 according to the input operation information.
- the video input unit 300 outputs the input real world video (hereinafter referred to as “input video”) to the video recognition unit 630, the information position determination unit 700, and the object superimposition unit 710. In addition, the video input unit 300 outputs information indicating the focal point of the lens (hereinafter referred to as “focus point”) to the video recognition unit 630 and the object superimposing unit 710.
- the video recognition unit 630 recognizes an object displayed on the input video. Specifically, the video recognition unit 630 performs image recognition processing such as pattern matching processing on the input video, and an area on the input video in which an object such as a building or a sign is displayed (hereinafter referred to as “object area”). ). Video recognition unit 630 then outputs information indicating the position of the extracted object region on the input video (hereinafter referred to as “region information”) to distance acquisition unit 650, object positioning unit 660, and information position determination unit 700. .
- region information information indicating the position of the extracted object region on the input video
- the video recognition unit 630 preferably adds identification information to each area information and outputs it in order to perform subsequent processing for each object.
- the own position and orientation calculation unit 640 calculates the position and orientation of the information display device 100 at the time when the real world video is input. Specifically, the self-position / posture calculation unit 640 calculates the viewpoint position and the optical axis direction of the video input unit 300 using an angular velocity sensor, a GPS (global positioning system) sensor, and the like. Then, the own position / orientation calculation unit 640 outputs a calculation result (hereinafter referred to as “position / orientation information”) to the distance acquisition unit 650 and the object positioning unit 660.
- position / orientation information a calculation result
- the distance acquisition unit 650 acquires a horizontal distance from the viewpoint of the input video to the object recognized by the video recognition unit 630 (hereinafter referred to as “object distance”).
- the distance acquisition unit 650 measures the actual distance from the image recognition unit 630 to each object corresponding to the area information of each object, using a laser type distance measuring sensor or the like.
- the distance acquisition unit 650 converts the measured actual distance into a horizontal distance based on the elevation angle in the direction of the object calculated from the position and orientation information, and the object positioning unit 660 and the information output control unit 690 are used as the object distance. Output to.
- the object distance may be the actual distance instead of the horizontal distance depending on the application, such as an application in an area where the ground is inclined.
- the object positioning unit 660 acquires the position and orientation information from the own position and orientation calculation unit 640, the region information from the video recognition unit 630, and the object distance from the distance acquisition unit 650. Next, the object positioning unit 660 measures the position in the real world of the object displayed in the input video from the acquired information (hereinafter simply referred to as “object position”). Further, the object positioning unit 660 outputs information indicating the measurement result by latitude and longitude (hereinafter referred to as “latitude / longitude information”) to the object information search unit 680 and the information output control unit 690.
- latitude / longitude information information indicating the measurement result by latitude and longitude
- the terminal communication unit 670 communicates with an information server 740 including an object information database (DB) 730 via a communication network 720 such as the Internet.
- the object information database 730 stores in advance an object information table in which various object information is accumulated for each object information in association with the position of the object.
- FIG. 7 is a diagram showing an example of the contents of the object information table stored in the object information database 730.
- the object information table 810 describes level 1 to level 4 object information 813 to 816 having different levels of detail in association with the object ID 811 and the object position 812.
- the information server 740 described above can acquire the corresponding ID 811 and level 1 to level 4 object information 813 to 816 from the object information table 810 using the position 812 as an index.
- Level 1 object information 813 is a symbol indicating a facility type.
- Level 2 object information 814 is the name of the facility.
- Level 3 object information 815 is facility topic information.
- Level 4 object information 816 is detailed information of the facility. That is, the level of detail of the object information 813 to 816 of level 1 to level 4 increases in this order.
- an object existing at a position of “latitude: xxx1, longitude: xxx2” is associated with “B” as the facility type and “X tower” as the facility name.
- topics information and detailed information are further associated with the object existing at the position “latitude: xxx1, longitude: xxx2,” respectively.
- “currently holding an event where snow falls in the atrium” is associated as topic information, and “snow is falling at 13:00, 15:00, 17:00” as detailed information. Accordingly, when the latitude / longitude information “latitude: xxx1, longitude: xxx2” is used as a search key, these object information are searched.
- the object information search unit 680 transmits the input latitude / longitude information to the information server 740 as a search query for the object information database 730 via the terminal communication unit 670. Then, the object information search unit 680 outputs the object information returned from the information server 740 to the information output control unit 690.
- the object information returned here is all of the ID 811 and the level 1 to level 4 object information 813 to 816.
- the object information search unit 680 may search and acquire only the object information at that level. Good.
- the information output control unit 690 controls how to present the object information according to the object distance.
- the information output control unit 690 stores display information determination rules in advance, and determines object information related to objects to be displayed and object information having different amounts of information to be displayed according to the display information determination rules. To do. That is, the information output control unit 690 determines the object and the amount of information as display targets. Then, the information output control unit 690 outputs the object information determined to be a display target (hereinafter referred to as “display object information”) to the information position determination unit 700 and the object superimposition unit 710.
- display object information prescribes that an object having a higher level of detail is determined as a display target for an object having a shorter object distance.
- FIG. 8 is a diagram showing an example of the contents of the display information determination rule.
- the display information determination rule 820 defines the display / non-display of the object information and the level of the object information in the case of display as the determination content 822 for each range of the object distance 821.
- a determination content 822 of “level 1” is defined for the object distance 821 of “300 m or more and less than 400 m”. This means that when the object distance is 300 m or more and less than 400 m, the level 1 object information (here, the symbol indicating the facility type) is determined as the display object information.
- the display information determination rule 820 associates object information with a higher level of detail as the object distance 821 is shorter within a certain range. Therefore, according to the display information determination rule 820, the displayed object information has more detailed contents as the object distance is shorter.
- the information position determining unit 700 outputs the determination result (hereinafter referred to as “display layout information”) to the object superimposing unit 710. Specifically, the information position determination unit 700 detects a vanishing point from a linear component included in the input video. The information position determining unit 700 then displays each display object information based on the vanishing point, the region information, and the display object information so as to follow a line segment radially extending from the vanishing point and not to overlap the object region. The display position of is determined.
- the information position determination unit 700 may determine the display position of each display object information along a line connecting edges of a plurality of object areas. As a result, the information position determining unit 700 can easily determine the display position of each display object information in a state that does not overlap the object area and that corresponds to the arrangement relationship of the object in the real world. .
- the object superimposing unit 710 generates video data for displaying the information superimposed video in which the display object information is arranged at the position indicated by the display layout information on the input video, and outputs the generated video data to the video display unit 400. Specifically, the object superimposing unit 710 generates video data that displays only the input video in the Photo mode, and generates video data that displays the information superimposed video in the Info mode or the Photo + Info mode. .
- the video display unit 400 converts the video data generated by the object superimposing unit 710 into a video and displays it on the liquid crystal display.
- the information display device 100 includes, for example, a CPU (central processing unit), a storage medium such as a ROM (read only memory) storing a control program, a working memory such as a RAM (random access memory), and the like.
- a CPU central processing unit
- a storage medium such as a ROM (read only memory) storing a control program
- a working memory such as a RAM (random access memory), and the like.
- the function of each unit is realized by the CPU executing the control program.
- an information display device 100 it is possible to superimpose and display information on an object on the input video in a manner of presenting information according to the distance between the viewpoint of the input video and the object.
- FIG. 9 is a flowchart showing the operation of the information display apparatus 100.
- step S1010 the video input unit 300 inputs a real world video and sends the input video to the video recognition unit 630, the information position determination unit 700, and the object superimposition unit 710.
- FIG. 10 is a diagram illustrating an example of an input video input by the video input unit 300.
- first to third objects 832-1 to 832-3 positioned in order from the near side along the road 831 are projected on the input image 830.
- the first to third objects 832-1 to 832-3 are buildings of different facilities.
- the video recognition unit 630 extracts and identifies one or more object areas from the input video. For example, the video recognizing unit 630 obtains first to third object areas 833-1 to 833-1 as the object areas of the first to third objects 832-1 to 832- 3 from the input video 830 shown in FIG. Extract and identify 833-3. Then, the video recognizing unit 630 selects all or arbitrary object areas from the identified object areas as processing targets.
- the image recognition unit 630 preferentially selects an object area with a high recognition rate within a preset number of thresholds, or is included in an object area in focus and a range before and after the object area. Select only the object area. For example, when 100 object areas are extracted from one video, the recognition rate is not handled equally but processed according to the priority order. An index that indicates the certainty.
- the feature of the object region is, for example, the steepness of the slope of the pixel value at the boundary between the background and the object, or the size of the area of the region surrounded by the ridge portion (edge) having a high pixel value in the object region. is there. In this way, by restricting the object areas that are actually processed, the processing load on the subsequent stage can be reduced.
- step S1030 the video recognition unit 630 selects one object area from the object areas selected as the processing target. Then, the video recognition unit 630 sends the region information of the selected object region to the distance acquisition unit 650, the object positioning unit 660, and the information position determination unit 700.
- step S1040 the distance acquisition unit 650 calculates the object distance of the selected object based on information input from the own position / posture calculation unit 640 and the video recognition unit 630. Then, the distance acquisition unit 650 sends the calculated object distance to the object positioning unit 660 and the information output control unit 690.
- the own position / orientation calculation unit 640 generates, as position / orientation information, the position and optical axis direction of the viewpoint of the video input unit 300 calculated using an angular velocity sensor, a GPS sensor, and the like, and obtains a distance.
- the distance acquisition unit 650 measures the actual distance to each object based on the area information from the video recognition unit 630 using a laser-type distance measuring sensor or the like. Then, the distance acquisition unit 650 converts the measured actual distance into a horizontal distance based on the elevation angle in the direction of the object calculated from the position and orientation information, and outputs it as the object distance.
- step S1050 the object positioning unit 660 calculates the position of the object corresponding to the object region using the position / orientation information, the orientation of the object direction calculated from the position / orientation information, and the object distance.
- the object positioning unit 660 sends the calculated latitude / longitude information of the object position to the object information search unit 680 and the information output control unit 690.
- step S1060 the object information search unit 680 sends a search query using the latitude / longitude information as a search key to the information server 740, and acquires corresponding object information. Then, the object information search unit 680 sends the acquired object information to the information output control unit 690.
- the information output control unit 690 determines an object to be displayed (hereinafter referred to as “display object”).
- the information output control unit 690 determines, for example, all the acquired object information, an object corresponding to the focused object area, or an object corresponding to the object area included in the peripheral range. For example, when the attribute (hotel, department store, etc.) of the object to be displayed is specified, the information output control unit 690 may determine the object corresponding to the specified attribute as the display object. Good.
- step S1080 the information output control unit 690 determines the amount of information using the above-described display information determination rule so that the level of detail becomes higher as the object distance becomes shorter.
- the amount of information determined here is the amount of information when presenting the object information of the display object.
- the information output control unit 690 sends the determined display object and object information corresponding to the determined amount of information to the information position determining unit 700 and the object superimposing unit 710 as display object information.
- step S1090 the information position determination unit 700 calculates a line segment representing the depth of the input video from the arrangement pattern of the plurality of object areas and the linear component included in the input video.
- step S1100 the information position determining unit 700 determines the display layout based on the line segment calculated in step S1090, and sends the display layout information to the object superimposing unit 710.
- step S1110 the object superimposing unit 710 generates video data of information superimposed video in which display object information is arranged according to display layout information.
- step S1120 the video recognition unit 630 determines whether or not an unprocessed object area remains among the object areas selected as the processing target in step S1020. If an unprocessed object area remains (S1120: YES), the video recognition unit 630 returns to step S1030, selects an unprocessed object area, and executes the processes of steps S1030 to S1110. As a result, when there are a plurality of display object information, in step S1110, video data on which the plurality of object information is superimposed is generated. If there is no unprocessed object area remaining (S1120: NO), the video recognition unit 630 proceeds to step S1130.
- step S1130 the object superimposing unit 710 outputs the generated video data to the video display unit 400 to display the information superimposed video.
- step S1140 the information display apparatus 100 determines whether or not an instruction to stop displaying the information superimposed video has been given by a user operation or the like (for example, an operation to be switched to the Photo mode). If the information display device 100 is not instructed to stop displaying the information superimposed video (S1140: NO), the information display apparatus 100 returns to step S1010, and if instructed to stop displaying the information superimposed video (S1140: YES). ), A series of processing ends.
- an instruction to stop displaying the information superimposed video for example, an operation to be switched to the Photo mode.
- FIG. 11 is a diagram illustrating an example of the information superimposed video displayed by the video display unit 400.
- the object information table 810 shown in FIG. 7 and the display information determination rule 820 shown in FIG. 8 an information superimposed video displayed corresponding to the input video shown in FIG. 10 is shown.
- the first to third objects 832-1 to 832- 3 shown in FIG. 10 are objects corresponding to ID “001”, ID “002”, and ID “003” shown in FIG. 7, respectively. To do.
- the object distances of the first to third objects 832-1 to 832-3 are 150 m, 250 m, and 350 m.
- the level of the object information of level 3 topic information
- level 2 fracility name
- level 1 symbol indicating the facility type
- Level 1 object information is determined as display object information.
- the first to third object information 841-1 to 841- corresponding to the first to third object areas 833-1 to 833-3 are added to the input video 830. 3 is drawn in an overlapping manner.
- the first to third object information 841-1 to 841-3 are along a line segment 842 (not actually displayed) indicating the depth of the input image, and the first to third object information 841. -1 to 841-3 so as not to overlap.
- the contents of the first to third object information 841-1 to 841-3 are more detailed as they are closer to the viewpoint of the input image 830, that is, closer to the front.
- the information amount of the first object information 841-1 on the foremost side is as large as 20 characters
- the information amount of the distant third object information 841-3 is as small as 1 character.
- information that the user wants to know can be displayed to the user in an easy-to-view manner, and the user's movement and behavior can be more appropriately supported.
- the information display device 100 tracks the image of the object area and moves the displayed object information on the screen when the image of the object projected on the input video moves continuously. May be. By doing so, the information display apparatus 100 can reduce the processing load by omitting the processing of steps 1020 to S1070.
- the information display apparatus 100 can recalculate line segments indicating the object area, the object distance, and the depth based on the change in the position and orientation information. As a result, the information display apparatus 100 can simplify the processing of steps S1080 to S1100 and reduce the processing load.
- the information display apparatus 100 may process steps S1030 to S1110 collectively.
- the information display apparatus 100 may perform a plurality of processes that can be performed simultaneously in parallel, such as the processes in steps S1030 to S1080 and the process in step S1090.
- the information display apparatus 100 displays information about an object by superimposing it on the real world video with an information amount corresponding to the distance between the viewpoint of the real world video and the object. can do. That is, the information display device 100 according to the present embodiment performs superimposed display with the real world video while changing the method of presenting the object information according to the distance from the user carrying the information display device 100 to the object.
- the information display apparatus 100 according to the present embodiment has a large amount of information about near objects and a small amount of information about distant objects, and can perform display according to the depth. Therefore, the information display apparatus 100 according to the present embodiment can display so as not to be confused because there is too much display information, and it is possible to display information that the user wants to know easily.
- the information display apparatus 100 may acquire latitude / longitude information, specify an object, or specify object information using another method such as the method described in Patent Document 1. Good.
- Embodiment 2 of the present invention is an example in which displayed object information is changed based on a user operation.
- the information display device reflects the user operation in the operation input unit 510 in the input real world video (such as zooming and focusing of the video), and the object is linked to the real world video. Change information.
- FIG. 12 is a block diagram showing the configuration of the information display apparatus according to the present embodiment, and corresponds to FIG. 6 of the first embodiment.
- the same parts as those in FIG. 6 are denoted by the same reference numerals, and description thereof will be omitted.
- the information display apparatus 100a is different from the operation control unit 610, the image recognition unit 630, the information output control unit 690, and the object superimposing unit 710 in FIG. A unit 610a, a video recognition unit 630a, an information output control unit 690a, and an object superimposing unit 710a.
- the operation control unit 610a determines how the user operation in the operation input unit 510 is reflected in the input real world video and the displayed object information according to the mode set in the mode switching input unit 520. Control.
- the operation control unit 610 reflects the user operation in the input video in the Photo mode, and further changes the displayed object information in conjunction with the input video in the Photo + Info mode. Further, in the Info mode, the operation control unit 610 reflects the user operation only on the displayed object information and does not reflect the input video (that is, lens control).
- the video recognizing unit 630a recognizes an object displayed in the input video, selects an object area of the object where the focus point is located, or an object area including surrounding objects, and outputs corresponding area information. Specifically, the video recognition unit 630a selects an object region of an object that is located within the range of the depth of field and is in focus. That is, the video recognition unit 630a can accept a user's selection of an object for which object information is to be displayed by a zoom operation, a focus operation, and an aperture operation.
- the information output control unit 690a controls the object information to be displayed and its information amount according to the object distance. Further, the information output control unit 690a determines that an object located within the depth of field range is a display target when the aperture operation is performed, and according to the zoom magnification when the zoom operation is performed. Controls object information to be displayed.
- the object superimposing unit 710a generates video data of an information superimposed video in which display object information is arranged according to display layout information and an icon indicating a focus position is superimposed. Further, the object superimposing unit 710a switches the display mode of the object information being displayed when a focus lock operation is performed.
- the operation control unit 610a detects when an aperture operation of the video input unit 300 is performed by an aperture adjustment unit (for example, using the lever 511 of the operation input unit 510) provided in the information display device 100a. Then, operation information indicating the contents of the aperture operation is generated.
- FIG. 13 is a flowchart showing the operation of the information display device 100a, and corresponds to FIG. 9 of the first embodiment.
- the same parts as those in FIG. 9 are denoted by the same step numbers, and description thereof will be omitted.
- step S2010 the operation control unit 610a determines whether operation information is input from the operation input unit 510 or not.
- the operation control unit 610a proceeds to step S2020 when the operation information is input (S2010: YES), and proceeds to step S1020 of FIG. 9 when the operation information is not input (S2010: NO).
- the operation information is information indicating that the lever 511 of the operation input unit 510 has been moved 30 degrees to the right.
- step S2020 the operation control unit 610a determines whether the mode is the Photo + Info mode or the Photo mode, that is, whether the mode is for performing lens control.
- the operation control unit 610a proceeds to step S2030 when it is in the Photo + Info mode or the Photo mode (S2020: YES), and proceeds to step S2040 when it is in the Info mode (S2020: NO).
- step S2040 the operation control unit 610a determines whether the mode is the Photo + Info mode or the Info mode, that is, whether the mode includes the object information display. If the mode is the Photo + Info mode or the Info mode (S2040: YES), the operation control unit 610a proceeds to step S2050. If the mode is the Photo mode (S2040: NO), the operation control unit 610a proceeds to step S1020 in FIG.
- step S2050 the operation control unit 610a determines whether or not a focus operation has been performed based on the input operation information.
- the operation control unit 610a proceeds to step S2060 when the focus operation is performed (S2050: YES), and proceeds to step S2070 when the focus operation is not performed (S2050: NO).
- step S2060 the motion control unit 610a causes the video recognition unit 630a to acquire in-focus distance information indicating the in-focus distance of the focus point, and the process proceeds to step S2070.
- the image recognition unit 630a acquires the zoom position and the focus value of the lens, and acquires the focus position information by referring to a distance table in which these values and the focus distance are described in advance in association with each other.
- step S2070 the operation control unit 610a determines whether an aperture operation has been performed based on the input operation information.
- the operation control unit 610a proceeds to step S2080 when the diaphragm operation is performed (S2070: YES), and proceeds to step S2090 when the diaphragm operation is not performed (S2070: NO).
- step S2080 the operation control unit 610a causes the information output control unit 690a to acquire depth of field information indicating the depth of field of the lens, and the process proceeds to step S2090.
- the information output control unit 690a acquires the zoom value, the focus value, and the aperture value of the lens, and refers to a depth-of-field table in which these values are described in association with the depth of field in advance. , Get depth of field information.
- step S2090 the operation control unit 610a determines whether a zoom operation has been performed based on the input operation information.
- the operation control unit 610a proceeds to step S2100 when the zoom operation is performed (S2090: YES), and proceeds to step S2110 when the zoom operation is not performed (S2090: NO).
- step S2100 the operation control unit 610a causes the information output control unit 690a to acquire magnification information indicating the zoom magnification, and the process proceeds to step S2110.
- the information output control unit 690a acquires a zoom value of the lens, and acquires magnification information by referring to a magnification table in which this value and the zoom magnification are described in advance in association with each other.
- step S2110 the operation control unit 610a determines whether or not a focus lock operation has been performed based on the input operation information. If the focus lock operation is performed (S2110: YES), the operation control unit 610a proceeds to step S2120. If the focus lock operation is not performed (S2110: NO), the operation control unit 610a proceeds to step S1020 in FIG. move on.
- step S2120 the motion control unit 610a causes the object superimposing unit 710a to acquire in-focus object information indicating the object that is the focus lock target, and the process proceeds to step S1020 in FIG.
- the object superimposing unit 710a acquires the focused object information based on the area information of the object where the focus point is located.
- the information display device 100a executes steps 1020 to S1140 in FIG. 9 to display the information superimposed image.
- the processing in each step is different from that in the first embodiment in accordance with the information acquisition in steps S2060, S2080, S2100, and S2120. Processing contents different from the first embodiment will be described.
- step S1030 the video recognition unit 630a determines a candidate point or a candidate area that represents the focus point or an object area around the focus point based on the focus distance information acquired in step S2060. Then, the video recognition unit 630a displays a mark indicating the determined candidate point or candidate area on the object superimposing unit 710a.
- this mark display for example, the image recognizing unit 630a selects only the object region around the focus point, the information position determining unit 700 determines the position of the mark based on the region information, and the object superimposing unit 710a is determined. A mark is superimposed on the position.
- the video recognition unit 630a determines the amount of information that can be displayed and the way of presentation from the candidate points or candidate areas. For example, the video recognizing unit 630a can display information based on the number of objects, the ratio of the object area to the total input video, the degree of change in luminance and color of the area other than the object area in the total input video, and the like. Decide how to present including quantities. Then, the video recognition unit 630a compares the number of objects with the information amount of each object information, and selects candidate points or candidate areas so that the information amount to be presented does not become excessive.
- the area where the object information can be displayed is an area other than the object area, or an area obtained by adding a part of the object area to the area other than the object area.
- the video recognition unit 630a may specify candidate points or candidate areas in response to a user operation.
- the video display unit 400 also serves as a touch panel type input unit, and accepts a selection according to an area touched by the user.
- the video recognition unit 630a may receive a user operation on another operation unit such as a cross key and move a candidate point or a candidate region to be operated and determine whether to select or not select.
- the video recognition unit 630a sends the extracted region information of the object region to the distance acquisition unit 650, the object positioning unit 660, and the information position determination unit 700, and proceeds to step S1040.
- the video recognition unit 630a sends area information of an object area corresponding to the selected point or area.
- step S1070 the information output control unit 690a determines the object information corresponding to the object located within the depth of field as display object information based on the depth of field information acquired in step S2080. To do.
- step S1080 the information output control unit 690a determines display object information based on the magnification information acquired in step S2100 so that the higher the zoom magnification, the higher the level of detail. For example, the information output control unit 690a determines the display object information by changing the level of detail according to the zoom magnification. Specifically, the level of detail defined in the display information decision rule is increased by 1 when the zoom is small and increased by 2 when the zoom is large.
- step S1110 the object superimposing unit 710a changes the display mode of the object information being displayed based on the focused object information acquired in step S2120, and superimposes it on the input video.
- the object superimposing unit 710a increases the transparency of the object information being displayed so that the input image can be seen through the object information.
- the display form of the object information changes in conjunction (synchronization) with the focus lock operation for the input video.
- the Info mode only the display form of the object information changes.
- FIG. 14 is a diagram illustrating an example of a change in the display image when the zoom operation is performed in the Info mode, and corresponds to FIG. 11 of the first embodiment.
- FIG. 14A it is assumed that the focus point 851 is located in the first object area 833-1 and only the first object information 841-1 is displayed.
- the zoom-up operation is performed in the Info mode as shown in FIG. 14B
- the details of the first object information 841-1 are shown in a state where the input video is not zoomed up as shown in FIG. 14C.
- the degree level will be higher.
- the user can obtain more detailed information about a desired object without changing the state of the input image.
- FIG. 15 is a diagram illustrating an example of a change in the display image when the zoom operation is performed in the Photo + Info mode, and corresponds to FIG.
- FIG. 15A it is assumed that the focus point 851 is located in the first object area 833-1 and only the first object information 841-1 is displayed.
- the input video is zoomed up as shown in FIG. 15C.
- the level of detail of the first object information 841-1 increases in conjunction with zooming in of the input video.
- the user can obtain more detailed information about the desired object through a natural operation of zooming in on the object of interest. Can do.
- FIG. 16 is a diagram showing an example of a change in the display image when the diaphragm operation is performed in the Photo + Info mode, and corresponds to FIG.
- first to third object information 841-1 to 841-3 are displayed as shown in FIG. 16A.
- FIG. 16B it is assumed that an operation of opening the aperture is performed in the Photo + Info mode.
- FIG. 16C for example, the first and second candidate points 852-1 and 852-2 are displayed as a guide near the focus point 851.
- the final display is limited to the display of only the object information 841-2 corresponding to the second 852-2, that is, within the range of the depth of field narrowed by the operation of opening the aperture. It is done.
- the detail level of the object information can be changed in conjunction (synchronization) with the normal aperture operation.
- the user can obtain more detailed information regarding the desired object by a natural operation of opening the aperture when the object of interest is limited.
- the user can adjust the number of displayed object information by adjusting the aperture.
- FIG. 17 is a diagram showing an example of a change in the display image when the focus lock operation is performed in the Info mode, and corresponds to FIG.
- the first object information 841-1 is displayed.
- the first object information 841-1 has a transparency of 50% as shown in FIG. 17C.
- this embodiment it is possible to change the display mode of the object information in conjunction with the normal focus lock operation. As a result, the user can more surely know that the focus is locked to the object. Further, when transmitting, the user can visually recognize the entire input image while confirming the object information.
- the information display device 100a according to the present embodiment allows the user to easily select the object information to be displayed and the level of detail of the displayed object information.
- the information display device 100a according to the present embodiment determines whether the displayed object information is changed in conjunction with an operation on the input video, that is, a normal camera operation, or can be operated independently. Makes it possible to choose arbitrarily. Therefore, the information display device 100a according to the present embodiment can display information that the user wants to know among the object information of the real world video more easily and more accurately. Further, the information display device 100a according to the present embodiment can easily display object information for a desired plurality of objects. In other words, the information display device 100a according to the present embodiment can realize intuitive control over the target that the user wants to know and the amount of information that the user wants, and only the information that the user wants to see through simple operations Can be provided with a large amount of information.
- the correspondence between the specific operation content in the operation input unit 510 of the information display apparatus 100a according to the present embodiment and the internal adjustment content such as zoom is not limited to a specific one.
- the detail level of the object information may be continuously variable.
- the operation input unit 510 can arbitrarily set continuous variables such as a time length and a rotation angle
- the information display device 100a links the set variables to the level of detail of the object information. May be displayed continuously changing.
- the zoom operation can continuously change the zoom rate. Therefore, it is conceivable that the information display device 100a changes the level of detail of the object information in conjunction with the zoom rate.
- the object information becomes more detailed when the lever 511 of the operation input unit 510 is turned to the right.
- the present invention is not limited to this.
- the information display device 100a according to the present embodiment sets a more detailed level of detail when the lever 511 is rotated counterclockwise, and sets the level of detail at finer intervals when the lever 511 is rotated counterclockwise. You may make it do, and you may have a function opposite to this.
- the form of the operation input unit 510 is not limited to the above-described contents, and may be any form that can adjust the zoom magnification and the like.
- the mode of the operation input unit 510 can perform an operation independent of the zoom operation, such as a mode using a lens aperture ring of the video input unit 300, a mode using a cross key, or the like. It is good.
- the user can operate a cross key or the like while viewing the aperture value displayed on the video display unit 400.
- the mode switching input unit 520 is not limited to the above content, and other notations such as “Info, Video + Info, Video”, “navigation, movement, camera” may be used. May be adopted.
- the change in the display mode of the object information when the focus lock operation is performed is not limited to the above content.
- the change of the display form may be a change to an arbitrary transparency, an information highlighting display, or the like.
- the highlighting is, for example, a change in the color, darkness, brightness, thickness, and font of the character, blinking of the character, display of a frame surrounding the character, emphasis on the edge of the object information display area, and Includes changes.
- the information display device 100a may change the level of detail of the object information according to the number of objects to be displayed. Specifically, for example, when the number of objects to be displayed is small, the information display apparatus 100a increases the level of detail of the object information above the level specified by the display information determination rule.
- the information display device 100a may control the effective time zone of the object information according to the shutter speed. Specifically, for example, the information display device 100a presents information associated with a narrow time span before, after, or before or after the current time or a specified time by selecting a fast shutter speed. . Alternatively, the information display device 100a continuously displays information associated with a wide time span before, after, or before or after the current time or a specified time by selecting a slow shutter speed. To control.
- the information display device 100a may change the degree of detail and the display form of the object information in accordance with other camera operations and camera settings (for example, the degree of exposure correction (sensitivity adjustment)). Specifically, for example, the information display device 100a controls whether to present information on facilities in a range that can be seen from the viewpoint or to include information on facilities behind the building that cannot be seen from the viewpoint. .
- the information display device 100a may be able to select a genre of object information to be displayed, such as restaurant information and amusement information, by a dial type or equivalent menu similar to the shooting menu in the camera.
- the information display device 100a may change the color of the object area and the object information to which the pointer is placed. Furthermore, the information display device 100a includes an object information search interface, and displays the object information as a search result by changing the color, blinking, or displaying the object information superimposed on the video while pointing with a pointer such as an arrow. May be.
- Embodiment 3 of the present invention is an example in which information superimposed video is recorded on a removable storage medium.
- the generated information superimposed video is not only displayed but also recorded on a removable storage medium.
- FIG. 18 is a block diagram showing a configuration of the information display apparatus according to the present embodiment, and corresponds to FIG. 6 of the first embodiment.
- the same parts as those in FIG. 6 are denoted by the same reference numerals, and description thereof will be omitted.
- the information display device 100b includes a storage medium 750b and a recording / reading unit 760b.
- the storage medium 750b is connected to the recording / reading unit 760b.
- the recording / reading unit 760 b is disposed between the object superimposing unit 710 and the video display unit 400.
- the storage medium 750b is a removable storage medium, for example, an SD memory card.
- the recording / reading unit 760b transfers video data of the information superimposed video to the video display unit 400 and records it in the storage medium 750b. Further, the recording / reading unit 760b reads the video data of the information superimposed video recorded on the storage medium 750b from the storage medium 750b and sends it to the video display unit 400 in the information superimposed video reproduction mode.
- the setting of the information superimposed video recording mode and the information superimposed video playback mode may be performed by the operation input unit 510 and the operation control unit 610, for example.
- the video data of the information superimposed video is the video data obtained by combining the input video and the image of the object information, or the data of the input video and the data related to the object information are separated in a state where the information superimposed video can be reproduced. It may be a thing.
- Data related to object information is, for example, object information associated with coordinates (display layout information) representing a spatial position in a video and a time code representing a temporal position of the video when superimposed on video data. Recorded data.
- Such an information display device can simultaneously record a photographed real world video and object information at the time of photographing, and can read and display the recorded information on another information display device or the like.
- optical devices As storage media, optical devices, semiconductor memories, memory devices such as CDs (compact discs), DVDs (digital versatile discs), and BDs (blu-ray discs), hard disks, arbitrary memories on the network, optical memories, etc.
- Various storage media such as a new memory can be applied.
- the fourth embodiment of the present invention is an example in which a user operation is received using a touch panel.
- the information display device is equipped with an operation input unit using a touch panel to realize the above-described operation.
- FIG. 19 is a block diagram showing a configuration of the information display apparatus according to the present embodiment, and corresponds to FIG. 12 of the second embodiment.
- the same parts as those in FIG. 12 are denoted by the same reference numerals, and description thereof will be omitted.
- the information display device 100c has a video display / operation input unit 400c instead of the video display unit 400 of FIG.
- the video display / operation input unit 400c has a liquid crystal display (corresponding to the video display unit 400 of the second embodiment) and an integrated touch panel (not shown).
- the video display / operation input unit 400c accepts at least a focus operation for setting a focus point at an arbitrary position on the screen and a focus lock operation for maintaining the current focal length by a user's touch operation.
- the video display / operation input unit 400c outputs operation information indicating the content of the accepted touch operation to the operation control unit 610a. More specifically, when an operation for designating a focus point is performed, the video display / operation input unit 400c provides operation information for instructing the lens to be focused at the designated position. Output to. In addition, when the focus lock operation is performed, the video display / operation input unit 400c outputs operation information indicating that the focus lock operation has been performed to the operation control unit 610a.
- the video recognition unit 630a recognizes an object displayed on the input video. Then, the video recognition unit 630a selects the object area of the object where the focus point is located or the object area including the surrounding objects, and outputs the corresponding area information. As a result, the video display / operation input unit 400c displays the video focused on the vicinity of the part where the user touches the fingertip and the object information of the object in the vicinity.
- the information display device 100c can accept a user's selection of an object for displaying object information by a touch operation.
- FIG. 20 is a diagram illustrating an example of a change in the display image when a touch operation is performed in the Photo + Info mode, and corresponds to FIG.
- first to third object areas 833-1 to 833-3 exist.
- a touch operation is performed with the finger 861 on the video display / operation input unit 400c, and the focus point is set in the first object region 833-1.
- a position is specified.
- the designated position 862 becomes the focus point and is displayed as a guide, and the first object information 841-1 is displayed.
- the information display device 100c accepts a focus operation, a focus lock operation, and the like for an object by a direct designation operation on the display. Accordingly, the user can obtain more detailed information regarding the desired object by a natural operation of touching the object of interest on the display screen.
- the information display device 100c may input the operation information of the touch panel operation to the video recognition unit 630.
- the video recognition unit 630 may extract the object area at the touched position and select only the extracted object area as a processing target.
- the video recognition unit 630 may similarly process the entire video or a part of the video as an object area even in a situation where there is no clear object such as a building. For example, in a video showing the sea and sky or a video showing only the sky, the sky part may be displayed as an object, and information associated with the spatial coordinates may be displayed as object information.
- the fifth embodiment of the present invention is an example in which a more detailed user operation is accepted using a touch panel.
- the information display device narrows down display object information by accepting a narrowing operation for a condition to be displayed from the user on the touch panel.
- FIG. 21 is a block diagram showing a configuration of the information display apparatus according to the present embodiment, and corresponds to FIG. 19 of the fourth embodiment.
- the same parts as those in FIG. 19 are denoted by the same reference numerals, and description thereof will be omitted.
- the information display device 100d replaces the video display / operation input unit 400c and the information output control unit 690a of FIG. 19 with a video display / operation input unit 400d and information different from these. And an output control unit 690d.
- the video display / operation input unit 400d accepts a condition specifying operation for specifying a display target condition.
- the condition designating operation includes at least an operation for designating the object distance and an operation for designating the valid time of the object information.
- the effective time of object information (hereinafter simply referred to as “effective time”) is a date and time representative of a period during which object information is valid.
- the information output control unit 690d controls the object information to be displayed according to the condition specifying operation indicated by the operation information input from the video display / operation input unit 400d. More specifically, when an operation for specifying the object distance is performed, the information output control unit 690d determines the object information of the object corresponding to the specified object distance as the display object information. Further, when an operation for designating an effective time is performed, the information output control unit 690d determines object information corresponding to the specified effective time as display object information.
- an object information table having contents different from those in the first to fourth embodiments is stored in the object information database 730.
- FIG. 22 is a diagram showing an example of the contents of the object information table in the present embodiment.
- the object information table 810d describes object information 813 to 816 in association with not only the object ID 811 and the object position 812 but also the valid time 817d.
- the attributes of the object information 813 to 816 are different from those of the first to fourth embodiments.
- Level 1 object information 813 is label information of a post (word-of-mouth information) by a general user regarding a facility.
- Level 2 object information 814 is detailed information of word-of-mouth information.
- Level 3 object information 815 is comments and topic information about the facility.
- the level 4 object information 816 is reference information indicating a destination URL (uniform resource locator) or the like for referring to external information regarding the facility. That is, in the level 1 to level 4 object information 813 to 816, the granularity of information can be freely set for the number of levels and the information described in each level.
- a combination of a position of “latitude: xx1, longitude: yy2” and an effective time of “2009/11/4, 8:30” is associated with “stop by transfer” as a label of word-of-mouth information ing.
- the combination of the position “latitude: xx1, longitude: yy2” and the effective time “2009/11/4, 20:30” is associated with “live held” as a label of word-of-mouth information. . That is, even when the same object is used, different object information is associated with each other when the effective time is different.
- the user performs a condition specifying operation by touching an indicator described later on the touch panel, for example. Then, the video display / operation input unit 400d outputs operation information indicating the contents of the condition specifying operation (object distance and effective time) to the information output control unit 690d.
- the information output control unit 690d determines that the object information corresponding to the specified object distance is the display object information when the operation information of the condition specifying operation for specifying the object distance is input. In addition, when the operation information of the condition designating operation for designating the valid time is input, the information output control unit 690d determines the object information corresponding to the designated valid time as the display object information.
- the operation after the display object is determined is the same as that in the fourth embodiment.
- the object information displayed on the video display / operation input unit 400d is narrowed down to object information corresponding to the conditions (object distance and effective time) specified by the user.
- the object information search unit 680 may narrow down the object information.
- the object information search unit 680 inputs operation information indicating the contents of the condition specifying operation, and performs the following operation.
- the object information search unit 680 calculates, as latitude and longitude information, a range having the specified object distance as a radius centered on its own position. Then, the object information search unit 680 searches for object information using the latitude / longitude information as a search key, as in the second embodiment.
- the object information search unit 680 transmits the latitude / longitude range information to the information server 740 via the terminal communication unit 670 as a search query for the object information database 730d.
- the object information search unit 680 outputs the object information returned from the information server 740 to the information output control unit 690.
- the object information search unit 680 outputs the valid time to the information output control unit 690 in the same manner.
- FIG. 23 is a diagram illustrating an example of a change in the display image when a touch operation is performed in the Photo + Info mode, and corresponds to FIG.
- the same parts as those in FIG. 16 are denoted by the same reference numerals, and description thereof will be omitted.
- the information superimposed video 840d displayed by the video display / operation input unit 400d includes a distance indicator 871 and an effective time indicator 872.
- the distance indicator displays the object distance of each object area currently focused by a distance mark 873.
- the effective time indicator 872 displays the effective time of the object information of the currently focused object area by a time mark 874.
- the focused object area is a focus point or an object area around the focus point.
- the video display / operation input unit 400d acquires in advance object information of each focused object area from the information output control unit 690d in order to display the distance mark 873 and the time mark 847.
- first and second object regions 833-1 and 833-2 are focused object regions.
- the distance indicator 871 includes a first distance mark 873-1 indicating 150 m and a second distance mark 873 indicating 250 m corresponding to the first and second object regions 833-1 and 833-2, respectively. -2 is displayed.
- the valid time indicator 872 includes first and second time marks 874-1 indicating the valid time of each object information corresponding to the first and second object areas 833-1 and 833-2. 874-2 is displayed.
- the video display / operation input unit 400d keeps one of the distance marks 873 and the time mark 874 selected by default.
- the level of the display object information is level 1
- the first distance mark 873-1 is selected by default.
- the first and second time marks 873-1 and 873-2 corresponding to the first object region 833-1 indicate “8:30” and “20:30”, respectively, and the first time It is assumed that the mark 873-1 is selected by default.
- the displayed object information is “stopping by transfer”, which is information corresponding to the valid time “8:30” of the object information in the first object area 833-1. It is narrowed down to.
- the information superimposed video 840d desirably displays which object distance and which effective time object information is currently displayed.
- this display is performed by the colors of the distance indicator 871 and the effective time indicator 872 is shown, but color change or blinking of the distance mark 873 and the time mark 874 may be used.
- the information display device can narrow down the displayed object information by a touch operation on the display.
- the information display apparatus can specify the narrowing-down condition by selecting from the object distance and effective time displayed on the display.
- the information display device performs browsing even when a plurality of object information is given to the same object, or even when a plurality of object information is given to the same coordinates or a narrow range on the screen. It becomes easy. Furthermore, the information display device according to the present embodiment enables search, selection, and browsing of object information of an arbitrary object around the focus position even when no object is displayed on the screen. That is, the user can obtain more detailed information regarding the desired object by a natural operation of touching the object of interest on the display screen.
- the conditions that can be specified are not limited to the object distance and the valid time, and various conditions relating to information included in the object information table can be adopted.
- the conditions that can be specified the registration time and set time of the object information may be adopted.
- the object information is acquired from a predetermined information server.
- the search engine may be used to search for appropriate object information on the communication network. Good.
- the object information database is described as being arranged in an external device and acquired via a communication network.
- the information display device may be provided with an object database in advance.
- the information display device may acquire the object information from the external device only when the necessary object information is not held.
- the information display device may be configured so that the object information once acquired is cached and is not acquired again from the outside.
- the information display device edits the object information using a text summarizing function or the like so as to match the specified level of detail. May be.
- the information simplification (detailed) ratio may be symmetric or asymmetric with respect to the distance from the focus position. Specifically, for example, in accordance with the depth of field, the ratio of simplification (details) is less for objects closer to the focus position, and simplification (details) for objects behind the focus position. Increase the ratio of
- the information display device and the information display method according to the present invention are useful as an information display device and an information display method that can display information that the user wants to know among the information about the object displayed in the real world video in an easy-to-see manner.
- the present invention is useful as an augmented reality system that superimposes and displays real-world video and information.
- the present invention is applied to training, educational, space development, wearable system, and entertainment system applications. can do.
- the present invention is useful as a real-world information search system that performs intuitive control of an object that the user wants to obtain information and the amount of information, such as a digital still camera, a digital video camera, a car navigation system, It can be applied to the use of navigation systems and measurement / surveying systems using mobile phones.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Computer Hardware Design (AREA)
- Environmental Sciences (AREA)
- Marketing (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Emergency Management (AREA)
- Environmental & Geological Engineering (AREA)
- Computer Graphics (AREA)
- Remote Sensing (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
- Position Input By Displaying (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
本発明の実施の形態1は、実世界映像を表示しながら撮影操作を待機するデジタルスチルカメラに本発明を適用した例である。まず、本発明の実施の形態1に係る情報表示装置の操作インタフェースの概要について説明する。
本発明の実施の形態2は、表示されるオブジェクト情報をユーザ操作に基づいて変化させるようにした例である。
本発明の実施の形態3は、着脱可能な記憶媒体に情報重畳映像を記録するようにした例である。
本発明の実施の形態4は、タッチパネルを用いてユーザ操作を受け付けるようにした例である。
本発明の実施の形態5は、タッチパネルを用いて更に詳細なユーザ操作を受け付けるようにした例である。
200 本体部
300 映像入力部
400 映像表示部
400c、400d 映像表示・操作入力部
500 操作インタフェース部
510 操作入力部
511 レバー
512 ボタン
520 モード切替入力部
521 スイッチ
522 スライド部
610、610a 動作制御部
620 レンズ制御部
630、630a 映像認識部
640 自位置姿勢算出部
650 距離取得部
660 オブジェクト測位部
670 端末通信部
680 オブジェクト情報検索部
690、690a、690d 情報出力制御部
700 情報位置決定部
710、710a オブジェクト重畳部
720 通信ネットワーク
730 オブジェクト情報データベース
740 情報サーバ
750b 記憶媒体
760b 記録・読出部
Claims (18)
- 実世界映像を入力する映像入力部と、
入力された前記実世界映像に映し出されるオブジェクトと前記実世界映像の視点との間の距離を取得する距離取得部と、
前記オブジェクトに関する情報と前記実世界映像とを重畳した映像を出力するオブジェクト重畳部と、
前記オブジェクト重畳部が出力する前記情報を制御する情報出力制御部と、を有し、
前記情報出力制御部は、
取得された前記距離に応じて、前記オブジェクト重畳部が出力する前記情報の提示の仕方を決定する、
情報表示装置。 - 前記情報出力制御部は、
前記距離がより短いほど、より多い情報量およびより強調した提示形態の少なくとも1つを、前記オブジェクト重畳部が出力する前記情報に対して決定する、
請求項1記載の情報表示装置。 - 前記実世界映像の視点の位置を取得する自位置姿勢取得部と、
入力された前記実世界映像に映し出されるオブジェクトの実世界における位置を取得するオブジェクト測位部と、を更に有し、
前記オブジェクト測位部は、
取得された前記視点の位置と、前記視点を基準としたオブジェクトの距離、方位および仰角とに基づいて、前記オブジェクトの位置を取得する、
請求項1記載の情報表示装置。 - 入力された前記実世界映像から、前記実世界映像に映し出されるオブジェクトを認識する映像認識部、を更に有し、
前記情報出力制御部は、
認識された前記オブジェクトに関する情報のうち、決定された前記情報の提示の仕方に対応する情報を、前記オブジェクト重畳部に対して出力させる、
請求項3記載の情報表示装置。 - 前記オブジェクト測位部は、
認識された前記オブジェクトの前記実世界映像上の領域と、前記実世界映像の視点および視界とに基づいて、前記オブジェクトの位置を取得する、
請求項3記載の情報表示装置。 - 前記映像入力部が入力する前記実世界映像のズームの倍率を変化させるレンズ制御部、を更に有し、
前記情報出力制御部は、
前記実世界映像のズームの倍率がより大きいほど、より多い情報量を決定する、
請求項2記載の情報表示装置。 - 前記情報出力制御部は、
情報表示の対象となるオブジェクトの数がより少ないほど、オブジェクトごとの前記情報に対してより多い情報量を決定する、
請求項2記載の情報表示装置。 - 前記情報出力制御部は、
入力された前記実世界映像に映し出されるオブジェクトの中から、情報表示の対象となるオブジェクトを決定する、
請求項1記載の情報表示装置。 - 前記情報出力制御部は、
所定のオブジェクト数の範囲内で、情報表示の対象となるオブジェクトを決定する、
請求項8記載の情報表示装置。 - 前記映像入力部が入力する前記実世界映像の合焦位置を変化させるレンズ制御部、を更に有し、
前記情報出力制御部は、
前記合焦位置に近いオブジェクトを、情報表示の対象に決定する、
請求項8記載の情報表示装置。 - 前記レンズ制御部は、
所定の操作を受けて前記合焦位置を固定し、
前記オブジェクト重畳部は、
前記合焦位置が固定されたときに提示している情報の提示の仕方を変化させる、
請求項10記載の情報表示装置。 - 前記映像入力部が入力する前記実世界映像の被写界深度を変化させるレンズ制御部、を更に有し、
前記情報出力制御部は、
前記被写界深度の範囲内に位置するオブジェクトを、情報表示の対象に決定する、
請求項8記載の情報表示装置。 - 前記オブジェクト重畳部は、
認識された前記オブジェクトの前記実世界映像上の領域に対応付けて、そのオブジェクトに関する情報を表示する、
請求項3記載の情報表示装置。 - 複数のオブジェクトの情報が表示されるときに、前記複数のオブジェクトの実世界での配置関係に対応する位置に、前記複数のオブジェクトの情報の表示位置を決定する情報位置決定部、を更に有する、
請求項4記載の情報表示装置。 - 前記情報位置決定部は、
入力された前記実世界映像の奥行き方向を検出し、複数のオブジェクトの情報が表示されるときに前記奥行き方向に揃うように、前記複数のオブジェクトの情報の表示位置を決定する、
請求項14記載の情報表示装置。 - ユーザ操作を受け付ける操作入力部と、
前記ユーザ操作を、入力される前記実世界映像に反映させるモードと、前記ユーザ操作を、提示される前記オブジェクトの情報に反映させるモードと、前記ユーザ操作を、入力される前記実世界映像に反映させ、かつ、入力される前記実世界映像に連動して前記オブジェクトの情報を変化させるモードと、を切り替えて設定する動作制御部、を更に有する、
請求項1記載の情報表示装置。 - 前記オブジェクト重畳部によって表示される前記実世界映像に対するタッチパネル操作により、ユーザ操作を受け付ける操作入力部、を更に有する、
請求項1記載の情報表示装置 - 実世界映像を入力するステップと、
入力された前記実世界映像に映し出されるオブジェクトと前記実世界映像の視点との間の距離を取得するステップと、
取得された前記距離に応じて、前記オブジェクトに関する情報の提示の仕方を決定するステップと、
決定された提示の仕方で提示されるように、前記情報と前記実世界映像とを重畳して出力するステップと、
を有する情報表示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09834424.5A EP2378392B1 (en) | 2008-12-25 | 2009-12-22 | Information displaying apparatus and information displaying method |
US13/140,889 US9204050B2 (en) | 2008-12-25 | 2009-12-22 | Information displaying apparatus and information displaying method |
CN200980150013.4A CN102246121B (zh) | 2008-12-25 | 2009-12-22 | 信息显示装置和信息显示方法 |
JP2010543851A JP5328810B2 (ja) | 2008-12-25 | 2009-12-22 | 情報表示装置および情報表示方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-331451 | 2008-12-25 | ||
JP2008331451 | 2008-12-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010073616A1 true WO2010073616A1 (ja) | 2010-07-01 |
Family
ID=42287263
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/007113 WO2010073616A1 (ja) | 2008-12-25 | 2009-12-22 | 情報表示装置および情報表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9204050B2 (ja) |
EP (1) | EP2378392B1 (ja) |
JP (3) | JP5328810B2 (ja) |
CN (1) | CN102246121B (ja) |
WO (1) | WO2010073616A1 (ja) |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011233005A (ja) * | 2010-04-28 | 2011-11-17 | Ntt Docomo Inc | オブジェクト表示装置、オブジェクト表示システム及びオブジェクト表示方法 |
US20120038669A1 (en) * | 2010-08-12 | 2012-02-16 | Pantech Co., Ltd. | User equipment, server, and method for selectively filtering augmented reality |
US20120044263A1 (en) * | 2010-08-20 | 2012-02-23 | Pantech Co., Ltd. | Terminal device and method for augmented reality |
JP2012064071A (ja) * | 2010-09-17 | 2012-03-29 | Mic Ware:Kk | 情報システム、端末装置、広告出力方法、およびプログラム |
JP2012123546A (ja) * | 2010-12-07 | 2012-06-28 | Casio Comput Co Ltd | 情報表示システム、情報表示装置、情報提供装置、および、プログラム |
WO2012098869A1 (ja) * | 2011-01-21 | 2012-07-26 | パナソニック株式会社 | 情報処理装置、拡張現実感システム、情報処理方法、及び情報処理プログラム |
WO2012108180A1 (ja) | 2011-02-08 | 2012-08-16 | パナソニック株式会社 | 通信装置、通信システム、通信方法、及び通信プログラム |
JP2013077314A (ja) * | 2012-12-25 | 2013-04-25 | Casio Comput Co Ltd | 情報表示システム、情報表示装置、情報提供装置、および、プログラム |
WO2013069189A1 (en) * | 2011-11-11 | 2013-05-16 | Sony Corporation | Information processing apparatus, information processing method, and program |
JP2013109768A (ja) * | 2011-11-22 | 2013-06-06 | Samsung Electronics Co Ltd | ポータブル端末で拡張現実サービスを提供するための方法、及びポータブル端末装置 |
WO2013099473A1 (ja) * | 2011-12-27 | 2013-07-04 | ソニー株式会社 | サーバ |
JP2013196157A (ja) * | 2012-03-16 | 2013-09-30 | Sony Corp | 制御装置、電子機器、制御方法、及びプログラム |
JP2013218597A (ja) * | 2012-04-11 | 2013-10-24 | Sony Corp | 情報処理装置、表示制御方法及びプログラム |
US20130321662A1 (en) * | 2011-02-08 | 2013-12-05 | Furukawa Electric Co., Ltd. | Optical module |
JP2014501405A (ja) * | 2010-12-17 | 2014-01-20 | クゥアルコム・インコーポレイテッド | ハンドヘルドデバイスにおけるアイキャプチャに基づく拡張現実処理 |
JP2014071663A (ja) * | 2012-09-28 | 2014-04-21 | Brother Ind Ltd | ヘッドマウントディスプレイ、それを作動させる方法およびプログラム |
JP2014085723A (ja) * | 2012-10-19 | 2014-05-12 | Toyota Motor Corp | 情報提供装置および情報提供方法 |
JP2014120132A (ja) * | 2012-12-19 | 2014-06-30 | Konica Minolta Inc | 画像処理端末、画像処理システム、および画像処理端末の制御プログラム |
JP2014131094A (ja) * | 2012-12-28 | 2014-07-10 | Seiko Epson Corp | 表示装置、および、表示装置の制御方法 |
GB2511386A (en) * | 2012-12-21 | 2014-09-03 | Syngenta Ltd | Herbicidal compositions |
JP2014186434A (ja) * | 2013-03-22 | 2014-10-02 | Seiko Epson Corp | 頭部装着型表示装置を利用した情報表示システム、頭部装着型表示装置を利用した情報表示方法、および、頭部装着型表示装置 |
JP2015022737A (ja) * | 2013-07-24 | 2015-02-02 | 富士通株式会社 | 情報処理装置、情報提供方法および情報提供プログラム |
JP2015089021A (ja) * | 2013-10-31 | 2015-05-07 | キヤノンマーケティングジャパン株式会社 | 撮像装置、撮像制御方法およびプログラム |
JP2015092780A (ja) * | 2015-02-12 | 2015-05-14 | オリンパスイメージング株式会社 | 撮影機器および撮影方法 |
US20150170367A1 (en) * | 2012-10-02 | 2015-06-18 | Google Inc. | Identification of relative distance of objects in images |
JP2015164001A (ja) * | 2014-02-28 | 2015-09-10 | 株式会社丸山製作所 | ガイド装置 |
JP2016004292A (ja) * | 2014-06-13 | 2016-01-12 | 富士通株式会社 | 端末装置、情報処理システム、及び表示制御プログラム |
JP2016018367A (ja) * | 2014-07-08 | 2016-02-01 | 沖電気工業株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2016029591A (ja) * | 2015-10-30 | 2016-03-03 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JPWO2014002259A1 (ja) * | 2012-06-29 | 2016-05-30 | トヨタ自動車株式会社 | 画像情報提供装置、画像情報提供システム、画像情報提供方法 |
JPWO2014017392A1 (ja) * | 2012-07-24 | 2016-07-11 | 日本電気株式会社 | 情報処理装置、そのデータ処理方法、およびプログラム |
US9438806B2 (en) | 2010-09-17 | 2016-09-06 | Olympus Corporation | Photographing apparatus and photographing method for displaying combined avatar and map information related to a subject |
JP2017096635A (ja) * | 2015-11-18 | 2017-06-01 | アイシン・エィ・ダブリュ株式会社 | 目的地設定システム、方法およびプログラム |
JP6193466B1 (ja) * | 2016-12-09 | 2017-09-06 | 株式会社ドワンゴ | 画像表示装置、画像処理装置、画像処理システム、画像処理方法及び画像処理プログラム |
WO2018146979A1 (ja) * | 2017-02-10 | 2018-08-16 | シャープ株式会社 | 画像処理装置及び画像処理プログラム |
JP2019012237A (ja) * | 2017-06-30 | 2019-01-24 | パナソニックIpマネジメント株式会社 | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 |
US10235776B2 (en) | 2015-09-07 | 2019-03-19 | Kabushiki Kaisha Toshiba | Information processing device, information processing method, and information processing program |
CN111538405A (zh) * | 2019-02-07 | 2020-08-14 | 株式会社美凯利 | 信息处理方法及终端、非临时性计算机可读存储介质 |
JP2021092802A (ja) * | 2013-02-22 | 2021-06-17 | ソニーグループ株式会社 | 情報処理装置、制御方法及びプログラム |
CN112995507A (zh) * | 2021-02-08 | 2021-06-18 | 北京蜂巢世纪科技有限公司 | 提示对象位置的方法及装置 |
WO2023119527A1 (ja) * | 2021-12-22 | 2023-06-29 | マクセル株式会社 | 携帯情報端末および情報処理方法 |
WO2024075817A1 (ja) * | 2022-10-07 | 2024-04-11 | 株式会社日立製作所 | 表示方法、表示システム |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8730156B2 (en) * | 2010-03-05 | 2014-05-20 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
US9204050B2 (en) * | 2008-12-25 | 2015-12-01 | Panasonic Intellectual Property Management Co., Ltd. | Information displaying apparatus and information displaying method |
JP5525201B2 (ja) | 2009-07-28 | 2014-06-18 | パナソニック株式会社 | 画像合成装置、画像符号化装置、コンピュータプログラム、記録媒体 |
US8611592B2 (en) * | 2009-08-26 | 2013-12-17 | Apple Inc. | Landmark identification using metadata |
US9420251B2 (en) | 2010-02-08 | 2016-08-16 | Nikon Corporation | Imaging device and information acquisition system in which an acquired image and associated information are held on a display |
KR101606727B1 (ko) * | 2010-06-25 | 2016-03-28 | 엘지전자 주식회사 | 휴대 단말기 및 그 동작 방법 |
JP2012029164A (ja) * | 2010-07-26 | 2012-02-09 | Konica Minolta Business Technologies Inc | 携帯端末及び装置管理方法 |
KR101357262B1 (ko) * | 2010-08-13 | 2014-01-29 | 주식회사 팬택 | 필터 정보를 이용한 객체 인식 장치 및 방법 |
WO2012023253A1 (ja) * | 2010-08-20 | 2012-02-23 | パナソニック株式会社 | 受信表示装置、情報発信装置、光無線通信システム、受信表示用集積回路、情報発信用集積回路、受信表示プログラム、情報発信プログラム、及び光無線通信方法 |
US8666978B2 (en) | 2010-09-16 | 2014-03-04 | Alcatel Lucent | Method and apparatus for managing content tagging and tagged content |
US8655881B2 (en) | 2010-09-16 | 2014-02-18 | Alcatel Lucent | Method and apparatus for automatically tagging content |
US20120067954A1 (en) * | 2010-09-16 | 2012-03-22 | Madhav Moganti | Sensors, scanners, and methods for automatically tagging content |
US8533192B2 (en) | 2010-09-16 | 2013-09-10 | Alcatel Lucent | Content capture device and methods for automatically tagging content |
KR101669119B1 (ko) * | 2010-12-14 | 2016-10-25 | 삼성전자주식회사 | 다층 증강 현실 시스템 및 방법 |
US9736524B2 (en) * | 2011-01-06 | 2017-08-15 | Veveo, Inc. | Methods of and systems for content search based on environment sampling |
US8576223B1 (en) * | 2011-03-29 | 2013-11-05 | Google Inc. | Multiple label display for 3D objects |
US8860717B1 (en) | 2011-03-29 | 2014-10-14 | Google Inc. | Web browser for viewing a three-dimensional object responsive to a search query |
WO2013046596A1 (ja) * | 2011-09-26 | 2013-04-04 | Necカシオモバイルコミュニケーションズ株式会社 | 携帯型情報処理端末 |
US9606992B2 (en) * | 2011-09-30 | 2017-03-28 | Microsoft Technology Licensing, Llc | Personal audio/visual apparatus providing resource management |
JP5783885B2 (ja) * | 2011-11-11 | 2015-09-24 | 株式会社東芝 | 情報提示装置、その方法及びそのプログラム |
TWI430012B (zh) * | 2011-12-05 | 2014-03-11 | Hannspree Inc | 影像擷取觸控顯示器 |
KR101303166B1 (ko) * | 2012-01-26 | 2013-09-09 | 엘지전자 주식회사 | 이동 단말기 및 그의 사진 검색방법 |
US9773345B2 (en) * | 2012-02-15 | 2017-09-26 | Nokia Technologies Oy | Method and apparatus for generating a virtual environment for controlling one or more electronic devices |
JP5921320B2 (ja) * | 2012-04-27 | 2016-05-24 | 富士通テン株式会社 | 表示システム、携帯装置、車載装置、及び、プログラム |
GB201208088D0 (en) * | 2012-05-09 | 2012-06-20 | Ncam Sollutions Ltd | Ncam |
US9147221B2 (en) * | 2012-05-23 | 2015-09-29 | Qualcomm Incorporated | Image-driven view management for annotations |
KR101899977B1 (ko) * | 2012-07-10 | 2018-09-19 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어 방법 |
CN103577788A (zh) | 2012-07-19 | 2014-02-12 | 华为终端有限公司 | 增强现实的实现方法和装置 |
TWI475474B (zh) * | 2012-07-30 | 2015-03-01 | Mitac Int Corp | Gesture combined with the implementation of the icon control method |
TWI451344B (zh) * | 2012-08-27 | 2014-09-01 | Pixart Imaging Inc | 手勢辨識系統及手勢辨識方法 |
JP6157094B2 (ja) * | 2012-11-21 | 2017-07-05 | キヤノン株式会社 | 通信装置、設定装置、通信方法、設定方法、及び、プログラム |
US9298970B2 (en) | 2012-11-27 | 2016-03-29 | Nokia Technologies Oy | Method and apparatus for facilitating interaction with an object viewable via a display |
US9613461B2 (en) | 2012-12-10 | 2017-04-04 | Sony Corporation | Display control apparatus, display control method, and program |
US9996150B2 (en) * | 2012-12-19 | 2018-06-12 | Qualcomm Incorporated | Enabling augmented reality using eye gaze tracking |
EP2757772A3 (en) * | 2013-01-17 | 2017-08-16 | Canon Kabushiki Kaisha | Image pickup apparatus, remote control apparatus, and methods of controlling image pickup apparatus and remote control apparatus |
US9946963B2 (en) | 2013-03-01 | 2018-04-17 | Layar B.V. | Barcode visualization in augmented reality |
JP6315895B2 (ja) * | 2013-05-31 | 2018-04-25 | キヤノン株式会社 | 撮像装置、画像処理装置、撮像装置の制御方法、画像処理装置の制御方法、プログラム |
KR20150008733A (ko) * | 2013-07-15 | 2015-01-23 | 엘지전자 주식회사 | 안경형 휴대기기 및 그의 정보 투사면 탐색방법 |
CN103942049B (zh) * | 2014-04-14 | 2018-09-07 | 百度在线网络技术(北京)有限公司 | 增强现实的实现方法、客户端装置和服务器 |
CN106233716B (zh) * | 2014-04-22 | 2019-12-24 | 日本电信电话株式会社 | 动态错觉呈现装置、动态错觉呈现方法、程序 |
DE102014208663A1 (de) * | 2014-05-08 | 2015-11-12 | Conti Temic Microelectronic Gmbh | Vorrichtung und verfahren zur bereitstellung von informationsdaten zu einem in einem videobildstrom enthaltenen objekt einer fahrzeugumgebung |
JP2016004340A (ja) * | 2014-06-16 | 2016-01-12 | セイコーエプソン株式会社 | 情報配信システム、頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム |
JP2016110245A (ja) * | 2014-12-03 | 2016-06-20 | 株式会社T.J.Promotion | 表示システム、表示方法、コンピュータプログラム、コンピュータが読み取り可能な記憶媒体 |
JP6424601B2 (ja) * | 2014-12-10 | 2018-11-21 | 富士通株式会社 | 表示制御方法、情報処理プログラム、及び情報処理装置 |
US9836814B2 (en) * | 2015-01-09 | 2017-12-05 | Panasonic Intellectual Property Management Co., Ltd. | Display control apparatus and method for stepwise deforming of presentation image radially by increasing display ratio |
US9918008B2 (en) * | 2015-02-06 | 2018-03-13 | Wipro Limited | Method and device for assisting a user to capture images |
CN106406507B (zh) * | 2015-07-30 | 2020-03-27 | 株式会社理光 | 图像处理方法以及电子设备 |
JP6153984B2 (ja) * | 2015-10-14 | 2017-06-28 | 株式会社日本総合研究所 | 宣伝情報提供システム、宣伝情報提供方法および宣伝情報提供プログラム |
JP6333871B2 (ja) * | 2016-02-25 | 2018-05-30 | ファナック株式会社 | 入力画像から検出した対象物を表示する画像処理装置 |
US20190043235A1 (en) * | 2016-03-24 | 2019-02-07 | Mitsubishi Electric Corporation | Support image display apparatus, support image display method, and computer readable medium |
US10025376B2 (en) | 2016-04-27 | 2018-07-17 | Rovi Guides, Inc. | Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment |
KR20190002416A (ko) * | 2016-04-27 | 2019-01-08 | 로비 가이드스, 인크. | 가상 현실 환경을 디스플레이하는 헤드업 디스플레이 상에 추가적인 콘텐트를 디스플레이하는 방법 및 시스템 |
US20190221184A1 (en) * | 2016-07-29 | 2019-07-18 | Mitsubishi Electric Corporation | Display device, display control device, and display control method |
WO2018034171A1 (ja) * | 2016-08-19 | 2018-02-22 | ソニー株式会社 | 画像処理装置および画像処理方法 |
US10559087B2 (en) * | 2016-10-14 | 2020-02-11 | Canon Kabushiki Kaisha | Information processing apparatus and method of controlling the same |
JP7009057B2 (ja) * | 2016-11-15 | 2022-01-25 | 株式会社リコー | 表示装置、表示システム、及びプログラム |
JP2018088118A (ja) * | 2016-11-29 | 2018-06-07 | パイオニア株式会社 | 表示制御装置、制御方法、プログラム及び記憶媒体 |
JP6987208B2 (ja) * | 2016-11-29 | 2021-12-22 | パイオニア株式会社 | 表示制御装置、制御方法、プログラム及び記憶媒体 |
KR20180131856A (ko) * | 2017-06-01 | 2018-12-11 | 에스케이플래닛 주식회사 | 배송 물품 정보 제공 방법 및 이를 위한 장치 |
US10607082B2 (en) | 2017-09-09 | 2020-03-31 | Google Llc | Systems, methods, and apparatus for image-responsive automated assistants |
JP7013786B2 (ja) * | 2017-10-16 | 2022-02-01 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置、プログラム及び制御方法 |
US11126848B2 (en) * | 2017-11-20 | 2021-09-21 | Rakuten Group, Inc. | Information processing device, information processing method, and information processing program |
US11164380B2 (en) * | 2017-12-05 | 2021-11-02 | Samsung Electronics Co., Ltd. | System and method for transition boundaries and distance responsive interfaces in augmented and virtual reality |
JP7110738B2 (ja) * | 2018-06-05 | 2022-08-02 | 大日本印刷株式会社 | 情報処理装置、プログラム及び情報処理システム |
US10719995B2 (en) * | 2018-10-23 | 2020-07-21 | Disney Enterprises, Inc. | Distorted view augmented reality |
US11321411B1 (en) * | 2018-12-28 | 2022-05-03 | Meta Platforms, Inc. | Systems and methods for providing content |
US10979672B1 (en) | 2020-10-20 | 2021-04-13 | Katmai Tech Holdings LLC | Web-based videoconference virtual environment with navigable avatars, and applications thereof |
JP7453193B2 (ja) | 2021-09-27 | 2024-03-19 | Kddi株式会社 | ユーザの周辺状況に連動して音声合成に基づく音声を制御する携帯装置、プログラム及び方法 |
CN115170629A (zh) * | 2022-09-08 | 2022-10-11 | 杭州海康慧影科技有限公司 | 一种伤口信息获取方法、装置、设备及存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08190640A (ja) | 1995-01-12 | 1996-07-23 | Hitachi Ltd | 情報表示方法および情報提供システム |
JPH09505138A (ja) * | 1993-09-10 | 1997-05-20 | クリティコム・コーポレイション | 位置及び姿勢を利用した電子光学式視覚システム |
JP2000194467A (ja) * | 1998-12-24 | 2000-07-14 | Matsushita Electric Ind Co Ltd | 情報表示装置、情報処理装置、機器制御装置、及び測距装置 |
JP2000512781A (ja) * | 1996-06-12 | 2000-09-26 | ジオ ヴェクター コーポレーション | コンピュータ・ビジョン・システムのグラフィック・ユーザ・インターフェイス |
JP2003167659A (ja) * | 2001-11-28 | 2003-06-13 | Fujitsu Ltd | 情報処理装置および情報オブジェクトの表示方法 |
JP2004048674A (ja) | 2002-05-24 | 2004-02-12 | Olympus Corp | 視野一致型情報呈示システム、並びに、それに用いられる携帯情報端末及びサーバ |
JP2006155238A (ja) * | 2004-11-29 | 2006-06-15 | Hiroshima Univ | 情報処理装置、携帯端末、情報処理方法、情報処理プログラム、およびコンピュータ読取可能な記録媒体 |
JP2008193640A (ja) * | 2007-02-08 | 2008-08-21 | Kddi Corp | 撮影画像に付加画像を重畳させて表示する端末及びプログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4486175B2 (ja) * | 1999-01-29 | 2010-06-23 | 株式会社日立製作所 | 立体地図表示装置および方法 |
CA2400037A1 (en) * | 2000-02-14 | 2001-08-23 | Adriana Guzman | System and method for graphical programming |
JP3967866B2 (ja) * | 2000-04-28 | 2007-08-29 | パイオニア株式会社 | ナビゲーション装置及びナビゲーションプログラムがコンピュータで読取可能に記録された情報記録媒体 |
JP3406965B2 (ja) | 2000-11-24 | 2003-05-19 | キヤノン株式会社 | 複合現実感提示装置及びその制御方法 |
JP2004151085A (ja) * | 2002-09-27 | 2004-05-27 | Canon Inc | 情報処理方法及び情報処理装置 |
JP4298407B2 (ja) | 2002-09-30 | 2009-07-22 | キヤノン株式会社 | 映像合成装置及び映像合成方法 |
JP4026146B2 (ja) | 2004-01-20 | 2007-12-26 | マツダ株式会社 | 車両用画像表示装置、車両用画像表示方法及び車両用画像表示プログラム |
US7720436B2 (en) * | 2006-01-09 | 2010-05-18 | Nokia Corporation | Displaying network objects in mobile devices based on geolocation |
JP4246195B2 (ja) * | 2005-11-01 | 2009-04-02 | パナソニック株式会社 | カーナビゲーションシステム |
JP4793685B2 (ja) * | 2006-03-31 | 2011-10-12 | カシオ計算機株式会社 | 情報伝送システム、撮像装置、情報出力方法、及び、情報出力プログラム |
US20080136958A1 (en) * | 2006-12-11 | 2008-06-12 | Pentax Corporation | Camera having a focus adjusting system and a face recognition function |
BRPI0812782B1 (pt) * | 2007-05-31 | 2019-01-22 | Panasonic Corp | aparelho de captura de imagem, aparelho de provisão de informação adicional e método para uso em um aparelho de provisão de informação adicional |
JP4600515B2 (ja) * | 2008-05-07 | 2010-12-15 | ソニー株式会社 | 情報提示装置及び情報提示方法、撮像装置、並びにコンピュータ・プログラム |
JP2010118019A (ja) | 2008-11-14 | 2010-05-27 | Sharp Corp | 端末装置、配信装置、端末装置の制御方法、配信装置の制御方法、制御プログラムおよび記録媒体 |
US8397181B2 (en) * | 2008-11-17 | 2013-03-12 | Honeywell International Inc. | Method and apparatus for marking a position of a real world object in a see-through display |
US9204050B2 (en) * | 2008-12-25 | 2015-12-01 | Panasonic Intellectual Property Management Co., Ltd. | Information displaying apparatus and information displaying method |
-
2009
- 2009-12-22 US US13/140,889 patent/US9204050B2/en not_active Expired - Fee Related
- 2009-12-22 CN CN200980150013.4A patent/CN102246121B/zh not_active Expired - Fee Related
- 2009-12-22 EP EP09834424.5A patent/EP2378392B1/en not_active Not-in-force
- 2009-12-22 JP JP2010543851A patent/JP5328810B2/ja not_active Expired - Fee Related
- 2009-12-22 WO PCT/JP2009/007113 patent/WO2010073616A1/ja active Application Filing
-
2013
- 2013-05-10 JP JP2013100308A patent/JP5588542B2/ja not_active Expired - Fee Related
- 2013-05-10 JP JP2013100306A patent/JP5611412B2/ja not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09505138A (ja) * | 1993-09-10 | 1997-05-20 | クリティコム・コーポレイション | 位置及び姿勢を利用した電子光学式視覚システム |
JPH08190640A (ja) | 1995-01-12 | 1996-07-23 | Hitachi Ltd | 情報表示方法および情報提供システム |
JP2000512781A (ja) * | 1996-06-12 | 2000-09-26 | ジオ ヴェクター コーポレーション | コンピュータ・ビジョン・システムのグラフィック・ユーザ・インターフェイス |
JP2000194467A (ja) * | 1998-12-24 | 2000-07-14 | Matsushita Electric Ind Co Ltd | 情報表示装置、情報処理装置、機器制御装置、及び測距装置 |
JP2003167659A (ja) * | 2001-11-28 | 2003-06-13 | Fujitsu Ltd | 情報処理装置および情報オブジェクトの表示方法 |
JP2004048674A (ja) | 2002-05-24 | 2004-02-12 | Olympus Corp | 視野一致型情報呈示システム、並びに、それに用いられる携帯情報端末及びサーバ |
JP2006155238A (ja) * | 2004-11-29 | 2006-06-15 | Hiroshima Univ | 情報処理装置、携帯端末、情報処理方法、情報処理プログラム、およびコンピュータ読取可能な記録媒体 |
JP2008193640A (ja) * | 2007-02-08 | 2008-08-21 | Kddi Corp | 撮影画像に付加画像を重畳させて表示する端末及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2378392A4 |
Cited By (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011233005A (ja) * | 2010-04-28 | 2011-11-17 | Ntt Docomo Inc | オブジェクト表示装置、オブジェクト表示システム及びオブジェクト表示方法 |
US20120038669A1 (en) * | 2010-08-12 | 2012-02-16 | Pantech Co., Ltd. | User equipment, server, and method for selectively filtering augmented reality |
US20120044263A1 (en) * | 2010-08-20 | 2012-02-23 | Pantech Co., Ltd. | Terminal device and method for augmented reality |
KR101429250B1 (ko) | 2010-08-20 | 2014-09-25 | 주식회사 팬택 | 단계별 객체 정보 제공이 가능한 단말 장치 및 방법 |
CN102402790A (zh) * | 2010-08-20 | 2012-04-04 | 株式会社泛泰 | 用于增强现实的终端设备及方法 |
EP2420955A3 (en) * | 2010-08-20 | 2014-10-22 | Pantech Co., Ltd. | Terminal device and method for augmented reality |
US9438806B2 (en) | 2010-09-17 | 2016-09-06 | Olympus Corporation | Photographing apparatus and photographing method for displaying combined avatar and map information related to a subject |
JP2012064071A (ja) * | 2010-09-17 | 2012-03-29 | Mic Ware:Kk | 情報システム、端末装置、広告出力方法、およびプログラム |
JP2012123546A (ja) * | 2010-12-07 | 2012-06-28 | Casio Comput Co Ltd | 情報表示システム、情報表示装置、情報提供装置、および、プログラム |
JP2014501405A (ja) * | 2010-12-17 | 2014-01-20 | クゥアルコム・インコーポレイテッド | ハンドヘルドデバイスにおけるアイキャプチャに基づく拡張現実処理 |
JP2012155362A (ja) * | 2011-01-21 | 2012-08-16 | Panasonic Corp | 情報処理装置、拡張現実感システム、情報処理方法、及び情報処理プログラム |
WO2012098869A1 (ja) * | 2011-01-21 | 2012-07-26 | パナソニック株式会社 | 情報処理装置、拡張現実感システム、情報処理方法、及び情報処理プログラム |
WO2012108180A1 (ja) | 2011-02-08 | 2012-08-16 | パナソニック株式会社 | 通信装置、通信システム、通信方法、及び通信プログラム |
US20130321662A1 (en) * | 2011-02-08 | 2013-12-05 | Furukawa Electric Co., Ltd. | Optical module |
WO2013069189A1 (en) * | 2011-11-11 | 2013-05-16 | Sony Corporation | Information processing apparatus, information processing method, and program |
US10614605B2 (en) | 2011-11-11 | 2020-04-07 | Sony Corporation | Information processing apparatus, information processing method, and program for displaying a virtual object on a display |
US9928626B2 (en) | 2011-11-11 | 2018-03-27 | Sony Corporation | Apparatus, method, and program for changing augmented-reality display in accordance with changed positional relationship between apparatus and object |
JP2013105253A (ja) * | 2011-11-11 | 2013-05-30 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP2013109768A (ja) * | 2011-11-22 | 2013-06-06 | Samsung Electronics Co Ltd | ポータブル端末で拡張現実サービスを提供するための方法、及びポータブル端末装置 |
WO2013099473A1 (ja) * | 2011-12-27 | 2013-07-04 | ソニー株式会社 | サーバ |
JP2013196157A (ja) * | 2012-03-16 | 2013-09-30 | Sony Corp | 制御装置、電子機器、制御方法、及びプログラム |
US9823821B2 (en) | 2012-04-11 | 2017-11-21 | Sony Corporation | Information processing apparatus, display control method, and program for superimposing virtual objects on input image and selecting an interested object |
JP2013218597A (ja) * | 2012-04-11 | 2013-10-24 | Sony Corp | 情報処理装置、表示制御方法及びプログラム |
JPWO2014002259A1 (ja) * | 2012-06-29 | 2016-05-30 | トヨタ自動車株式会社 | 画像情報提供装置、画像情報提供システム、画像情報提供方法 |
JPWO2014017392A1 (ja) * | 2012-07-24 | 2016-07-11 | 日本電気株式会社 | 情報処理装置、そのデータ処理方法、およびプログラム |
US9489702B2 (en) | 2012-07-24 | 2016-11-08 | Nec Corporation | Information processing device, data processing method thereof, and program |
US9628698B2 (en) * | 2012-09-07 | 2017-04-18 | Pixart Imaging Inc. | Gesture recognition system and gesture recognition method based on sharpness values |
JP2014071663A (ja) * | 2012-09-28 | 2014-04-21 | Brother Ind Ltd | ヘッドマウントディスプレイ、それを作動させる方法およびプログラム |
US10297084B2 (en) | 2012-10-02 | 2019-05-21 | Google Llc | Identification of relative distance of objects in images |
US9501831B2 (en) * | 2012-10-02 | 2016-11-22 | Google Inc. | Identification of relative distance of objects in images |
US20150170367A1 (en) * | 2012-10-02 | 2015-06-18 | Google Inc. | Identification of relative distance of objects in images |
JP2014085723A (ja) * | 2012-10-19 | 2014-05-12 | Toyota Motor Corp | 情報提供装置および情報提供方法 |
JP2014120132A (ja) * | 2012-12-19 | 2014-06-30 | Konica Minolta Inc | 画像処理端末、画像処理システム、および画像処理端末の制御プログラム |
GB2511386A (en) * | 2012-12-21 | 2014-09-03 | Syngenta Ltd | Herbicidal compositions |
JP2013077314A (ja) * | 2012-12-25 | 2013-04-25 | Casio Comput Co Ltd | 情報表示システム、情報表示装置、情報提供装置、および、プログラム |
JP2014131094A (ja) * | 2012-12-28 | 2014-07-10 | Seiko Epson Corp | 表示装置、および、表示装置の制御方法 |
JP2021092802A (ja) * | 2013-02-22 | 2021-06-17 | ソニーグループ株式会社 | 情報処理装置、制御方法及びプログラム |
US11513353B2 (en) | 2013-02-22 | 2022-11-29 | Sony Corporation | Information processing device that displays a virtual object relative to real space |
JP7268692B2 (ja) | 2013-02-22 | 2023-05-08 | ソニーグループ株式会社 | 情報処理装置、制御方法及びプログラム |
US11885971B2 (en) | 2013-02-22 | 2024-01-30 | Sony Corporation | Information processing device that displays a virtual object relative to real space |
JP2014186434A (ja) * | 2013-03-22 | 2014-10-02 | Seiko Epson Corp | 頭部装着型表示装置を利用した情報表示システム、頭部装着型表示装置を利用した情報表示方法、および、頭部装着型表示装置 |
JP2015022737A (ja) * | 2013-07-24 | 2015-02-02 | 富士通株式会社 | 情報処理装置、情報提供方法および情報提供プログラム |
JP2015089021A (ja) * | 2013-10-31 | 2015-05-07 | キヤノンマーケティングジャパン株式会社 | 撮像装置、撮像制御方法およびプログラム |
JP2015164001A (ja) * | 2014-02-28 | 2015-09-10 | 株式会社丸山製作所 | ガイド装置 |
JP2016004292A (ja) * | 2014-06-13 | 2016-01-12 | 富士通株式会社 | 端末装置、情報処理システム、及び表示制御プログラム |
JP2016018367A (ja) * | 2014-07-08 | 2016-02-01 | 沖電気工業株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2015092780A (ja) * | 2015-02-12 | 2015-05-14 | オリンパスイメージング株式会社 | 撮影機器および撮影方法 |
US10235776B2 (en) | 2015-09-07 | 2019-03-19 | Kabushiki Kaisha Toshiba | Information processing device, information processing method, and information processing program |
JP2016029591A (ja) * | 2015-10-30 | 2016-03-03 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2017096635A (ja) * | 2015-11-18 | 2017-06-01 | アイシン・エィ・ダブリュ株式会社 | 目的地設定システム、方法およびプログラム |
JP6193466B1 (ja) * | 2016-12-09 | 2017-09-06 | 株式会社ドワンゴ | 画像表示装置、画像処理装置、画像処理システム、画像処理方法及び画像処理プログラム |
JP2018097453A (ja) * | 2016-12-09 | 2018-06-21 | 株式会社ドワンゴ | 画像表示装置、画像処理装置、画像処理システム、画像処理方法及び画像処理プログラム |
JPWO2018146979A1 (ja) * | 2017-02-10 | 2019-11-14 | シャープ株式会社 | 画像処理装置及び画像処理プログラム |
WO2018146979A1 (ja) * | 2017-02-10 | 2018-08-16 | シャープ株式会社 | 画像処理装置及び画像処理プログラム |
JP7065383B2 (ja) | 2017-06-30 | 2022-05-12 | パナソニックIpマネジメント株式会社 | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 |
JP2019012237A (ja) * | 2017-06-30 | 2019-01-24 | パナソニックIpマネジメント株式会社 | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 |
CN111538405A (zh) * | 2019-02-07 | 2020-08-14 | 株式会社美凯利 | 信息处理方法及终端、非临时性计算机可读存储介质 |
JP2020129356A (ja) * | 2019-02-07 | 2020-08-27 | 株式会社メルカリ | プログラム、情報処理方法、及び情報処理端末 |
CN112995507A (zh) * | 2021-02-08 | 2021-06-18 | 北京蜂巢世纪科技有限公司 | 提示对象位置的方法及装置 |
WO2023119527A1 (ja) * | 2021-12-22 | 2023-06-29 | マクセル株式会社 | 携帯情報端末および情報処理方法 |
WO2024075817A1 (ja) * | 2022-10-07 | 2024-04-11 | 株式会社日立製作所 | 表示方法、表示システム |
Also Published As
Publication number | Publication date |
---|---|
JP5611412B2 (ja) | 2014-10-22 |
JP5588542B2 (ja) | 2014-09-10 |
US20110254861A1 (en) | 2011-10-20 |
JP2013225311A (ja) | 2013-10-31 |
CN102246121A (zh) | 2011-11-16 |
EP2378392B1 (en) | 2016-04-13 |
EP2378392A4 (en) | 2012-09-05 |
CN102246121B (zh) | 2016-01-13 |
JP5328810B2 (ja) | 2013-10-30 |
JPWO2010073616A1 (ja) | 2012-06-07 |
JP2013211027A (ja) | 2013-10-10 |
EP2378392A1 (en) | 2011-10-19 |
US9204050B2 (en) | 2015-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5611412B2 (ja) | 情報表示装置 | |
KR102618495B1 (ko) | 영상 처리 장치 및 방법 | |
JP4752827B2 (ja) | 地図情報表示装置、地図情報表示方法、及びプログラム | |
JP5980222B2 (ja) | コンテンツ処理装置、コンテンツ処理方法およびプログラム | |
CN103369234B (zh) | 服务器、客户终端和系统 | |
US20160344783A1 (en) | Content provision system, information processing apparatus and content reproduction method | |
EP2252044A2 (en) | Electronic apparatus, display controlling method and program | |
JP5398970B2 (ja) | 移動通信装置、制御方法 | |
CN103002208A (zh) | 电子装置和图像拾取设备 | |
CN102263896A (zh) | 图像处理单元、图像处理方法和程序 | |
CN104255022B (zh) | 为相机添加虚拟变焦能力的服务器、客户终端、系统和可读介质 | |
US20120002094A1 (en) | Image pickup apparatus for providing reference image and method for providing reference image thereof | |
CN111680238B (zh) | 信息分享方法、装置和存储介质 | |
KR20190107012A (ko) | 정보 처리 장치, 정보 처리 방법, 및 프로그램 | |
JP2013183218A (ja) | 動画再生装置および動画再生プログラム | |
CN113905175A (zh) | 视频生成方法、装置、电子设备及可读存储介质 | |
KR20190120106A (ko) | 동영상의 대표 이미지를 결정하는 방법 및 그 방법을 처리하는 전자 장치 | |
JP5007631B2 (ja) | 電子カメラ | |
JP2010009192A (ja) | 情報表示システム及びそれを用いた携帯情報端末 | |
JP4870503B2 (ja) | カメラ、ブログ管理システム | |
JP2008219390A (ja) | 画像閲覧装置 | |
KR101436325B1 (ko) | 동영상 대표 이미지 설정 방법 및 장치 | |
KR20110052247A (ko) | 촬영 이미지를 제공하는 촬영 장치, 촬영 이미지 및 연관 이미지를 디스플레이하는 디스플레이 장치 및 그 방법들 | |
JP5325012B2 (ja) | 歌唱者画像撮影機能を有するカラオケシステム | |
US10026198B2 (en) | Method, system and electronic device for at least one of efficient graphic processing and salient based learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980150013.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09834424 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010543851 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13140889 Country of ref document: US Ref document number: 2009834424 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |