WO2011096561A1 - 撮像装置、情報取得システム及びプログラム - Google Patents
撮像装置、情報取得システム及びプログラム Download PDFInfo
- Publication number
- WO2011096561A1 WO2011096561A1 PCT/JP2011/052498 JP2011052498W WO2011096561A1 WO 2011096561 A1 WO2011096561 A1 WO 2011096561A1 JP 2011052498 W JP2011052498 W JP 2011052498W WO 2011096561 A1 WO2011096561 A1 WO 2011096561A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- image data
- building
- control unit
- latitude
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32144—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/344—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
- H04N21/8133—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8146—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
- H04N21/8153—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00323—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a measuring, monitoring or signaling apparatus, e.g. for transmitting measured information to a central location
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3253—Position information, e.g. geographical position at time of capture, GPS data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3273—Display
Definitions
- the present invention relates to an imaging apparatus having a function of acquiring information on a position where an object is imaged and associating the information of the object with information on the position, and an imaging apparatus that acquires information on the object using the information.
- the present invention relates to an information acquisition system and program.
- a portable information device such as an imaging device (for example, a digital camera) having an imaging function
- position information on the position of the portable information device is detected, and image data captured in accordance with the detected position information is processed.
- Some have the function to perform for example, refer patent document 1).
- the user cannot easily obtain information on what kind of subject (for example, a building) the user is photographing.
- An imaging apparatus acquires an information about the subject based on the position information, an image capturing unit that captures an image of the subject, a position information acquisition unit that acquires position information of an imaging position, and an image of the subject
- a control unit that displays data and information on the subject on a display unit; and a hold control unit that outputs a hold control signal for holding image data of the subject and information on the subject to the control unit.
- An imaging apparatus includes a latitude / longitude detection unit that detects latitude / longitude information of its own position, an azimuth angle detection unit that detects an azimuth angle capturing image data, and the latitude / longitude information in the latitude / longitude information.
- the AR information added to the building located in the latitude / longitude range in the azimuth angle direction is acquired from the latitude / longitude information and the azimuth angle, and displayed on the display unit, and the AR information is stored in the storage unit.
- a hold control unit that outputs a hold control signal that causes the storage unit to store the AR information and the image data when the action is detected.
- the information acquisition system includes an imaging device and an information search system, and the imaging device is any one of the imaging devices described above.
- the information acquisition system in the aspect which concerns on this invention is comprised from an imaging device and an information search system, and the said azimuth
- Latitude / longitude detection unit for detecting longitude information
- azimuth angle detection unit for detecting the azimuth angle of image data
- a building located in the latitude / longitude range in the direction of the azimuth angle in the latitude / longitude information
- the acquired AR information is acquired from the latitude / longitude information and the azimuth angle, and a control unit for displaying the AR information on the display unit and an action for storing the AR information in the storage unit are detected.
- the information processing system includes a hold control unit that outputs a hold control signal for storing the AR information and image data in the storage unit to the control unit
- the information search system includes a building identification number of a building, A database in which map data corresponding to latitude and longitude information of the building, a building identification number and a building table corresponding to the AR information of the building indicated by the building identification number are stored, and the imaging
- the building identification number of the building located in the latitude / longitude range in the direction of the azimuth angle in the latitude / longitude information is searched from the map data and searched.
- the AR information added to the building indicated by the building identification number is read from the building table and read out.
- an information search server that sends the AR information to the imaging apparatus.
- a program according to an aspect of the present invention is a program for causing a computer to execute any of the functions of the imaging apparatus described above.
- the program according to the aspect of the present invention is a program for causing a computer to execute any of the functions of the imaging apparatus described above, and the control unit stores latitude and longitude information of its own position detected by the latitude and longitude detection unit.
- the information is acquired from the latitude / longitude information and the azimuth angle, displayed on the display unit, and when the hold control unit detects an action of storing the AR information in the storage unit, the AR information and the image data are stored in the storage unit.
- an imaging device an information acquisition system, and a program that can enhance user activities.
- FIG. 1 is a block diagram illustrating a configuration example of an imaging apparatus and an information acquisition system in the present embodiment.
- the information acquisition system includes a digital camera 1 that is an example of an imaging device and an information search system 2, and the digital camera 1 performs data communication with a wireless base station 3 by wireless communication. Data is transmitted to and received from the information search system 2 (or an information search server 21 described later) via the base station 3 and an information communication network 4 such as the Internet.
- the imaging device in the present embodiment is a portable terminal with an imaging function such as the digital camera 1 and the camera-equipped mobile phone described above.
- the digital camera 1 has a browser function, and includes a control unit 11, a transmission / reception unit 12, an imaging unit 13, a GPS (Global Positioning System) 14, a storage unit 15, an orientation sensor 16 (azimuth angle detection unit), a display.
- the unit 17 and the timer 18 are included.
- the wireless communication in the present embodiment includes communication using radio waves, light, sound waves, or the like without using a line as a transmission path.
- the imaging unit 13 includes a CCD (Charge-Coupled Device) or CMOS (Complementary-Metal-Oxide-Semiconductor) image sensor, a lens, and the like.
- Image data obtained by imaging a subject (object to be imaged) is stored in the control unit. 11 is output.
- the subject in the present embodiment is a building (a commercial building including a store, a public building including a school and a hospital, a factory, a housing including a condominium, a tower, a bridge, a dam, an amusement park, a bronze statue, etc. Of artificially constructed objects, etc.).
- the GPS 14 instructs a control signal (for example, a control signal output by a peripheral circuit (not shown) that detects the pressing of the imaging button B1) or information search to instruct imaging from the user.
- a control signal for example, a control signal output from a peripheral circuit (not shown) that detects that an icon has been selected by touching an icon indicating a search or the like by a user
- Information on latitude / longitude (latitude / longitude information, position information) is acquired and output to the control unit 11.
- the position information acquisition unit (latitude / longitude detection unit) in the present embodiment may have another configuration as long as the position of the imaging device 13 can be acquired.
- the wireless base station 3 is used to acquire the position information of the imaging device 13. You may do it.
- the azimuth sensor 16 (azimuth angle detection unit, azimuth information acquisition unit) is configured by an electronic compass or the like, and a control signal (for example, a peripheral circuit (not shown) that detects the pressing of the imaging button B1 is output) that instructs imaging from the user. Signal) or a control signal for instructing information search (for example, a control signal output from a peripheral circuit (not shown) that detects that an icon is selected by touching an icon indicating search or the like by the user) is input.
- a control signal for example, a peripheral circuit (not shown) that detects that an icon is selected by touching an icon indicating search or the like by the user
- the imaging unit 13 for example, in the CCD and the lens arranged in series, the direction in the series direction is detected and output to the control unit 11 as direction information.
- the azimuth sensor 16 may detect the azimuth in the direction of the subject as viewed from the user (that is, the imaging direction) and output it to the control unit 11 as azimuth information.
- the azimuth angle is adopted as the azimuth information, for example, the latitude and longitude of the position information is set as the center point, the north is the reference (0 degree), and the east (90 degrees), the south (180 degrees), the west
- the azimuth angle is expressed in units of degrees, minutes, and seconds as (270 degrees).
- the control unit 11 When a control signal instructing imaging such as a user pressing the imaging button B1 is input from a peripheral circuit (not shown), the control unit 11 adds image identification information to the captured image data, and for each image identification information.
- the captured image data and the time data, position information, and orientation information acquired from the timer 18 are associated with each other and written in the storage unit 15 in the order of imaging.
- the display unit 17 displays image data picked up by the image pickup unit 13 or image data selected by the user as an image to be displayed and read out from the storage unit 15 by the control unit 11. Further, as will be described later, the display unit 17 displays the image data received from the information search system 2 or the character data input by the user under the control of the control unit 11.
- the transmission / reception unit 12 performs transmission / reception with the wireless base station 3 by wireless communication, and transmits / receives data such as image data, character data, or control signals to / from the information search system 2 via the information communication network 4. .
- the information search system 2 has an information search server 21 and a database 22 as shown in FIG.
- the database 22 may be provided in a storage medium (for example, a memory or an HDD) in the information search server 21, or may be provided in an external storage medium or a storage medium of another terminal.
- a storage medium for example, a memory or an HDD
- building identification information for identifying a building, a building name that is the name of the building, and building information (address, telephone number, type)
- Information such as surrounding image data centered on the building), location information based on the latitude and longitude of the building, description of the building (information written by the store if it is a store), and posted information (user who visited Comments, etc., image data posted by the user, etc.) are stored.
- the database 22 stores map data in which each registered building is arranged with latitude and longitude as coordinate axes of a two-dimensional plane. In this map data, each building is shown at the position of the latitude and longitude corresponding to the building identification information.
- the information search server 21 uses the position information and the azimuth information to be input, and searches for a building at the closest distance in the direction indicated by the azimuth information from the position of the latitude and longitude indicated by the position information. Get the building identification information of the building.
- the information search server 21 searches the building table for a building corresponding to the building identification information of the building acquired through the search, and searches each piece of information (building name, building). Information, etc.) is transmitted to the digital camera 1.
- the information search server 21 may selectively transmit each piece of information of the building to be transmitted to the digital camera 1 according to necessity such as transmission / reception data capacity. In that case, the information search server 21 may transmit the remaining information of the building information to the digital camera 1 again by a predetermined operation (for example, a request from the digital camera 1 or the like). Good.
- FIG. 3 is a flowchart showing an operation example of the information search process in the present embodiment when it is desired to perform an information search in a state where the digital camera 1 is imaged toward the building.
- 4 to 6 are diagrams showing the screen displayed on the display unit 17 of the digital camera 1 and the arrangement of the imaging button B1 of the digital camera 1.
- FIG. In the following description, for example, a description will be given in a state where a user is searching for a restaurant that eats dinner while walking in a downtown area.
- step S1 When the user looks at the downtown area and directs the lens of the digital camera 1 toward the restaurant with a favorite appearance (step S1) and finds the restaurant with the appearance that he wants to obtain information, as shown in FIG.
- the browse icon I1 is selected by touching (pressing down) the browse icon I1 provided in the image display field MP of the camera 1 (step S2).
- the control unit 11 takes the image data input by the imaging unit 13 and captured by the imaging unit 13 in the image display field MP as illustrated in FIG. 4.
- the image data is displayed via the display unit 17.
- the image display field MP image display part
- the image display field MP is comprised by the touchscreen (For example, a pressure sensitive type, an electrostatic type, a sound wave type etc. are included).
- step S2 described above, after the image capturing button B1 of the digital camera 1 is pressed, the browse icon I1 is touched and selected while viewing the image display field MP in which the captured image of the restaurant is displayed. Also good.
- a signal indicating that the browse icon I1 has been selected is output to the control unit 11 from a peripheral circuit (not shown).
- the control part 11 detects that the browse icon I1 was selected by the signal which shows that the browse icon I1 input was selected, and image to the said image data of the said restaurant which the imaging part 13 imaged. Identification information is assigned, and the image data is associated with the image identification information.
- the image data and the time of imaging are written in the storage unit 15, and the image data is stored (step S3).
- the control unit 11 associates the position information acquired by the GPS 14 and the direction information acquired by the direction sensor 16 with the image identification information as related information, and stores it in the storage unit 15.
- Write and store step S4.
- the control unit 11 writes the image data and the related information in the storage unit 15, and then adds camera identification information for identifying the digital camera 1 (user identification information given to the user or given to the digital camera 1.
- the search request signal including the identification information), the position information, and the azimuth information is transmitted to the information search system 2 via the transmission / reception unit 12 (step S5).
- the control unit 11 transmits the search request signal to the information search system 2 via the transmission / reception unit 12 without writing the image data and the related information in the storage unit 15. You may do it.
- the server 21 when the server 21 receives the search request signal from the digital camera 1, the search target building is searched from the map data in the database 22 based on the position information and the direction information included in the search request signal. Extract the product. And the server 21 transmits each information in the table of the said building to the digital camera 1 as search result information including the building identification information from the building table of FIG. 2 by this building identification information. (Step S7). At this time, as an example, the server 21 reads an address on the network of the digital camera 1 added when the search request signal is transmitted from the digital camera 1 via the transmission / reception unit 12, and the search result is obtained with respect to this address. Send information.
- step S7 described above the server 21 only detects that the camera identification information included in the search request signal is registered in the user registration table in the user registration table stored in the database 22.
- the building to be searched may be extracted from the map data in the database 22 based on the position information and the direction information included in the search request signal.
- the server 21 detects that the camera identification information included in the search request signal is not registered in the user registration table, for example, the server 21 digitally registers information indicating that user registration based on the camera identification information is necessary. What is necessary is just to make it transmit with respect to the camera 1.
- FIG. 1 the server 21 only detects that the camera identification information included in the search request signal is registered in the user registration table in the user registration table stored in the database 22.
- the building to be searched may be extracted from the map data in the database 22 based on the position information and the direction information included in the search request signal.
- the server 21 digitally registers information indicating that user registration based on the camera identification information is necessary. What is necessary is just to make it transmit with respect
- the building is displayed in the information display column SP through the display unit 17 as shown in FIG. 5.
- Information is displayed (step S8).
- the building is a restaurant, information on what kind of restaurant (Chinese, Japanese, French, Italian, etc.), a telephone number of the restaurant, an address, and the like are displayed.
- the information display field SP shown in FIG. 5 is an example in the present embodiment, and the information display field SP may be displayed so as to partially overlap the restaurant image.
- the information display field SP is a part of the image display field MP, and is displayed when, for example, the above-described browse icon I1 or a CM icon I2 described later is touched and selected.
- the peripheral circuit When the user touches the CM icon I2, the peripheral circuit outputs a signal indicating that the CM icon I2 has been selected to the control unit 11.
- the control unit 11 detects that the CM icon I2 has been selected by receiving a signal indicating that the CM icon I2 has been selected, and the commercial information of the restaurant included in the search result information ( Menu, number of seats, in-store image, chef's today's recommended dish, etc.) are displayed in the information display field SP via the display unit 17.
- the commercial information includes image data such as an in-store image
- the control unit 11 displays the image data in the information display field SP.
- the peripheral circuit outputs a signal indicating that the posting browsing icon I3 has been selected to the control unit 11.
- the control unit 11 detects that the posting browsing icon I3 has been selected by receiving a signal indicating that the posting browsing icon I3 has been selected, and another user included in the search result information.
- the posted information (comments) written in is displayed on the information display field SP via the display unit 17.
- the control unit 11 displays a plurality of pieces of posted information in the information display column SP in the order of posted dates, for example.
- the posted information includes image data. If the post information includes photographs of each dish taken by other users or image data inside the store, the control unit 11 sends the image data via the display unit 17. Displayed in the information display column SP. For example, when the control unit 11 detects a signal indicating that the information display field SP has been touched so as to slide rightward from the left end in FIG. 5, the control unit 11 ends the display of the information display field SP, As shown in FIG. 4, the image of the restaurant is displayed on the image display field MP via the display unit 17. For example, when the posting information is displayed in the information display field SP and a signal indicating that the posting browsing icon I3 is selected, the control unit 11 displays the information display field SP or displays the posting information. May be terminated.
- control unit 11 may display the commercial information together with the building information when displaying the commercial information (or posted information) after displaying the building information in the information display field SP.
- the commercial information may be displayed after deleting from the information display field SP.
- the control unit 11 detects a signal indicating that the in-store image displayed in the information display field SP is touched, the control unit 11 displays the enlarged image or the reduced image of the in-store image in the information display field SP (or the image display field SP). MP).
- the peripheral circuit When the user touches the mail icon I4, the peripheral circuit outputs a signal indicating that the mail icon I4 has been selected to the control unit 11.
- the control unit 11 detects that the mail icon I4 has been selected by inputting a signal indicating that the mail icon I4 has been selected, and an input field IP as shown in FIG.
- the image is displayed on a part of the image display field MP via the display unit 17.
- the input field IP is composed of a text field T where a user inputs information and the like, and a touch screen type keyboard unit K. This input field IP is used when the user inputs post information, a store name, or a year, which will be described later.
- the user fills in the text field T of the input field IP using the touch screen type keyboard unit K with posted information such as the food he / she eats, the atmosphere in the store, or the quality of service of the store clerk. Then, when the user touches the mail icon I4 again, the control unit 11 detects that the mail icon I4 is selected and is displayed in the character information and the image display field P described in the text field T.
- the received image data is posted information together with the camera identification information and the building identification information, and a search request signal including the posted information is transmitted to the information search system 2 via the transmission / reception unit 12.
- the server 21 when the server 21 receives the search request signal from the digital camera 1, the server 21 writes new post information input corresponding to the building identification information in the post column of the building table of the database 22.
- the server 21 includes the camera included in the search request signal in the user registration table in the database 22 before writing the posting information entered in the posting entry field (text field T) in the posting field of the building table of the database 22.
- the post information described above may be written in the post field of the building table of the database 22 only when the presence or absence of the identification information is detected and it is detected that the camera identification information is registered. *
- the control unit 11 detects whether or not the end icon E has been selected (step S9).
- the peripheral circuit outputs a signal indicating that the end icon E has been selected to the control unit 11.
- the control unit 11 detects that the end icon E has been selected by receiving a signal indicating that the end icon E has been selected, the control unit 11 ends the information search process.
- the control unit 11 returns the process to step S1 and performs the imaging process of the digital camera 1 and Continue the information retrieval process.
- the control unit 11 may include captured image data in the search request signal transmitted to the information search system 2 when the browse icon I1 is selected. Then, the server 21 compares the peripheral image data of the building information retrieved from the building table with the transmitted image data by image recognition based on the building identification information of the building extracted from the map data. Whether a feature is extracted from the image data, compared with the feature value of the stored peripheral image data, and whether the similarity is greater than or equal to the set numerical value. It may be configured to determine whether or not. At this time, if it is determined that the building of the surrounding image data is not similar to the building of the image data as a comparison result, the server 21 determines the next closest building from the map data based on the position information and the direction information. It is also possible to extract and compare feature quantities again, and to evaluate the similarity between the newly extracted peripheral image data of the building and the image data being captured.
- the browse icon I1 in the present embodiment may be provided in the imaging apparatus 1 as a browse button (not shown), similarly to the imaging button B1 described above.
- the function of the browse button is the same as the function of the browse icon I1 described above.
- the imaging button B1 in the present embodiment may function as a browse button by operating a switch unit (not shown). In this case, the browse icon I1 or the browse button is not necessary.
- FIG. 7 shows an operation example of the information search process in the present embodiment when the digital camera 1 captures an image of a building, reads image data stored in the storage unit 15, and performs information search for the image data. It is a flowchart to show. In the following description, for example, a state in which a user searches for a restaurant that eats dinner while scolding the downtown area of Shinjuku, captures image data of several restaurants, and then searches for a restaurant to eat from among them. An example is shown.
- the user points the digital camera 1 toward a restaurant with a favorite appearance while hesitating the downtown area (step S11).
- an imaging button provided on the digital camera 1 is provided.
- B1 (shutter) is pressed (step S12).
- the control unit 11 displays the image data of the restaurant in which the image data captured by the imaging unit 13 is captured in the image display field MP of the display unit 17 as shown in FIG.
- the peripheral circuit outputs a signal indicating that the imaging button has been pressed to the control unit 11.
- the control part 11 detects that the imaging button B1 was pressed by inputting the signal which shows that the imaging button B1 was pressed, and image data of the said restaurant which the imaging part 13 is imaging
- the image identification information is assigned to the image identification information, the image identification information is associated with the image identification information, and the image data and the imaging time are written and stored in the storage unit 15 (step S13).
- the control unit 11 writes the position information acquired by the GPS 14 and the azimuth information acquired by the azimuth sensor 16 at the time when the image pickup button B1 is pressed, in association with the image identification information, and writes it in the storage unit 15.
- Store step S14).
- the control unit 11 determines whether or not an image is stored in the storage unit 15 based on whether or not a signal indicating that the read icon R has been selected is input (step S15). At this time, when a signal indicating that the read icon R has been selected is input, the control unit 11 detects that the read icon R has been selected, and starts the search for information, so that the process proceeds to step S16. Proceed.
- the read icon R may be a read button (not shown).
- the readout button may be provided on the main body of the digital camera 1 like the imaging button B1.
- the control unit 11 performs the process of step S11 in order to perform a new imaging process. return.
- the control unit 11 receives the image data in the order of time series imaging from the storage unit 15. Are sequentially read (step S16), and image data is displayed in the image display field MP via the display unit 17 for each preset number of thumbnail images (or image data one by one). Also, when the thumbnail images cannot be fully contained in one page (or displayed one by one), when the user touches the image displayed in the image display field MP to slide in a predetermined direction, the previous page or the next page Are displayed in the image display field MP.
- control unit 11 displays a thumbnail image of the previous page or the next page by detecting a signal indicating that the image displayed in the image display field MP has been touched so as to slide in a predetermined direction. Displayed in the column MP via the display unit 17.
- the peripheral circuit has selected the browse icon I1 together with the selected image identification information. Is output to the control unit 11.
- the control unit 11 receives position information corresponding to the image identification information from the storage unit 15 based on the image identification information of the selected image data.
- the azimuth information is read, and a search request signal including camera identification information, position information, and azimuth information is transmitted to the information search system 2 via the transmission / reception unit 12 (step S17).
- step S18 the control unit 11 detects whether or not the end icon E has been selected. At this time, when the user touches the end icon E and receives a signal indicating that the end icon E has been selected from the peripheral circuit, the control unit 11 ends the browsing process. On the other hand, when the signal indicating that the end icon E has been selected is not input, the control unit 11 returns the process to step S16 and continues the process of selecting an image from the thumbnail image.
- FIG. 8 shows information search for the restaurant of the input store name from the building table of FIG. 2 corresponding to the store name stored in the database 22 when the user inputs the store name to the digital camera 1.
- 6 is a flowchart showing an example of operation of information search processing in the present embodiment when it is desired to perform the operation. In the following description, for example, when having dinner in Harajuku, a state in which restaurant information heard from an acquaintance is confirmed.
- the peripheral circuit When the user selects the search icon I5 by touching, the peripheral circuit outputs a signal indicating that the search icon I5 has been selected to the control unit 11.
- the control unit 11 detects that the search icon I5 has been selected by inputting a signal indicating that the search icon I5 has been selected, and an input field IP as shown in FIG.
- the image is displayed on a part of the image display field MP via the display unit 17.
- the user writes the store name of the restaurant to be searched using the touch screen type keyboard unit K in the text field T of the input field IP (step S21), and the user touches the search icon I5 again (step S22).
- the peripheral circuit outputs the character data of the store name input in the text field T to the control unit 11 together with a signal indicating that the search icon I5 has been selected.
- the control part 11 detects that the search icon I5 was selected by the input of the signal which shows that the search icon I5 was selected, reads the character data of the shop name input into the said text column T, Along with its own camera identification information, this search request signal is transmitted as a search request signal to the information search system 2 via the transmission / reception unit 12 (step S23).
- the server 21 reads information on the building (store) corresponding to the store name (store information, each information in the table of FIG. 2) from the building table in the database 22. (Step S24).
- the server 21 After reading the store information, the server 21 transmits the acquired store information to the digital camera 1 (step S25).
- the control unit 11 displays the image data around the store included in the building information in the image display field MP of FIG. Building information is displayed in the information display field SP (step S26). For example, information on what kind of restaurant (Chinese, Japanese, French, Italian, etc.) and a telephone number are displayed.
- the control unit 11 detects that the CM icon I2 has been selected, and the commercial information of the restaurant included in the search result information (the menu and the chef's recommended food today) Are displayed on the information display field SP via the display unit 17.
- the peripheral circuit When the user selects the posting browsing icon I3 by touching, the peripheral circuit outputs a signal indicating that the posting browsing icon I3 has been selected to the control unit 11.
- the control unit 11 detects that the posting browsing icon I3 has been selected by receiving a signal indicating that the posting browsing icon I3 has been selected, and posted by another user included in the search result information. Information is displayed on the information display field SP via the display unit 17.
- control unit 11 when there are a plurality of pieces of post information, sequentially displays the plurality of pieces of post information via the display unit 17 in the information display column SP. In addition, when the posting information includes image data, the control unit 11 displays the image data in the information display field SP via the display unit 17.
- control part 11 detects whether the end icon E was selected (step S27).
- the detection of the selection of the end icon E is the same as the process in step S9 (in FIG. 3) in ⁇ information retrieval of an image captured by the digital camera 1>, and thus the description thereof is omitted.
- the control unit 11 ends the browsing process.
- the control unit 11 returns the process to step S21 and continues to search for store information.
- the building table of the database 22 may have a configuration in which discount coupon data is stored for each store. Then, the server 21 adds discount coupon information to the search result information and transmits it to the digital camera 1. When the user eats or shop using the discount coupon information, the server 21 detects whether or not the user has eaten or shopped based on whether or not the discount coupon information is used. For example, when the discount coupon information is a barcode (including a two-dimensional barcode) displayed in the information display field SP of the digital camera 1, the reader of the store reads the barcode and is transmitted from the digital camera 1. Usage information indicating that the discount coupon information has been used is transmitted to the information search system 2 together with the camera identification information and the shopping (or meal) fee.
- the discount coupon information is a barcode (including a two-dimensional barcode) displayed in the information display field SP of the digital camera 1
- Usage information indicating that the discount coupon information has been used is transmitted to the information search system 2 together with the camera identification information and the shopping (or meal) fee.
- the server 21 when the server 21 receives the usage information, the history information corresponding to the camera identification information added to the usage information is read from the database 22 to search for information and use the store. As an information provision fee according to how much information in the database 22 is used at the use stage of information on whether the store was used by looking at the information to be provided, or the store was viewed using the posted information, Charge differently for stores used by.
- the server 21 writes, for each camera identification information, the stage of information search and the history of using discount coupon information in the history table stored in the database 22.
- This bar code includes building identification information indicating a store.
- the database 22 is provided with a billing table for storing a billing history corresponding to each store and a billing integrated value corresponding to the building identification information. Then, the server 21 writes the amount of money used and the billing corresponding to the use stage of the information as history in the billing table corresponding to the building identification information, and adds the new billing to the summed value and adds the summed value. Update.
- the server 21 corresponds to the usage amount when the discount coupon information is used (for example, the usage amount is multiplied by a point coefficient).
- the calculated points are obtained and accumulated in the user registration table in the database 22 in accordance with the camera identification information.
- the server 21 accumulates points of preset values for the user who has transmitted the posted information in the user registration table corresponding to the camera identification information. The above points can be used together with discount coupon information in price payment instead of money.
- the server 21 may transmit a nickname and a title together with the posted identification information to the digital camera 1 as posted information in the first stage. Then, the control unit 11 first displays only a plurality of nicknames and titles in the information display field SP. Then, the user touches and selects the nickname and title of the posted information to be concerned from among the plurality of displayed information. As a result, the peripheral circuit transmits a signal indicating that the nickname and title have been selected and the posting identification information of the selected posting information to the control unit 11. As a result, the control unit 11 transmits a posting information transmission request together with the posting identification information to the information search system 2.
- the server 21 transmits the character data and the image data (the entire posted information) corresponding to the transmitted posted identification information to the digital camera 1.
- the server 21 assigns post identification information for identification to each post information, and stores it in the user registration table in association with the camera identification information.
- the control unit 11 when the control unit 11 receives character data and image data from the information search system 2 via the transmission / reception unit 12, the control unit 11 displays the character data in the information display field SP, and displays the image data as image data. You may display on the display column MP.
- the server 21 searches the user registration table for post identification information corresponding to the post information referred to, and increments the reference count of the camera identification information corresponding to the post identification information. Then, in the first nickname and title display stage in which the user selects the posting information, the server 21 displays information on the nickname and title of the user with the highest number of references, that is, in the order in which the posting information is selected. A display list in which the order of display in the column SP is higher is transmitted to the digital camera 1. Thereby, the control part 11 displays a nickname and a title in order in the information display column SP according to the said display list.
- the database 22 may have a past image table in which image data in which buildings and landscapes at each latitude and longitude are captured every year are stored corresponding to each latitude and longitude. .
- the peripheral circuit outputs a signal indicating that the past image icon I6 has been selected to the control unit 11.
- the control unit 11 detects that the past image icon I6 has been selected by receiving a signal indicating that the past image icon I6 has been selected, and the input field IP as shown in FIG. Is displayed on a part of the image display field MP via the display unit 17. Then, after entering the year (for example, year) in the text field T using the touch screen type keyboard unit K, the user touches the past image icon I6 (or browse icon I1).
- the peripheral circuit transmits the character data for the year to the control unit 11 together with a signal indicating that the past image icon I6 has been selected.
- the control unit 11 reads character data indicating the year entered in the text column T. After reading the year data, the control unit 11 transmits a past image search request to the information search system 2 together with the read year, the position information, the direction information, and the camera identification information.
- the server 21 selects a past image table corresponding to the latitude and longitude based on the position information of the building received from the digital camera 1 displayed in the image display field P. Then, the server 21 reads the image data corresponding to the orientation information and the year in the selected past image table, and transmits the read image data to the digital camera 1.
- the digital camera 1 displays the image data corresponding to the year received from the server 21 in the image display field MP via the display unit 17.
- the server 21 transmits the information to the digital camera 1 together with the read image data. As a result, the user can obtain information such as what kind of building has existed in each past year, or nothing but only the scenery.
- the server 21 searches the past image table of the latitude / longitude closest to the latitude / longitude in the direction of the azimuth information, and extracts image data therefrom. Do. Also, if the fiscal year to be searched is an era when there are no photographs, landscape images created in that era or CG (Computer Graphics) image data created by imagination are replaced with photographic image data. Corresponding to the above, the past image table of the storage unit 15 may be stored.
- the control unit 11 obtains the latitude / longitude information (position information) indicating the location of the digital camera 1 itself obtained from the GPS 14 and the azimuth sensor 16 (azimuth angle detection unit). Image identification information is added to image data obtained by the imaging unit 16 imaging the subject, such as azimuth information indicating the direction of the optical axis of the digital camera 1 (the direction in which the image is taken), and the storage unit 15 Write in imaging order.
- a hold control unit 31 and an AR information storage unit 30 are newly added to the configuration of FIG. 1 as functions for acquiring and storing the AR information.
- FIG. 9 is a block diagram illustrating a configuration example of an information acquisition system having a function of acquiring and storing AR information.
- the AR information in the present embodiment indicates various types of information (information on the subject) shown in the building table shown in FIG.
- the AR information includes building identification information for identifying a building, a building name that is the name of the building, and building information (address, telephone number, type, peripheral image data centered on the building) Information), position information by the latitude and longitude of the building, etc., description of the building (information written by the store if it is a store), post information (comments such as evaluation of visited users, user comments and Image data submitted by the user) and the like.
- the building identification information may be a URL indicating a place where the building information is stored.
- the control unit 11 determines whether any of the AR information is a tag (augmented reality object) to be displayed superimposed on the image data of the through image (image obtained by capturing the subject). You may comprise so that it may be set.
- the through image is image data that the imaging unit 13 continuously outputs images formed on the imaging element as image data to the control unit 11, and the control unit 11 sequentially displays on the display unit 17.
- the hold control unit 31 detects an action requesting to hold AR information.
- the hold control unit 31 includes the image data that the imaging unit 13 forms on an element such as a CCD and is displayed on the display unit 17 and the AR that is in the image data (within the angle of view of the image).
- the hold timing for storing the information in the AR information storage unit 30 is detected.
- the AR information storage unit 30 may be provided in the information search system 2.
- the control unit 13 stores a table indicating the correspondence between the focal length, the magnification, and the like and the angle of view in the storage unit 15 in advance, and displays the image image by the focal length, the magnification, etc. when the image data is acquired. Find corner information.
- FIG. 10 shows a tag as an augmented reality object (for example, building information as augmented reality) in which real images (through images) of buildings T1 to T4 and AR information of each of the buildings T1 to T4 are described.
- the tag is a diagram showing a composite image (AR image) on the display unit 17 with images of TB1, TB2, TB3, and TB4.
- the digital camera 1 in the present embodiment has an AR information acquisition mode for acquiring AR information or a normal imaging mode.
- the control unit 11 presses an AR information acquisition mode button (AR information acquisition button ARB1) provided on the housing of the digital camera 1, or displays an AR information acquisition mode icon (AR information) displayed on the display unit 17.
- AR information acquisition button ARB1 provided on the housing of the digital camera 1
- AR information acquisition mode icon AR information displayed on the display unit 17.
- the control unit 11 By touching the acquisition icon ARI1), the control unit 11 detects a signal from the switch when the button is pressed or detects a touch of the icon on the display unit 17 by the touch sensor, and the AR information acquisition mode or the normal imaging mode is detected. Control one of them.
- the digital camera 1 in the present embodiment displays the AR information as a tag (synthesizes the image data and the augmented reality object of the AR information) (AR display mode) or non-display ( A mode (AR non-display mode).
- the control unit 11 is a signal from a switch when a display / non-display button (AR switching button ARB2) provided on the housing of the digital camera 1 is pressed, or a display / non-display displayed on the display unit 17.
- the icon AR switching icon ARI2
- the control unit 11 detects whether the button is pressed or the touch of the icon is detected by a sensor and is displayed or not displayed.
- the display unit 17 is provided with a transparent touch sensor on the display element.
- the touch sensor is touched, and the coordinate value of the touched area is controlled as a detection signal.
- the control unit 11 matches the coordinates of the touch sensor and the display element to be displayed, and determines the displayed image based on the coordinate value of the detection signal (the icon displayed on the display unit 17 is any icon). Indicating the activation of an application or the like, or identifying image identification information of image data displayed as a thumbnail image).
- the control unit 11 Whenever the azimuth angle (azimuth information) of the digital camera 1 changes, the control unit 11 captures the latitude / longitude information (position information), the azimuth that was capturing the image data displayed on the display unit 17 when the digital camera 1 was held. The information and the angle-of-view information are transmitted to the information search system 2 through the transmission / reception unit 12 together with the AR information acquisition request.
- the control unit 11 may acquire the AR information by transmitting the position information and the AR information acquisition request to the information search system 2 via the transmission / reception unit 12 periodically or according to a user instruction.
- control unit 11 transmits the image data and the AR information acquisition request held by the hold control unit 31 to the information search system 2 via the transmission / reception unit 12 periodically or according to a user instruction, and acquires the AR information. Also good.
- the control unit 11 detects the azimuth angle for every predetermined period, and changes the azimuth angle from the azimuth angle already stored in the azimuth angle storage unit of the storage unit 15 beyond a preset angle. When the azimuth angle changes, it is detected as a change in azimuth angle, and the azimuth angle at this time is newly stored in the azimuth angle storage unit.
- the information search server 21 receives the AR information acquisition request from the digital camera 1, the information search server 21 receives the latitude / longitude information obtained from the angle-of-view information based on the latitude / longitude information, the azimuth information, and the angle-of-view information added to the AR information acquisition request. The search distance from the coordinate position of the indicated latitude and longitude to the coordinate position of the latitude and longitude for searching the AR information is obtained. Further, the information search server 21 retrieves the AR information by reading the search distance corresponding to the view angle information from the search distance table of the database 22 in which the view angle information and the search distance corresponding to the view angle are described. Find the distance.
- FIG. 12 is a diagram illustrating a search range for searching for AR information of a building. Then, the information search server 21 obtains the search range shown in FIG. 12 from the coordinate position based on the longitude / latitude information, the azimuth angle of the optical axis of the lens of the digital camera 1, and the search distance, and the building identification information in this search range Is read from the map data in the database 22. In addition, the information search server 21 reads the AR information of the building from the building table in the database 22 based on the read building identification information. The information search server 21 transmits the read AR information to the digital camera 1 together with the corresponding building identification information.
- the control unit 11 matches the coordinate value based on the latitude / longitude information of the digital camera 1 with the coordinate value based on the latitude / longitude information indicated by the latitude / longitude information included in the AR information, and adds this to the image data (through image).
- Images of AR information tags are superimposed and combined and displayed on the display unit 17 in the state shown in FIG.
- the control unit 11 receives the hold control signal from the hold control unit 31, the control unit 11 transmits the latitude / longitude information, the azimuth information, and the angle-of-view information at the time of holding, and the AR information acquisition request from the information search system 2.
- the AR information and the building identification information for identifying the building of the AR information are recorded in the AR information storage unit 30 for each image identification information of the image data when held.
- FIG. 11 is a diagram showing a configuration of an AR information table stored in the AR information storage unit 30 in FIG.
- the AR information table is stored in the AR information storage unit 30 for each piece of image identification information.
- the control unit 11 generates the building identification information (URL indicating the location where the AR information of the building is stored) and the AR information (latitude and longitude information of the building). And the latitude / longitude information, the azimuth information, and the angle-of-view information of the digital camera 1 itself when the hold control signal is input and held are written and stored in the AR information storage unit 30 as an AR information table. Further, when the hold control signal is input, the control unit 11 stores the image data imaged on the image sensor of the imaging unit 13 at that time in association with the image identification information given to the image data. Write to 15 and store.
- the hold control unit 31 detects a signal from the switch when the AR information acquisition button ARB1 provided on the housing of the digital camera 1 is pressed. It has a detection means, or a detection means for the touch sensor to detect that the icon indicating the AR information acquisition (AR information acquisition icon ARI1) displayed on the display unit 17 is touched. Then, when the AR information acquisition button is pressed or when an icon indicating AR information acquisition is touched, the hold control unit 31 detects that a signal for recording image data and AR information is input, A hold control signal for performing hold control is output to the control unit 1. As described above, the hold control unit 31 detects that a signal for recording the image data and the AR information is input by the motion (action, operation, operation) of the digital camera 1 by the user. Outputs a hold control signal for performing hold control.
- the hold control unit 31 causes the digital camera 1 to suddenly move (an operation such as turning the digital camera 1 downward or swinging so that the screen is easy to see) as an action for holding the AR information, for example.
- the detection timing may be detected, and the detection timing may be set as the hold timing.
- the hold control unit 31 sends a hold control signal to the control unit 11 as a hold instruction control signal. Output.
- the hold control unit 31 detects that the signal for recording the image data and the AR information is input by the motion of the terminal (for example, the digital camera 1), and performs hold control on the control unit 1.
- the hold control signal to be output is output.
- the hold control unit 31 detects, for example, an image feature point as an action for holding AR information, and the feature point is detected by moving a preset distance within a preset detection range. Or the change of the image pattern in the image data is detected when the contrast of the image data output from the image sensor changes beyond a preset difference, and this detection timing is used as the hold timing. You may do it. In detecting the change in the image pattern, the hold control unit 31 outputs a hold control signal to the control unit 11 as a control signal for performing hold control when detecting a change in the image pattern that is equal to or greater than a preset threshold value. .
- the hold control unit 31 changes the angle of the digital camera 1 using an elevation angle sensor as an action for holding the AR information (moved by an operation such as turning down or swinging so that the screen is easy to see). ) May be detected, and the detection timing may be set as the hold timing.
- the hold control unit 31 uses the hold control signal as a control signal for performing hold control. 11 is output.
- the hold control unit 31 detects that the signal for recording the image data and the AR information is input by the motion of the terminal (for example, the digital camera 1), and performs hold control on the control unit 11.
- the hold control signal to be output is output.
- the control unit 11 acquires AR information corresponding to the image data when the hold control signal is received.
- the control unit 11 When the hold control signal is input, the control unit 11 obtains the buffer obtained from the information search system 2 immediately before receiving the hold control signal or at the time of an AR information acquisition request before a preset time. Stored in the unit, image data obtained for each AR information acquisition request corresponding to the image identification information, building identification information corresponding to the image data, AR information of the building, latitude / longitude information, direction information, and The angle of view information is written and stored in the AR information storage unit 30 as an AR information table with time information added. In addition, the control unit 11 writes and stores the image data in the storage unit 15 together with the image identification information. Further, the control unit 11 and the hold control unit 31 may be configured to perform AR information acquisition or AR information hold processing even when set to a non-display mode in which AR display is not displayed.
- the control unit 11 instructs to press the AR display confirmation button (AR switching button ARB2) provided on the housing of the digital camera 1 when displaying the AR information of the held image.
- the time information from the AR information table stored in the AR information storage unit 30 and the internal information by an instruction signal based on the touch of the signal or the AR display confirmation icon (AR switching button ARI2) displayed on the display unit 17 Compared with the time of the clock, the AR information table of the latest time is selected from the AR information storage unit 15 and read out, and the image data is displayed on the display unit 17 in the same way as when the information is retrieved from the information search system 2. Display with tag.
- the control unit 11 may periodically acquire the AR information table at the latest time from the information search server 21, or obtain the AR information table at the latest time from the information search server 21 based on the instruction signal. You may get it. Further, in the AR information acquisition mode, the control unit 11 may display the AR information of the held image by receiving a hold control signal at the time of holding.
- control unit 11 is configured to read image data corresponding to the image identification information stored in the AR information storage unit 30 from the storage unit 15 and display a thumbnail image of the read image data on the display unit 17. May be.
- the control unit 11 may be configured to select image data when the user touches the thumbnail image displayed on the display unit 17.
- the control unit 11 reads the AR information table of the image identification information from the AR information storage unit 30 based on the image identification information associated with the selected thumbnail image, and displays it on the display unit 17. At this time, as described above, the user can select display / non-display of the AR information.
- control unit 11 includes, as AR information, the building identification information for identifying the building, the building name that is the name of the building, the building information, and the building, which are described in the building table. If all the location information based on the latitude and longitude of the object, the description of the building, and the posted information are stored in the storage unit 15 or the like, even in a place where the radio base station 3 and the radio do not communicate, that is, offline However, the control unit 11 can display the image data stored in the AR information storage unit 30 on the display unit 17 with an AR information tag added as shown in FIG.
- the control part 11 will memorize
- an AR information acquisition request may be made again to the information search system.
- control unit 11 When the control unit 11 is in the AR information acquisition mode, when receiving a hold control signal from the hold control unit 31, the control unit 11 displays an icon prompting the user ID to be input, and an icon indicating alphabetic and numeric keys. At the same time, a user ID entry column is displayed.
- the control unit 11 detects the corresponding character data, writes it in the internal buffer, and displays the character data in the internal buffer in the user ID entry field. .
- the control part 11 fixes the character string of an internal buffer as a user ID by detecting that the enter key in the icon by which the alphabetic and numerical key was shown was touched. The control unit 11 adds and stores the user ID stored in the internal buffer to the AR information table.
- the control unit 11 When the user wants to refer to the AR information, when the image data is selected, the control unit 11 enters the user ID together with a display prompting the user ID to be input and an icon showing alphabetic and numeric keys. Display the column. By detecting that the icon showing the alphabetic and numeric keys is touched, the control unit 11 detects the corresponding character data, writes it in the internal buffer, and also stores the character data in the internal buffer in the user ID entry column. Is displayed. And the control part 11 detects that the enter key in the icon by which the alphabetic and numerical key was shown was touched, The user ID corresponding to the character string of an internal buffer, and selected image data Comparison is made with the user ID added to the AR information table corresponding to the image identification information.
- the control unit 11 displays an image (AR image) obtained by combining the image data and the AR information on the display unit 17.
- AR image an image obtained by combining the image data and the AR information on the display unit 17.
- the user ID added to the AR information table is different, only the image data is displayed.
- the digital camera 1 can display AR information by specifying a user (or user ID) for each image.
- control unit 11 does not display the AR information registered by the other user in the information search system 2 or the AR information acquired by the other user, but wants to display the AR information registered by the user or the AR information acquired by the user.
- a user who uses the digital camera 1 is stored in advance in the storage unit 15 or the database 22 as a user table, such as an address book, and an icon (AR switching icon ARI2) for referring to AR information is touched.
- the control unit 11 displays a list of user names (or a list of user IDs) acquired from the user table on the display unit 17.
- the control unit 11 When the user touches his / her user ID from the list, the control unit 11 reads out the AR information table to which the same user as the touched user ID is added from the AR information storage unit 31, and this AR information Image data having the same image identification information as the image identification information in the table may be read from the storage unit 15 and displayed as a thumbnail image on the display unit 17.
- the control unit 11 stores an AR information table in which the same user ID as the user ID of the other user is added based on a signal selected by the other user in a state where the list is displayed.
- the image data read from the unit 30 and having the same image identification information as the image identification information in the AR information table may be read from the storage unit 15 and displayed on the display unit 17 as a thumbnail image.
- the control unit 11 when storing the AR information table for each piece of image identification information in the AR information storage unit 30, the control unit 11 not only receives the AR information from the information search system 2, but also the AR information as AR related information.
- a parameter or shooting date / time (time information) associated with the AR information may be added and stored in the AR information table in association with each building identification information.
- an AR date search icon AR date search icon displayed on the display unit 17.
- the control unit 11 detects that the AR date search icon has been pressed by the sensor, and displays an input field for inputting a date on the display unit 17 together with an icon showing alphabetic and numeric keys. indicate.
- the control unit 11 converts the AR information table to which the same date as this date is added to the AR information storage unit 30. Search and extract from. Then, the control unit 11 reads the image identification information corresponding to the extracted AR information table, reads the image data of this image identification information from the storage unit 15, and displays it as a thumbnail image on the display unit 17.
- the control unit 11 detects that the thumbnail image is touched by the sensor of the display unit 17, and the image data selected by the touch is selected. Image identification information is detected.
- control unit 11 reads out the image data corresponding to the image identification information of the selected image data from the storage unit 15 and displays it on the display unit 17. Further, the control unit 11 reads the AR information table corresponding to the image identification information, and displays a composite image (AR image) in which the AR information and the AR related information are superimposed on the previously displayed image data on the display unit 17. indicate.
- AR image composite image
- the control unit 11 determines whether the current AR information display icon has been touched.
- the AR information table of the image identification information of the image data is extracted from the AR information storage unit 30 by detection based on the detection signal. Then, the control unit 11 adds the latitude / longitude information, the azimuth information, and the angle-of-view information when the AR information is acquired from the AR information table between the present and the past (in this embodiment, 10 years), An AR information search request is transmitted to the information search system 2.
- the control unit 11 may acquire all AR information from the information search system 2 between the present and the past (in this embodiment, 10 years) from the AR information table.
- the information search server 21 When the information search server 21 receives the AR information search request from the digital camera 1, the building identification information of the building in the search range obtained from the latitude / longitude information, the azimuth information, and the angle of view information added to the AR information search request. Are extracted from the map data of the database 22. Then, the information search server 21 reads out the AR information of the building corresponding to the building identification information from the building table of the database 22 and transmits it to the digital camera 1. Thereby, the control part 11 displays the AR information of the building in the present there overlaid on the past image data. With this configuration, it is possible to easily confirm the transition between the past and the current building.
- the control unit 11 may store current or past time information (eg, current time). AR information of the time information may be acquired and displayed on the display unit 17 based on 10 years ago. The user can view the AR information corresponding to the image data in time series from the present to the past.
- a configuration in which a plurality of user IDs can be added, so that AR information is disclosed only between a plurality of friends (users). can be limited.
- a user ID may be set for each building identification information and managed for each AR information of the building.
- the control unit 11 is configured to prompt the user ID to be input for each building. If no user ID is set, the open display, that is, the person (user) to be viewed is not limited, and any person (user) can be seen. Only when the user ID is added to the AR information table corresponding to the selected image data, the control unit 11 displays a field for inputting the user ID and an icon showing alphabetic and numeric keys. 17 is displayed.
- the AR information display combinations using the user ID are the following five types.
- the user ID is added, and the limited display of users that can be displayed
- Open display that can be displayed by any user without a user ID
- a user ID is added, and a limited display of users that can be displayed (4) Current (or Composite image of AR information of the current building in the search range of the past) image data and the present (or past) image data No user ID is added, and any user Displayable open display (5) Display of current (or past) image data only (AR information not displayed) Moreover, the control part 11 can switch the display of AR information using said user ID with the signal based on selection of AR user switching icon ARI4.
- FIG. 13 is a flowchart showing an operation example of an information search system according to another embodiment of the present invention.
- the control unit 11 detects that the AR information acquisition mode icon is touched, and the digital camera 1 Is the AR information acquisition mode.
- the control unit 11 causes the display unit 17 to display the image data continuously supplied from the imaging unit 13 (step S31).
- the control unit 11 reads latitude / longitude information (position information) indicating coordinate values of latitude and longitude and azimuth information indicating the azimuth angle of the optical axis of the digital camera 1 from each of the GPS 14 and the azimuth sensor 16 (step) S32).
- the control unit 21 reads the angle-of-view information from the focal length and the magnification from the table stored in the storage unit 15 and associated with the angle-of-view information for each combination of the focal length and the magnification.
- the angle of view information in the digital camera 1 when an information acquisition request is made to the information search system 2 is obtained.
- the control unit 11 assigns the camera identification information, the latitude / longitude information, the azimuth information, and the angle-of-view information of the digital camera 1 to the AR information acquisition request. It transmits to the search system 2 (step S33).
- the information search server 21 obtains a search range for searching for a building from latitude / longitude information, direction information, and angle-of-view information supplied from the digital camera 1 and searches the map data in the database 22 for a building within the search range.
- the AR information is read from the building table using the building identification information of the building searched within the search range (step S34). Then, the information retrieval server 21 transmits the read AR information to the digital camera 1 corresponding to the camera identification information together with the building identification information of the AR information.
- control unit 11 is included in its own latitude / longitude information, azimuth information, field angle information, search distance (added to the AR information from the information search system 2 and transmitted), and supplied AR information. From the latitude and longitude information of the building, the coordinate position of each building in the display coordinates of the image data on the display unit 17 is calculated, and after displaying the image data, the AR information of each building in the image data is displayed as a tag. (Step S35).
- the control unit 11 determines whether or not the azimuth angle supplied from the azimuth sensor 16 has changed by a predetermined angle (distance) (step S36). That is, the control unit 11 obtains a difference between the azimuth angle supplied from the azimuth sensor 16 and the azimuth angle stored in the azimuth angle storage unit, and compares this difference with a preset angle (angle threshold). . At this time, when the difference changes beyond (or more than) a predetermined angle (angle threshold), the control unit 11 determines that the azimuth angle has changed, and returns the process to step S31. If it does not change (or exceeds) (threshold), the azimuth is not changed, and the process proceeds to step S37.
- the control unit 11 determines whether or not a hold control signal is supplied from the hold control unit 31 (step S37). At this time, when the hold control signal is input, the control unit 11 advances the process to step S38. On the other hand, when the hold control signal is not input, the control unit 11 returns the process to step S36.
- the hold control unit 31 compares acceleration data from the acceleration sensor with a preset threshold value of acceleration data, and the acceleration data supplied from the acceleration sensor exceeds the threshold value of the acceleration data. If (or above), a hold control signal is generated and transmitted to the control unit 11. On the other hand, when the acceleration data supplied from the acceleration sensor is equal to or less than (or less than) the threshold value of the acceleration data, the hold control unit 31 does not perform a process of transmitting a hold control signal to the control unit 11.
- control part 11 memorize
- the data is written into the unit 15 and stored (step S38).
- the control unit 11 advances (returns) the process to step S36.
- the flowchart of FIG. 13 is repeated whether the AR information is displayed or not.
- the display unit 17 is moved to a position where the user can most easily see by changing the imaging direction of the digital camera 1, that is, by changing the imaging direction to a direction different from the direction in which the AR information is acquired.
- the AR information can be confirmed in the image data when the AR information is acquired.
- the user holds the AR information of the building in the north direction from the current position, and after holding the digital camera 1 in a direction different from the north direction (for example, the direction of the ground or the east direction). It becomes possible to confirm the AR information of the building of the held image data.
- the digital camera 1 according to the present embodiment can perform the above-described processing even when it cannot communicate with the wireless base station 3 and is offline.
- the navigation function of the digital camera 1 will be described.
- the image data and the AR information are displayed as tags.
- the same processing can be performed in the AR display confirmation mode after the AR information is held.
- the control unit 11 detects that the navigation icon is touched by a touch sensor, and starts a navigation application (navigation unit).
- the control unit 11 displays on the display unit 17 a display that prompts the user to touch the tag of the building for which guidance is requested.
- the control unit 11 detects the building identification information corresponding to the touched building tag and corresponds to the image identification information of the displayed image data.
- the AR information table is read from the AR information storage unit 30.
- the control unit 11 reads the latitude / longitude information of the building from the AR information corresponding to the detected building identification information from the AR information table, and the latitude / longitude information of the current position of the digital camera 1 and the building information Latitude / longitude information is added and transmitted to the information retrieval system 2 as a navigation request.
- the database 22 stores the shortest route (or a plurality of routes) between the latitude / longitude information of the added digital camera 1 and the latitude / longitude information of the building. Search from map data. That is, the information search server 21 extracts the intersection closest to the coordinate value where the digital camera 1 is located indicated by the latitude / longitude information from the map data and sets it as the start intersection. Similarly, the information search server 21 extracts the intersection closest to the coordinate value where the building indicated by the latitude and longitude information is located from the map data, and sets it as the end intersection.
- the information search server 21 generates a combination of a plurality of road routes connecting from the start intersection to the end intersection from the road network of the map data, and selects a route that minimizes the distance of each combination.
- a combination of road routes that minimizes the total distance is extracted by an algorithm in graph theory that efficiently solves a known shortest path problem such as Dijkstra method for extracting the shortest path.
- the information retrieval server 21 transmits image data of a map indicating a road route from the start intersection to the end intersection to the digital camera 1 that has made the navigation request.
- the information search server 21 generates map image data with a display color of a guide route (a road route connecting a start intersection and an end intersection) in the map image data in a color different from that of other roads.
- the control unit 11 changes the received map image data to a size that can be displayed on its own display unit 17, and displays the changed map image data on the display unit 17. And the control part 11 arrange
- the image data and the AR information of the building of the image data may be transmitted from the digital camera 1 to another digital camera of another user, and the information may be shared with the other user. Further, the image data and the AR information of the building in the image data are uploaded from the digital camera 1 to a website such as SNS (Social Networking Service), and the information is shared with many other digital cameras. You may make it do. In this case, by adding the user ID to the AR information table, only the user who recognizes the user ID can view or add the AR information corresponding to the image data in this SNS.
- SNS Social Networking Service
- the SNS server has an AR information sharing database that stores image data and AR information.
- the user can add and register user information from the digital camera 1 to the image data that the user wants to register and AR information corresponding to the image data.
- the control unit 11 detects that the icon has been touched by a touch sensor and displays a registration screen on the display unit 17.
- This registration screen also includes a thumbnail image of image data corresponding to the AR information acquired by the user, and an icon indicating alphabetic and numeric keys.
- the user selects image data to be registered from thumbnail images, inputs a user ID for the image data and the user's own name (or nickname), and touches the enter icon, thereby the control unit 11
- the SNS server is accessed, and this image data and AR information are transmitted together with the registration request.
- the SNS server Upon receiving this registration request, the SNS server generates a file of the user's own name (or nickname) and writes and stores the image data and AR information requested for registration in the AR information sharing database.
- This file has the same configuration as the AR information table of FIG.
- the control unit 11 detects that the icon has been touched by a touch sensor and displays a reference screen on the display unit 17. To do.
- This reference screen also includes an icon showing alphabetic and numeric keys.
- the user's name (or nickname) and user ID that the user wants to refer to are input by touching an icon with alphabetic and numeric keys, and touching the enter icon,
- the control unit 11 accesses the SNS server, and transmits this name (or nickname) and user ID together with a reference request for image data and AR information.
- the SNS server When the SNS server receives the reference request, the SNS server searches the AR information sharing database based on the name (or nickname) added to the reference request, and extracts a file having the same name (or nickname). Then, the SNS server reads the image identification information to which the same user ID as the user ID received from the AR information of the file is added and the image data corresponding to the image identification information, and the read AR information and image data Is transmitted to the digital camera 1.
- the control unit 11 displays the input image data and AR information on the display unit 17 as described above.
- the control unit 11 detects that the icon has been touched by a touch sensor and displays a change screen on the display unit 17. indicate.
- the change screen includes a thumbnail image of image data corresponding to the read AR information, and an icon indicating an information entry column and alphabetic and numeric keys.
- the control unit 17 detects the selected image data with the touch sensor, and the selected image data is displayed on the display unit 17.
- the AR information tag is displayed in an overlapping manner on the displayed image data so as to correspond to the position of each building.
- the user touches and selects the AR information tag to be changed, and enters the character string and user ID added or edited in the information entry field by touching the alphabetic and numeric keys, and the enter key (enter icon).
- the control unit 11 uses the character string input in the information entry field as AR information of the new building, and requests the image identification information and the building as a change request including the user name and user ID.
- the identification information and the changed AR information are transmitted to the SNS server.
- the SNS server searches the AR information sharing database based on the name (or nickname) added to the change request, and extracts a file having the same name (or nickname).
- the SNS server determines whether or not the user ID added to the image identification information matches the user ID added to the received change request, and if it does not match, nothing can be changed without doing anything. It notifies the digital camera 1 that there is. On the other hand, when the user ID added to the AR information matches the user ID added to the received change request, the SNS server changes the AR information of the building identification information in the image identification information.
- a program for realizing the AR information hold function of the control unit 11 and hold control unit 31 of the digital camera 1 in FIG. 9 is recorded on a computer-readable recording medium, and the program recorded on the recording medium is recorded.
- the AR information may be controlled by causing the computer system to read and execute.
- a program for realizing the AR information search function of the information search server 21 in FIG. 9 is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed.
- the AR information search may be controlled.
- the “computer system” includes an OS and hardware such as peripheral devices.
- the “computer system” includes a homepage providing environment (or display environment) if a WWW system is used.
- the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
- the “computer-readable recording medium” dynamically holds a program for a short time like a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line.
- a volatile memory in a computer system serving as a server or a client in that case, and a program that holds a program for a certain period of time are also included.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Computer Graphics (AREA)
- Strategic Management (AREA)
- Environmental & Geological Engineering (AREA)
- Biodiversity & Conservation Biology (AREA)
- Remote Sensing (AREA)
- Environmental Sciences (AREA)
- Emergency Management (AREA)
- Ecology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Game Theory and Decision Science (AREA)
- Marketing (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Television Signal Processing For Recording (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
本願は、2010年2月8日に、日本に出願された特願2010-025998号に基づき優先権を主張し、その内容をここに援用する。
図1において、情報取得システムは、撮像装置の一例であるデジタルカメラ1と、情報検索システム2とから構成され、デジタルカメラ1は無線通信により無線基地局3とデータ通信を行うことにより、この無線基地局3及びインターネットなどの情報通信網4を介して、情報検索システム2(或いは、後述する情報検索サーバ21)とデータの送受信を行う。なお、本実施形態における撮像装置とは、上述のデジタルカメラ1やカメラ付携帯電話などのような撮像機能付の携帯型端末である。例えば、デジタルカメラ1は、ブラウザ機能を有しており、制御部11、送受信部12、撮像部13、GPS(Global Positioning System)14、記憶部15、方位センサ16(方位角検出部)、表示部17及びタイマ18を有している。ここで、本実施形態における無線通信とは、伝送路として線を使わなく、電波、光や音波などによる通信を含むものである。
表示部17は、撮像部13が撮像している画像データ、またはユーザにより表示する画像として選択されて制御部11が記憶部15から読み出した画像データを表示する。
また、表示部17は、後述するように、情報検索システム2から受信した画像データ、あるいはユーザが入力した文字データなどの表示を、上記制御部11の制御により行う。
送受信部12は、無線通信により無線基地局3と送受信を行い、情報通信網4を介して、情報検索システム2との間にて、画像データ、文字データあるいは制御信号などのデータの送受信を行う。
データベース22には、図2に示す建造物テーブルのように、建造物を識別するための建造物識別情報と、建造物の名称である建造物名称と、建造物情報(住所、電話番号、種別、建造物を中心とした周辺画像データなどの情報)と、建造物の緯度経度などによる位置情報と、建造物の説明(店舗であれば店が記載する情報)と、投稿情報(訪れたユーザの評価などのコメント、及びユーザの投稿した画像データ等)とが記憶されている。
情報検索サーバ21は、入力される上記位置情報及び上記方位情報を用い、位置情報の示した緯度経度の位置から、方位情報の示す方向において、最も近い距離にある建造物を探索し、探索された建造物の建造物識別情報を取得する。
また、情報検索サーバ21は、探索して取得した建造物の建造物識別情報に対応する建造物を、上記建造物テーブルから検索し、検索された建造物の各情報(建造物名称、建造物情報、など)をデジタルカメラ1に送信する。なお、情報検索サーバ21は、デジタルカメラ1に送信する上記の建造物の各情報を、例えば送受信データ容量などの必要に応じて、選択的に送信するようにしてもよい。その場合、情報検索サーバ21は、上記の建造物の各情報のうち残りの情報を、所定の操作(例えば、デジタルカメラ1からの要求など)などによって再度デジタルカメラ1に送信するようにしてもよい。
以下の説明において、例えば、ユーザが繁華街を歩きながら、夕食を食べるレストランを探索している状態にて説明する。
ユーザは繁華街を徘徊しつつ、気に入った外観のレストランに対し、デジタルカメラ1のレンズを向け(ステップS1)、情報を得たい外観のレストランを見つけた場合、図4に示されるように、デジタルカメラ1の画像表示欄MPに設けられているブラウズアイコンI1をタッチする(押下する)ことによりブラウズアイコンI1が選択される(ステップS2)。
なお、上述のステップS2は、デジタルカメラ1の撮像ボタンB1を押下した後に、撮像された上記レストランの画像が表示される画像表示欄MPを見ながらブラウズアイコンI1をタッチして選択するようにしてもよい。
そして、制御部11は、画像データ及び上記関連情報を記憶部15に書き込んだ後、デジタルカメラ1を識別するためのカメラ識別情報(ユーザに付与されたユーザの識別情報あるいはデジタルカメラ1に付与された識別情報)と、上記位置情報と、上記方位情報とを含む検索依頼信号を、送受信部12を介して情報検索システム2へ送信する(ステップS5)。なお、制御部11は、上述のステップS3~ステップS5において、上記画像データ及び上記関連情報などを記憶部15に書き込まずに送受信部12を介して上記検索依頼信号を情報検索システム2へ送信するようにしてもよい。
これにより、制御部11は、CMアイコンI2が選択されたことを、上記CMアイコンI2が選択されたことを示す信号が入力されることにより検出し、検索結果情報に含まれるレストランのコマーシャル情報(メニューや、席数や、店内画像や、シェフの今日のおすすめ料理など)を、表示部17を介して情報表示欄SPに表示する。また、このコマーシャル情報に店内画像などの画像データが含まれている場合、制御部11は、その画像データを情報表示欄SPに表示する。
これにより、制御部11は、投稿閲覧アイコンI3が選択されたことを、上記投稿閲覧アイコンI3が選択されたことを示す信号が入力されることにより検出し、検索結果情報に含まれる他のユーザの書き込んだ投稿情報(コメント)を、情報表示欄SPに表示部17を介して表示する。 そして、上記の投稿情報が複数ある場合、制御部11は、例えば、投稿された日付順に複数の投稿情報を情報表示欄SPに表示する。
そして、例えば、制御部11は、図5において、情報表示欄SPの左端から右方向にスライドするようにタッチされたことを示す信号を検出した場合、情報表示欄SPの表示を終了して、図4のように画像表示欄MPに上記レストランの画像を表示部17を介して表示する。なお、例えば、制御部11は、情報表示欄SPに投稿情報を表示中で、投稿閲覧アイコンI3が選択されたことを示す信号を検出した場合に、情報表示欄SPの表示又は投稿情報の表示を終了するようにしてもよい。
なお、制御部11は、情報表示欄SPに表示されたその店内画像などをタッチされたことを示す信号を検出すると、その店内画像の拡大画像又は縮小画像を情報表示欄SP(又は画像表示欄MP)に表示する。
これにより、制御部11は、メールアイコンI4が選択されたことを、上記メールアイコンI4が選択されたことを示す信号が入力されることにより検出し、図6に示すような入力欄IPを、表示部17を介して画像表示欄MPの一部に表示する。ここで、入力欄IPは、ユーザが情報などを入力するテキスト欄Tと、タッチ画面型キーボード部Kと、で構成されている。この入力欄IPは、ユーザが、後述する投稿情報、店舗名や年度を入力する際などに使用される。
そして、制御部11は、終了アイコンEが選択されたことを示す信号が入力されることにより、終了アイコンEが選択されたことを検出した場合、情報検索の処理を終了する。
一方、制御部11は、終了アイコンEが選択されたことを示す信号が入力されず、終了アイコンEが選択されたことを検出しない場合、処理をステップS1に戻し、デジタルカメラ1の撮像処理及び情報検索処理を継続する。
この際、比較結果として周辺画像データの建造物と、画像データの建造物とが同様でないと判定した場合、サーバ21は、位置情報及び方位情報により次に距離的に近い建造物を地図データから抽出し、再度特徴量の比較を行い、新たに抽出された建造物の周辺画像データと、撮像している画像データとの類似性の評価を行うようにしても良い。
以下の説明において、例えば、ユーザが新宿の繁華街を徘徊しつつ、夕食を食べるレストランを探索し、いくつかのレストランの画像データを撮像してから、そのなかから食事をするレストランを探す状態を示す例を示す。
そして、制御部11は、撮像ボタンB1が押下されたことを示す信号が入力されることにより、撮像ボタンB1が押下されたことを検出し、撮像部13の撮像している上記レストランの画像データに画像識別情報を付与し、この画像識別情報に対応させ、画像データと撮像した時刻とを記憶部15に対して書き込んで記憶させる(ステップS13)。
そして、制御部11は、撮像ボタンB1が押下された時点において、GPS14が取得した位置情報と、方位センサ16が取得した方位情報とを、上記画像識別情報に対応させ、記憶部15に書き込み、記憶させる(ステップS14)。
このとき、読み出しアイコンRが選択されたことを示す信号が入力されると、制御部11は、読み出しアイコンRが選択されたことを検出し、情報の探索を開始するため、処理をステップS16へ進める。なお、読み出しアイコンRは、読み出しボタン(不図示)としてもよい。その場合、読み出しボタンは、撮像ボタンB1のようにデジタルカメラ1本体に設ければよい。 一方、読み出しアイコンRが選択されたことを示す信号が入力されない場合又は記憶部15に画像が記憶されていない場合、制御部11は、新たな撮像処理を行うため、処理をステップS11の状態に戻す。
また、1ページにサムネイル画像が入りきらない場合(或いは1枚ずつ表示する場合)、ユーザが画像表示欄MPに表示された画像を所定の方向にスライドさせるようにタッチすると、前ページ又は次ページのサムネイル画像を画像表示欄MPに表示する。この時、制御部11は、画像表示欄MPに表示された画像が所定の方向にスライドさせるようにタッチされたことを示す信号を検出することで、前ページ又は次ページのサムネイル画像を画像表示欄MPに表示部17を介して表示する。
これにより、制御部11は、ブラウズアイコンI1が押下されたことを示す信号が入力されると、選択された画像データの画像識別情報により、記憶部15からこの画像識別情報に対応する位置情報及び方位情報を読み出し、カメラ識別情報、位置情報及び方位情報とを含む検索依頼信号を、送受信部12を介して情報検索システム2へ送信する(ステップS17)。
以下の説明において、例えば、原宿にて夕食を行う場合、知人から聞いたレストランの情報を確認する状態を示す。 ユーザが検索アイコンI5をタッチにより選択すると、上記周辺回路は検索アイコンI5が選択されたことを示す信号を制御部11に出力する。
これにより、制御部11は、検索アイコンI5が選択されたことを、上記検索アイコンI5が選択されたことを示す信号が入力されることにより検出し、図6に示すような入力欄IPを、表示部17を介して画像表示欄MPの一部に表示する。
そして、ユーザが入力欄IPのテキスト欄Tに対し、タッチ画面型キーボード部Kを用いて検索を行いたいレストランの店舗名を書き込み(ステップS21)、ユーザが検索アイコンI5を再度タッチする(ステップS22)。
次に、サーバ21は、上記検索依頼信号を入力すると、データベース22における建造物テーブルから、上記店舗名に対応する建造物(店舗)の情報(店舗情報、図2のテーブルにおける各情報)を読み出す(ステップS24)。
店舗情報が送受信部12を介して受信されると、制御部11は、表示部17を介して、図5の画像表示欄MPに建造物情報に含まれる店舗周辺の画像データを表示するとともに、情報表示欄SPに建物情報を表示する(ステップS26)。例えば、レストランが、どのような種類(中華、和風、フレンチ、イタリアンなど)であるかの情報と、電話番号となどが表示される。
また、ユーザがCMアイコンI2をタッチにより選択すると、制御部11は、CMアイコンI2が選択されたことを検出し、検索結果情報に含まれるレストランのコマーシャル情報(メニューや、シェフの今日のおすすめ料理など)を、情報表示欄SPに表示部17を介して表示する。
制御部11は、投稿閲覧アイコンI3が選択されたことを示す信号が入力されることにより、投稿閲覧アイコンI3が選択されたことを検出し、検索結果情報に含まれる他のユーザの書き込んだ投稿情報を、情報表示欄SPに表示部17を介して表示する。
このとき、制御部11は、ユーザが終了アイコンEをタッチにより終了アイコンEが選択されたこと示す信号が周辺回路から入力された場合、ブラウズの処理を終了する。一方、制御部11は、終了アイコンEが選択されたことを示す信号が入力されない場合、処理をステップS21に戻し、店舗の情報の検索を継続する。
データベース22の建造物テーブルは、上述した建造物が店舗(レストランや会社も含む)である場合、この店舗毎に割引クーポンのデータが記憶されている構成としてもよい。
そして、サーバ21は、検索結果情報に割引クーポン情報を付加し、デジタルカメラ1へ送信する。
ユーザが上記割引クーポン情報を用いて、食事をしたり、買い物をしたりすると、サーバ21は、割引クーポン情報の使用の有無からユーザが食事あるいは買い物をしたかの有無を検出する。例えば、上記割引クーポン情報がデジタルカメラ1の情報表示欄SPに表示されるバーコード(2次元バーコードを含む)などの場合、店舗の読み取り機がそのバーコードを読み取り、デジタルカメラ1から送信されるカメラ識別情報、及び買い物(または食事)の料金ととともに、割引クーポン情報が使用されたことを示す利用情報を情報検索システム2へ送信する。
そして、サーバ21は、建造物識別情報に対応して、利用した金額及び上記情報の利用段階に対応する課金を、上記課金テーブルに履歴として書き込み、積算値に新たな課金を加算して積算値の更新を行う。
また、サーバ21は、投稿情報を送信してきたユーザに対し、予め設定された値のポイントを、上記ユーザ登録テーブルに、カメラ識別情報に対応して積算する。 上記ポイントはお金の代わりに、代金支払いにおいて割引クーポン情報とともに使用することができる。
これにより、制御部11は、情報検索システム2に対して、投稿識別情報とともに投稿情報送信依頼を送信する。
ここで、サーバ21は、各投稿情報に対して、識別のための投稿識別情報を付与し、カメラ識別情報と対応させて、ユーザ登録テーブルに書き込んで記憶させている。
なお、本実施形態においては、制御部11は、情報検索システム2からの文字データ及び画像データを、送受信部12を介して受信すると、情報表示欄SPに文字データを表示し、画像データを画像表示欄MPに表示してもよい。
そして、サーバ21は、ユーザに投稿情報を選択させる、最初のニックネーム及び表題の表示段階において、投稿情報の選択された回数の多い順に、すなわち参照回数の多いユーザのニックネームと表題とを、情報表示欄SPに表示する順位を上位とした表示リストを、デジタルカメラ1へ送信する。
これにより、制御部11は、上記表示リストに従い、情報表示欄SPにニックネームと表題を順次表示する。
また、本実施形態においては、データベース22は、各緯度経度における建物及び風景が年度ごとに撮影された画像データが、緯度経度毎に対応して記憶された過去画像テーブルを有するようにしてもよい。
ユーザが過去画像アイコンI6をタッチにより選択すると、上記周辺回路は過去画像アイコンI6が選択されたことを示す信号を制御部11に出力する。
これにより、制御部11は、過去画像アイコンI6が選択されたことを、上記過去画像アイコンI6が選択されたことを示す信号が入力されることにより検出し、図6に示すような入力欄IPを、表示部17を介して画像表示欄MPの一部に表示する。
そして、ユーザは、テキスト欄Tにタッチ画面型キーボード部Kを用いて年度(例えば、西暦)を記入した後、過去画像アイコンI6(又はブラウズアイコンI1)をタッチする。
年度のデータを読み込んだ後、制御部11は、過去画像検索要求を、読み込んだ年度と、位置情報と、方位情報と、カメラ識別情報とともに、情報検索システム2へ送信する。
これにより、ユーザは、過去の各年度にどのような建造物があったのか、あるいは何もなく風景のみであったのかなどの情報を得ることができる。
また、検索対象の年度が、写真が無い時代であれば、その時代にかかれた風景画、あるいは想像して作成されたCG(Computer Graphics)の画像データを、写真の画像データに代えて、年度に対応して、記憶部15の上記過去画像テーブルに記憶させておいても良い。
すでに述べたように、図1の構成において、制御部11は、GPS14から得られたデジタルカメラ1自身の存在位置を示す緯度経度情報(位置情報)、方位センサ16(方位角検出部)で得られたデジタルカメラ1の光学軸の方向(撮影している方向)を示す方位角からなる方位情報などを、撮像部16が被写体を撮像した画像データに画像識別情報を付加し、記憶部15に撮像順に書き込む。
この図1の構成に対して、図9に示すように、AR情報の取得及び記憶のための機能として、ホールド制御部31とAR情報記憶部30とを新たに付加する。図9は、AR情報を取得及び記憶する機能を有する情報取得システムの構成例を示すブロック図である。
制御部11は、このAR情報のうちいずれかを、スルー画像(被写体を撮像した画像)の画像データに対して重ね合わせて表示するタグ(拡張現実のオブジェクト)とするかは、ユーザにより任意に設定されるように構成しても良い。
ここでスルー画像とは、撮像部13が撮像素子に結像された画像を連続的に画像データとして制御部11へ出力し、制御部11が順次表示部17に表示する画像データである。
ここで、制御部13は、焦点距離及び倍率等と、画角との対応を示すテーブルを記憶部15に予め記憶させておき、画像データを取得した際の焦点距離及び倍率等により画像の画角情報を求める。
また、本実施形態におけるデジタルカメラ1は、AR情報を取得するAR情報取得モードまたは通常の撮像モードがある。制御部11は、デジタルカメラ1の筐体に設けられているAR情報取得モードのボタン(AR情報取得ボタンARB1)の押下、あるいは表示部17に表示されているAR情報取得モードのアイコン(AR情報取得アイコンARI1)をタッチすることにより、制御部11は、ボタンの押下をスイッチからの信号を検出あるいは表示部17におけるアイコンのタッチをタッチセンサにより検出し、AR情報取得モードあるいは通常の撮像モードのいずれかとする制御を行う。
ここで、制御部11は、方位角の変化を、所定の周期毎に方位角を検出し、すでに記憶部15の方位角記憶部に記憶されている方位角から、予め設定した角度を超えて方位角が変化すると、方位角の変化として検出し、この時点の方位角を新たに方位角記憶部に記憶させる。
また、情報検索サーバ21は、画角情報と、当該画角に対応した検索距離とが記載されたデータベース22の検索距離テーブルから、画角情報に対応した検索距離を読み出すことによりAR情報の検索距離を求める。
そして、情報検索サーバ21は、経度緯度情報に基づく座標位置、デジタルカメラ1のレンズの光軸の方位角、検索距離により、図12に示す検索範囲を求め、この検索範囲にある建造物識別情報を、データベース22の地図データから読み出す。
また、情報検索サーバ21は、読み出した建造物識別情報により、データベース22内の建造物テーブルから、建造物のAR情報を読み出す。
情報検索サーバ21は、読み出したAR情報を、対応する建造物識別情報とともに、デジタルカメラ1に対して送信する。
また、制御部11は、ホールド制御部31からホールド制御信号を受信すると、ホールドされた際の緯度経度情報、方位情報及び画角情報と、AR情報取得要求にて情報検索システム2から送信されるAR情報及びこのAR情報の建造物を識別する建造物識別情報とを、ホールドした際の画像データの画像識別情報毎に、AR情報記憶部30に対して記録する処理を行う。
制御部11は、上述したように画像識別情報毎に生成され、建造物識別情報(建造物のAR情報が記憶されている場所を示すURL)と、この建造物のAR情報(緯度経度情報を含む)と、ホールド制御信号が入力されてホールドされた際のデジタルカメラ1自身の緯度経度情報、方位情報及び画角情報とを、AR情報記憶部30に対してAR情報テーブルとして書き込み記憶する。
また、制御部11は、ホールド制御信号が入力されると、その時点において撮像部13の撮像素子に結像されている画像データを、この画像データに付与した画像識別情報に対応させて記憶部15に書き込んで記憶させる。
そして、ホールド制御部31は、AR情報取得ボタンが押下された場合、あるいはAR情報取得を示すアイコンがタッチされた場合、画像データとAR情報とを記録する信号が入力されたことを検出し、制御部1に対してホールド制御を行うホールド制御信号を出力する。このように、ホールド制御部31は、ユーザのデジタルカメラ1に対するモーション(アクション、操作、動作)によって、画像データとAR情報とを記録する信号が入力されたことを検出し、制御部1に対してホールド制御を行うホールド制御信号を出力する。
上記加速度センサを用いた加速度検出において、ホールド制御部31は、予め設定された閾値以上の加速度データが加速度センサから供給されると、ホールド指示の制御信号として、ホールド制御信号を制御部11に対して出力する。このように、ホールド制御部31は、端末(例、デジタルカメラ1)のモーションによって、画像データとAR情報とを記録する信号が入力されたことを検出し、制御部1に対してホールド制御を行うホールド制御信号を出力する。
上記画像パターンの変化の検出において、ホールド制御部31は、予め設定された閾値以上の画像パターンの変化を検出すると、ホールド制御を行う制御信号として、ホールド制御信号を制御部11に対して出力する。
上記仰角センサを用いた仰角情報の検出において、ホールド制御部31は、予め設定された閾値以上の仰角変化が仰角センサから供給されると、ホールド制御を行う制御信号として、ホールド制御信号を制御部11に対して出力する。このように、ホールド制御部31は、端末(例、デジタルカメラ1)のモーションによって、画像データとAR情報とを記録する信号が入力されたことを検出し、制御部11に対してホールド制御を行うホールド制御信号を出力する。
このため、制御部11は、AR情報取得要求に対応した、画像データ及びこの画像データに対応する建造物識別情報と、この建造物のAR情報、緯度経度情報、方位情報及び画角情報とを、所定の時間内、例えば1秒間、AR情報記憶部30のバッファ部に書き込み記憶させる(キャッシュする)。
また、制御部11は、画像データを記憶部15に、この画像識別情報とともに書き込んで記憶させる。
また、制御部11及びホールド制御部31は、AR表示を非表示とする非表示モードに設定されていても、AR情報の取得あるいはAR情報のホールド処理を行うように構成しても良い。
この際、上述したように、ユーザは、AR情報の表示/非表示を選択することができる。
上述した構成により、ユーザは、AR情報を得たい方向にデジタルカメラ1の撮像方向を向けていなくても、例えば、デジタルカメラ1をテーブルの上に置いて表示部17の画面を見る状態においても、AR情報を得たい方向の画像データを見つつ、AR情報を確認することが可能となる。
ユーザがアルファベット及び数字のキーが示されたアイコンをタッチすることにより、制御部11は、対応する文字データを検出し、内部バッファに書き込むとともに、ユーザID記入欄に内部バッファの文字データを表示する。
そして、制御部11は、アルファベット及び数字のキーが示されたアイコンにあるエンターのキーをタッチされたことを検出することにより、内部バッファの文字列をユーザIDとして固定する。
制御部11は、AR情報テーブルに、内部バッファに記憶されているユーザIDを付加して記憶する。
アルファベット及び数字のキーが示されたアイコンがタッチされたことを検出することにより、制御部11は、対応する文字データを検出し、内部バッファに書き込むとともに、ユーザID記入欄に内部バッファの文字データを表示する。
そして、制御部11は、アルファベット及び数字のキーが示されたアイコンにあるエンターのキーをタッチされたことを検出することにより、内部バッファの文字列に対応するユーザIDと、選択された画像データの画像識別情報に対応したAR情報テーブルに付加されたユーザIDとの比較を行う。
制御部11は、ユーザの入力したユーザIDと、AR情報テーブルに付加されているユーザIDとが等しい場合、表示部17に画像データとAR情報とを合成した画像(AR画像)の表示を行い、一方、AR情報テーブルに付加されているユーザIDとが異なる場合、画像データのみを表示する。
この構成により、デジタルカメラ1は、画像毎にユーザ(又はユーザID)を特定してAR情報の表示を行うことが可能となる。
そして、ユーザがこのリストから自身のユーザIDをタッチすると、制御部11は、このタッチされたユーザIDと同一のユーザが付加されたAR情報テーブルを、AR情報記憶部31から読み出し、このAR情報テーブルの画像識別情報と同一の画像識別情報を有する画像データを、記憶部15から読み出しサムネイル画像として、表示部17に表示するように構成しても良い。なお、制御部11は、上記リストが表示された状態において、他のユーザが選択された信号に基づき、該他のユーザのユーザIDと同一のユーザが付加されたAR情報テーブルを、AR情報記憶部30から読み出して、このAR情報テーブルの画像識別情報と同一の画像識別情報を有する画像データを、記憶部15から読み出しサムネイル画像として、表示部17に表示するように構成しても良い。
このようにすることで、例えば、ユーザが10年前の画像データ、AR情報及びAR関連情報を表示部17に表示させる際、表示部17に表示されたAR日付検索のアイコン(AR日付検索アイコンARI3)を押下すると、制御部11は、AR日付検索のアイコンを押下したことをセンサにより検出し、日付を入力する入力欄を、アルファベット及び数字のキーが示されたアイコンとともに、表示部17に表示する。
そして、制御部11は、抽出されたAR情報テーブルに対応する画像識別情報を読み込み、この画像識別情報の画像データを記憶部15から読み込み、サムネイル画像として表示部17に表示する。
ユーザが表示部17に表示されているサムネイル画像をタッチすることにより、制御部11は、表示部17のセンサによりそのサムネイル画像がタッチされたことを検出し、タッチされて選択された画像データの画像識別情報を検知する。
さらに、制御部11は、画像識別情報に対応するAR情報テーブルを読み込み、AR情報及びAR関連情報を、先に表示した画像データに対して重ねた合成画像(AR画像)を、表示部17に表示する。
上述した構成により、ユーザは、10年前の画像データと、この画像データにある建造物の10年前のAR情報とを、表示部17を介して見ることが可能となる。
そして、制御部11は、AR情報テーブルから、現在から過去の間(本実施形態の場合、10年間)にAR情報を取得した際の緯度経度情報、方位情報及び画角情報を付加して、情報検索システム2に対して、AR情報検索要求を送信する。なお、制御部11は、AR情報テーブルから、現在から過去の間(本実施形態の場合、10年間)の全てのAR情報を情報検索システム2から取得するようにしてもよい。
そして、情報検索サーバ21は、建造物識別情報に対応する建造物のAR情報を、データベース22の建造物テーブルから読み出し、デジタルカメラ1に対して送信する。
これにより、制御部11は、過去の画像データに対して、現在のそこにある建造物のAR情報を重ねて表示させる。
この構成により、過去と現在の建造物の移り変わりを容易に確認することができる。
また、被写体の撮影当時(例、10年前)のAR情報が情報検索サーバ21のデータベース22に日時順などで保存されている場合、制御部11は、現在又は過去の時刻情報(例、現在、10年前、など)に基づき、その時刻情報のAR情報を取得して表示部17に表示してもよい。ユーザは、現在から過去に遡って、画像データに対応するAR情報を時系列的に見ることができる。
また、いずれのユーザIDも設定されていない場合、オープン表示、すなわち見る人(ユーザ)を限定せず、いずれの人間(ユーザ)も見られる状態となる。
制御部11は、選択された画像データに対応するAR情報テーブルにユーザIDが付加されているときのみ、ユーザIDを入力する欄と、アルファベット及び数字のキーが示されたアイコンとを、表示部17に表示する。
(1)現在(又は過去)の画像データ及び現在(又は過去)の画像データ内の建造物のAR情報の合成画像
ユーザIDが付加されており、表示可能なユーザの限定表示
(2)現在(又は過去)の画像データ及び現在(又は過去)の画像データ内の建造物のAR情報の合成画像
ユーザIDが付加されておらず、いずれものユーザが表示可能なオープン表示
(3)現在(又は過去)の画像データ及びこの現在(又は過去)の画像データの検索範囲にある現在の建造物のAR情報の合成画像
ユーザIDが付加されており、表示可能なユーザの限定表示
(4)現在(又は過去)の画像データ及びこの現在(又は過去)の画像データの検索範囲にある現在の建造物のAR情報の合成画像
ユーザIDが付加されておらず、いずれものユーザが表示可能なオープン表示
(5)現在(又は過去)の画像データのみの表示(AR情報非表示)
また、制御部11は、上記のユーザIDを用いたAR情報の表示をARユーザ切り替えアイコンARI4の選択に基づく信号によって切り替えることができる。
ユーザが表示部17に表示されているAR情報取得モードのアイコン(AR情報取得アイコンARI1)をタッチすると、制御部11は、AR情報取得モードのアイコンがタッチされたことを検出し、デジタルカメラ1をAR情報取得モードとする。
制御部11は、撮像部13から連続的に供給される画像データを表示部17に表示させる(ステップS31)。
このとき、制御部21は、焦点距離及び倍率から画角情報を、記憶部15に記憶されている、焦点距離及び倍率の組合せ毎に画角情報が対応付けられたテーブルから読み出すことで、AR情報取得要求を情報検索システム2に対して行う際のデジタルカメラ1における画角情報を求める。
そして、情報検索サーバ21は、読み出したAR情報を、このAR情報の建造物識別情報とともに、カメラ識別情報に対応したデジタルカメラ1に対して送信する。
このとき、制御部11は、差分が所定の角度(角度の閾値)を超えて(又は以上)変化した場合、方位角が変化したとして、処理をステップS31に戻し、一方、差分が所定の角度(閾値)を超えて(又は以上)変化していない場合、方位角が変化していないとして、処理をステップS37へ進める。
このとき、制御部11は、ホールド制御信号が入力されると、処理をステップS38へ進め、一方、ホールド制御信号が入力されていない場合、処理をステップS36へ戻す。
ここで、ホールド制御部31は、例えば、加速度センサからの加速度データと、予め設定されている加速度データの閾値とを比較し、加速度センサから供給される加速度データが加速度データの閾値を超えている(又は以上である)場合、ホールド制御信号を生成し、制御部11へ送信する。一方、ホールド制御部31は、加速度センサから供給される加速度データが加速度データの閾値以下(又は未満)の場合、ホールド制御信号を制御部11へ送信する処理を行わない。
AR情報取得モードの場合、AR情報を表示及び非表示のいずれにおいても、図13のフローチャートを繰り返すことになる。
例えば、東京駅北口からの画像データにある建造物のAR情報をホールドし、レストランのテーブル上にて、東京駅北口からの画像データの建造物のAR情報を確認することが可能となる。また、ユーザは、現在位置から北の方向の建造物のAR情報をホールドし、ホールド後に北の方向とは異なる方向(例、地面の方向や東の方向)にデジタルカメラ1を向けた状態において、ホールドした画像データの建造物のAR情報を確認することが可能となる。また、本実施形態におけるデジタルカメラ1は、無線基地局3と通信が行えずオフラインの場合においても、上述した処理が可能となる。
表示部17に表示されているナビゲーションのアイコンをタッチすると、制御部11はナビゲーションのアイコンをタッチしたことをタッチセンサにより検出し、ナビゲーションのアプリケーション(ナビゲーション部)を起動する。
ユーザが案内を要求する建造物のタグをタッチすると、制御部11は、タッチされた建造物のタグに対応した建造物識別情報を検出し、表示している画像データの画像識別情報に対応するAR情報テーブルをAR情報記憶部30から読み込む。
そして、制御部11は、このAR情報テーブルから、検出した建造物識別情報に対応したAR情報から、建造物の緯度経度情報を読み込み、デジタルカメラ1の現在位置の緯度経度情報と、建造物の緯度経度情報を付加し、ナビゲーション要求として、情報検索システム2に対して送信する。
すなわち、情報検索サーバ21は、緯度経度情報が示すデジタルカメラ1の位置する座標値から最も近い交差点を地図データから抽出し、開始交差点とする。
同様に、情報検索サーバ21は、緯度経度情報が示す建造物の位置する座標値から最も近い交差点を地図データから抽出し、終了交差点とする。
そして、情報検索サーバ21は、開始交差点から終了交差点までの道路経路を示す地図の画像データを、ナビゲーション要求を行ったデジタルカメラ1に対して送信する。ここで、情報検索サーバ21は、地図の画像データにおける案内経路(開始交差点と終了交差点とを結ぶ道路経路)の表示色を他の道路と異なる色にて、地図の画像データを生成する。
そして、制御部11は、GPS14で取得した緯度経度の位置に対応した、表示部17に表示されている地図の画像データ上の位置に、自身の位置を示すマークを重ねて配置する。これにより、ユーザは、自身のマークの移動を確認しながら、開始交差点から終了交差点まで案内の案内サービスを受けることができる。
さらに、デジタルカメラ1から、画像データと、この画像データ内の建造物のAR情報とを、SNS(Social Networking Service)等のWebサイトにアップロードするようにし、多数の他のデジタルカメラと情報を共有するようにしても良い。この場合、AR情報テーブルにユーザIDを付加しておくことにより、ユーザIDを認識しているユーザのみが、このSNSにおいて、画像データに対応したAR情報を閲覧あるいは追加することができる。
ユーザは、デジタルカメラ1から、自身の登録したい画像データ及びこの画像データに対応するAR情報にユーザ情報を付加し、登録することができる。
ここで、表示部17に表示されているユーザ登録のアイコンを、ユーザがタッチすると、制御部11は、アイコンをタッチしたことをタッチセンサにより検出し、登録画面を表示部17に表示する。この登録画面には、ユーザの取得したAR情報に対応する画像データのサムネイル画像と、アルファベット及び数字のキーが示されたアイコンも含まれている。
SNSサーバは、この登録要求を受信すると、ユーザ自身の名前(あるいはニックネーム)のファイルを生成し、登録要求された画像データ及びAR情報を、AR情報共有データベースに書き込んで記憶させる。このファイルは、図11のAR情報テーブルと同様の構成である。
次に、この登録画面において、ユーザが参照したいユーザの名前(あるいはニックネーム)及びユーザIDを、アルファベット及び数字のキーが示されたアイコンをタッチすることで入力し、エンターアイコンをタッチすることにより、制御部11は、SNSサーバにアクセスし、この名前(あるいはニックネーム)及びユーザIDを、画像データ及びAR情報の参照要求とともに送信する。
そして、SNSサーバは、このファイルのAR情報の中から受信したユーザIDと同一のユーザIDが付加された画像識別情報及びこの画像識別情報に対応する画像データを読み出し、読み出したAR情報及び画像データを、デジタルカメラ1に送信する。
制御部11は、入力される画像データ及びAR情報をすでに説明したように、表示部17に表示する。
そして、この変更画面において、ユーザが変更したいAR情報の画像データを、サムネイル画像から選択すると、制御部17は、選択された画像データをタッチセンサにより検出し、選択した画像データを表示部17に表示し、この表示された画像データ上にそれぞれの建造物の位置に対応させ、AR情報のタグを重ねて表示させる。
SNSサーバは、変更要求を受信すると、変更要求に付加された名前(あるいはニックネーム)により、AR情報共有データベースを検索し、同一の名前(あるいはニックネーム)のファイルを抽出する。
一方、SNSサーバは、AR情報に付加されているユーザIDが、受信した変更要求に付加されたユーザIDと一致した場合、この画像識別情報における建造物識別情報のAR情報を変更する。
また、図9における情報検索サーバ21のAR情報検索の機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することによりAR情報の検索の制御を行ってもよい。
なお、ここでいう「コンピュータシステム」とは、OSや周辺機器等のハードウェアを含むものとする。
また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムを送信する場合の通信線のように、短時間の間、動的にプログラムを保持するもの、その場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持しているものも含むものとする。また上記プログラムは、前述した機能の一部を実現するためのものであっても良く、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであっても良い。
Claims (16)
- 被写体を撮像する撮像部と、
撮像位置の位置情報を取得する位置情報取得部と、
前記位置情報に基づいて前記被写体に関する情報を取得し、前記被写体の画像データと前記被写体に関する情報とを表示部に表示する制御部と、
前記被写体の画像データと前記被写体に関する情報とをホールドするホールド制御信号を前記制御部に出力するホールド制御部と、
を備える撮像装置。 - 前記制御部は、前記ホールド制御部から受信する前記ホールド制御信号に基づき、前記被写体の画像データと前記被写体に関する情報とを前記表示部に表示する請求項1に記載の撮像装置。
- 前記制御部は、前記表示部に対する前記被写体に関する情報の表示及び非表示を切り替え可能である請求項1又は請求項2に記載の撮像装置。
- 前記制御部は、前記被写体を撮像中において前記ホールド制御信号を受信した場合に、ホールドした前記被写体の画像データと前記被写体に関する情報とを前記表示部に表示する請求項1から請求項3のいずれか一項に記載の撮像装置。
- 前記制御部は、前記ホールド制御信号に基づいてホールドした前記被写体に関する情報と前記被写体の画像データとを他端末へ送信する請求項1から請求項4のいずれか一項に記載の撮像装置。
- 前記被写体に関する情報は、AR情報である請求項1から請求項5のいずれか一項に記載の撮像装置。
- 撮像装置と情報検索システムとを備え、
前記撮像装置は、請求項1から請求項6のいずれか一項に記載の撮像装置である、
情報取得システム。 - 自身の位置の緯度経度情報を検出する緯度経度検出部と、
画像データを撮像している方位角を検出する方位角検出部と、
前記緯度経度情報における前記方位角方向の緯度経度範囲に位置する建造物に付加されたAR情報を、当該緯度経度情報及び当該方位角により取得し、表示部に表示する制御部と、
前記AR情報を記憶部に記憶させるアクションを検知すると、前記AR情報と前記画像データとを記憶部に記憶させるホールド制御信号を、前記制御部に対して出力するホールド制御部と
を備える撮像装置。 - 前記制御部は、
前記ホールド制御部から前記ホールド制御信号が入力されると、当該ホールド制御信号が入力された際に、前記画像データを撮像していた自身の位置の前記緯度経度情報及び方位角を、前記画像データを識別する画像識別情報を付加して前記記憶部に記憶させる請求項8に記載の撮像装置。 - 前記制御部は、
前記ホールド制御部から前記ホールド制御信号が入力されると、当該ホールド制御信号が入力された際に、前記表示部に表示していた前記画像データ及び当該画像データに対応する前記AR情報を、当該画像データを識別する画像識別情報を付加して前記記憶部に記憶させる請求項8記載の撮像装置。 - 前記記憶部は、前記画像データ毎に
前記画像データを識別する画像識別情報、画像データ内の建造物を識別する建造物識別情報、当該建造物識別情報の示す建造物の建造物情報、当該画像識別情報に付加されたユーザIDとから構成されている
請求項8から請求項10のいずれか一項に記載の撮像装置。 - 前記制御部は、前記記憶部から前記AR情報及び前記画像データを読み出して前記表示部に表示させる、
請求項8から請求項11のいずれか一項に記載の撮像装置。 - 前記制御部は、撮像状態において付加された前記AR情報を情報検索システムへ送信する請求項8から請求項12のいずれか一項に記載の撮像装置。
- 撮像装置と情報検索システムとから構成され、当該情報検索システムが前記撮像装置から送信される緯度経度情報及び方位角により、当該緯度経度情報における前記方位角の方向の緯度経度範囲に位置する建造物を抽出し、この抽出された該建造物に付加された情報を前記撮像装置に送信する情報取得システムであり、
前記撮像装置は、
自身の位置の緯度経度情報を検出する緯度経度検出部と、
画像データを撮像している方位角を検出する方位角検出部と、
前記緯度経度情報における前記方位角の方向の緯度経度範囲に位置する建造物に付加されたAR情報を、当該緯度経度情報及び当該方位角により取得し、表示部に表示する制御部と、
前記AR情報を記憶部に記憶させるアクションを検知すると、前記AR情報と前記画像データとを前記記憶部に記憶させるホールド制御信号を、前記制御部に対して出力するホールド制御部と
を備え、
前記情報検索システムは、
建造物の建造物識別番号及び当該建造物の緯度経度情報を対応させた地図データと、前記建造物識別番号及び当該建造物識別番号の示す建造物の前記AR情報を対応させた建造物テーブルとが記憶されたデータベースと、
前記撮像装置から送信された前記緯度経度情報及び前記方位角により、前記緯度経度情報における前記方位角の方向の緯度経度範囲に位置する建造物の建造物識別番号を前記地図データから検索し、検索された前記建造物識別番号により、当該建造物識別番号の示す建造物に付加された前記AR情報を前記建造物テーブルから読み出し、読み出した建造物の前記AR情報を前記撮像装置に送信する情報検索サーバと
を備える、
情報取得システム。 - 請求項1から請求項6のいずれかに記載の撮像装置の機能をコンピュータに実行させるためのプログラムであり、
被写体を撮像した撮像位置の位置情報を入力するステップと、
前記位置情報に基づいて前記被写体に関する情報を取得するステップと、
前記被写体の画像データと前記被写体に関する情報とを表示部に表示するステップと、
前記被写体の画像データと前記被写体に関する情報とをホールドするホールド制御信号を制御部に出力するステップと、
をコンピュータに実行させるためのプログラム。 - 請求項8から請求項13のいずれかに記載の撮像装置の機能をコンピュータに実行させるためのプログラムであり、
緯度経度検出部が検出した自身の位置の緯度経度情報を入力するステップと、
方位角検出部が検出した画像データを撮像している方位角を入力するステップと、
前記緯度経度情報における前記方位角方向の緯度経度範囲に位置する建造物に付加されたAR情報を、当該緯度経度情報及び当該方位角により取得し、表示部に表示するステップと、
前記AR情報を記憶部に記憶させるアクションを検知すると、前記AR情報と前記画像データとを前記記憶部に記憶させるホールド制御信号を、前記制御部に対して出力するステップと、
をコンピュータに実行させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180008654.3A CN102763404B (zh) | 2010-02-08 | 2011-02-07 | 摄影装置、信息取得系统及程序 |
EP11739908.9A EP2536124B1 (en) | 2010-02-08 | 2011-02-07 | Imaging device, information acquisition system, and program |
JP2011552853A JPWO2011096561A1 (ja) | 2010-02-08 | 2011-02-07 | 撮像装置、情報取得システム及びプログラム |
CN202011217327.6A CN112565552B (zh) | 2010-02-08 | 2011-02-07 | 摄影装置、信息取得系统、记录媒体及服务器 |
BR112012019407-0A BR112012019407A2 (ja) | 2010-02-08 | 2011-02-07 | An imaging device, an information acquisition system, and a program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010025998 | 2010-02-08 | ||
JP2010-025998 | 2010-02-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011096561A1 true WO2011096561A1 (ja) | 2011-08-11 |
Family
ID=44353435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/052498 WO2011096561A1 (ja) | 2010-02-08 | 2011-02-07 | 撮像装置、情報取得システム及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (7) | US9420251B2 (ja) |
EP (1) | EP2536124B1 (ja) |
JP (3) | JPWO2011096561A1 (ja) |
CN (3) | CN112565552B (ja) |
BR (1) | BR112012019407A2 (ja) |
WO (1) | WO2011096561A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014006147A (ja) * | 2012-06-25 | 2014-01-16 | Alpine Electronics Inc | 拡張現実システム |
JP2014131207A (ja) * | 2012-12-28 | 2014-07-10 | Sharp Corp | 携帯情報端末 |
JP2015534364A (ja) * | 2012-09-21 | 2015-11-26 | 杭州美盛紅外光電技術有限公司 | 熱画像情報記録装置および熱画像情報記録方法 |
JP2016507833A (ja) * | 2013-01-22 | 2016-03-10 | マイクロソフト テクノロジー ライセンシング,エルエルシー | 複合現実フィルタリング |
JP2016071720A (ja) * | 2014-09-30 | 2016-05-09 | キヤノンマーケティングジャパン株式会社 | 情報処理装置、情報処理システム、その制御方法及びプログラム |
US9509941B2 (en) | 2012-07-13 | 2016-11-29 | Intel Corporation | Context based management for secure augmented reality applications |
JP2017017716A (ja) * | 2016-07-28 | 2017-01-19 | 株式会社Jvcケンウッド | 情報生成装置、情報付加装置及び情報表示装置 |
WO2018116538A1 (ja) * | 2016-12-23 | 2018-06-28 | 株式会社tiwaki | 仮想内容記憶方法 |
JP2019164001A (ja) * | 2018-03-19 | 2019-09-26 | 株式会社Lifull | 情報提供システム、サーバ装置、端末用プログラム、及び情報提供方法 |
JP2022000795A (ja) * | 2016-12-13 | 2022-01-04 | 富士フイルムビジネスイノベーション株式会社 | 情報管理装置 |
WO2023095770A1 (ja) * | 2021-11-24 | 2023-06-01 | 日本電気株式会社 | 拡張現実表示装置、サーバ装置、拡張現実表示システム、拡張現実表示方法、及びプログラム |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009037835B4 (de) | 2009-08-18 | 2012-12-06 | Metaio Gmbh | Verfahren zur Darstellung von virtueller Information in einer realen Umgebung |
US9420251B2 (en) * | 2010-02-08 | 2016-08-16 | Nikon Corporation | Imaging device and information acquisition system in which an acquired image and associated information are held on a display |
WO2012120799A1 (ja) * | 2011-03-04 | 2012-09-13 | パナソニック株式会社 | 表示装置及び表示方向切替方法 |
CN102355550A (zh) * | 2011-11-16 | 2012-02-15 | 天津三星光电子有限公司 | 一种具有gps定位功能的数码相机 |
JP6071903B2 (ja) * | 2011-12-08 | 2017-02-01 | キヤノン株式会社 | 撮像装置、およびその制御方法、プログラム |
US20130257900A1 (en) * | 2012-03-30 | 2013-10-03 | Nokia Corporation | Method and apparatus for storing augmented reality point-of-interest information |
JP6004807B2 (ja) * | 2012-07-24 | 2016-10-12 | キヤノン株式会社 | 画像処理装置、その制御方法、プログラム |
US20140067869A1 (en) | 2012-08-30 | 2014-03-06 | Atheer, Inc. | Method and apparatus for content association and history tracking in virtual and augmented reality |
CN104541300B (zh) * | 2012-09-28 | 2019-01-22 | 英特尔公司 | 增强现实信息的确定 |
KR101962134B1 (ko) * | 2012-10-24 | 2019-07-17 | 엘지전자 주식회사 | 컨텐츠 제공 방법 및 이를 위한 디지털 디바이스 |
TWI571827B (zh) * | 2012-11-13 | 2017-02-21 | 財團法人資訊工業策進會 | 決定3d物件影像在3d環境影像中深度的電子裝置及其方法 |
CN103104792B (zh) * | 2013-01-22 | 2016-01-27 | Tcl集团股份有限公司 | 仿生摄像头、驱动仿生摄像头的偏置方法、及媒体终端 |
JP6265621B2 (ja) * | 2013-04-18 | 2018-01-24 | キヤノン株式会社 | 表示制御装置およびプログラム |
KR102249886B1 (ko) * | 2014-02-14 | 2021-05-10 | 삼성전자 주식회사 | 웹 페이지 전환 방법 및 이를 이용하는 휴대 단말기, 이 방법을 실행하기 위한 프로그램이 기록되어 있는 비휘발성 기록매체 |
WO2016162952A1 (ja) * | 2015-04-07 | 2016-10-13 | 三菱電機株式会社 | 空気調和機のメンテナンスサポートシステム |
US9943262B2 (en) * | 2015-10-26 | 2018-04-17 | Sk Planet Co., Ltd. | System and method for providing walking path guidance service based on measurement of motion of user |
JP2017134467A (ja) * | 2016-01-25 | 2017-08-03 | 富士通株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
US10154080B2 (en) * | 2016-05-23 | 2018-12-11 | Accenture Global Solutions Limited | Enhancing digital content provided from devices |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US10334160B2 (en) * | 2016-09-28 | 2019-06-25 | Casio Computer Co., Ltd. | Imaging apparatus capable of automatic photography, image playback apparatus which plays back automatically photographed image, image playback method, and imaging method |
US10380544B2 (en) * | 2016-12-24 | 2019-08-13 | Motorola Solutions, Inc. | Method and apparatus for avoiding evidence contamination at an incident scene |
CN107194370B (zh) * | 2017-06-10 | 2023-12-19 | 国网浙江省电力公司台州供电公司 | 基于地理位置信息的影像智能分析方法 |
US10338768B1 (en) | 2017-12-10 | 2019-07-02 | International Business Machines Corporation | Graphical user interface for finding and depicting individuals |
US10521961B2 (en) * | 2017-12-10 | 2019-12-31 | International Business Machines Corporation | Establishing a region of interest for a graphical user interface for finding and depicting individuals |
US10565764B2 (en) | 2018-04-09 | 2020-02-18 | At&T Intellectual Property I, L.P. | Collaborative augmented reality system |
US10536809B1 (en) * | 2018-06-28 | 2020-01-14 | Homerun Labs, Inc. | System and method for detection of a structure |
US11227007B2 (en) * | 2019-07-23 | 2022-01-18 | Obayashi Corporation | System, method, and computer-readable medium for managing image |
JP6852202B1 (ja) * | 2020-01-14 | 2021-03-31 | 株式会社梓設計 | 情報表示装置、情報表示方法及びプログラム |
TWI737371B (zh) * | 2020-06-30 | 2021-08-21 | 國立成功大學 | 虛實整合系統與方法 |
KR20220114336A (ko) * | 2021-02-08 | 2022-08-17 | 현대자동차주식회사 | 사용자 단말 및 그 제어 방법 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006295827A (ja) * | 2005-04-14 | 2006-10-26 | Sony Ericsson Mobilecommunications Japan Inc | 携帯端末装置 |
JP2008076734A (ja) * | 2006-09-21 | 2008-04-03 | Seiko Epson Corp | 眼鏡装置及び画像生成方法 |
JP2009017540A (ja) * | 2007-05-31 | 2009-01-22 | Panasonic Corp | 画像撮影装置、付加情報提供サーバ及び付加情報フィルタリングシステム |
JP2010025998A (ja) | 2008-07-15 | 2010-02-04 | Ricoh Co Ltd | クリーニング装置、定着装置及び画像形成装置 |
Family Cites Families (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0438573A (ja) * | 1990-06-04 | 1992-02-07 | Toshiba Corp | 画像表示装置 |
US6282362B1 (en) | 1995-11-07 | 2001-08-28 | Trimble Navigation Limited | Geographical position/image digital recording and display system |
US6222583B1 (en) | 1997-03-27 | 2001-04-24 | Nippon Telegraph And Telephone Corporation | Device and system for labeling sight images |
JP3738310B2 (ja) * | 1997-08-04 | 2006-01-25 | カシオ計算機株式会社 | カメラ |
JP4009887B2 (ja) * | 1998-10-20 | 2007-11-21 | 富士フイルム株式会社 | カメラ |
JP4274656B2 (ja) * | 1999-12-08 | 2009-06-10 | カシオ計算機株式会社 | カメラ装置及び被写体名称表示方法 |
US7028263B2 (en) * | 2000-07-19 | 2006-04-11 | Research In Motion Limited | User interface and method for viewing short messages on a wireless device |
JP3925057B2 (ja) | 2000-09-12 | 2007-06-06 | カシオ計算機株式会社 | カメラ装置、撮影範囲表示システム、及び撮影範囲表示方法 |
US7016899B1 (en) | 2000-09-21 | 2006-03-21 | Lenovo (Singapore) Pte. Ltd. | Camera device, methods and program products for location and environmental stamping of images, communications and other applications |
JP2002344792A (ja) * | 2001-02-16 | 2002-11-29 | Fuji Photo Film Co Ltd | 情報伝送システム |
WO2003049424A1 (fr) | 2001-12-03 | 2003-06-12 | Nikon Corporation | Appareil electronique, camera electronique, dispositif electronique, appareil d'affichage d'image et systeme de transmission d'image |
JP4158376B2 (ja) | 2001-12-07 | 2008-10-01 | 株式会社ニコン | 電子カメラおよび画像表示装置および画像表示方法 |
JP3893983B2 (ja) * | 2002-01-17 | 2007-03-14 | ソニー株式会社 | 情報提供装置及び情報提供方法、記憶媒体、並びにコンピュータ・プログラム |
JP2004015187A (ja) | 2002-06-04 | 2004-01-15 | Fuji Photo Film Co Ltd | 撮影補助システム、デジタルカメラ、及びサーバ |
JP2004297478A (ja) | 2003-03-27 | 2004-10-21 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2004341960A (ja) | 2003-05-16 | 2004-12-02 | Sony Corp | 位置依存情報検索方法、位置依存情報検索装置、位置依存情報提供装置、および位置依存情報検索プログラム |
US20050046706A1 (en) | 2003-08-28 | 2005-03-03 | Robert Sesek | Image data capture method and apparatus |
JP4561084B2 (ja) * | 2003-11-25 | 2010-10-13 | ソニー株式会社 | サービス管理装置及びサービス管理方法、並びにサービス提供システム及びサービス提供方法 |
JP4419621B2 (ja) | 2004-03-17 | 2010-02-24 | アイシン・エィ・ダブリュ株式会社 | 旅行計画作成管理システム |
JP2006003963A (ja) | 2004-06-15 | 2006-01-05 | Hitachi Omron Terminal Solutions Corp | 地域案内システム |
JP4363642B2 (ja) * | 2004-07-02 | 2009-11-11 | 富士フイルム株式会社 | 地図表示システム及びデジタルカメラ |
JP4054018B2 (ja) | 2004-12-14 | 2008-02-27 | 株式会社ナビタイムジャパン | 空席経路探索システム、空席経路探索装置および端末装置 |
US7720436B2 (en) | 2006-01-09 | 2010-05-18 | Nokia Corporation | Displaying network objects in mobile devices based on geolocation |
JP4632793B2 (ja) | 2005-01-12 | 2011-02-16 | 京セラ株式会社 | ナビゲーション機能付き携帯型端末機 |
US20060190812A1 (en) * | 2005-02-22 | 2006-08-24 | Geovector Corporation | Imaging systems including hyperlink associations |
JP2006235979A (ja) | 2005-02-24 | 2006-09-07 | Toshiba Corp | 情報提供システム及び情報提供方法 |
JP2006238220A (ja) | 2005-02-25 | 2006-09-07 | Fuji Photo Film Co Ltd | 撮像装置、撮像方法、及びプログラム |
JP4366601B2 (ja) | 2005-03-18 | 2009-11-18 | ソニー株式会社 | タイムシフト画像配信システム、タイムシフト画像配信方法、タイムシフト画像要求装置および画像サーバ |
JP2006350879A (ja) * | 2005-06-20 | 2006-12-28 | Nikon Corp | 情報提供システム |
JP2008001383A (ja) * | 2006-06-21 | 2008-01-10 | Mitsubishi Gas Chem Co Inc | 多層ボトル |
JP4863274B2 (ja) | 2006-09-20 | 2012-01-25 | ソニー エリクソン モバイル コミュニケーションズ, エービー | 行動案内装置、行動予定処理方法、行動案内システム、及び行動予定処理プログラム |
JP2008111693A (ja) | 2006-10-30 | 2008-05-15 | Sony Ericsson Mobilecommunications Japan Inc | 移動体装置および目標物情報検索方法 |
US20080147730A1 (en) * | 2006-12-18 | 2008-06-19 | Motorola, Inc. | Method and system for providing location-specific image information |
JP2008160631A (ja) | 2006-12-26 | 2008-07-10 | Funai Electric Co Ltd | 携帯機器 |
JP4479929B2 (ja) | 2007-02-01 | 2010-06-09 | 日本電気株式会社 | 撮影画像利用システムおよび撮影画像利用方法 |
JP4914268B2 (ja) | 2007-03-29 | 2012-04-11 | 株式会社日立製作所 | 検索サービスサーバの情報検索方法。 |
JP5194650B2 (ja) | 2007-08-31 | 2013-05-08 | 株式会社ニコン | 電子カメラ |
US8180396B2 (en) * | 2007-10-18 | 2012-05-15 | Yahoo! Inc. | User augmented reality for camera-enabled mobile devices |
KR100914848B1 (ko) * | 2007-12-15 | 2009-09-02 | 한국전자통신연구원 | 혼합현실 시스템 및 그 구현 방법 |
US9191238B2 (en) * | 2008-07-23 | 2015-11-17 | Yahoo! Inc. | Virtual notes in a reality overlay |
TWI403156B (zh) * | 2008-08-08 | 2013-07-21 | 尼康股份有限公司 | Carrying information machines, photographic devices, and information acquisition systems |
EP2325768A4 (en) * | 2008-08-08 | 2016-08-17 | Nikon Corp | RESEARCH ASSISTANCE SYSTEM, RESEARCH ASSISTANCE METHOD, AND RESEARCH ASSISTANCE PROGRAM |
WO2010032079A2 (en) * | 2008-09-17 | 2010-03-25 | Nokia Corp. | User interface for augmented reality |
WO2010073616A1 (ja) * | 2008-12-25 | 2010-07-01 | パナソニック株式会社 | 情報表示装置および情報表示方法 |
TWI514337B (zh) * | 2009-02-20 | 2015-12-21 | 尼康股份有限公司 | Carrying information machines, photographic devices, and information acquisition systems |
US8301202B2 (en) * | 2009-08-27 | 2012-10-30 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
KR101648339B1 (ko) * | 2009-09-24 | 2016-08-17 | 삼성전자주식회사 | 휴대용 단말기에서 영상인식 및 센서를 이용한 서비스 제공 방법 및 장치 |
US20110093890A1 (en) * | 2009-10-21 | 2011-04-21 | John Araki | User control interface for interactive digital television |
US9766089B2 (en) * | 2009-12-14 | 2017-09-19 | Nokia Technologies Oy | Method and apparatus for correlating and navigating between a live image and a prerecorded panoramic image |
US20110161875A1 (en) * | 2009-12-29 | 2011-06-30 | Nokia Corporation | Method and apparatus for decluttering a mapping display |
US9420251B2 (en) * | 2010-02-08 | 2016-08-16 | Nikon Corporation | Imaging device and information acquisition system in which an acquired image and associated information are held on a display |
KR101691034B1 (ko) * | 2010-08-03 | 2016-12-29 | 삼성전자주식회사 | 3차원 그래픽 기반 단말기에서 객체 렌더링 시 부가정보 합성 장치 및 방법 |
US20120038668A1 (en) * | 2010-08-16 | 2012-02-16 | Lg Electronics Inc. | Method for display information and mobile terminal using the same |
KR101260576B1 (ko) * | 2010-10-13 | 2013-05-06 | 주식회사 팬택 | Ar 서비스를 제공하기 위한 사용자 단말기 및 그 방법 |
KR101688155B1 (ko) * | 2010-10-25 | 2016-12-20 | 엘지전자 주식회사 | 이동 단말기의 정보 처리 장치 및 그 방법 |
WO2012166135A1 (en) * | 2011-06-01 | 2012-12-06 | Empire Technology Development,Llc | Structured light projection for motion detection in augmented reality |
US8749396B2 (en) * | 2011-08-25 | 2014-06-10 | Satorius Stedim Biotech Gmbh | Assembling method, monitoring method, communication method, augmented reality system and computer program product |
US20140351709A1 (en) * | 2011-09-16 | 2014-11-27 | Nec Casio Mobile Communications, Ltd. | Information processing device having information management editing function |
US8902344B2 (en) * | 2011-12-28 | 2014-12-02 | Canon Kabushiki Kaisha | Display control apparatus, image capture apparatus, display control method, and image capture apparatus control method |
US8610663B2 (en) * | 2012-02-06 | 2013-12-17 | Lg Electronics Inc. | Portable device and method for controlling the same |
JP5966510B2 (ja) * | 2012-03-29 | 2016-08-10 | ソニー株式会社 | 情報処理システム |
KR101923929B1 (ko) * | 2012-06-06 | 2018-11-30 | 삼성전자주식회사 | 증강 현실 서비스를 제공하는 이동통신 단말기 및 증강 현실 서비스에 대한 화면으로의 화면 전환 방법 |
KR20140038577A (ko) * | 2012-09-11 | 2014-03-31 | 한국과학기술연구원 | 메타데이터를 이용한 콘텐츠 추천 |
KR101962134B1 (ko) * | 2012-10-24 | 2019-07-17 | 엘지전자 주식회사 | 컨텐츠 제공 방법 및 이를 위한 디지털 디바이스 |
US9547697B2 (en) * | 2012-12-21 | 2017-01-17 | Google Inc. | Aggregating interactions for content items |
JP2015005026A (ja) * | 2013-06-19 | 2015-01-08 | 京セラドキュメントソリューションズ株式会社 | 機器管理端末、機器管理システム、及び機器管理プログラム |
US9230473B2 (en) * | 2013-06-24 | 2016-01-05 | Microsoft Technology Licensing, Llc | Dual duty cycle OLED to enable dynamic control for reduced motion blur control with constant brightness in augmented reality experiences |
WO2014208689A1 (ja) * | 2013-06-26 | 2014-12-31 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、情報処理装置の制御方法、プログラム及び情報記憶媒体 |
WO2015001611A1 (ja) * | 2013-07-02 | 2015-01-08 | 株式会社日立製作所 | ネットワーク構築支援システム及び方法 |
JP6132719B2 (ja) * | 2013-09-18 | 2017-05-24 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置 |
KR102114618B1 (ko) * | 2014-01-16 | 2020-05-25 | 엘지전자 주식회사 | 포터블 디바이스 및 그 제어 방법 |
JP2015141259A (ja) * | 2014-01-28 | 2015-08-03 | アルパイン株式会社 | 地図情報表示装置および地図情報表示方法 |
KR102182162B1 (ko) * | 2014-02-20 | 2020-11-24 | 엘지전자 주식회사 | Hmd 및 그 제어 방법 |
WO2015159561A1 (ja) * | 2014-04-16 | 2015-10-22 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、情報処理システム、および情報処理方法 |
US9526983B2 (en) * | 2014-06-04 | 2016-12-27 | Chih-Feng Lin | Virtual reality avatar traveling control system and virtual reality avatar traveling control method |
US9632664B2 (en) * | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US9678566B2 (en) * | 2015-06-03 | 2017-06-13 | Oculus Vr, Llc | Hand-held controllers for virtual reality system |
CN108228120A (zh) * | 2016-12-13 | 2018-06-29 | 腾讯科技(深圳)有限公司 | 一种ar场景下的多屏联动方法和系统 |
-
2011
- 2011-02-04 US US13/021,198 patent/US9420251B2/en active Active
- 2011-02-07 BR BR112012019407-0A patent/BR112012019407A2/ja not_active Application Discontinuation
- 2011-02-07 WO PCT/JP2011/052498 patent/WO2011096561A1/ja active Application Filing
- 2011-02-07 EP EP11739908.9A patent/EP2536124B1/en active Active
- 2011-02-07 CN CN202011217327.6A patent/CN112565552B/zh active Active
- 2011-02-07 JP JP2011552853A patent/JPWO2011096561A1/ja active Pending
- 2011-02-07 CN CN201710339570.7A patent/CN107404598B/zh active Active
- 2011-02-07 CN CN201180008654.3A patent/CN102763404B/zh active Active
-
2016
- 2016-05-12 JP JP2016095985A patent/JP6156549B2/ja active Active
- 2016-07-11 US US15/207,228 patent/US9756253B2/en active Active
-
2017
- 2017-06-08 JP JP2017113602A patent/JP2017200208A/ja active Pending
- 2017-08-03 US US15/668,032 patent/US10452914B2/en active Active
-
2019
- 2019-10-17 US US16/656,062 patent/US11048941B2/en active Active
-
2021
- 2021-06-23 US US17/356,004 patent/US11455798B2/en active Active
-
2022
- 2022-08-16 US US17/889,037 patent/US11741706B2/en active Active
-
2023
- 2023-07-06 US US18/218,749 patent/US20230351746A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006295827A (ja) * | 2005-04-14 | 2006-10-26 | Sony Ericsson Mobilecommunications Japan Inc | 携帯端末装置 |
JP2008076734A (ja) * | 2006-09-21 | 2008-04-03 | Seiko Epson Corp | 眼鏡装置及び画像生成方法 |
JP2009017540A (ja) * | 2007-05-31 | 2009-01-22 | Panasonic Corp | 画像撮影装置、付加情報提供サーバ及び付加情報フィルタリングシステム |
JP2010025998A (ja) | 2008-07-15 | 2010-02-04 | Ricoh Co Ltd | クリーニング装置、定着装置及び画像形成装置 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014006147A (ja) * | 2012-06-25 | 2014-01-16 | Alpine Electronics Inc | 拡張現実システム |
US9948659B2 (en) | 2012-07-13 | 2018-04-17 | Intel Corporation | Context based management for secure augmented reality applications |
US9509941B2 (en) | 2012-07-13 | 2016-11-29 | Intel Corporation | Context based management for secure augmented reality applications |
JP2015534364A (ja) * | 2012-09-21 | 2015-11-26 | 杭州美盛紅外光電技術有限公司 | 熱画像情報記録装置および熱画像情報記録方法 |
JP2014131207A (ja) * | 2012-12-28 | 2014-07-10 | Sharp Corp | 携帯情報端末 |
JP2016507833A (ja) * | 2013-01-22 | 2016-03-10 | マイクロソフト テクノロジー ライセンシング,エルエルシー | 複合現実フィルタリング |
JP2016071720A (ja) * | 2014-09-30 | 2016-05-09 | キヤノンマーケティングジャパン株式会社 | 情報処理装置、情報処理システム、その制御方法及びプログラム |
JP2017017716A (ja) * | 2016-07-28 | 2017-01-19 | 株式会社Jvcケンウッド | 情報生成装置、情報付加装置及び情報表示装置 |
JP2022000795A (ja) * | 2016-12-13 | 2022-01-04 | 富士フイルムビジネスイノベーション株式会社 | 情報管理装置 |
WO2018116538A1 (ja) * | 2016-12-23 | 2018-06-28 | 株式会社tiwaki | 仮想内容記憶方法 |
JP2018106307A (ja) * | 2016-12-23 | 2018-07-05 | 株式会社tiwaki | 仮想内容記憶方法 |
JP2019164001A (ja) * | 2018-03-19 | 2019-09-26 | 株式会社Lifull | 情報提供システム、サーバ装置、端末用プログラム、及び情報提供方法 |
JP7144164B2 (ja) | 2018-03-19 | 2022-09-29 | 株式会社Lifull | 情報提供システム、サーバ装置、及び端末用プログラム |
WO2023095770A1 (ja) * | 2021-11-24 | 2023-06-01 | 日本電気株式会社 | 拡張現実表示装置、サーバ装置、拡張現実表示システム、拡張現実表示方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN112565552B (zh) | 2023-04-18 |
EP2536124B1 (en) | 2019-03-27 |
US10452914B2 (en) | 2019-10-22 |
EP2536124A4 (en) | 2015-01-14 |
US9420251B2 (en) | 2016-08-16 |
JP2016192767A (ja) | 2016-11-10 |
EP2536124A1 (en) | 2012-12-19 |
CN102763404A (zh) | 2012-10-31 |
US20210319222A1 (en) | 2021-10-14 |
US9756253B2 (en) | 2017-09-05 |
US20110193985A1 (en) | 2011-08-11 |
BR112012019407A2 (ja) | 2018-03-20 |
CN107404598A (zh) | 2017-11-28 |
US11455798B2 (en) | 2022-09-27 |
US11048941B2 (en) | 2021-06-29 |
US20170330037A1 (en) | 2017-11-16 |
US20160323515A1 (en) | 2016-11-03 |
US20220392217A1 (en) | 2022-12-08 |
US11741706B2 (en) | 2023-08-29 |
US20230351746A1 (en) | 2023-11-02 |
CN112565552A (zh) | 2021-03-26 |
JP6156549B2 (ja) | 2017-07-05 |
CN102763404B (zh) | 2017-06-06 |
JP2017200208A (ja) | 2017-11-02 |
JPWO2011096561A1 (ja) | 2013-06-13 |
CN107404598B (zh) | 2020-11-17 |
US20200050859A1 (en) | 2020-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6156549B2 (ja) | 撮像装置、情報取得システム及びプログラム | |
JP6958673B2 (ja) | 携帯情報機器、情報取得システム、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180008654.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11739908 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011552853 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2131/KOLNP/2012 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011739908 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012019407 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012019407 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120802 |