WO2015125210A1 - 情報表示装置及び情報表示プログラム - Google Patents
情報表示装置及び情報表示プログラム Download PDFInfo
- Publication number
- WO2015125210A1 WO2015125210A1 PCT/JP2014/053765 JP2014053765W WO2015125210A1 WO 2015125210 A1 WO2015125210 A1 WO 2015125210A1 JP 2014053765 W JP2014053765 W JP 2014053765W WO 2015125210 A1 WO2015125210 A1 WO 2015125210A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- display device
- information display
- target feature
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9537—Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present invention relates to an information display apparatus and an information display program, and more particularly to an apparatus for displaying related information on a “target feature”.
- GPS Global Positioning System
- Portable information terminals such as smart phones and tablet terminals are also equipped with a GPS reception function and can use various navigation services.
- Patent Document 1 includes a position information acquisition device, a position measurement unit, an orientation information acquisition device, an orientation measurement unit, a distance information acquisition device, a distance measurement unit, and a map information storage unit, and further obtains information obtained by each unit.
- a portable map display device provided with a target feature specifying means that uses and specifies the position of an actual target feature.
- the map display device described in Patent Document 1 is a feature on a map corresponding to an actual target feature based on the position of the actual target feature specified by the user using the target feature specifying means of the map display device.
- the attribute information can be displayed on the display device.
- Patent Document 2 discloses a pointing system that processes information related to an object addressed by a user.
- a user points a handheld device (portable terminal) to an object to be addressed
- the device measures the position and orientation of the portable terminal, searches a database on the network, The information about the object is presented on the user interface.
- the map display device described in Patent Document 1 itself is provided with a position measurement means, an orientation measurement means, and a display device.
- the map display device is directed to a destination feature, and the position information and orientation of the display device itself are obtained. Information is acquired, further, distance information between the display device and the target feature is acquired by the distance measuring means, and position information of the target feature is calculated from each acquired information. Furthermore, by referring to the map information based on the calculated location information of the destination feature, the map display device needs to acquire the attribute information of the destination feature and measure the distance between the display device and the destination feature. .
- Patent Document 2 discloses a pointing system in which an object is addressed with a mobile terminal or the like and information related to the object is manipulated.
- the invention of Patent Document 2 discloses an example in which components such as a position determination unit and a posture determination unit are not physically confined to a mobile terminal, but are distributed on a wireless network together with a database.
- the record in the database includes a geometric descriptor that defines a discontinuous spatial range, and the search means is defined by the instantaneous position and the instantaneous posture measured by the mobile terminal. Search the database by determining whether the address state crosses the spatial range.
- the address state of the mobile terminal is compared with the geometric descriptor of the database record, and when it is determined that the mobile terminal is crossed, the multimedia information of the database record is called. Intersection judgment is performed.
- the geometric intersection determination in the space expressed by the three-dimensional coordinates is performed, not only the position of the mobile terminal but also the attitude of the mobile terminal are necessary for specifying the object. In addition, complicated geometric calculation processing based on such information is required.
- An object of the present invention is to display the information on the target feature when the user points the display device to a nearby target feature with a simpler configuration in consideration of the problems of the prior art. Is to provide a device.
- an information display device capable of displaying related information of a feature, and acquiring current position information of the information display device
- a target feature specifying execution unit that specifies the target feature as a target feature by referring to the map information using the current position information and the direction information, and a specific information for acquiring specific information about the target feature
- the feature specifying execution unit is directed to the information display device from the current position of the information display device on the map obtained from the map information based on the current position information and the direction information of the information display device.
- a feature on the map whose direction intersects at a position closest to the information display device is specified as the target feature.
- FIG. 1 is a block diagram of an information display apparatus according to Embodiment 1 of the present invention.
- FIG. 3 is a software configuration diagram of the information display apparatus according to the first embodiment. 1 is a front external view and a back external view of an information display device according to Embodiment 1.
- FIG. 1 is a configuration diagram of an information display system including an information display device according to Embodiment 1.
- FIG. 3 is a screen display diagram of a basic screen of the information display device according to the first embodiment.
- 5 is a flowchart of an information display operation of the information display apparatus according to the first embodiment.
- the screen display figure of the feature information display screen (during information acquisition) of an information display device.
- FIG. 10 is a screen display diagram of a basic screen of the information display apparatus according to the second embodiment. 10 is a flowchart of an information display operation of the information display apparatus according to the second embodiment. The screen display figure of the live view display screen of the information display apparatus which concerns on Example 2.
- FIG. 10 is a conceptual diagram illustrating the format of an image data file in the information display apparatus according to the second embodiment.
- FIG. 1 is a block diagram of the information display apparatus according to the first embodiment.
- the information display apparatus 100 includes a main control unit 101, a system bus 102, a ROM 103, a RAM 104, a storage unit 110, an image processing unit 120, an audio processing unit 130, an operation unit 140, a communication processing unit 150, a sensor unit 160, and an expansion interface unit 170.
- Etc. are configured by a computer.
- the information display apparatus 100 may be configured based on a terminal having a communication function, for example, a mobile terminal such as a mobile phone, a smart phone, or a tablet terminal. You may comprise based on PDA (Personal Digital Assistants) and notebook type PC (Personal Computer). Further, a digital still camera, a video camera capable of shooting a movie, a portable game machine, or other portable digital devices may be used as a base.
- a mobile terminal such as a mobile phone, a smart phone, or a tablet terminal.
- PDA Personal Digital Assistants
- notebook type PC Personal Computer
- a digital still camera, a video camera capable of shooting a movie, a portable game machine, or other portable digital devices may be used as a base.
- the main control unit 101 is a microprocessor unit that controls the entire information display device 100 according to a predetermined program.
- a system bus 102 is a data communication path for transmitting and receiving data between the main control unit 101 and each unit in the information display apparatus 100.
- a ROM (Read Only Memory) 103 is a memory in which a basic operation program such as an operating system and other application programs are stored. For example, a rewritable ROM such as an EEPROM (Electrically Erasable Programmable ROM) or a flash ROM is used. .
- a RAM (Random Access Memory) 104 serves as a work area for executing a basic operation program and other application programs.
- the ROM 103 and the RAM 104 may be integrated with the main control unit 101. Further, the ROM 103 may not use an independent configuration as shown in FIG. 1 but may use a partial storage area in the storage unit 110.
- the storage unit 110 stores each operation setting value of the information display device 100, information of the user of the information display device 100, and the like in the various information / data storage areas.
- the various information / data storage areas also function as a map information storage unit that holds a map information group downloaded from the network. Still image data and moving image data captured by the information display device 100 can also be stored.
- the storage unit 110 can also store new application programs downloaded from the network. As one of the application programs, there is an “information display program” that realizes main functions of the information display apparatus of the present embodiment. The configuration and function of the “information display program” will be described in detail with reference to FIG.
- the whole or a part of the functions of the ROM 103 may be replaced by a partial area of the storage unit 110.
- the storage unit 110 needs to hold stored information even when power is not supplied to the information display apparatus 100. Therefore, for example, devices such as flash ROM, SSD (Solid State Drive), HDD (Hard Disc Drive) and the like are used.
- the image processing unit 120 includes a display unit 121, an image signal processing unit 122, a first image input unit 123, and a second image input unit 124.
- the display unit 121 is a display device such as a liquid crystal panel, for example, and provides the image data processed by the image signal processing unit 122 to the user of the information display apparatus 100.
- the image signal processing unit 122 includes a video RAM (not shown), and the display unit 121 is driven based on the image data input to the video RAM.
- the image signal processing unit 122 has a function of performing format conversion, a menu or other OSD (On Screen Display) signal superimposing processing, if necessary.
- OSD On Screen Display
- the first image input unit 123 and the second image input unit 124 convert light input from the lens into an electrical signal using an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
- an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
- a camera unit for inputting image data of surroundings and objects.
- the audio processing unit 130 includes an audio output unit 131, an audio signal processing unit 132, and an audio input unit 133.
- the audio output unit 131 is a speaker, and provides the audio signal processed by the audio signal processing unit 132 to the user of the information display apparatus 100.
- the voice input unit 133 is a microphone, which converts a user's voice and the like into voice data and inputs the voice data. Note that the voice input unit 133 is separate from the information display device 100 and may be connected to the information display device 100 by wired communication or wireless communication.
- the operation unit 140 is an instruction input unit that inputs an operation instruction to the information display apparatus 100.
- the operation unit 140 includes a touch panel 140t arranged on the display unit 121 and an operation key 140k in which button switches are arranged. Shall. Either one may be sufficient. You may operate the information display apparatus 100 using the keyboard etc. which were connected to the below-mentioned expansion interface part 170.
- FIG. The information display apparatus 100 may be operated using a separate information terminal device connected by wired communication or wireless communication.
- the touch panel function may be provided in the display unit 121.
- the communication processing unit 150 includes a LAN (Local Area Network) communication unit 151, a mobile telephone network communication unit 152, and a proximity wireless communication unit 153.
- the LAN communication unit 151 transmits and receives data by connecting to the wireless communication access point 202 of the Internet 201 by wireless communication.
- the mobile telephone network communication unit 152 performs telephone communication (call) and data transmission / reception by wireless communication with the base station 203 of the mobile telephone communication network.
- the close proximity wireless communication unit 153 performs wireless communication when close to the corresponding reader / writer.
- the LAN communication unit 151, the mobile telephone network communication unit 152, and the close proximity wireless communication unit 153 are each provided with a coding circuit, a decoding circuit, an antenna, and the like. You may further provide other communication parts, such as an infrared communication part.
- the sensor unit 160 is a sensor group for detecting the state of the information display device 100.
- These sensor groups include a position information acquisition unit that acquires the current position information of the information display device 100, and direction information acquisition that acquires the direction information of the information display device when the information display device 100 is directed to a target feature. Parts.
- the information display device 10 may further include other sensors such as an atmospheric pressure sensor.
- the extended interface unit 170 is an interface group for extending the functions of the information display device 100.
- the extended interface unit 170 is configured by an image / audio interface, a USB (Universal Serial Bus) interface, a memory interface, and the like.
- the image / audio interface performs input of image signals / audio signals from external image / audio output devices, output of image signals / audio signals to external image / audio input devices, and the like.
- the USB interface is connected to a PC to transmit and receive data, and to connect a keyboard and other USB devices.
- the memory interface transmits and receives data by connecting a memory card and other memory media.
- the configuration example of the information display apparatus 100 shown in FIG. 1 includes many configurations that are not essential to the present embodiment, such as the audio processing unit 140, but the present embodiment is implemented even if these configurations are not provided. The effect of the example is not compromised.
- a configuration not shown in the figure such as a digital television broadcast receiving function and an electronic money settlement function, may be further added.
- FIG. 2 is a software configuration diagram of the information display apparatus 100 according to the present embodiment, and shows a software configuration in the ROM 103, the RAM 104, and the storage unit 110.
- the basic operation program 103 a and other programs are stored in the ROM 103
- the “information display program” 110 b and other programs are stored in the storage unit 110.
- the basic operation program 103a stored in the ROM 103 is expanded in the RAM 104, and the main control unit 101 executes the expanded basic operation program to constitute the basic operation execution unit 104a.
- the “information display program” 110b stored in the storage unit 110 is expanded in the RAM 104, and the main control unit 101 executes the expanded “information display program”.
- the azimuth acquisition execution unit 104b1, the target feature identification execution unit 104b2, and the related information acquisition execution unit 104b3 are configured.
- the RAM 104 includes a temporary storage area that temporarily holds data as necessary when various application programs are executed.
- the position / orientation acquisition execution unit 104b1 includes a position information acquisition unit that acquires current position information of the information display device 100 from GPS information (latitude, longitude, etc.) received by the GPS reception unit 161, a gyro sensor 162, a geomagnetic sensor 163, and the like. From the output, the function of the azimuth information acquisition unit that acquires the azimuth information of the information display device when the information display device 100 is directed to the target feature is provided.
- the target feature specifying execution unit 104b2 refers to the map information downloaded from the network by using the position information and the direction information calculated by the position / orientation acquisition execution unit 104b1, thereby identifying the target feature as “target feature. It has the function specified as.
- the “target feature” is, for example, a high-rise building or a complex building, there are one or a plurality of tenants on each floor.
- the related information acquisition execution unit 104b3 refers to the downloaded map information, and from the additional data accompanying the map information, the specific information (address information, store name information, building name) of the target feature ("target feature")
- the communication processing unit 150 in FIG. 1 transmits the specific information acquired by the related information acquisition execution unit 104b3 by the specific information acquisition unit to the search server on the network, receives the related information from the search server, and acquires the related information. It functions as a communication part for making the relevant information of the part.
- FIG. 3 is an external view of the information display apparatus 100 of the present embodiment. This external view is an example when the information display device 100 is an information terminal device such as a smart phone.
- FIG. 3A is a front view of the information display device 100
- FIG. 3 is a rear (back) view of the display device 100.
- FIG. The illustration of the left and right side surfaces and the ceiling is omitted.
- the first image input unit 123 is located on the same surface (front surface) as the display unit 121
- the second image input unit 124 is the opposite surface (back surface) of the display unit 121.
- the first image input unit 123 positioned on the same plane as the display unit 121 may be referred to as “in camera”, and the second image input unit 124 positioned on the opposite side of the display unit 121 may be referred to as “out camera”.
- the position of the second image input unit 124 may not be the back surface as long as it is not flush with the display unit 121.
- the second image input unit 124 may be separate from the information display device 100 and may be connected to the information display device 100 by wired communication or wireless communication. Further, only one of the camera units may be provided. Further, when the information display device 100 is a digital still camera, it may have a shape different from those shown in FIGS.
- FIG. 4 is a configuration diagram of an information display system including the information display device 100 of the present embodiment.
- the information display system includes an information display device 100, a wide-area public network 201 such as the Internet and its wireless communication access point 202, a mobile telephone communication network base station 203, an application server 211, a map data server 212, and mobile telephone communication. Server 213. Needless to say, a large number of various server devices, terminal devices, etc., not shown, are connected on the Internet 201.
- the map data server 212 stores commercially available map information, for example, Google ⁇ Maps (registered trademark), in which the longitude and latitude on the ground are assigned to the values of the XY coordinates on the plane.
- Google ⁇ Maps registered trademark
- the information display apparatus 100 can be expanded in function by downloading a new application program from the application server 211 via the Internet 201 and the wireless communication access point 202 or the base station 203 of the mobile telephone communication network. To do. At this time, the downloaded new application program is stored in the storage unit 110. When the main control unit 101 expands the new application program stored in the storage unit 110 in the RAM 104 and executes the expanded new application program, the information display apparatus 100 can realize various new functions. Shall.
- the information display apparatus 100 uses so-called cloud computing that uses computing resources (software and hardware, in other words, these processing functions, storage areas, and data) on the cloud through a network. Since it is configured as a premise, it is possible to provide an information display device capable of displaying information on a destination feature with a simple configuration.
- the information display operation in the information display apparatus 100 is mainly executed by the main control unit 101 by expanding the information display program 110 b stored in the storage unit 110 into the RAM 104.
- the information display execution unit 104b, the position / orientation acquisition execution unit 104b1, the target feature identification execution unit 104b2, the related information acquisition execution unit 104b3, and the basic operation execution unit 104a configured by Or each hardware block which implement
- the information display device 100 further includes an information display execution unit 104b, a position / orientation acquisition execution unit 104b1, a target feature identification execution unit 104b2, and a related information acquisition execution unit 104b3. It may control the operation of the apparatus 100.
- the position / orientation acquisition execution unit 104b1 of the information display device 100 acquires the map information around the current position from the map data server 212 using the GPS information (latitude, longitude, etc.) received by the GPS reception unit 161, and displays it. The current position and its surroundings are displayed on the map of the unit 121.
- FIG. 5 is a screen display diagram illustrating a basic screen 121a displayed on the display unit 121 of the information display apparatus 100.
- the basic screen 121a is displayed when the power of the information display device 100 is turned on by pressing the power key 140k1, or when the home key 140k2 is pressed during execution of an arbitrary application program.
- the icon group (APP-A to N, “feature information”) displayed in the area 121a1 of the basic screen 121a is a collection of icons associated with each application program that can be executed by the information display apparatus 100.
- the “feature information” icon 121a2 is an icon associated with an “information display program” that executes an information display process that characterizes the information display apparatus 100 of the present embodiment. By selecting any icon APP, a predetermined application program associated with the selected icon is executed.
- the icon may be selected by performing a tap operation on a predetermined area on the touch panel 140t corresponding to the position on the display unit 121 where the target icon is displayed.
- the operation may be performed by operating operation keys such as a cross cursor key and an enter key (not shown).
- the user's line of sight of the information display device 100 may be detected using the first image input unit 123, and the icon may be selected based on the detected line-of-sight information.
- the information display device 100 that operates based on the control of the basic operation execution unit 104a, when the user selects the icon 121a2 on the basic screen 121a by a tap operation or the like, an “information display program” is executed, and the basic operation execution unit 104a The information display execution unit 104b is activated and the control subject is transferred to the information display execution unit 104b.
- the information display execution unit 104b that has received the control subject's delegation from the basic operation execution unit 104a first displays on the display unit 121 a feature information display screen (initial state) 121b as shown in FIG. (S101).
- the feature information display screen (initial state) 121b includes a navigation mark 121b1 such as “arrow”, an information display area 121b2, and an “end” icon 121b3. Further, a guidance display 121b4 is displayed in the information display area 121b2. . In the guidance display 121b4, for example, guidance such as “Please hold the upper arrow in the direction of the object for which information is to be displayed for a while” is presented.
- the information display execution unit 104b sets the control subject as a basic operation execution unit (although not shown in the flowchart of FIG. 6). It returns to 104a and the operation
- the user follows the guidance on the guidance display 121b4 so that the tip side of the arrow of the navigation mark 121b1 is directed toward the feature for which information is to be acquired (hereinafter referred to as the destination feature).
- the body posture of the information display device 100 is adjusted. That is, for example, when a store where detailed information is to be acquired is found while walking in a shopping street, the user may hold the information display device 100 so that the arrow of the navigation mark 121b1 is directed toward the target store. Further, it is determined whether or not the user holds the housing posture of the information display terminal 100 for a predetermined time or more with the arrow of the navigation mark 121b1 facing the target store (S102: Yes).
- the predetermined time may be a time length that allows the information display apparatus 100 to determine whether or not the user intentionally holds the posture. For example, 0.5 seconds or 1 second is preset as the predetermined time.
- the processes after S103 are started. That is, when it is not determined that the body posture has been held for a predetermined time or longer, such as when the body is continuously moved by the user (S102: No), the processing after S103 is not started.
- the state in which the body posture is maintained refers to a state in which the spatial position of the body is substantially fixed.
- the spatial position of the housing does not need to be completely fixed, and it is determined that the housing posture is maintained by allowing a slight change in position due to camera shake or the like.
- the process of determining the “body posture” in S102 is omitted and triggered by the selection of an “information acquisition” icon (not shown) prepared separately on the feature information display screen (initial state) 121b or later.
- the process may be started.
- the information display execution unit 104b displays the display on the display unit 121 as a feature information display screen (information (Acquiring) 121c.
- a message 121c5 is displayed in the information display area 121c2.
- the position / orientation acquisition execution unit 104b1 calculates the position information of the information display device 100 from the signal received by the GPS reception unit 161 based on the control of the information display execution unit 104b, and the gyro sensor 162, the geomagnetic sensor 163, and the like.
- Direction information of the information display device 100 is calculated from the output of (S103).
- the position information and azimuth information may be calculated without using the GPS receiver 161, the gyro sensor 162, the geomagnetic sensor 163, and the like.
- the information display execution unit 104b obtains the current position of the information display device 100 and its surrounding map information from the map data server 212 on the Internet based on the position information calculated by the position / orientation acquisition execution unit 104b1 in the process of S103. 201 and downloaded via the LAN communication unit 151 or the mobile telephone network communication unit 152 and stored in the temporary storage area of the RAM 104 (S104).
- a map data group is downloaded in advance from the map data server 212 to various information / data storage areas of the storage unit 110, and map data (map information) around the current position of the information display device 100 is downloaded from the downloaded map data group. ) May be loaded into the temporary storage area of the RAM 104.
- the target feature identification execution unit 104b2 uses the position information and the direction information calculated by the position / orientation acquisition execution unit 104b1 in the process of S103 based on the control of the information display execution unit 104b.
- the destination feature that is, the feature that the user points to the tip of the arrow of the navigation mark 121b1 is displayed.
- a target feature specifying process for specifying is performed (S105).
- the target feature specifying process in S105 will be described with reference to FIGS. 9A to 9C.
- the user who owns the information display device 100 is located near the intersection where the main road 301 and the side road 302 cross each other in a shopping street where the stores 311 to 315 etc. are lined up. Assume that
- the current position 321 of the information display device 100 based on the position information calculated in the process of S103 is determined on the map data 300 downloaded in the process of S104 (FIG. 9A). .
- FIG. 9A shows the current position 321 of the user based on the position information calculated by the position / orientation acquisition execution unit 104b1 in the target feature specifying process in S105, and the map is downloaded from the map data server 212 and stored in the temporary storage area of the RAM 104.
- the map data 300 is displayed in a superimposed manner on a common two-dimensional coordinate plane.
- the displayed two-dimensional map data 300 includes the current position 321 of the user, the target feature to which the user is pointing the information display device 100, the surrounding buildings (stores 311 to 315, etc.), and the surrounding area. Roads (main road 301, side road 302) are included.
- the target features displayed on the two-dimensional coordinate plane and the surrounding buildings are uniformly located outside of them as seen from above, regardless of their height, number of layers, and internal structure. It is displayed as a plane figure showing the contour (position). Similarly, the road is displayed as a plane figure viewed from above.
- the map data may be three-dimensional data that can use GPS as long as information on the outline (position) of the outside of the feature can be obtained.
- a straight line 323 is drawn from the current position 321 of the information display device 100 on the map data 300 in the direction of the angle (azimuth angle) 322 indicated by the azimuth information calculated in S103 (FIG. 9B).
- the north is used as the angle reference, but another direction may be used as the angle reference.
- the feature stored 313) located at the closest position (intersection 324) from the current position 321 of the information display device 100 is displayed.
- the target feature is specified (FIG. 9C). Since the straight line 323 and the feature on which the user points the information display device are on the same two-dimensional coordinate plane, if the current position 321 and the azimuth angle 322 are known, it is easy to identify the intersecting feature. .
- the algorithm of the target feature specifying process of the present embodiment has been described with the graphic depiction using FIG. 9A to FIG. 9C. All may be performed by computation on the RAM 104.
- the same display as in FIGS. 9A to 9C is displayed on the display unit 121. You may make it let it.
- distance information between the information display device 100 and the destination feature is not necessary for specifying the destination feature, and therefore hardware and / or software for acquiring the distance information is required. Do not need.
- the feature to be intersected is simply determined by finding a point 324 at which the contour line outside the feature intersects with the straight line 323 at the azimuth angle 322 on the map data. No geometric processing is required.
- the target feature that is the target of information display by the information display device 100 is a feature in the vicinity of the user, as is apparent from FIGS. 9A to 9C.
- the user is located on the main road 301 immediately before the stores 313, 314, etc., but the current position of the user may be anywhere as long as the stores 313, 314, etc. can be seen.
- the environment may be such that the user is on the sidewalk or store on the opposite side across the traffic lane of the main road 301 and the stores 313, 314, etc. can be seen from there.
- the user's current position 321 is not directly visible, for example, if the user wants to know information about another store behind the store 313, the user moves to the side street 302 and points the store at the information display device 100. It ’s fine.
- the target feature specifying execution unit 104b2 is downloaded from the map data server 212 in the process of S104 and temporarily stored in the RAM 104 based on the control of the information display executing unit 104b.
- unique information address information, store name information, building name information, etc.
- the information display execution unit 104b transfers the acquired unique information of the destination feature to the related information acquisition execution unit 104b3.
- the related information acquisition execution unit 104b3 performs a network search using the unique information of the target feature as a keyword based on the control of the information display execution unit 104b, and acquires related information related to the target feature (S107). ).
- a known technique / technology may be used as a technique for performing a network search using a specific keyword.
- the unique information of the destination feature acquired in the process of S106 is transmitted to a search server (not shown) via the LAN communication unit 151 or the mobile telephone network communication unit 152, and the search result is the search result.
- the related information related to the destination feature is received by the LAN communication unit 151 or the mobile telephone network communication unit 152.
- the information display execution unit 104b displays an error message to that effect. Is displayed on the display unit 121 (S108).
- the information display execution unit 104b displays the related information related to the acquired target feature. Information is displayed on the display unit 121 (S109).
- FIG. 10A and 10B show an example of a screen display diagram of a feature information display screen (result display) 121d displayed on the display unit 121 of the information display device 100.
- FIG. In the feature information display screen (result display) 121d, the related information related to the destination feature acquired by the keyword search performed in the process of S107 is displayed in the format of the search result list display 121d6 as shown in FIG. The information is displayed in the information display area 121d2 in the form of a homepage display 121d7 as shown.
- the search result list display 121d6 is a format that displays a list of link information to a plurality of homepages and the like that the search engine of the related information acquisition execution unit 104b3 determines to match the keyword condition in the keyword search performed in the process of S107. is there.
- the search engine of the related information acquisition execution unit 104b3 determines to match the keyword condition in the keyword search performed in the process of S107. is there.
- the information such as the homepage as the related information of the destination feature is simply displayed on the display unit 121. It can be displayed.
- the homepage display 121d7 is a format for directly displaying one of information such as a homepage that the search engine of the related information acquisition execution unit 104b3 determines to meet the keyword condition in the keyword search performed in the process of S107. In this case, the user can immediately confirm information such as a home page as related information of the destination feature.
- the feature information display screen (result display) 121d of the information display apparatus 100 may be set by the user as to which of the above-mentioned formats the relevant information of the target feature is displayed. .
- the search result is displayed in a homepage display format, and when there are a plurality of search results having a degree of match with the keyword equal to or more than a predetermined value.
- the related information of the destination feature may be displayed in the form of a list. Further, the related information of the destination feature may be displayed on the display unit 121 in a format different from the above.
- the unique information of the adjacent feature adjacent to the destination feature (store 312 and store 314 in the example shown in FIG. 9C) is also acquired.
- related information regarding the adjacent feature may be acquired in the process of S107.
- as much related information regarding each feature located around the target feature as possible may be acquired as long as the processing capability of the information display terminal 100 allows.
- the arrow of the navigation mark 121b1 does not point correctly at the destination feature (store 313) due to the shift of the holding angle of the information display device 100.
- information different from the related information related to the destination feature may be displayed in the process of S109.
- related information related to the adjacent feature is also acquired in advance, a flick in the left-right direction on the feature information display screen (result display) 121d shown in FIGS. 10A and 10B.
- the current position 321 of the user has been described as a fixed point, but may be a moving point including at least two different points under certain conditions. Since the user does not request “stop” on the display screen of FIG. 7, the user may operate while moving. Even when the user is walking or riding a low-speed moving body and the current position 321 of the user changes with time, the map display device of this embodiment can be used. During the movement of the user, the azimuth angle between the user and the destination feature 313 changes slightly but continuously. However, the information required within a predetermined time is only information on the current position and azimuth of the information display terminal 100 itself.
- the position closest to the straight line (intersection 324) at each time point within a predetermined time is the location of the destination feature 313. It is on the contour line.
- the spatial position (relative position) of the frame is fixed Can be determined.
- the information display apparatus 100 of the present embodiment it is possible to provide an information display apparatus and method that can display information on nearby destination features to the user with a simpler configuration. That is, since the information display device 100 effectively uses computing resources on the cloud, hardware for acquiring distance information between the information display device 100 and the target feature for specifying the target feature and / or It is possible to acquire and display related information of the destination feature with a simpler configuration that does not require software.
- the related information of the destination feature is acquired from a public network such as the Internet through a network search using the unique information (address information, store name information, building name information, etc.) of the destination feature as a keyword. It is possible to efficiently collect the latest information.
- the information necessary for determining the body posture is the current position and the azimuth angle of the information display terminal itself in the real space. Even if there are cars, there is no problem. Even if the map display device of the present embodiment is used in an environment such as a downtown area where many buildings and stores are lined up and many people and cars frequently move between the user and the destination feature, It is possible to properly provide and display information on nearby destination features.
- the information display apparatus of the present embodiment can be easily realized simply by downloading an “information display program” as an application program to a commercially available mobile terminal having a communication function.
- FIG. 11 is a software configuration diagram of the information display apparatus 100 of the present embodiment.
- the information display program 110b, the camera function program 110c, and other programs are stored in the storage unit 110. That is, in the second embodiment, a digital camera is assumed as a specific example of the mobile terminal, and in addition to the configuration of the first embodiment, a camera function program 110c is provided.
- the information display program 110b stored in the storage unit 110 is expanded in the RAM 104 in the same manner as in the first embodiment, and the main control unit 101 executes the expanded information display program, whereby the information display execution unit 104b. And a position / orientation acquisition execution unit 104b1, a target feature identification execution unit 104b2, and a related information acquisition execution unit 104b3.
- the camera function program 110c is expanded in the RAM 104, and the main control unit 101 executes the expanded camera function program 110c, thereby configuring the camera function execution unit 104c and the target feature extraction execution unit 104c1.
- Control of the information display operation of the information display apparatus 100 of the present embodiment is mainly performed by the information display execution unit 104b, the position / orientation acquisition execution unit 104b1, the target feature identification execution unit 104b2, the related information acquisition execution unit 104b3, and the basic operation execution.
- Unit 104a camera function execution unit 104c, and target feature extraction execution unit 104c1.
- the information display device 100 further includes hardware blocks that are realized by hardware, and includes an information display execution unit 104b, a position / orientation acquisition execution unit 104b1, a target feature identification execution unit 104b2, and a related information acquisition execution.
- the hardware blocks may control the operation of the information display device 100.
- FIG. 12 is a screen display diagram for explaining the basic screen 121a of the information display apparatus 100 of the present embodiment.
- the icon group (APP-A to N) displayed in the area 121a1 of the basic screen 121a is a collection of icons associated with each application program that can be executed by the information display apparatus 100.
- the “information camera” icon 121a3 is an icon associated with an “information display program” that executes an information display process that characterizes the information display apparatus 100 of the present embodiment.
- the basic operation execution unit 104a executes information display execution of the “information display program”.
- the unit 104b is activated, and the control subject is transferred to the information display execution unit 104b.
- the information display execution unit 104b which has been delegated the control subject from the basic operation execution unit 104a, first activates the camera function execution unit 104c and validates the second image input unit 124 (out camera) (S201). Next, the camera function execution unit 104c starts input of image data from the second image input unit 124 based on the control of the information display execution unit 104b, and the input image data is shown in FIG. It is displayed on the live view display screen 121e (S202).
- the live view display screen 121e includes a live view window 121e1, a “shutter” icon 121e2, a “flash” icon 121e3, a “function setting” icon 121e4, and an “end” icon 121e5.
- the live view window 121e1 displays the image data input by the second image input unit 124.
- the user of the information display apparatus 100 can adjust the composition of the subject to be photographed while confirming the display of the live view window 121e1. It should be noted that by performing an operation such as pinch out / in on the touch panel 140t (see FIG. 12) corresponding to the position on the display unit 121 where the live view window 121e1 is displayed, the second image input unit 124 is zoomed in / out. Can be controlled.
- the camera function execution unit 104c When it is detected that the user has selected the “shutter” icon 121e2, the camera function execution unit 104c starts a recording sequence. In this recording sequence, the camera function execution unit 104c executes a process for converting the output of an electronic device such as a CCD / CMOS sensor into digital image data in addition to control of focusing, exposure, and the like. Image data is input from the input unit 124. Further, the camera function execution unit 104c performs signal processing such as gamma correction and noise removal on the input image data, and stores the image data subjected to the above processing in various information / data storage areas of the storage unit 110. .
- signal processing such as gamma correction and noise removal
- the “flash” icon 121e3 switches between enabling / disabling the flash function by selecting this.
- the “function setting” icon 121e4 By selecting the “function setting” icon 121e4, various settings of the camera function of the information display apparatus 100 according to the present embodiment can be changed.
- the signal processing such as focusing, exposure, gamma correction, noise removal, etc., the flash function, and the function for changing various settings are not configurations that make the features of the present invention. The detailed explanation is omitted.
- the information display execution unit 104b When the user selects the “end” icon 121e5 or presses the home key 140k2, the information display execution unit 104b operates the camera function execution unit 104c (although not shown in the flowchart of FIG. 13). And the second image input unit 124 is invalidated, and the control subject is returned to the basic operation execution unit 104a to end the operation of the information display execution unit 104b. Further, the basic operation execution unit 104a displays a basic screen 121a.
- the user adjusts the body posture of the information display device 100 so that the second image input unit 124 can capture the feature (target feature) for which information is desired to be acquired. That is, for example, when a store for which detailed information is to be acquired is found while walking in a shopping street, the user has the information display device 100 so that the second image input unit 124 faces the target store, and a live view display screen The target store may be displayed in the live view window 121e1 of 121e. Further, the housing posture of the information display terminal 100 is held for a predetermined time or more in a state where the target store is displayed in the live view window 121e1 (S203: Yes).
- the processes after S204 are started. That is, when it is not determined that the chassis posture has been held for a predetermined time or longer, such as when the image displayed in the live view window 121e1 continues to change (S203: No), the processing after S204 is not started.
- the state in which the above-described body posture is maintained refers to a state in which the spatial position of the body is substantially fixed.
- the spatial position of the housing does not need to be completely fixed, and it is determined that the housing posture is maintained by allowing a slight change in position due to camera shake or the like.
- the processing from S204 onward may be started with the selection of an “information acquisition” icon (not shown) separately prepared on the live view display screen 121e as a trigger.
- the information display execution unit 104b performs the processes of S204 to S208. Since this is the same as the processing of S103 to S107 of the first embodiment, description thereof is omitted. However, the orientation information acquired in the processing of S204 is appropriately corrected in consideration of the fact that the second image input unit 124 is directed to the destination feature as compared with the case of the first embodiment. Needless to say, there is a need.
- the camera function execution unit 104c is based on the control of the information display execution unit 104b.
- a gaze mark (see FIG. 16: 121e6) indicating that there is displayable related information related to the destination feature is superimposed on a position in the live view window 121e1 where the relationship with the destination feature becomes clear.
- Are displayed (S209).
- the unique information of the destination feature cannot be acquired in the process of S207, or when the related information regarding the destination feature cannot be acquired in the process of S208, the superimposing process is not performed.
- FIG. 15 is an enlarged view of the live view window 121e1 on the live view display screen 121e, and the user directs the second image input unit 124 to the store 313 on the current position 321 on the map shown in FIG. This is an example when the apparatus 100 is held.
- the live view window 121e1 the adjacent store 312 and store 314 are imaged and displayed with the store 313 at the center.
- the relationship with the store 313 in the live view window 121e1 is as shown in FIG.
- a gaze mark 121e6 is superimposed and displayed at a clear position.
- the vicinity of the center of the store 313 in the live view window is selected as the display position of the gaze mark 121e6 that clearly shows the relationship with the store 313.
- the display position of the gaze mark 121e6 that makes the relationship with the store 313 clear is not limited to the vicinity of the center of the store 313 in the live view window, but may be an arbitrary position overlapping the store 313, for example.
- the processing is acquired in S208.
- Related information related to the store 313 may be displayed on the display unit 121 in the form of the search result list display 121d6 shown in FIG. 10A or the form of the home page display 121d7 shown in FIG. 10B.
- the target feature extraction execution unit 104c1 when the target feature extraction execution unit 104c1 performs a tap operation or the like on the region 121e8 indicating the store 313 input from the second image input unit 124 and extracted from the image data displayed on the live view window 121e1, You may make it display the relevant information regarding the shop 313 acquired by the process of S208.
- the display shape (color, shape, size, presence / absence of blinking, etc.) of the related information related to the store 313 acquired in the process of S208 is displayed. You may make it change.
- the gaze mark when the gaze mark is a triangle, it indicates that the related information has been acquired in the form of a search result list display, and as shown in FIG.
- the gaze mark when the gaze mark is a star shape, it indicates that the related information has been acquired in a homepage display format.
- related information regarding the store 314 is not displayed even if a tap operation or the like is performed on the area indicating the store 314. That is, whether or not related information regarding each store can be displayed may be confirmed by the presence or absence of a gaze mark in the live view window 121e1. If the related information cannot be acquired, the gaze mark may not be displayed as described above. However, as shown in FIG. 17C, the gaze indicating that there is no related information. A mark may be displayed.
- the related information display window 121e9 for displaying the related information regarding the destination feature is superimposed and displayed in the live view window 121e1 in the format of PinP (Picture in Picture). You may make it do.
- the format of the search result list display 121d6 shown in FIG. 10A Alternatively, the display may be switched to the format of the home page display 121d7 shown in FIG. 10B.
- a reference marker 121e10 as shown in FIG. 19 may be displayed inside the live view window 121e1.
- the reference marker 121e10 serves as a reference position for the focusing process in the recording sequence when the “shutter” icon 121e2 is pressed by the user, and an aiming position for target object orientation in the information display process of the present embodiment. In this way, if the reference marker 121e10 is displayed inside the live view window 121e1, the process of directing the information display device 100 to the target feature becomes easier.
- the “shutter” icon 121e2 is selected in a state where the related information regarding the destination feature has been acquired by the processing of the flowchart shown in FIG. 13, the unique information and the related information are used as the extended data in the recording sequence. You may make it record on an image data file with data.
- the recording destination may be various information / data recording areas of the storage unit 110, various storage media connected to the expansion interface unit 170, or a network storage connected via the communication processing unit 150. good.
- FIG. 20 shows an example of the file structure of the image data file 300 recorded in various information / data recording areas of the storage unit 110.
- the image data file 300 of the present embodiment includes image data 310 and extended data 330.
- the extended data 330 is a shooting date / time, shutter speed, aperture, etc. of the image data 310, and GPS information of the shooting location.
- the shooting condition information 331 indicating the conditions, the unique information 332 of the target feature acquired in the process of S207, and the URL (Uniform Resource Locator) 333 of related information related to the target feature acquired in the process of S208 are assumed to be included. .
- the user's current position is within a predetermined time. If the relationship between the position and the specific destination feature 313 is substantially constant, it may be determined that the chassis posture has been held for a predetermined time or more.
- the distance information between the information display device 100 for specifying the destination feature and the destination feature is possible to acquire and display the relevant information of the destination feature with a simpler configuration that does not require hardware and / or software for acquiring.
- the destination feature can be displayed and confirmed on the display unit 121, and further, it can be easily confirmed whether or not there is related information regarding the destination feature. Since the relevant information of the destination feature is acquired from a public network such as the Internet by performing a network search using the unique information of the destination feature (address information, store name information, building name information, etc.) as a keyword, As in the first embodiment, it is possible to efficiently collect the latest information. Further, the image of the destination feature and the related information related to the destination feature can be stored in the storage as an image data file, and the destination feature and the related information can be reviewed at a later date.
- the embodiments of the present invention have been described using the first and second embodiments. Needless to say, the configuration for realizing the technology of the present invention is not limited to the above-described embodiments, and various modifications may be considered. It is done. For example, a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. These all belong to the category of the present invention.
- numerical values, messages, and the like appearing in sentences and drawings are merely examples, and the use of different ones does not impair the effects of the present invention.
- the functions and the like of the present invention described above may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
- the microprocessor unit or the like may be realized by software by interpreting and executing a program that realizes each function or the like. Hardware and software may be used together.
- the software may be stored in advance in the ROM 103 or the storage unit 110 of the information display device 100 at the time of product shipment. After product shipment, it may be acquired from the application server 211 on the Internet 201 via the LAN communication unit 151 or the mobile telephone network communication unit 152. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 170 or the like.
- control lines and information lines shown in the figure are those that are considered necessary for the explanation, and not all control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
- Information display device 101 Main control unit 102: System bus 103: ROM 104: RAM 104a: Basic operation execution unit 104b: Information display execution unit 104b1: Position / orientation acquisition execution unit 104b2: Target feature identification execution unit, 104b3: related information acquisition execution unit, 104c: camera function execution unit, 110: storage unit, 110b: information display program, 110c: camera function program, 120: image processing unit, 121a: basic screen, 121a1: Application program icon group, 121a2: Feature information icon, 130: Audio processing unit, 140: Operation unit, 150: Communication processing unit, 160: Sensor unit, 161: GPS receiving unit, 162: Gyro sensor, 163: Geomagnetic sensor, 170: expansion interface, 300: map data , 301-302: Road, 311-315: store, 321: current location of the user, 322: azimuth of the straight line, 323: a straight line, 324: intersection.
Abstract
Description
図1は、実施例1の情報表示装置のブロック図である。情報表示装置100は、主制御部101、システムバス102、ROM103、RAM104、ストレージ部110、画像処理部120、音声処理部130、操作部140、通信処理部150、センサ部160、拡張インタフェース部170、等を構成要素とするコンピュータで構成される。
本実施例の情報表示装置100における情報表示動作は、図2に示したように、主として、ストレージ部110に記憶された情報表示プログラム110bがRAM104に展開されて主制御部101により実行されることによって構成された、情報表示実行部104b及び位置/方位取得実行部104b1と対象地物特定実行部104b2と関連情報取得実行部104b3、また、基本動作実行部104aによって制御される。或いは、前述の情報表示実行部104b、位置/方位取得実行部104b1、対象地物特定実行部104b2、関連情報取得実行部104b3と同等の動作をハードウェアで実現する各ハードウェアブロックを本実施例の情報表示装置100が更に備え、情報表示実行部104b、位置/方位取得実行部104b1、対象地物特定実行部104b2、関連情報取得実行部104b3に代替して、前記各ハードウェアブロックが情報表示装置100の動作を制御するものであっても良い。情報表示装置100の位置/方位取得実行部104b1は、GPS受信部161で受信したGPS情報(緯度、経度等)を利用して、地図データサーバ212から現在位置周辺の地図情報を取得し、表示部121の地図上に現在位置及びその周辺を表示する。
基本動作実行部104aの制御に基づいて動作する情報表示装置100において、ユーザが基本画面121a上のアイコン121a2をタップ操作等により選択すると、「情報表示プログラム」が実行され、基本動作実行部104aは情報表示実行部104bを起動し、制御主体を情報表示実行部104bに委譲する。
なお、本実施例においては、実空間において、情報表示装置100を所有するユーザが、店舗311~315等の立ち並ぶ商店街の、主幹道路301と脇道302が丁字交差する交差路付近に位置するものと仮定する。
Claims (15)
- 地物の関連情報を表示可能な情報表示装置であって、
前記情報表示装置の現在位置情報を取得する位置情報取得部と、
前記情報表示装置を目的の地物に指向させた際の前記情報表示装置の方位情報を取得する方位情報取得部と、
地図情報を記憶する地図情報記憶部と、
前記現在位置情報及び前記方位情報を用いて、前記地図情報を参照することにより前記目的の地物を対象地物として特定する対象地物特定実行部と、
前記対象地物に関する固有情報を取得する固有情報取得部と、
前記固有情報に基づいた検索処理により、前記目的の地物の前記関連情報を取得する関連情報取得部と、
前記目的の地物の前記関連情報を表示する表示部と、
を備え、
前記対象地物特定実行部は、
前記情報表示装置の前記現在位置情報及び前記方位情報を基に、前記地図情報で得られる地図上の前記情報表示装置の現在位置から該情報表示装置が向けられている方向が該情報表示装置から最も近い位置で交差する前記地図上の地物を前記対象地物として特定する
ことを特徴とする情報表示装置。 - 地物の関連情報を表示可能な情報表示装置であって、
前記情報表示装置の現在位置情報を取得する位置情報取得部と、
前記情報表示装置を目的の地物に指向させた際の前記情報表示装置の方位情報を取得する方位情報取得部と、
地図情報を記憶する地図情報記憶部と、
前記現在位置情報及び前記方位情報を用いて、前記地図情報記憶部に記憶されている前記地図情報を参照することにより、前記目的の地物を対象地物として特定する対象地物特定実行部と、
前記対象地物に関する固有情報を取得する固有情報取得部と、
前記固有情報を検索サーバに送信し、前記検索サーバから前記目的の地物の前記関連情報を受信する通信部と、
前記通信部で受信した前記目的の地物の前記関連情報を表示する表示部と、
を備え、
前記対象地物特定実行部は、
前記情報表示装置の前記現在位置情報及び前記方位情報を基に、前記地図情報で得られる地図上の前記情報表示装置の現在位置から該情報表示装置が向けられている方向が該情報表示装置から最も近い位置で交差する前記地図上の地物を前記対象地物として特定する
ことを特徴とする情報表示装置。 - 前記地図情報は、2次元の地図情報であり、
前記対象地物特定実行部は、
前記現在位置情報及び前記方位情報を基に、前記2次元の座標平面上で、前記情報表示装置が向けられている方向が前記最も近い位置で交差する前記地物の輪郭線と交差する点を求め、該地物を前記対象地物として特定する
ことを特徴とする請求項1または請求項2に記載の情報表示装置。 - 前記情報表示装置は、更に、
前記目的の地物の画像情報を入力する画像入力部を備え、
前記表示部は、前記目的の地物の前記関連情報の有無を示す注視マークを前記画像入力部で入力した前記目的の地物の前記画像情報と重畳して表示する
ことを特徴とする請求項1または請求項2に記載の情報表示装置。 - 前記情報表示装置は、更に、
ユーザの操作指示を入力する指示入力部を備え、
前記表示部は、前記指示入力部により前記注視マークを選択する指示が入力された場合に前記目的の地物の前記関連情報を表示する
ことを特徴とする請求項4に記載の情報表示装置。 - 前記情報表示装置において、
前記注視マークは、前記指示入力部により前記注視マークを選択する指示が入力された場合に表示する前記目的の地物の前記関連情報の表示書式に応じて表示形状が異なる
ことを特徴とする請求項5に記載の情報表示装置。 - 前記情報表示装置は、更に、
画像データファイルを記憶媒体に記録する記録処理部を備え、
前記画像データファイルは、少なくとも前記画像入力部から入力した前記画像情報と前記目的の地物の前記関連情報とを含む
ことを特徴とする請求項4に記載の情報表示装置。 - 前記目的の地物の前記固有情報を取得するために、前記表示部に初期状態として表示される地物情報表示画面に、前記情報表示装置を前記目的の地物方向に向けて保持するガイダンスを提示する
ことを特徴とする請求項1または請求項2に記載の情報表示装置。 - 前記情報表示装置において、
前記固有情報取得部における前記目的の地物の前記固有情報を取得する処理は、前記情報表示装置を前記目的の地物に指向させた状態が所定時間以上継続した場合に開始される
ことを特徴とする請求項8に記載の情報表示装置。 - 前記情報表示装置において、
前記固有情報取得部における前記目的の地物の前記固有情報を取得する処理は、前記初期状態の表示画面に設けられた情報取得アイコンが選択されたことをトリガとして開始される
ことを特徴とする請求項8に記載の情報表示装置。 - 前記地図情報は、2次元の地図情報であり、
前記固有情報取得部における前記目的の地物の前記固有情報を取得する処理は、前記情報表示装置を前記目的の地物に指向させた状態が所定時間以上継続した場合、若しくは、前記情報表示装置を前記目的の地物に指向させた状態が所定時間以上継続した場合に開始され、
前記対象地物特定実行部は、
前記現在位置情報及び前記方位情報を基に、前記2次元の地図上において、前記情報表示装置が向けられている方向が、前記最も近い位置で交差する前記地物を前記対象地物として特定する
ことを特徴とする請求項1または請求項2に記載の情報表示装置。 - 地物の関連情報を表示可能な情報表示装置の情報表示プログラムであって、
前記情報表示装置は、主制御部及び記憶領域を備えたコンピュータを備えており、
前記情報表示装置の現在位置情報を取得する位置情報取得ステップと、
前記情報表示装置を目的の地物に指向させた際の前記情報表示装置の方位情報を取得する方位情報取得ステップと、
地図情報を前記記憶領域に記憶する地図情報記憶ステップと、
前記現在位置情報及び前記方位情報を用いて、前記地図情報を参照することにより前記目的の地物を対象地物として特定する対象地物特定ステップと、
前記地図情報を参照することにより、前記対象地物に関する固有情報を取得する固有情報取得ステップと、
前記固有情報に基づいた検索処理により前記目的の地物の前記関連情報を取得する関連情報取得ステップと、
前記目的の地物の前記関連情報を表示する表示ステップと、
をコンピュータに実行させ、
前記対象地物特定実行ステップにおいて、
前記情報表示装置の前記現在位置情報及び前記方位情報を基に、前記地図情報で得られる地図上の前記情報表示装置の現在位置から該情報表示装置が向けられている方向が該情報表示装置から最も近い位置で交差する前記地図上の地物を前記対象地物として特定する
ことを特徴とする情報表示プログラム。 - 前記関連情報取得ステップにおいて、
前記目的地物の固有情報をキーワードとしたネットワーク検索を行い、前記目的地物に関する前記関連情報の取得を行う
ことを特徴とする請求項12に記載の情報表示プログラム。 - 前記関連情報取得ステップは、
前記固有情報取得ステップで取得した前記固有情報を検索サーバに送信し、前記検索サーバから前記目的の地物の前記関連情報を受信する通信ステップと、
前記目的の地物の前記関連情報を表示する表示ステップと、
をコンピュータに実行させることを特徴とする請求項12に記載の情報表示プログラム。 - 前記情報表示プログラムは、更に、
前記目的の地物の画像情報を入力する画像入力ステップを備え、
前記表示ステップは、前記目的の地物の前記関連情報の有無を示す注視マークを前記画像入力部で入力した前記目的の地物の前記画像情報と重畳して表示する
ことを特徴とする請求項12に記載の情報表示プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016503805A JP6145563B2 (ja) | 2014-02-18 | 2014-02-18 | 情報表示装置 |
PCT/JP2014/053765 WO2015125210A1 (ja) | 2014-02-18 | 2014-02-18 | 情報表示装置及び情報表示プログラム |
US15/114,992 US20160343156A1 (en) | 2014-02-18 | 2014-02-18 | Information display device and information display program |
CN201480073172.XA CN105917329B (zh) | 2014-02-18 | 2014-02-18 | 信息显示装置和信息显示程序 |
US17/529,638 US20220076469A1 (en) | 2014-02-18 | 2021-11-18 | Information display device and information display program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/053765 WO2015125210A1 (ja) | 2014-02-18 | 2014-02-18 | 情報表示装置及び情報表示プログラム |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/114,992 A-371-Of-International US20160343156A1 (en) | 2014-02-18 | 2014-02-18 | Information display device and information display program |
US17/529,638 Continuation US20220076469A1 (en) | 2014-02-18 | 2021-11-18 | Information display device and information display program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015125210A1 true WO2015125210A1 (ja) | 2015-08-27 |
Family
ID=53877751
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/053765 WO2015125210A1 (ja) | 2014-02-18 | 2014-02-18 | 情報表示装置及び情報表示プログラム |
Country Status (4)
Country | Link |
---|---|
US (2) | US20160343156A1 (ja) |
JP (1) | JP6145563B2 (ja) |
CN (1) | CN105917329B (ja) |
WO (1) | WO2015125210A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170046891A1 (en) * | 2015-08-12 | 2017-02-16 | Tyco Fire & Security Gmbh | Systems and methods for location identification and tracking using a camera |
USD806743S1 (en) * | 2016-08-01 | 2018-01-02 | Facebook, Inc. | Display screen with animated graphical user interface |
CN108255046B (zh) | 2016-12-28 | 2020-06-09 | 卡西欧计算机株式会社 | 电子设备、显示控制方法以及记录介质 |
CN109974733A (zh) * | 2019-04-02 | 2019-07-05 | 百度在线网络技术(北京)有限公司 | 用于ar导航的poi显示方法、装置、终端和介质 |
WO2023091506A1 (en) * | 2021-11-16 | 2023-05-25 | Figma, Inc. | Commenting feature for graphic design systems |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004531791A (ja) * | 2001-01-24 | 2004-10-14 | ジオベクター コーポレーション | 物をアドレスするポインティング・システム |
WO2010150643A1 (ja) * | 2009-06-22 | 2010-12-29 | 兵庫県 | 情報システム、サーバ装置、端末装置、情報処理方法、およびプログラム |
JP2012141768A (ja) * | 2010-12-28 | 2012-07-26 | Dainippon Printing Co Ltd | 携帯用端末装置、情報閲覧用プログラム、サーバ装置及び、閲覧情報提供用プログラム |
JP2013080326A (ja) * | 2011-10-03 | 2013-05-02 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
JP2013142956A (ja) * | 2012-01-10 | 2013-07-22 | Pasuko:Kk | 撮影対象検索システム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3918813B2 (ja) * | 2001-10-23 | 2007-05-23 | ソニー株式会社 | データ通信システム、データ送信装置、並びにデータ受信装置 |
US8275394B2 (en) * | 2008-03-20 | 2012-09-25 | Nokia Corporation | Nokia places floating profile |
US9736368B2 (en) * | 2013-03-15 | 2017-08-15 | Spatial Cam Llc | Camera in a headframe for object tracking |
EP2500814B1 (en) * | 2011-03-13 | 2019-05-08 | LG Electronics Inc. | Transparent display apparatus and method for operating the same |
US9996150B2 (en) * | 2012-12-19 | 2018-06-12 | Qualcomm Incorporated | Enabling augmented reality using eye gaze tracking |
-
2014
- 2014-02-18 CN CN201480073172.XA patent/CN105917329B/zh active Active
- 2014-02-18 WO PCT/JP2014/053765 patent/WO2015125210A1/ja active Application Filing
- 2014-02-18 JP JP2016503805A patent/JP6145563B2/ja active Active
- 2014-02-18 US US15/114,992 patent/US20160343156A1/en not_active Abandoned
-
2021
- 2021-11-18 US US17/529,638 patent/US20220076469A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004531791A (ja) * | 2001-01-24 | 2004-10-14 | ジオベクター コーポレーション | 物をアドレスするポインティング・システム |
WO2010150643A1 (ja) * | 2009-06-22 | 2010-12-29 | 兵庫県 | 情報システム、サーバ装置、端末装置、情報処理方法、およびプログラム |
JP2012141768A (ja) * | 2010-12-28 | 2012-07-26 | Dainippon Printing Co Ltd | 携帯用端末装置、情報閲覧用プログラム、サーバ装置及び、閲覧情報提供用プログラム |
JP2013080326A (ja) * | 2011-10-03 | 2013-05-02 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
JP2013142956A (ja) * | 2012-01-10 | 2013-07-22 | Pasuko:Kk | 撮影対象検索システム |
Also Published As
Publication number | Publication date |
---|---|
JP6145563B2 (ja) | 2017-06-14 |
US20160343156A1 (en) | 2016-11-24 |
CN105917329B (zh) | 2019-08-30 |
US20220076469A1 (en) | 2022-03-10 |
CN105917329A (zh) | 2016-08-31 |
JPWO2015125210A1 (ja) | 2017-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102021050B1 (ko) | 내비게이션 정보를 제공하는 방법, 기계로 읽을 수 있는 저장 매체, 이동 단말 및 서버 | |
US20220076469A1 (en) | Information display device and information display program | |
KR102627612B1 (ko) | 증강현실을 이용한 주변 정보 표시 방법 및 그 전자 장치 | |
US10194273B2 (en) | Positioning information processing method and apparatus | |
EP3748533B1 (en) | Method, apparatus, and storage medium for obtaining object information | |
KR101769852B1 (ko) | 드론을 이용한 부동산 거래 중개 시스템 | |
JP2021520540A (ja) | カメラの位置決め方法および装置、端末並びにコンピュータプログラム | |
CN110457571B (zh) | 获取兴趣点信息的方法、装置、设备及存储介质 | |
US20230284000A1 (en) | Mobile information terminal, information presentation system and information presentation method | |
CN110865756A (zh) | 图像标注方法、装置、设备及存储介质 | |
US20140292636A1 (en) | Head-Worn Infrared-Based Mobile User-Interface | |
CN112149659B (zh) | 定位方法及装置、电子设备和存储介质 | |
CN112818240A (zh) | 评论信息的展示方法、装置、设备及计算机可读存储介质 | |
CN110532474B (zh) | 信息推荐方法、服务器、系统以及计算机可读存储介质 | |
CN111754564B (zh) | 视频展示方法、装置、设备及存储介质 | |
CN110990728A (zh) | 兴趣点信息的管理方法、装置、设备及存储介质 | |
CN111008083A (zh) | 页面通信方法、装置、电子设备及存储介质 | |
JP2008111693A (ja) | 移動体装置および目標物情報検索方法 | |
JP2016133701A (ja) | 情報提供システム、及び情報提供方法 | |
JP2006047147A (ja) | 情報提供装置 | |
CN109582200B (zh) | 一种导航信息显示方法及移动终端 | |
CN111984755A (zh) | 确定目标停车点的方法、装置、电子设备及存储介质 | |
CN112804481B (zh) | 监控点位置的确定方法、装置及计算机存储介质 | |
JP2019174548A (ja) | 制御装置、電子機器、制御方法、及び制御プログラム | |
WO2021200187A1 (ja) | 携帯端末および情報処理方法、並びに記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14883271 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016503805 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15114992 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14883271 Country of ref document: EP Kind code of ref document: A1 |