WO2018150515A1 - 画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 - Google Patents
画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 Download PDFInfo
- Publication number
- WO2018150515A1 WO2018150515A1 PCT/JP2017/005709 JP2017005709W WO2018150515A1 WO 2018150515 A1 WO2018150515 A1 WO 2018150515A1 JP 2017005709 W JP2017005709 W JP 2017005709W WO 2018150515 A1 WO2018150515 A1 WO 2018150515A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- inclination
- image database
- database
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/33—Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
- H04W64/003—Locating users or terminals or network equipment for network management purposes, e.g. mobility management locating network equipment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/912—Applications of a database
- Y10S707/913—Multimedia
- Y10S707/915—Image
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/99941—Database schema or data structure
- Y10S707/99948—Application of database or data structure, e.g. distributed, multimedia, or image
Definitions
- the present invention relates to an image database construction apparatus that generates an image database for estimating the position of a terminal indoors, a position and inclination estimation apparatus that estimates the position of a terminal using the image database, and an image database construction method.
- Non-Patent Document 1 As a method for obtaining an accurate position and inclination of a terminal, there has been a method using an image registered in advance (see, for example, Non-Patent Document 1). This is an image data that is linked to all indoor images and position and tilt information, and this image data is accumulated and prepared as a database. When estimating the position of the terminal, the image taken at the terminal position is stored in the database. The most similar image is searched by collating with all the images in it, and the position and inclination of the image are output.
- the conventional method for estimating the position and inclination has the following problems in order to collate images that are similar to the photographed image from all indoor images. That is, the first problem is deterioration of real-time property. As the number of image data stored in the database increases, the calculation cost increases and the real-time property deteriorates. The second problem is that it is difficult to identify the same scene. For example, if similar data at different positions, i.e., image data with different positions but similar subject shapes, is stored in the database, an inaccurate matching result may be obtained. It was.
- a satellite positioning system such as GPS (Global Positioning System) may be used as a means for obtaining the position of the terminal.
- GPS Global Positioning System
- indoor positioning using such a satellite positioning system is difficult, and positioning accuracy for obtaining an accurate position of the terminal is insufficient. Therefore, it is difficult to apply to an apparatus for estimating the position and inclination indoors.
- the method using the satellite positioning system does not provide accurate matching results because the positioning accuracy in the height direction is insufficient. It is difficult to estimate the position and inclination of an indoor terminal from a point using a satellite positioning system.
- the position and inclination estimation apparatus also includes an image database storing image information obtained by grouping images having shooting position and inclination information in a plurality of areas, and which of the plurality of areas belongs to the image database.
- An area detection unit that detects an image, an image acquisition unit that acquires an image of an imaging target, an image acquired by the image acquisition unit, and an image stored in the image database, the same region detected by the region detection unit
- a database collation unit that collates only the images belonging to the image, and a position and inclination estimation unit that outputs the position and inclination of the collation result image as an estimation result in the database collation unit.
- the image database construction device associates a captured image, an acquired position and inclination of the captured image, and region information, and generates associated image information as an image database.
- the position and inclination estimation device collates only images belonging to the same region as the region detected by the region detection unit among the images stored in the image database. Thereby, real-time property can be improved and an accurate position and inclination estimation result can be obtained.
- FIG. 1 is a configuration diagram of an image database construction apparatus according to the present embodiment.
- the illustrated image database construction apparatus includes an area detection unit 1, a sensor information acquisition unit 2, a positional relationship calculation unit 3, and an image database generation unit 4.
- the area detection unit 1 is a processing unit that detects which area is located among a plurality of indoor areas as area information.
- the sensor information acquisition unit 2 is an information acquisition unit provided with an image acquisition unit 2a and a distance measurement unit 2b.
- the image acquisition unit 2a is an apparatus that acquires an indoor image
- the distance measurement unit 2b is a processing unit that measures distance information indicating the distance between the shooting target of the shot image and the acquisition position of the shot image.
- the positional relationship calculation unit 3 is a calculation unit that calculates the acquisition position and inclination of each captured image based on the captured image and distance information acquired by the sensor information acquisition unit 2.
- the image database generation unit 4 uses the acquired position and inclination of the captured image calculated by the positional relationship calculation unit 3, the captured image acquired by the sensor information acquisition unit 2, and the region information detected by the region detection unit 1 as image information. This is a processing unit that generates this image information as the image database 5 in association.
- FIG. 2 is a hardware configuration diagram of the image database construction apparatus shown in FIG.
- the image database construction device 100 includes an RFID (Radio Frequency IDentification) receiver 101, a camera 102, a distance sensor 103, and a calculator 104.
- the RFID receiver 101 is a receiver that communicates with an RFID tag 105 provided for each of a plurality of indoor areas and detects which RFID tag 105 is recognized.
- the camera 102 is a device that acquires a color image.
- the distance sensor 103 is a device that acquires a distance image (an image of the distance corresponding to each pixel).
- the computer 104 is a device for acquiring data from the RFID receiver 101, the camera 102, and the distance sensor 103 and generating the image database 5 of FIG. 1 based on these data, and is configured by, for example, a personal computer or a smartphone. ing.
- the RFID receiver 101 in FIG. 2 constitutes the area detection unit 1 in FIG. 1, the camera 102 constitutes the image acquisition unit 2a, and the distance sensor 103 constitutes the distance measurement unit 2b.
- the computer 104 realizes the functions of the positional relationship calculation unit 3 and the image database generation unit 4. The realization of these functions is configured by the processor of the computer 104 executing software corresponding to the positional relationship calculation unit 3 and the image database generation unit 4.
- the area detection unit 1 to image database generation unit 4 of FIG. 1 have the following functions.
- the area detection unit 1 communicates with the RFID tag 105 and recognizes its attribute (which area).
- the sensor information acquisition unit 2 acquires a plurality of color images and a pair of distance information corresponding to these color images.
- the positional relationship calculation unit 3 connects the information acquired by the sensor information acquisition unit 2 to create three-dimensional point cloud data, and estimates the acquisition position and inclination of each color image and distance information pair.
- the image database generation unit 4 associates the acquired position and inclination of the captured image estimated by the positional relationship calculation unit 3, the captured image, and the region information as image information, and generates this image information as the image database 5.
- FIG. 3 shows a state in which two rooms are viewed from above.
- Each room (room 1, room 2) corresponds to each area, and a first RFID tag 105a and a second RFID tag 105b are provided near the entrance of each room.
- the operation of constructing such an image database for two rooms will be described with reference to the flowchart of FIG.
- the region detection unit 1 recognizes the first RFID tag 105a (region detection step: step ST1). Thereafter, a color image of the room 1 is acquired by the image acquisition unit 2a of the sensor information acquisition unit 2 (information acquisition step: step ST2), and a distance image is acquired by the distance measurement unit 2b (information acquisition step: step ST3).
- the positional relationship calculation unit 3 joins the data acquired by the sensor information acquisition unit 2, and calculates the position and inclination of the device that performs color image acquisition and distance image measurement (calculation step: step ST4).
- An example of the position and inclination obtained here is a value based on the position and inclination at the start of measurement.
- a method for connecting a pair of color image and distance information a method using a point that is a feature of the image (for example, literature: Taguchi, Y, Jian, YD, Ramlingam, S: Point-Plane SLAM for Hand-Held 3D Sensors, Robotics and Automation (ICRA), 2014 IEEE International Conference on 6-10 May 2013.
- the measurement is ended (step ST5-YES).
- the positional relationship calculation unit 3 increases the accuracy of the position and inclination of the sensor using Bundle Adjustment or the like (step ST6). After that, it is determined whether the measurement of all the rooms has been completed (step ST7). If the measurement has not been completed, the process returns to step ST1 and the above processing is repeated. End the build operation. In this case, since the measurement of the room 2 has not been performed yet, “NO” is determined in the step ST7, and a color image and distance information are acquired at each position as indicated by a dotted line 111 in FIG. The process of ST6 is performed.
- the database shown in FIG. 5 is created by the image database generation unit 4 (image database generation step). As shown in FIG. 5, a pair of an image, distance information, position, and inclination is registered for each RFID. For comparison, a conventional database is shown in FIG. Compared with the database of FIG. 6, the database by the image database construction apparatus of this embodiment has images grouped for each RFID.
- FIG. 7 is a configuration diagram of the position and inclination estimation apparatus.
- the position and inclination estimation apparatus according to the present embodiment includes an area detection unit 1, an image acquisition unit 2 a, a database collation unit 11, and a position and inclination estimation unit 12.
- the region detection unit 1 and the image acquisition unit 2a are the same as the region detection unit 1 and the image acquisition unit 2a in the image database construction device, but are provided in a terminal that is an estimation target of the position and inclination estimation device.
- the database collation unit 11 collates only images having the same region information as the region information detected by the region detection unit 1 among the images acquired by the image acquisition unit 2a and the images stored in the image database 5.
- the position and inclination estimation unit 12 is a processing unit that outputs the position and inclination of the image specified by the database collation unit 11 as an estimation result.
- the image database 5 is a database generated by the image database construction device.
- FIG. 8 is a hardware configuration diagram of the position and inclination estimation apparatus shown in FIG.
- the position and inclination estimation apparatus 200 includes an RFID receiver 101, a camera 102, and a computer 201.
- the RFID receiver 101 and the camera 102 are the same as the RFID receiver 101 and the camera 102 in the image database construction apparatus 100.
- the computer 201 is a device for acquiring data from the RFID receiver 101 and the camera 102 and estimating the position and inclination of the terminal based on these data, and is configured by, for example, a personal computer or a smartphone.
- the RFID receiver 101 in FIG. 8 constitutes the area detection unit 1 in FIG. 7, and the camera 102 constitutes the image acquisition unit 2a.
- the computer 201 realizes the functions of the database collation unit 11 and the position and inclination estimation unit 12. The realization of these functions is configured by the processor of the computer 201 executing software corresponding to the database collation unit 11 and the position and inclination estimation unit 12.
- the area detection unit 1 to the position and inclination estimation unit 12 of FIG. 7 have the following functions.
- the area detection unit 1 communicates with the RFID tag 105 and recognizes its attribute (which area).
- the image acquisition unit 2a acquires a color image taken from the terminal position.
- the database collation unit 11 collates the image database 5 based on the color image acquired by the image acquisition unit 2a and the region information detected by the region detection unit 1, and the position and inclination estimation unit 12
- the position and inclination of the image of 11 collation results are output as estimation results.
- the user contacts the RFID tag 105 and the RFID receiver 101 to read the RFID tag 105 (step ST11), and inputs information about a room where the terminal is located.
- This input may be manually input by the user to the RFID receiver 101 (database collation unit 11), or the information of the RFID tag 105 from the RFID receiver 101 is directly used as room information. You may make it give to the database collation part 11).
- the terminal has entered the room 1 of the first RFID tag 105a shown in FIG.
- the database collation unit 11 collates with the data in the image database 5 (step ST13).
- the image of RFID (1) is collated in the image database shown in FIG.
- an image most similar to the image acquired in step ST12 is extracted.
- the position and inclination estimation unit 12 estimates the position and inclination of the terminal based on the distance information of the terminal added to the image extracted in step ST13 + the position t and inclination R of the terminal (step ST14).
- the first effect is a reduction in calculation time.
- collation with the image database 5 is only image data corresponding to the specified area. Therefore, the number of collated images can be reduced as compared with the conventional case.
- the collation is performed only with the image of the RFID (1) and is not collated with the image of the RFID (2), so that the calculation time can be reduced.
- there is an improvement in distinguishability of similar scenes For example, even when the room 1 and the room 2 shown in FIG. 3 have the same pattern, the terminal position can be accurately estimated because the room 1 and the room 2 are identified and collated. .
- RFID is used as the region specifying means.
- any device that can uniquely specify the region such as a beacon, can be used in the same manner.
- the area detection unit that detects which of the plurality of areas the captured image belongs to, and the imaging target of the captured image
- An information acquisition unit that acquires distance information indicating a distance between the object and the acquisition position of the captured image, a calculation unit that calculates a shooting position and an inclination of the captured image based on the distance information, and imaging of the captured image
- An image database generation unit that generates an image database in which a position and an inclination and a captured image are associated as image information, and generates an image database by grouping image information of images belonging to the same region Therefore, when estimating the position and inclination of an image, it is possible to improve the real-time property and obtain an accurate position and inclination estimation result. It is possible to provide an image database.
- the plurality of regions are associated with the regions detected by the region detection unit in a one-to-one relationship, so that the regions can be identified easily and reliably. it can.
- the area detection unit since the area detection unit detects an area using RFID, the area can be identified easily and reliably.
- an image database that stores image information obtained by grouping images having shooting position and inclination information in a plurality of areas, and any of the plurality of areas
- An area detection unit that detects whether the image belongs to an image acquisition unit, an image acquisition unit that acquires an image of the object to be photographed, an image acquired by the image acquisition unit, and an image stored in the image database.
- a database collation unit that collates only images belonging to the same region, and a position and inclination estimation unit that outputs the position and inclination of the image of the collation result as an estimation result in the database collation unit.
- accurate position and inclination estimation results can be obtained.
- the plurality of areas are associated with the areas detected by the area detection unit on a one-to-one basis, so that the areas can be identified easily and reliably. Can do.
- the area detecting unit detects the area using the RFID, so that the area can be identified easily and reliably.
- the image database construction method using the image database construction device described in the first embodiment in which one of a plurality of regions is a captured image.
- Embodiment 2 uses an image database construction device that performs grouping based on signal values obtained at a plurality of indoor positions and generates an image database, and an image database constructed by the image database construction device. And a position and inclination estimation device for estimating the inclination.
- FIG. 10 is a configuration diagram of the image database construction apparatus according to the second embodiment.
- the illustrated image database construction apparatus includes an area detection unit 1a, a sensor information acquisition unit 2, a positional relationship calculation unit 3, and an image database generation unit 4a.
- the area detection unit 1a is a processing unit that measures values of signals set at a plurality of indoor positions as values corresponding to areas.
- the image database generation unit 4a groups the position and inclination calculation information of each captured image calculated by the positional relationship calculation unit 3 using the signal values measured by the area detection unit 1a, and the grouped shootings are performed. It is a processing unit that associates an image, the acquisition position and inclination of the captured image, and a signal value, and generates associated image information as an image database 5a.
- FIG. 11 is a hardware configuration diagram of the image database construction apparatus shown in FIG.
- the image database construction device 100a includes a wireless LAN receiver 106, a camera 102, a distance sensor 103, and a computer 104a.
- the wireless LAN receiver 106 is a receiver for communicating with a wireless LAN access point 107 provided for each of a plurality of indoor areas and measuring the signal strength (RSSI: Received Signal Strength Indicator).
- RSSI Received Signal Strength Indicator
- the camera 102 and the distance sensor 103 have the same configuration as the camera 102 and the distance sensor 103 in the image database construction device 100 of the first embodiment.
- the wireless LAN receiver 106 in FIG. 11 constitutes the area detection unit 1a in FIG. 10, the camera 102 constitutes the image acquisition unit 2a, and the distance sensor 103 constitutes the distance measurement unit 2b.
- the computer 104a realizes the functions of the positional relationship calculation unit 3 and the image database generation unit 4a. The realization of these functions is configured by the processor of the computer 104a executing software corresponding to the positional relationship calculation unit 3 and the image database generation unit 4a.
- FIG. 12 shows a state where two rooms are viewed from above. An access point 107a and an access point 107b are installed in the room 1, and an access point 107c and an access point 107d are installed in the room 2. The operation of constructing such an image database for two rooms will be described with reference to the flowchart of FIG.
- the positional relationship calculation unit 3 increases the accuracy of the position and inclination of the sensor using Bundle Adjustment or the like (step ST26).
- the image database generation unit 4a performs color image and distance information grouping based on the signal intensity obtained in step ST24 (image database generation step: step ST27).
- the grouping method there is an unsupervised learning method such as a K-means method or Spectral Clustering.
- step ST28 it is determined whether or not the measurement for all the rooms has been completed. If the measurement has not been completed, the process from step ST21 is repeated. Exit. In this case, since the measurement of the room 2 is not performed, “NO” is determined in the step ST28, the color image and the distance information are acquired at the respective positions as indicated by the dotted line 113 in FIG. 12, and the steps ST21 to ST27 are performed. Perform the process.
- the room 1 and the room 2 are measured, and when the measurement result is converted into a database by the image database generation unit 4a, an image database shown in FIG. 14 is created (image database generation step).
- image database generation step for each signal, a pair of an image, distance information, position, and inclination is registered.
- the signal 1 and the signal 2 are signals corresponding to the room 1 and the room 2 shown in FIG. 12, but may not necessarily correspond to the room depending on the size of the room and the number of the access points 107. For example, when a room is large and there are many access points 107, a plurality of groups may be formed in one room.
- FIG. 15 is a configuration diagram of the position and inclination estimation apparatus.
- the position and inclination estimation apparatus according to the present embodiment includes an area detection unit 1a, an image acquisition unit 2a, a database collation unit 11a, a position and inclination estimation unit 12a, and an image database 5a.
- the image acquisition unit 2a and the region detection unit 1a are the same as the image acquisition unit 2a and the region detection unit 1a in the image database construction device.
- the database collating unit 11a collates only the image having the same signal value as the signal value detected by the area detecting unit 1a out of the image acquired by the image acquiring unit 2a and the image stored in the image database 5a.
- the position and inclination estimation unit 12a is a processing unit that outputs the position and inclination of the image specified by the database collation unit 11a as an estimation result.
- the image database 5a is a database generated by the image database construction device 100a.
- FIG. 16 is a hardware configuration diagram of the position and inclination estimation apparatus shown in FIG.
- the position and inclination estimation apparatus 200a includes a wireless LAN receiver 106, a camera 102, and a computer 201a.
- the wireless LAN receiver 106 and the camera 102 are the same as the wireless LAN receiver 106 and the camera 102 of the image database construction apparatus 100a shown in FIG.
- the computer 201a is a device for acquiring data from the wireless LAN receiver 106 and the camera 102 and estimating the position and inclination of the terminal based on these data, and is configured by, for example, a personal computer or a smartphone.
- the wireless LAN receiver 106 in FIG. 16 constitutes the area detection unit 1a in FIG. 15, and the camera 102 constitutes the image acquisition unit 2a.
- the computer 201a realizes the functions of the database collation unit 11a and the position and inclination estimation unit 12a. The realization of these functions is configured by the processor of the computer 201a executing software corresponding to the database collation unit 11a and the position and inclination estimation unit 12a.
- the image acquisition unit 2a to the position and inclination estimation unit 12a in FIG. 15 have the following functions.
- the image acquisition unit 2a acquires a color image taken from the terminal position.
- the area detection unit 1a obtains the signal strength with each access point 107.
- the database collation unit 11a collates the image database 5a based on the color image acquired by the image acquisition unit 2a and the signal intensity measured by the area detection unit 1a, and the position and inclination estimation unit 12a The position and inclination of the image of 11 collation results are output as estimation results.
- the database collation unit 13a first collates the image database 5a based on the signal strength (step ST32). Further, when an image is acquired by the image acquisition unit 2a (step ST33), the database collation unit 13a collates images from the collation results of step ST32 (step ST34). That is, the database collation unit 13a first roughly estimates the terminal position based on the signal intensity from the region detection unit 1a, and then collates with image data corresponding to the rough estimation result.
- step ST32 collation is performed only with the image of signal 1 in the image database 5a shown in FIG. Then, from the color image of signal 1, the one most similar to the image acquired in step ST33 is extracted. Next, the position and inclination estimation unit 14 estimates the terminal position and inclination based on the terminal distance information added to the image extracted in step ST34 + the terminal position t and inclination R (step ST35).
- the first effect is a reduction in calculation time.
- collation with the image database 5a is only image data corresponding to the specified signal. Therefore, the number of collated images can be reduced as compared with the conventional case. For example, in the above example, a plurality of color images corresponding to the signal 1 shown in FIG. 14 and the image acquired by the image acquisition unit 2a are collated. Since image matching with the color image corresponding to the signal 2 is not performed, calculation time can be reduced.
- a second effect there is an improvement in distinguishability of similar scenes. For example, even if the room 1 and the room 2 shown in FIG. 12 have the same interior pattern, the pre-processing for roughly determining the position of the user based on the signal strength from each access point is performed. Can be estimated.
- RSSI is used as the value of the signal detected by the region detection unit 1a.
- the present invention is not limited to this, and the signal strength with the beacon, UWB ( Any one of the signal strength of the Ultra Wide Band, the distance to the access point (ToF: Time of Flight), and the measured value of the geomagnetic sensor may be used.
- the region detection unit measures the set signal values as values corresponding to the regions at a plurality of positions
- the image database generation unit The image acquisition position and inclination calculated by the calculation unit are grouped using the signal value measured by the area detection unit, and the signal value, the acquisition position and inclination of the grouped image, and the image are displayed as an image. Since it is related as information and image information is generated as an image database, it is possible to improve the real-time property and obtain an accurate position and inclination estimation result when estimating the position and inclination of the image.
- a database can be provided.
- the region detection unit is any one of RSSI, beacon signal strength, UWB signal strength, distance to the access point, and measurement value of the geomagnetic sensor. Since one is measured as the value of the set signal, a special device is not required for signal measurement, and the region can be easily and reliably specified.
- the area detection unit measures the set signal values as values corresponding to the areas at a plurality of positions, and the image database stores the signal values. Since the database collation unit collates only the same image as the signal value detected by the area detection unit among the images stored in the image database, the database collation unit When estimating the inclination, the real-time property can be improved and an accurate position and inclination estimation result can be obtained.
- the region detection unit is any one of RSSI, beacon signal strength, UWB signal strength, distance to the access point, and measurement value of the geomagnetic sensor. Since one is measured as the value of the set signal, a special device is not required for signal measurement, and the region can be easily and reliably specified.
- the image database construction device, the position and inclination estimation device, and the image database construction method according to the present invention relate to a configuration for obtaining the position and inclination of a terminal using previously registered images, and It is suitable for use in an application that superimposes and displays additional information.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
実施の形態1.
図1は、本実施の形態による画像データベース構築装置の構成図である。
図示の画像データベース構築装置は、領域検出部1、センサ情報取得部2、位置関係計算部3、画像データベース生成部4を備える。領域検出部1は、屋内の複数の領域のうち、どの領域に位置するかを領域情報として検出する処理部である。センサ情報取得部2は、画像取得部2aと距離測定部2bと備えた情報取得部である。画像取得部2aは、屋内の画像を取得する装置であり、距離測定部2bは、撮影画像の撮影対象物と撮影画像の取得位置との距離を示す距離情報を測定する処理部である。位置関係計算部3は、センサ情報取得部2で取得した撮影画像と距離情報とに基づいて、各撮影画像の取得位置と傾きを算出する算出部である。画像データベース生成部4は、位置関係計算部3で算出した撮影画像の取得位置及び傾きと、センサ情報取得部2で取得した撮影画像と、領域検出部1で検出した領域情報とを画像情報として関連付け、この画像情報を画像データベース5として生成する処理部である。
図3は、二つの部屋を上から見た状態を示している。それぞれの部屋(部屋1、部屋2)は各領域に対応するもので、各部屋の出入り口付近に第1のRFIDタグ105aと第2のRFIDタグ105bが設けられている。このような二つの部屋の画像データベース構築の動作を図4のフローチャートを用いて説明する。
図7は、位置及び傾き推定装置の構成図である。本実施の形態の位置及び傾き推定装置は、領域検出部1、画像取得部2a、データベース照合部11、位置及び傾き推定部12を備える。領域検出部1及び画像取得部2aは、画像データベース構築装置における領域検出部1及び画像取得部2aと同様であるが、位置及び傾き推定装置の推定対象となる端末に設けられているものである。データベース照合部11は、画像取得部2aで取得した画像と、画像データベース5に格納された画像のうち、領域検出部1で検出された領域情報と同一の領域情報を有する画像のみを照合し、いずれかの画像を特定する処理部である。位置及び傾き推定部12は、データベース照合部11で特定された画像の位置と傾きを推定結果として出力する処理部である。画像データベース5は、画像データベース構築装置で生成されたデータベースである。
ユーザがRFIDタグ105とRFID受信機101とを接触させてRFIDタグ105の読み取りを行い(ステップST11)、端末が位置する部屋の情報を入力する。なお、この入力はユーザが手入力でRFID受信機101(データベース照合部11)に入力しても良いし、RFID受信機101からのRFIDタグ105の情報を直接部屋の情報としてRFID受信機101(データベース照合部11)に与えるようにしても良い。なお、ここでは、端末は図3に示す第1のRFIDタグ105aの部屋1に入ったとする。このような状態で、画像取得部2aにより部屋1内の画像データを取得すると(ステップST12)、データベース照合部11は、画像データベース5中のデータと照合する(ステップST13)。ここでは、図5に示した画像データベース中、RFID(1)の画像とのみ照合を行う。そして、RFID(1)のカラー画像の中から、ステップST12で取得した画像と最も類似するものを抽出する。次いで、位置及び傾き推定部12は、ステップST13で抽出された画像に付与されている端末の距離情報+端末の位置tと傾きRに基づいて端末の位置と傾きを推定する(ステップST14)。
実施の形態2は、屋内の複数の位置で得られた信号の値に基づいてグルーピングを行って画像データベースを生成する画像データベース構築装置と、画像データベース構築装置で構築された画像データベースを用いて位置と傾きを推定する位置及び傾き推定装置である。
図12は、二つの部屋を上から見た状態を示している。部屋1にはアクセスポイント107aとアクセスポイント107bが設置され、部屋2にはアクセスポイント107cとアクセスポイント107dが設置されている。このような二つの部屋の画像データベース構築の動作を図13のフローチャートを用いて説明する。
図15は、位置及び傾き推定装置の構成図である。本実施の形態の位置及び傾き推定装置は、領域検出部1a、画像取得部2a、データベース照合部11a、位置及び傾き推定部12a、画像データベース5aを備える。画像取得部2a及び領域検出部1aは、画像データベース構築装置における画像取得部2a及び領域検出部1aと同様である。データベース照合部11aは、画像取得部2aで取得した画像と、画像データベース5aに格納された画像のうち、領域検出部1aで検出された信号の値と同一の信号の値を有する画像のみを照合し、いずれかの画像を特定する処理部である。位置及び傾き推定部12aは、データベース照合部11aで特定された画像の位置と傾きを推定結果として出力する処理部である。また、画像データベース5aは、画像データベース構築装置100aで生成されたデータベースである。
領域検出部1aがその位置における各アクセスポイントからの信号強度を検出すると(ステップST31)、データベース照合部13aは、先ず、信号強度に基づく画像データベース5aの照合を行う(ステップST32)。また、画像取得部2aで画像が取得される(ステップST33)と、データベース照合部13aは、ステップST32の照合結果の中から画像の照合を行う(ステップST34)。すなわち、データベース照合部13aは、先ず、領域検出部1aからの信号強度に基づいて端末の位置を大まかに推定し、次に、この大まかな推定結果に対応した画像データと照合を行う。例えば、ステップST32による照合では信号1であった場合、図14に示す画像データベース5a中、信号1の画像とのみ照合を行う。そして、信号1のカラー画像の中から、ステップST33で取得した画像と最も類似するものを抽出する。次いで、位置及び傾き推定部14は、ステップST34で抽出された画像に付与されている端末の距離情報+端末の位置tと傾きRに基づいて端末の位置と傾きを推定する(ステップST35)。
Claims (11)
- 撮影された画像が複数の領域のうちのどの領域に属するかを検出する領域検出部と、
前記撮影された画像の撮影対象物と前記撮影画像の取得位置との距離を示す距離情報を取得する情報取得部と、
前記距離情報に基づいて、前記撮影された画像の撮影位置及び傾きを算出する算出部と、
前記撮影された画像の撮影位置及び傾きと、前記撮影された画像とを画像情報として関連付けた画像データベースを生成する画像データベース生成部であって、同じ領域に属する画像の画像情報をグルーピングして画像データベースを生成する画像データベース生成部とを備えた画像データベース構築装置。 - 前記複数の領域は、前記領域検出部で検出する領域と一対一に関連付けられていることを特徴とする請求項1記載の位置及び傾き推定装置。
- 領域検出部はRFID(Radio Frequency IDentification)を用いて前記領域を検出することを特徴とする請求項2記載の画像データベース構築装置。
- 前記領域検出部は、複数の位置において、設定された信号の値を前記領域に対応した値として測定し、
前記画像データベース生成部は、前記算出部で算出した前記画像の取得位置及び傾きを、前記領域検出部で測定された信号の値を用いてグルーピングし、前記信号の値と、前記グルーピングされた前記画像の取得位置及び傾きと、当該画像とを画像情報として関連付け、当該画像情報を画像データベースとして生成することを特徴とする請求項1記載の画像データベース構築装置。 - 前記領域検出部は、RSSI(Received Signal Strength Indicator)、ビーコンとの信号強度、UWB(Ultra Wide Band)の信号強度、アクセスポイントへの距離及び地磁気センサの測定値のうち、いずれか一つを前記設定された信号の値として測定することを特徴とする請求項4記載の画像データベース構築装置。
- 撮影位置と傾きの情報を持った画像を複数の領域でグルーピングした画像情報を格納する画像データベースと、
複数の領域のうちのどの領域に属するかを検出する領域検出部と、
撮影対象物の画像を取得する画像取得部と、
前記画像取得部で取得した画像と、前記画像データベースに格納された画像のうち、前記領域検出部で検出された同一の領域に属する画像のみを照合するデータベース照合部と、
前記データベース照合部で、照合結果の画像の位置と傾きを推定結果として出力する位置及び傾き推定部とを備えた位置及び傾き推定装置。 - 前記複数の領域は、前記領域検出部で検出する領域と一対一に関連付けられていることを特徴とする請求項6記載の位置及び傾き推定装置。
- 前記領域検出部はRFIDを用いて領域を検出することを特徴とする請求項6記載の位置及び傾き推定装置。
- 前記領域検出部は、複数の位置において、設定された信号の値を前記領域に対応した値として測定すると共に、
前記画像データベースは、前記信号の値を用いてグルーピングされており、
前記データベース照合部は、前記画像データベースに格納された画像のうち、前記領域検出部で検出された信号の値と同一の画像のみを照合することを特徴とする請求項6記載の位置及び傾き推定装置。 - 前記領域検出部は、RSSI、ビーコンとの信号強度、UWBの信号強度、アクセスポイントへの距離及び地磁気センサの測定値のうち、いずれか一つを前記設定された信号の値として測定することを特徴とする請求項9記載の位置及び傾き推定装置。
- 請求項1に記載の画像データベース構築装置を用いた画像データベース構築方法であって、
撮影された画像が複数の領域のうちのどの領域に属するかを検出する領域検出ステップと、
前記撮影された画像の撮影対象物と前記撮影画像の取得位置との距離を示す距離情報を取得する情報取得ステップと、
前記距離情報に基づいて、前記撮影された画像の撮影位置及び傾きを算出する算出ステップと、
前記撮影された画像の撮影位置及び傾きと、前記撮影された画像とを画像情報として関連付けた画像データベースを生成する画像データベース生成ステップであって、同じ領域に属する画像の画像情報をグルーピングして画像データベースを生成する画像データベース生成ステップとを備えた画像データベース構築方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780086070.5A CN110268438B (zh) | 2017-02-16 | 2017-02-16 | 图像数据库构建装置、位置和倾斜估计装置以及图像数据库构建方法 |
JP2019500108A JP6580286B2 (ja) | 2017-02-16 | 2017-02-16 | 画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 |
DE112017006794.4T DE112017006794B4 (de) | 2017-02-16 | 2017-02-16 | Verfahren zur Orts- und Neigungsschätzung |
US16/474,243 US20190362517A1 (en) | 2017-02-16 | 2017-02-16 | Image database creation device, location and inclination estimation device, and image database creation method |
PCT/JP2017/005709 WO2018150515A1 (ja) | 2017-02-16 | 2017-02-16 | 画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/005709 WO2018150515A1 (ja) | 2017-02-16 | 2017-02-16 | 画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018150515A1 true WO2018150515A1 (ja) | 2018-08-23 |
Family
ID=63169178
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/005709 WO2018150515A1 (ja) | 2017-02-16 | 2017-02-16 | 画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20190362517A1 (ja) |
JP (1) | JP6580286B2 (ja) |
CN (1) | CN110268438B (ja) |
DE (1) | DE112017006794B4 (ja) |
WO (1) | WO2018150515A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11408966B2 (en) * | 2018-11-14 | 2022-08-09 | International Business Machines Corporation | Location detection using a single beacon |
US11810323B2 (en) * | 2018-12-28 | 2023-11-07 | Ntt Docomo, Inc. | Position estimation system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003111128A (ja) * | 2001-09-28 | 2003-04-11 | J-Phone East Co Ltd | 現在位置特定方法、現在位置情報提供方法、移動経路案内方法、位置情報管理システム及び情報通信端末 |
JP2004191339A (ja) * | 2002-12-13 | 2004-07-08 | Sharp Corp | 位置情報検索方法、位置情報検索装置、位置情報検索端末、及び、位置情報検索システム |
JP2004260304A (ja) * | 2003-02-24 | 2004-09-16 | Fuji Photo Film Co Ltd | 画像管理システム |
JP2007172197A (ja) * | 2005-12-21 | 2007-07-05 | Hitachi Ltd | 画像入力装置、画像出力装置及び画像入出力システム |
WO2013146582A1 (ja) * | 2012-03-29 | 2013-10-03 | 日本電気株式会社 | 位置判定システム、位置判定方法、コンピュータプログラム及び位置判定装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010105633A1 (en) * | 2009-03-16 | 2010-09-23 | Nokia Corporation | Data processing apparatus and associated user interfaces and methods |
US8933993B1 (en) * | 2012-01-16 | 2015-01-13 | Google Inc. | Hybrid local and cloud based method for pose determination of a mobile device |
CN103150330B (zh) * | 2013-01-16 | 2015-12-23 | 中南大学 | 一种基于局部对象和块匹配的扫描证书图像检索方法 |
CN103149222A (zh) * | 2013-02-28 | 2013-06-12 | 重庆大学 | 射线实时成像中缺陷检测方法 |
CN104936283B (zh) | 2014-03-21 | 2018-12-25 | 中国电信股份有限公司 | 室内定位方法、服务器和系统 |
EP3026596A1 (en) * | 2014-11-26 | 2016-06-01 | Thomson Licensing | System for identifying a location of a mobile tag reader |
-
2017
- 2017-02-16 CN CN201780086070.5A patent/CN110268438B/zh active Active
- 2017-02-16 DE DE112017006794.4T patent/DE112017006794B4/de active Active
- 2017-02-16 US US16/474,243 patent/US20190362517A1/en not_active Abandoned
- 2017-02-16 JP JP2019500108A patent/JP6580286B2/ja active Active
- 2017-02-16 WO PCT/JP2017/005709 patent/WO2018150515A1/ja active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003111128A (ja) * | 2001-09-28 | 2003-04-11 | J-Phone East Co Ltd | 現在位置特定方法、現在位置情報提供方法、移動経路案内方法、位置情報管理システム及び情報通信端末 |
JP2004191339A (ja) * | 2002-12-13 | 2004-07-08 | Sharp Corp | 位置情報検索方法、位置情報検索装置、位置情報検索端末、及び、位置情報検索システム |
JP2004260304A (ja) * | 2003-02-24 | 2004-09-16 | Fuji Photo Film Co Ltd | 画像管理システム |
JP2007172197A (ja) * | 2005-12-21 | 2007-07-05 | Hitachi Ltd | 画像入力装置、画像出力装置及び画像入出力システム |
WO2013146582A1 (ja) * | 2012-03-29 | 2013-10-03 | 日本電気株式会社 | 位置判定システム、位置判定方法、コンピュータプログラム及び位置判定装置 |
Non-Patent Citations (1)
Title |
---|
YUKO UEMATSU: "Augmented Reality: Foundation 2: Geometrical Registration Technique", IPSJ MAGAZINE, vol. 51, 15 April 2010 (2010-04-15), pages 373 - 378, ISSN: 0447-8053 * |
Also Published As
Publication number | Publication date |
---|---|
CN110268438A (zh) | 2019-09-20 |
JP6580286B2 (ja) | 2019-09-25 |
JPWO2018150515A1 (ja) | 2019-11-07 |
DE112017006794B4 (de) | 2025-02-06 |
CN110268438B (zh) | 2023-08-25 |
US20190362517A1 (en) | 2019-11-28 |
DE112017006794T5 (de) | 2019-10-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101686171B1 (ko) | 영상 및 거리 데이터를 이용한 위치 인식 장치 및 방법 | |
US9470511B2 (en) | Point-to-point measurements using a handheld device | |
CN100369487C (zh) | 物体检测装置、物体检测服务器以及物体检测方法 | |
CN108921894B (zh) | 对象定位方法、装置、设备和计算机可读存储介质 | |
CN106291517A (zh) | 基于位置与视觉信息优化的室内云机器人角度定位方法 | |
TW201715476A (zh) | 運用擴增實境技術之導航系統 | |
KR101885961B1 (ko) | 이미지를 기반으로 한 객체 위치 추정 방법 및 장치 | |
Ruotsalainen et al. | Visual-aided two-dimensional pedestrian indoor navigation with a smartphone | |
Heya et al. | Image processing based indoor localization system for assisting visually impaired people | |
EP3005238B1 (en) | Method and system for coordinating between image sensors | |
Liu et al. | A low-cost and scalable framework to build large-scale localization benchmark for augmented reality | |
KR101573289B1 (ko) | 위치 측정 방법 및 이를 이용한 휴대 단말기 | |
Li et al. | Joint intrinsic and extrinsic lidar-camera calibration in targetless environments using plane-constrained bundle adjustment | |
JP6580286B2 (ja) | 画像データベース構築装置、位置及び傾き推定装置並びに画像データベース構築方法 | |
KR20170058612A (ko) | 영상 기반 실내측위 방법 및 그의 시스템 | |
EP3726253A1 (en) | Video enhanced location estimates | |
CN109612455A (zh) | 一种室内定位方法及系统 | |
JP7677160B2 (ja) | 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム | |
TW201621273A (zh) | 行動定位裝置及其定位方法 | |
JP7110727B2 (ja) | ビーコン発信機位置抽出システム及びビーコン発信機位置抽出方法 | |
US9245343B1 (en) | Real-time image geo-registration processing | |
KR102407802B1 (ko) | 인공신경망 학습 기반의 실내외 3차원 좌표 및 방위 추정 장치 | |
US20220018950A1 (en) | Indoor device localization | |
US10965876B2 (en) | Imaging system, imaging method, and not-transitory recording medium | |
US10735902B1 (en) | Method and computer program for taking action based on determined movement path of mobile devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17896865 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019500108 Country of ref document: JP Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17896865 Country of ref document: EP Kind code of ref document: A1 |
|
WWG | Wipo information: grant in national office |
Ref document number: 112017006794 Country of ref document: DE |