WO2020115945A1 - Positioning system - Google Patents

Positioning system Download PDF

Info

Publication number
WO2020115945A1
WO2020115945A1 PCT/JP2019/030360 JP2019030360W WO2020115945A1 WO 2020115945 A1 WO2020115945 A1 WO 2020115945A1 JP 2019030360 W JP2019030360 W JP 2019030360W WO 2020115945 A1 WO2020115945 A1 WO 2020115945A1
Authority
WO
WIPO (PCT)
Prior art keywords
position information
terminal
image data
global position
unit
Prior art date
Application number
PCT/JP2019/030360
Other languages
French (fr)
Japanese (ja)
Inventor
和斗 大森
林 宏樹
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2020115945A1 publication Critical patent/WO2020115945A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • One aspect of the present invention relates to a positioning system.
  • Reference 1 describes map data in which a reference image including an image of a feature obtained by capturing an image in advance and image capturing position information of the reference image are associated with each other, and a current image captured by a camera of a mobile body (terminal).
  • a technique for estimating the current position of the terminal based on the above is disclosed.
  • the position of the terminal is highly accurately estimated based on the captured image and map data. Even if it is done, the actual position of the terminal and the positioning estimation result may deviate from each other. In this case, the positioning accuracy cannot be sufficiently ensured.
  • One aspect of the present invention has been made in view of the above circumstances, and aims to improve positioning accuracy.
  • a positioning system is a positioning system that performs positioning based on image data captured by a terminal, and continuously stores local position information that is relative position information of the terminal with respect to a certain point.
  • Map data in which the local position acquisition unit to be acquired in step S1 is associated with the feature amount of the feature points included in the image data acquired in advance and the global position information that is absolute position information associated with the feature points.
  • a storage unit to store, a global position estimation unit that estimates global position information of the terminal at the time of image capturing in the terminal based on image data captured in the terminal, and map data stored in the storage unit, and a local position By estimating the amount of movement from the time of imaging at the terminal based on the change in the local position information acquired by the obtaining unit, and correcting the global position information estimated by the global position estimating unit based on the moving amount. And a position derivation unit that derives position information of the terminal in consideration of the movement amount.
  • the image data captured by the terminal, the map data (the feature amount of the feature point included in the image data acquired in advance, and the global position information of the feature point are associated with each other.
  • Data and the global position information of the terminal at the time of imaging at the terminal is estimated.
  • the global position information is high based on the captured image data and the map data. Even if estimated accurately, it is possible that the actual position of the terminal deviates from the positioning result.
  • the amount of movement of the terminal from the time of image capturing in the terminal is estimated from the continuously acquired local position information of the terminal, and the global position information is calculated based on the movement amount. Is corrected, and the position information of the terminal considering the amount of movement is derived. In this way, the movement amount is estimated based on the continuously acquired local position information of the terminal, the global position information is corrected according to the movement amount, and the final position information of the terminal is derived, Even if the position of the terminal changes after the image data is captured, the actual position of the terminal can be specified with high accuracy. As described above, the positioning system according to one aspect of the present invention can improve positioning accuracy.
  • the global position estimation unit performs matching between the feature points of the map data and the feature points of the image data captured by the terminal, and based on the global position information associated with the feature points of the map data, the Global location information of the terminal may be estimated. In this way, by matching the feature points between the map data and the imaged image data, the correspondence relationship between the map data and the image data is appropriately specified, and the global position information associated with the feature points of the map data is obtained. Based on, it is possible to properly estimate the global position information of the terminal at the time of image capturing.
  • the storage unit stores a plurality of divided map data obtained by dividing the map data into fixed regions according to the global position information
  • the global position estimation unit stores the image data captured by the terminal and the terminal acquired by the terminal.
  • Terminal acquisition position information that is global position information is acquired, one or a plurality of division map data is selected from a plurality of division map data according to the terminal acquisition position information, and the feature points of the selected division map data and the terminal.
  • the global position information of the terminal at the time of imaging at the terminal may be estimated based on the result of matching with the feature point of the image data captured at.
  • the map data is divided into a plurality of pieces according to the global position information, and the division map data to be matched is selected from the plurality of pieces of division map data according to the terminal acquisition position information, so that the matching range (search The range) can be narrowed down, and the efficiency of matching processing can be improved.
  • the positioning system 1 determines whether or not the image data can be used for estimation of global position information by the global position estimation unit based on the content of the image data captured by the terminal, and if the image data cannot be used, Further, an image capturing instructing unit for instructing to capture new image data may be further provided. In this way, in the previous stage of the position estimation by the global position estimation unit, when the captured image data is not suitable for the estimation, the image pickup of new image data is prompted, so that the global position estimation unit performs unnecessary processing. It is possible to avoid being hit and to reduce the total positioning time.
  • the image capturing instruction unit determines that the image data cannot be used for estimating the global position information by the global position estimating unit when the image data captured by the terminal includes a dynamic object at a predetermined value or more. May be. It is considered that dynamic objects do not exist continuously and are not recorded in the map data. Therefore, the dynamic object becomes noise in the estimation of the global position information, which may cause a decrease in positioning accuracy.
  • the dynamic object described above is caused. It is possible to suppress a decrease in positioning accuracy.
  • the image capturing instruction unit determines that the image data cannot be used for the estimation of the global position information by the global position estimating unit when the image data captured by the terminal includes motion blur of a predetermined value or more. Good. If the image data contains a large amount of motion blur, matching of feature points cannot be performed properly, which may cause a decrease in positioning accuracy. In this regard, when the image data includes motion blur of a predetermined value or more, it is determined that the image data cannot be used for estimating global position information, and thus the positioning accuracy is deteriorated due to the motion blur described above. Can be suppressed.
  • the storage unit stores three-dimensional position information as global position information of the map data, the global position estimation unit estimates three-dimensional position information as global position information of the terminal, and the local position acquisition unit 3D position information may be acquired as the local position information of. This allows the position deriving unit to derive three-dimensional position information.
  • positioning accuracy can be improved.
  • FIG. 1 It is a block diagram which shows the functional structure of the positioning system which concerns on this embodiment. It is a sequence diagram which shows the process which a positioning system performs. It is a figure which shows the hardware constitutions of the positioning server and communication terminal contained in a positioning system.
  • FIG. 1 is a block diagram showing the functional configuration of the positioning system 1 according to this embodiment.
  • the positioning system 1 illustrated in FIG. 1 is a positioning system that performs positioning based on image data captured by the communication terminal 50 (terminal).
  • the positioning system 1 is a system that measures the position of the communication terminal 50 in a service that provides AR (Augmented Reality) content according to the position of the communication terminal 50, for example.
  • AR Augmented Reality
  • the positioning system 1 will be described as a system related to a service that provides AR content, but the positioning system 1 may be a system related to another application.
  • the positioning system 1 includes a position positioning server 10 and a communication terminal 50. Although only one communication terminal 50 is shown in FIG. 1, a plurality of communication terminals 50 may actually be included.
  • the position positioning server 10 estimates global position information, which is absolute position information of the communication terminal 50 at the time of image capturing, based on image data captured by the communication terminal 50.
  • the position positioning server 10 has a storage unit 11 and a positioning unit 12 (global position estimation unit).
  • the storage unit 11 associates a feature amount (for example, a luminance direction vector) of a feature point included in image data acquired in advance with global position information, which is absolute position information associated with the feature point, in a map.
  • the data 100 is stored.
  • map data 100 is generated, for example, based on a large amount of image data previously imaged by a stereo camera (not shown) or the like that can simultaneously image an object from a plurality of different directions.
  • a characteristic point is a point that is prominently detected in an image, and is, for example, a point whose brightness (intensity) is larger (or smaller) than other areas.
  • the global position information of the feature point is global position information set in association with the feature point, and is global position information in the real world about the region indicated by the feature point in the image.
  • the global position information can be associated with each feature point by a conventionally known method.
  • the storage unit 11 stores three-dimensional position information as global position information of feature points of the map data 100.
  • the storage unit 11 stores, for example, latitude/longitude/height of a feature point as three-dimensional global position information of the feature point.
  • the storage unit 11 stores a plurality of pieces of divided map data obtained by dividing the map data 100 into certain areas according to global position information. Areas (global position information) near the boundaries of the divided map data may overlap with each other, or may not overlap with each other. By overlapping the areas near the boundaries of each divided map data with each other, the terminal acquisition position information (details described later), which is the global position information of the communication terminal acquired by the communication terminal 50, is near the boundary of each divided map data.
  • the storage unit 11 stores a plurality of divided map data, but the storage unit 11 may store one undivided map data.
  • the positioning unit 12 determines global position information (in the three-dimensional space) of the communication terminal 50 when the communication terminal 50 captures an image. Position information) is estimated. Specifically, the positioning unit 12 performs matching between the characteristic points of the map data 100 and the characteristic points of the image data captured by the communication terminal 50, and determines the area of the map data 100 corresponding to the captured image data. Identify. Then, the positioning unit 12 estimates the imaging position of the image data (that is, the global position information of the communication terminal 50 at the time of imaging) based on the global position information associated with the feature point of the map data 100 related to the specified area. To do.
  • the positioning unit 12 estimates global position information of the communication terminal 50 using one or a plurality of division map data selected from a plurality of division map data. That is, the positioning unit 12 acquires the terminal acquisition position information which is the global position information of the communication terminal acquired by the communication terminal 50 together with the image data captured by the communication terminal 50 (details will be described later), and the terminal acquisition position information. According to the above, one or a plurality of divided map data are selected from the plurality of divided map data, and based on the result of matching between the characteristic points of the selected divided map data and the characteristic points of the image data captured by the communication terminal 50. Estimate global position information of the communication terminal 50 at the time of image capturing in the communication terminal 50.
  • the positioning unit 12 selects the division map data including the position indicated by the terminal acquisition position information, for example.
  • the position positioning server 10 transmits the positioning result by the positioning unit 12 to the communication terminal 50.
  • the positioning result may include information about the direction (direction in the three-dimensional coordinates of roll, pitch, and yaw) estimated from the image data.
  • the communication terminal 50 is, for example, a terminal capable of wireless communication, and is, for example, a smartphone, a tablet terminal, a PC, or the like.
  • the communication terminal 50 receives the provision of the AR content from the content server (not shown) when the user uses the installed application, for example.
  • an application for example, image pickup (continuous image pickup) by the mounted camera is started.
  • the communication terminal 50 is based on the positioning result estimated by the position positioning server 10 according to the imaged image data, the imaged image data, the information (angle of view, etc.) of the camera of the communication terminal 50, and the like.
  • a content including AR content is provided from a content server (not shown).
  • the communication terminal 50 includes an imaging unit 51, a positioning unit 52, a local position acquisition unit 53, a position derivation unit 54, and an imaging instruction unit 55.
  • the image pickup unit 51 has a function of controlling image pickup of image data by the camera.
  • the imaging unit 51 starts imaging the image data by the camera, for example, at the timing when the execution of the application receiving the AR content is started.
  • the image pickup unit 51 starts image pickup, the image pickup unit 51 continuously picks up an image by the camera until the execution of the application ends.
  • the imaged image data is used by the position positioning server 10 to estimate global position information.
  • the image capturing instruction unit 55 determines whether the image data can be used for estimating the global position information by the position positioning server 10. If it is determined that the image data cannot be used, the image capturing unit 51 is instructed to capture new image data without transmitting the image data to the position positioning server 10.
  • the communication terminal 50 sends the image data and the positioning result (terminal acquisition position information) by the positioning unit 52 described later. Is transmitted to the positioning server 10.
  • the image capturing instruction unit 55 cannot use the image data for estimating global position information by the position positioning server 10.
  • the dynamic object is a moving object such as a person or a car, and is usually an object not included in the map data 100.
  • the imaging instruction unit 55 may specify a dynamic object by performing, for example, a process of semantic segmentation and specifying image data in pixel units.
  • the image capturing instruction unit 55 may specify a dynamic object by using an image recognition technique. For example, the image capturing instruction unit 55 may identify an object that exists only in one image data based on a difference between continuously captured image data. An object whose position changes significantly may be identified as a dynamic object.
  • the image capturing instruction unit 55 determines that the image data cannot be used for estimating global position information by the position location server 10. To do. Motion blur is blurring that occurs when a moving object is captured by a camera. The image capturing instruction unit 55 may actually analyze the image data when determining how much motion blur is included in the image data. For example, the image capturing instruction unit 55 may display the detection result of the acceleration sensor mounted on the communication terminal 50. You may use it.
  • the positioning unit 52 acquires terminal acquisition position information that is global position information of the communication terminal 50 by performing GPS (Global Positioning System) positioning, for example.
  • the positioning unit 52 may periodically perform positioning while the image data is being imaged under the control of the imaging unit 51.
  • the positioning unit 52 may acquire global position information by a positioning method other than GPS positioning (for example, base station positioning).
  • the local position acquisition unit 53 continuously acquires local position information, which is relative position information of the communication terminal 50 with respect to a certain point.
  • the certain point is, for example, a point at which execution of an application (application receiving AR content) is started in the communication terminal 50.
  • the local position acquisition unit 53 continuously acquires three-dimensional local position information from, for example, a sensor mounted on the communication terminal 50.
  • the sensor mounted on the communication terminal 50 is not limited, but, for example, a visual SLAM (Simultaneous Localization and Mapping) using an inertial measurement unit (IMU: Inertial Measurement Unit), a camera sensor, a general acceleration sensor, a gyro sensor, or the like is used. be able to.
  • IMU Inertial Measurement Unit
  • the three-dimensional latitude/longitude/height of the point where the execution of the application is started is used as a reference (coordinates (x, y, z) are (0, 0, 0)), and the three-dimensional latitude from the point is set.
  • local position information which is relative position information of the communication terminal 50 with respect to the point, is tracked.
  • the positioning result may include, in addition to the local position information, information about the direction (direction in the three-dimensional coordinates of roll, pitch, and yaw) estimated from the detection result of the acceleration sensor, the gyro sensor, or the like. Good.
  • the position derivation unit 54 estimates the amount of movement of the communication terminal 50 from the time of image capturing based on the change in the local position information acquired by the local position acquisition unit 53, and based on the amount of movement, the position positioning server 10 causes By correcting the estimated global position information, the position information of the communication terminal 50 considering the above-described movement amount is derived.
  • the local position information three-dimensional coordinates
  • the numbers of the three-dimensional coordinates are values that are easy to understand for convenience of explanation.
  • the position deriving unit 54 only needs to correct the global position information based on the movement amount (in consideration of the movement amount), and does not necessarily correct the global position information by the movement amount. Good. That is, the position deriving unit 54 takes into account the error in the calculation timing, the communication time between the communication terminal 50 and the position positioning server 10, and the like, and the global position by the value obtained by multiplying the movement amount by a predetermined adjustment value. The information may be corrected.
  • the communication terminal 50 transmits the corrected position information of the communication terminal 50 to the position positioning server 10.
  • FIG. 2 is a sequence diagram showing processing performed by the positioning system 1.
  • the communication terminal 50 starts capturing image data at the timing when the execution of the application receiving the AR content is started (step S1).
  • step S2 determines whether or not the image data can be used for the estimation of global position information by the position positioning server 10 based on the content of the imaged image data.
  • step S2 for example, when a dynamic object is included in the captured image data by a predetermined value or more, it is determined that the image data cannot be used for estimating the global position information by the positioning server 10. .. Further, in step S2, for example, when the captured image data includes motion blur of a predetermined value or more, it is determined that the image data cannot be used for estimating the global position information by the position positioning server 10. ..
  • step S2 When it is determined in step S2 that the captured image data cannot be used for estimating the global position information by the positioning server 10, the communication terminal 50 does not transmit the image data to the positioning server 10, New image data is imaged again (that is, the process of step S1 is performed again).
  • step S2 when it is determined in step S2 that the captured image data can be used for estimating the global position information by the position positioning server 10, the communication terminal 50 determines that the image data and the GPS information (terminal acquisition position information). ) Is transmitted to the positioning server 10 (step S3), and tracking of local position information is started (step S4).
  • the positioning server 10 selects one or a plurality of division map data to be used from the plurality of division map data based on the GPS information (terminal acquisition position information) obtained from the communication terminal 50 (step S5). Then, the position location server 10 globalizes the communication terminal 50 at the time of image capturing by the communication terminal 50 based on the result of matching between the feature point of the selected divided map data and the feature point of the image data imaged by the communication terminal 50. The position information is estimated (step S6). The position positioning server 10 transmits the estimated global position information to the communication terminal 50 (step S7).
  • the communication terminal 50 corrects the global position information transmitted from the position positioning server 10 based on the local position information (step S8). Specifically, the communication terminal 50 estimates the amount of movement of the communication terminal 50 from the time of imaging based on the change in local position information acquired by the local position acquisition unit 53, and based on the amount of movement, the position is calculated. By correcting the global position information estimated by the positioning server 10, the position information of the communication terminal 50 in consideration of the above-described movement amount is derived. The communication terminal 50 transmits the derived corrected position information to the position positioning server 10 (step S9). Then, the AR content is provided using the position information.
  • the positioning system 1 is a positioning system that performs positioning based on image data captured by the communication terminal 50, and continuously acquires local position information that is relative position information of the communication terminal 50 with respect to a certain point.
  • the local position acquisition unit 53 that stores the map data 100 in which the feature amount of the feature point included in the image data acquired in advance and the global position information that is the absolute position information of the feature point are stored.
  • a positioning unit that estimates global position information of the communication terminal 50 at the time of image capturing in the communication terminal 50 based on the unit 11, the image data captured by the communication terminal 50, and the map data 100 stored in the storage unit 11.
  • the position deriving unit 54 derives the position information of the communication terminal 50 in consideration of the movement amount by correcting the position information.
  • the image data captured by the communication terminal 50 the map data 100 (the feature amount of the feature points included in the image data acquired in advance, and the global position information associated with the feature points). Based on (data associated with and), global position information of the communication terminal 50 at the time of image capturing by the communication terminal 50 is estimated.
  • the captured image data and the map data 100 Even if the global position information is estimated with high accuracy based on the above, it is conceivable that the actual position of the communication terminal 50 may deviate from the positioning result.
  • the amount of movement of the communication terminal 50 from the time of image capturing by the communication terminal 50 is estimated from the continuously acquired local position information of the communication terminal 50, and the global position information is calculated based on the movement amount. Is corrected, and the position information of the communication terminal 50 considering the movement amount is derived. In this way, the movement amount is estimated based on the continuously acquired local position information of the communication terminal 50, the global position information is corrected according to the movement amount, and the final position information of the communication terminal 50 is derived. By doing so, even if the position of the communication terminal 50 changes after the image data is captured, the actual position of the communication terminal 50 can be specified with high accuracy. As described above, according to the positioning system 1, the positioning accuracy can be improved.
  • the processing load may increase due to the user repeatedly requesting the positioning result.
  • the positioning accuracy improves, so that the above-mentioned problem is less likely to occur, and the processing unit such as the CPU is less likely to occur.
  • the technical effect of reducing the processing load is also obtained.
  • the capacity of the map data 100 that can be stored is limited depending on the performance of the communication terminal 50, and the positioning accuracy should be sufficiently improved. May not be possible.
  • the large-scale map data 100 used for positioning is stored in the position positioning server 10, and the image data used for positioning is acquired by the communication terminal 50, thereby improving the positioning accuracy.
  • quick positioning is realized.
  • the processing on the side of the communication terminal 50 depends on the performance and the OS of the communication terminal 50 by making it less likely to depend on the performance and the OS of the communication terminal 50 such as capturing image data and acquiring local position information. It is possible to realize an architecture that is difficult to do.
  • the positioning unit 12 performs matching between the characteristic points of the map data 100 and the characteristic points of the image data captured by the communication terminal 50, and based on the global position information associated with the characteristic points of the map data 100, the communication terminal The global position information of the communication terminal 50 at the time of imaging at 50 is estimated. In this way, by matching the feature points between the map data 100 and the imaged image data, the correspondence relationship between the map data 100 and the image data is appropriately specified and associated with the feature points of the map data 100. Based on the global position information, it is possible to appropriately estimate the global position information of the communication terminal 50 at the time of image capturing.
  • the storage unit 11 stores a plurality of divided map data obtained by dividing the map data 100 into certain regions according to the global position information, and the positioning unit 12 in the communication terminal 50 together with the image data imaged in the communication terminal 50.
  • the terminal acquisition position information that is the acquired global position information of the communication terminal 50 is acquired, one or a plurality of division map data is selected from a plurality of division map data according to the terminal acquisition position information, and the selected division map is selected.
  • Global position information of the communication terminal 50 at the time of image capturing in the communication terminal 50 is estimated based on the result of matching between the characteristic point of the data and the characteristic point of the image data captured by the communication terminal 50.
  • the map data 100 is divided into a plurality of pieces according to the global position information, and the division map data to be matched is selected from the plurality of pieces of division map data according to the terminal acquisition position information, whereby the matching range ( The search range) can be narrowed down, and the efficiency of matching processing can be improved. That is, for example, when the stored map data is a single vast map, the search range is too wide and the search efficiency deteriorates when the image data is transmitted from the communication terminal to the positioning server to perform self-position recognition.
  • the stored map data 100 is divided into a certain unit, and the divided map data to be matched is selected from a plurality of divided map data according to the terminal acquisition position information, thereby limiting the search range and performing processing. It is possible to improve efficiency.
  • the positioning system 1 determines whether or not the image data can be used for the estimation of the global position information by the positioning unit 12 based on the content of the image data captured by the communication terminal 50, and when it cannot be used.
  • an image capturing instruction unit 55 for instructing image capturing of new image data is further provided. As described above, in the previous stage of the position estimation by the positioning unit 12, when the captured image data is not suitable for the estimation, the new image data is prompted to be captured, so that the positioning unit 12 performs unnecessary processing. Can be avoided and the total positioning time can be shortened.
  • the image capturing instruction unit 55 cannot use the image data for estimating global position information by the positioning unit 12. judge. It is considered that the dynamic object does not exist continuously and is not recorded in the map data 100. Therefore, the dynamic object becomes noise in the estimation of the global position information, which may cause a decrease in positioning accuracy.
  • the dynamic object described above is caused. It is possible to suppress a decrease in positioning accuracy.
  • the image capturing instruction unit 55 determines that the image data cannot be used for estimating global position information by the positioning unit 12. .. If the image data contains a large amount of motion blur, matching of feature points cannot be performed properly, which may cause a decrease in positioning accuracy. In this regard, when the image data includes motion blur more than a predetermined value, it is determined that the image data cannot be used for the estimation of the global position information, so that the positioning accuracy is lowered due to the motion blur described above. Can be suppressed.
  • the storage unit 11 stores three-dimensional position information as global position information of the map data 100, and the positioning unit 12 estimates the three-dimensional position information as global position information of the communication terminal 50, and the local position acquisition unit. 53 acquires three-dimensional position information as local position information of the communication terminal 50. As a result, the position deriving unit 54 can derive three-dimensional position information.
  • the positioning server 10 and the communication terminal 50 described above may be physically configured as a computer device including the processor 1001, the memory 1002, the storage 1003, the communication device 1004, the input device 1005, the output device 1006, the bus 1007, and the like. Good.
  • the word “device” can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the positioning server 10 and the communication terminal 50 may be configured to include one or a plurality of each device illustrated in the figure, or may be configured not to include some devices.
  • Each function of the position location server 10 and the communication terminal 50 causes a predetermined software (program) to be loaded onto hardware such as the processor 1001 and the memory 1002, so that the processor 1001 performs calculation and communication by the communication device 1004, It is realized by controlling reading and/or writing of data in the memory 1002 and the storage 1003.
  • a predetermined software program
  • the processor 1001 operates an operating system to control the entire computer, for example.
  • the processor 1001 may be composed of a central processing unit (CPU) including an interface with peripheral devices, a control device, a calculation device, a register, and the like.
  • CPU central processing unit
  • the control function of the positioning unit 12 and the like of the position positioning server 10 may be realized by the processor 1001.
  • the processor 1001 reads a program (program code), software module, and data from the storage 1003 and/or the communication device 1004 into the memory 1002, and executes various processes according to these.
  • a program program that causes a computer to execute at least part of the operations described in the above-described embodiments is used.
  • the control function of the positioning unit 12 and the like of the position positioning server 10 may be realized by the control program stored in the memory 1002 and operated by the processor 1001, and may be realized similarly for other functional blocks.
  • the various processes described above are executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001.
  • the processor 1001 may be implemented by one or more chips.
  • the program may be transmitted from the network via an electric communication line.
  • the memory 1002 is a computer-readable recording medium, and is composed of, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (ElectricallyErasable Programmable ROM), RAM (Random Access Memory), and the like. May be done.
  • the memory 1002 may be called a register, a cache, a main memory (main storage device), or the like.
  • the memory 1002 can store a program (program code) executable to implement the wireless communication method according to the embodiment of the present invention, a software module, and the like.
  • the storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disc drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disc). (Registered trademark) disk), smart card, flash memory (for example, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like.
  • the storage 1003 may be called an auxiliary storage device.
  • the storage medium described above may be, for example, a database including the memory 1002 and/or the storage 1003, a server, or another appropriate medium.
  • the communication device 1004 is hardware (transmission/reception device) for performing communication between computers via a wired and/or wireless network, and is also referred to as a network device, network controller, network card, communication module, or the like.
  • the input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that receives an input from the outside.
  • the output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that performs output to the outside.
  • the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information.
  • the bus 1007 may be composed of a single bus, or may be composed of different buses among devices.
  • the positioning server 10 and the communication terminal 50 include a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), and the like. It may be configured to include hardware, and the hardware may implement part or all of each functional block. For example, processor 1001 may be implemented with at least one of these hardware.
  • the positioning system 1 has been described as including the communication terminal 50 and the position positioning server 10, but the present invention is not limited to this, and each function of the positioning system 1 may be realized only by the communication terminal.
  • the global position information may be corrected using other information in addition to the local position information. For example, for a spot identified by global position information, if a subsequently visited spot can be estimated (that is, if the user's flow line can be estimated according to the characteristics of the spot), based on the estimated flow line, The global position information may be corrected. In that case, for example, the accuracy of estimating the flow line may be improved by using information such as the sex and age of the user, and the accuracy of correcting the global position information may be improved.
  • the global position information may be corrected in consideration of the past information (the flow line traced by the user in the past).
  • the global position information may be corrected based on the position information.
  • the global position information may be corrected by improving the estimation accuracy of the flow line according to the search history of the user in the communication terminal 50 and the content of the information being browsed (content of the user's interest).
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution-Advanced
  • SUPER 3G IMT-Advanced
  • 4G 5G
  • FRA Full Radio Access
  • W-CDMA Wideband Code Division Multiple Access
  • GSM Global System for Mobile Communications
  • CDMA2000 Code Division Multiple Access 2000
  • UMB Universal Mobile Broad-band
  • IEEE 802.11 Wi-Fi
  • IEEE 802.16 WiMAX
  • IEEE 802.20 UWB (Ultra-Wide) Band
  • Bluetooth registered trademark
  • Information that has been input and output may be stored in a specific location (for example, memory), or may be managed in a management table. Information that is input/output can be overwritten, updated, or added. The output information and the like may be deleted. The input information and the like may be transmitted to another device.
  • the determination may be performed by a value represented by 1 bit (whether 0 or 1), may be performed by a Boolean value (Boolean: true or false), and may be performed by comparing numerical values (for example, a predetermined value). (Comparison with the value).
  • the notification of the predetermined information (for example, the notification of “being X”) is not limited to the explicit notification, but is performed implicitly (for example, the notification of the predetermined information is not performed). Good.
  • software, instructions, etc. may be sent and received via a transmission medium.
  • the software may use wired technologies such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and/or wireless technologies such as infrared, wireless and microwave to websites, servers, or other When transmitted from a remote source, these wireline and/or wireless technologies are included within the definition of transmission medium.
  • wired technologies such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and/or wireless technologies such as infrared, wireless and microwave to websites, servers, or other
  • the information, signals, etc. described herein may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description include voltage, current, electromagnetic waves, magnetic fields or magnetic particles, optical fields or photons, or any of these. May be represented by a combination of
  • the information, parameters, and the like described in this specification may be represented by absolute values, relative values from predetermined values, or may be represented by other corresponding information. ..
  • User terminals are defined by those skilled in the art as mobile communication terminals, subscriber stations, mobile units, subscriber units, wireless units, remote units, mobile devices, wireless devices, wireless communication devices, remote devices, mobile subscriber stations, access terminals, It may also be referred to as a mobile terminal, wireless terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable term.
  • determining and “determining” may encompass a wide variety of actions.
  • “Judgment” and “decision” mean, for example, calculating, computing, processing, deriving, investigating, looking up (e.g., table, database or another). (Search in data structure), ascertaining that it is regarded as “judgment” and “decision” can be included.
  • “decision” and “decision” include receiving (eg, receiving information), transmitting (eg, transmitting information), input (input), output (output), access (accessing) (for example, accessing data in a memory) may be regarded as “judging” and “deciding”.
  • judgment and “decision” are considered to be “judgment” and “decision” when things such as resolving, selecting, choosing, establishing, establishing, and comparing are done. May be included. That is, the “judgment” and “decision” may include considering some action as “judgment” and “decision”.
  • the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” means both "based only on” and “based at least on.”
  • any reference to that element does not generally limit the amount or order of those elements. These nomenclatures may be used herein as a convenient way to distinguish between two or more elements. Thus, references to the first and second elements do not mean that only two elements may be employed there, or that the first element must precede the second element in any way.
  • a device including a plurality of devices is also included unless it is a device in which only one clearly exists in terms of context or technology.
  • SYMBOLS 1 Positioning system, 11... Storage part, 12... Positioning part (global position estimation part), 50... Communication terminal (terminal), 53... Local position acquisition part, 54... Position derivation part, 55... Imaging instruction part, 100... Map data.

Abstract

This positioning system is provided with: a local position acquisition unit that continuously acquires local positional information being relative positional information of a communication terminal; a storage unit that stores map data; a positioning unit that, on the basis of image data captured in the communication terminal and the map data stored in the storage unit, estimates global positional information of the communication terminal at the time of imaging in the communication terminal: and a position derivation unit that, on the basis of a change in the local positional information acquired by the local position acquisition unit, estimates a movement amount from the time of imaging in the communication terminal, and on the basis of the movement amount, corrects the global positional information estimated by the positioning unit, thereby deriving positional information of the communication terminal with consideration given to the movement amount.

Description

測位システムPositioning system
 本発明の一態様は、測位システムに関する。 One aspect of the present invention relates to a positioning system.
 文献1には、予め撮影することによって得られた地物の画像を含む参照画像及び該参照画像の撮像位置情報が関連付けられたマップデータと、移動体(端末)のカメラによって撮像された現在画像とに基づき、端末の現在位置を推定する技術が開示されている。 Reference 1 describes map data in which a reference image including an image of a feature obtained by capturing an image in advance and image capturing position information of the reference image are associated with each other, and a current image captured by a camera of a mobile body (terminal). A technique for estimating the current position of the terminal based on the above is disclosed.
特開2017-151148号公報JP, 2017-151148, A
 ここで、例えば端末において画像が撮像された後に端末を利用するユーザが移動し端末の位置が変化したような場合においては、撮像された画像とマップデータとに基づき端末の位置が高精度に推定されていても、実際の端末の位置と測位推定結果とが乖離することが考えられる。この場合、測位精度を十分に担保することができない。 Here, for example, when a user who uses the terminal moves and the position of the terminal changes after the image is captured by the terminal, the position of the terminal is highly accurately estimated based on the captured image and map data. Even if it is done, the actual position of the terminal and the positioning estimation result may deviate from each other. In this case, the positioning accuracy cannot be sufficiently ensured.
 本発明の一態様は上記実情に鑑みてなされたものであり、測位精度の向上を目的とする。 One aspect of the present invention has been made in view of the above circumstances, and aims to improve positioning accuracy.
 本発明の一態様に係る測位システムは、端末において撮像された画像データに基づき測位を行う測位システムであって、ある地点を基準とした端末の相対的な位置情報であるローカル位置情報を継続的に取得するローカル位置取得部と、予め取得された画像データに含まれる特徴点の特徴量と該特徴点に関連付けられた絶対的な位置情報であるグローバル位置情報とが対応付けられたマップデータを記憶する記憶部と、端末において撮像された画像データと、記憶部に記憶されているマップデータとに基づいて、端末における撮像時の端末のグローバル位置情報を推定するグローバル位置推定部と、ローカル位置取得部によって取得されるローカル位置情報の変化に基づいて、端末における撮像時からの移動量を推定し、該移動量に基づいて、グローバル位置推定部によって推定されたグローバル位置情報を補正することにより、移動量を考慮した端末の位置情報を導出する位置導出部と、を備える。 A positioning system according to an aspect of the present invention is a positioning system that performs positioning based on image data captured by a terminal, and continuously stores local position information that is relative position information of the terminal with respect to a certain point. Map data in which the local position acquisition unit to be acquired in step S1 is associated with the feature amount of the feature points included in the image data acquired in advance and the global position information that is absolute position information associated with the feature points. A storage unit to store, a global position estimation unit that estimates global position information of the terminal at the time of image capturing in the terminal based on image data captured in the terminal, and map data stored in the storage unit, and a local position By estimating the amount of movement from the time of imaging at the terminal based on the change in the local position information acquired by the obtaining unit, and correcting the global position information estimated by the global position estimating unit based on the moving amount. And a position derivation unit that derives position information of the terminal in consideration of the movement amount.
 本発明の一態様に係る測位システムでは、端末において撮像された画像データと、マップデータ(予め取得された画像データに含まれる特徴点の特徴量と該特徴点のグローバル位置情報とが関連付けられたデータ)とに基づき、端末における撮像時の端末のグローバル位置情報が推定される。ここで、例えば端末において画像データが撮像された後に、端末を利用するユーザが移動し端末の位置が変化したような場合においては、撮像された画像データとマップデータとに基づきグローバル位置情報が高精度に推定されていても、実際の端末の位置と測位結果とが乖離することが考えられる。この点、本発明の一態様に係る測位システムでは、継続的に取得される端末のローカル位置情報から、端末における撮像時からの端末の移動量が推定され、該移動量に基づいてグローバル位置情報が補正されて、移動量を考慮した端末の位置情報が導出される。このように、継続的に取得される端末のローカル位置情報に基づいて移動量が推定され該移動量に応じてグローバル位置情報が補正されて最終的な端末の位置情報が導出されることにより、画像データが撮像された後に端末の位置が変化した場合においても、実際の端末の位置を高精度に特定することができる。以上より、本発明の一態様に係る測位システムによれば、測位精度を向上させることができる。 In the positioning system according to one aspect of the present invention, the image data captured by the terminal, the map data (the feature amount of the feature point included in the image data acquired in advance, and the global position information of the feature point are associated with each other. Data) and the global position information of the terminal at the time of imaging at the terminal is estimated. Here, for example, when the user who uses the terminal moves and the position of the terminal changes after the image data is captured by the terminal, the global position information is high based on the captured image data and the map data. Even if estimated accurately, it is possible that the actual position of the terminal deviates from the positioning result. In this regard, in the positioning system according to one aspect of the present invention, the amount of movement of the terminal from the time of image capturing in the terminal is estimated from the continuously acquired local position information of the terminal, and the global position information is calculated based on the movement amount. Is corrected, and the position information of the terminal considering the amount of movement is derived. In this way, the movement amount is estimated based on the continuously acquired local position information of the terminal, the global position information is corrected according to the movement amount, and the final position information of the terminal is derived, Even if the position of the terminal changes after the image data is captured, the actual position of the terminal can be specified with high accuracy. As described above, the positioning system according to one aspect of the present invention can improve positioning accuracy.
 グローバル位置推定部は、マップデータの特徴点と、端末において撮像された画像データの特徴点とのマッチングを行い、マップデータの特徴点に関連付けられたグローバル位置情報に基づいて、端末における撮像時の端末のグローバル位置情報を推定してもよい。このように、マップデータ及び撮像された画像データ間で特徴点のマッチングが行われることによって、マップデータ及び画像データの対応関係が適切に特定され、マップデータの特徴点に関連付けられたグローバル位置情報に基づき、撮像時の端末のグローバル位置情報を適切に推定することができる。 The global position estimation unit performs matching between the feature points of the map data and the feature points of the image data captured by the terminal, and based on the global position information associated with the feature points of the map data, the Global location information of the terminal may be estimated. In this way, by matching the feature points between the map data and the imaged image data, the correspondence relationship between the map data and the image data is appropriately specified, and the global position information associated with the feature points of the map data is obtained. Based on, it is possible to properly estimate the global position information of the terminal at the time of image capturing.
 記憶部は、マップデータについてグローバル位置情報に応じて一定の領域毎に分割した複数の分割マップデータを記憶し、グローバル位置推定部は、端末において撮像された画像データと共に端末において取得された端末のグローバル位置情報である端末取得位置情報を取得し、該端末取得位置情報に応じて、複数の分割マップデータから一又は複数の分割マップデータを選択し、選択した分割マップデータの特徴点と、端末において撮像された画像データの特徴点とのマッチングの結果に基づき、端末における撮像時の端末のグローバル位置情報を推定してもよい。このように、マップデータがグローバル位置情報に応じて複数に分割され、端末取得位置情報に応じて複数の分割マップデータからマッチング対象の分割マップデータが選択されることにより、適切にマッチング範囲(探索範囲)を絞りこむことができ、マッチング処理の効率化を図ることができる。 The storage unit stores a plurality of divided map data obtained by dividing the map data into fixed regions according to the global position information, and the global position estimation unit stores the image data captured by the terminal and the terminal acquired by the terminal. Terminal acquisition position information that is global position information is acquired, one or a plurality of division map data is selected from a plurality of division map data according to the terminal acquisition position information, and the feature points of the selected division map data and the terminal The global position information of the terminal at the time of imaging at the terminal may be estimated based on the result of matching with the feature point of the image data captured at. In this way, the map data is divided into a plurality of pieces according to the global position information, and the division map data to be matched is selected from the plurality of pieces of division map data according to the terminal acquisition position information, so that the matching range (search The range) can be narrowed down, and the efficiency of matching processing can be improved.
 測位システム1は、端末において撮像された画像データの内容に基づいて、該画像データをグローバル位置推定部によるグローバル位置情報の推定に用いることができるか否かを判定し、用いることができない場合に、新たな画像データの撮像を指示する撮像指示部をさらに備えていてもよい。このように、グローバル位置推定部による位置推定の前段階において、撮像された画像データが推定に適さない場合に新たな画像データの撮像が促されることによって、グローバル位置推定部において無駄な処理が行われることを回避することができ、トータルの測位時間を短縮することができる。 The positioning system 1 determines whether or not the image data can be used for estimation of global position information by the global position estimation unit based on the content of the image data captured by the terminal, and if the image data cannot be used, Further, an image capturing instructing unit for instructing to capture new image data may be further provided. In this way, in the previous stage of the position estimation by the global position estimation unit, when the captured image data is not suitable for the estimation, the image pickup of new image data is prompted, so that the global position estimation unit performs unnecessary processing. It is possible to avoid being hit and to reduce the total positioning time.
 撮像指示部は、端末において撮像された画像データに動的な物体が所定値以上含まれている場合に、該画像データをグローバル位置推定部によるグローバル位置情報の推定に用いることができないと判定してもよい。動的な物体については、継続的に存在するものではなく、マップデータに記録されていないと考えられる。このため、動的な物体は、グローバル位置情報の推定においてノイズとなり、測位精度の低下の原因となりうる。この点、画像データに動的な物体が所定値以上含まれている場合に該画像データをグローバル位置情報の推定に用いることができないと判定することによって、上述した動的な物体を原因とした測位精度の低下を抑制することができる。 The image capturing instruction unit determines that the image data cannot be used for estimating the global position information by the global position estimating unit when the image data captured by the terminal includes a dynamic object at a predetermined value or more. May be. It is considered that dynamic objects do not exist continuously and are not recorded in the map data. Therefore, the dynamic object becomes noise in the estimation of the global position information, which may cause a decrease in positioning accuracy. In this regard, by determining that the image data cannot be used for estimating the global position information when the image data includes a dynamic object or more by a predetermined value or more, the dynamic object described above is caused. It is possible to suppress a decrease in positioning accuracy.
 撮像指示部は、端末において撮像された画像データにモーションブラーが所定値以上含まれている場合に、該画像データをグローバル位置推定部によるグローバル位置情報の推定に用いることができないと判定してもよい。画像データにモーションブラーが多く含まれている場合には、特徴点のマッチングを適切に行うことができず、測位精度の低下の原因となりうる。この点、画像データにモーションブラーが所定値以上含まれている場合に該画像データをグローバル位置情報の推定に用いることができないと判定することによって、上述したモーションブラーを原因とした測位精度の低下を抑制することができる。 Even if the image capturing instruction unit determines that the image data cannot be used for the estimation of the global position information by the global position estimating unit when the image data captured by the terminal includes motion blur of a predetermined value or more. Good. If the image data contains a large amount of motion blur, matching of feature points cannot be performed properly, which may cause a decrease in positioning accuracy. In this regard, when the image data includes motion blur of a predetermined value or more, it is determined that the image data cannot be used for estimating global position information, and thus the positioning accuracy is deteriorated due to the motion blur described above. Can be suppressed.
 記憶部は、マップデータのグローバル位置情報として3次元の位置情報を記憶しており、グローバル位置推定部は、端末のグローバル位置情報として3次元の位置情報を推定し、ローカル位置取得部は、端末のローカル位置情報として3次元の位置情報を取得してもよい。これにより、位置導出部によって3次元の位置情報を導出することが可能となる。 The storage unit stores three-dimensional position information as global position information of the map data, the global position estimation unit estimates three-dimensional position information as global position information of the terminal, and the local position acquisition unit 3D position information may be acquired as the local position information of. This allows the position deriving unit to derive three-dimensional position information.
 本発明の一態様によれば、測位精度を向上させることができる。 According to one aspect of the present invention, positioning accuracy can be improved.
本実施形態に係る測位システムの機能構成を示すブロック図である。It is a block diagram which shows the functional structure of the positioning system which concerns on this embodiment. 測位システムが行う処理を示すシーケンス図である。It is a sequence diagram which shows the process which a positioning system performs. 測位システムに含まれる位置測位サーバ及び通信端末のハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of the positioning server and communication terminal contained in a positioning system.
 以下、添付図面を参照しながら本発明の実施形態を詳細に説明する。図面の説明において、同一又は同等の要素には同一符号を用い、重複する説明を省略する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the description of the drawings, the same reference numerals are used for the same or equivalent elements, and duplicate description will be omitted.
 図1は、本実施形態に係る測位システム1の機能構成を示すブロック図である。図1に示す測位システム1は、通信端末50(端末)において撮像された画像データに基づき測位を行う測位システムである。測位システム1は、例えば、通信端末50の位置に応じてAR(Augmented Reality)コンテンツを提供するサービスにおいて、通信端末50の測位を行うシステムである。以下では、測位システム1がARコンテンツを提供するサービスに係るシステムであるとして説明するが、測位システム1は他の用途に係るシステムであってもよい。測位システム1は、位置測位サーバ10と、通信端末50とを備える。なお、図1においては通信端末50が1台のみ示されているが、実際には通信端末50は複数台含まれていてもよい。 FIG. 1 is a block diagram showing the functional configuration of the positioning system 1 according to this embodiment. The positioning system 1 illustrated in FIG. 1 is a positioning system that performs positioning based on image data captured by the communication terminal 50 (terminal). The positioning system 1 is a system that measures the position of the communication terminal 50 in a service that provides AR (Augmented Reality) content according to the position of the communication terminal 50, for example. Hereinafter, the positioning system 1 will be described as a system related to a service that provides AR content, but the positioning system 1 may be a system related to another application. The positioning system 1 includes a position positioning server 10 and a communication terminal 50. Although only one communication terminal 50 is shown in FIG. 1, a plurality of communication terminals 50 may actually be included.
 位置測位サーバ10は、通信端末50において撮像された画像データに基づき、撮像時における通信端末50の絶対的な位置情報であるグローバル位置情報を推定する。位置測位サーバ10は、記憶部11と、測位部12(グローバル位置推定部)とを有する。 The position positioning server 10 estimates global position information, which is absolute position information of the communication terminal 50 at the time of image capturing, based on image data captured by the communication terminal 50. The position positioning server 10 has a storage unit 11 and a positioning unit 12 (global position estimation unit).
 記憶部11は、予め取得された画像データに含まれる特徴点の特徴量(例えば輝度方向ベクトル)と該特徴点に関連付けられた絶対的な位置情報であるグローバル位置情報とが対応付けられたマップデータ100を記憶する。このようなマップデータ100は、例えば、対象物を複数の異なる方向から同時に撮像可能なステレオカメラ(不図示)等によって予め撮像された大量の画像データに基づき生成されるものである。特徴点とは、画像中において際立って検出される点であり、例えば他の領域と比べて輝度(強度)が大きい(又は小さい)点である。特徴点のグローバル位置情報とは、特徴点に関連付けて設定されたグローバル位置情報であり、画像中の特徴点が示す領域についての現実世界におけるグローバル位置情報である。なお、各特徴点に対するグローバル位置情報の関連付けは、従来から周知の方法によって行うことができる。 The storage unit 11 associates a feature amount (for example, a luminance direction vector) of a feature point included in image data acquired in advance with global position information, which is absolute position information associated with the feature point, in a map. The data 100 is stored. Such map data 100 is generated, for example, based on a large amount of image data previously imaged by a stereo camera (not shown) or the like that can simultaneously image an object from a plurality of different directions. A characteristic point is a point that is prominently detected in an image, and is, for example, a point whose brightness (intensity) is larger (or smaller) than other areas. The global position information of the feature point is global position information set in association with the feature point, and is global position information in the real world about the region indicated by the feature point in the image. The global position information can be associated with each feature point by a conventionally known method.
 記憶部11は、マップデータ100の特徴点のグローバル位置情報として3次元の位置情報を記憶している。記憶部11は、特徴点の3次元のグローバル位置情報として、例えば、特徴点の緯度・経度・高さを記憶している。記憶部11は、マップデータ100についてグローバル位置情報に応じて一定の領域毎に分割した複数の分割マップデータを記憶している。各分割マップデータは、互いに境界付近の領域(グローバル位置情報)が重複していてもよいし、重複していなくてもよい。各分割マップデータについて境界付近の領域を互いに重複させることにより、通信端末50において取得された通信端末のグローバル位置情報である端末取得位置情報(詳細は後述)が各分割マップデータの境界付近である場合においても、端末取得位置情報に対応する分割マップデータが存在しない(マップデータを参照できない)事態を回避することができ、確実に対応する分割マップデータを選択する(詳細は後述)ことができる。なお、以下では記憶部11が複数の分割マップデータを記憶しているとして説明するが、記憶部11は、分割されていない1つのマップデータを記憶していてもよい。 The storage unit 11 stores three-dimensional position information as global position information of feature points of the map data 100. The storage unit 11 stores, for example, latitude/longitude/height of a feature point as three-dimensional global position information of the feature point. The storage unit 11 stores a plurality of pieces of divided map data obtained by dividing the map data 100 into certain areas according to global position information. Areas (global position information) near the boundaries of the divided map data may overlap with each other, or may not overlap with each other. By overlapping the areas near the boundaries of each divided map data with each other, the terminal acquisition position information (details described later), which is the global position information of the communication terminal acquired by the communication terminal 50, is near the boundary of each divided map data. Even in such a case, it is possible to avoid the situation where the division map data corresponding to the terminal acquisition position information does not exist (the map data cannot be referred to), and the corresponding division map data can be surely selected (details will be described later). .. In the following description, it is assumed that the storage unit 11 stores a plurality of divided map data, but the storage unit 11 may store one undivided map data.
 測位部12は、通信端末50において撮像された画像データと、記憶部11に記憶されているマップデータ100とに基づいて、通信端末50における撮像時の通信端末50のグローバル位置情報(3次元の位置情報)を推定する。具体的には、測位部12は、マップデータ100の特徴点と、通信端末50において撮像された画像データの特徴点とのマッチングを行い、撮像された画像データに対応するマップデータ100の領域を特定する。そして、測位部12は、特定した領域に係るマップデータ100の特徴点に関連付けられたグローバル位置情報に基づいて、画像データの撮像位置(すなわち、撮像時における通信端末50のグローバル位置情報)を推定する。 Based on the image data captured by the communication terminal 50 and the map data 100 stored in the storage unit 11, the positioning unit 12 determines global position information (in the three-dimensional space) of the communication terminal 50 when the communication terminal 50 captures an image. Position information) is estimated. Specifically, the positioning unit 12 performs matching between the characteristic points of the map data 100 and the characteristic points of the image data captured by the communication terminal 50, and determines the area of the map data 100 corresponding to the captured image data. Identify. Then, the positioning unit 12 estimates the imaging position of the image data (that is, the global position information of the communication terminal 50 at the time of imaging) based on the global position information associated with the feature point of the map data 100 related to the specified area. To do.
 測位部12は、より詳細には、複数の分割マップデータから選択した一又は複数の分割マップデータを用いて通信端末50のグローバル位置情報を推定する。すなわち、測位部12は、通信端末50において撮像された画像データと共に通信端末50において取得された通信端末のグローバル位置情報である端末取得位置情報を取得し(詳細は後述)、該端末取得位置情報に応じて、複数の分割マップデータから一又は複数の分割マップデータを選択し、選択した分割マップデータの特徴点と、通信端末50において撮像された画像データの特徴点とのマッチングの結果に基づき、通信端末50における撮像時の通信端末50のグローバル位置情報を推定する。測位部12は、例えば端末取得位置情報に示される位置を含む分割マップデータを選択する。位置測位サーバ10は、測位部12による測位結果を通信端末50に送信する。なお、測位結果には、グローバル位置情報に加えて画像データから推定される方向(ロール、ピッチ、ヨーの3次元座標中の方向)に関する情報が含まれていてもよい。 More specifically, the positioning unit 12 estimates global position information of the communication terminal 50 using one or a plurality of division map data selected from a plurality of division map data. That is, the positioning unit 12 acquires the terminal acquisition position information which is the global position information of the communication terminal acquired by the communication terminal 50 together with the image data captured by the communication terminal 50 (details will be described later), and the terminal acquisition position information. According to the above, one or a plurality of divided map data are selected from the plurality of divided map data, and based on the result of matching between the characteristic points of the selected divided map data and the characteristic points of the image data captured by the communication terminal 50. Estimate global position information of the communication terminal 50 at the time of image capturing in the communication terminal 50. The positioning unit 12 selects the division map data including the position indicated by the terminal acquisition position information, for example. The position positioning server 10 transmits the positioning result by the positioning unit 12 to the communication terminal 50. In addition to the global position information, the positioning result may include information about the direction (direction in the three-dimensional coordinates of roll, pitch, and yaw) estimated from the image data.
 通信端末50は、例えば無線通信が可能な端末であり、例えばスマートフォン、タブレット型端末、PC等である。通信端末50は、例えば実装されたアプリケーションをユーザが利用することによって、コンテンツサーバ(不図示)からARコンテンツの提供を受ける。通信端末50では、例えばアプリケーションが実行されると、実装されたカメラによる撮像(継続的な撮像)が開始される。そして、通信端末50は、撮像された画像データに応じて位置測位サーバ10によって推定される測位結果、撮像された画像データ、及び、通信端末50のカメラの情報(画角等)等に基づいて、コンテンツサーバ(不図示)からARコンテンツを含むコンテンツの提供を受ける。通信端末50は、撮像部51と、測位部52と、ローカル位置取得部53と、位置導出部54と、撮像指示部55と、を備える。 The communication terminal 50 is, for example, a terminal capable of wireless communication, and is, for example, a smartphone, a tablet terminal, a PC, or the like. The communication terminal 50 receives the provision of the AR content from the content server (not shown) when the user uses the installed application, for example. In the communication terminal 50, when an application is executed, for example, image pickup (continuous image pickup) by the mounted camera is started. Then, the communication terminal 50 is based on the positioning result estimated by the position positioning server 10 according to the imaged image data, the imaged image data, the information (angle of view, etc.) of the camera of the communication terminal 50, and the like. , A content including AR content is provided from a content server (not shown). The communication terminal 50 includes an imaging unit 51, a positioning unit 52, a local position acquisition unit 53, a position derivation unit 54, and an imaging instruction unit 55.
 撮像部51は、カメラによる画像データの撮像を制御する機能である。撮像部51は、例えば、ARコンテンツの提供を受けるアプリケーションの実行が開始されたタイミングで、カメラによる画像データの撮像を開始する。撮像部51は、撮像を開始すると、アプリケーションの実行が終了するまで、継続的にカメラによる撮像を行う。撮像された画像データは、位置測位サーバ10によるグローバル位置情報の推定に用いられる。 The image pickup unit 51 has a function of controlling image pickup of image data by the camera. The imaging unit 51 starts imaging the image data by the camera, for example, at the timing when the execution of the application receiving the AR content is started. When the image pickup unit 51 starts image pickup, the image pickup unit 51 continuously picks up an image by the camera until the execution of the application ends. The imaged image data is used by the position positioning server 10 to estimate global position information.
 撮像指示部55は、撮像部51の制御に応じてカメラによって撮像された画像データの内容に基づいて、該画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができるか否かを判定し、用いることができない場合に、該画像データを位置測位サーバ10に送信することなく、新たな画像データの撮像を撮像部51に指示する。通信端末50は、撮像指示部55によって画像データをグローバル位置情報の推定に用いることができると判定された場合に、該画像データと、後述する測位部52による測位結果(端末取得位置情報)とを位置測位サーバ10に送信する。 Based on the content of the image data captured by the camera under the control of the image capturing unit 51, the image capturing instruction unit 55 determines whether the image data can be used for estimating the global position information by the position positioning server 10. If it is determined that the image data cannot be used, the image capturing unit 51 is instructed to capture new image data without transmitting the image data to the position positioning server 10. When the image capturing instruction unit 55 determines that the image data can be used for estimating the global position information, the communication terminal 50 sends the image data and the positioning result (terminal acquisition position information) by the positioning unit 52 described later. Is transmitted to the positioning server 10.
 撮像指示部55は、通信端末50において撮像された画像データに動的な物体が所定値以上含まれている場合に、該画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができないと判定する。動的な物体とは、例えば人や車など動きのある物体であり、通常、マップデータ100には含まれない物体である。撮像指示部55は、例えばセマンテックセグメンテーションの処理を行い、画像データを画素単位で特定することにより動的な物体を特定してもよい。撮像指示部55は、画像認識技術を利用することによって動的な物体を特定してもよいし、例えば連続して撮像された画像データ間の差異に基づき一方の画像データにしか存在しない物体や位置が大きく変化している物体を動的な物体と特定してもよい。 When the image data captured by the communication terminal 50 includes a dynamic object in a predetermined value or more, the image capturing instruction unit 55 cannot use the image data for estimating global position information by the position positioning server 10. To determine. The dynamic object is a moving object such as a person or a car, and is usually an object not included in the map data 100. The imaging instruction unit 55 may specify a dynamic object by performing, for example, a process of semantic segmentation and specifying image data in pixel units. The image capturing instruction unit 55 may specify a dynamic object by using an image recognition technique. For example, the image capturing instruction unit 55 may identify an object that exists only in one image data based on a difference between continuously captured image data. An object whose position changes significantly may be identified as a dynamic object.
 撮像指示部55は、通信端末50において撮像された画像データにモーションブラーが所定値以上含まれている場合に、該画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができないと判定する。モーションブラーとは、動いている対象をカメラで撮像したときに生じるぶれである。なお、撮像指示部55は、画像データにモーションブラーがどのぐらい含まれているかを判断するに際して、実際に画像データを解析してもよいし、例えば通信端末50が実装する加速度センサの検出結果を利用してもよい。 When the image data captured by the communication terminal 50 includes motion blur of a predetermined value or more, the image capturing instruction unit 55 determines that the image data cannot be used for estimating global position information by the position location server 10. To do. Motion blur is blurring that occurs when a moving object is captured by a camera. The image capturing instruction unit 55 may actually analyze the image data when determining how much motion blur is included in the image data. For example, the image capturing instruction unit 55 may display the detection result of the acceleration sensor mounted on the communication terminal 50. You may use it.
 測位部52は、例えばGPS(Global Positioning System)測位を行うことによって通信端末50のグローバル位置情報である端末取得位置情報を取得する。測位部52は、例えば撮像部51の制御によって画像データが撮像されている間、定期的に測位を行うものであってもよい。なお、測位部52は、GPS測位以外の測位方法(例えば基地局測位など)によってグローバル位置情報を取得するものであってもよい。 The positioning unit 52 acquires terminal acquisition position information that is global position information of the communication terminal 50 by performing GPS (Global Positioning System) positioning, for example. The positioning unit 52 may periodically perform positioning while the image data is being imaged under the control of the imaging unit 51. The positioning unit 52 may acquire global position information by a positioning method other than GPS positioning (for example, base station positioning).
 ローカル位置取得部53は、ある地点を基準とした通信端末50の相対的な位置情報であるローカル位置情報を継続的に取得する。ある地点とは、例えば通信端末50においてアプリケーション(ARコンテンツの提供を受けるアプリケーション)の実行が開始された地点である。ローカル位置取得部53は、例えば通信端末50が実装するセンサから3次元のローカル位置情報を継続的に取得する。通信端末50が実装するセンサは限定されないが、例えば慣性計測装置(IMU:Inertial Measurement Unit)を用いたVisual SLAM(Simultaneous Localization and Mapping)や、カメラセンサ、一般的な加速度センサ又はジャイロセンサ等を用いることができる。例えばアプリケーションの実行が開始された地点の3次元の緯度・経度・高さが基準(座標(x,y,z)が(0,0,0))とされ、当該地点からの3次元の緯度・経度・高さの変化量が継続的に取得されることにより、当該地点を基準とした通信端末50の相対的な位置情報であるローカル位置情報がトラッキングされる。なお、測位結果には、ローカル位置情報に加えて、加速度センサ又はジャイロセンサ等における検出結果から推定される方向(ロール、ピッチ、ヨーの3次元座標中の方向)に関する情報が含まれていてもよい。 The local position acquisition unit 53 continuously acquires local position information, which is relative position information of the communication terminal 50 with respect to a certain point. The certain point is, for example, a point at which execution of an application (application receiving AR content) is started in the communication terminal 50. The local position acquisition unit 53 continuously acquires three-dimensional local position information from, for example, a sensor mounted on the communication terminal 50. The sensor mounted on the communication terminal 50 is not limited, but, for example, a visual SLAM (Simultaneous Localization and Mapping) using an inertial measurement unit (IMU: Inertial Measurement Unit), a camera sensor, a general acceleration sensor, a gyro sensor, or the like is used. be able to. For example, the three-dimensional latitude/longitude/height of the point where the execution of the application is started is used as a reference (coordinates (x, y, z) are (0, 0, 0)), and the three-dimensional latitude from the point is set. By continuously acquiring the amount of change in longitude and height, local position information, which is relative position information of the communication terminal 50 with respect to the point, is tracked. Note that the positioning result may include, in addition to the local position information, information about the direction (direction in the three-dimensional coordinates of roll, pitch, and yaw) estimated from the detection result of the acceleration sensor, the gyro sensor, or the like. Good.
 位置導出部54は、ローカル位置取得部53によって取得されるローカル位置情報の変化に基づいて、通信端末50における撮像時からの移動量を推定し、該移動量に基づいて、位置測位サーバ10によって推定されたグローバル位置情報を補正することにより、上述した移動量を考慮した通信端末50の位置情報を導出する。例えば、ある画像データについて通信端末50が撮像した時点のローカル位置情報(3次元座標)が(3,5,3)であったとする。なお、ここでは3次元座標の数字は説明の便宜上わかりやすい値としている。そして、当該画像データに関する位置測位サーバ10の測位結果を取得した後に位置導出部54が位置を導出する際、ローカル位置情報(3次元座標)が(1,8,2)であったとする。この場合、位置導出部54は、通信端末50における撮像時から現時点(位置導出時点)までの通信端末50の移動量として、撮像時の座標(3,5,3)と現時点の座標(1,8,2)とから、(3-1=2,5-8=-3,3-1=1)、すなわち、(2,-3,1)を移動量として導出する。そして、位置導出部54は、当該移動量分だけグローバル位置情報を補正する。すなわち、例えばグローバル位置情報(撮像時におけるグローバル位置情報)に対して、移動量である(2,-3,1)を足し込む補正を行うことにより、撮像後において移動した通信端末50の位置情報(グローバル位置情報)を適切に導出することができる。なお、位置導出部54は、移動量に基づいて(移動量を考慮して)グローバル位置情報を補正するものであればよく、必ずしも、移動量分だけグローバル位置情報を補正するものでなくてもよい。すなわち、位置導出部54は、例えば、算出タイミングの誤差や通信端末50及び位置測位サーバ10間での通信時間等を考慮して、移動量に所定の調整値を掛け合わせた値分だけグローバル位置情報を補正するものであってもよい。通信端末50は、補正後の通信端末50の位置情報を位置測位サーバ10に送信する。 The position derivation unit 54 estimates the amount of movement of the communication terminal 50 from the time of image capturing based on the change in the local position information acquired by the local position acquisition unit 53, and based on the amount of movement, the position positioning server 10 causes By correcting the estimated global position information, the position information of the communication terminal 50 considering the above-described movement amount is derived. For example, it is assumed that the local position information (three-dimensional coordinates) at the time when the communication terminal 50 takes an image of certain image data is (3, 5, 3). It should be noted that here, the numbers of the three-dimensional coordinates are values that are easy to understand for convenience of explanation. Then, it is assumed that the local position information (three-dimensional coordinates) is (1, 8, 2) when the position deriving unit 54 derives the position after acquiring the positioning result of the position positioning server 10 regarding the image data. In this case, the position deriving unit 54 determines the coordinates (3, 5, 3) at the time of image capturing and the coordinates (1, at the current time) as the amount of movement of the communication terminal 50 from the time of image capturing by the communication terminal 50 to the current time (time of position deriving). From (8, 2), (3-1=2, 5-8=-3, 3-1=1), that is, (2, -3, 1) is derived as the movement amount. Then, the position deriving unit 54 corrects the global position information by the movement amount. That is, for example, by correcting the global position information (global position information at the time of image capturing) by adding the movement amount (2, -3, 1), the position information of the communication terminal 50 moved after the image capturing. (Global position information) can be appropriately derived. Note that the position deriving unit 54 only needs to correct the global position information based on the movement amount (in consideration of the movement amount), and does not necessarily correct the global position information by the movement amount. Good. That is, the position deriving unit 54 takes into account the error in the calculation timing, the communication time between the communication terminal 50 and the position positioning server 10, and the like, and the global position by the value obtained by multiplying the movement amount by a predetermined adjustment value. The information may be corrected. The communication terminal 50 transmits the corrected position information of the communication terminal 50 to the position positioning server 10.
 次に、図2を参照して、測位システム1が行う処理について説明する。図2は、測位システム1が行う処理を示すシーケンス図である。 Next, the processing performed by the positioning system 1 will be described with reference to FIG. FIG. 2 is a sequence diagram showing processing performed by the positioning system 1.
 図2に示されるように、通信端末50は、例えば、ARコンテンツの提供を受けるアプリケーションの実行が開始されたタイミングで画像データの撮像を開始する(ステップS1)。 As shown in FIG. 2, for example, the communication terminal 50 starts capturing image data at the timing when the execution of the application receiving the AR content is started (step S1).
 つづいて、通信端末50は、撮像された画像データの内容に基づいて、該画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができるか否かを判定する(ステップS2)。ステップS2においては、例えば、撮像された画像データに動的な物体が所定値以上含まれている場合に、該画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができないと判定する。また、ステップS2においては、例えば、撮像された画像データにモーションブラーが所定値以上含まれている場合に、該画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができないと判定する。 Subsequently, the communication terminal 50 determines whether or not the image data can be used for the estimation of global position information by the position positioning server 10 based on the content of the imaged image data (step S2). In step S2, for example, when a dynamic object is included in the captured image data by a predetermined value or more, it is determined that the image data cannot be used for estimating the global position information by the positioning server 10. .. Further, in step S2, for example, when the captured image data includes motion blur of a predetermined value or more, it is determined that the image data cannot be used for estimating the global position information by the position positioning server 10. ..
 ステップS2において、撮像された画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができないと判定した場合には、通信端末50は、画像データを位置測位サーバ10に送信することなく、新たな画像データを再度撮像する(すなわち再度ステップS1の処理が行われる)。一方で、ステップS2において、撮像された画像データを位置測位サーバ10によるグローバル位置情報の推定に用いることができると判定した場合には、通信端末50は、画像データ及びGPS情報(端末取得位置情報)を位置測位サーバ10に送信する(ステップS3)と共に、ローカル位置情報のトラッキングを開始する(ステップS4)。 When it is determined in step S2 that the captured image data cannot be used for estimating the global position information by the positioning server 10, the communication terminal 50 does not transmit the image data to the positioning server 10, New image data is imaged again (that is, the process of step S1 is performed again). On the other hand, when it is determined in step S2 that the captured image data can be used for estimating the global position information by the position positioning server 10, the communication terminal 50 determines that the image data and the GPS information (terminal acquisition position information). ) Is transmitted to the positioning server 10 (step S3), and tracking of local position information is started (step S4).
 位置測位サーバ10は、通信端末50から取得したGPS情報(端末取得位置情報)に基づき、複数の分割マップデータから、利用する一又は複数の分割マップデータを選択する(ステップS5)。そして、位置測位サーバ10は、選択した分割マップデータの特徴点と、通信端末50において撮像された画像データの特徴点とのマッチングの結果に基づき、通信端末50における撮像時の通信端末50のグローバル位置情報を推定する(ステップS6)。位置測位サーバ10は、推定したグローバル位置情報を通信端末50に送信する(ステップS7)。 The positioning server 10 selects one or a plurality of division map data to be used from the plurality of division map data based on the GPS information (terminal acquisition position information) obtained from the communication terminal 50 (step S5). Then, the position location server 10 globalizes the communication terminal 50 at the time of image capturing by the communication terminal 50 based on the result of matching between the feature point of the selected divided map data and the feature point of the image data imaged by the communication terminal 50. The position information is estimated (step S6). The position positioning server 10 transmits the estimated global position information to the communication terminal 50 (step S7).
 通信端末50は、位置測位サーバ10から送信されたグローバル位置情報について、ローカル位置情報に基づいて補正する(ステップS8)。具体的には、通信端末50は、ローカル位置取得部53によって取得されるローカル位置情報の変化に基づいて、通信端末50における撮像時からの移動量を推定し、該移動量に基づいて、位置測位サーバ10によって推定されたグローバル位置情報を補正することにより、上述した移動量を考慮した通信端末50の位置情報を導出する。通信端末50は、導出した補正後の位置情報を位置測位サーバ10に送信する(ステップS9)。そして、当該位置情報を用いて、ARコンテンツの提供が行われる。 The communication terminal 50 corrects the global position information transmitted from the position positioning server 10 based on the local position information (step S8). Specifically, the communication terminal 50 estimates the amount of movement of the communication terminal 50 from the time of imaging based on the change in local position information acquired by the local position acquisition unit 53, and based on the amount of movement, the position is calculated. By correcting the global position information estimated by the positioning server 10, the position information of the communication terminal 50 in consideration of the above-described movement amount is derived. The communication terminal 50 transmits the derived corrected position information to the position positioning server 10 (step S9). Then, the AR content is provided using the position information.
 次に、本実施形態に係る測位システム1の作用効果について説明する。 Next, operation effects of the positioning system 1 according to the present embodiment will be described.
 測位システム1は、通信端末50において撮像された画像データに基づき測位を行う測位システムであって、ある地点を基準とした通信端末50の相対的な位置情報であるローカル位置情報を継続的に取得するローカル位置取得部53と、予め取得された画像データに含まれる特徴点の特徴量と該特徴点の絶対的な位置情報であるグローバル位置情報とが対応付けられたマップデータ100を記憶する記憶部11と、通信端末50において撮像された画像データと、記憶部11に記憶されているマップデータ100とに基づいて、通信端末50における撮像時の通信端末50のグローバル位置情報を推定する測位部12と、ローカル位置取得部53によって取得されるローカル位置情報の変化に基づいて、通信端末50における撮像時からの移動量を推定し、該移動量に基づいて、測位部12によって推定されたグローバル位置情報を補正することにより、移動量を考慮した通信端末50の位置情報を導出する位置導出部54と、を備える。 The positioning system 1 is a positioning system that performs positioning based on image data captured by the communication terminal 50, and continuously acquires local position information that is relative position information of the communication terminal 50 with respect to a certain point. The local position acquisition unit 53 that stores the map data 100 in which the feature amount of the feature point included in the image data acquired in advance and the global position information that is the absolute position information of the feature point are stored. A positioning unit that estimates global position information of the communication terminal 50 at the time of image capturing in the communication terminal 50 based on the unit 11, the image data captured by the communication terminal 50, and the map data 100 stored in the storage unit 11. 12 and the change in the local position information acquired by the local position acquisition unit 53, the movement amount from the time of imaging in the communication terminal 50 is estimated, and the global position estimated by the positioning unit 12 is estimated based on the movement amount. The position deriving unit 54 derives the position information of the communication terminal 50 in consideration of the movement amount by correcting the position information.
 本実施形態に係る測位システム1では、通信端末50において撮像された画像データと、マップデータ100(予め取得された画像データに含まれる特徴点の特徴量と該特徴点に関連付けられたグローバル位置情報とが対応付けられたデータ)とに基づき、通信端末50における撮像時の通信端末50のグローバル位置情報が推定される。ここで、例えば通信端末50において画像データが撮像された後に、通信端末50を利用するユーザが移動し通信端末50の位置が変化したような場合においては、撮像された画像データとマップデータ100とに基づきグローバル位置情報が高精度に推定されていても、実際の通信端末50の位置と測位結果とが乖離することが考えられる。この点、測位システム1では、継続的に取得される通信端末50のローカル位置情報から、通信端末50における撮像時からの通信端末50の移動量が推定され、該移動量に基づいてグローバル位置情報が補正されて、移動量を考慮した通信端末50の位置情報が導出される。このように、継続的に取得される通信端末50のローカル位置情報に基づいて移動量が推定され該移動量に応じてグローバル位置情報が補正されて最終的な通信端末50の位置情報が導出されることにより、画像データが撮像された後に通信端末50の位置が変化した場合においても、実際の通信端末50の位置を高精度に特定することができる。以上より、測位システム1によれば、測位精度を向上させることができる。測位精度が低い場合には、ユーザが繰り返し測位結果を求めること等により処理負荷が増大することが考えられるところ、測位精度が向上することによって、上述した問題が起こりにくくなり、CPU等の処理部における処理負荷を軽減するという技術的効果も併せて奏する。 In the positioning system 1 according to the present embodiment, the image data captured by the communication terminal 50, the map data 100 (the feature amount of the feature points included in the image data acquired in advance, and the global position information associated with the feature points). Based on (data associated with and), global position information of the communication terminal 50 at the time of image capturing by the communication terminal 50 is estimated. Here, for example, when the user who uses the communication terminal 50 moves and the position of the communication terminal 50 changes after the image data is captured by the communication terminal 50, the captured image data and the map data 100 Even if the global position information is estimated with high accuracy based on the above, it is conceivable that the actual position of the communication terminal 50 may deviate from the positioning result. In this regard, in the positioning system 1, the amount of movement of the communication terminal 50 from the time of image capturing by the communication terminal 50 is estimated from the continuously acquired local position information of the communication terminal 50, and the global position information is calculated based on the movement amount. Is corrected, and the position information of the communication terminal 50 considering the movement amount is derived. In this way, the movement amount is estimated based on the continuously acquired local position information of the communication terminal 50, the global position information is corrected according to the movement amount, and the final position information of the communication terminal 50 is derived. By doing so, even if the position of the communication terminal 50 changes after the image data is captured, the actual position of the communication terminal 50 can be specified with high accuracy. As described above, according to the positioning system 1, the positioning accuracy can be improved. When the positioning accuracy is low, the processing load may increase due to the user repeatedly requesting the positioning result. However, the positioning accuracy improves, so that the above-mentioned problem is less likely to occur, and the processing unit such as the CPU is less likely to occur. The technical effect of reducing the processing load is also obtained.
 例えば、通信端末50のみでマップデータ100を用いたグローバル測位を行おうとした場合には、通信端末50の性能によっては記憶できるマップデータ100の容量に限度があり、測位精度を十分に向上させることができないおそれがある。この点、本実施形態のように、測位に用いる大規模なマップデータ100については位置測位サーバ10で記憶し、測位に用いる画像データが通信端末50で取得されることにより、測位精度を向上させると共に、迅速な測位を実現している。また、通信端末50側での処理については、画像データの撮像及びローカル位置情報の取得等、通信端末50の性能やOSに依存しにくいものとすることによって、通信端末50の性能やOSに依存しにくいアーキテクチャーを実現することができる。 For example, when trying to perform global positioning using the map data 100 only with the communication terminal 50, the capacity of the map data 100 that can be stored is limited depending on the performance of the communication terminal 50, and the positioning accuracy should be sufficiently improved. May not be possible. In this respect, like the present embodiment, the large-scale map data 100 used for positioning is stored in the position positioning server 10, and the image data used for positioning is acquired by the communication terminal 50, thereby improving the positioning accuracy. At the same time, quick positioning is realized. The processing on the side of the communication terminal 50 depends on the performance and the OS of the communication terminal 50 by making it less likely to depend on the performance and the OS of the communication terminal 50 such as capturing image data and acquiring local position information. It is possible to realize an architecture that is difficult to do.
 測位部12は、マップデータ100の特徴点と、通信端末50において撮像された画像データの特徴点とのマッチングを行い、マップデータ100の特徴点に関連付けられたグローバル位置情報に基づいて、通信端末50における撮像時の通信端末50のグローバル位置情報を推定する。このように、マップデータ100及び撮像された画像データ間で特徴点のマッチングが行われることによって、マップデータ100及び画像データの対応関係が適切に特定され、マップデータ100の特徴点に対応付けられたグローバル位置情報に基づき、撮像時の通信端末50のグローバル位置情報を適切に推定することができる。 The positioning unit 12 performs matching between the characteristic points of the map data 100 and the characteristic points of the image data captured by the communication terminal 50, and based on the global position information associated with the characteristic points of the map data 100, the communication terminal The global position information of the communication terminal 50 at the time of imaging at 50 is estimated. In this way, by matching the feature points between the map data 100 and the imaged image data, the correspondence relationship between the map data 100 and the image data is appropriately specified and associated with the feature points of the map data 100. Based on the global position information, it is possible to appropriately estimate the global position information of the communication terminal 50 at the time of image capturing.
 記憶部11は、マップデータ100についてグローバル位置情報に応じて一定の領域毎に分割した複数の分割マップデータを記憶し、測位部12は、通信端末50において撮像された画像データと共に通信端末50において取得された通信端末50のグローバル位置情報である端末取得位置情報を取得し、該端末取得位置情報に応じて、複数の分割マップデータから一又は複数の分割マップデータを選択し、選択した分割マップデータの特徴点と、通信端末50において撮像された画像データの特徴点とのマッチングの結果に基づき、通信端末50における撮像時の通信端末50のグローバル位置情報を推定する。このように、マップデータ100がグローバル位置情報に応じて複数に分割され、端末取得位置情報に応じて複数の分割マップデータからマッチング対象の分割マップデータが選択されることにより、適切にマッチング範囲(探索範囲)を絞りこむことができ、マッチング処理の効率化を図ることができる。すなわち、例えば、記憶されているマップデータが単一の広大なマップである場合、通信端末から位置測位サーバに画像データを送信し自己位置認識を行う際に探索範囲が広すぎて探索効率が悪化してしまうことが考えられる。この点、記憶されているマップデータ100を一定の単位で分割し、端末取得位置情報に応じて複数の分割マップデータからマッチング対象の分割マップデータを選択することによって探索範囲を限定して処理の効率化を図ることができる。 The storage unit 11 stores a plurality of divided map data obtained by dividing the map data 100 into certain regions according to the global position information, and the positioning unit 12 in the communication terminal 50 together with the image data imaged in the communication terminal 50. The terminal acquisition position information that is the acquired global position information of the communication terminal 50 is acquired, one or a plurality of division map data is selected from a plurality of division map data according to the terminal acquisition position information, and the selected division map is selected. Global position information of the communication terminal 50 at the time of image capturing in the communication terminal 50 is estimated based on the result of matching between the characteristic point of the data and the characteristic point of the image data captured by the communication terminal 50. In this way, the map data 100 is divided into a plurality of pieces according to the global position information, and the division map data to be matched is selected from the plurality of pieces of division map data according to the terminal acquisition position information, whereby the matching range ( The search range) can be narrowed down, and the efficiency of matching processing can be improved. That is, for example, when the stored map data is a single vast map, the search range is too wide and the search efficiency deteriorates when the image data is transmitted from the communication terminal to the positioning server to perform self-position recognition. It is possible that In this respect, the stored map data 100 is divided into a certain unit, and the divided map data to be matched is selected from a plurality of divided map data according to the terminal acquisition position information, thereby limiting the search range and performing processing. It is possible to improve efficiency.
 測位システム1は、通信端末50において撮像された画像データの内容に基づいて、該画像データを測位部12によるグローバル位置情報の推定に用いることができるか否かを判定し、用いることができない場合に、新たな画像データの撮像を指示する撮像指示部55をさらに備える。このように、測位部12による位置推定の前段階において、撮像された画像データが推定に適さない場合に新たな画像データの撮像が促されることによって、測位部12において無駄な処理が行われることを回避することができ、トータルの測位時間を短縮することができる。 The positioning system 1 determines whether or not the image data can be used for the estimation of the global position information by the positioning unit 12 based on the content of the image data captured by the communication terminal 50, and when it cannot be used. In addition, an image capturing instruction unit 55 for instructing image capturing of new image data is further provided. As described above, in the previous stage of the position estimation by the positioning unit 12, when the captured image data is not suitable for the estimation, the new image data is prompted to be captured, so that the positioning unit 12 performs unnecessary processing. Can be avoided and the total positioning time can be shortened.
 撮像指示部55は、通信端末50において撮像された画像データに動的な物体が所定値以上含まれている場合に、該画像データを測位部12によるグローバル位置情報の推定に用いることができないと判定する。動的な物体については、継続的に存在するものではなく、マップデータ100に記録されていないと考えられる。このため、動的な物体は、グローバル位置情報の推定においてノイズとなり、測位精度の低下の原因となりうる。この点、画像データに動的な物体が所定値以上含まれている場合に該画像データをグローバル位置情報の推定に用いることができないと判定することによって、上述した動的な物体を原因とした測位精度の低下を抑制することができる。 When the image data captured by the communication terminal 50 includes a dynamic object or more in a predetermined value, the image capturing instruction unit 55 cannot use the image data for estimating global position information by the positioning unit 12. judge. It is considered that the dynamic object does not exist continuously and is not recorded in the map data 100. Therefore, the dynamic object becomes noise in the estimation of the global position information, which may cause a decrease in positioning accuracy. In this regard, by determining that the image data cannot be used for estimating the global position information when the image data includes a dynamic object or more by a predetermined value or more, the dynamic object described above is caused. It is possible to suppress a decrease in positioning accuracy.
 撮像指示部55は、通信端末50において撮像された画像データにモーションブラーが所定値以上含まれている場合に、該画像データを測位部12によるグローバル位置情報の推定に用いることができないと判定する。画像データにモーションブラーが多く含まれている場合には、特徴点のマッチングを適切に行うことができず、測位精度の低下の原因となりうる。この点、画像データにモーションブラーが所定値以上含まれている場合に該画像データをグローバル位置情報の推定に用いることができないと判定することによって、上述したモーションブラーを原因とした測位精度の低下を抑制することができる。 When the image data captured by the communication terminal 50 includes motion blur of a predetermined value or more, the image capturing instruction unit 55 determines that the image data cannot be used for estimating global position information by the positioning unit 12. .. If the image data contains a large amount of motion blur, matching of feature points cannot be performed properly, which may cause a decrease in positioning accuracy. In this regard, when the image data includes motion blur more than a predetermined value, it is determined that the image data cannot be used for the estimation of the global position information, so that the positioning accuracy is lowered due to the motion blur described above. Can be suppressed.
 記憶部11は、マップデータ100のグローバル位置情報として3次元の位置情報を記憶しており、測位部12は、通信端末50のグローバル位置情報として3次元の位置情報を推定し、ローカル位置取得部53は、通信端末50のローカル位置情報として3次元の位置情報を取得する。これにより、位置導出部54によって3次元の位置情報を導出することが可能となる。 The storage unit 11 stores three-dimensional position information as global position information of the map data 100, and the positioning unit 12 estimates the three-dimensional position information as global position information of the communication terminal 50, and the local position acquisition unit. 53 acquires three-dimensional position information as local position information of the communication terminal 50. As a result, the position deriving unit 54 can derive three-dimensional position information.
 最後に、測位システム1に含まれた位置測位サーバ10,通信端末50のハードウェア構成について、図3を参照して説明する。上述の位置測位サーバ10,通信端末50は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、バス1007などを含むコンピュータ装置として構成されてもよい。 Finally, the hardware configurations of the positioning server 10 and the communication terminal 50 included in the positioning system 1 will be described with reference to FIG. The positioning server 10 and the communication terminal 50 described above may be physically configured as a computer device including the processor 1001, the memory 1002, the storage 1003, the communication device 1004, the input device 1005, the output device 1006, the bus 1007, and the like. Good.
 なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。位置測位サーバ10,通信端末50のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 Note that in the following description, the word "device" can be read as a circuit, device, unit, or the like. The hardware configuration of the positioning server 10 and the communication terminal 50 may be configured to include one or a plurality of each device illustrated in the figure, or may be configured not to include some devices.
 位置測位サーバ10,通信端末50における各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることで、プロセッサ1001が演算を行い、通信装置1004による通信や、メモリ1002及びストレージ1003におけるデータの読み出し及び/又は書き込みを制御することで実現される。 Each function of the position location server 10 and the communication terminal 50 causes a predetermined software (program) to be loaded onto hardware such as the processor 1001 and the memory 1002, so that the processor 1001 performs calculation and communication by the communication device 1004, It is realized by controlling reading and/or writing of data in the memory 1002 and the storage 1003.
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU:Central Processing Unit)で構成されてもよい。例えば、位置測位サーバ10の測位部12等の制御機能はプロセッサ1001で実現されてもよい。 The processor 1001 operates an operating system to control the entire computer, for example. The processor 1001 may be composed of a central processing unit (CPU) including an interface with peripheral devices, a control device, a calculation device, a register, and the like. For example, the control function of the positioning unit 12 and the like of the position positioning server 10 may be realized by the processor 1001.
 また、プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュールやデータを、ストレージ1003及び/又は通信装置1004からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、上述の実施の形態で説明した動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。例えば、位置測位サーバ10の測位部12等の制御機能は、メモリ1002に格納され、プロセッサ1001で動作する制御プログラムによって実現されてもよく、他の機能ブロックについても同様に実現されてもよい。上述の各種処理は、1つのプロセッサ1001で実行される旨を説明してきたが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップで実装されてもよい。なお、プログラムは、電気通信回線を介してネットワークから送信されても良い。 Further, the processor 1001 reads a program (program code), software module, and data from the storage 1003 and/or the communication device 1004 into the memory 1002, and executes various processes according to these. As the program, a program that causes a computer to execute at least part of the operations described in the above-described embodiments is used. For example, the control function of the positioning unit 12 and the like of the position positioning server 10 may be realized by the control program stored in the memory 1002 and operated by the processor 1001, and may be realized similarly for other functional blocks. Although it has been described that the various processes described above are executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001. The processor 1001 may be implemented by one or more chips. The program may be transmitted from the network via an electric communication line.
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAM(Random Access Memory)などの少なくとも1つで構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ1002は、本発明の一実施の形態に係る無線通信方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 1002 is a computer-readable recording medium, and is composed of, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (ElectricallyErasable Programmable ROM), RAM (Random Access Memory), and the like. May be done. The memory 1002 may be called a register, a cache, a main memory (main storage device), or the like. The memory 1002 can store a program (program code) executable to implement the wireless communication method according to the embodiment of the present invention, a software module, and the like.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つで構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。上述の記憶媒体は、例えば、メモリ1002及び/又はストレージ1003を含むデータベース、サーバその他の適切な媒体であってもよい。 The storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disc drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disc). (Registered trademark) disk), smart card, flash memory (for example, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like. The storage 1003 may be called an auxiliary storage device. The storage medium described above may be, for example, a database including the memory 1002 and/or the storage 1003, a server, or another appropriate medium.
 通信装置1004は、有線及び/又は無線ネットワークを介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。 The communication device 1004 is hardware (transmission/reception device) for performing communication between computers via a wired and/or wireless network, and is also referred to as a network device, network controller, network card, communication module, or the like.
 入力装置1005は、外部からの入力を受け付ける入力デバイス(例えば、キーボード、マウス、マイクロフォン、スイッチ、ボタン、センサなど)である。出力装置1006は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカー、LEDランプなど)である。なお、入力装置1005及び出力装置1006は、一体となった構成(例えば、タッチパネル)であってもよい。 The input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that receives an input from the outside. The output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that performs output to the outside. The input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
 また、プロセッサ1001やメモリ1002などの各装置は、情報を通信するためのバス1007で接続される。バス1007は、単一のバスで構成されてもよいし、装置間で異なるバスで構成されてもよい。 Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information. The bus 1007 may be composed of a single bus, or may be composed of different buses among devices.
 また、位置測位サーバ10,通信端末50は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)などのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つで実装されてもよい。 In addition, the positioning server 10 and the communication terminal 50 include a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), and the like. It may be configured to include hardware, and the hardware may implement part or all of each functional block. For example, processor 1001 may be implemented with at least one of these hardware.
 以上、本実施形態について詳細に説明したが、当業者にとっては、本実施形態が本明細書中に説明した実施形態に限定されるものではないということは明らかである。本実施形態は、特許請求の範囲の記載により定まる本発明の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本明細書の記載は、例示説明を目的とするものであり、本実施形態に対して何ら制限的な意味を有するものではない。 Although the present embodiment has been described in detail above, it is obvious to those skilled in the art that the present embodiment is not limited to the embodiment described in this specification. The present embodiment can be implemented as modified and changed modes without departing from the spirit and scope of the present invention defined by the description of the claims. Therefore, the description of the present specification is for the purpose of exemplifying explanation, and does not have any restrictive meaning to the present embodiment.
 例えば、測位システム1は通信端末50及び位置測位サーバ10を含んで構成されているとして説明したがこれに限定されず、測位システム1の各機能が、通信端末のみによって実現されてもよい。また、グローバル位置情報の補正については、ローカル位置情報に加えて他の情報を用いて行ってもよい。例えば、グローバル位置情報によって特定されるスポットについて、その後に訪れるスポットが推定できる場合(すなわち、スポットの特徴に応じてユーザの動線が推定できる場合)には、推定される動線に基づいて、グローバル位置情報の補正を行ってもよい。その場合、例えばユーザの性別、年代等の情報を用いて動線の推定精度を向上させて、グローバル位置情報の補正精度を向上させてもよい。また、ユーザが過去に訪れたことのあるスポットのグローバル位置情報である場合には、過去の情報(過去にユーザが辿った動線)を考慮してグローバル位置情報の補正を行ってもよい。また、例えば複数のユーザで同一のアプリケーションを用いており、且つ、当該複数のユーザが協同して当該アプリケーションを実行している場合等においては、互いのローカル位置情報を共有し合い、互いのローカル位置情報に基づいてグローバル位置情報の補正を行ってもよい。また、通信端末50におけるユーザの検索履歴や閲覧している情報の内容(ユーザの興味の内容)によって動線の推定精度を向上させて、グローバル位置情報の補正を行ってもよい。 For example, the positioning system 1 has been described as including the communication terminal 50 and the position positioning server 10, but the present invention is not limited to this, and each function of the positioning system 1 may be realized only by the communication terminal. Further, the global position information may be corrected using other information in addition to the local position information. For example, for a spot identified by global position information, if a subsequently visited spot can be estimated (that is, if the user's flow line can be estimated according to the characteristics of the spot), based on the estimated flow line, The global position information may be corrected. In that case, for example, the accuracy of estimating the flow line may be improved by using information such as the sex and age of the user, and the accuracy of correcting the global position information may be improved. Further, in the case of the global position information of the spot that the user has visited in the past, the global position information may be corrected in consideration of the past information (the flow line traced by the user in the past). In addition, for example, when a plurality of users are using the same application and the plurality of users collaborate to execute the application, mutual local position information is shared with each other, and The global position information may be corrected based on the position information. Further, the global position information may be corrected by improving the estimation accuracy of the flow line according to the search history of the user in the communication terminal 50 and the content of the information being browsed (content of the user's interest).
 本明細書で説明した各態様/実施形態は、LTE(Long Term Evolution)、LTE-A(LTE-Advanced)、SUPER 3G、IMT-Advanced、4G、5G、FRA(Future Radio Access)、W-CDMA(登録商標)、GSM(登録商標)、CDMA2000、UMB(Ultra Mobile Broad-band)、IEEE 802.11(Wi-Fi)、IEEE 802.16(WiMAX)、IEEE 802.20、UWB(Ultra-Wide Band)、Bluetooth(登録商標)、その他の適切なシステムを利用するシステム及び/又はこれらに基づいて拡張された次世代システムに適用されてもよい。 Each aspect/embodiment described in this specification is LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA. (Registered trademark), GSM (registered trademark), CDMA2000, UMB (Ultra Mobile Broad-band), IEEE 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802.20, UWB (Ultra-Wide) Band), Bluetooth (registered trademark), or any other system using an appropriate system and/or a next-generation system extended based on the system.
 本明細書で説明した各態様/実施形態の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本明細書で説明した方法については、例示的な順序で様々なステップの要素を提示しており、提示した特定の順序に限定されない。 The order of the processing procedures, sequences, flowcharts, etc. of each aspect/embodiment described in this specification may be changed as long as there is no contradiction. For example, the methods described herein present elements of the various steps in a sample order, and are not limited to the specific order presented.
 入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルで管理してもよい。入出力される情報等は、上書き、更新、または追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 Information that has been input and output may be stored in a specific location (for example, memory), or may be managed in a management table. Information that is input/output can be overwritten, updated, or added. The output information and the like may be deleted. The input information and the like may be transmitted to another device.
 判定は、1ビットで表される値(0か1か)によって行われてもよいし、真偽値(Boolean:trueまたはfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 The determination may be performed by a value represented by 1 bit (whether 0 or 1), may be performed by a Boolean value (Boolean: true or false), and may be performed by comparing numerical values (for example, a predetermined value). (Comparison with the value).
 本明細書で説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 The respective aspects/embodiments described in this specification may be used alone, in combination, or may be switched according to execution. Further, the notification of the predetermined information (for example, the notification of “being X”) is not limited to the explicit notification, but is performed implicitly (for example, the notification of the predetermined information is not performed). Good.
 ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 Software, whether called software, firmware, middleware, microcode, hardware description language, or any other name, instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules. , Application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc. should be construed broadly.
 また、ソフトウェア、命令などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、同軸ケーブル、光ファイバケーブル、ツイストペア及びデジタル加入者回線(DSL)などの有線技術及び/又は赤外線、無線及びマイクロ波などの無線技術を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び/又は無線技術は、伝送媒体の定義内に含まれる。 Also, software, instructions, etc. may be sent and received via a transmission medium. For example, the software may use wired technologies such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and/or wireless technologies such as infrared, wireless and microwave to websites, servers, or other When transmitted from a remote source, these wireline and/or wireless technologies are included within the definition of transmission medium.
 本明細書で説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 The information, signals, etc. described herein may be represented using any of a variety of different technologies. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description include voltage, current, electromagnetic waves, magnetic fields or magnetic particles, optical fields or photons, or any of these. May be represented by a combination of
 なお、本明細書で説明した用語及び/又は本明細書の理解に必要な用語については、同一の又は類似する意味を有する用語と置き換えてもよい。 Note that the terms described in the present specification and/or the terms necessary for understanding the present specification may be replaced with terms having the same or similar meanings.
 また、本明細書で説明した情報、パラメータなどは、絶対値で表されてもよいし、所定の値からの相対値で表されてもよいし、対応する別の情報で表されてもよい。 The information, parameters, and the like described in this specification may be represented by absolute values, relative values from predetermined values, or may be represented by other corresponding information. ..
 ユーザ端末は、当業者によって、移動通信端末、加入者局、モバイルユニット、加入者ユニット、ワイヤレスユニット、リモートユニット、モバイルデバイス、ワイヤレスデバイス、ワイヤレス通信デバイス、リモートデバイス、モバイル加入者局、アクセス端末、モバイル端末、ワイヤレス端末、リモート端末、ハンドセット、ユーザエージェント、モバイルクライアント、クライアント、またはいくつかの他の適切な用語で呼ばれる場合もある。 User terminals are defined by those skilled in the art as mobile communication terminals, subscriber stations, mobile units, subscriber units, wireless units, remote units, mobile devices, wireless devices, wireless communication devices, remote devices, mobile subscriber stations, access terminals, It may also be referred to as a mobile terminal, wireless terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable term.
 本明細書で使用する「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up)(例えば、テーブル、データベースまたは別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。 As used herein, the terms "determining" and "determining" may encompass a wide variety of actions. "Judgment" and "decision" mean, for example, calculating, computing, processing, deriving, investigating, looking up (e.g., table, database or another). (Search in data structure), ascertaining that it is regarded as "judgment" and "decision" can be included. In addition, "decision" and "decision" include receiving (eg, receiving information), transmitting (eg, transmitting information), input (input), output (output), access (accessing) (for example, accessing data in a memory) may be regarded as “judging” and “deciding”. In addition, "judgment" and "decision" are considered to be "judgment" and "decision" when things such as resolving, selecting, choosing, establishing, establishing, and comparing are done. May be included. That is, the “judgment” and “decision” may include considering some action as “judgment” and “decision”.
 本明細書で使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 As used herein, the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase "based on" means both "based only on" and "based at least on."
 本明細書で「第1の」、「第2の」などの呼称を使用した場合においては、その要素へのいかなる参照も、それらの要素の量または順序を全般的に限定するものではない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本明細書で使用され得る。したがって、第1および第2の要素への参照は、2つの要素のみがそこで採用され得ること、または何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 Where the designations "first," "second," etc. are used herein, any reference to that element does not generally limit the amount or order of those elements. These nomenclatures may be used herein as a convenient way to distinguish between two or more elements. Thus, references to the first and second elements do not mean that only two elements may be employed there, or that the first element must precede the second element in any way.
 「含む(include)」、「含んでいる(including)」、およびそれらの変形が、本明細書あるいは特許請求の範囲で使用されている限り、これら用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。さらに、本明細書あるいは特許請求の範囲において使用されている用語「または(or)」は、排他的論理和ではないことが意図される。 As long as the terms "include", "including", and variations thereof are used in the present specification or claims, these terms are the same as the term "comprising." It is intended to be comprehensive. Furthermore, the term “or” as used in the specification or claims is not intended to be an exclusive OR.
 本明細書において、文脈または技術的に明らかに1つのみしか存在しない装置である場合以外は、複数の装置をも含むものとする。 In the present specification, a device including a plurality of devices is also included unless it is a device in which only one clearly exists in terms of context or technology.
 本開示の全体において、文脈から明らかに単数を示したものではなければ、複数のものを含むものとする。 Throughout this disclosure, the plural is included unless the context clearly indicates the singular.
 1…測位システム、11…記憶部、12…測位部(グローバル位置推定部)、50…通信端末(端末)、53…ローカル位置取得部、54…位置導出部、55…撮像指示部、100…マップデータ。 DESCRIPTION OF SYMBOLS 1... Positioning system, 11... Storage part, 12... Positioning part (global position estimation part), 50... Communication terminal (terminal), 53... Local position acquisition part, 54... Position derivation part, 55... Imaging instruction part, 100... Map data.

Claims (7)

  1.  端末において撮像された画像データに基づき測位を行う測位システムであって、
     ある地点を基準とした前記端末の相対的な位置情報であるローカル位置情報を継続的に取得するローカル位置取得部と、
     予め取得された画像データに含まれる特徴点の特徴量と該特徴点に関連付けられた絶対的な位置情報であるグローバル位置情報とが対応付けられたマップデータを記憶する記憶部と、
     前記端末において撮像された画像データと、前記記憶部に記憶されているマップデータとに基づいて、前記端末における撮像時の前記端末のグローバル位置情報を推定するグローバル位置推定部と、
     前記ローカル位置取得部によって取得されるローカル位置情報の変化に基づいて、前記端末における撮像時からの移動量を推定し、該移動量に基づいて、前記グローバル位置推定部によって推定されたグローバル位置情報を補正することにより、前記移動量を考慮した前記端末の位置情報を導出する位置導出部と、を備える、測位システム。
    A positioning system that performs positioning based on image data captured by a terminal,
    A local position acquisition unit that continuously acquires local position information that is relative position information of the terminal with respect to a certain point;
    A storage unit that stores map data in which feature amounts of feature points included in previously acquired image data and global position information that is absolute position information associated with the feature points are associated with each other,
    A global position estimating unit that estimates global position information of the terminal at the time of image capturing in the terminal, based on image data captured by the terminal and map data stored in the storage unit;
    Global position information estimated by the global position estimation unit based on a change in the local position information acquired by the local position acquisition unit And a position deriving unit that derives position information of the terminal in consideration of the movement amount.
  2.  前記グローバル位置推定部は、前記マップデータの特徴点と、前記端末において撮像された画像データの特徴点とのマッチングを行い、前記マップデータの特徴点に関連付けられたグローバル位置情報に基づいて、前記端末における撮像時の前記端末のグローバル位置情報を推定する、請求項1記載の測位システム。 The global position estimation unit performs matching between a feature point of the map data and a feature point of image data captured by the terminal, and based on global position information associated with the feature point of the map data, The positioning system according to claim 1, which estimates global position information of the terminal at the time of imaging at the terminal.
  3.  前記記憶部は、前記マップデータについてグローバル位置情報に応じて一定の領域毎に分割した複数の分割マップデータを記憶し、
     前記グローバル位置推定部は、前記端末において撮像された画像データと共に前記端末において取得された前記端末のグローバル位置情報である端末取得位置情報を取得し、該端末取得位置情報に応じて、前記複数の分割マップデータから一又は複数の分割マップデータを選択し、選択した分割マップデータの特徴点と、前記端末において撮像された画像データの特徴点とのマッチングの結果に基づき、前記端末における撮像時の前記端末のグローバル位置情報を推定する、請求項2記載の測位システム。
    The storage unit stores a plurality of divided map data obtained by dividing the map data into fixed areas according to global position information,
    The global position estimation unit acquires terminal acquisition position information which is global position information of the terminal acquired by the terminal together with image data captured by the terminal, and the plurality of the plurality of terminal acquisition position information are acquired according to the terminal acquisition position information. One or a plurality of division map data is selected from the division map data, and based on the result of matching between the feature points of the selected division map data and the feature points of the image data captured by the terminal, the The positioning system according to claim 2, which estimates global position information of the terminal.
  4.  前記端末において撮像された画像データの内容に基づいて、該画像データを前記グローバル位置推定部によるグローバル位置情報の推定に用いることができるか否かを判定し、用いることができない場合に、新たな画像データの撮像を指示する撮像指示部をさらに備える、請求項1~3のいずれか一項記載の測位システム。 Based on the content of the image data captured by the terminal, it is determined whether or not the image data can be used for estimating the global position information by the global position estimating unit. The positioning system according to any one of claims 1 to 3, further comprising an imaging instruction unit that instructs imaging of image data.
  5.  前記撮像指示部は、前記端末において撮像された画像データに動的な物体が所定値以上含まれている場合に、該画像データを前記グローバル位置推定部によるグローバル位置情報の推定に用いることができないと判定する、請求項4記載の測位システム。 The image capturing instructing unit cannot use the image data for estimating global position information by the global position estimating unit when the image data captured by the terminal includes a dynamic object in a predetermined value or more. The positioning system according to claim 4, which is determined.
  6.  前記撮像指示部は、前記端末において撮像された画像データにモーションブラーが所定値以上含まれている場合に、該画像データを前記グローバル位置推定部によるグローバル位置情報の推定に用いることができないと判定する、請求項4又は5記載の測位システム。 The image capturing instruction unit determines that the image data cannot be used for estimating global position information by the global position estimating unit when the image data captured by the terminal includes motion blur of a predetermined value or more. The positioning system according to claim 4 or 5.
  7.  前記記憶部は、マップデータのグローバル位置情報として3次元の位置情報を記憶しており、
     前記グローバル位置推定部は、前記端末のグローバル位置情報として3次元の位置情報を推定し、
     前記ローカル位置取得部は、前記端末のローカル位置情報として3次元の位置情報を取得する、請求項1~6のいずれか一項記載の測位システム。
    The storage unit stores three-dimensional position information as global position information of map data,
    The global position estimation unit estimates three-dimensional position information as global position information of the terminal,
    7. The positioning system according to claim 1, wherein the local position acquisition unit acquires three-dimensional position information as local position information of the terminal.
PCT/JP2019/030360 2018-12-03 2019-08-01 Positioning system WO2020115945A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018226346A JP2022028981A (en) 2018-12-03 2018-12-03 Positioning system
JP2018-226346 2018-12-03

Publications (1)

Publication Number Publication Date
WO2020115945A1 true WO2020115945A1 (en) 2020-06-11

Family

ID=70974184

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/030360 WO2020115945A1 (en) 2018-12-03 2019-08-01 Positioning system

Country Status (2)

Country Link
JP (1) JP2022028981A (en)
WO (1) WO2020115945A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022181138A1 (en) * 2021-02-25 2022-09-01 日野自動車株式会社 Self-position estimation device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010029482A (en) * 2008-07-29 2010-02-12 Univ Of Tsukuba Diagnostic supporting system for automatically creating follow-up observation report on vertebra-vertebral body lesion using mri
US20150241232A1 (en) * 2012-08-16 2015-08-27 Plk Technologies Navigation system for determining route change of vehicle
JP2016125913A (en) * 2015-01-05 2016-07-11 キヤノン株式会社 Image acquisition device and control method of image acquisition device
JP2016170060A (en) * 2015-03-13 2016-09-23 三菱電機株式会社 Facility information display system, mobile terminal, server and facility information display method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010029482A (en) * 2008-07-29 2010-02-12 Univ Of Tsukuba Diagnostic supporting system for automatically creating follow-up observation report on vertebra-vertebral body lesion using mri
US20150241232A1 (en) * 2012-08-16 2015-08-27 Plk Technologies Navigation system for determining route change of vehicle
JP2016125913A (en) * 2015-01-05 2016-07-11 キヤノン株式会社 Image acquisition device and control method of image acquisition device
JP2016170060A (en) * 2015-03-13 2016-09-23 三菱電機株式会社 Facility information display system, mobile terminal, server and facility information display method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022181138A1 (en) * 2021-02-25 2022-09-01 日野自動車株式会社 Self-position estimation device

Also Published As

Publication number Publication date
JP2022028981A (en) 2022-02-17

Similar Documents

Publication Publication Date Title
AU2015402322B2 (en) System and method for virtual clothes fitting based on video augmented reality in mobile phone
JP6339200B2 (en) Method and apparatus for position estimation using trajectories
CN107123142B (en) Pose estimation method and device
US20140233800A1 (en) Method of tracking object and electronic device supporting the same
KR20160003066A (en) Monocular visual slam with general and panorama camera movements
US20150213325A1 (en) Incremental learning for dynamic feature database management in an object recognition system
JP6296454B2 (en) Positioning method and apparatus by base station
JP6462528B2 (en) MOBILE BODY TRACKING DEVICE, MOBILE BODY TRACKING METHOD, AND MOBILE BODY TRACKING PROGRAM
JP7015017B2 (en) Object segmentation of a series of color image frames based on adaptive foreground mask upsampling
CN113766636A (en) Apparatus and method for estimating position in wireless communication system
WO2020115945A1 (en) Positioning system
CN108010052A (en) Method for tracking target and system, storage medium and electric terminal in complex scene
KR102612792B1 (en) Electronic device and method for determining entry in region of interest thereof
WO2020115944A1 (en) Map data generation device
JP6945004B2 (en) Information processing device
JP7355840B2 (en) AR system and terminal
WO2021075318A1 (en) Positioning system and terminal
CN113160258A (en) Method, system, server and storage medium for extracting building vector polygon
JP7198966B2 (en) Positioning system
CN114630102A (en) Method and device for detecting angle change of data acquisition equipment and computer equipment
WO2021166954A1 (en) Map data generation device and positioning device
JP6957651B2 (en) Information processing device
JP2019159787A (en) Person detection method and person detection program
JP2022083707A (en) Information processing system
JP6813463B2 (en) Information processing equipment and information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19892809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19892809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP