WO2006035476A1 - 位置特定サーバ及び移動体端末 - Google Patents
位置特定サーバ及び移動体端末 Download PDFInfo
- Publication number
- WO2006035476A1 WO2006035476A1 PCT/JP2004/014061 JP2004014061W WO2006035476A1 WO 2006035476 A1 WO2006035476 A1 WO 2006035476A1 JP 2004014061 W JP2004014061 W JP 2004014061W WO 2006035476 A1 WO2006035476 A1 WO 2006035476A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- reference data
- information
- unit
- mobile terminal
- candidate
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0252—Radio frequency fingerprinting
- G01S5/02521—Radio frequency fingerprinting using a radio-map
- G01S5/02522—The radio-map containing measured values of non-radio values
Definitions
- the present invention relates to a position information service system that provides information for specifying a position from an image photographed by a terminal.
- GPS Global Positioning System
- GPS, gyroscopes, and vehicle speed pulses are used for measuring and tracking the position of a car in car navigation. GPS positioning is also corrected by map matching with power roads that include errors in the direction of travel obtained by the gyro. However, even if car navigation map matching is applied as it is, it is difficult to understand the movement of pedestrians.
- Japanese Patent Laid-Open No. 9 190413 information processing device: Ricoh
- a recognition device A device for recognizing the location of a user such as “A office meeting room” or “B office corridor” by using a neural network, and using the recognized location name for PDA user information management, etc. is there.
- Such location recognition is realized by an information processing device that maintains the location information of offices A and B and has learned the characteristics of conference rooms and corridors.
- Patent Document 1 Japanese Patent Laid-Open No. 9-190413
- the present invention has been made in order to eliminate the above-described drawbacks of the prior art.
- the purpose of the present invention is to pinpoint the user's position obtained by GPS, PHS (registered trademark), or the like.
- PHS registered trademark
- the user decides a specific target and shoots Therefore, it is an object to realize position specification.
- This provides a service that can identify the user's location without having to prepare a huge amount of reference data. It is possible to provide information according to the location and navigation from there to the destination.
- the location server according to the present invention includes:
- a communication unit that receives, from a mobile terminal, approximate position information that can generally specify a position where the mobile terminal exists and photo information taken by a shooting unit of the mobile terminal.
- Candidate selection unit that selects reference data to be compared based on the relationship between the approximate position information and the position information.
- Collation evaluation unit that evaluates collation results and determines acceptance / rejection of reference data
- a position specifying unit for specifying position information stored in association with the reference data in the reference data database when it is determined that the reference data is adopted.
- a rough position of the user is obtained by causing the user to shoot as many objects in the town, such as signs or signs of the same color, mark, shape, etc. Can be narrowed down.
- Maintenance such as when a signboard is changed can be handled by all stores by simply changing the original reference data.
- the location information of the new stores can be added, so there is no need for maintenance.
- FIG. 1 is a configuration diagram of a position specifying system according to the first embodiment.
- a terminal (mobile terminal) 100 includes a rough position acquisition unit 1, a photographing unit 2, a display unit 3, and a communication unit 4a.
- the approximate position acquisition unit 1 is a functional module that obtains the approximate position A with the terminal 100
- the photographing unit 2 is a functional module that captures the photograph B with the terminal 100
- the display unit 3 is the map received from the server 200.
- the communication unit 4a is a communication function module with the server 200.
- the server (location specifying server) 200 includes a candidate selection unit 10, a map selection unit 15, a reference data database 40, a map database 41, a communication unit 4b, and a matching processing device 150.
- the candidate selection unit 10 selects a candidate set C as a target at the position based on the received information on the approximate position A and the information in the reference data database 40.
- the reference data selected in candidate set C is called candidate data.
- the map selection unit 15 selects a map D corresponding to the position from the map database 41.
- the communication unit 4b is a communication function module with the terminal.
- the collation processing device 150 receives the selected candidate set C and the map D as input, a collation unit 20 that collates the candidate set C with the photo B, and a collation evaluation unit 21 that evaluates the collation result. Based on the evaluation result, the position specifying unit 22 for specifying the shooting position and the specified position mark It consists of a map display unit 23 placed on the map.
- FIG. 2 is a diagram showing a processing flow of the server according to the first embodiment.
- the server 200 receives the approximate position information A from the terminal 100 when there is a position specifying request from the terminal 100. Then, a reference image candidate set C corresponding to the position information indicating the range of the approximate position is searched from the reference data database 40 (step ST101).
- the terminal 100 acquires the approximate position information A by the approximate position acquisition unit 1 such as GPS in response to a request from Sano or voluntarily, and sends the acquired approximate position information A to the server 200. It is configured to transmit.
- the approximate position acquisition unit 1 such as GPS in response to a request from Sano or voluntarily
- FIG. 3 is a diagram showing an example of the reference data database according to the first embodiment.
- a signboard of a combination store with thousands of stores in various places is photographed, and the portion of the signboard cut out is stored as reference data. Since a combination uses the same signboard at each branch, it is possible to handle a single type of reference data without having to prepare reference data for the signboard of a certain combination company for each branch.
- the location information of the stores in each location is stored in association with longitude and latitude, using the photo from the front of the sign of the convenience store as reference data.
- the reference data is not limited to photographs, but may be 2D images, 3D Computer Graphics (CG), logos, character strings, sounds, and the like.
- the location information may be other information such as a base station area of a mobile phone. For example, assuming that the candidate data selected in ST101 is “Psuper” and “SMART”, the following explanation is continued.
- the collation unit 20 performs matching (collation) between the photograph B taken by the user and each reference data of the candidate set C selected as a candidate, and the collation evaluation unit 21 evaluates the matching (collation) result. (Step ST103).
- each candidate data that is reference data is shifted slightly from the upper left force of the photograph B as shown in FIG.
- the part with the highest matching evaluation result in Photo B is regarded as the matching location, and the evaluation score at that part is evaluated for each candidate data.
- the price point is made by comparing the reference data with the color histogram of the photograph. If the number of colors is large, the color may vary depending on the shooting conditions, etc., even if the sign is the same.
- step ST104 It is checked whether there is more candidate data of evaluation points for the set threshold! /, Value (step ST104). Among them, the candidate data with the highest evaluation score is selected (step ST1 05). The higher the rating, the more likely it is that the sign or mark is the same. The photo taken by the user does not necessarily include a specific object. In such a case, if a match with another sign is made, the wrong decision is made and a value is set.
- FIG. 5 is a diagram showing an example in which the position is narrowed down on the map.
- the approximate position A of the user is represented by a circle, and the narrowed position is represented by an ellipse.
- the GPS and base station information alone can provide rough area information. However, in this system, the user's position can be narrowed down by collating the photo with the reference data.
- the reference data database 41 includes the reference data that is an image when the mobile terminal holder has photographed the target object for specifying the position during movement, and the reference data.
- the communication unit 46 associates and stores the position information of the mobile terminal from the mobile terminal, the approximate position information that can generally specify the position of the mobile terminal, and the photo information captured by the imaging unit 2 of the mobile terminal.
- the candidate selection unit 10 selects reference data to be collated from the reference data database 41 based on the relationship between the approximate position information and the position information, and the collation unit 20 selects the selected reference data and Collation with the above-described photo information is performed, and the collation evaluation unit 21 evaluates the collation result and determines whether or not the reference data is accepted. Then, when it is determined that the reference data is adopted, the position specifying unit 22 specifies the position information stored in the reference data database 41 in association with the reference data.
- the map selection unit 15 performs map information based on the position information specified by the position specifying unit 22.
- the map display unit 23 adds information for visually identifying the position based on the position information to the selected map information, and the added map information is transmitted to the mobile terminal via the communication unit 4b. To provide.
- a force indicating a configuration in which the verification processing device 150 is placed on a server can be mounted on a terminal.
- the approximate position acquisition unit 1 acquires approximate position information that can generally specify the position of the moving object
- the photographing unit 2 obtains photographic information by taking a picture
- the communication unit 4a transmits the above approximate position information to the server 200, receives reference data to be collated and position information corresponding to the reference data from the server 200, and receives the reference information provided in the terminal 100.
- the combining unit 20 compares the received reference data with the above photographic information, and the matching evaluation unit 21 provided in the terminal 100 evaluates the comparison result and determines whether or not the reference data is accepted. Then, when it is determined that the reference data is adopted, the position specifying unit 22 provided in the terminal 100 specifies the position of the mobile terminal based on the position information corresponding to the reference data.
- the candidate data is compared with every part of the photograph B, and the matching is shown.
- the shooting target is arranged in the center
- only candidate data and the center portion of the shot photo are collated.
- the photograph B is divided into a plurality of divisions such as 9 divisions and 12 divisions, the central area is designated, and only the central portion is compared and collated with the candidate data. Select a candidate with a high evaluation score.
- the central part may be specified by specifying coordinates.
- the collation unit collates only a part of the photographic information.
- Embodiment 4 In Embodiment 1, based on position information associated with reference data captured from one direction (usually the front direction), the position specifying unit specifies a shooting position that further narrows down the approximate position.
- reference data that captures various distances and angular forces for each target is prepared, and information such as distance and angle associated with these reference data is used to further position. The form to narrow down is explained.
- FIG. 6 is a diagram illustrating a processing flow of the server according to the fourth embodiment.
- FIG. 7 is a diagram showing an example of a reference data database according to the fourth embodiment. As shown in the figure, for each reference data, prepare a photograph of various distances and angular forces taken only by a signboard photograph and a logo mark front image for each store. In addition, taking into consideration that the characteristics of shooting differ depending on the terminal, it is effective to prepare photographs taken from various distances and angles for each type of terminal.
- FIG. 8 is a diagram showing the relationship between the distance and angle during shooting. With this information, the direction of the signboard can be determined.
- the angle of the reference data is 0 for the front of the signboard, +30 if the central force is 30 degrees to the right, and 20 if it is 20 degrees to the left.
- the writing method is not limited to this.
- the corresponding reference data is narrowed down to specify the position.
- the priority and threshold values of conditions for narrowing down candidates can be determined according to the situation. It is determined according to the accuracy of the position acquisition unit and direction information acquisition unit, the type and amount of data to be prepared, and the like.
- Candidates are narrowed down based on the approximate position information (ST101), matching is performed (ST103), and if a high collation evaluation result is obtained, the position is specified using the distance and angle information. If the evaluation score is higher than the threshold value (ST104), the candidate with the highest evaluation is selected (ST105).
- the position is specified by the distance and angle information of the reference data (ST160).
- the location identifying unit 22 takes a picture by comparing it with reference data corresponding to the type of terminal. Because the distance and angle from the target can be identified based on the size of the signboard and logo mark and the degree of distortion of the signage depending on the shooting angle, the pinpoint position can be further narrowed down.
- the reference data database 41 stores information indicating the positional relationship with the target object in association with the reference data
- the position specifying unit 22 stores the information in association with the reference data. The position is determined using information indicating the positional relationship!
- This embodiment is effective when data from various positions and angles are sufficiently prepared.
- FIG. 9 is a diagram showing a processing flow of the server according to the fifth embodiment.
- the distance of the signboard power is estimated from the internal parameters of the camera and the length of the signboard, and the angle to the signboard is estimated by converting the photographed photograph.
- the camera parameter value of the camera serving as the imaging unit is required. Therefore, it is necessary to receive the terminal power of the parameters (zoom, focus, etc.) of the photographing unit at the time of photographing, and to convert the photographed photo B based on it. If the shooting unit is a mobile phone camera and the parameters are determined according to the model of the terminal, you need to know the model of the terminal.
- FIG. 10 is a diagram illustrating a processing flow of the server according to the sixth embodiment.
- candidates are narrowed down based on the approximate position information, the distance of the signage force is estimated based on the internal parameters of the camera and the length of the signboard, and the angle with respect to the signboard is estimated by converting the photographed photograph. (ST150).
- the position is specified by the distance and angle information of the matched reference data (ST1
- the narrowing down by the approximate position may be performed after estimating the distance and angle.
- the photographing unit 2 of the terminal 100 is taking the photo B at his / her own position.
- the photographing is performed. A form that considers the case of requesting the location without being described will be described.
- FIG. 11 is a configuration diagram of the position specifying system according to the seventh embodiment.
- a candidate presentation unit 30 is added.
- the candidate presentation unit 30 is a functional module that presents the candidate set C selected by the candidate selection unit 10 to the terminal.
- FIG. 12 is a diagram illustrating a processing flow of the server according to the seventh embodiment.
- candidate data is searched by the approximate position A, and a candidate set C is generated (step ST101).
- the operation is the same as in the first embodiment. If there is no photo, the candidate set C is presented to the user, and a request is made to select a store near the candidate from among the candidates (step ST111).
- collation processing device 150 receives selected candidate set C and map D as input, and collation unit 20 collates candidate set C with photo B, and collation evaluation unit 21 Evaluate with. Based on the result, the position specifying unit 22 specifies the photographed position.
- the map display unit 23 arranges a mark at the specified position on the map data selected by the map selection unit 15, and communicates the map D on which the position mark is arranged. Transmit from the unit 4b to the communication unit 4a.
- the terminal 100 displays the map D received by the communication unit 4a on the display unit 3.
- the candidate presenting unit 30 presents the identification information of the reference data selected by the candidate selecting unit 10 to the mobile terminal 100 via the communication unit 4b as a photographing target candidate.
- the present embodiment is based on the seventh embodiment, and checks whether each type of store in the candidate set C has two or more location information, and if there are multiple stores, The form which excludes candidate power is also demonstrated.
- FIG. 13 is a diagram illustrating a processing flow of the server according to the eighth embodiment. If you have multiple locations, there are more than one store of the same type in the area, so you cannot determine which of the photos the user has taken. Therefore, the reference data of that type of store is excluded from the candidates, and the other types of stores are set as a candidate set C ′ (step ST110). The candidate set C ′ is presented to the user, and a request is made to select a nearby store from the candidates and take a picture (step ST111).
- the candidate presenting unit excludes the candidate data from the target candidates for imaging.
- the verification processing device 150 is placed on the Sano 200, and the configuration is shown.
- the present invention can be used not only outdoors but also indoors. Even indoors, it is possible to obtain an approximate position and further shoot the signboard of the store to narrow down the position information.
- the data stored in the database is limited to a specific target, the amount of data is not necessarily enormous.
- the approximate position acquisition unit 1 has been shown to obtain the approximate position from GPS and base station information.
- An approximate position such as “near” or “near the station” may be designated.
- the present invention also applies to a force that is effective for specifying a position when a person is walking and her position is no longer weak, but also when moving by other moving means such as a car other than walking. There is an effect.
- a certain specific image is selected from images that are constantly shot by a camera mounted on a moving object such as a car. It is also effective to extract the target.
- FIG. 14 is a configuration diagram of the position specifying system according to the fourteenth embodiment.
- the terminal 100 acquires the absolute direction when shooting.
- the azimuth information acquisition unit 5 is provided.
- FIG. 15 is a diagram showing an example of a reference data database according to the fourteenth embodiment.
- the reference data database contains absolute bearing information when a user on the road looks at the front of the signboard. As the direction information, more accurate direction information may be held.
- FIG. 17 is a diagram showing a processing flow of the server according to the fourteenth embodiment. ST102,
- ST111 may be omitted.
- search and narrow down reference data candidates having absolute orientation information corresponding to the absolute orientation information of the mobile terminal. For example, within the approximate position of the camera, Psuper and S
- the reference data database includes the absolute orientation information of the signboard.
- FIG. 18 is a diagram showing an example of a reference data database according to the fifteenth embodiment.
- the reference data database contains absolute heading information for users on the road looking at the front of the sign and length information for the sign.
- FIG. 19 is a diagram illustrating the relationship between the distance and the angle during shooting.
- FIG. 20 is a diagram showing a process flow of the server according to the sixteenth embodiment.
- candidates are narrowed down based on the approximate position information (ST101), and matching is performed (ST103). If the evaluation point of the high evaluation is above the threshold (ST104), select some candidates with high evaluation (ST120).
- the distance of the signboard force and the relative angle to the signboard are estimated by converting the photographed photograph (ST150).
- the estimated distance and the distance between the reference data, and the absolute orientation of the mobile terminal and the "(absolute) orientation + opposite) angle" of the reference data are the closest candidates. select. Then, the position is specified by the distance and angle of the reference data (ST155).
- FIG. 21 is a diagram showing a processing flow of the server according to the seventeenth embodiment.
- the narrowing down by the approximate position may be performed after estimating the distance and angle.
- candidates for reference data having direction information corresponding to the direction information of the mobile terminal are searched, collated, and evaluated.
- the server 200 and the terminal 100 are a kind of computer, and each element can execute processing by a program. It is also possible to store the program in a storage medium so that the computer can read the program from the storage medium.
- FIG. 22 is a diagram illustrating a hardware configuration example of the server 200 and the terminal 100.
- An arithmetic device 2201, a data storage device 2202, a memory 2203, and a communication interface 2204 are connected to the bus.
- the data storage device 2202 is, for example, a ROM (Read Only Memory) or a hard disk.
- the memory 2203 is usually a RAM (Random Access Memory).
- the communication interface 2204 is used for communication between the server 200 and the terminal 100 via the network.
- the program is normally stored in the data storage device 2202, and is loaded into the memory 2203 and sequentially read into the arithmetic device 2201 for processing.
- FIG. 1 is a configuration diagram of a position specifying system according to a first embodiment.
- FIG. 2 is a diagram showing a processing flow of the server according to the first embodiment.
- FIG. 3 is a diagram showing an example of a reference data database according to the first embodiment.
- FIG. 4 is a diagram showing a concept of matching.
- FIG. 5 is a diagram showing an example in which positions are narrowed down on a map.
- FIG. 6 is a diagram showing a processing flow of the server according to the fourth embodiment.
- FIG. 7 is a diagram showing an example of a reference data database according to the fourth embodiment.
- FIG. 8 is a diagram showing the relationship between distance and angle during shooting.
- FIG. 9 shows a process flow of the server according to the fifth embodiment.
- FIG. 10 is a diagram showing a process flow of a server according to the sixth embodiment.
- FIG. 11 is a configuration diagram of a position specifying system according to a seventh embodiment.
- FIG. 12 shows a processing flow of the server according to the seventh embodiment.
- FIG. 13 is a diagram showing a processing flow of the server according to the eighth embodiment.
- FIG. 14 is a configuration diagram of a position specifying system according to a fourteenth embodiment.
- FIG. 15 is a diagram showing an example of a reference data database according to the fourteenth embodiment.
- FIG. 16 is a diagram showing an example in which positions are narrowed down on a map.
- FIG. 17 shows a process flow of the server according to the fourteenth embodiment.
- FIG. 18 shows an example of a reference data database according to the fifteenth embodiment.
- FIG. 19 is a diagram showing the relationship between distance and angle during shooting.
- FIG. 20 shows a processing flow of the server according to the sixteenth embodiment.
- FIG. 21 is a diagram showing a process flow of the server according to the seventeenth embodiment.
- FIG. 22 is a diagram illustrating a hardware configuration example.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Navigation (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2004/014061 WO2006035476A1 (ja) | 2004-09-27 | 2004-09-27 | 位置特定サーバ及び移動体端末 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2004/014061 WO2006035476A1 (ja) | 2004-09-27 | 2004-09-27 | 位置特定サーバ及び移動体端末 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006035476A1 true WO2006035476A1 (ja) | 2006-04-06 |
Family
ID=36118628
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/014061 WO2006035476A1 (ja) | 2004-09-27 | 2004-09-27 | 位置特定サーバ及び移動体端末 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2006035476A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009003415A (ja) * | 2007-03-26 | 2009-01-08 | Nec (China) Co Ltd | 地図データ更新方法および装置 |
WO2011145482A1 (ja) * | 2010-05-17 | 2011-11-24 | 株式会社エヌ・ティ・ティ・ドコモ | 端末位置特定システム、移動端末及び端末位置特定方法 |
WO2013112461A1 (en) * | 2012-01-27 | 2013-08-01 | Qualcomm Incorporated | System and method for determining location of a device using opposing cameras |
JP2013222335A (ja) * | 2012-04-17 | 2013-10-28 | Hitachi Ltd | 対象物特定システム、対象物特定サーバ及び対象物特定端末 |
JP2014524577A (ja) * | 2011-08-19 | 2014-09-22 | クゥアルコム・インコーポレイテッド | 屋内測位のためのロゴ検出 |
JP2019139664A (ja) * | 2018-02-14 | 2019-08-22 | 清水建設株式会社 | 位置検出装置、位置検出システム、及び位置検出方法 |
CN110501012A (zh) * | 2019-07-23 | 2019-11-26 | 恒大智慧科技有限公司 | 一种商场店铺导航方法、计算机设备及可读存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001235534A (ja) * | 2000-02-25 | 2001-08-31 | Nippon Telegr & Teleph Corp <Ntt> | 位置情報補正装置と方法及び位置情報補正プログラムを記録した記録媒体 |
JP2003240593A (ja) * | 2002-02-20 | 2003-08-27 | National Institute Of Advanced Industrial & Technology | 携帯者の現在位置および方位推定方法 |
JP2003263104A (ja) * | 2002-03-11 | 2003-09-19 | Mitsubishi Electric Corp | 撮像情報認識システム |
-
2004
- 2004-09-27 WO PCT/JP2004/014061 patent/WO2006035476A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001235534A (ja) * | 2000-02-25 | 2001-08-31 | Nippon Telegr & Teleph Corp <Ntt> | 位置情報補正装置と方法及び位置情報補正プログラムを記録した記録媒体 |
JP2003240593A (ja) * | 2002-02-20 | 2003-08-27 | National Institute Of Advanced Industrial & Technology | 携帯者の現在位置および方位推定方法 |
JP2003263104A (ja) * | 2002-03-11 | 2003-09-19 | Mitsubishi Electric Corp | 撮像情報認識システム |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009003415A (ja) * | 2007-03-26 | 2009-01-08 | Nec (China) Co Ltd | 地図データ更新方法および装置 |
WO2011145482A1 (ja) * | 2010-05-17 | 2011-11-24 | 株式会社エヌ・ティ・ティ・ドコモ | 端末位置特定システム、移動端末及び端末位置特定方法 |
CN102893129A (zh) * | 2010-05-17 | 2013-01-23 | 株式会社Ntt都科摩 | 终端位置确定系统、移动终端以及终端位置确定方法 |
JP2014524577A (ja) * | 2011-08-19 | 2014-09-22 | クゥアルコム・インコーポレイテッド | 屋内測位のためのロゴ検出 |
KR101570195B1 (ko) * | 2011-08-19 | 2015-11-18 | 퀄컴 인코포레이티드 | 실내 포지셔닝을 위한 로고 검출 |
WO2013112461A1 (en) * | 2012-01-27 | 2013-08-01 | Qualcomm Incorporated | System and method for determining location of a device using opposing cameras |
US9986208B2 (en) | 2012-01-27 | 2018-05-29 | Qualcomm Incorporated | System and method for determining location of a device using opposing cameras |
JP2013222335A (ja) * | 2012-04-17 | 2013-10-28 | Hitachi Ltd | 対象物特定システム、対象物特定サーバ及び対象物特定端末 |
JP2019139664A (ja) * | 2018-02-14 | 2019-08-22 | 清水建設株式会社 | 位置検出装置、位置検出システム、及び位置検出方法 |
CN110501012A (zh) * | 2019-07-23 | 2019-11-26 | 恒大智慧科技有限公司 | 一种商场店铺导航方法、计算机设备及可读存储介质 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9449228B1 (en) | Inferring locations from an image | |
CN109029444B (zh) | 一种基于图像匹配和空间定位的室内导航系统及导航方法 | |
US8180146B2 (en) | Method and apparatus for recognizing and localizing landmarks from an image onto a map | |
JP4591353B2 (ja) | 文字認識装置、移動通信システム、移動端末装置、固定局装置、文字認識方法および文字認識プログラム | |
CN104748738B (zh) | 室内定位导航方法和系统 | |
US8929604B2 (en) | Vision system and method of analyzing an image | |
US9324003B2 (en) | Location of image capture device and object features in a captured image | |
CN110298269B (zh) | 场景图像定位方法、装置、设备及可读存储介质 | |
CN106646566A (zh) | 乘客定位方法、装置及系统 | |
KR100533033B1 (ko) | 디지털 영상 처리 기술을 이용한 위치 추적 시스템 및 방법 | |
CN111028358B (zh) | 室内环境的增强现实显示方法、装置及终端设备 | |
WO2012046671A1 (ja) | 測位システム | |
WO2016149918A1 (zh) | 用户地理位置的确定 | |
CN103632626A (zh) | 一种基于移动互联网的智能导游实现方法、装置及移动客户端 | |
CN103778261A (zh) | 一种基于移动云计算图像识别的自助导游方法及系统 | |
CN109740479A (zh) | 一种车辆重识别方法、装置、设备及可读存储介质 | |
EP4089370B1 (en) | Method and device for verifying a current location and orientation of a user using landmarks | |
WO2006035476A1 (ja) | 位置特定サーバ及び移動体端末 | |
JP2000047579A (ja) | 地図データベース更新装置 | |
KR20190029412A (ko) | 네트워크를 통한 오프라인 매장 정보 제공 방법 및 이에 사용되는 관리 서버 | |
CN114969221A (zh) | 一种更新地图的方法及相关设备 | |
Ecklbauer | A mobile positioning system for android based on visual markers | |
CN111537954A (zh) | 一种实时高动态融合定位方法及装置 | |
Shahid et al. | Images based indoor positioning using AI and crowdsourcing | |
KR102555668B1 (ko) | 맵 생성 방법 및 이를 이용한 이미지 기반 측위 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |