CN104936283A - Indoor positioning method, server and system - Google Patents
Indoor positioning method, server and system Download PDFInfo
- Publication number
- CN104936283A CN104936283A CN201410106331.3A CN201410106331A CN104936283A CN 104936283 A CN104936283 A CN 104936283A CN 201410106331 A CN201410106331 A CN 201410106331A CN 104936283 A CN104936283 A CN 104936283A
- Authority
- CN
- China
- Prior art keywords
- image
- positioning
- user terminal
- information
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
- H04W64/003—Locating users or terminals or network equipment for network management purposes, e.g. mobility management locating network equipment
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Navigation (AREA)
Abstract
The invention discloses an indoor positioning method, a server and a system. When receiving a positioning request sent by a user terminal, the server extracts positioning information included in the positioning request, uses the positioning information to perform coarse positioning on the user terminal, and obtains a first target image associated with a roughly estimated position from a database, the distance between the coordinate position of the first target image and the roughly estimated position is smaller than a preset distance; when a positioning image sent by the user terminal is received, whether a matching image matched with the positioning image exists in the first target image is judged; if the matching image exists, coordinate information and direction information of the matching image are sent to the user terminal, so that the user terminal displays the coordinate information and the direction information of the matching image on an indoor map, thereby using the displayed coordinate information and direction information as current position coordinates and direction of a user. Thus, indoor accurate positioning can be effectively realized.
Description
Technical field
The present invention relates to the communications field, particularly a kind of indoor orientation method, server and system.
Background technology
Current location-based application is just in develop rapidly, and the demand of people to position & navigation increases day by day, and especially how locating which layer of a user in a solitary building, is the problem of an awfully hot door.Because in some complex indoor environments, as airport hall, exhibition room, warehouse, supermarket, library, underground parking etc., by this type of information, the information on services that just can provide for user, as advertisement, shows the way, and rescue is searched.But at present, by the constraint of the indoor conditions of positioning time, positioning precision and complexity, go back the indoor positioning technologies that neither one is fairly perfect.
Based on the GPS(Global Positioning System of satellite communication, global positioning system) be the location technology be most widely used at present.Therefore and be not suitable for indoor positioning its good positioning precision solves the practical problem of a lot of military and civilian, but its shortcoming is satellite-signal can not penetrate building, and, the cost of its locator terminal is also higher in addition.
Based on the location technology of base station, positioning precision is not enough, can not be positioned at which floor by consumer positioning, can not be used for indoor positioning.
Existing indoor wireless locating system mainly adopts ultrasonic wave, infrared ray, optics (vision), RF(Radio Frequency, radio frequency) etc. short distance wireless technical, wherein RF can be divided into WiFi(wireless fidelity again, Wireless Fidelity), bluetooth, RFID(Radio FrequencyIdentification, radio-frequency (RF) identification) technology, this wherein disposes extensively based on the wireless location technology of WiFi and cost is lower.
Current most WiFi targeting scheme is all utilize RSS(Received SignalStrength, received signal strength), be mainly divided into two kinds according to algorithm, one is triangle algorithm, and one is location fingerprint method of identification.Triangle algorithm utilizes the position of the distance estimations target between target to be measured at least three known reference point, and location fingerprint identification rule obtains target location by the signal characteristic finger print information needed for comparison and location.Wherein the accuracy of the latter is higher, and error is within 30m.But such accuracy can not meet the demand of indoor positioning completely.
WiFi router all can only region within covering radius 90 meters, and exist signal weak, cover Dead Core Problems, so at any time and can not all have the robustness of navigation system in any case.
Method based on optics (vision) passes through image processing techniques, from database, find a match user to catch the image of image, thus obtain positioning result, its accuracy is very high, meet the demand of indoor positioning, but it needs high computing capability and large memory space, is difficult to the requirement meeting real-time.
Summary of the invention
The embodiment of the present invention provides a kind of indoor orientation method, server and system.By wireless location technology and image matching technology being combined, effectively indoor accurate position can be realized.
According to an aspect of the present invention, a kind of indoor orientation method is provided, comprises:
When receiving the Location Request that user terminal sends, extract the locating information that Location Request comprises;
Described locating information is utilized to carry out coarse positioning to described user terminal, to obtain rough estimate position;
The first object image be associated with described rough estimate position is obtained from database, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset, the image wherein stored in a database all has coordinate information and directional information, position residing when coordinate information represents user's photographic images, directional information represents shooting direction during photographic images;
When receiving the positioning image that described user terminal sends, judge in described first object image, whether there is the matching image matched with described positioning image, wherein said user terminal first sends described Location Request when positioning, and sends the described positioning image of shooting subsequently;
If there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
In one embodiment, if there is not matching image, then the coordinate information of rough estimate position is sent to described user terminal, so that described user terminal shows the coordinate information of described rough estimate position in indoor map, thus using the coordinate information of display as the current position coordinates of user.
In one embodiment, when receiving the positioning image that described user terminal sends, judge that the step that whether there is the matching image matched with described positioning image in described first object image comprises:
When receiving the positioning image that described user terminal sends, judge whether also comprise angle changing value in described Location Request, wherein angle changing value for the shooting direction of described user terminal when taking described positioning image relative to front one-time positioning determine the angle changing value in direction;
If do not comprise angle changing value in described Location Request, then by described first object image alternatively image, judge in candidate image, whether there is the matching image matched with positioning image.
In one embodiment, if described Location Request comprises angle changing value, then the determined direction of one-time positioning and described angle changing value before utilizing, judge the shooting direction of described user terminal when taking described positioning image;
Utilize described shooting direction, in described first object image, obtain the second target image, the differential seat angle between the direction of wherein said second target image and described shooting direction is no more than predetermined angle threshold value;
By described second target image alternatively image, judge in candidate image, whether there is the matching image matched with positioning image.
In one embodiment, described angle threshold value is 45 degree.
In one embodiment, judge that the step that whether there is the matching image matched with positioning image in candidate image comprises:
Preliminary treatment is carried out to positioning image, to make positioning image and candidate image have unified resolution;
Respectively image procossing is carried out to pretreated positioning image and candidate image, to extract characteristic information S and Fi with illumination, dimension rotation consistency respectively, wherein S is the characteristic information obtained from pretreated positioning image, Fi is the characteristic information obtained from i-th candidate image, 1≤i≤N, N is the quantity of candidate image;
Judge whether to there is the Fi mated with S-phase;
If there is the Fi that mates with S-phase, then using the candidate image corresponding to the Fi mated with S-phase as the matching image matched with positioning image;
If there is not the Fi mated with S-phase, then judge to there is not matching image.
In one embodiment, the distance preset is not more than 20 meters.
In one embodiment, described locating information comprises the identification information of wireless signal strength that wireless network access point that described user terminal collects sends and described wireless network access point.
According to a further aspect in the invention, provide a kind of indoor positioning server, comprise receiving element, extraction unit, coarse positioning unit, first object image acquisition unit, database, recognition unit and transmitting element, wherein:
Receiving element, for receiving the Location Request that user terminal sends, wherein said user terminal first sends described Location Request when positioning, and sends the positioning image of shooting subsequently;
Extraction unit, for receive when receiving element user terminal send Location Request time, extract the locating information that Location Request comprises;
Coarse positioning unit, for utilizing described locating information to carry out coarse positioning to described user terminal, to obtain rough estimate position;
First object image acquisition unit, for obtaining the first object image be associated with described rough estimate position from database, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset;
Database, for memory image, the image wherein stored all has coordinate information and directional information, position residing when coordinate information represents user's photographic images, and directional information represents shooting direction during photographic images;
Recognition unit, for receive when receiving element described user terminal send positioning image time, judge in described first object image, whether there is the matching image matched with described positioning image;
Transmitting element, for the judged result according to recognition unit, if there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
In one embodiment, transmitting element is also for the judged result according to recognition unit, if there is not matching image, then the coordinate information of rough estimate position is sent to described user terminal, so that described user terminal shows the coordinate information of described rough estimate position in indoor map, thus using the coordinate information of display as the current position coordinates of user.
In one embodiment, recognition unit specifically comprises judge module, matching module, wherein:
Judge module, for receive when receiving element described user terminal send positioning image time, judge whether also comprise angle changing value in described Location Request, wherein angle changing value for the shooting direction of described user terminal when taking described positioning image relative to front one-time positioning determine the angle changing value in direction;
Matching module, for the judged result according to judge module, if do not comprise angle changing value in described Location Request, then by described first object image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.
In one embodiment, recognition unit also comprises direction discernment module and target identification module, wherein:
Direction discernment module, for the judged result according to judge module, if described Location Request comprises angle changing value, then the determined direction of one-time positioning and described angle changing value before utilizing, judge the shooting direction of described user terminal when taking described positioning image;
Target identification module, for utilizing described shooting direction, in described first object image, obtain the second target image, the differential seat angle between the direction of wherein said second target image and described shooting direction is no more than predetermined angle threshold value;
Matching module also for by described second target image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.
In one embodiment, described angle threshold value is 45 degree.
In one embodiment, matching module specifically carries out preliminary treatment to positioning image, to make positioning image and candidate image have unified resolution; Respectively image procossing is carried out to pretreated positioning image and candidate image, to extract characteristic information S and Fi with illumination, dimension rotation consistency respectively, wherein S is the characteristic information obtained from pretreated positioning image, Fi is the characteristic information obtained from i-th candidate image, 1≤i≤N, N is the quantity of candidate image; Judge whether to there is the Fi mated with S-phase; If there is the Fi that mates with S-phase, then using the candidate image corresponding to the Fi mated with S-phase as the matching image matched with positioning image; If there is not the Fi mated with S-phase, then judge to there is not matching image.
In one embodiment, the distance preset is not more than 20 meters.
In one embodiment, described locating information comprises the identification information of wireless signal strength that wireless network access point that described user terminal collects sends and described wireless network access point.
According to a further aspect in the invention, provide a kind of indoor locating system, comprise user terminal and indoor positioning server, wherein:
User terminal, for taking positioning image when positioning, sends to indoor positioning server by the positioning image of the Location Request and shooting that comprise locating information successively; When receiving coordinate information and the directional information of the matching image that indoor positioning server sends, indoor map shows coordinate information and the directional information of described matching image, thus using the coordinate information of display and directional information as the current position coordinates of user and direction;
Indoor positioning server is the indoor positioning server that above-mentioned any embodiment relates to.
The present invention, by when receiving the Location Request that user terminal sends, extracts the locating information that Location Request comprises; Described locating information is utilized to carry out coarse positioning to described user terminal, to obtain rough estimate position; From database, obtain the first object image be associated with described rough estimate position, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset; When receiving the positioning image that described user terminal sends, judge in described first object image, whether there is the matching image matched with described positioning image; If there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.Thus effectively can realize indoor accurate position.
Description of the invention provides in order to example with for the purpose of describing, and is not exhaustively or limit the invention to disclosed form.Many modifications and variations are obvious for the ordinary skill in the art.Selecting and describing embodiment is in order to principle of the present invention and practical application are better described, and enables those of ordinary skill in the art understand the present invention thus design the various embodiments with various amendment being suitable for special-purpose.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the schematic diagram of an indoor orientation method of the present invention embodiment.
Fig. 2 is the schematic diagram of another embodiment of indoor orientation method of the present invention.
Fig. 3 is the schematic diagram of a match query image of the present invention embodiment.
Fig. 4 is the schematic diagram of an indoor positioning server of the present invention embodiment.
Fig. 5 is the schematic diagram of a recognition unit of the present invention embodiment.
Fig. 6 is the schematic diagram of an indoor locating system of the present invention embodiment.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiments.Illustrative to the description only actually of at least one exemplary embodiment below, never as any restriction to the present invention and application or use.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
Unless specifically stated otherwise, otherwise positioned opposite, the numerical expression of the parts of setting forth in these embodiments and step and numerical value do not limit the scope of the invention.
Meanwhile, it should be understood that for convenience of description, the size of the various piece shown in accompanying drawing is not draw according to the proportionate relationship of reality.
May not discuss in detail for the known technology of person of ordinary skill in the relevant, method and apparatus, but in the appropriate case, described technology, method and apparatus should be regarded as a part of authorizing specification.
In all examples with discussing shown here, any occurrence should be construed as merely exemplary, instead of as restriction.Therefore, other example of exemplary embodiment can have different values.
It should be noted that: represent similar terms in similar label and letter accompanying drawing below, therefore, once be defined in an a certain Xiang Yi accompanying drawing, then do not need to be further discussed it in accompanying drawing subsequently.
Fig. 1 is the schematic diagram of an indoor orientation method of the present invention embodiment.Preferably, the method step of the present embodiment can be performed by indoor positioning server.
Step 101, when receiving the Location Request that user terminal sends, extracts the locating information that Location Request comprises.
Preferred described locating information comprises the identification information of wireless signal strength RSS that wireless network access point that described user terminal collects sends and described wireless network access point, such as MAC(Media Access Control, medium education) address information.
Step 102, utilizes described locating information to carry out coarse positioning to described user terminal, to obtain rough estimate position.
Such as, coarse positioning is carried out by WiFi fingerprint location.Be that those skilled in the art can understand owing to how to carry out coarse positioning, therefore do not launch here to describe.
Step 103, from database, obtain the first object image be associated with described rough estimate position, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset.
Preferably, the distance preset is not more than 20 meters.
The image wherein stored in a database all has coordinate information and directional information, and position residing when coordinate information represents user's photographic images, directional information represents shooting direction during photographic images.Information in this database can generate in off-line training step.
Step 104, when receiving the positioning image that described user terminal sends, judge in described first object image, whether there is the matching image matched with described positioning image, wherein said user terminal first sends described Location Request when positioning, and sends the described positioning image of shooting subsequently.
Step 105, if there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
Based on the indoor orientation method that the above embodiment of the present invention provides, by wireless location technology and image matching technology being combined, effectively indoor accurate position can be realized.
Fig. 2 is the schematic diagram of another embodiment of indoor orientation method of the present invention.Preferably, the method step of the present embodiment can be performed by indoor positioning server.Wherein step 201-203 is identical with above-mentioned steps 101-103.
Step 201, when receiving the Location Request that user terminal sends, extracts the locating information that Location Request comprises.
Step 202, utilizes described locating information to carry out coarse positioning to described user terminal, to obtain rough estimate position.
Step 203, from database, obtain the first object image be associated with described rough estimate position, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset.
Step 204, when receiving the positioning image that described user terminal sends, judges whether also comprise angle changing value in described Location Request.If do not comprise angle changing value in described Location Request, then perform step 205; If described Location Request comprises angle changing value, then perform step 206.
Wherein angle changing value for the shooting direction of described user terminal when taking described positioning image relative to front one-time positioning determine the angle changing value in direction.
For rest location, owing to there is not the determined direction of front one-time positioning, therefore there is not angle changing value yet.
For motion positions afterwards, can by take described positioning image time shooting direction relative to front one-time positioning determine that the angle changing value in direction sends to indoor positioning server, to position process.
Step 205, by described first object image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.If there is matching image, then perform step 209; If there is not matching image, then perform step 210.
Step 206, the determined direction of one-time positioning and described angle changing value before utilizing, judge the shooting direction of described user terminal when taking described positioning image.
Step 207, utilizes described shooting direction, in described first object image, obtain the second target image.
Differential seat angle between the direction of wherein said second target image and described shooting direction is no more than predetermined angle threshold value.
Preferably, described angle threshold value is 45 degree.More than 45 degree, picture material difference is comparatively large, cannot be used for coupling.
Step 208, by described second target image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.If there is matching image, then perform step 209; If there is not matching image, then perform step 210.
Step 209, the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.Afterwards, other step of the present embodiment is no longer performed.
Step 210, sends to described user terminal by the coordinate information of rough estimate position, so that described user terminal shows the coordinate information of described rough estimate position in indoor map, thus using the coordinate information of display as the current position coordinates of user.
Fig. 3 is the schematic diagram of a match query image of the present invention embodiment.Wherein, the step that whether above-mentioned judgement exists the matching image matched with positioning image in candidate image comprises:
Step 301, carries out preliminary treatment to positioning image, to make positioning image and candidate image have unified resolution.
Step 302, respectively image procossing is carried out to pretreated positioning image and candidate image, to extract characteristic information S and Fi with illumination, dimension rotation consistency respectively, wherein S is the characteristic information obtained from pretreated positioning image, Fi is the characteristic information obtained from i-th candidate image, 1≤i≤N, N is the quantity of candidate image.
Such as, by SURF(Speeded Up Robust Features) image is carried out to the process of feature extraction, the characteristic point obtained has illumination, dimension rotation consistency.SURF extracts feature by building Hessian matrix, metric space, determines that stable extreme point is key point, finally according to the gradient characteristics of key point peripheral point, produces Feature Descriptor, finally the characteristic vector obtained is used for next step coupling.
Step 303, judges whether to there is the Fi mated with S-phase.If there is the Fi mated with S-phase, then perform step 304; If there is not the Fi mated with S-phase, then perform step 305.
Such as, can according to k-d tree algorithm, mate the feature multi-C vector of image, Satisfying Matching Conditions then obtains immediate matching image.
Step 304, using the candidate image corresponding to the Fi mated with S-phase as the matching image matched with positioning image.
Step 305, judges to there is not matching image.
Fig. 4 is the schematic diagram of an indoor positioning server of the present invention embodiment.As shown in Figure 4, this server comprises receiving element 401, extraction unit 402, coarse positioning unit 403, first object image acquisition unit 404, database 405, recognition unit 406 and transmitting element 407.Wherein:
Receiving element 401, for receiving the Location Request that user terminal sends.
Wherein said user terminal first sends described Location Request when positioning, and sends the positioning image of shooting subsequently.
Extraction unit 402, for receive when receiving element 401 user terminal send Location Request time, extract the locating information that Location Request comprises.
Preferably, described locating information comprises the identification information of wireless signal strength that wireless network access point that described user terminal collects sends and described wireless network access point, such as mac address information.
Coarse positioning unit 403, for utilizing described locating information to carry out coarse positioning to described user terminal, to obtain rough estimate position.
First object image acquisition unit 404, for obtaining the first object image be associated with described rough estimate position from database, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset.
Preferably, the distance preset is not more than 20 meters.
Database 405, for memory image, the image wherein stored all has coordinate information and directional information, and position residing when coordinate information represents user's photographic images, directional information represents shooting direction during photographic images.
Recognition unit 406, for receive when receiving element 401 described user terminal send positioning image time, judge in described first object image, whether there is the matching image matched with described positioning image.
Transmitting element 407, for the judged result according to recognition unit 406, if there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
Based on the indoor positioning server that the above embodiment of the present invention provides, by wireless location technology and image matching technology being combined, effectively indoor accurate position can be realized.
Preferably, transmitting element 407 is also for the judged result according to recognition unit 406, if there is not matching image, then the coordinate information of rough estimate position is sent to described user terminal, so that described user terminal shows the coordinate information of described rough estimate position in indoor map, thus using the coordinate information of display as the current position coordinates of user.
Fig. 5 is the schematic diagram of a recognition unit of the present invention embodiment.As shown in Figure 5, recognition unit 406 specifically comprises judge module 501 and matching module 502.Wherein:
Judge module 501, for receive when receiving element described user terminal send positioning image time, judge whether also comprise angle changing value in described Location Request, wherein angle changing value for the shooting direction of described user terminal when taking described positioning image relative to front one-time positioning determine the angle changing value in direction.
Matching module 502, for the judged result according to judge module 501, if do not comprise angle changing value in described Location Request, then by described first object image alternatively image, judge in candidate image, whether there is the matching image matched with positioning image.
Preferably, recognition unit also comprises direction discernment module 503 and target identification module 504, wherein:
Direction discernment module 503, for the judged result according to judge module 501, if described Location Request comprises angle changing value, then the determined direction of one-time positioning and described angle changing value before utilizing, judge the shooting direction of described user terminal when taking described positioning image.
Target identification module 504, for utilizing described shooting direction, in described first object image, obtain the second target image, the differential seat angle between the direction of wherein said second target image and described shooting direction is no more than predetermined angle threshold value.
Preferably, described angle threshold value is 45 degree.
Matching module 502 also for by described second target image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.
Preferably, matching module 502 specifically carries out preliminary treatment to positioning image, to make positioning image and candidate image have unified resolution; Respectively image procossing is carried out to pretreated positioning image and candidate image, to extract characteristic information S and Fi with illumination, dimension rotation consistency respectively, wherein S is the characteristic information obtained from pretreated positioning image, Fi is the characteristic information obtained from i-th candidate image, 1≤i≤N, N is the quantity of candidate image; Judge whether to there is the Fi mated with S-phase; If there is the Fi that mates with S-phase, then using the candidate image corresponding to the Fi mated with S-phase as the matching image matched with positioning image; If there is not the Fi mated with S-phase, then judge to there is not matching image.
Fig. 6 is the schematic diagram of an indoor locating system of the present invention embodiment.As shown in Figure 6, this system comprises user terminal 601 and indoor positioning server 602.For convenience's sake, only give a user terminal in figure 6, those skilled in the art are scrutable, and multiple user terminal and indoor positioning server can be had to carry out communication interaction.Wherein:
User terminal 601, for taking positioning image when positioning, after the Location Request comprising locating information is sent to indoor positioning server 602, then sends to indoor positioning server 602 by the positioning image of shooting; When receiving coordinate information and the directional information of the matching image that indoor positioning server 602 sends, indoor map shows coordinate information and the directional information of described matching image, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
Indoor positioning server 602 is the indoor positioning server that any embodiment in Fig. 4 and Fig. 5 relates to.
Based on the indoor locating system that the above embodiment of the present invention provides, by wireless location technology and image matching technology being combined, effectively indoor accurate position can be realized.
In one embodiment, user terminal can comprise WiFi scan module, gyro sensor module, camera model and display module.Wherein, WiFi scanning mould can be used for the scanning different router signal strength signal intensity of reception and MAC Address; Gyro sensor module is for recording the angle of relative reference point; Camera model is for catching position image; Display module is for showing last positioning result.
Below by concrete example, the present invention will be described.The user terminal that user carries such as mobile phone moves in building.Wherein user upper once locate time take positioning image in a region towards east, when user moves to another region, when again positioning:
The first step, user utilizes user terminal to take positioning image towards south, and user terminal gathers the angle change of indoor positioning information, the more front orientation of terminal simultaneously.The Location Request comprising locating information and angle change is transferred to indoor positioning server with character string forms by user terminal, afterwards the positioning image of shooting is sent to indoor positioning server.
Second step, indoor positioning server by utilizing locating information carries out coarse positioning.Search in a database from all images within the 20m of rough estimate position, obtain n width image.Recycling angle change information determines user Chao Nan photographic images in this location, and then in n width image, filter out the m width image of shooting direction towards south.The image range of carrying out mating can be effectively reduced thus.Consider that angular deviation is too large more than 45 degree of picture material difference, cannot be used for coupling, the m width image angle deviation therefore selected is all in 45 degree.
3rd step, the image that user catches by indoor positioning server carries out preliminary treatment, makes this image consistent with the resolution of m width image to be matched.Then SURF is adopted to carry out feature extraction process to image, to obtain the characteristic point with illumination, dimension rotation consistency.
4th step, the characteristic point that the characteristic point that the image that user catches obtains obtains with m width image is respectively carried out multi-C vector coupling, and Satisfying Matching Conditions then obtains immediate coordinate diagram picture to be matched.Utilize this coordinate diagram picture to search in image data base, obtain more accurate position coordinates and direction.As the coordinate diagram picture without Satisfying Matching Conditions, then feed back coarse positioning coordinate.
4th step, obtains positioning result by previous step and transfers to user terminal, and be presented in indoor map.Thus complete accurate location.
One of ordinary skill in the art will appreciate that all or part of step realizing above-described embodiment can have been come by hardware, the hardware that also can carry out instruction relevant by program completes, described program can be stored in a kind of computer-readable recording medium, the above-mentioned storage medium mentioned can be read-only memory, disk or CD etc.
Claims (17)
1. an indoor orientation method, is characterized in that, comprising:
When receiving the Location Request that user terminal sends, extract the locating information that Location Request comprises;
Described locating information is utilized to carry out coarse positioning to described user terminal, to obtain rough estimate position;
The first object image be associated with described rough estimate position is obtained from database, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset, the image wherein stored in a database all has coordinate information and directional information, position residing when coordinate information represents user's photographic images, directional information represents shooting direction during photographic images;
When receiving the positioning image that described user terminal sends, judge in described first object image, whether there is the matching image matched with described positioning image, wherein said user terminal first sends described Location Request when positioning, and sends the described positioning image of shooting subsequently;
If there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
2. method according to claim 1, is characterized in that,
If there is not matching image, then the coordinate information of rough estimate position is sent to described user terminal, so that described user terminal shows the coordinate information of described rough estimate position in indoor map, thus using the coordinate information of display as the current position coordinates of user.
3. method according to claim 1, is characterized in that,
When receiving the positioning image that described user terminal sends, judge that the step that whether there is the matching image matched with described positioning image in described first object image comprises:
When receiving the positioning image that described user terminal sends, judge whether also comprise angle changing value in described Location Request, wherein angle changing value for the shooting direction of described user terminal when taking described positioning image relative to front one-time positioning determine the angle changing value in direction;
If do not comprise angle changing value in described Location Request, then by described first object image alternatively image, judge in candidate image, whether there is the matching image matched with positioning image.
4. method according to claim 3, is characterized in that,
If described Location Request comprises angle changing value, then the determined direction of one-time positioning and described angle changing value before utilizing, judge the shooting direction of described user terminal when taking described positioning image;
Utilize described shooting direction, in described first object image, obtain the second target image, the differential seat angle between the direction of wherein said second target image and described shooting direction is no more than predetermined angle threshold value;
By described second target image alternatively image, judge in candidate image, whether there is the matching image matched with positioning image.
5. method according to claim 4, is characterized in that,
Described angle threshold value is 45 degree.
6. the method according to any one of claim 3-5, is characterized in that,
Judge that the step that whether there is the matching image matched with positioning image in candidate image comprises:
Preliminary treatment is carried out to positioning image, to make positioning image and candidate image have unified resolution;
Respectively image procossing is carried out to pretreated positioning image and candidate image, to extract characteristic information S and Fi with illumination, dimension rotation consistency respectively, wherein S is the characteristic information obtained from pretreated positioning image, Fi is the characteristic information obtained from i-th candidate image, 1≤i≤N, N is the quantity of candidate image;
Judge whether to there is the Fi mated with S-phase;
If there is the Fi that mates with S-phase, then using the candidate image corresponding to the Fi mated with S-phase as the matching image matched with positioning image;
If there is not the Fi mated with S-phase, then judge to there is not matching image.
7. method according to claim 1 and 2, is characterized in that,
The distance preset is not more than 20 meters.
8. method according to claim 1 and 2, is characterized in that,
Described locating information comprises the identification information of wireless signal strength that wireless network access point that described user terminal collects sends and described wireless network access point.
9. an indoor positioning server, is characterized in that, comprises receiving element, extraction unit, coarse positioning unit, first object image acquisition unit, database, recognition unit and transmitting element, wherein:
Receiving element, for receiving the Location Request that user terminal sends, wherein said user terminal first sends described Location Request when positioning, and sends the positioning image of shooting subsequently;
Extraction unit, for receive when receiving element user terminal send Location Request time, extract the locating information that Location Request comprises;
Coarse positioning unit, for utilizing described locating information to carry out coarse positioning to described user terminal, to obtain rough estimate position;
First object image acquisition unit, for obtaining the first object image be associated with described rough estimate position from database, the coordinate position of wherein said first object image and the distance of described rough estimate position are less than the distance preset;
Database, for memory image, the image wherein stored all has coordinate information and directional information, position residing when coordinate information represents user's photographic images, and directional information represents shooting direction during photographic images;
Recognition unit, for receive when receiving element described user terminal send positioning image time, judge in described first object image, whether there is the matching image matched with described positioning image;
Transmitting element, for the judged result according to recognition unit, if there is matching image, then the coordinate information of described matching image and directional information are sent to described user terminal, so that described user terminal shows coordinate information and the directional information of described matching image in indoor map, thus using the coordinate information of display and directional information as the current position coordinates of user and direction.
10. server according to claim 9, is characterized in that,
Transmitting element is also for the judged result according to recognition unit, if there is not matching image, then the coordinate information of rough estimate position is sent to described user terminal, so that described user terminal shows the coordinate information of described rough estimate position in indoor map, thus using the coordinate information of display as the current position coordinates of user.
11. servers according to claim 9, is characterized in that,
Recognition unit specifically comprises judge module, matching module, wherein:
Judge module, for receive when receiving element described user terminal send positioning image time, judge whether also comprise angle changing value in described Location Request, wherein angle changing value for the shooting direction of described user terminal when taking described positioning image relative to front one-time positioning determine the angle changing value in direction;
Matching module, for the judged result according to judge module, if do not comprise angle changing value in described Location Request, then by described first object image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.
12. servers according to claim 11, is characterized in that, recognition unit also comprises direction discernment module and target identification module, wherein:
Direction discernment module, for the judged result according to judge module, if described Location Request comprises angle changing value, then the determined direction of one-time positioning and described angle changing value before utilizing, judge the shooting direction of described user terminal when taking described positioning image;
Target identification module, for utilizing described shooting direction, in described first object image, obtain the second target image, the differential seat angle between the direction of wherein said second target image and described shooting direction is no more than predetermined angle threshold value;
Matching module also for by described second target image alternatively image, judges in candidate image, whether there is the matching image matched with positioning image.
13. servers according to claim 12, is characterized in that,
Described angle threshold value is 45 degree.
14. servers according to any one of claim 11-13, is characterized in that,
Matching module specifically carries out preliminary treatment to positioning image, to make positioning image and candidate image have unified resolution; Respectively image procossing is carried out to pretreated positioning image and candidate image, to extract characteristic information S and Fi with illumination, dimension rotation consistency respectively, wherein S is the characteristic information obtained from pretreated positioning image, Fi is the characteristic information obtained from i-th candidate image, 1≤i≤N, N is the quantity of candidate image; Judge whether to there is the Fi mated with S-phase; If there is the Fi that mates with S-phase, then using the candidate image corresponding to the Fi mated with S-phase as the matching image matched with positioning image; If there is not the Fi mated with S-phase, then judge to there is not matching image.
15. servers according to claim 9 or 10, is characterized in that,
The distance preset is not more than 20 meters.
16. servers according to claim 9 or 10, is characterized in that,
Described locating information comprises the identification information of wireless signal strength that wireless network access point that described user terminal collects sends and described wireless network access point.
17. 1 kinds of indoor locating systems, is characterized in that, comprise user terminal and indoor positioning server, wherein:
User terminal, for taking positioning image when positioning, sends to indoor positioning server by the positioning image of the Location Request and shooting that comprise locating information successively; When receiving coordinate information and the directional information of the matching image that indoor positioning server sends, indoor map shows coordinate information and the directional information of described matching image, thus using the coordinate information of display and directional information as the current position coordinates of user and direction;
Indoor positioning server, the indoor positioning server related to any one of claim 9-16.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410106331.3A CN104936283B (en) | 2014-03-21 | 2014-03-21 | Indoor orientation method, server and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410106331.3A CN104936283B (en) | 2014-03-21 | 2014-03-21 | Indoor orientation method, server and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104936283A true CN104936283A (en) | 2015-09-23 |
CN104936283B CN104936283B (en) | 2018-12-25 |
Family
ID=54123175
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410106331.3A Active CN104936283B (en) | 2014-03-21 | 2014-03-21 | Indoor orientation method, server and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104936283B (en) |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105225240A (en) * | 2015-09-25 | 2016-01-06 | 哈尔滨工业大学 | The indoor orientation method that a kind of view-based access control model characteristic matching and shooting angle are estimated |
CN105246039A (en) * | 2015-10-20 | 2016-01-13 | 深圳大学 | Image processing-based indoor positioning method and system |
CN105354296A (en) * | 2015-10-31 | 2016-02-24 | 广东欧珀移动通信有限公司 | Terminal positioning method and user terminal |
CN105652237A (en) * | 2016-01-13 | 2016-06-08 | 广东欧珀移动通信有限公司 | Positioning method and device of mobile terminal, and mobile terminal |
CN105792131A (en) * | 2016-04-21 | 2016-07-20 | 北京邮电大学 | Positioning method and system |
CN105930478A (en) * | 2016-05-03 | 2016-09-07 | 福州市勘测院 | Element object spatial information fingerprint-based spatial data change capture method |
CN105959919A (en) * | 2016-06-29 | 2016-09-21 | 宁波市由乐讯通讯科技有限公司 | Position acquisition method and system for use in wireless communication |
CN105975967A (en) * | 2016-04-29 | 2016-09-28 | 殳南 | Target positioning method and system |
CN106131783A (en) * | 2016-06-29 | 2016-11-16 | 宁波市由乐讯通讯科技有限公司 | Location acquiring method and system in a kind of modified model radio communication |
CN106153047A (en) * | 2016-08-15 | 2016-11-23 | 广东欧珀移动通信有限公司 | A kind of indoor orientation method, device and terminal |
CN106170124A (en) * | 2016-06-29 | 2016-11-30 | 宁波市由乐讯通讯科技有限公司 | Location acquiring method and system in a kind of radio communication based on multiple location technology |
CN106291517A (en) * | 2016-08-12 | 2017-01-04 | 苏州大学 | Indoor cloud robot angle positioning method based on position and visual information optimization |
CN106530008A (en) * | 2016-11-10 | 2017-03-22 | 广州市沃希信息科技有限公司 | Scene-picture-based advertising method and system |
CN106767810A (en) * | 2016-11-23 | 2017-05-31 | 武汉理工大学 | The indoor orientation method and system of a kind of WIFI and visual information based on mobile terminal |
CN106767850A (en) * | 2016-11-10 | 2017-05-31 | 广州市沃希信息科技有限公司 | A kind of passenger's localization method and system based on scene picture |
CN107356242A (en) * | 2017-06-26 | 2017-11-17 | 广州贰拾肆机器人科技有限公司 | The indoor orientation method and system of a kind of Intelligent mobile equipment |
WO2017219679A1 (en) * | 2016-06-20 | 2017-12-28 | 杭州海康威视数字技术股份有限公司 | Method and device for establishing correspondence between rfid tags and persons, and method and device for trajectory tracking |
CN107545006A (en) * | 2016-06-28 | 2018-01-05 | 百度在线网络技术(北京)有限公司 | A kind of method, equipment and system for being used to establishing or updating image positional data storehouse |
CN107566500A (en) * | 2017-09-11 | 2018-01-09 | 珠海市魅族科技有限公司 | The sending method and relevant apparatus of a kind of position location |
CN108256543A (en) * | 2016-12-29 | 2018-07-06 | 纳恩博(北京)科技有限公司 | A kind of localization method and electronic equipment |
CN108692720A (en) * | 2018-04-09 | 2018-10-23 | 京东方科技集团股份有限公司 | Localization method, location-server and positioning system |
CN109171789A (en) * | 2018-09-18 | 2019-01-11 | 上海联影医疗科技有限公司 | A kind of calibration method and calibration system for diagnostic imaging equipment |
WO2019104665A1 (en) * | 2017-11-30 | 2019-06-06 | 深圳市沃特沃德股份有限公司 | Robot cleaner and repositioning method therefor |
CN109919600A (en) * | 2019-03-04 | 2019-06-21 | 出门问问信息科技有限公司 | A kind of virtual card call method, device, equipment and storage medium |
CN110243366A (en) * | 2018-03-09 | 2019-09-17 | 中国移动通信有限公司研究院 | A kind of vision positioning method and device, equipment, storage medium |
CN110360999A (en) * | 2018-03-26 | 2019-10-22 | 京东方科技集团股份有限公司 | Indoor orientation method, indoor locating system and computer-readable medium |
CN111178366A (en) * | 2018-11-12 | 2020-05-19 | 杭州萤石软件有限公司 | Mobile robot positioning method and mobile robot |
CN111935641A (en) * | 2020-08-14 | 2020-11-13 | 上海木木聚枞机器人科技有限公司 | Indoor self-positioning realization method, intelligent mobile device and storage medium |
WO2021057797A1 (en) * | 2019-09-27 | 2021-04-01 | Oppo广东移动通信有限公司 | Positioning method and apparatus, terminal and storage medium |
CN113038367A (en) * | 2021-02-26 | 2021-06-25 | 山东鹰格信息工程有限公司 | Non-exposed space rapid positioning method, device and equipment based on 5G technology |
CN113656629A (en) * | 2021-07-29 | 2021-11-16 | 北京百度网讯科技有限公司 | Visual positioning method and device, electronic equipment and storage medium |
US11182927B2 (en) | 2018-09-18 | 2021-11-23 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for positioning an object |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013028359A1 (en) * | 2011-08-19 | 2013-02-28 | Qualcomm Incorporated | Logo detection for indoor positioning |
WO2013112461A1 (en) * | 2012-01-27 | 2013-08-01 | Qualcomm Incorporated | System and method for determining location of a device using opposing cameras |
CN103249142A (en) * | 2013-04-26 | 2013-08-14 | 东莞宇龙通信科技有限公司 | Locating method, locating system and mobile terminal |
CN103424113A (en) * | 2013-08-01 | 2013-12-04 | 毛蔚青 | Indoor positioning and navigating method of mobile terminal based on image recognition technology |
CN103473257A (en) * | 2012-06-06 | 2013-12-25 | 三星电子株式会社 | Apparatus and method of tracking location of wireless terminal based on image |
-
2014
- 2014-03-21 CN CN201410106331.3A patent/CN104936283B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013028359A1 (en) * | 2011-08-19 | 2013-02-28 | Qualcomm Incorporated | Logo detection for indoor positioning |
WO2013112461A1 (en) * | 2012-01-27 | 2013-08-01 | Qualcomm Incorporated | System and method for determining location of a device using opposing cameras |
CN103473257A (en) * | 2012-06-06 | 2013-12-25 | 三星电子株式会社 | Apparatus and method of tracking location of wireless terminal based on image |
CN103249142A (en) * | 2013-04-26 | 2013-08-14 | 东莞宇龙通信科技有限公司 | Locating method, locating system and mobile terminal |
CN103424113A (en) * | 2013-08-01 | 2013-12-04 | 毛蔚青 | Indoor positioning and navigating method of mobile terminal based on image recognition technology |
Non-Patent Citations (1)
Title |
---|
张凡 陈典铖 杨杰: "室内定位技术及系统比较研究", 《广州通信技术》 * |
Cited By (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105225240B (en) * | 2015-09-25 | 2017-10-03 | 哈尔滨工业大学 | The indoor orientation method that a kind of view-based access control model characteristic matching is estimated with shooting angle |
CN105225240A (en) * | 2015-09-25 | 2016-01-06 | 哈尔滨工业大学 | The indoor orientation method that a kind of view-based access control model characteristic matching and shooting angle are estimated |
CN105246039A (en) * | 2015-10-20 | 2016-01-13 | 深圳大学 | Image processing-based indoor positioning method and system |
CN105246039B (en) * | 2015-10-20 | 2018-05-29 | 深圳大学 | A kind of indoor orientation method and system based on image procossing |
CN105354296A (en) * | 2015-10-31 | 2016-02-24 | 广东欧珀移动通信有限公司 | Terminal positioning method and user terminal |
CN105354296B (en) * | 2015-10-31 | 2018-06-29 | 广东欧珀移动通信有限公司 | A kind of method of locating terminal and user terminal |
CN105652237A (en) * | 2016-01-13 | 2016-06-08 | 广东欧珀移动通信有限公司 | Positioning method and device of mobile terminal, and mobile terminal |
CN105792131A (en) * | 2016-04-21 | 2016-07-20 | 北京邮电大学 | Positioning method and system |
CN105792131B (en) * | 2016-04-21 | 2018-11-23 | 北京邮电大学 | A kind of localization method and system |
CN105975967A (en) * | 2016-04-29 | 2016-09-28 | 殳南 | Target positioning method and system |
CN105930478A (en) * | 2016-05-03 | 2016-09-07 | 福州市勘测院 | Element object spatial information fingerprint-based spatial data change capture method |
CN105930478B (en) * | 2016-05-03 | 2019-04-19 | 福州市勘测院 | Spatial data based on feature object spatial information fingerprint changes catching method |
WO2017219679A1 (en) * | 2016-06-20 | 2017-12-28 | 杭州海康威视数字技术股份有限公司 | Method and device for establishing correspondence between rfid tags and persons, and method and device for trajectory tracking |
CN107545006A (en) * | 2016-06-28 | 2018-01-05 | 百度在线网络技术(北京)有限公司 | A kind of method, equipment and system for being used to establishing or updating image positional data storehouse |
CN106131783A (en) * | 2016-06-29 | 2016-11-16 | 宁波市由乐讯通讯科技有限公司 | Location acquiring method and system in a kind of modified model radio communication |
CN106170124A (en) * | 2016-06-29 | 2016-11-30 | 宁波市由乐讯通讯科技有限公司 | Location acquiring method and system in a kind of radio communication based on multiple location technology |
CN105959919A (en) * | 2016-06-29 | 2016-09-21 | 宁波市由乐讯通讯科技有限公司 | Position acquisition method and system for use in wireless communication |
CN106291517A (en) * | 2016-08-12 | 2017-01-04 | 苏州大学 | Indoor cloud robot angle positioning method based on position and visual information optimization |
CN106153047A (en) * | 2016-08-15 | 2016-11-23 | 广东欧珀移动通信有限公司 | A kind of indoor orientation method, device and terminal |
CN106767850A (en) * | 2016-11-10 | 2017-05-31 | 广州市沃希信息科技有限公司 | A kind of passenger's localization method and system based on scene picture |
CN106767850B (en) * | 2016-11-10 | 2022-12-30 | 广州市沃希信息科技有限公司 | Passenger positioning method and system based on scene picture |
CN106530008B (en) * | 2016-11-10 | 2022-01-07 | 广州市沃希信息科技有限公司 | Advertisement method and system based on scene picture |
CN106530008A (en) * | 2016-11-10 | 2017-03-22 | 广州市沃希信息科技有限公司 | Scene-picture-based advertising method and system |
CN106767810B (en) * | 2016-11-23 | 2020-04-21 | 武汉理工大学 | Indoor positioning method and system based on WIFI and visual information of mobile terminal |
CN106767810A (en) * | 2016-11-23 | 2017-05-31 | 武汉理工大学 | The indoor orientation method and system of a kind of WIFI and visual information based on mobile terminal |
CN108256543A (en) * | 2016-12-29 | 2018-07-06 | 纳恩博(北京)科技有限公司 | A kind of localization method and electronic equipment |
CN107356242A (en) * | 2017-06-26 | 2017-11-17 | 广州贰拾肆机器人科技有限公司 | The indoor orientation method and system of a kind of Intelligent mobile equipment |
CN107356242B (en) * | 2017-06-26 | 2020-09-29 | 广州贰拾肆机器人科技有限公司 | Indoor positioning method and system for intelligent mobile equipment |
CN107566500A (en) * | 2017-09-11 | 2018-01-09 | 珠海市魅族科技有限公司 | The sending method and relevant apparatus of a kind of position location |
WO2019104665A1 (en) * | 2017-11-30 | 2019-06-06 | 深圳市沃特沃德股份有限公司 | Robot cleaner and repositioning method therefor |
CN110243366A (en) * | 2018-03-09 | 2019-09-17 | 中国移动通信有限公司研究院 | A kind of vision positioning method and device, equipment, storage medium |
CN110243366B (en) * | 2018-03-09 | 2021-06-08 | 中国移动通信有限公司研究院 | Visual positioning method and device, equipment and storage medium |
US11395100B2 (en) | 2018-03-26 | 2022-07-19 | Boe Technology Group Co., Ltd. | Indoor positioning method, indoor positioning system, indoor positioning apparatus and computer readable medium |
CN110360999A (en) * | 2018-03-26 | 2019-10-22 | 京东方科技集团股份有限公司 | Indoor orientation method, indoor locating system and computer-readable medium |
CN110360999B (en) * | 2018-03-26 | 2021-08-27 | 京东方科技集团股份有限公司 | Indoor positioning method, indoor positioning system, and computer readable medium |
WO2019196403A1 (en) * | 2018-04-09 | 2019-10-17 | 京东方科技集团股份有限公司 | Positioning method, positioning server and positioning system |
US11933614B2 (en) | 2018-04-09 | 2024-03-19 | Boe Technology Group Co., Ltd. | Positioning method, positioning server and positioning system |
CN108692720B (en) * | 2018-04-09 | 2021-01-22 | 京东方科技集团股份有限公司 | Positioning method, positioning server and positioning system |
CN108692720A (en) * | 2018-04-09 | 2018-10-23 | 京东方科技集团股份有限公司 | Localization method, location-server and positioning system |
US11727598B2 (en) | 2018-09-18 | 2023-08-15 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for positioning an object |
US11182927B2 (en) | 2018-09-18 | 2021-11-23 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for positioning an object |
CN109171789A (en) * | 2018-09-18 | 2019-01-11 | 上海联影医疗科技有限公司 | A kind of calibration method and calibration system for diagnostic imaging equipment |
CN111178366A (en) * | 2018-11-12 | 2020-05-19 | 杭州萤石软件有限公司 | Mobile robot positioning method and mobile robot |
CN111178366B (en) * | 2018-11-12 | 2023-07-25 | 杭州萤石软件有限公司 | Mobile robot positioning method and mobile robot |
CN109919600A (en) * | 2019-03-04 | 2019-06-21 | 出门问问信息科技有限公司 | A kind of virtual card call method, device, equipment and storage medium |
WO2021057797A1 (en) * | 2019-09-27 | 2021-04-01 | Oppo广东移动通信有限公司 | Positioning method and apparatus, terminal and storage medium |
CN111935641B (en) * | 2020-08-14 | 2022-08-19 | 上海木木聚枞机器人科技有限公司 | Indoor self-positioning realization method, intelligent mobile device and storage medium |
CN111935641A (en) * | 2020-08-14 | 2020-11-13 | 上海木木聚枞机器人科技有限公司 | Indoor self-positioning realization method, intelligent mobile device and storage medium |
CN113038367B (en) * | 2021-02-26 | 2022-07-12 | 山东鹰格信息工程有限公司 | Non-exposed space rapid positioning method, device and equipment based on 5G technology |
CN113038367A (en) * | 2021-02-26 | 2021-06-25 | 山东鹰格信息工程有限公司 | Non-exposed space rapid positioning method, device and equipment based on 5G technology |
CN113656629B (en) * | 2021-07-29 | 2022-09-23 | 北京百度网讯科技有限公司 | Visual positioning method and device, electronic equipment and storage medium |
CN113656629A (en) * | 2021-07-29 | 2021-11-16 | 北京百度网讯科技有限公司 | Visual positioning method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN104936283B (en) | 2018-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104936283A (en) | Indoor positioning method, server and system | |
Huang et al. | WiFi and vision-integrated fingerprint for smartphone-based self-localization in public indoor scenes | |
US11395100B2 (en) | Indoor positioning method, indoor positioning system, indoor positioning apparatus and computer readable medium | |
CN103827634B (en) | Logo detection for indoor positioning | |
Niu et al. | Resource-efficient and automated image-based indoor localization | |
CN106767810B (en) | Indoor positioning method and system based on WIFI and visual information of mobile terminal | |
US20130243250A1 (en) | Location of image capture device and object features in a captured image | |
CN111028358B (en) | Indoor environment augmented reality display method and device and terminal equipment | |
CN103761539B (en) | Indoor locating method based on environment characteristic objects | |
CN106153047A (en) | A kind of indoor orientation method, device and terminal | |
CN104378735A (en) | Indoor positioning method, client side and server | |
JP5274192B2 (en) | Position estimation system, position estimation server, and position estimation method | |
TWM580186U (en) | 360 degree surround orientation and position sensing object information acquisition system | |
CN111832579B (en) | Map interest point data processing method and device, electronic equipment and readable medium | |
Zhang et al. | Seeing Eye Phone: a smart phone-based indoor localization and guidance system for the visually impaired | |
CN112422653A (en) | Scene information pushing method, system, storage medium and equipment based on location service | |
JP7001711B2 (en) | A position information system that uses images taken by a camera, and an information device with a camera that uses it. | |
CN103557834A (en) | Dual-camera-based solid positioning method | |
CN108512888A (en) | A kind of information labeling method, cloud server, system, electronic equipment and computer program product | |
Sui et al. | An accurate indoor localization approach using cellphone camera | |
Jiao et al. | A hybrid of smartphone camera and basestation wide-area indoor positioning method | |
Chi et al. | Locate, Tell, and Guide: Enabling public cameras to navigate the public | |
JP6281947B2 (en) | Information presentation system, method and program | |
US10108882B1 (en) | Method to post and access information onto a map through pictures | |
He et al. | Portable 3D visual sensor based indoor localization on mobile device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |