CN107036593A - The indoor orientation method that a kind of feature based matching is combined with man-machine interaction - Google Patents
The indoor orientation method that a kind of feature based matching is combined with man-machine interaction Download PDFInfo
- Publication number
- CN107036593A CN107036593A CN201611020040.8A CN201611020040A CN107036593A CN 107036593 A CN107036593 A CN 107036593A CN 201611020040 A CN201611020040 A CN 201611020040A CN 107036593 A CN107036593 A CN 107036593A
- Authority
- CN
- China
- Prior art keywords
- indoor
- man
- machine interaction
- user
- cloud platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
Abstract
The present invention provides the indoor orientation method that a kind of feature based matching is combined with man-machine interaction, belongs to indoor navigation field of locating technology.The continuous image and the pattern of man-machine interaction that this method is obtained using mobile terminal camera are combined, indoor positioning work is completed in cloud platform, the positional information obtained using Feature Matching can feed back to user, the modes such as the eye recognition by user are modified to positional information, and the effect that human eye obtains information is effectively improved by the pattern of man-machine interaction.User can improve the efficiency and ease for operation of man-machine interaction in interactive process is carried out by way of providing option, so that by the uncertain information precision during characteristic matching.The mode combined using characteristic matching and man-machine interaction by indoor positioning from calculating coordinate patten transformation be scene search pattern.Position fixing process does not need the input of external signal, improves reliability, and positioning calculation is completed at cloud platform end, and navigation procedure is not limited by mobile terminal performance.
Description
Technical field
The present invention relates to indoor orientation method technical field, and in particular to a kind of feature based matching is combined with man-machine interaction
Indoor orientation method.
Background technology
With the fast development of mobile communication technology, got over based on location-based service (Location Based Service, LBS)
Favored to get over by people.Outdoor positioning technology and indoor positioning technologies can be divided into based on location-based service technology:In outdoor ring
Border, GLONASS (Global Navigation Satellite System, GNSS) is widely used, and can by will
Realistic existing high-precision location navigation;In indoor environment, in the case where position acquisition can not be realized by satellite navigation system,
Electromagnetism reference beacon is typically installed indoors, positioned by way of signal intensity inverting distance or fingerprint base, can
Reach meter accuracy.
According to statistics, people 70%-80% activity occurs indoors, therefore carries out to research of the indoor positioning about problem
Tool is of great significance.When occurring fire in tier building, not a duck soup is withdrawn rapidly, now needs indoor navigation to determine
Position service direction is in dangerous crowd and quickly and accurately withdrawn to safety area, prevents great casualties and wealth
Production loss.Outdoor area can obtain the positional information of degree of precision by GNSS.GNSS is needed between satellite and mobile target
Direct communication, in the signal blind zones such as built-up urban district or indoor environment complicated and changeable, signal is vulnerable to building wall
Wall, glass, indoor equipment and other items are blocked, it is difficult to effectively play its ranging localization function.Accordingly, it would be desirable to utilize other sensings
Device can just realize high-precision indoor positioning.
Currently, occur in that the indoor positioning technologies of many correlations, such as infrared ray, radio frequency identification, WiFi, Zigbee, regard
Feel the technologies such as positioning, all achieve good effect.Although but above-mentioned indoor positioning technologies under certain environmental conditions may be used
To obtain preferable locating effect, but these technologies have respective defect, otherwise it is that positioning precision is low, otherwise it is to positioning
Environmental requirement is harsh, it is impossible to meet people to indoor positioning perception system precision height, the requirement of good environmental adaptability.With difference
During technology and method carry out indoor positioning, user ignores generally as the passive role for receiving positioning result
The subjective role of user's navigator fix indoors.In indoor navigation position fixing process, the information acquired in eye-observation is than appointing
The observation of what sensor is more, but this partial information does not play a role in navigator fix, in other words, indoor
Lacked in positioning it is man-machine between interaction.
The content of the invention
For above-mentioned the deficiencies in the prior art, the present invention provides the interior that a kind of feature based matching is combined with man-machine interaction
Localization method, can accurately obtain the Indoor Location Information residing for cellphone subscriber.
For achieving the above object, the present invention is adopted the following technical scheme that:
The indoor orientation method that a kind of feature based matching is combined with man-machine interaction, is comprised the following steps that:
Step a. user downloads indoor map and indoor locating system by mobile terminal, and opens the camera of mobile terminal,
Continuous image is obtained using camera;
The continuous image that step b. obtains step a uploads cloud platform, by the three-dimensional map information in cloud platform to even
Continuous image carries out characteristic matching, so that scene Recognition is carried out, the preliminary Indoor Location Information obtained residing for user;
The Indoor Location Information that cloud platform is tentatively obtained is sent on the mobile terminal of user by step c., and will be therein
Fuzzy message feeds back to user, provides the different options of the possible correspondence position of ambiguous location;Week of the user according to observed by human eye
The scene and material object enclosed fuzzy message are identified judgement, and the choosing corresponding to correct positional information is chosen from different options
, finishing man-machine interaction;
The correct positional information of gained in step c is uploaded cloud platform by step d., and cloud platform will tentatively be obtained in step b
Indoor Location Information in fuzzy message replace with correct positional information, obtain the customer position information of precision, and instead
Feed user;
Cellphone subscriber's positional information of step e. precisions is shown in the indoor map that mobile terminal has been downloaded, and completes navigation
Position work.
Further, the continuous image is photo, video.
Beneficial effect:1st, characteristic matching quickly can be carried out to the image that user uploads on cloud platform end, and utilized
The eye-observation information of user carries out the correction of position result;2nd, user is in interactive process is carried out, by providing option
Mode improve the efficiency and ease for operation of man-machine interaction so that by the uncertain information precision during characteristic matching;3、
The mode combined using characteristic matching and man-machine interaction by indoor positioning from calculating coordinate patten transformation be scene search pattern;4、
The input of the external signals such as WiFi is not needed, the availability under reliability and hard situation is improved, positioning calculation is in cloud platform
End is completed, and navigation procedure is not limited by mobile terminal performance.
Brief description of the drawings
Fig. 1 is the inventive method flow chart.
Embodiment
The positioning calculation of the present invention is operated in cloud platform completion, and the positional information obtained using Feature Matching can be fed back
To user, positional information is modified by modes such as the eye recognitions of user, effectively improved by the pattern of man-machine interaction
Human eye obtains the effect of information.Characteristic matching and man-machine interaction are combined, ambiguous location is obtained using characteristic matching, and lead to
Cross man-machine interaction and the mode of ambiguous location precision is realized into indoor positioning.
The present invention's comprises the following steps that:
The Quick Response Code for downloading the Indoor environment map and indoor locating system is put in notable position to step a. indoors, uses
Indoor navigation map is downloaded at family by way of scanning Quick Response Code, and the scanning of Quick Response Code can pass through the softwares such as wechat, Tengxun qq
Realize, the program of the two-dimensional code scanning that can also be carried by mobile terminal is realized, is then turned on the camera of mobile terminal, using taking the photograph
As head acquisition continuous image, continuous image can be photo or video;
Step b. indoor locating systems are connected by the modes such as wlan network and cloud platform, and the image of acquisition is not in mobile terminal
Middle storage, directly uploads cloud platform, and cloud platform is screened to image data, rejects invalid and redundant data, passes through cloud platform
The three-dimensional map information of the existing building carries out characteristic matching to continuous image, so that scene Recognition is carried out, by continuous shadow
As information and three-dimensional map information are contrasted, user position is found out by the matching degree of comparison, it is preliminary to obtain indoor
Coordinate information;
The Indoor Location Information that step c. cloud platforms are tentatively obtained is sent on the mobile terminal of user, according to characteristic matching
Information is divided into precise information and the class of fuzzy message two by accuracy, and fuzzy message therein is fed back into user, to depanning
The different options of the possible correspondence position in position are pasted, surrounding scene of the user according to observed by human eye and material object enter to fuzzy message
Row identification judges, the option corresponding to correct positional information, finishing man-machine interaction are chosen from different options;
The correct positional information of gained in step c is uploaded cloud platform by step d., and cloud platform will tentatively be obtained in step b
Indoor Location Information in fuzzy message replace with correct positional information, obtain the customer position information of precision, and instead
Feed user;
The customer location of precision is shown in the indoor navigation map downloaded before mobile terminal obtained by step e. steps d,
Complete navigator fix work.
The method and approach that the present invention implements the technical scheme are a lot, and described above is only being preferable to carry out for the present invention
Mode, it is noted that for those skilled in the art, under the premise without departing from the principles of the invention, also
Some improvements and modifications can be made, these improvements and modifications also should be regarded as protection scope of the present invention.It is unknown in the present embodiment
True each part can use prior art to be realized.
Claims (2)
1. the indoor orientation method that a kind of feature based matching is combined with man-machine interaction, it is characterised in that:
Comprise the following steps that:
Step a. user downloads indoor map and indoor locating system by mobile terminal, and opens the camera of mobile terminal, utilizes
Camera obtains continuous image;
The continuous image that step b. obtains step a uploads cloud platform, by the three-dimensional map information in cloud platform to continuous shadow
As carrying out characteristic matching, so that scene Recognition is carried out, the preliminary Indoor Location Information obtained residing for user;
The Indoor Location Information that cloud platform is tentatively obtained is sent on the mobile terminal of user by step c., and by fuzzy letter therein
Breath feeds back to user, provides the different options of the possible correspondence position of ambiguous location;User according to observed by human eye around field
Scape and material object fuzzy message are identified judgement, and the option corresponding to correct positional information is chosen from different options, complete
Into man-machine interaction;
The correct positional information of gained in step c is uploaded cloud platform, the room that cloud platform will tentatively be obtained in step b by step d.
Fuzzy message in interior positional information replaces with correct positional information, obtains the customer position information of precision, and feeds back to
User;
Cellphone subscriber's positional information of step e. precisions is shown in the indoor map that mobile terminal has been downloaded, and completes navigator fix
Work.
2. the indoor orientation method that a kind of feature based matching according to claim 1 is combined with man-machine interaction, its feature
It is, the continuous image is photo, video.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611020040.8A CN107036593A (en) | 2016-11-18 | 2016-11-18 | The indoor orientation method that a kind of feature based matching is combined with man-machine interaction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611020040.8A CN107036593A (en) | 2016-11-18 | 2016-11-18 | The indoor orientation method that a kind of feature based matching is combined with man-machine interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107036593A true CN107036593A (en) | 2017-08-11 |
Family
ID=59531080
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611020040.8A Pending CN107036593A (en) | 2016-11-18 | 2016-11-18 | The indoor orientation method that a kind of feature based matching is combined with man-machine interaction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107036593A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI735876B (en) * | 2019-05-10 | 2021-08-11 | 宏碁股份有限公司 | Indoor positioning method, indoor positioning training system and mobile device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103424113A (en) * | 2013-08-01 | 2013-12-04 | 毛蔚青 | Indoor positioning and navigating method of mobile terminal based on image recognition technology |
WO2014073841A1 (en) * | 2012-11-07 | 2014-05-15 | 한국과학기술연구원 | Method for detecting image-based indoor position, and mobile terminal using same |
CN105225240A (en) * | 2015-09-25 | 2016-01-06 | 哈尔滨工业大学 | The indoor orientation method that a kind of view-based access control model characteristic matching and shooting angle are estimated |
CN105698761A (en) * | 2014-11-28 | 2016-06-22 | 英业达科技有限公司 | Cloud image positioning and navigation method and system |
-
2016
- 2016-11-18 CN CN201611020040.8A patent/CN107036593A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014073841A1 (en) * | 2012-11-07 | 2014-05-15 | 한국과학기술연구원 | Method for detecting image-based indoor position, and mobile terminal using same |
CN103424113A (en) * | 2013-08-01 | 2013-12-04 | 毛蔚青 | Indoor positioning and navigating method of mobile terminal based on image recognition technology |
CN105698761A (en) * | 2014-11-28 | 2016-06-22 | 英业达科技有限公司 | Cloud image positioning and navigation method and system |
CN105225240A (en) * | 2015-09-25 | 2016-01-06 | 哈尔滨工业大学 | The indoor orientation method that a kind of view-based access control model characteristic matching and shooting angle are estimated |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI735876B (en) * | 2019-05-10 | 2021-08-11 | 宏碁股份有限公司 | Indoor positioning method, indoor positioning training system and mobile device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107315166B (en) | System and method for positioning microsatellite base station based on single or multiple Bluetooth transmitting units | |
CN107339990B (en) | Multi-mode fusion positioning system and method | |
CN103823228B (en) | Alignment system, terminal and localization method | |
US9506761B2 (en) | Method and apparatus for indoor position tagging | |
CN104936283A (en) | Indoor positioning method, server and system | |
EP2844009B1 (en) | Method and system for determining location and position of image matching-based smartphone | |
CN106793086A (en) | A kind of indoor orientation method | |
US20130211718A1 (en) | Apparatus and method for providing indoor navigation service | |
WO2015014018A1 (en) | Indoor positioning and navigation method for mobile terminal based on image recognition technology | |
CN105717484A (en) | Indoor positioning system and positioning method | |
CN106851585A (en) | A kind of mixing floor location method based on barometer and WiFi | |
US20160345129A1 (en) | Positioning system for indoor and surrounding areas, positioning method and route-planning method thereof and mobile apparatus | |
US20110237185A1 (en) | Method and system for determining a location for a rf communication device based on its proximity to a mobile device | |
CN104569909B (en) | A kind of indoor alignment system and method | |
CN111664848B (en) | Multi-mode indoor positioning navigation method and system | |
CN108983147A (en) | A kind of indoor locating system and method based on mobile terminal | |
JP2007098555A (en) | Position indicating method, indicator and program for achieving the method | |
CN104237846A (en) | Autonomous moving object indoor three-dimensional positioning and tracking system and method | |
US11156694B2 (en) | Supporting a selection of a floor | |
CN115685060A (en) | Indoor fingerprint map construction method and related device | |
CN102547974A (en) | Layered heterogeneous wireless co-location method | |
CN107065027A (en) | Detection system, method, device and the equipment of source of leaks | |
TWI570424B (en) | Positioning method and electronic apparatus thereof | |
Kanakaraja | IoT enabled BLE and LoRa based indoor localization without GPS | |
CN107036593A (en) | The indoor orientation method that a kind of feature based matching is combined with man-machine interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170811 |
|
RJ01 | Rejection of invention patent application after publication |