CN117308966B - Indoor positioning and navigation method, system and computer equipment - Google Patents

Indoor positioning and navigation method, system and computer equipment Download PDF

Info

Publication number
CN117308966B
CN117308966B CN202311608430.7A CN202311608430A CN117308966B CN 117308966 B CN117308966 B CN 117308966B CN 202311608430 A CN202311608430 A CN 202311608430A CN 117308966 B CN117308966 B CN 117308966B
Authority
CN
China
Prior art keywords
information
image information
acquiring
destination
key area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311608430.7A
Other languages
Chinese (zh)
Other versions
CN117308966A (en
Inventor
马志鹏
陈国伟
张强强
唐锋
郭传刚
林松斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Taichuan Cloud Technology Co ltd
Original Assignee
Zhuhai Taichuan Cloud Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Taichuan Cloud Technology Co ltd filed Critical Zhuhai Taichuan Cloud Technology Co ltd
Priority to CN202311608430.7A priority Critical patent/CN117308966B/en
Publication of CN117308966A publication Critical patent/CN117308966A/en
Application granted granted Critical
Publication of CN117308966B publication Critical patent/CN117308966B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/20Position of source determined by a plurality of spaced direction-finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/24Position of single direction-finder fixed by determining direction of a plurality of spaced sources of known location
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The present disclosure relates to the field of indoor measurement and positioning technologies, and in particular, to an indoor positioning and navigation method, system, and computer device. The method comprises the following steps: acquiring starting point position information and destination position information; acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm; acquiring current position information, and matching the current position information with path information to judge whether the current position is positioned in a key area within a preset range of a destination position; if yes, acquiring current image information, and matching with preset key area image information; and if the matching is successful, outputting path prompt information. By adopting the method, the convenience of indoor positioning and navigation and the navigation precision can be improved.

Description

Indoor positioning and navigation method, system and computer equipment
Technical Field
The present disclosure relates to the field of indoor measurement and positioning technologies, and in particular, to an indoor positioning and navigation method, system, and computer device.
Background
In recent years, outdoor navigation brings incomparable convenience to the society and people; the original mode is changed, the eyes of the Mimi hemp on the map are not required to be known and checked, only simple searching work is needed, and the destination can be reached by adopting a planning line provided by the system. However, the outdoor map and navigation technology mainly relies on the development of satellites, and the satellites cannot reach indoor areas due to the limitation of communication modes; various indoor designs exist because each large building is designed and laid out according to the wishes of the sponsor, along with a particular geographic location. The indoor design and layout of the building are very complex, and the occupied area of the large building is large, so that indoor navigation becomes expected by all large building groups.
In addition, all large cities in the whole country are fully promoted to be in future-oriented smart city construction, and projects such as smart parks, smart buildings and smart communities are overwhelmed. As an intelligent building, the development of digital twinning is necessary in the future, and only an indoor positioning system is established, the digital twinning progress can be promoted, and the connection between reality and the digital world is rapidly promoted.
Therefore, there is a need for an indoor positioning and navigation method and system that solves the problems associated with the complex design and layout of the building interior.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an indoor positioning and navigation method, system, and computer device that can improve indoor navigation accuracy.
In a first aspect, the present application provides an indoor positioning and navigation method, the method comprising:
acquiring starting point position information and destination position information;
acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm;
acquiring current position information, and matching the current position information with path information to judge whether the current position is positioned in a key area within a preset range of a destination position;
if yes, acquiring current image information, and matching with preset key area image information;
and if the matching is successful, outputting path prompt information.
In one embodiment, the specific manner of determining whether the current location is located in the critical area within the preset range of the destination location includes:
acquiring beacon mark information of a positioning beacon which is closest to the position of the destination in the path information;
acquiring the distance between the current position and the positioning beacon according to the current position information and the beacon mark information, and judging whether the distance between the current position information and the positioning beacon information is smaller than a preset distance;
if yes, a key area of which the front position is positioned in a preset range of the destination position is indicated;
if not, the non-critical area of which the front position is outside the preset range of the destination position is indicated.
In one embodiment, the specific way of obtaining the current image information and matching the current image information with the preset key area image information includes:
acquiring at least one key region image information corresponding to destination information, which is pre-stored in a storage space, according to the beacon mark information and the destination information;
acquiring characteristic information of the image information of the key area, and matching the acquired image information of at least one key area with the current image information one by one;
and acquiring the key area image information successfully matched.
In one embodiment, the specific manner of obtaining the current image information and matching the current image information with the preset key area image information further includes:
judging whether the key area image information successfully matched is unique;
if not, acquiring image acquisition point position information corresponding to the key area image information;
judging whether the acquisition point corresponding to the position information of the acquisition point is positioned on a path corresponding to the path information;
if not, discarding the key region image information corresponding to the acquisition point position information;
if yes, the key area image information corresponding to the position information of the acquisition point is used as the key area image information successfully matched with the current image information.
In one embodiment, the specific manner of outputting the path prompt information includes:
acquiring a corresponding region of a destination in the key region image information according to the destination information;
matching the region corresponding to the destination in the current image information according to the characteristics of the region corresponding to the destination in the key region image information;
and adding a preset indication mark into the current image information and outputting the current image information as path prompt information.
In a second aspect, the present application also provides an indoor positioning and navigation system, including:
the positioning beacon module is used for receiving the electromagnetic wave signals and feeding back distance information;
the mobile navigation module is used for sending out electromagnetic wave signals, receiving distance information fed back by the positioning beacon module and acquiring current position information according to the plurality of distance information;
the path planning module is used for acquiring the starting point position information and the destination position information and acquiring path information according to the acquired starting point position information, the destination position information and a preset path planning algorithm;
the storage module is used for storing a preset path planning algorithm and key area image information corresponding to the key area information.
In a third aspect, the present application also provides a computer device comprising a memory storing a computer program and a processor implementing the following steps when executing the computer program:
acquiring starting point position information and destination position information;
acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm;
acquiring current position information, and matching the current position information with path information to judge whether the current position is positioned in a key area within a preset range of a destination position;
if yes, acquiring current image information, and matching with preset key area image information;
and if the matching is successful, outputting path prompt information.
In a fourth aspect, the present application also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
acquiring starting point position information and destination position information;
acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm;
acquiring current position information, and matching the current position information with path information to judge whether the current position is positioned in a key area within a preset range of a destination position;
if yes, acquiring current image information, and matching with preset key area image information;
and if the matching is successful, outputting path prompt information.
In a fifth aspect, the present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of:
acquiring starting point position information and destination position information;
acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm;
acquiring current position information, and matching the current position information with path information to judge whether the current position is positioned in a key area within a preset range of a destination position;
if yes, acquiring current image information, and matching with preset key area image information;
and if the matching is successful, outputting path prompt information.
According to the indoor positioning and navigation method, system, computer equipment, storage medium and computer program product, indoor positioning navigation is realized based on the Bluetooth AOA algorithm, in the moving process, map and path display are adopted to conduct whole-course path guidance, and after moving to the vicinity of a destination, the path guidance of a real-time scene is automatically started, and through combination of whole-course guidance and real-time scene path guidance, the probability that a user still walks wrong when using navigation is reduced, so that convenience and accuracy of navigation guidance are improved.
Drawings
FIG. 1 is an application environment diagram of an indoor positioning and navigation method in one embodiment;
FIG. 2 is a flow chart of a method for indoor positioning and navigation according to one embodiment;
FIG. 3 is a flow chart of step 300 in one embodiment;
FIG. 4 is a flow chart of step 400 in one embodiment;
fig. 5 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The indoor positioning and navigation method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices, where the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
In one embodiment, as shown in fig. 2, an indoor positioning and navigation method is provided, and the method is applied to the terminal 102 in fig. 1 for illustration, and includes the following steps:
step 100: and acquiring the starting point position information and the destination position information.
The starting point information is a starting point position of the user on the building, the position can be input by the user through input equipment, the input equipment comprises but not limited to equipment with an information acquisition function such as a keyboard, a touch screen and a microphone, and the position information of the current user can be obtained as the starting point position information; the destination location information is destination location information corresponding to a place to which the user intends to travel.
Step 200: and acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm.
The path information is information indicating how to move from a position corresponding to the start point position information to a movement path corresponding to the destination position information, and in this embodiment, the presentation form of the path information is based on a pre-stored indoor two-dimensional and/or three-dimensional map, and the path information is displayed in the map to indicate a complete path for a user to move from the start point to the destination.
Step 300: and acquiring current position information, and matching the current position information with the path information to judge whether the current position is positioned in a key area within a preset range of the destination position.
The method comprises the steps of obtaining current position information, and determining the current position of a holding positioning device through a Bluetooth AOA positioning algorithm; the key area is a circular area which is surrounded by taking a destination or other important nodes (such as corners, building crossings, elevator gates and the like) in the path as a circle center and taking a preset length as a radius. Specifically, the specific step of judging whether the current position is located in the key area within the preset range of the destination position includes:
step 310: beacon mark information of a positioning beacon closest to the destination location in the path information is acquired.
The positioning beacons are devices which are installed indoors and are used for receiving positioning signals to determine the position of a signal source, and a plurality of positioning beacons are usually installed indoors, and each positioning beacon is provided with beacon mark information for identifying the identity and the position of the positioning beacon.
Step 320: and acquiring the distance between the current position and the positioning beacon according to the current position information and the beacon mark information, and judging whether the distance between the current position information and the positioning beacon information is smaller than a preset distance.
In this embodiment, the preset distance is the radius of the critical area, and its size is the same as the preset length used in determining the critical area.
Step 331: if yes, the key area of the front position in the preset range of the destination position is indicated.
Step 332: if not, the non-critical area of which the front position is outside the preset range of the destination position is indicated.
In the above step, since the position of the key position and the position of the positioning beacon are relatively fixed, the horizontal distance between the user and the key position can be calculated by calculating the distance between the positioning beacon and the user, and then the distance is compared with the preset length, so as to judge whether the user enters the key region.
Step 400: if yes, the current image information is obtained and matched with the preset key area image information.
The current image information is the image information acquired through the image acquisition module, and the preset key area image information is the image information which is pre-stored in the storage space and is used for recording the environment in the key area and used as an image matching reference. Specifically, the specific mode for obtaining the current image information and matching with the preset key area image information comprises the following steps:
step 410: and acquiring at least one key region image information corresponding to the destination information, which is pre-stored in the storage space, according to the beacon mark information and the destination information.
The key region image information is obtained by shooting and sampling a key region from one or more view angles for one key region. For example, a destination which can be reached only along a single direction of the actual path by a gateway such as a room at the end of a corridor, only key image information of one direction needs to be acquired; aiming at destinations which can be reached along the actual path in two directions such as rooms positioned in the middle section of the corridor, key image information in two opposite directions needs to be acquired; the number of the key image information to be acquired is consistent with the number of paths of the branches aiming at the key areas such as the corridor branches.
Step 420: and acquiring characteristic information of the image information of the key area, and matching the acquired image information of at least one key area with the current image information one by one.
In the foregoing step, the key area may correspond to a plurality of pieces of key image information, so that the current image information and the plurality of pieces of key image information need to be matched one by one, and different image features exist in the key image information shot at different view angles, and the current image is identified and matched by matching the image features with the current image.
Step 430: and acquiring the key area image information successfully matched.
Step 440: judging whether the key area image information successfully matched is unique or not.
Step 441: if not, acquiring the position information of the image acquisition point corresponding to the image information of the key area.
Through the steps, one or more pieces of successfully matched image information can be obtained; the reason why the plurality of successfully matched key area image information exists includes that the image features of the key area shot from a plurality of view angles are approximate, so that the plurality of key image information can be successfully matched with the current image information, and therefore when the situation occurs, the successfully matched key image information needs to be further screened to acquire the key image information of the same view angle as the current image information acquired from the current view angle of a user.
Step 450: and judging whether the acquisition point position corresponding to the acquisition point position information is positioned on a path corresponding to the path information.
Step 461: if not, discarding the key region image information corresponding to the acquisition point position information.
Step 462: if yes, the key area image information corresponding to the position information of the acquisition point is used as the key area image information successfully matched with the current image information.
The method comprises the steps that when key image information is acquired, the acquisition point information corresponds to the position of the acquisition point, the acquisition point information corresponds to the key image information one by one, after key image matching is completed and a plurality of successfully matched key image information exists, whether the acquisition point for acquiring the key image information is positioned on a path corresponding to the path information is judged, if yes, the included angle between the direction from the acquisition point to the center of a key area and the shooting direction of the acquired current image information is smaller than a preset shooting angle threshold, and the key image information corresponding to the position information of the acquisition point can be considered to be the successfully matched key image information, if no, the key image information is abandoned.
Step 500: and if the matching is successful, outputting path prompt information.
The method for outputting the path prompt information comprises the following specific modes of combining the path prompt information with the current image information, providing the image information or the video information for prompting a specific path moving from the current position to the destination for a user, and specifically outputting the path prompt information:
step 510: and acquiring the corresponding region of the destination in the key region image information according to the destination information.
Step 520: and matching the region corresponding to the destination in the current image information according to the characteristics of the region corresponding to the destination in the key region image information.
Step 530: and adding a preset indication mark into the current image information and outputting the current image information as path prompt information.
When the indoor positioning and navigation method is used, a user uses the electronic equipment for running the method indoors, the embodiment takes the mobile phone as an example to specifically explain, the user establishes a link with the indoor Bluetooth positioning system through the Bluetooth communication function of the mobile phone, then inputs an origin and a destination at the mobile phone end, and when the origin is input, the current position of the user can be automatically determined through the Bluetooth AOA positioning algorithm, so that automatic input of the information of the origin is completed. After the origin information and the destination information are filled, automatic path planning can be completed through a preset path algorithm, path information is displayed through an image display device of the mobile phone, namely, a path image displayed in a two-dimensional map and/or a three-dimensional map, a user can walk according to the path image prompted by the mobile phone end at the moment, in the moving process of the user, the mobile phone can be combined with a plurality of positioning beacon modules installed indoors to determine the current position of the user holding the mobile phone, so that the current position information is updated, and therefore whether a key area is entered is judged in real time, the current position information is presumed to be a circular key area taking a place corresponding to the destination information as a center and taking a preset length as a radius, the user is confirmed to enter the key area through a positioning algorithm, an image right in front of the travelling direction of the user is obtained through a camera of the mobile phone to be the current image information, the current image information is matched with one or more key image information corresponding to the key area, the position corresponding to the destination information in the successfully matched key image information is identified, the image feature of the user is obtained, the current position corresponding to the destination information in the image information is identified according to the image feature of the image feature contained in the user, whether the current position information corresponds to the destination information in the image information is judged, and the preset destination information is accurately indicated in the initial position, for example, the image is pointed to the bottom of the image is provided by the image is more than the initial position information, and the position is pointed to the initial position information, and is more accurate to point to the user. In summary, in the moving process, the map and the path display are adopted to perform the whole-course path guidance, the real-time scene path guidance is automatically started after the map and the path display are moved to the vicinity of the intersection or the vicinity of the destination, and the probability that the user still walks by mistake when using navigation is reduced by combining the whole-course guidance and the real-time scene path guidance, so that the convenience and the accuracy of the navigation guidance are improved.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an indoor positioning and navigation system for realizing the indoor positioning and navigation method. The implementation of the solution provided by the system is similar to that described in the above method, so the specific limitation of one or more embodiments of the indoor positioning and navigation system provided below may be referred to above for limitation of an indoor positioning and navigation method, and will not be repeated here.
In one embodiment, an indoor positioning and navigation system is provided, comprising: positioning beacon module, mobile navigation module, route planning module and storage module, wherein:
the positioning beacon module is used for receiving the electromagnetic wave signals and feeding back distance information;
the mobile navigation module is used for sending out electromagnetic wave signals, receiving distance information fed back by the positioning beacon module and acquiring current position information according to the plurality of distance information;
the path planning module is used for acquiring the starting point position information and the destination position information and acquiring path information according to the acquired starting point position information, the destination position information and a preset path planning algorithm;
the storage module is used for storing a preset path planning algorithm and key area image information corresponding to the key area information.
Each of the modules in the indoor positioning and navigation system may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure of which may be as shown in fig. 5. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an indoor positioning and navigation method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 5 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
step 100: and acquiring the starting point position information and the destination position information.
Step 200: and acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm.
Step 300: and acquiring current position information, and matching the current position information with the path information to judge whether the current position is positioned in a key area within a preset range of the destination position.
The specific step of judging whether the current position is located in the key area within the preset range of the destination position comprises the following steps:
step 310: acquiring beacon mark information of a positioning beacon closest to the position of the destination from the path information;
step 320: acquiring the distance between the current position and the positioning beacon according to the current position information and the beacon mark information, and judging whether the distance between the current position information and the positioning beacon information is smaller than a preset distance;
step 331: if yes, a key area of which the front position is positioned in a preset range of the destination position is indicated;
step 332: if not, the non-critical area of which the front position is outside the preset range of the destination position is indicated.
Step 400: if yes, acquiring current image information, and matching with preset key area image information;
the specific method for acquiring the current image information and matching with the preset key area image information comprises the following steps:
step 410: acquiring at least one key region image information corresponding to destination information, which is pre-stored in a storage space, according to the beacon mark information and the destination information;
step 420: acquiring characteristic information of the image information of the key area, and matching the acquired image information of at least one key area with the current image information one by one;
step 430: and acquiring the key area image information successfully matched.
Step 440: judging whether the key area image information successfully matched is unique;
step 441: if not, acquiring image acquisition point position information corresponding to the key area image information;
step 450: judging whether the acquisition point corresponding to the position information of the acquisition point is positioned on a path corresponding to the path information;
step 461: if not, discarding the key region image information corresponding to the acquisition point position information;
step 462: if yes, the key area image information corresponding to the position information of the acquisition point is used as the key area image information successfully matched with the current image information.
Step 500: and if the matching is successful, outputting path prompt information.
The specific mode of outputting the path prompt information comprises the following steps:
step 510: acquiring a corresponding region of a destination in the key region image information according to the destination information;
step 520: matching the region corresponding to the destination in the current image information according to the characteristics of the region corresponding to the destination in the key region image information;
step 530: and adding a preset indication mark into the current image information and outputting the current image information as path prompt information.
In one embodiment, a computer program product is provided comprising a computer program which, when executed by a processor, performs the steps of:
step 100: and acquiring the starting point position information and the destination position information.
Step 200: and acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm.
Step 300: and acquiring current position information, and matching the current position information with the path information to judge whether the current position is positioned in a key area within a preset range of the destination position.
The specific step of judging whether the current position is located in the key area within the preset range of the destination position comprises the following steps:
step 310: acquiring beacon mark information of a positioning beacon closest to the position of the destination from the path information;
step 320: acquiring the distance between the current position and the positioning beacon according to the current position information and the beacon mark information, and judging whether the distance between the current position information and the positioning beacon information is smaller than a preset distance;
step 331: if yes, a key area of which the front position is positioned in a preset range of the destination position is indicated;
step 332: if not, the non-critical area of which the front position is outside the preset range of the destination position is indicated.
Step 400: if yes, acquiring current image information, and matching with preset key area image information;
the specific method for acquiring the current image information and matching with the preset key area image information comprises the following steps:
step 410: acquiring at least one key region image information corresponding to destination information, which is pre-stored in a storage space, according to the beacon mark information and the destination information;
step 420: acquiring characteristic information of the image information of the key area, and matching the acquired image information of at least one key area with the current image information one by one;
step 430: and acquiring the key area image information successfully matched.
Step 440: judging whether the key area image information successfully matched is unique;
step 441: if not, acquiring image acquisition point position information corresponding to the key area image information;
step 450: judging whether the acquisition point corresponding to the position information of the acquisition point is positioned on a path corresponding to the path information;
step 461: if not, discarding the key region image information corresponding to the acquisition point position information;
step 462: if yes, the key area image information corresponding to the position information of the acquisition point is used as the key area image information successfully matched with the current image information.
Step 500: and if the matching is successful, outputting path prompt information.
The specific mode of outputting the path prompt information comprises the following steps:
step 510: acquiring a corresponding region of a destination in the key region image information according to the destination information;
step 520: matching the region corresponding to the destination in the current image information according to the characteristics of the region corresponding to the destination in the key region image information;
step 530: and adding a preset indication mark into the current image information and outputting the current image information as path prompt information.
It should be noted that, user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (4)

1. An indoor positioning and navigation method, the method comprising:
acquiring starting point position information and destination position information;
acquiring path information according to the starting point position information, the destination position information and a preset path planning algorithm;
acquiring current position information, and matching the current position information with path information to judge whether the current position is positioned in a key area within a preset range of a destination position;
if yes, acquiring current image information, and matching with preset key area image information;
if the matching is successful, outputting path prompt information, wherein,
the specific way for judging whether the current position is located in the key area within the preset range of the destination position comprises the following steps:
acquiring beacon mark information of a positioning beacon closest to the position of the destination from the path information;
acquiring the distance between the current position and the positioning beacon according to the current position information and the beacon mark information, and judging whether the distance between the current position information and the positioning beacon information is smaller than a preset distance;
if yes, a key area of which the current position is located in a preset range of the destination position is indicated;
if not, the non-key area of which the current position is outside the preset range of the destination position is indicated;
the specific ways for acquiring the current image information and matching with the preset key area image information comprise the following steps:
acquiring at least one key region image information corresponding to the destination information pre-stored in a storage space according to the beacon mark information and the destination information;
acquiring characteristic information of the image information of the key area, and matching the acquired image information of at least one key area with the current image information one by one;
acquiring key region image information successfully matched;
the specific way for obtaining the current image information and matching the current image information with the preset key area image information further comprises the following steps:
judging whether the key area image information successfully matched is unique;
if not, acquiring image acquisition point position information corresponding to the key area image information;
judging whether the acquisition point corresponding to the position information of the acquisition point is positioned on a path corresponding to the path information;
if not, discarding the key region image information corresponding to the acquisition point position information;
if yes, the key area image information corresponding to the position information of the acquisition point is used as the key area image information successfully matched with the current image information.
2. The method according to claim 1, wherein the specific manner of outputting the path prompt information includes:
acquiring a corresponding region of a destination in the key region image information according to the destination information;
matching the region corresponding to the destination in the current image information according to the characteristics of the region corresponding to the destination in the key region image information;
and adding a preset indication mark into the current image information and outputting the current image information as path prompt information.
3. An indoor positioning and navigation system employing the method of claim 1 or 2, characterized in that the indoor positioning and navigation system comprises:
the positioning beacon module is used for receiving the electromagnetic wave signals and feeding back distance information;
the mobile navigation module is used for sending out electromagnetic wave signals, receiving distance information fed back by the positioning beacon module and acquiring current position information according to the plurality of distance information;
the path planning module is used for acquiring the starting point position information and the destination position information and acquiring path information according to the acquired starting point position information, the destination position information and a preset path planning algorithm;
the storage module is used for storing a preset path planning algorithm and key area image information corresponding to the key area information.
4. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of claim 1 or 2 when executing the computer program.
CN202311608430.7A 2023-11-29 2023-11-29 Indoor positioning and navigation method, system and computer equipment Active CN117308966B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311608430.7A CN117308966B (en) 2023-11-29 2023-11-29 Indoor positioning and navigation method, system and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311608430.7A CN117308966B (en) 2023-11-29 2023-11-29 Indoor positioning and navigation method, system and computer equipment

Publications (2)

Publication Number Publication Date
CN117308966A CN117308966A (en) 2023-12-29
CN117308966B true CN117308966B (en) 2024-02-09

Family

ID=89281472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311608430.7A Active CN117308966B (en) 2023-11-29 2023-11-29 Indoor positioning and navigation method, system and computer equipment

Country Status (1)

Country Link
CN (1) CN117308966B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850563A (en) * 2014-02-18 2015-08-19 歌乐株式会社 Destination image comparison retrieval device, destination image comparison retrieval system and destination image comparison retrieval method
CN106097443A (en) * 2016-05-30 2016-11-09 南京林业大学 City indoor and outdoor integrated three-dimensional scenario building and spatially adaptive air navigation aid
CN107094319A (en) * 2016-02-17 2017-08-25 王庆文 A kind of high-precision indoor and outdoor fusion alignment system and method
CN111615056A (en) * 2020-04-08 2020-09-01 广州中海达卫星导航技术股份有限公司 Indoor and outdoor seamless switching positioning method and device, computer equipment and storage medium
CN112729313A (en) * 2020-12-28 2021-04-30 大连伟岸纵横科技股份有限公司 Indoor navigation positioning method, terminal and computer storage medium
CN112985419A (en) * 2021-05-12 2021-06-18 中航信移动科技有限公司 Indoor navigation method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107478237A (en) * 2017-06-29 2017-12-15 百度在线网络技术(北京)有限公司 Real scene navigation method, device, equipment and computer-readable recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850563A (en) * 2014-02-18 2015-08-19 歌乐株式会社 Destination image comparison retrieval device, destination image comparison retrieval system and destination image comparison retrieval method
CN107094319A (en) * 2016-02-17 2017-08-25 王庆文 A kind of high-precision indoor and outdoor fusion alignment system and method
CN106097443A (en) * 2016-05-30 2016-11-09 南京林业大学 City indoor and outdoor integrated three-dimensional scenario building and spatially adaptive air navigation aid
CN111615056A (en) * 2020-04-08 2020-09-01 广州中海达卫星导航技术股份有限公司 Indoor and outdoor seamless switching positioning method and device, computer equipment and storage medium
CN112729313A (en) * 2020-12-28 2021-04-30 大连伟岸纵横科技股份有限公司 Indoor navigation positioning method, terminal and computer storage medium
CN112985419A (en) * 2021-05-12 2021-06-18 中航信移动科技有限公司 Indoor navigation method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN117308966A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
CN102754097B (en) Method and apparatus for presenting a first-person world view of content
US8952983B2 (en) Method and apparatus for annotating point of interest information
CN102227611B (en) For providing the method and apparatus of the cursor of context data in instruction drawing application
CN105190239B (en) For using the directionality and X-ray view techniques of the navigation of mobile device
CN105008858B (en) For user's framework in the circle of indoor positioning
US9514717B2 (en) Method and apparatus for rendering items in a user interface
JP5785302B2 (en) A user portable terminal that retrieves target geographical information using the user's current position and current azimuth and provides the user with the information
US11243288B2 (en) Location error radius determination
CN104331423B (en) A kind of localization method and device based on electronic map
US10049124B2 (en) Apparatus and method of tracking location of wireless terminal based on image
CN104956236A (en) Survey techniques for generating location fingerprint data
KR20170046675A (en) Providing in-navigation search results that reduce route disruption
BR112016025128B1 (en) COMPUTER IMPLEMENTED METHOD OF DETERMINING A CALCULATED POSITION OF A MOBILE PROCESSING DEVICE, COMPUTER STORAGE MEDIA, AND MOBILE PROCESSING DEVICE
US20110219328A1 (en) Methods and apparatuses for facilitating location selection
CN104135716A (en) Push method and system of interest point information
EP2820576A1 (en) Method and apparatus for rendering items in a user interface
CN107209783A (en) Adaptive location designator
Carswell et al. Mobile visibility querying for LBS
KR20190018243A (en) Method and system for navigation using video call
CN112328911B (en) Place recommending method, device, equipment and storage medium
US7755517B2 (en) Navigation device
CN117308966B (en) Indoor positioning and navigation method, system and computer equipment
WO2016058533A1 (en) Method and apparatus for generating and positioning descriptive location prompt information
KR102136213B1 (en) Method and system for associating maps having different attribute for provding different services
CN116734873A (en) Direction determination method, direction determination device, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant