CN116358523A - Indoor navigation method, device, electronic equipment and storage medium - Google Patents

Indoor navigation method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116358523A
CN116358523A CN202310325374.XA CN202310325374A CN116358523A CN 116358523 A CN116358523 A CN 116358523A CN 202310325374 A CN202310325374 A CN 202310325374A CN 116358523 A CN116358523 A CN 116358523A
Authority
CN
China
Prior art keywords
navigation
destination
displaying
map
attribute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310325374.XA
Other languages
Chinese (zh)
Inventor
陈浩然
王晓彤
林博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Chengshi Wanglin Information Technology Co Ltd
Original Assignee
Beijing Chengshi Wanglin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chengshi Wanglin Information Technology Co Ltd filed Critical Beijing Chengshi Wanglin Information Technology Co Ltd
Priority to CN202310325374.XA priority Critical patent/CN116358523A/en
Publication of CN116358523A publication Critical patent/CN116358523A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The embodiment of the invention provides an indoor navigation method, an indoor navigation device, electronic equipment and a storage medium, which are applied to the technical field of map navigation, wherein the method comprises the following steps: responding to an identification instruction aiming at an entity map identifier, displaying a three-dimensional map corresponding to a target floor corresponding to the entity map identifier and displaying the current position of a navigation object in the three-dimensional map; in response to obtaining a destination to be reached, displaying a destination attribute for the destination and displaying a target position corresponding to the destination in the three-dimensional map; in response to a navigation instruction for the destination attribute, displaying a live-action map and at least one navigation guide for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.

Description

Indoor navigation method, device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of map navigation, and more particularly, to an indoor navigation method, an indoor navigation apparatus, an electronic device, and a computer-readable storage medium.
Background
Along with the development of navigation technology, people often adopt a navigation mode to select corresponding lines when going out daily, so that congestion can be avoided, and wrong routes can be avoided. However, indoor navigation still has a great disadvantage due to limitations of sites and devices. When people enter into multi-storey buildings such as office buildings and shopping centers, only the position diagram of each storey is displayed through navigation software or position diagrams in the buildings, users can only be informed of the position of the general position, the users cannot be guided to go to the corresponding positions, the users can only search through the users themselves, navigation is carried out through the view angle of single dimension, good route navigation cannot be provided for the users in the rooms with more space and complicated routes, the navigation efficiency is low, and direction recognition cannot be carried out through more view angles, so that the user experience is reduced.
Disclosure of Invention
The embodiment of the invention provides an indoor navigation method, an indoor navigation device, electronic equipment and a computer readable storage medium, which are used for solving or partially solving the problems of low navigation efficiency and single navigation dimension of indoor navigation.
The embodiment of the invention discloses an indoor navigation method, which comprises the following steps:
Responding to an identification instruction aiming at an entity map identifier, displaying a three-dimensional map corresponding to a target floor corresponding to the entity map identifier and displaying the current position of a navigation object in the three-dimensional map;
in response to obtaining a destination to be reached, displaying a destination attribute for the destination and displaying a target position corresponding to the destination in the three-dimensional map;
in response to a navigation instruction for the destination attribute, displaying a live-action map and at least one navigation guide for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.
Optionally, the displaying the live-action graph and at least one navigation guidance for the destination includes:
displaying a live-action diagram of the navigation prompt information aiming at the destination and corresponding to the target floor, and displaying a moving route in the live-action diagram.
Optionally, the displaying the navigation prompt information for the destination includes:
and displaying a direction guide mark for turning to the next moving path by the moving path to which the navigation object currently belongs and moving navigation prompt information aiming at the destination.
Optionally, displaying the moving route in the live-action graph includes:
and displaying a moving route which is pointed to a next moving route by the moving route which the navigation object currently belongs to in the live-action diagram.
Optionally, the displaying the destination attribute for the destination includes:
and displaying an attribute floating window corresponding to the destination, and displaying a destination attribute corresponding to the destination in the attribute floating window.
Optionally, the destination is a conference room located in the target floor, and the displaying, in the attribute floating window, a destination attribute corresponding to the destination includes:
displaying conference attributes corresponding to the conference room in the attribute floating window;
the conference attribute at least comprises one of position information, conference theme, conference time, conference user identification, sign-in code and conference room image.
Optionally, the attribute floating window further includes a navigation control, the displaying a live-action image and at least one navigation guide for the destination in response to a navigation instruction for the destination attribute, and displaying a navigation route from the current location to the target location in the three-dimensional map, including:
Responding to a navigation instruction aiming at the navigation control, and switching and displaying the attribute floating window as a navigation floating window, wherein the navigation floating window at least comprises steering navigation prompt information, navigation information and an AR navigation control;
in response to a live-action navigation instruction for the AR navigation control, displaying a live-action map and at least one navigation guideline for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.
The embodiment of the invention also discloses an indoor navigation device, which comprises:
the map display module is used for responding to the identification instruction aiming at the entity map identification, displaying a three-dimensional map corresponding to the target floor corresponding to the entity map identification and displaying the current position of the navigation object in the three-dimensional map;
a position display module, configured to display a destination attribute for a destination and display a target position corresponding to the destination in the three-dimensional map in response to acquiring the destination to be reached;
a navigation module for displaying a live-action map and at least one navigation guide for the destination in response to a navigation instruction for the destination attribute, and displaying a navigation route from the current location to the target location in the three-dimensional map.
Optionally, the navigation module is specifically configured to:
displaying a live-action diagram of the navigation prompt information aiming at the destination and corresponding to the target floor, and displaying a moving route in the live-action diagram.
Optionally, the navigation module is specifically configured to:
and displaying a direction guide mark for turning to the next moving path by the moving path to which the navigation object currently belongs and moving navigation prompt information aiming at the destination.
Optionally, the navigation module is specifically configured to:
and displaying a moving route which is pointed to a next moving route by the moving route which the navigation object currently belongs to in the live-action diagram.
Optionally, the position display module is specifically configured to:
and displaying an attribute floating window corresponding to the destination, and displaying a destination attribute corresponding to the destination in the attribute floating window.
Optionally, the destination is a conference room located in the target floor, and the location display module is specifically configured to:
displaying conference attributes corresponding to the conference room in the attribute floating window;
the conference attribute at least comprises one of position information, conference theme, conference time, conference user identification, sign-in code and conference room image.
Optionally, the attribute floating window further includes a navigation control, and the navigation module is specifically configured to:
responding to a navigation instruction aiming at the navigation control, and switching and displaying the attribute floating window as a navigation floating window, wherein the navigation floating window at least comprises steering navigation prompt information, navigation information and an AR navigation control;
in response to a live-action navigation instruction for the AR navigation control, displaying a live-action map and at least one navigation guideline for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.
The embodiment of the invention also discloses electronic equipment, which comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the method according to the embodiment of the present invention when executing the program stored in the memory.
Embodiments of the present invention also disclose a computer-readable storage medium having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method according to the embodiments of the present invention.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, in the indoor navigation process, the terminal can respond to the identification instruction aiming at the entity map identification, display the three-dimensional map corresponding to the target floor corresponding to the entity map identification and display the current position of the navigation object in the three-dimensional map, respond to the acquisition of the destination to be reached, display the destination attribute aiming at the destination and display the target position corresponding to the destination in the three-dimensional map, obtain the corresponding three-dimensional map through the identification of the entity map identification, facilitate the navigation object to visually and comprehensively view map information, and when the navigation object inputs the navigation instruction aiming at the destination, the terminal can respond to the navigation instruction aiming at the destination attribute, display at least one navigation guide from the real map to the destination and display the navigation route from the current position to the target position in the three-dimensional map, thereby effectively assisting the navigation object to move to the indoor destination in a good direction guide and space structure identification mode by displaying the real map, at least one navigation guide and the navigation route in the three-dimensional map in the indoor navigation process.
Drawings
FIG. 1 is a flow chart of steps of an indoor navigation method provided in an embodiment of the present invention;
FIG. 2 is a schematic illustration of an application interface provided in an embodiment of the present invention;
FIG. 3 is a schematic illustration of an application interface provided in an embodiment of the present invention;
FIG. 4 is a schematic illustration of an application interface provided in an embodiment of the present invention;
FIG. 5 is a schematic illustration of an application interface provided in an embodiment of the present invention;
FIG. 6 is a schematic diagram of an application interface provided in an embodiment of the present invention;
FIG. 7 is a schematic illustration of an application interface provided in an embodiment of the present invention;
fig. 8 is a block diagram of an indoor navigation device provided in an embodiment of the present invention;
fig. 9 is a block diagram of an electronic device provided in an embodiment of the invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
As an example, in a multi-story building such as an office building and a shopping mall, only the position diagram of each floor is presented through navigation software or a position diagram in the building, the user can only be informed of where the approximate position is, and cannot be guided to go to the corresponding position, and can only search by the user, and navigation can be performed through a single-dimension view angle, so that good route navigation can not be provided for the user in a room with more spaces and complicated routes, the navigation efficiency is low, and direction recognition can not be performed through more view angles, thereby reducing user experience. For example, for a business located on a campus, it often contains multiple different meeting rooms in one building, while for some users in the business it may be unclear for objective reasons where the meeting room is located, for which indoor navigation is required; or, for the shopping mall, the shop composition in the shopping mall is more complex, and when the user wants to reach the corresponding target shop, the same problem exists, but the related art cannot meet the indoor navigation requirement.
In this regard, one of the core inventions of the present invention is that in the process of indoor navigation, the terminal may display a three-dimensional map corresponding to a target floor corresponding to the entity map identifier and display a current position of a navigation object in the three-dimensional map in response to an identification instruction for the entity map identifier, and in response to acquiring a destination to be reached, display destination attributes for the destination and display a target position corresponding to the destination in the three-dimensional map, and obtain a corresponding three-dimensional map by identifying the entity map identifier, so that the navigation object can visually and comprehensively view map information.
In order to enable those skilled in the art to better understand the technical solutions in the embodiments of the present invention, the following explains and describes some technical features related to the embodiments of the present invention:
the entity map identifier, which may be an identifier physically presented, for example, a two-dimensional code physically presented in the real world, a bar code, or the like, may be identified, and a corresponding data map may be obtained by identifying the entity map identifier, so as to be presented in the terminal.
A three-dimensional map, which may be a map displayed in a three-dimensional image mode.
The real-scene image can be used for calling the real-scene content which is presented after the image sensor is used for carrying out image acquisition on the environment for the terminal, and the current environment can be intuitively presented through the real-scene image so as to more intuitively navigate.
The navigation guide can be related content for navigation displayed in a graphical user interface of the terminal, navigation with different latitudes can be provided for a user through different navigation guides, and a good direction guide and space structure identification mode is provided for the user.
A navigation object, which may be a user, i.e. a user holding a terminal providing navigation functions.
Referring to fig. 1, a step flowchart of an indoor navigation method provided in an embodiment of the present invention may specifically include the following steps:
step 101, responding to an identification instruction aiming at an entity map identifier, displaying a three-dimensional map corresponding to a target floor corresponding to the entity map identifier and displaying the current position of a navigation object in the three-dimensional map;
optionally, the embodiment of the invention can be applied to a mobile terminal, corresponding application programs can be run in the mobile terminal, and the application programs can provide corresponding navigation functions. For example, the application program may be a conventional navigation application, an office application, a corresponding navigation kinetic energy may be provided in the office application, etc., which is not limited by the present invention.
In a specific implementation, the terminal may display a three-dimensional map corresponding to a target floor corresponding to the entity map identifier and display a current position of the navigation object in the three-dimensional map in response to an identification instruction for the entity map identifier. For a building, corresponding building data can be constructed, corresponding floor data can be built for each floor according to the condition of floor distribution, and the corresponding floor data are associated through corresponding entity map identifiers, one entity map identifier is associated with one floor data, then the terminal can display a three-dimensional map corresponding to a target floor corresponding to the entity map identifier in a graphical user interface according to the corresponding entity map identifier, and the current position of a navigation object is displayed in the three-dimensional map, namely the current position of a user in the floor where the user is located is displayed in the three-dimensional map.
As an example, the entity map identifier may be an identifier set in the real world, for example, a two-dimensional code, for an office building or a shopping center, a corresponding planar map may be set in each floor, and a corresponding two-dimensional code may be set in the planar map, where the two-dimensional code is associated with the floor, when a user scans the two-dimensional code through the terminal, the terminal may obtain map data associated with the scanned two-dimensional code from a corresponding server, and render a corresponding three-dimensional map, and through the three-dimensional map, a spatial structure of the corresponding floor may be displayed more intuitively and comprehensively, so that the user may conveniently distinguish directions and spaces.
Step 102, in response to obtaining a destination to be reached, displaying a destination attribute for the destination and displaying a target position corresponding to the destination in the three-dimensional map;
in the embodiment of the invention, as for the destination, the destination can be a destination existing before the navigation object processes the entity map identification, or can be a destination selected by the navigation object in the three-dimensional map, as for the destination, while the three-dimensional map is displayed, the terminal can simultaneously display a target position corresponding to the destination in the three-dimensional map, and simultaneously display a destination attribute corresponding to the destination in the graphical user interface; for the latter, after displaying the three-dimensional map, the terminal may select a corresponding destination according to an operation of the navigation object in the three-dimensional map, and display a target position corresponding to the destination in the three-dimensional map, and simultaneously display a destination attribute corresponding to the destination in the graphical user interface.
For displaying the destination attribute, the terminal may display the destination attribute corresponding to the destination in the attribute floating window by displaying the attribute floating window corresponding to the destination. In one example, assuming that the building is an office building and the destination can be a meeting room located at a corresponding floor in the office building, the terminal can display a meeting attribute corresponding to the meeting room in the attribute floating window, wherein the meeting attribute at least comprises one of location information, a meeting theme, meeting time, a meeting user identifier, a sign-in code and a meeting room image; assuming that the building is a shopping center, the destination may be a shop located at a corresponding floor of the shopping center, etc., the terminal may display a commodity attribute corresponding to the shop in the attribute floating window, wherein the shop attribute may include at least one of location information, a shop name, a business hours, a shop type, and a shop image, which is not limited in this aspect of the present invention.
Step 103, in response to the navigation instruction for the destination attribute, displaying a live-action image and at least one navigation guidance for the destination, and displaying a navigation route from the current position to the target position in the three-dimensional map.
In the embodiment of the invention, after the destination attribute corresponding to the destination is displayed, when the navigation object inputs the corresponding navigation instruction, the terminal can respond to the navigation instruction aiming at the destination attribute, display the live-action image and at least one navigation guide aiming at the destination, and display the navigation route from the current position to the target position in the three-dimensional map, so that navigation is performed from multiple dimensions by displaying the live-action image, at least one navigation guide and the navigation route in the three-dimensional map in the indoor navigation process, a good direction guide and space structure recognition mode is provided for the navigation object, the navigation object is effectively assisted to move to the destination, and the indoor navigation efficiency is greatly improved.
In a specific implementation, for an attribute floating window corresponding to a destination displayed by a terminal, the terminal can further comprise a navigation control, navigation to the destination can be triggered through the navigation control, specifically, the terminal can switch and display the attribute floating window as a navigation floating window in response to a navigation instruction to the navigation control, the navigation floating window at least comprises steering prompt information, navigation information and an AR navigation control, for the navigation floating window, brief navigation information can be provided for a navigation object in a text and image mode, under the condition that a user is already familiar with the destination position, which destination needs to be moved to can be known through the navigation prompt information and the navigation information, and for the condition that the user is not familiar with the destination position, real scene navigation can be triggered through the AR navigation control while knowing which destination needs to be moved to, the terminal can display a real scene map and at least one navigation guide to the destination in response to the real scene navigation instruction to the AR navigation control, and display a navigation route from a current position to a target position in a three-dimensional map, so that in an indoor navigation is effectively realized, the navigation efficiency is improved, and the navigation object is effectively recognized from the navigation map to the destination in a three-dimensional navigation mode is provided.
Alternatively, for the display of different navigation directions, the terminal may display a live-action diagram of the navigation prompt information for the destination corresponding to the target floor, and display a moving route in the live-action diagram, where the navigation prompt information may be used to prompt the moving direction, the remaining distance, etc. of the current position relative to the destination, e.g., the terminal may display, in the graphical user interface, a direction guide identifier for turning the moving path currently to which the navigation object belongs to the next moving path, and moving navigation prompt information for the destination. In addition, for the moving route displayed in the live-action diagram, the moving route can be the moving route of the moving route which is pointed to the next moving route by the current moving route of the navigation object, so that the steering between the current moving route and the next moving route can be prompted through the direction guide mark, the position relation of the current position relative to the destination can be prompted through the navigation prompt information, the user can be assisted to move through the moving route, the navigation through multiple dimensions is realized, a good direction guide and space structure recognition mode is provided for the navigation object, the movement of the navigation object to the destination is effectively assisted, and the indoor navigation efficiency is greatly improved.
In the embodiment of the invention, in the indoor navigation process, the terminal can respond to the identification instruction aiming at the entity map identification, display the three-dimensional map corresponding to the target floor corresponding to the entity map identification and display the current position of the navigation object in the three-dimensional map, respond to the acquisition of the destination to be reached, display the destination attribute aiming at the destination and display the target position corresponding to the destination in the three-dimensional map, obtain the corresponding three-dimensional map through the identification of the entity map identification, facilitate the navigation object to visually and comprehensively view map information, and when the navigation object inputs the navigation instruction aiming at the destination, the terminal can respond to the navigation instruction aiming at the destination attribute, display at least one navigation guide from the real map to the destination and display the navigation route from the current position to the target position in the three-dimensional map, thereby effectively assisting the navigation object to move to the indoor destination in a good direction guide and space structure identification mode by displaying the real map, at least one navigation guide and the navigation route in the three-dimensional map in the indoor navigation process.
In order to enable those skilled in the art to better understand the technical solutions according to the embodiments of the present invention, the following description is given by way of example:
referring to fig. 2, which shows a schematic diagram of an application interface provided in an embodiment of the present invention, a user may scan an entity map identifier at a corresponding scan interface 20 of an application program, may acquire map data corresponding to the entity map identifier when the terminal identifies the corresponding entity map identifier in a scan window, may display the map interface in a graphical user interface based on the acquired map data in case that a corresponding destination already exists, may display a three-dimensional map 310 corresponding to the entity map identifier and an attribute floating window 320 corresponding to the destination in the map interface, and may display a current location and a location corresponding to the destination in the three-dimensional map 310 at the same time, for example, assuming that the user previously reserved a certain conference room for a corresponding conference, the terminal may output corresponding prompt information in the application 15 minutes before the meeting starts, and when the user inputs a navigation instruction for the prompt information, the terminal may present the interface shown in fig. 2 so that the user scans the entity map identifier through the terminal, after the scanning is completed, the application interface shown in fig. 3 may be displayed, the three-dimensional map corresponding to the floor where the terminal is located and the attribute floating window may be displayed in the interface, and the information such as the meeting theme, the location, the meeting time, the meeting participant, the sign-in code image, the navigation control 330 may be displayed in the attribute floating window, and meanwhile, the current location and the target location of the meeting room may be displayed in the three-dimensional map. It should be noted that, when no destination exists, that is, the user only wants to obtain the corresponding guide information through the entity map identifier, after identifying the corresponding entity map identifier, the terminal may display the corresponding guide prompt information in the graphical user interface, as shown in fig. 4, may display the corresponding guide floating window in the application interface, prompt the user about the floor where the user is currently located, and provide the corresponding guide control 410, so that the user triggers the corresponding guide function.
Taking fig. 3 as an example, when the user triggers the navigation for the conference room through the navigation control 330, referring to fig. 5, the terminal may switch the attribute floating window to the navigation floating window 510, and the navigation floating window 510 may provide corresponding navigation information and AR navigation control 520, where the navigation information may include "right turn", 18m, reaching the reference point "xx", and the like, and detailed text navigation information, such as "start navigation, right turn, right side port go straight after reaching xx, left turn, go straight after reaching xx", reaching the end xx ", and the like, and meanwhile, an image corresponding to the conference room may be displayed in the navigation floating window. In the case that the user is already familiar with the meeting room location, by means of the navigation prompt information and the navigation information, it is possible to learn which meeting room needs to be moved to, while for the case that the user is not familiar with the meeting room location, while learning which destination needs to be moved to, it is also possible to trigger live-action navigation through the AR navigation control, as shown in fig. 6, in the case that AR live-action navigation is triggered, the terminal may display in the graphical user interface the direction guide identifier 610 of the current moving path of the user to the next moving path, the moving navigation prompt floating window 620 (in the floating window, there may be displayed a moving prompt and location reference information, etc., such as "go right", "reach xx reference point after 18 m", etc.) for the current moving path points to the moving route 630 of the next moving path, so as to effectively assist the moving object to the indoor destination by displaying the live-action map, at least one navigation guide and the navigation route in the three-dimensional map, navigating from multiple dimensions, and effectively assisting the moving object to the indoor destination.
In addition, after reaching the corresponding destination, the terminal may output corresponding prompt information in the application interface, for example, referring to fig. 7, after the user reaches the corresponding conference room, the terminal may display a corresponding prompt floating window 710 in the application interface, and may display corresponding prompt information in the prompt floating window 710, for example, "reach the destination, remember check-in" and the like, and may display a corresponding check-in code and the like, which is not limited in the present invention.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 8, a block diagram of an indoor navigation device according to an embodiment of the present invention is shown, which may specifically include the following modules:
a map display module 801, configured to display a three-dimensional map corresponding to a target floor corresponding to an entity map identifier and display a current position of a navigation object in the three-dimensional map in response to an identification instruction for the entity map identifier;
A location display module 802, configured to display a destination attribute for a destination and display a target location corresponding to the destination in the three-dimensional map in response to acquiring the destination to be reached;
a navigation module 803 for displaying a live-action map and at least one navigation guidance for the destination in response to a navigation instruction for the destination attribute, and displaying a navigation route from the current position to the target position in the three-dimensional map.
In an alternative embodiment, the navigation module 803 is specifically configured to:
displaying a live-action diagram of the navigation prompt information aiming at the destination and corresponding to the target floor, and displaying a moving route in the live-action diagram.
In an alternative embodiment, the navigation module 803 is specifically configured to:
and displaying a direction guide mark for turning to the next moving path by the moving path to which the navigation object currently belongs and moving navigation prompt information aiming at the destination.
In an alternative embodiment, the navigation module 803 is specifically configured to:
and displaying a moving route which is pointed to a next moving route by the moving route which the navigation object currently belongs to in the live-action diagram.
In an alternative embodiment, the location display module 802 is specifically configured to:
and displaying an attribute floating window corresponding to the destination, and displaying a destination attribute corresponding to the destination in the attribute floating window.
In an alternative embodiment, the destination is a conference room located in the target floor, and the location display module 802 is specifically configured to:
displaying conference attributes corresponding to the conference room in the attribute floating window;
the conference attribute at least comprises one of position information, conference theme, conference time, conference user identification, sign-in code and conference room image.
In an optional embodiment, the attribute floating window further includes a navigation control, and the navigation module 803 is specifically configured to:
responding to a navigation instruction aiming at the navigation control, and switching and displaying the attribute floating window as a navigation floating window, wherein the navigation floating window at least comprises steering navigation prompt information, navigation information and an AR navigation control;
in response to a live-action navigation instruction for the AR navigation control, displaying a live-action map and at least one navigation guideline for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
In addition, the embodiment of the invention also provides electronic equipment, which comprises: the processor, the memory, store the computer program on the memory and can run on the processor, this computer program realizes each process of the above-mentioned indoor navigation method embodiment when being carried out by the processor, and can reach the same technical result, in order to avoid repetition, will not be repeated here.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, realizes the processes of the indoor navigation method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 900 includes, but is not limited to: radio frequency unit 901, network module 902, audio output unit 903, input unit 904, sensor 905, display unit 906, user input unit 907, interface unit 908, memory 909, processor 910, and power source 911. It will be appreciated by those skilled in the art that the structure of the electronic device according to the embodiments of the present invention is not limited to the electronic device, and the electronic device may include more or less components than those illustrated, or may combine some components, or may have different arrangements of components. In the embodiment of the invention, the electronic equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, specifically, receiving downlink data from a base station and then processing the downlink data by the processor 910; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 may also communicate with networks and other devices via a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 902, such as helping the user to send and receive e-mail, browse web pages, and access streaming media, etc.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the electronic device 900. The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive an audio or video signal. The input unit 904 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 9042, the graphics processor 9041 processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphics processor 9041 may be stored in memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 901 in the case of a telephone call mode.
The electronic device 900 also includes at least one sensor 905, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 9061 and/or the backlight when the electronic device 900 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 905 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 906 is used to display information input by a user or information provided to the user. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 is operable to receive input numeric or character information, and to generate key signal inputs related to user settings and function controls of the electronic device. In particular, the user input unit 907 includes a touch panel 9071 and other input devices 9072. Touch panel 9071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (such as operations of the user on touch panel 9071 or thereabout using any suitable object or accessory such as a finger, stylus, or the like). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, and receives and executes commands sent by the processor 910. In addition, the touch panel 9071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 907 may also include other input devices 9072 in addition to the touch panel 9071. In particular, other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 910 to determine a type of touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of touch event. It will be appreciated that in one embodiment, the touch panel 9071 and the display panel 9061 are two independent components for implementing the input and output functions of the electronic device, but in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 908 is an interface to which an external device is connected to the electronic apparatus 900. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 900 or may be used to transmit data between the electronic apparatus 900 and an external device.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 909 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 909, and calling data stored in the memory 909, thereby performing overall monitoring of the electronic device. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 910.
The electronic device 900 may also include a power supply 911 (e.g., a battery) for powering the various components, and the power supply 911 may preferably be logically coupled to the processor 910 by a power management system, such as to perform charge, discharge, and power consumption management functions.
In addition, the electronic device 900 includes some functional modules that are not shown, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (10)

1. An indoor navigation method, comprising:
responding to an identification instruction aiming at an entity map identifier, displaying a three-dimensional map corresponding to a target floor corresponding to the entity map identifier and displaying the current position of a navigation object in the three-dimensional map;
in response to obtaining a destination to be reached, displaying a destination attribute for the destination and displaying a target position corresponding to the destination in the three-dimensional map;
in response to a navigation instruction for the destination attribute, displaying a live-action map and at least one navigation guide for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.
2. The method of claim 1, wherein the displaying a live-action image and at least one navigation directions for the destination comprises:
displaying a live-action diagram of the navigation prompt information aiming at the destination and corresponding to the target floor, and displaying a moving route in the live-action diagram.
3. The method of claim 2, wherein the displaying navigation prompts for the destination comprises:
And displaying a direction guide mark for turning to the next moving path by the moving path to which the navigation object currently belongs and moving navigation prompt information aiming at the destination.
4. The method of claim 2, wherein displaying the movement route in the live-action graph comprises:
and displaying a moving route which is pointed to a next moving route by the moving route which the navigation object currently belongs to in the live-action diagram.
5. The method of claim 1, wherein the displaying the destination attribute for the destination comprises:
and displaying an attribute floating window corresponding to the destination, and displaying a destination attribute corresponding to the destination in the attribute floating window.
6. The method of claim 5, wherein the destination is a meeting room located in the target floor, the displaying a destination attribute corresponding to the destination in the attribute floating window, comprising:
displaying conference attributes corresponding to the conference room in the attribute floating window;
the conference attribute at least comprises one of position information, conference theme, conference time, conference user identification, sign-in code and conference room image.
7. The method of claim 5, wherein the property floating window further comprises a navigation control, the displaying a live-action map and at least one navigation guide for the destination in response to a navigation instruction for the destination property, and displaying a navigation route from the current location to the target location in the three-dimensional map, comprising:
responding to a navigation instruction aiming at the navigation control, and switching and displaying the attribute floating window as a navigation floating window, wherein the navigation floating window at least comprises steering navigation prompt information, navigation information and an AR navigation control;
in response to a live-action navigation instruction for the AR navigation control, displaying a live-action map and at least one navigation guideline for the destination, and displaying a navigation route from the current location to the target location in the three-dimensional map.
8. An indoor navigation device, comprising:
the map display module is used for responding to the identification instruction aiming at the entity map identification, displaying a three-dimensional map corresponding to the target floor corresponding to the entity map identification and displaying the current position of the navigation object in the three-dimensional map;
A position display module, configured to display a destination attribute for a destination and display a target position corresponding to the destination in the three-dimensional map in response to acquiring the destination to be reached;
a navigation module for displaying a live-action map and at least one navigation guide for the destination in response to a navigation instruction for the destination attribute, and displaying a navigation route from the current location to the target location in the three-dimensional map.
9. An electronic device comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory communicate with each other via the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the method according to any one of claims 1-7 when executing a program stored on a memory.
10. A computer-readable storage medium having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method of any of claims 1-7.
CN202310325374.XA 2023-03-29 2023-03-29 Indoor navigation method, device, electronic equipment and storage medium Pending CN116358523A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310325374.XA CN116358523A (en) 2023-03-29 2023-03-29 Indoor navigation method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310325374.XA CN116358523A (en) 2023-03-29 2023-03-29 Indoor navigation method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116358523A true CN116358523A (en) 2023-06-30

Family

ID=86914775

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310325374.XA Pending CN116358523A (en) 2023-03-29 2023-03-29 Indoor navigation method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116358523A (en)

Similar Documents

Publication Publication Date Title
CN108089891B (en) Application program starting method and mobile terminal
CN108174103B (en) Shooting prompting method and mobile terminal
CN110110571B (en) Code scanning method and mobile terminal
CN110865745A (en) Screen capturing method and terminal equipment
CN109523253B (en) Payment method and device
US20210320995A1 (en) Conversation creating method and terminal device
CN108917766B (en) Navigation method and mobile terminal
CN111124706A (en) Application program sharing method and electronic equipment
CN111079030A (en) Group searching method and electronic device
CN110536236B (en) Communication method, terminal equipment and network equipment
CN110941469B (en) Application splitting creation method and terminal equipment thereof
CN109889756B (en) Video call method and terminal equipment
CN109472825B (en) Object searching method and terminal equipment
CN111178306B (en) Display control method and electronic equipment
CN110677537B (en) Note information display method, note information sending method and electronic equipment
CN109474889B (en) Information transmission method, mobile terminal and server
CN109543193B (en) Translation method, translation device and terminal equipment
CN109388471B (en) Navigation method and device
CN108632470B (en) Wireless network signal display method and mobile terminal
CN110891122A (en) Wallpaper pushing method and electronic equipment
CN108495276B (en) Sharing method and device of digital business card
CN110928616A (en) Shortcut icon management method and electronic equipment
CN116358523A (en) Indoor navigation method, device, electronic equipment and storage medium
WO2020253377A1 (en) Terminal positioning method and mobile terminal
WO2021104254A1 (en) Information processing method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination