CN111758016A - Image navigation method, device, equipment and readable storage medium - Google Patents

Image navigation method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN111758016A
CN111758016A CN202080001068.5A CN202080001068A CN111758016A CN 111758016 A CN111758016 A CN 111758016A CN 202080001068 A CN202080001068 A CN 202080001068A CN 111758016 A CN111758016 A CN 111758016A
Authority
CN
China
Prior art keywords
image
navigation
guide
live
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080001068.5A
Other languages
Chinese (zh)
Inventor
陈尊裕
吴沛谦
张仲文
吴珏其
胡斯洋
陈欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fengtuzhi Technology Holding Co.,Ltd.
Original Assignee
Fengtu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fengtu Technology Co ltd filed Critical Fengtu Technology Co ltd
Publication of CN111758016A publication Critical patent/CN111758016A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The invention discloses an image navigation method, which comprises the following steps: acquiring a navigation path and determining a plurality of guide nodes in the navigation path; acquiring a live-action guidance image corresponding to each guidance node according to the navigation path; forming a live-action guide image set by using the live-action guide images, and outputting the live-action guide image set; the method comprises the steps of firstly determining a navigation path, acquiring a live-action guide image according to guide nodes in the navigation path, and outputting the live-action guide image after generating a live-action guide image set by using the live-action guide image, so that a user can determine the advancing direction according to the live-action guide image set to realize the navigation function; the real-time position does not need to be acquired, and the planned route does not need to be determined according to the real-time position, so that the interference caused by the fact that the real-time position cannot be accurately acquired can be avoided, the navigation accuracy is improved, and the navigation effect is improved; in addition, the invention also provides an image navigation device, an image navigation device and a computer readable storage medium, which also have the beneficial effects.

Description

Image navigation method, device, equipment and readable storage medium
Technical Field
The present invention relates to the field of navigation technologies, and in particular, to an image navigation method, an image navigation apparatus, an image navigation device, and a computer-readable storage medium.
Background
Navigation is a function frequently used in daily life of people, and in related technologies, a real-time position of a user is continuously acquired by means of a positioning function, the real-time position is displayed on a map, and the user knows a forward route and a direction by combining a planned route pointing to a destination and displayed on the map, so that the purpose of navigation is achieved. However, in a dense building area, satellite signals are interfered and limited by the strength of the satellite signals, the positioning accuracy and the direction correctness of a user cannot be guaranteed, and inaccurate positioning causes deviation of a navigation planning route and causes certain misguidance to the user. In an indoor environment, an indoor positioning technology is not popularized yet, satellite signal strength can be further weakened, and the problem of inaccurate navigation is more prominent, so that the problems of poor navigation accuracy and poor navigation effect exist in the related technology.
Therefore, how to solve the problem of poor navigation accuracy and navigation effect in the related art is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the present invention provides an image navigation method, an image navigation apparatus, an image navigation device, and a computer readable storage medium, which solve the problem of poor navigation accuracy and poor navigation effect in the related art.
In order to solve the above technical problem, the present invention provides an image navigation method, including:
acquiring a navigation path and determining a plurality of guide nodes in the navigation path;
acquiring a live-action guidance image corresponding to each guidance node according to the navigation path;
and forming a live-action guide image set by using the live-action guide images, and outputting the live-action guide image set.
Optionally, the obtaining of the live-action guidance image corresponding to each guidance node according to the navigation path includes:
acquiring a panoramic image corresponding to the guide node;
extracting guide information corresponding to the navigation path;
and marking the panoramic image by using the guide information to obtain the live-action guide image.
Optionally, the obtaining of the live-action guidance image corresponding to each guidance node according to the navigation path includes:
extracting guide information corresponding to the navigation path;
and screening the pre-marked images in the database according to the guide information to obtain the live-action guide image.
Optionally, the obtaining of the live-action guidance image corresponding to each guidance node according to the navigation path includes:
extracting guide information corresponding to the navigation path;
sending the guide information and the node information corresponding to the guide node to a server;
and receiving the live-action guide image sent by the server.
Optionally, the forming a set of live-action guidance images by using the live-action guidance images and outputting the set of live-action guidance images includes:
sequencing all the live-action guide images according to the guide information to obtain a live-action guide image set;
and outputting the live-action guide image set.
Optionally, the obtaining the navigation path includes:
determining a path starting point and a path end point, and generating a preselected path according to the path starting point and the path end point;
and determining the navigation path from the pre-selected paths according to a path selection rule.
Optionally, the determining a starting point of the path includes:
acquiring target position information, and determining the starting point of the path by using the target position information;
and/or the presence of a gas in the gas,
acquiring and analyzing a starting point selection instruction to obtain an interest point;
and determining the starting point of the path by using the interest point information corresponding to the interest point.
Optionally, the determining a plurality of guide nodes in the navigation path includes:
determining a guideline level;
and filtering preset guide nodes corresponding to the navigation path according to the guide grades to obtain a plurality of guide nodes.
Optionally, after the outputting the set of live-action guidance images, further comprising:
acquiring and analyzing an editing instruction to obtain editing information;
and editing the live-action guide image set according to the editing information.
The present invention also provides an image navigation apparatus, comprising:
the navigation system comprises a path acquisition module, a navigation module and a navigation module, wherein the path acquisition module is used for acquiring a navigation path and determining a plurality of guide nodes in the navigation path;
the image acquisition module is used for acquiring the live-action guidance images corresponding to the guidance nodes according to the navigation path;
and the output module is used for forming a real-scene guide image set by using the real-scene guide images and outputting the real-scene guide image set.
The present invention also provides an image navigation device comprising a memory and a processor, wherein:
the memory is used for storing a computer program;
the processor is configured to execute the computer program to implement the image navigation method.
The invention also provides a computer-readable storage medium for storing a computer program, wherein the computer program, when executed by a processor, implements the image navigation method described above.
The image navigation method provided by the invention obtains a navigation path and determines a plurality of guide nodes in the navigation path; acquiring a live-action guidance image corresponding to each guidance node according to the navigation path; and forming a live-action guide image set by using the live-action guide images, and outputting the live-action guide image set.
The method comprises the steps of firstly determining a navigation path, and acquiring a live-action guide image according to a guide node in the navigation path, wherein the live-action guide image comprises guide information and is used for generating a live-action guide image set. And after the live-action guide image is used for generating a live-action guide image set, the live-action guide image set is output, so that a user can determine the advancing direction according to the live-action guide image set, and the navigation function is realized. The method does not need to acquire a real-time position during navigation and determine a planned route according to the real-time position, so that the interference caused by the fact that the real-time position cannot be accurately acquired can be avoided, the navigation accuracy is improved, the navigation effect is improved, the navigation application range is expanded, and the problems of poor navigation accuracy and poor navigation effect in the related technology are solved. Meanwhile, the real-scene guide image set is adopted to replace a planar abstract graph and a planning line, so that the environment where the user is located can correspond to the real-scene guide image, and the navigation effect is further improved.
In addition, the invention also provides an image navigation device, an image navigation device and a computer readable storage medium, which also have the beneficial effects.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flowchart of an image navigation method according to an embodiment of the present invention;
fig. 2 is a flowchart of a live-action guidance image method according to an embodiment of the present invention;
FIG. 3 is a flowchart of another real-scene guidance image method according to an embodiment of the present invention;
FIG. 4 is a flowchart of another real-scene guidance image method according to an embodiment of the present invention;
FIG. 5 is a flowchart of a specific image navigation method according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an image navigation apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an image navigation apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The navigation function may help a user determine a route to a target location and a forward direction, and thus is frequently used in daily life. In order to guide the user in real time, the related art generally acquires the position of the user in real time, and displays a heading and a planned route on a map so as to guide the user to proceed. However, in an area with dense buildings or in an indoor area, signals for positioning such as satellite signals may be attenuated, resulting in a decrease in positioning accuracy. Inaccurate positioning can cause deviation of a navigation planning line, and further misleading is generated for a user. Therefore, the related art has a problem of poor navigation accuracy and navigation effect.
In order to solve the above problems, the present invention provides an image navigation method, and a corresponding apparatus, device and readable storage medium. And acquiring a corresponding live-action guidance image after the navigation route is determined, constructing a live-action guidance image set by using the corresponding live-action guidance image, and guiding the user in a mode of outputting the live-action guidance image set. Because the position of the user does not need to be acquired in real time in the navigation process, and navigation is not needed to be carried out according to the real-time position, the problem of reduction of navigation accuracy caused by inaccurate positioning is avoided, and the navigation effect is improved.
In particular, in one possible implementation, please refer to fig. 1. Fig. 1 is a flowchart of an image navigation method according to an embodiment of the present invention, where the method includes:
s101: and acquiring the navigation path, and determining a plurality of guide nodes in the navigation path.
It should be noted that, some or all of the steps of the image navigation method provided by the present invention may be executed by a designated device or terminal, and the device or terminal may be referred to as the present device, and a specific form of the present device is not limited in this embodiment, and may be, for example, a mobile phone, a tablet computer, or a computer.
The navigation path is a path from a starting point to an end point, which the user wants to pass through, and the specific content of the navigation path is related to the end point of the path according to the starting point of the path. The navigation path acquisition mode is not limited in this embodiment, and may be generated according to path information input by a user, for example; or analyzing the path message when receiving the path message to directly obtain the navigation path, wherein the path message can be sent by other terminals or equipment.
And a guide node exists on the navigation path, and the guide node is a position node and is used for representing the position of acquiring the live-action guide image. The specific content of the guidance node is not limited in this embodiment, and may be, for example, a starting point of the navigation route, an end point of the navigation route, a turning point of the navigation route, or an intersection of the navigation route and another route. The guide node corresponds to the navigation path, which is necessarily on the navigation path. The number of the guide nodes is plural, and the specific number is not limited in this embodiment and can be set according to actual situations.
S102: and acquiring the live-action guide image corresponding to each guide node according to the navigation path.
After the guide node is determined, a live-action guide image corresponding to the guide node is obtained so as to guide the user. The real-scene guide image is a real-scene image with guide information, the guide information may include a forward direction, a forward path, or a forward manner of the user, and the form of the guide information may be a text description, a static or dynamic graphic indication, and the like, which is not limited in this embodiment. It should be noted that, since the navigation path has directionality, that is, from the path starting point to the path ending point, when the live-action guidance image corresponding to the guidance node is acquired, the navigation path is acquired according to the real-action guidance image, so as to obtain the real-action guidance image whose direction is consistent with the direction of the navigation path.
The live-action guide image is based on a live-action image, and the live-action image is an image for recording a real environment. In this embodiment, since the real-time position of the user is not obtained, in order to make the user know the current position of the user and determine the heading direction, a real-scene guidance image based on the real-scene image is required. Furthermore, the real-scene guide image is a real-scene image, and for some users with poor direction sense or users with poor abstract graphic understanding ability, the real-scene guide image can more conveniently position the position of the real-scene guide image, so that the user can conveniently correspond the current environment to the real-scene guide image, and the navigation effect is further improved.
S103: and forming a live-action guide image set by using the live-action guide images, and outputting the live-action guide image set.
And after obtaining a plurality of live-action guide images, forming a live-action guide image set by using the live-action guide images, and navigating the user by outputting the live-action guide image set. Specifically, the embodiment does not limit the specific way of outputting the live-action guide image set, and for example, the live-action guide image set may be output through a preset port and output to a display screen for display; or the data can be output to a position corresponding to a preset path for storage according to the preset path; or may be sent to other devices or terminals, i.e., the set of live-action guide images is output to the other devices or terminals.
By applying the image navigation method provided by the embodiment of the invention, the navigation path is determined firstly, and the live-action guidance image is obtained according to the guidance node in the navigation path, and the live-action guidance image contains guidance information and is used for generating the live-action guidance image set. And after the live-action guide image is used for generating a live-action guide image set, the live-action guide image set is output, so that a user can determine the advancing direction according to the live-action guide image set, and the navigation function is realized. The method does not need to acquire a real-time position during navigation and determine a planned route according to the real-time position, so that the interference caused by the fact that the real-time position cannot be accurately acquired can be avoided, the navigation accuracy is improved, the navigation effect is improved, the navigation application range is expanded, and the problems of poor navigation accuracy and poor navigation effect in the related technology are solved. Meanwhile, the real-scene guide image set is adopted to replace a planar abstract graph and a planning line, so that the environment where the user is located can correspond to the real-scene guide image, and the navigation effect is further improved.
Based on the above embodiment, in a possible implementation manner, the live-action image may be marked according to the guidance information of the navigation path, so as to obtain the live-action guidance image. Referring to fig. 2, fig. 2 is a flowchart of a live-action guidance image method according to an embodiment of the present invention, including:
s201: and acquiring a panoramic image corresponding to the guide node.
The panoramic image is a panoramic image in which the surrounding environment is represented as much as possible by a wide-angle representation means. In the present embodiment, the live-action guidance image is generated using the panoramic image as the live-action image. The panoramic image can allow a user to observe a 360-degree real scene environment around the guide node, so that a better navigation effect is achieved. After the guidance nodes are determined, panoramic images corresponding to the guidance nodes need to be acquired, the panoramic images can be stored locally or in a cloud, for example, the panoramic images can be stored in a server, corresponding relations can be established between each guidance node and the corresponding panoramic image, and the panoramic images corresponding to the guidance nodes are acquired according to the corresponding relations.
S202: and extracting guide information corresponding to the navigation path.
After the panoramic image is acquired, the guidance information corresponding to the navigation path needs to be extracted, and specifically, the guidance information may be extracted from the navigation information corresponding to the navigation path. The guide information includes direction information and the like, and a live-action guide image matched with the navigation path can be obtained according to the guide information.
S203: and marking the panoramic image by using the guide information to obtain a live-action guide image.
And after the panoramic image is obtained, determining the panoramic image as a live-action image, and marking the panoramic image by using the guide information to obtain the live-action guide image. The specific method of marking is not limited in this embodiment, and for example, the real-image may be marked by using an arrow, a character, or an animation.
In one possible embodiment, to reduce the file size of the live-action guide image, the panoramic image may be sliced. Specifically, since the panoramic image records all real scenes around the guidance node, that is, 360-degree real scenes around the guidance node, the panoramic image can be segmented according to the guidance information to obtain a real image matched with the navigation path, so as to reduce the file size of the finally generated real-scene guidance image. For example, when the guidance information indicates that the node a advances in the south direction, the panoramic image corresponding to the node a may be segmented, and the specific segmentation method is not limited in this embodiment and may be set according to actual needs. For example, the southward part of the panoramic map may be segmented, and the segmented image may be determined as a live-action image; or the direction of the user reaching the a guide node and the south direction may be combined to segment the panoramic image, for example, if the user reaches the a guide node from the east direction, the image in the southeast direction of the a guide node may be segmented, and the segmented image is determined to be the live image.
In another possible implementation, the pre-marked image may be filtered with the guide information in order to obtain the live-action guide image more quickly. Referring to fig. 3, fig. 3 is a flowchart of another real-scene guidance image method according to an embodiment of the present invention, including:
s301: and extracting guide information corresponding to the navigation path.
In this embodiment, in order to acquire the live-action guidance image matching the navigation path, it is also necessary to extract guidance information corresponding to the navigation path.
S302: and screening the pre-marked images in the database according to the guide information to obtain the live-action guide image.
The database is preset in the embodiment, and can be a local database or a cloud database. The database stores pre-marked images, and each guide node corresponds to a plurality of pre-marked images. And after the guide information is obtained, screening the pre-marked image of the guide node by using the guide information, and determining the screened pre-marked image as the live-action guide image. Specifically, when only one pre-marked image passes the screening, the pre-marked image is directly determined as the live-action guide image; when there are a plurality of screened images, the live-action guidance image may be selected according to a preset selection rule, and the specific content of the preset selection rule is not limited in this embodiment, and may be, for example, a random selection rule or a polling selection rule.
In another possible implementation, in order to acquire the real-scene guide image more quickly and reduce the computing power requirement of the device, a server with stronger computing power may be used to acquire the real-scene guide image. Referring to fig. 4, fig. 4 is a flowchart of another real-scene guidance image method according to an embodiment of the present invention, including:
s401: and extracting guide information corresponding to the navigation path.
In this embodiment, in order to acquire the live-action guidance image matching the navigation path, it is also necessary to extract guidance information corresponding to the navigation path.
S402: and sending the guide information and the node information corresponding to the guide node to a server.
And after the guide information is acquired, the guide information and the node information corresponding to the guide node are sent to the server side together, so that the server side determines the live-action guide image according to the guide information and the node information. The embodiment does not limit the specific way of determining the live-action guide image by the server, and for example, the way described in fig. 2 or fig. 3 may be adopted. Because the server side has stronger computing power, the live-action guide image can be determined more quickly, and meanwhile, the computing power requirement on the equipment is reduced.
S403: and receiving the live-action guide image sent by the server.
After the live-action guide image is determined, the server sends the live-action guide image, and the equipment or the terminal can receive the live-action guide image sent by the server and perform subsequent operation by using the live-action guide image.
Based on the above embodiment, in one possible implementation, different guidance levels may be set, and the strength of the navigation guidance is determined according to the guidance levels. Referring to fig. 5, fig. 5 is a flowchart of a specific image navigation method according to an embodiment of the present invention, including:
s501: and determining a path starting point and a path end point, and generating a preselected path according to the path starting point and the path end point.
Before determining the navigation path, a path start point and a path end point need to be determined in order to generate a preselected path from the path start point and the path break. The preselected route is a feasible route between the route starting point and the route ending point, and the specific number of the preselected routes is not limited in this embodiment.
The present embodiment does not limit the determination method of the route starting point and the route ending point, and specifically, the following method may be adopted to determine the route starting point:
s5011: and acquiring target position information, and determining a path starting point by using the target position information.
In this embodiment, the target position information may be directly acquired and taken as the starting point of the path. The specific content of the target location information may be latitude and longitude information, and the obtaining method is not limited in this embodiment, and for example, the target location information may be directly input by a user, or the target location information may be extracted by parsing a location instruction, and the location instruction may be sent by other devices or terminals.
Correspondingly, the method in the step S5011 may also be used to determine the path end point, and this embodiment of the specific process is not described again.
In another possible implementation, the starting point of the path may be determined by using the interest point on the map, specifically:
s5012: obtaining and analyzing a starting point selection instruction to obtain an interest point
The starting point selection instruction is used for selecting a point Of interest, which is poi (point Of interests), which may be any point on a map, for example, a location point around a bus station or a location point around a store. The embodiment does not limit the generation manner of the starting point selection instruction, for example, the user may click on the map and generate the starting point selection instruction according to the clicked position; or the user may input the content desired to be searched in the search box, and the start point selection instruction may be generated based on the content. And after the starting point selection instruction is obtained, analyzing the starting point selection instruction to obtain the interest point.
S5013: and determining a path starting point by using the interest point information corresponding to the interest point.
After the interest point is determined, the start point of the path is determined by using the corresponding interest point information, and the interest point information can be latitude and longitude information. The method can enable the user to determine the starting point of the path more conveniently.
Correspondingly, the path end point may also be determined by using the methods in the step S5012 and the step S5013, and details of this embodiment of the process are not described again.
It should be noted that the present embodiment does not limit which of the two methods is specifically adopted to determine the route starting point, and may adopt only one or two methods simultaneously, and the two methods may be respectively used to select the route starting point and the route end point, for example, the method in the step S5011 is used to determine the route starting point, and the method in the steps S5012 and S5013 is used to determine the route end point; alternatively, the route start point is determined by the method in the step S5012 and the step S5013, and the route end point is determined by the method in the step S5011.
S502: and determining a navigation path from the pre-selected paths according to the path selection rule.
And after the preselected path is obtained, selecting the preselected path according to a path selection rule, and finally determining the navigation path. The specific content of the route selection rule is not limited in this embodiment, and the route selection rule may be set according to an actual situation, for example, when only one preselected route is provided, the preselected route is directly determined as a navigation route; when there are a plurality of preselected paths, the preselected path closest to the preselected path may be determined as the navigation path, or the path with the least number of turns may be determined as the navigation path, or the path with the best road condition may be determined as the navigation path.
S503: a guideline level is determined.
In this embodiment, a guidance level is set to meet the requirements of different users, so as to provide more detailed guidance for users with poor direction feeling, provide rough guidance for users with good direction feeling, and reduce the number of live-action guidance images. The specific form of the index level is not limited in this embodiment, and may be a number or a letter, for example. Different guide grades correspond to different guide strengths, and the guide strength is embodied by the number of guide nodes, and the more the guide nodes are, the stronger the guide strength is, the fewer the guide nodes are, and the weaker the guide strength is.
S504: and filtering preset guide nodes corresponding to the navigation path according to the guide grades to obtain a plurality of guide nodes.
And after the guide level is determined, filtering preset guide nodes corresponding to the navigation path by using the guide level, wherein the preset guide nodes are all guide nodes on the navigation path. The embodiment does not limit the specific filtering method, for example, preset guide nodes may be filtered according to node types of the guide nodes corresponding to each guide level, where the node types may be classified into intersection nodes, turning points, fork points, roadside nodes, and the like, or each guide node may be classified according to a classification rule to obtain multiple types of nodes, such as a first type of node, a class a node, and the like; or filtering preset guide nodes according to the number of the guide nodes corresponding to each guide level. After the preset guide nodes are filtered, the guide nodes can be determined.
S505: and acquiring the live-action guide image corresponding to each guide node according to the navigation path.
For details of this step, reference may be made to the step S102 and the descriptions in the foregoing embodiments, which are not described herein again.
S506: and sequencing all the live-action guide images according to the guide information to obtain a live-action guide image set.
And after all the live-action guide images are obtained, sorting the live-action guide images according to the guide information. Because the navigation path has directionality, the process of the user passing through each guide node is ordered, and the order is matched with the guide information, so that the live-action guide images are sorted according to the guide information, and a live-action guide image set can be obtained.
S507: and outputting the live-action guide image set.
The embodiment does not limit the output manner of the real-scene guidance image set, and for example, the real-scene guidance image set may be output to a display screen, and displayed by the display screen, or may be output to a target path for storage, so that a user may view the image.
In a possible implementation manner, after the live-action guidance image set is obtained, the live-action guidance image set may be edited according to an editing instruction, specifically:
s508: and acquiring and analyzing the editing instruction to obtain editing information.
The editing instruction may be input by the user or generated according to some operation of the user, or may be transmitted by other devices or terminals. After the editing instruction is obtained, the editing instruction is analyzed, so that the editing information can be obtained, and the specific content of the editing information is not limited in this embodiment.
S509: and editing the live-action guide image set according to the editing information.
The embodiment does not limit the specific editing object of the editing information, which may be a real-scene guide image set, or may be one or more real-scene guide images in the image set, and the edited content may include deletion or position transformation of the real-scene guide images, or some information may be added, deleted, or modified on one or more real-scene guide images.
It should be noted that the steps S508 and S509 are not essential steps, and are not necessarily executed.
In the following, the image navigation apparatus provided by the embodiment of the present invention is introduced, and the image navigation apparatus described below and the image navigation method described above may be referred to correspondingly.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an image navigation apparatus according to an embodiment of the present invention, including:
a path obtaining module 610, configured to obtain a navigation path and determine a plurality of guidance nodes in the navigation path;
an image obtaining module 620, configured to obtain a live-action guidance image corresponding to each guidance node according to the navigation path;
the output module 630 is configured to utilize the live-action guide images to form a live-action guide image set, and output the live-action guide image set.
Optionally, the image acquiring module 620 includes:
the panoramic image acquisition unit is used for acquiring a panoramic image corresponding to the guide node;
the first extraction unit is used for extracting the guide information corresponding to the navigation path;
and the marking unit is used for marking the panoramic image by using the guide information to obtain the live-action guide image.
Optionally, the image acquiring module 620 includes:
the second extraction unit is used for extracting the guide information corresponding to the navigation path;
and the screening unit is used for screening the pre-marked images in the database according to the guide information to obtain the live-action guide image.
Optionally, the image acquiring module 620 includes:
the third extraction unit is used for extracting the guide information corresponding to the navigation path;
the sending unit is used for sending the guide information and the node information corresponding to the guide node to the server;
and the receiving unit is used for receiving the live-action guide image sent by the server.
Optionally, the output module 630 includes:
the sorting unit is used for sorting all the live-action guide images according to the guide information to obtain a live-action guide image set;
and the output unit is used for outputting the live-action guide image set.
Optionally, the path obtaining module 610 includes:
the pre-selection path acquisition unit is used for determining a path starting point and a path end point and generating a pre-selection path according to the path starting point and the path end point;
and the navigation path determining unit is used for determining the navigation path from the pre-selected paths according to the path selection rule.
Optionally, the preselected path acquiring unit includes:
the first determining subunit is used for acquiring target position information and determining a path starting point by using the target position information;
and/or the presence of a gas in the gas,
the interest point acquisition subunit is used for acquiring and analyzing the starting point selection instruction to obtain an interest point;
and the second determining subunit is used for determining the starting point of the path by using the interest point information corresponding to the interest point.
Optionally, the path obtaining module 610 includes:
a level determination unit for determining a guide level;
and the guide node filtering unit is used for filtering the preset guide nodes corresponding to the navigation path according to the guide grades to obtain a plurality of guide nodes.
Optionally, the method further comprises:
the editing information acquisition module is used for acquiring and analyzing the editing instruction to obtain editing information;
and the editing module is used for editing the live-action guide image set according to the editing information.
By applying the image navigation device provided by the embodiment of the invention, the navigation path is determined firstly, and the live-action guidance image is obtained according to the guidance node in the navigation path, and the live-action guidance image contains guidance information and is used for generating the live-action guidance image set. And after the live-action guide image is used for generating a live-action guide image set, the live-action guide image set is output, so that a user can determine the advancing direction according to the live-action guide image set, and the navigation function is realized. The device does not need to acquire a real-time position during navigation and does not need to determine a planned route according to the real-time position, so that the interference caused by the fact that the real-time position cannot be accurately acquired can be avoided, the navigation accuracy is improved, the navigation effect is improved, the navigation application range is expanded, and the problems of poor navigation accuracy and poor navigation effect in the related technology are solved. Meanwhile, the real-scene guide image set is adopted to replace a planar abstract graph and a planning line, so that the environment where the user is located can correspond to the real-scene guide image, and the navigation effect is further improved.
In the following, the image navigation apparatus provided by the embodiment of the present invention is introduced, and the image navigation apparatus described below and the image navigation method described above may be referred to correspondingly.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an image navigation apparatus according to an embodiment of the present invention. The image navigation device 700 may include a processor 701 and a memory 702, and may further include one or more of a multimedia component 703, an information input/information output (I/O) interface 704, and a communication component 705.
The processor 701 is configured to control the overall operation of the image navigation apparatus 700, so as to complete all or part of the steps in the image navigation method; the memory 702 is used to store various types of data to support operation at the image navigation device 700, such data can include, for example, instructions for any application or method operating on the image navigation device 700, as well as application-related data. The Memory 702 may be implemented by any type or combination of volatile and non-volatile Memory devices, such as one or more of Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic or optical disk.
The multimedia components 703 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 702 or transmitted through the communication component 705. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 704 provides an interface between the processor 701 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 705 is used for wired or wireless communication between the image navigation device 700 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding Communication component 705 may include: Wi-Fi part, Bluetooth part, NFC part.
The image navigation Device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors or other electronic components, and is configured to perform the image navigation method according to the above embodiments.
In the following, the computer-readable storage medium provided by the embodiment of the present invention is introduced, and the computer-readable storage medium described below and the image navigation method described above may be referred to correspondingly.
The present invention also provides a computer-readable storage medium having a computer program stored thereon, which, when executed by a processor, implements the steps of the image navigation method described above.
The computer-readable storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it should also be noted that, herein, relationships such as first and second, etc., are intended only to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms include, or any other variation is intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that includes a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
The image navigation method, the image navigation apparatus, the image navigation device and the computer readable storage medium provided by the present invention are described in detail above, and a specific example is applied in the present document to explain the principle and the implementation of the present invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (12)

1. An image navigation method, comprising:
acquiring a navigation path and determining a plurality of guide nodes in the navigation path;
acquiring a live-action guidance image corresponding to each guidance node according to the navigation path;
and forming a live-action guide image set by using the live-action guide images, and outputting the live-action guide image set.
2. The image navigation method according to claim 1, wherein the obtaining of the live-action guidance image corresponding to each guidance node according to the navigation path includes:
acquiring a panoramic image corresponding to the guide node;
extracting guide information corresponding to the navigation path;
and marking the panoramic image by using the guide information to obtain the live-action guide image.
3. The image navigation method according to claim 1, wherein the obtaining of the live-action guidance image corresponding to each guidance node according to the navigation path includes:
extracting guide information corresponding to the navigation path;
and screening the pre-marked images in the database according to the guide information to obtain the live-action guide image.
4. The image navigation method according to claim 1, wherein the obtaining of the live-action guidance image corresponding to each guidance node according to the navigation path includes:
extracting guide information corresponding to the navigation path;
sending the guide information and the node information corresponding to the guide node to a server;
and receiving the live-action guide image sent by the server.
5. The image navigation method according to any one of claims 2 to 4, wherein the forming a set of live-action guidance images from the live-action guidance images and outputting the set of live-action guidance images comprises:
sequencing all the live-action guide images according to the guide information to obtain a live-action guide image set;
and outputting the live-action guide image set.
6. The image navigation method according to claim 1, wherein the obtaining of the navigation path includes:
determining a path starting point and a path end point, and generating a preselected path according to the path starting point and the path end point;
and determining the navigation path from the pre-selected paths according to a path selection rule.
7. The image navigation method according to claim 6, wherein the determining a path start point comprises:
acquiring target position information, and determining the starting point of the path by using the target position information;
and/or the presence of a gas in the gas,
acquiring and analyzing a starting point selection instruction to obtain an interest point;
and determining the starting point of the path by using the interest point information corresponding to the interest point.
8. The image navigation method according to claim 1, wherein the determining a plurality of guide nodes in the navigation path includes:
determining a guideline level;
and filtering preset guide nodes corresponding to the navigation path according to the guide grades to obtain a plurality of guide nodes.
9. The image navigation method according to claim 1, further comprising, after the outputting the set of live-action guidance images:
acquiring and analyzing an editing instruction to obtain editing information;
and editing the live-action guide image set according to the editing information.
10. An image navigation apparatus, comprising:
the navigation system comprises a path acquisition module, a navigation module and a navigation module, wherein the path acquisition module is used for acquiring a navigation path and determining a plurality of guide nodes in the navigation path;
the image acquisition module is used for acquiring the live-action guidance images corresponding to the guidance nodes according to the navigation path;
and the output module is used for forming a real-scene guide image set by using the real-scene guide images and outputting the real-scene guide image set.
11. An image navigation device, comprising a memory and a processor, wherein:
the memory is used for storing a computer program;
the processor for executing the computer program to implement the image navigation method according to any one of claims 1 to 9.
12. A computer-readable storage medium for storing a computer program, wherein the computer program, when executed by a processor, implements the image navigation method according to any one of claims 1 to 9.
CN202080001068.5A 2020-05-11 2020-05-11 Image navigation method, device, equipment and readable storage medium Pending CN111758016A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/089552 WO2021226779A1 (en) 2020-05-11 2020-05-11 Method, device, and equipment for image navigation, and readable storage medium

Publications (1)

Publication Number Publication Date
CN111758016A true CN111758016A (en) 2020-10-09

Family

ID=72713471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080001068.5A Pending CN111758016A (en) 2020-05-11 2020-05-11 Image navigation method, device, equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN111758016A (en)
WO (1) WO2021226779A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113465601A (en) * 2021-05-13 2021-10-01 上海师范大学 Indoor navigation based on visual path

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114646320B (en) * 2022-02-09 2023-04-28 江苏泽景汽车电子股份有限公司 Path guiding method and device, electronic equipment and readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246018A (en) * 2008-03-14 2008-08-20 凯立德欣技术(深圳)有限公司 Road indication method, device and navigator supporting image
CN105371847A (en) * 2015-10-27 2016-03-02 深圳大学 Indoor live-action navigation method and system
US20170089714A1 (en) * 2015-09-29 2017-03-30 Xiaomi Inc. Navigation method and device
CN107631726A (en) * 2017-09-05 2018-01-26 上海博泰悦臻网络技术服务有限公司 Information processing/indoor navigation method, medium, terminal, server and communication network
CN109612484A (en) * 2018-12-13 2019-04-12 睿驰达新能源汽车科技(北京)有限公司 A kind of path guide method and device based on real scene image
CN110553651A (en) * 2019-09-26 2019-12-10 众虎物联网(广州)有限公司 Indoor navigation method and device, terminal equipment and storage medium
CN110702138A (en) * 2018-07-10 2020-01-17 上海擎感智能科技有限公司 Navigation path live-action preview method and system, storage medium and vehicle-mounted terminal
CN111044061A (en) * 2018-10-12 2020-04-21 腾讯大地通途(北京)科技有限公司 Navigation method, device, equipment and computer readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101097144B (en) * 2006-06-30 2011-10-19 佛山市顺德区顺达电脑厂有限公司 Navigation system having realistic display and method thereof
CN101451852B (en) * 2008-12-19 2012-01-04 华为终端有限公司 Navigation equipment and navigation method
US8639440B2 (en) * 2010-03-31 2014-01-28 International Business Machines Corporation Augmented reality shopper routing
JP2018054456A (en) * 2016-09-29 2018-04-05 アイシン・エィ・ダブリュ株式会社 Route guidance system, and route guidance program
CN106595641A (en) * 2016-12-29 2017-04-26 深圳前海弘稼科技有限公司 Travelling navigation method and device
CN107730970B (en) * 2017-02-14 2021-03-02 西安艾润物联网技术服务有限责任公司 Parking lot entrance and exit live-action display method and device
CN110617832A (en) * 2019-10-15 2019-12-27 天津津航计算技术研究所 Enhanced live-action aided navigation method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246018A (en) * 2008-03-14 2008-08-20 凯立德欣技术(深圳)有限公司 Road indication method, device and navigator supporting image
US20170089714A1 (en) * 2015-09-29 2017-03-30 Xiaomi Inc. Navigation method and device
CN105371847A (en) * 2015-10-27 2016-03-02 深圳大学 Indoor live-action navigation method and system
CN107631726A (en) * 2017-09-05 2018-01-26 上海博泰悦臻网络技术服务有限公司 Information processing/indoor navigation method, medium, terminal, server and communication network
CN110702138A (en) * 2018-07-10 2020-01-17 上海擎感智能科技有限公司 Navigation path live-action preview method and system, storage medium and vehicle-mounted terminal
CN111044061A (en) * 2018-10-12 2020-04-21 腾讯大地通途(北京)科技有限公司 Navigation method, device, equipment and computer readable storage medium
CN109612484A (en) * 2018-12-13 2019-04-12 睿驰达新能源汽车科技(北京)有限公司 A kind of path guide method and device based on real scene image
CN110553651A (en) * 2019-09-26 2019-12-10 众虎物联网(广州)有限公司 Indoor navigation method and device, terminal equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113465601A (en) * 2021-05-13 2021-10-01 上海师范大学 Indoor navigation based on visual path

Also Published As

Publication number Publication date
WO2021226779A1 (en) 2021-11-18

Similar Documents

Publication Publication Date Title
US20210409747A1 (en) Geospatial Media Recording System
JP5315424B2 (en) Image display control apparatus and control method thereof
CN102227611B (en) For providing the method and apparatus of the cursor of context data in instruction drawing application
CN106679665B (en) Route planning method and device
US20040162669A1 (en) Method of controlling display of point information on map
CN112665601A (en) Path planning method and device, electronic equipment and readable storage medium
US10846804B2 (en) Electronic business card exchange system and method using mobile terminal
CN111758016A (en) Image navigation method, device, equipment and readable storage medium
EP3244166B1 (en) System and method for identifying socially relevant landmarks
KR101615504B1 (en) Apparatus and method for serching and storing contents in portable terminal
CN103547887A (en) Navigation system with assistance for making multiple turns in a short distance
JP5615858B2 (en) Route search system and route search method
JP2007058088A (en) Information symbol mapping device
CN118392152A (en) Image generation method, device, equipment and storage medium
US20150178361A1 (en) Information processing apparatus, information processing terminal, computer program product, and information processing method
KR20150037104A (en) Point of interest update method, apparatus and system based crowd sourcing
CN108763414B (en) Live-action display method and device, terminal equipment and storage medium
AU2017277679B2 (en) Location integration into electronic mail system
JP5746911B2 (en) Facility search system along route and facility search method along route
KR100637739B1 (en) Geography information guidance system and the method which use the quality and a data of the electronic map , and the store device which records a method
KR102143239B1 (en) Apparatus for prviding online course curating service
JP5980172B2 (en) Display control apparatus and control method thereof
WO2008015803A1 (en) Map information processing device, navigation system, and program
JP2007271404A (en) Terminal, server, system and program for destination predicting information
JP2007271357A (en) Positional information terminal, positional information server, positional information system, and positional information program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210929

Address after: P.O. Box 4519, 30 de Castro street, 1 Wickham Island, road town of Tortola, British Virgin Islands

Applicant after: Fengtuzhi Technology Holding Co.,Ltd.

Address before: Room 901, Cheung Sha Wan building, 909 Cheung Sha Wan Road, Lai Chi Kok

Applicant before: Fengtu Technology Co.,Ltd.