CN117109603B - POI updating method and navigation server - Google Patents

POI updating method and navigation server

Info

Publication number
CN117109603B
CN117109603B CN202310185184.2A CN202310185184A CN117109603B CN 117109603 B CN117109603 B CN 117109603B CN 202310185184 A CN202310185184 A CN 202310185184A CN 117109603 B CN117109603 B CN 117109603B
Authority
CN
China
Prior art keywords
navigation server
image
pose
navigation
poi
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310185184.2A
Other languages
Chinese (zh)
Other versions
CN117109603A (en
Inventor
刘小伟
宋肖肖
曹鹏蕊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310185184.2A priority Critical patent/CN117109603B/en
Publication of CN117109603A publication Critical patent/CN117109603A/en
Application granted granted Critical
Publication of CN117109603B publication Critical patent/CN117109603B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application provides a POI updating method and a navigation server, relates to the field of data processing, and can improve POI updating efficiency. The method comprises the following steps: the navigation server receives a first image from the electronic device; the first image is an image shot by the electronic equipment on a first place in the AR navigation process; the navigation server estimates a first pose of the electronic equipment according to the first image and the VPS map; the navigation server identifies a first image under the condition that the first pose is determined to be abnormal, so as to obtain a first attribute feature of a first place; the navigation server acquires a moving track and a shooting direction in the AR navigation process of the electronic equipment, and determines historical POI information of a first place according to the moving track and the shooting direction; the navigation server updates the historical POI information by using the first attribute feature when determining that the content of the first historical attribute feature in the historical POI information is different from the content of the first attribute feature.

Description

POI updating method and navigation server
Technical Field
The application relates to the field of data processing, in particular to a POI updating method and a navigation server.
Background
Points of interest (Point Of I nterest, POIs) are useful records of places on a map, consisting of their addresses, geographical coordinates and some additional attributes like names, categories, ratings etc. The POI can be used as a basis for providing the functions of location searching, navigation and the like for the user by the electronic equipment. Through the POI service, a search of a specific location can be provided to a user and directed to such a destination. For example, a POI service searches and screens restaurants based on information on a preference or satisfaction with the restaurant and then navigates to the restaurant for a meal. Therefore, the accuracy and the real-time performance of the POI are of great significance.
With the rapid development of economy, the related information of certain places changes rapidly, such as specific information of shops in a mall. Based on this, in order to bring better use experience to the user, the POIs corresponding to the shop scene need to be updated continuously correspondingly. However, the current update of the POI data is manual operation, and has the defects of high error rate, large workload, long time consumption, low efficiency and the like, which inevitably influences the use experience of the POI service.
Disclosure of Invention
The embodiment of the application provides a POI updating method and a navigation server, which can improve POI updating efficiency.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
In a first aspect, the present application provides a POI update method applied to a navigation server. The method comprises the following steps: the navigation server receives a first image from the electronic device; the first image is an image shot by the electronic equipment on a first place in the process of Augmented Reality (AR) navigation; the navigation server estimates a first pose of the electronic device according to the first image and the Visual Positioning Service (VPS) map; the VPS map includes all images photographed at different photographing angles at each of a plurality of different locations; the navigation server identifies a first image under the condition that the first pose is determined to be abnormal, so as to obtain a first attribute feature of a first place; the navigation server acquires a moving track and a shooting direction in the AR navigation process of the electronic equipment, and determines historical POI information of a first place according to the moving track and the shooting direction; the historical POI information comprises all historical attribute characteristics of the first place; the navigation server updates the historical POI information by using the first attribute feature under the condition that the content of the first historical attribute feature in the historical POI information is determined to be different from the content of the first attribute feature; the first historical attribute is of the same type as the first attribute.
The technical scheme provided by the embodiment of the application is particularly suitable for the AR navigation process of the mobile phone. In the technical scheme, when the user requests the VPS positioning in the AR navigation process by using the mobile phone, the navigation server can estimate and obtain the first pose according to the first image of the first location shot by the mobile phone camera. And when the first pose is determined to be abnormal, the first pose is considered to be not the current actual pose of the mobile phone. The first pose is obtained by the navigation server from the first image and the VPS map, so this also indicates that the first image is abnormal. That is, the first image and the image of the first location in the VPS map are different, thereby indicating that POI information of the first location may be changed. In this case, the navigation server may obtain the first attribute feature by identifying the first image, and compare the first attribute feature with the historical attribute feature in the historical POI information of the first location, so as to determine whether the historical POI information of the first location is actually changed. In the case that the POI information of the first location is determined to be truly changed by comparison, the historical POI information of the first location may be updated according to the first attribute feature. According to the technical scheme provided by the application, whether the POI of the place shot by the mobile phone is changed or not can be judged in real time in the AR navigation process of the mobile phone, and if the POI is changed, the POI is updated in time. The entire process does not require specific actions to be performed because the POI needs to be updated. Compared with the existing manual POI updating mode, the POI updating method is more accurate and higher in efficiency, and the authenticity, accuracy and instantaneity of POI information are guaranteed.
In one possible design manner of the first aspect, the navigation server estimates a first pose of the electronic device according to the first image and the visual positioning service VPS map, including: the navigation server acquires a moving track of the electronic equipment in the AR navigation process, and determines the current first position of the electronic equipment according to the moving track; the navigation server determines a first sub-VPS map corresponding to a first position in the VPS map; the navigation server searches a target place image matched with the first image from the first sub-VPS map; the navigation server extracts the characteristic points of the target place image, and matches the characteristic points of the target place image with the first image to obtain target characteristic points successfully matched with the first image; and the navigation server determines a first pose according to the pose information of the target feature points and the target place images.
Based on the technical scheme, the navigation server can smoothly utilize the first image and the self-stored VPS map to estimate and obtain the first pose of the mobile phone, and data support is provided for the subsequent flow of the POI updating method.
In one possible design manner of the first aspect, the navigation server determines that the first pose is abnormal, including: the navigation server acquires a moving track of the electronic equipment in the AR navigation process; the navigation server determines that the first pose is abnormal based on the movement track.
The moving track can reflect the moving trend of the mobile phone, so that the actual pose of the mobile phone can be reflected to a certain extent, and the moving track can be used for determining whether the first pose is abnormal or not. Based on the above, the above technical solution can smoothly determine whether the first pose has abnormal judgment results, thereby providing support for the execution of the subsequent steps.
In one possible design manner of the first aspect, the navigation server determines that the first pose is abnormal based on the movement track, including: the navigation server determines the current first position of the electronic equipment according to the movement track; the navigation server acquires all the poses corresponding to the first position from the first historical pose record; the first historical pose record comprises all shooting positions and poses corresponding to the shooting positions when all electronic equipment performs AR navigation in an area corresponding to the first position before the current moment; the navigation server calculates the average value of all the poses corresponding to the first position to obtain the average pose; and the navigation server determines that the first pose is abnormal under the condition that the distance between the first pose and the average position is larger than a preset threshold value.
Based on the technical scheme, the navigation server can accurately judge whether the first pose is abnormal or not, so that whether other processes of the subsequent POI method are carried out or not is determined, and smooth implementation of the POI updating method is ensured.
In one possible design manner of the first aspect, the navigation server determines that the first pose is abnormal based on the movement track, including: the navigation server acquires a shooting direction in the AR navigation process of the mobile phone; the navigation server determines the current actual pose of the electronic equipment according to the moving track and the shooting direction; the navigation server determines that the first pose is abnormal under the condition that the first pose is different from the actual pose.
Based on the technical scheme, the navigation server can accurately judge whether the first pose is abnormal or not, so that whether other processes of the subsequent POI method are carried out or not is determined, and smooth implementation of the POI updating method is ensured.
In one possible design manner of the first aspect, in the case that the first attribute feature includes a name, the navigation server identifies the first image to obtain the first attribute feature of the first location, including: the navigation server recognizes the first image using OCR technology to obtain a first attribute feature.
Based on the technical scheme, the navigation server can accurately identify the name of the first image by adopting the OCR technology, so as to provide data support for judging whether the POI of the first place needs to be updated or not, and ensure the smooth proceeding of the POI updating method.
In one possible implementation manner of the first aspect, after the navigation server searches the first sub-VPS map for the target location image matching the first image, the method further includes: the navigation server determines a search score according to the matching degree of the target site image and the first image; under the condition that the search score is larger than the preset score, the navigation server extracts the characteristic points of the target site image, and matches the characteristic points of the target site image with the first image to obtain target characteristic points successfully matched with the first image; and under the condition that the search score is smaller than the preset score, the navigation server determines that the first pose is abnormal, and identifies the first image so as to obtain the first attribute characteristic of the first place.
Based on the technical scheme, the navigation server can more quickly determine that the first pose is abnormal, and further more quickly determine whether to execute a process of updating the historical POI information of the first place. Therefore, the POI updating efficiency can be improved when the POI information of a certain place is changed, and the authenticity, accuracy and instantaneity of the POI information are ensured.
In a possible implementation manner of the first aspect, in a case that the navigation server updates the historical POI information using the first attribute feature, the method further includes: the navigation server updates a location image of a first location in the VPS map with the first image.
Based on the technical scheme, the location image of the first location in the VPS map can be updated in time, and then the pose of the mobile phone can be estimated and obtained when the VPS request carrying the image of the first location is received, so that the accuracy of AR navigation is further ensured, and the use experience of a user is improved.
In a possible design manner of the first aspect, in a case that the navigation server determines that the content of the first historical attribute feature in the historical POI information is different from the content of the first attribute feature, the method further includes: the navigation server sends first indication information to the electronic equipment so that the electronic equipment displays the first indication information on an AR navigation interface; the first indication information is used for at least indicating that the first attribute feature of the first place is changed.
Based on the technical scheme, the user can know which places in the AR navigation route are changed in time, so that more navigation possibilities are provided for the user, and the use experience of the user is improved.
In a second aspect, the present application provides a navigation server comprising: a memory and one or more processors; the memory is coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the navigation server to perform the POI update method as provided in the first aspect.
In a third aspect, the present application provides a computer readable storage medium comprising computer instructions which, when run on a navigation server, cause the navigation server to perform a POI update method as provided in the first aspect.
In a fourth aspect, there is provided a computer program product comprising instructions which, when run on a navigation server, enable the navigation server to perform the POI update method provided in the first aspect above.
The advantages achieved by the second aspect to the fourth aspect may refer to the advantages of the first aspect and any of the possible designs thereof, and are not described herein.
Drawings
FIG. 1 is a schematic diagram of an OCR technology according to an embodiment of the present application;
Fig. 2 is a schematic diagram of a POI updating method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an AR navigation system according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic software architecture diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a navigation server according to an embodiment of the present application;
fig. 7 is a flowchart of a POI updating method according to an embodiment of the present application;
FIG. 8 is a schematic view of an AR navigation scenario according to an embodiment of the present application;
Fig. 9 is a second flow chart of a POI updating method according to the embodiment of the present application;
Fig. 10 is a second flow chart of a POI updating method according to the embodiment of the present application;
fig. 11 is a flowchart of a POI updating method according to an embodiment of the present application;
FIG. 12 is a schematic view of a first image according to an embodiment of the present application;
Fig. 13 is a schematic flow chart of updating historical POI information according to an embodiment of the present application;
fig. 14 is a flow chart diagram of a POI updating method according to an embodiment of the present application;
fig. 15 is a flowchart fifth of a POI updating method according to an embodiment of the present application;
fig. 16 is a flowchart sixth of a POI updating method according to an embodiment of the present application;
fig. 17 is a schematic diagram of a scenario in which a mobile phone displays first indication information according to an embodiment of the present application;
Fig. 18 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that "/" means or, e.g., A/B may represent A or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the application may be combined with other embodiments.
The terms "first", "second" in the following embodiments of the present application are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
For a better understanding of aspects of embodiments of the present application, related terms and concepts that may be related to embodiments of the present application are described below. It should be understood that the related conceptual explanation may be limited by the specific embodiment of the present application, but it is not intended to limit the present application to this specific embodiment, and differences between different embodiments may exist, which is not limited herein. The concrete explanation is as follows:
(1) Augmented reality (augemented rea l ity, AR) navigation: AR navigation is a navigation mode, wherein a map, a mobile phone camera or an AR glasses camera is combined with AR technology/space map depth, the camera can present everything in the real world on a mobile phone screen, meanwhile, virtual models such as cartoon characters, indication arrows and the like can be superimposed on an existing image, and the virtual model can guide navigation directions for pedestrians.
In the AR navigation process, three-dimensional space reconstruction equipment, cloud algorithm engines and space three-dimensional map building can be adopted. In practice, a user can complete three-dimensional space high-precision positioning in real time by identifying a real environment through a user terminal (such as a smart phone) or an AR (AR) glasses camera with an inertial measurement unit (inert ia l measurement un it, IMU) (or a six-axis sensor), so as to perform precise visual navigation, and a virtual navigation prompt is superimposed in reality to help the user to quickly find a target parking space/track/meeting room/sightseeing seat/shop/service desk/elevator/toilet/sight spot and the like, so that the three-dimensional space high-precision positioning can be used in superposition with navigation/mutual entertainment/marketing. The application scene comprises: training venues/exhibitions/release events/malls/scenic spots/campuses/exhibition halls/museums.
(2) Point of interest (point of interest, POI): POIs are important components of a navigation electronic map, and generally refer to a landmark, a building, a scenic spot, etc. on the electronic map, and are used for marking government departments represented by the landmark, commercial institutions (gas stations, department stores, supermarkets, restaurants, hotels, convenience stores, hospitals, etc.), tourist attractions (parks, public toilets, etc.), ancient points of interest, transportation facilities (various stations, parking lots, overspeed cameras, speed limit marks), etc. The POI information generally includes information such as the name, category, longitude, latitude, contact, house construction, specific address, and evaluation content of the POI. The POI information can provide the functions of location searching and navigation for the user. Through the POI service, a search of a specific location can be provided to a user and directed to such a destination. The comprehensive POI information is necessary information for enriching the navigation map, can prompt the user of detailed information of branches and surrounding buildings of road conditions in time, and can conveniently search all places required by the user in navigation, and the most convenient and unobstructed road is selected for path planning. For example, a POI service searches and screens restaurants based on information on a preference or satisfaction with the restaurant and then navigates to the restaurant for a meal. Therefore, the amount and accuracy of POI information in the navigation map directly affect the convenience of navigation.
(3) Visual location service (vi sua l pos it ion ING SERVICE, VPS): i.e. visual location services, is a system or service that uses image information to locate. VPSs deployed at the cloud (or server or data center) may provide large-space three-dimensional maps and map POI identifications, map semantic services, global positioning services, and the like.
(4) Optical character recognition (opt ICA L CHARACTER recogn it ion, OCR): OCR is a technique that uses optical and computer technology to read out images, text printed or written on paper, and converts it into a format that can be accepted by computers and understood by humans, now also including text recognition of natural scenes. For example, the text information in the picture\PDF is captured by utilizing an optical technology and converted into text which can be edited such as Word and TXT.
Illustratively, the principle architecture of OCR technology may include a detection portion and an identification portion. The detection portion may include a feature extraction network. The identification portion may then include a codec based on an attention mechanism.
For example, referring to fig. 1, the feature extraction network may be a convolutional neural network (Convo l ut iona l Neura lNetwork, CNN). The feature extraction network is used for extracting features of the input image input inmage to obtain a feature map of 2 dimensions (D i smens iona l, D). The feature map serves, on the one hand, as an extraction basis for the overall features ho l i st ic feature of the entire input image and, on the other hand, as a context needed for the attention tent ion mechanism in the recognition part.
Illustratively, referring to FIG. 1, the identification portion may specifically include an encoder, a decoder, and an attention network. The encoder may be an encoder formed by a long short-term memory (LSTM) model, i.e., an LSTM encoder. The decoder may be a decoder formed of long short-term memory (LSTM) model, i.e., LSTM decoder. The attention network may then be a 2D attention attent ion.
The LSTM decoder is used for processing and encoding the feature map output by the CNN to obtain the integral feature ho l i st ic feature of the input image, and inputting ho l i st ic feature into the LSTM decoder for processing.
The LSTM decoder may obtain the hidden state H IDDEN STATES after ho l i st ic feature processing the LSTM decoder input and send it to 2D attent ion for processing, after 2D attent ion and the glance gl impses. The LSTM decoder obtains the current output result according to the previous output and gl impses processing. The LSTM decoder can then finally obtain the decoding result for ho l i st ic feature, i.e. the final recognition result for the input image, by repeating this process.
2D attent ion, a feature map for combining the CNN output and H IDDEN STATES provided by the LSTM decoder, generate gl impses, and provide the generated gl impses to the LSTM decoder, so that the LSTM decoder can successfully decode to obtain a recognition result of the input image, for example, "UNI TED" as shown in fig. 1.
The POI is used as a data support capable of providing the functions of site searching, navigation and the like for the user, and has important significance in accuracy and instantaneity. With the rapid development of economy, the related information of certain places changes rapidly, such as specific information of shops in a mall. Based on this, in order to bring better use experience to the user, the POIs corresponding to the shop scene need to be updated continuously correspondingly. However, the current update of the POI is manual operation, and has the defects of high error rate, large workload, long time consumption, low efficiency and the like, which inevitably influences the use experience of the user on the POI service.
In view of the above technical problems, referring to fig. 2, an embodiment of the present application provides a POI updating method. The method can be applied to a scene of AR navigation by a user through electronic equipment. According to the technical scheme, whether the current positioning of the electronic equipment is abnormal or not can be determined through the image of the first place shot by the camera of the electronic equipment in the process that the electronic equipment starts AR navigation, namely, whether the position corresponding to the image of the target place shot by the electronic equipment is greatly different from the position where the electronic equipment is actually located or not is determined. If so, a high probability indicates that the image of the target site taken by the electronic device camera may be subject to a large change (e.g., a store of the target site has changed) to cause an abnormality in the positioning determined from the image of the target site. This also indicates that the POI of the target site needs to be updated. Then, the information in the image of the target place shot by the camera can be identified by combining the image identification technology, and then the POI of the first place is updated. In this way, the updating of the POI can be automatically performed in real time in the AR navigation process of the user using the electronic equipment, maintenance is not required, POI updating efficiency is improved, the authenticity, accuracy and instantaneity of the POI are improved, and the use experience of the user when using the POI service can be improved.
The technical scheme provided by the embodiment of the application is described in detail below with reference to the accompanying drawings.
The technical scheme provided by the application can be applied to an AR navigation system (or POI update system) shown in figure 3. Referring to fig. 3, the AR navigation system includes an electronic device 01 and a navigation server 02. The electronic device 01 and the navigation server 02 can be connected by wired communication or wireless communication.
The electronic device 01 is mainly configured to respond to an AR navigation triggering operation of a user, open a camera, display an AR navigation interface, and send a current captured image of the camera, a navigation destination, and a current position (e.g., longitude and latitude) of the electronic device to the navigation server 02 in real time. In some embodiments, the electronic device 01 may be provided with an electronic compass, through which the electronic device 01 may detect a direction in which a camera head of the camera faces, and obtain direction data. The electronic device 01 may store the direction data as expansion information of the currently photographed image. That is, the electronic device 01 carries corresponding direction data in the current captured image transmitted to the navigation server 02.
In the embodiment of the application, the position of the electronic equipment, the gesture of the electronic equipment and the facing direction of the camera can be combined into a set to be called as a pose.
The navigation server 02 may include a VPS service and a VPS map corresponding to the VPS service. The VPS map may include sub-VPS maps corresponding to different areas or places (for example, mall a, food city B, etc.), and each sub-VPS map may include all images (or may be referred to as all images of 360 °) captured at different viewing angles at each position in the corresponding area.
Based on the image shot by the camera of the electronic device 01, the navigation server 02 can estimate and obtain the current pose of the electronic device 01 and the sub-VPS map corresponding to the current region of the electronic device by using the VPS service. Based on the pose, the navigation destination from the electronic device 01 and the sub-VPS map, AR navigation information, that is, information indicating how the user proceeds, may be determined, and the AR navigation information may be sent to the electronic device 01, so that the electronic device 01 may perform AR navigation smoothly. Of course, in the actual AR navigation process, there may be an error in the pose estimated by the navigation server 02 from the image captured by the camera of the electronic device 01. In order to obtain more accurate navigation information, the navigation server can determine the current actual pose of the electronic device 01 according to the moving track of the electronic device 01 and the current estimated pose in the current AR navigation process. The movement track of the electronic device 01 can be obtained by combining all the poses determined in the AR navigation process by the navigation server 02. Because the navigation server 02 determines that the pose exists as an accidental event according to the image shot by the camera of the electronic device 01, the probability is very small, and the current actual pose of the electronic device 01 can be obtained more accurately based on the historical pose record of the AR navigation and the pose obtained by combining the current estimation. Of course, in practice, the navigation server 02 may also determine the current actual pose of the electronic device 01 in any other feasible manner, which is not particularly limited by the present application.
In the embodiment of the present application, the navigation server 02 may further determine, according to the captured image from the electronic device, whether there is a change in the target location in the image captured by the electronic device compared with the image of the target location in the VPS map, that is, whether there is a change in the POI of the target location, and at this time, may obtain some current POI information of the target location by using image recognition, so as to update the POI of the target location in the VPS map.
For example, the electronic device in the embodiments of the present application may be an electronic device with AR navigation function, such as a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mob i le persona l computer, a UMPC, a netbook, a cellular phone, a personal digital assistant (personal assistant L D IGITA L ASS I STANT, PDA), an augmented reality (augmented rea l ity, AR) device, a virtual reality (vi rtua l rea l ity, VR) device, an artificial intelligence (ART IFICIA L INTE L L IGENCE, AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, which is not particularly limited in the specific type of the electronic device.
Taking an electronic device as an example of a mobile phone, fig. 4 shows a schematic structural diagram of the electronic device according to an embodiment of the present application.
As shown in fig. 4, the electronic device may have a plurality of cameras 293, such as a front-mounted normal camera, a front-mounted low power consumption camera, a rear-mounted normal camera, a rear-mounted wide-angle camera, and the like. In addition, the electronic device may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (un IVERSA L SER IA L bus, USB) interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a display 294, a subscriber identity module (subscr iber IDENT IFICAT ion modu le, SIM) card interface 295, and the like. Among other things, the sensor module 280 may include a gyroscope sensor 280A, a magnetic sensor 280B, an acceleration sensor 280C, a proximity light sensor 280D, a fingerprint sensor 280E, a temperature sensor 280F, a touch sensor 280G, an ambient light sensor 280H, and the like.
Processor 210 may include one or more processing units such as, for example: processor 210 may include an application processor (app l icat ion processor, AP), a modem processor, a graphics processor (graph ics process ing un it, GPU), an image signal processor (IMAGE S IGNA L processor, ISP), a controller, memory, a video codec, a digital signal processor (D IGITA L S IGNA L processor, DSP), a baseband processor, and/or a neural network processor (neura l-network process ing un it, NPU), among others. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CI rcu it, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CI rcu it sound, I2S) interface, a pulse code modulation (pu l se code modu l at ion, PCM) interface, a universal asynchronous receiver transmitter (un iversa l asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobi le industry processor interface, MI PI), a general purpose input/output (GPIO) interface, a subscriber identity module (subscr iber ident ity modu le, SIM) interface, and/or a universal serial bus (un IVERSA L SER IA L bus, USB) interface, among others.
The external memory interface 220 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The internal memory 221 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-vo l at i le memory, NVM). The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like. The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write. In the embodiment of the present application, the internal memory 221 may store a picture file or a recorded video file or the like of the electronic device photographed in a single-mirror photographing mode or a multi-mirror photographing mode or the like.
The touch sensor 280G, also referred to as a "touch device". The touch sensor 280G may be disposed on the display 194, and the touch sensor 280G and the display 294 form a touch screen, which is also referred to as a "touch screen". The touch sensor 280G is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 294. In other embodiments, the touch sensor 280G may also be disposed on a surface of the electronic device at a different location than the display 294.
In some embodiments, the electronic device may include 1 or N cameras 293, N being a positive integer greater than 1. In the embodiment of the present application, the type of the camera 293 may be differentiated according to the hardware configuration and the physical location. For example, the plurality of cameras included in the camera 293 may be disposed on the front and back sides of the electronic device, the camera disposed on the display screen 294 of the electronic device may be referred to as a front camera, and the camera disposed on the rear cover of the electronic device may be referred to as a rear camera; for example, cameras having different focal lengths and different viewing angles, including the camera 293, may be referred to as wide-angle cameras, and cameras having a long focal length and a small viewing angle may be referred to as normal cameras.
The electronic device implements display functions through the GPU, the display screen 294, and the application processor, etc. The GPU is a microprocessor for image editing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The electronic device may implement shooting functions through an ISP, a camera 293, a video codec, a GPU, a display 294, an application processor, and the like.
The display 294 is used to display images, videos, and the like. The display 294 includes a display panel. In some embodiments, the electronic device may include 1 or N displays 294, N being a positive integer greater than 1.
In embodiments of the application, the display 294 may be used to display an interface of an electronic device (e.g., a camera preview interface, an AR navigation interface, etc.) and display images captured from any one or more cameras 293 in the interface, or may also be used to display virtual images for AR navigation.
The charge management module 240 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 241 is used for connecting the battery 242, and the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 521, the display 294, the camera 293, the wireless communication module 260, and the like.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 250 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device.
The wireless communication module 260 may provide solutions for wireless communication including wireless local area network (Wi re less loca l area networks, WLAN) (e.g., wireless fidelity (WI RE LESS FIDE L ITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (globa l navigat ion SATE L L ITE SYSTEM, GNSS), frequency modulation (frequency modu l at ion, FM), near Field Communication (NFC), infrared (I R), etc. for application on an electronic device. The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The SIM card interface 295 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 295 or removed from the SIM card interface 295 to enable contact and separation from the electronic device. The electronic device may support one or more SIM card interfaces. The SIM card interface 295 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 295 may be used to insert multiple cards simultaneously. The SIM card interface 295 may also be compatible with external memory cards. The electronic equipment interacts with the network through the SIM card, so that the functions of communication, data communication and the like are realized.
It will be understood, of course, that the above illustration of fig. 4 is merely exemplary of the case where the electronic device is in the form of a cellular phone. If the electronic device is a tablet computer, a handheld computer, a PC, a PDA, a wearable device (e.g., a smart watch, a smart bracelet), etc., the electronic device may include fewer structures than those shown in fig. 4, or may include more structures than those shown in fig. 3, which is not limited herein.
It will be appreciated that in general, implementation of electronic device functions requires software in addition to hardware support.
Coordination of software is also required. The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the application are configured in a layered mannerThe system is an example illustrating the software architecture of an electronic device.
Fig. 5 is a schematic diagram of a layered architecture of a software system of an electronic device according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface.
As shown in fig. 5, the system used by the electronic device isFor example, in the embodiment of the present application, the software of the electronic device is divided into four layers, namely, an application layer, a framework layer, a system library and android runtime (android runt ime), a HAL layer (hardware abstract ion layer, a hardware abstraction layer) and a driver layer (or referred to as a kernel layer) from top to bottom.
The application layer may include a series of applications, among others. As shown in fig. 5, the application layer may include applications for cameras, navigation, gallery, calendar, map, WLAN, bluetooth, music, video, short messages, talk, etc.
The navigation application can be provided with an AR navigation function, and the camera application can be called to complete the AR navigation function when the AR navigation function is started. The framework layer may provide an application programming interface (app l icat ion programming interface, API) and programming framework for application programs of the application layer. The framework layer includes some predefined functions. For example, an activity manager, a window manager, a view system, a resource manager, a notification manager, an audio service, a camera service, etc., to which embodiments of the application are not limited in any way.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (Med ia Librar ies), openGL ES, SGL, etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. OpenGL ES is used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing, among others. SGL is the drawing engine for 2D drawing.
The android runtime (android runt ime) includes a core library and virtual machines. android runt ime is responsible for scheduling and management of the android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The HAL layer is an interface layer between the operating system kernel and the hardware circuitry that aims at abstracting the hardware. The hardware interface details of a specific platform are hidden, a virtual hardware platform is provided for an operating system, so that the operating system has hardware independence, and can be transplanted on various platforms. The HAL layer provides a standard interface to display device hardware functionality to a higher level Java API framework (i.e., framework layer). The HAL layer contains a plurality of library modules, each of which implements an interface for a particular type of hardware component, such as: aud io HAL audio module, b l uetooth HAL bluetooth module, CAMERA HAL camera module, sensors HAL sensor module (or I sensor service, sensor service).
The kernel layer is a layer between hardware and software. The inner core layer at least comprises display drive, camera drive, audio drive, sensor drive and the like, and the application is not limited.
The navigation server provided by the application can be a server, a server cluster formed by a plurality of servers or a cloud computing service center, and the application is not particularly limited to the above.
By way of example, using a training device as a server, fig. 6 shows a schematic diagram of a navigation server. Referring to fig. 6, the navigation server includes one or more processors 601, communication lines 602, and at least one communication interface (shown in fig. 6 by way of example only as including communication interface 603, and one processor 601), and optionally memory 604.
The processor 601 may be a general purpose central processing unit (centra l process ing un it, CPU), microprocessor, application Specific Integrated Circuit (ASIC), or one or more integrated circuits for controlling the execution of the program of the present application.
The communication lines 602 may include a communication bus for communication between the different components.
The communication interface 603 may be a transceiver module for communicating with other devices or communication networks, such as ethernet, RAN, wireless local area network (wi re less loca l area networks, WLAN), etc. For example, the transceiver module may be a device such as a transceiver or a transceiver. Optionally, the communication interface 603 may also be a transceiver circuit located in the processor 601, for implementing signal input and signal output of the processor.
The memory 604 may be a device having a memory function. For example, but not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (E LECTR ICA L LY erasab le programmab le read-on memory, EEPROM), a compact disc read-only memory (compact d i sc read-on memory) or other optical disk storage, optical disk storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via communication line 602. The memory may also be integrated with the processor.
The memory 604 is used for storing computer-executable instructions for implementing the present application, and is controlled by the processor 601 for execution. The processor 601 is configured to execute computer-executable instructions stored in the memory 604, thereby implementing the POI updating method provided in the embodiment of the present application.
Alternatively, in the embodiment of the present application, the processor 601 may perform the functions related to the processing in the POI updating method provided in the embodiment of the present application, and the communication interface 603 is responsible for communicating with other devices (such as electronic devices) or a communication network, which is not limited in particular.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application program codes, which are not particularly limited in the embodiments of the present application.
In a particular implementation, the processor 601 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 6, as an embodiment.
In a particular implementation, the server may include multiple processors, such as processor 601 and processor 607 in FIG. 6, as one embodiment. Each of these processors may be a single core (s ingle-core) processor or may be a multi-core (mu lt i-core) processor. The processor herein may include, but is not limited to, at least one of: a central processing unit (centra l process ing un it, CPU), microprocessor, digital Signal Processor (DSP), microcontroller (microcontro l ler unit, MCU), or artificial intelligence processor, each of which may include one or more cores for executing software instructions to perform operations or processes.
In a specific implementation, the server may also include an output device 605 and an input device 606, as one embodiment. The output device 605 communicates with the processor 601 and may display information in a variety of ways. For example, the output device 605 may be a liquid crystal display (l iqu ID CRYSTA L D I SP L AY, LCD), a light emitting diode (L IGHT EMITT ING D iode, LED) display device, a Cathode Ray Tube (CRT) display device, or a projector (projector), or the like. The input device 606 is in communication with the processor 601 and may receive user input in a variety of ways. For example, the input device 606 may be a mouse, a keyboard, a touch screen device, a sensing device, or the like.
The server may be a general purpose device or a special purpose device. For example, the server may be a desktop, a portable computer, a web server, a Personal Digital Assistant (PDA), a mobile handset, a tablet, a wireless terminal device, an embedded device, a terminal device as described above, a network device as described above, or a device having a similar structure as in fig. 6. The embodiment of the application is not limited to the type of the server.
The methods in the following embodiments may be implemented in an electronic device or training device having the above-described hardware structure.
The POI updating method provided by the embodiment of the application is described below with reference to the accompanying drawings.
The present application provides a POI updating method, which can be applied to an AR navigation system as shown in fig. 3, and takes an electronic device as a mobile phone as an example, referring to fig. 7, the flow of the POI updating method may include steps S701-S711:
s701, responding to the operation of starting AR navigation by a user, starting an AR navigation function by the mobile phone and displaying an AR navigation interface.
In order to facilitate the appearance of users, an AR navigation function may be provided in an application having a navigation function in a mobile phone. When the user needs to use the AR navigation function, the user can start the AR navigation function of the application by implementing a preset starting operation of the application (or preset by the user). Wherein the preset starting operation can be to AR navigation control
Under the condition that the AR navigation function is started, the application can start the camera of the mobile phone and display the image shot by the camera as the background in the AR navigation interface in real time. Under the condition that the AR navigation function is started, the mobile phone sends the position of the mobile phone, the image shot by the camera and the destination of the user to a navigation server corresponding to the application. The navigation server can generate navigation information based on information sent by the mobile phone and send the navigation information to the mobile phone. After receiving the navigation information from the navigation server, the mobile phone can display the navigation information in the form of a virtual image in an AR navigation interface based on the pose of the mobile phone, for example, a road pointing arrow, a road pointing prompt and the like are displayed, and further, the application can call an audio playing module (for example, a loudspeaker) of the mobile phone to play the road pointing voice corresponding to the navigation information.
For example, the mobile phone corresponds to an operation of starting AR navigation by the user, and an AR navigation interface displayed after the AR navigation function is started may be as shown in fig. 8 (a). As shown in fig. 8 (a), in the AR navigation interface 801 displayed on the mobile phone, the background is specifically an actual image photographed by the camera of the mobile phone. In addition, the AR navigation interface 801 may further include a virtual image for navigation generated by the mobile phone according to the navigation information sent by the navigation server, such as a routing arrow 802 shown in fig. 8 (a). Meanwhile, in order to facilitate the user to know how far to advance along the virtual image, navigation prompt information, such as 'advance by 44 meters', can be displayed near the virtual image.
In some possible scenarios, when the AR navigation function is turned on, if the orientation of the mobile phone camera is not the direction in which the current user needs to advance as indicated in the navigation information provided by the navigation server, the virtual image may be displayed in a specific orientation. Only when the shooting direction of the camera of the mobile phone is the azimuth, the virtual image is displayed in the image shot by the camera. When the shooting direction of the mobile phone camera is not the direction for displaying the virtual image, in order to instruct the user to change the shooting direction of the camera, corresponding prompt information is displayed in the AR navigation interface. As shown in fig. 8 (b), the handset may display a turn prompt 803. The prompt 803 is used to instruct the user to adjust the shooting direction of the mobile phone camera, for example, as shown in (b) of fig. 8, and may include an indication icon 8031 and an indication text 8032. Wherein the indicator text 8032 may be a text message such as "see here". Then, if the user controls the camera shooting direction of the mobile phone to be adjusted to the azimuth of displaying the virtual image, an AR navigation interface as described in (a) of the figure may be displayed.
S702, acquiring a first image of a first location in the AR navigation process of the mobile phone.
The first place is a place which can be shot by the shooting direction of the camera of the mobile phone, such as a shop of a mall, an elevator of the mall, a restaurant beside a highway and the like. The first image may specifically be a first image obtained by shooting the first location with a camera of the mobile phone.
In the embodiment of the application, after the mobile phone acquires the first image, the mobile phone needs to determine the pose of the mobile phone in time so as to display the virtual image corresponding to the navigation information on the AR navigation interface in a proper mode based on the current pose of the mobile phone under the condition of obtaining the navigation information. Based on this, the handset needs to send a VPS request to the navigation server after S702 (i.e. S703) to request to get its own pose.
S703, the mobile phone sends a VPS request to the navigation server.
Wherein the VPS request carries the first image. The VPS request is used to request the pose of the handset. In practical application, the mobile phone can send a VPS request to the navigation server and a navigation request to request the navigation server to combine the mobile phone position, the navigation destination and the first image sent by the current mobile phone when the mobile phone starts AR navigation, obtain navigation information and send the navigation information to the mobile phone, so that the mobile phone displays corresponding virtual icons and navigation prompt contents in an AR navigation interface according to the navigation information. Of course, the VPS request may also be the navigation request, where the VPS request also includes the role of the navigation request and carries the content carried by the navigation request.
S704, the navigation server receives the VPS request from the mobile phone.
Specifically, after receiving the VPS request, the navigation server may consider that the first image from the mobile phone is received, and estimate the pose of the mobile phone according to the first image in the VPS request, that is, S705 is executed.
And S705, the navigation server estimates the first pose of the mobile phone according to the first image and the VPS map carried in the VPS request.
After the navigation server estimates the first pose of the mobile phone according to the first image and the VPS map, it may be determined whether there is an abnormality according to the first pose (which may be understood as determining whether the first pose is an actual pose of the mobile phone or has little difference from the actual pose of the mobile phone), that is, S705 is followed by executing S706.
In some embodiments, the navigation server may estimate the first pose based on the first image and a sub-VPS map of an area of the VPS map corresponding to the first location. Based on this, referring to fig. 9 in conjunction with fig. 7, S705 may specifically include S7051-S7056:
S7051, a navigation server acquires a moving track of the mobile phone in the AR navigation process, and determines the current first position of the electronic equipment according to the moving track.
The movement track can be obtained by combining positions in the pose estimated after receiving the VPS request from the mobile phone each time in the AR navigation process of the navigation server according to the mobile phone. In addition, the moving track should be the moving track before the current moment in the process of carrying out AR navigation by the mobile phone.
It will be appreciated that it is not a common event that the change in POI information at a location causes a change in its image, so that the pose estimated by the navigation server from the captured image in the VPS request from the handset is mostly correct during AR navigation by the user using the handset. In addition, in the process of AR navigation by a user through a mobile phone, in order to ensure the look and feel of the AR navigation interface, the speed of capturing images by the camera and taking the images as the background of the AR navigation interface is very high, so that the positions corresponding to the previous image and the next image captured by the camera are not greatly changed. Based on the above, the navigation server can predict the specific position (i.e. the position where the first image is shot) of the current mobile phone according to the movement track before the current moment. How to determine the current position according to the previous movement track may be any feasible manner, and the present application is not particularly limited thereto.
S7052, the navigation server determines a first sub-VPS map corresponding to the first location in the VPS map.
In particular, in practice the VPS map may be composed of sub-VPS maps of different areas. The sub-VPS map includes all relevant map information of the area to which the sub-VPS map belongs, for example, all images photographed at different photographing angles at each position in the area and pose information thereof. Here, the pose information of a certain image specifically means that the pose of a camera when the image is captured can be reflected.
For example, taking an area as an a market as an example, all images 360 ° around each position (for example, a position of a first meter in front of a door of a second shop arranged clockwise by taking an A1 elevator as a starting point) of each layer in the a market and corresponding pose information are included in the sub-VPS map corresponding to the a market. All the 360 ° images of each location may be taken at different angles of capture by the person at that location at the time of generation of the sub-VPS map.
In the present application, the first location may be one location in the first sub-VPS map. For example, if the first sub-VPS map is a sub-VPS map corresponding to the a mall, the first location may be a location in front of a door of a certain shop in the a mall (for example, a second shop whose second floor is arranged clockwise with an A1 elevator as a starting point).
S7053, the navigation server searches the first sub-VPS map for a target location image matching the first image.
In one implementation, the navigation server may look up the target location image matching the first image from all location images in the first sub-VPS map one by one. The method used for specific matching may be any image matching method, and the present application is not particularly limited thereto.
In another implementation, the navigation server may first extract all features of the first image. Then, a target spot image whose feature matches the feature of the first image is found from all spot images in the first sub-VPS map based on all the features of the first image.
Of course, any feasible searching method is also possible in practice, and the present application is not particularly limited thereto.
The navigation server searches for a target location image matching the first image from the first sub-VPS map, specifically, refers to one or more location images having the highest matching degree with the first image in the first sub-VPS map. For example, if there are A, B and C three spot images that match the first image, the matching degree of a and the first image is 80%, the matching degree of B and the first image is 90%, and the matching degree of C and the first image is also 90%, then B and C are target spot images that match the first image.
S7054, the navigation server extracts feature points of the target point image.
Specifically, the feature points extracted by the navigation server may be feature points that can completely reflect the spatial relationship features of the target image, such as the structure and the shape of the target image. Such as contour feature points.
In the present application, the manner in which the navigation server extracts the feature points of the target location image may be any feasible manner, for example, a direction gradient histogram (h i stogram of or IENTED GRAD IENT, HOG) feature extraction algorithm, a local binary pattern (Loca lBinary Patterns, LBP) feature extraction algorithm, a scale invariant feature transform (sca le-INVAR IANT feature transform, SI FT) feature extraction algorithm, and the like. The present application is not particularly limited thereto.
In addition, based on the difference of information carried by the target site image itself, the feature points of the target site image extracted here may be 2D feature points or 3D feature points, which is not particularly limited in the present application.
S7055, the navigation server matches the feature point of the target location image with the first image, so as to obtain a target feature point successfully matched with the first image.
In the embodiment of the present application, the feature matching manner adopted by the navigation server to match the feature point of the target site image with the first image may be any feasible manner, which is not particularly limited in the present application.
S7056, the navigation server determines the first pose of the mobile phone according to the pose information of the target feature points and the target place images.
In the embodiment of the present application, the pose information of the target site image may be obtained when the navigation server implements S7052, or may be obtained before the navigation server executes S7055, which is not particularly limited in the present application. In addition, pose information of the target site image may exist alone or in expansion information of the target site image, which is not particularly limited in the present application.
In the embodiment of the present application, the pose estimation method adopted by the navigation server to determine the first pose according to the pose information of the target feature point and the target location image may be any feasible method, for example, a model-based pose estimation method, a machine learning-based pose estimation method, and the like. The present application is not particularly limited thereto.
Based on the technical schemes corresponding to the above-mentioned S7051-S7056, the navigation server can smoothly utilize the first image and the VPS map stored by itself to estimate the first pose of the mobile phone, and provide data support for the subsequent flow of the POI updating method.
Of course, the implementation procedure of determining the first pose of the mobile phone by the navigation server disclosed in the above S7051-S7056 according to the first image is only one possible implementation manner, and may be any other practical implementation manner, which is not particularly limited in the present application.
S706, the navigation server judges whether the first pose is abnormal or not.
If the navigation server determines that the first pose is not abnormal, the navigation server indicates that the first pose is the actual pose of the mobile phone, or the difference between the first pose and the actual pose of the mobile phone is not large, and at the moment, the navigation server can send the first pose to the mobile phone so that the mobile phone carries out AR navigation based on the first pose. I.e. S707 is performed. It should be noted that, when sending the first pose to the mobile phone, the navigation server also sends the navigation information corresponding to the navigation request to the mobile phone, so that the mobile phone can perform AR navigation.
Further, since the pose is determined by the navigation server from the first image, if it is determined that there is no positioning abnormality in the pose determined from the first image, it is highly probable that the image of the first place does not change greatly from the image of the first place in the navigation server. That is, there is a high probability that the POI of the first location will not change. In this case, the navigation server does not need to update the POI at the first location. That is, the process ends at this time, and the navigation server does not execute other processes until the current AR navigation process does not receive the photographed image from the mobile phone again.
If the navigation server determines that the first pose is abnormal, the navigation server indicates that the first pose is not the actual pose of the mobile phone, or that the difference between the first pose and the actual pose of the mobile phone is large. Because the pose is determined by the navigation server according to the first image, if the pose determined by the first image is determined to have abnormal positioning, the first image of the first location is largely changed compared with the location image of the first location in the navigation server. That is, the POI of the first location has a high probability of being changed. In this case, the navigation server may then perform the update flow of the POI. I.e. the subsequent S708-S711 are performed.
In practice, the navigation server may not execute the judgment step S706, but may execute S707 directly when it is determined that there is no abnormality in the first pose, and execute S708 when it is determined that there is an abnormality in the first pose.
In the embodiment of the application, the navigation server can specifically determine whether the first pose is abnormal or not based on the moving track of the mobile phone in the current AR navigation process before the current moment. Because the movement track can reflect the movement trend of the mobile phone, the actual pose of the mobile phone can be reflected to a certain extent, and the first pose can be used for determining whether the first pose is abnormal or not. The method can be realized by the following steps:
In one possible implementation manner, each time there is a process that a user uses an electronic device (for example, a mobile phone) to perform AR navigation near a first location, the navigation server may store the accurate pose determined after all the electronic devices capture images and the location where the electronic device is located in association. Therefore, the navigation server may decide whether the estimated first pose is correct based on the previously stored historical poses. Based on this, referring to fig. 10 in conjunction with fig. 7, S706 may specifically include S7061A-S7066A:
S7061A, the navigation server acquires a moving track of the mobile phone in the AR navigation process, and determines the current first position of the mobile phone according to the moving track.
The movement track can be obtained by combining positions in the pose estimated after receiving the VPS request from the mobile phone each time in the AR navigation process of the navigation server according to the mobile phone. In addition, the moving track should be the moving track before the current moment in the process of carrying out AR navigation by the mobile phone.
It will be appreciated that it is not a common event that the change in POI information at a location causes a change in its image, so that the pose estimated by the navigation server from the captured image in the VPS request from the handset is mostly correct during AR navigation by the user using the handset. In addition, in the process of AR navigation by a user through a mobile phone, in order to ensure the look and feel of the AR navigation interface, the speed of capturing images by the camera and taking the images as the background of the AR navigation interface is very high, so that the positions corresponding to the previous image and the next image captured by the camera are not greatly changed. Based on the above, the navigation server can predict the specific position (i.e. the position where the first image is shot) of the current mobile phone according to the movement track before the current moment. How to determine the current position according to the previous movement track may be any feasible manner, and the present application is not particularly limited thereto.
S7062A, the navigation server obtains all the poses corresponding to the first position from the first historical pose record.
The first historical pose record comprises all shooting positions and poses corresponding to the shooting positions when all electronic equipment conduct AR navigation in an area corresponding to the first position before the current moment.
S7063A, the navigation server calculates an average value of all the poses corresponding to the first position, to obtain an average pose.
Because the pose corresponding to the first position is numerous, the pose which is similar to the actual pose of the current mobile phone is necessarily included, and therefore, the average value of all the poses corresponding to the first position can be approximately regarded as the actual pose of the mobile phone in the first position.
S7064A, the navigation server determines whether the distance between the first pose and the average pose is greater than a preset threshold.
The preset threshold may be 10m, for example. The preset threshold may be the maximum of the distances possible for two approximate poses, in particular the poses of the images taken by the mobile phone. The preset threshold value under different photographing scenes may be different, for example, the preset threshold value corresponding to the scene for AR navigation in a mall and the preset threshold value corresponding to the scene for AR navigation in an urban road may be different.
The specific value of the preset threshold may be obtained according to practical experience, and the specific acquisition mode may be any feasible mode, which is not particularly limited by the present application.
If it is determined that the distance between the first pose and the average pose is greater than the preset threshold, it may be determined with high probability that the first pose is not the actual pose of the mobile phone, and at this time, the navigation server may determine that the first pose is abnormal, that is, execute S7065A.
If it is determined that the distance between the first pose and the average pose is smaller than the preset threshold, it can be determined with high probability that the first pose is very similar to the actual pose of the mobile phone, and at this time, the navigation server can determine that the first pose is not abnormal, and S7066A is executed.
It should be noted that, the case that the distance between the first pose and the average pose is equal to the preset threshold may be categorized as the case that the distance between the first pose and the average pose is greater than the preset threshold, or may be categorized as the case that the distance between the first pose and the average pose is less than the preset threshold.
In addition, in practice, the navigation server may not perform the step of determining S7064A, but may directly perform S7065A when determining that the distance between the first pose and the average pose is greater than the preset threshold, and may perform S7066A when determining that the distance between the first pose and the average pose is not greater than the preset threshold.
S7065A, the navigation server determines that there is no abnormality in the first pose.
S707 is performed after S7065A.
Note that, in practice, the navigation server may not execute S7065A, and the navigation server may directly execute S707 in a case where it is determined that the distance between the first pose and the average pose is greater than the preset threshold.
S7066A, the navigation server determines that the first pose is abnormal.
S708 is performed after S7066A.
Note that, in practice, the navigation server may not execute S7066A, and the navigation server may directly execute S708 if it is determined that the distance between the first pose and the average pose is not greater than the preset threshold.
Based on the technical schemes corresponding to S7061A-S7066A, the navigation server can accurately judge whether the first pose is abnormal, so as to determine whether to perform other processes of the subsequent POI method, and ensure smooth implementation of the POI updating method.
In another possible implementation manner, the navigation server can obtain the approximate actual pose of the mobile phone according to the moving track and the direction in the AR navigation process of the mobile phone. At this time, the navigation server can directly compare the first pose with the actual pose to determine whether the first pose is abnormal. Based on this, referring to fig. 11 in conjunction with fig. 7, S706 may specifically further include S7061B-S7064B:
S7061B, the navigation server acquires a moving track and a shooting direction of the mobile phone in the AR navigation process, and determines the current actual pose of the mobile phone according to the moving track and the shooting direction in the AR navigation process.
The movement track and the shooting direction can be obtained by combining positions in the pose estimated after receiving the VPS request from the mobile phone each time in the AR navigation process of the navigation server according to the mobile phone.
It will be appreciated that it is not a common event that the change in POI information at a location causes a change in its image, so that the pose estimated by the navigation server from the captured image in the VPS request from the handset is mostly correct during AR navigation by the user using the handset. In addition, in the process of AR navigation by a user through a mobile phone, in order to ensure the look and feel of the AR navigation interface, the speed of capturing images by the camera and taking the images as the background of the AR navigation interface is very high, so that the positions and the shooting directions of the previous image and the next image captured by the camera are not greatly changed. Based on the above, the navigation server can predict and obtain the specific pose of the current mobile phone (namely the actual pose for shooting the first image) according to the moving track and the direction before the current moment. How to determine the current position according to the previous movement track may be any feasible manner, and the present application is not particularly limited thereto.
In addition, in the POI updating method provided by the application, if the navigation server determines that the first pose is abnormal, a step similar to S7061B may be performed to obtain an actual pose, and the actual pose is sent to the mobile phone for use. The same applies throughout. Therefore, the mobile phone can be ensured to obtain the mobile phone pose requested by the VPS request in time, and the normal operation of AR navigation is ensured.
S7062B, the navigation server determines whether the first pose is the same as the actual pose.
If it is determined that the first pose is the same as the actual pose, it may be determined that there is no abnormality in the first pose, that is, S7063B is performed.
If it is determined that the first pose is the same as the actual pose, it may be determined that there is an abnormality in the first pose, and S7064B is executed.
In practice, the navigation server may not execute the judgment step S7062B, but may execute S7063B directly when it is determined that the first pose is the same as the practice, or execute S7064B when it is determined that the first pose is different from the practice.
Because the first pose is estimated, and the actual pose is also predicted according to the moving track and the shooting direction, the first pose and the actual pose are the same with minimum probability time and can be similar when they are different. Very similar situation can also be considered that the first pose is abnormality free. Based on this, in some embodiments, after S7061B, the navigation server may determine whether the first pose is abnormal based on whether the distance between the first pose and the actual pose is greater than a preset threshold, and the specific implementation may be as in S7064A-S7066A, which is not described herein.
S7063B, the navigation server determines that there is no abnormality in the first pose.
S707 is performed after S7063B.
In practice, the navigation server may not execute S7063B, and the navigation server may directly execute S707 when it is determined that the first pose is the same as the actual pose.
S7064B, the navigation server determines that the first pose is abnormal.
S708 is performed after S7064B.
In practice, the navigation server may not execute S7064B, and the navigation server may directly execute S708 if it is determined that the first pose is different from the actual pose.
Based on the technical schemes corresponding to S7061B-S7064B, the navigation server can accurately judge whether the first pose is abnormal, so as to determine whether to perform other processes of the subsequent POI method, and ensure smooth implementation of the POI updating method.
And S707, the navigation server sends the first pose to the mobile phone.
After receiving the first pose from the navigation server, the mobile phone can navigate with the first pose and the navigation information from the navigation server.
S708, the navigation server identifies the first image to obtain a first attribute feature of the first place.
Wherein the first attribute feature may comprise any one of: name, structure, type. The navigation server can specifically identify and obtain one or more different first attribute features according to requirements. For example, the navigation server may use the attribute features that may exist in the image in the POI information as the first attribute features that need to be identified.
Illustratively, the first attribute feature includes a name, and the navigation server may specifically recognize text information in the first image obtained by recognizing the first image by OCR, and include text information of a specific portion (for example, an uppermost sign portion) as the name. For example, the first image may be as shown in fig. 12, particularly if a certain shop is present at the first location. The name of the shop may be included in the first image, and the navigation server may obtain the name of the shop using OCR.
Of course, if the first attribute feature includes other content, the navigation server may be derived using any other feasible means, which the present application is not limited to.
Specific implementations of OCR technology may refer to relevant expressions of OCR technology in the description of terms in the foregoing implementations, which are not repeated here.
S709, the navigation server acquires a moving track and a shooting direction in the AR navigation process of the mobile phone, and determines historical POI information of the first place according to the moving track and the shooting direction.
Wherein, the historical POI information comprises all the historical attribute characteristics of the first place.
Specifically, the navigation server may first determine, according to a movement track and a shooting direction in the AR navigation process, a current actual pose of the mobile phone, and determine, according to the pose, a corresponding location from the VPS map, and then use the location as the first location. Thereafter, the historical POI information of the location can be obtained from the own memory as the historical POI information of the first location. The implementation of determining the current actual pose of the mobile phone may refer to the related expression of S7061B in the foregoing embodiment, which is not described herein again.
S710, the navigation server judges whether the content of the first historical attribute characteristic in the historical POI information is the same as the content of the first attribute characteristic.
Wherein the first historical attribute feature is the same type of historical attribute feature as the first attribute feature. The category identical refers specifically to feature information indicating the same, for example, if the first attribute feature is a category of "name", the history attribute feature identical to the category thereof is a history attribute feature of the category of "name" in the history POI.
For example, if the historical attribute features included in the historical POI information are: (name, BHS), (large-scale type, shopping service), (medium-scale type, clothing footwear cap); the first attribute feature is (name, JYY), then S710 specifically needs to compare whether "BHS" and "JYY" are identical.
If the navigation server determines that the content of the first historical attribute feature is the same as the content of the first attribute feature, the navigation server indicates that the current POI information of the first place does not have great change from the previous historical POI information, and the historical POI information of the first place does not need to be updated. At this time, the navigation server does not execute other processes until the navigation server does not receive the photographed image from the mobile phone again in the current AR navigation process.
If the navigation server determines that the content of the first historical attribute feature is different from the content of the first attribute feature, it indicates that the current POI of the first location has changed greatly from the previous historical POI information, and the historical POI information of the first location needs to be updated, that is, S711 is executed.
S711, the navigation server updates the historical POI information of the first location according to the first attribute characteristics.
Specifically, the navigation server may update the content of the first historical attribute feature, which is the same as the first attribute feature type, in the historical POI information to the content of the first attribute feature. For example, if the type of the first attribute is a name, the content is JYY; the content of the historical attribute feature with the category of name in the historical POI information is BHS, and the navigation server may update the BHS in the historical POI information to JYY.
In addition, the navigation server may obtain, according to the first attribute feature, a second attribute feature of another kind existing in the historical POI information, and then the navigation server may update, as the content of the second attribute feature, the content of the historical attribute feature of the first place, which is the same as the second attribute feature in the historical POI information.
For example, if the type of the first attribute is a name, the content is JYY, and the historical POI information of the first location includes a medium type. If the navigation server searches for the middle type of the milk tea drink according to JYY on the internet, and the middle type of the specific content in the historical POI information of the first place is clothing, shoes and caps, the navigation server can update the middle type of the specific content in the historical POI information of the first place to the milk tea drink.
Of course, the specific implementation of updating the historical POI information of the first location by the navigation server according to the first attribute feature in the implementation may be any other feasible implementation, which is not particularly limited by the present application.
Illustratively, taking the example that the first attribute feature includes a name and the first location is a certain shop of a certain mall, a specific implementation of S710 and S711 may be as shown in fig. 13. Referring to fig. 13, it is shown that: the first image of the first location may specifically be an image of a shop B, and the shop before the first location may specifically be a shop a, which may be named BHS.
The navigation server may recognize the name of the shop B in the first image by using OCR technology, i.e. "JYY". Then, the navigation server may obtain the historical POI information of the first location from the first sub-VPS map corresponding to the first location, and obtain the historical name (i.e., the original name) of the first location from the historical POI information, i.e., the name- "BHS" in the historical POI information. Specifically, the navigation server may store POI information of all the points in the first subvps map in the same data table, and the data table may be as shown in table 1 below.
TABLE 1 first sub-VPS map POI information
The third row is the historical POI information of the first place.
Then, the navigation server can judge whether the historical name of the first place and the content of the first attribute feature are the same, and if the historical name of the first place and the content of the first attribute feature are different, the historical POI information of the first place is updated by using the first attribute feature. If the two types of the data are the same, the process ends or no processing is performed. The data table corresponding to the POI information of all the places of the first subvps map obtained after updating the historical POI information of the first place may be as shown in table 2 below.
TABLE 2 first sub-VPS map POI info-update
Thus, the navigation server can smoothly complete the updating of the POI information of a certain place under the condition that the POI information of the place is changed.
The technical scheme provided by the embodiment of the application is particularly suitable for the AR navigation process of the mobile phone. In the technical scheme, when the user requests the VPS positioning in the AR navigation process by using the mobile phone, the navigation server can estimate and obtain the first pose according to the first image of the first location shot by the mobile phone camera. And when the first pose is determined to be abnormal, the first pose is considered to be not the current actual pose of the mobile phone. The first pose is obtained by the navigation server from the first image and the VPS map, so this also indicates that the first image is abnormal. That is, the first image and the image of the first location in the VPS map are different, thereby indicating that POI information of the first location may be changed. In this case, the navigation server may obtain the first attribute feature by identifying the first image, and compare the first attribute feature with the historical attribute feature in the historical POI information of the first location, so as to determine whether the historical POI information of the first location is actually changed. In the case that the POI information of the first location is determined to be truly changed by comparison, the historical POI information of the first location may be updated according to the first attribute feature. According to the technical scheme provided by the application, whether the POI of the place shot by the mobile phone is changed or not can be judged in real time in the AR navigation process of the mobile phone, and if the POI is changed, the POI is updated in time. The entire process does not require specific actions to be performed because the POI needs to be updated. Compared with the existing manual POI updating mode, the POI updating method is more accurate and higher in efficiency, and the authenticity, accuracy and instantaneity of POI information are guaranteed.
In some embodiments, in determining the first pose, if no location image matching the first image is retrieved, the navigation server may not perform subsequent estimation of the first pose. At this time, it may be considered that the first pose is necessarily abnormal, and then the update flow of the POI information may be directly performed, that is, the subsequent S708-S711 are performed. Based on this, referring to fig. 14 in combination with fig. 9, S7052 further includes S1401 and S1402:
s1401, the navigation server determines a search score according to the matching degree between the target point image and the first image.
For example, if there is one target spot image, the search score is the matching degree of the target spot image and the first image.
If there are multiple target site images, the search score may be a weighted average of the matching degrees of all the target site images and the first image.
S1402, the navigation server judges whether the search score is larger than a preset score.
Illustratively, the preset score may be 60%. The preset branch can be obtained according to practical experience, and the acquisition mode of the preset branch is not particularly limited.
If the navigation server determines that the search score is greater than the preset score, it can be considered that an image similar to the first image exists in the VPS map. At this time, the estimation flow of the first pose may be continuously performed, that is, S7053 and subsequent flows thereof may be performed.
If the navigation server determines that the search score is smaller than the preset score, the navigation server can consider that no image similar to the first image exists in the VPS map. At this time, it can be considered that there may be a change in the POI of the first location to cause the first image to be not close to the location image of the first location stored in advance in the VPS map. At this time, the navigation server may perform the update procedure of the POI, that is, perform the subsequent S708-S711.
It should be noted that, the case where the search score is equal to the preset score may be attributed to the case where the search score is greater than the preset score, or may be attributed to the case where the search score is less than the preset score.
In practice, the navigation server may not execute the judgment step S1402, but may execute S7053 directly when it is determined that the search score is greater than the preset score, or execute S708 when it is determined that the search score is not greater than the preset score.
Based on the technical schemes corresponding to S1401 and S1402, the navigation server can more quickly determine that the first pose is abnormal, and further more quickly determine whether to execute a process of updating the historical POI information of the first place. Therefore, the POI updating efficiency can be improved when the POI information of a certain place is changed, and the authenticity, accuracy and instantaneity of the POI information are ensured.
In some embodiments, the mobile phone can update the POI information of the first place and use the first image to update the place image corresponding to the first place in the VPS map, so that the VPS map can be updated in time, and the next normal use is facilitated.
Based on this, referring to fig. 15 in conjunction with fig. 7, the method further includes S712:
S712, the navigation server updates the location image of the first location in the VPS map with the first image.
The navigation server updates a place image corresponding to the first place in the VPS map by using the first image.
Specifically, after determining the actual pose of the mobile phone, the navigation server may update the location image of the first location corresponding to the pose in the VPS map to the first image.
In addition, in practice, the location image of the first location in the VPS map may include location images corresponding to a plurality of poses. Since the POI information of the first location is changed, all the location images need to be updated.
In one implementation, the navigation server may update the location images corresponding to the poses of the first location in the VPS map using the images after uploading the images corresponding to the poses of the first location by the mobile phone. And updating all the place images of the first place step by step.
In another implementation manner, the navigation server may utilize a specific image transformation means to transform the first image into images corresponding to multiple poses, and then update the location image of the first location in the VPS map using the first image and the transformed image.
Of course, updating all location images of the first location may be implemented in any feasible manner, which is not particularly limited by the present application.
S712 may be executed between S710 and S711, or may be executed after S711, as long as it is ensured that S712 is executed when the navigation server determines that the content of the history attribute feature identical to the type of the first attribute feature is different from the content of the first attribute feature. The specific execution timing of S712 is not particularly limited by the present application.
Based on the technical scheme corresponding to the S712, the location image of the first location in the VPS map can be updated in time, and then the pose of the mobile phone can be estimated more when the VPS request carrying the image of the first location is received, so that the accuracy of AR navigation is further ensured, and the use experience of a user is improved.
In some embodiments, to enhance the user experience, the navigation server may further send indication information indicating that there is a change in POI information of the first location to the mobile phone when it is determined that there is a need to update the historical POI information of the first location (i.e., the navigation server determines that the content of the historical attribute feature that is the same as the category of the first attribute feature is different from the content of the first attribute feature). Based on this in conjunction with fig. 7, referring to fig. 16, after S710 is performed, after the navigation server determines that the content of the history attribute feature identical to the category of the first attribute feature is different from the content of the first attribute feature, the method further includes S713 and S714:
s713, the navigation server sends the first indication information to the mobile phone.
Wherein the first indication information may be used to indicate that there is a change in the first attribute characteristic of the first location. For example, if the first attribute feature is a name, the first indication information may be specifically used to indicate that there is a change in the name of the first location.
Further, the first indication information may also be used to indicate a change specific of the first attribute characteristic of the first point. For example, if the first attribute feature is a name and the specific content is X, and the name of the first location in the historical POI information is Y, the first indication information may be used to indicate that the name of the first location is changed from X to Y.
Of course, if there are a plurality of first attribute features, which are different from the corresponding historical attribute features in the historical POI information of the first location, the first instruction information may be a plurality of first attribute features, each of which corresponds to a first attribute feature of the plurality of first attribute features. Or the first indication information may be used to indicate that there is a change in the plurality of first attribute features, or even to indicate a specific change in the first attribute features.
S714, the mobile phone receives the first indication information from the navigation server and displays the first indication information in the AR navigation interface.
For example, referring to fig. 17, after receiving the first indication information, the mobile phone may display the first indication information near the first location in the AR navigation interface. Taking the first place as a shop, the first indication information indicates that the name of the first place is changed, and the name is changed from X to Y, and referring to fig. 17, the first indication information may specifically be: the name of the shop is changed from X to Y.
Of course, if there are multiple first indication information, the mobile phone will display the multiple first indication information on the AR navigation interface in a similar manner to that shown in fig. 17.
Based on the technical schemes corresponding to S713 and S714, the user can know which places in the AR navigation route are changed in time, so that more possibilities of navigation are provided for the user, and the use experience of the user is improved.
It will be appreciated that, in order to achieve the above-mentioned functions, the electronic device includes corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the application can divide the functional modules of the electronic device according to the method example, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The embodiment of the application also provides electronic equipment, which comprises: a memory and one or more processors; the memory is coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the part of the POI update method as provided in the previous embodiments. The specific structure of the electronic device may refer to the structure of the electronic device shown in fig. 4.
The embodiment of the application also provides a navigation server, and the training equipment comprises a processor and a memory; the memory is configured to store executable instructions, and the processor is configured to execute the executable instructions stored by the memory to cause the navigation server to execute the portion executed by the navigation server in the POI update method as provided in the above embodiment. The specific structure of the navigation server may refer to the structure of the training apparatus shown in fig. 5.
The present application also provides a chip system, as shown in fig. 18, the chip system 1100 includes at least one processor 1101 and at least one interface circuit 1102. The processor 1101 and interface circuit 1102 may be interconnected by wires. For example, interface circuit 1102 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, the interface circuit 102 may be used to send signals to other devices (e.g., the processor 1101).
The interface circuit 1102 may, for example, read instructions stored in a memory and send the instructions to the processor 1101. The instructions, when executed by the processor 1101, may cause the electronic device to perform the various steps of the embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
Embodiments of the present application also provide a computer-readable storage medium including computer instructions that, when executed on an electronic device, cause the electronic device to perform a POI update method as provided in the foregoing embodiments.
Embodiments of the present application also provide a computer program product containing executable instructions that, when run on an electronic device, cause the electronic device to perform a POI update method as provided by the previous embodiments.
Embodiments of the present application also provide a computer-readable storage medium including computer instructions that, when executed on a navigation server, cause the navigation server to perform a POI update method as provided by the foregoing embodiments.
Embodiments of the present application also provide a computer program product containing executable instructions that, when run on a navigation server, cause the navigation server to perform the POI update method as provided by the previous embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to execute all or part of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a read-only memory (read on ly memory, ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (10)

1. A POI update method applied to a navigation server, the method comprising:
The navigation server receives a first image from an electronic device; the first image is an image shot by the electronic equipment on a first place in the process of Augmented Reality (AR) navigation;
The navigation server estimates a first pose of the electronic device according to the first image and a Visual Positioning Service (VPS) map; the VPS map comprises all images shot at different shooting angles at each position in a plurality of different areas;
The navigation server identifies the first image under the condition that the first pose is determined to be abnormal, so as to obtain a first attribute feature of the first place;
The navigation server acquires a moving track and a shooting direction in the AR navigation process of the electronic equipment, and determines historical point of interest (POI) information of the first place according to the moving track and the shooting direction; the historical POI information comprises all historical attribute characteristics of the first place;
the navigation server updates the historical POI information by using the first attribute feature under the condition that the content of the first historical attribute feature in the historical POI information is determined to be different from the content of the first attribute feature; the first historical attribute features are the same in kind as the first attribute features;
wherein the navigation server estimates a first pose of the electronic device from the first image and a visual positioning service VPS map, comprising:
The navigation server acquires a moving track in the AR navigation process of the electronic equipment, and determines the current first position of the electronic equipment according to the moving track; the navigation server determines a first sub-VPS map of the VPS map corresponding to the first location;
The navigation server searches a target place image matched with the first image from the first sub-VPS map;
the navigation server extracts the characteristic points of the target place image, and matches the characteristic points of the target place image with the first image to obtain target characteristic points successfully matched with the first image;
And the navigation server determines the first pose according to the target feature points and pose information of the target place image.
2. The method of claim 1, wherein the navigation server determining that the first pose is abnormal comprises:
The navigation server acquires a moving track in the AR navigation process of the electronic equipment;
and the navigation server determines that the first pose is abnormal based on the movement track.
3. The method of claim 2, wherein the navigation server determining that the first pose is abnormal based on the movement trajectory comprises:
the navigation server determines the current first position of the electronic equipment according to the movement track;
The navigation server acquires all the poses corresponding to the first position from a first historical pose record; the first historical pose record comprises all shooting positions and poses corresponding to the shooting positions when all electronic equipment performs AR navigation in an area corresponding to the first position before the current moment;
the navigation server calculates the average value of all the poses corresponding to the first position to obtain an average pose;
and the navigation server determines that the first pose is abnormal under the condition that the distance between the first pose and the average position is larger than a preset threshold value.
4. The method of claim 2, wherein the navigation server determining that the first pose is abnormal based on the movement trajectory comprises:
the navigation server acquires a shooting direction in the AR navigation process of the mobile phone;
The navigation server determines the current actual pose of the electronic equipment according to the moving track and the shooting direction;
the navigation server determines that the first pose is abnormal if the first pose is determined to be different from the actual pose.
5. The method of any of claims 1-4, wherein, in the event that the first attribute feature comprises a name, the navigation server identifies the first image to obtain the first attribute feature of the first location, comprising:
the navigation server recognizes the first image using OCR technology to obtain the first attribute feature.
6. The method of claim 2, wherein after the navigation server searches the first sub-VPS map for a target location image matching the first image, the method further comprises:
The navigation server determines a retrieval score according to the matching degree of the target place image and the first image;
The navigation server extracts the characteristic points of the target place image and matches the characteristic points of the target place image with the first image under the condition that the search score is larger than a preset score, so as to obtain target characteristic points successfully matched with the first image;
And under the condition that the search score is smaller than a preset score, the navigation server determines that the first pose is abnormal, and identifies the first image so as to obtain a first attribute characteristic of the first place.
7. The method according to any one of claims 1-4, wherein in case the navigation server updates the historical POI information with the first attribute feature, the method further comprises:
the navigation server updates a location image of a first location in a VPS map using the first image.
8. The method according to any one of claims 1-4, wherein the navigation server, in case it is determined that the content of the first historical attribute feature in the historical POI information is not identical to the content of the first attribute feature, further comprises:
The navigation server sends first indication information to the electronic equipment so that the electronic equipment displays the first indication information on an AR navigation interface; the first indication information is at least used for indicating that the first attribute feature of the first place is changed.
9. A navigation server comprising a memory and one or more processors; the memory is coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the navigation server to perform the POI update method of any of claims 1-8.
10. A computer readable storage medium comprising computer instructions which, when run on a navigation server, cause the navigation server to perform the POI update method of any one of claims 1-8.
CN202310185184.2A 2023-02-22 POI updating method and navigation server Active CN117109603B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310185184.2A CN117109603B (en) 2023-02-22 POI updating method and navigation server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310185184.2A CN117109603B (en) 2023-02-22 POI updating method and navigation server

Publications (2)

Publication Number Publication Date
CN117109603A CN117109603A (en) 2023-11-24
CN117109603B true CN117109603B (en) 2024-07-09

Family

ID=

Similar Documents

Publication Publication Date Title
US9449228B1 (en) Inferring locations from an image
US11721098B2 (en) Augmented reality interface for facilitating identification of arriving vehicle
US11163997B2 (en) Methods and apparatus for venue based augmented reality
US10068373B2 (en) Electronic device for providing map information
US9292764B2 (en) Method and apparatus for selectively providing information on objects in a captured image
KR102362117B1 (en) Electroninc device for providing map information
EP4059007A1 (en) Cross reality system with localization service and shared location-based content
JP2017198647A (en) System and method for presenting media content in autonomous vehicles
US20120093369A1 (en) Method, terminal device, and computer-readable recording medium for providing augmented reality using input image inputted through terminal device and information associated with same input image
KR20160062294A (en) Map service providing apparatus and method
CN115151947A (en) Cross reality system with WIFI/GPS-based map merging
CN111182453A (en) Positioning method, positioning device, electronic equipment and storage medium
CN105074691A (en) Context aware localization, mapping, and tracking
US10708708B2 (en) Reverse geocoding system
US8639023B2 (en) Method and system for hierarchically matching images of buildings, and computer-readable recording medium
CN108337664A (en) Tourist attraction augmented reality interaction guide system based on geographical location and method
WO2022073417A1 (en) Fusion scene perception machine translation method, storage medium, and electronic device
CN112269939B (en) Automatic driving scene searching method, device, terminal, server and medium
CN113822263A (en) Image annotation method and device, computer equipment and storage medium
CN117109603B (en) POI updating method and navigation server
CN117109603A (en) POI updating method and navigation server
CN116206479A (en) Map information display method, electronic device and medium
CN116664684B (en) Positioning method, electronic device and computer readable storage medium
US12026805B2 (en) Augmented reality based geolocalization of images
US20230154059A1 (en) Augmented Reality Based Geolocalization of Images

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant