CN117537820A - Navigation method, electronic device and readable storage medium - Google Patents

Navigation method, electronic device and readable storage medium Download PDF

Info

Publication number
CN117537820A
CN117537820A CN202311400068.4A CN202311400068A CN117537820A CN 117537820 A CN117537820 A CN 117537820A CN 202311400068 A CN202311400068 A CN 202311400068A CN 117537820 A CN117537820 A CN 117537820A
Authority
CN
China
Prior art keywords
information
input
electronic device
user
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311400068.4A
Other languages
Chinese (zh)
Inventor
张晓怡
苏佳明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311400068.4A priority Critical patent/CN117537820A/en
Publication of CN117537820A publication Critical patent/CN117537820A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships

Abstract

The application discloses a navigation method, electronic equipment and a readable storage medium, and belongs to the technical field of navigation. Wherein the method comprises the following steps: receiving a first input of a user; in response to a first input, displaying a shooting preview interface and displaying initial path information in the shooting preview interface, wherein the initial path information is generated based on initial position information and real-time position information of the electronic device; receiving a second input from the user; and in response to the second input, displaying a navigation route, wherein the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.

Description

Navigation method, electronic device and readable storage medium
Technical Field
The application belongs to the technical field of navigation, and particularly relates to a navigation method, electronic equipment and a readable storage medium.
Background
Currently, a common navigation method often realizes the positioning of a user through a global positioning system (GPS, global Positioning System), and makes a navigation route for the user according to the positioning information of the user so as to navigate the user.
The existing navigation method can acquire the user positioning information only as plane information, but because of different depths of the underwater environment, the GPS can not acquire the depth information of the user, so that the accuracy of an underwater navigation route formulated by using the GPS according to the positioning information of the user is lower.
Disclosure of Invention
The embodiment of the application aims to provide a navigation method which can improve the accuracy of underwater navigation.
In a first aspect, an embodiment of the present application provides a navigation method, performed by an electronic device, the method including:
receiving a first input of a user;
responding to the first input, displaying a shooting preview interface, and displaying initial path information in the shooting preview interface; wherein the initial path information is generated based on initial location information and real-time location information of the electronic device;
receiving a second input from the user;
displaying a navigation route in response to the second input; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
In a second aspect, an embodiment of the present application provides an electronic device, including:
The first receiving module is used for receiving a first input of a user;
the first response module is used for responding to the first input, displaying a shooting preview interface and displaying initial path information in the shooting preview interface; wherein the initial path information is generated based on initial location information and real-time location information of the electronic device;
the second receiving module is used for receiving a second input of a user;
a second response module for displaying a navigation route in response to the second input; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
In a third aspect, embodiments of the present application also provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, under the condition that the first input of the user is received, the electronic equipment responds to the first input, initial path information generated based on initial position information and real-time position information of the electronic equipment is displayed in a shooting preview interface, the initial path information represents an active path of the user in the shooting process, the user can observe the moving direction and the moving distance of the user in real time through the initial path information, under the condition that the user needs to return, the second input of the user is received, then a navigation route indicating to return to the initial position is displayed in response to the second input, and the navigation route can be obtained by processing the initial route so as to better and faster guide the user to return. In the embodiment of the application, accurate navigation can be realized without GPS positioning, the accuracy of underwater navigation can be improved, and the safety of a user during underwater activities is ensured.
Drawings
FIG. 1 is a flow chart of steps of a navigation method according to an embodiment of the present application;
FIG. 2 is a block diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is an interface schematic diagram of electronic device calibration according to an embodiment of the present application;
FIG. 4 is an interface schematic of automatic jump after calibration of an embodiment of the present application;
FIG. 5 is a schematic diagram of an interface during model entry in accordance with an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface after model entry is completed in an embodiment of the present application;
FIG. 7 is an interface schematic diagram showing navigation routes according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an interface for modifying a marker point according to an embodiment of the present application;
FIG. 9 is an interface schematic of an import target route model according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface for importing a route model with a preset route according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an interface for guiding movement of an electronic device during underwater photography according to an embodiment of the present application;
FIG. 12 is a flow chart of steps of another navigation method of an embodiment of the present application;
FIG. 13 is a schematic illustration of a pointing direction interface for a direction marker in accordance with an embodiment of the present application;
FIG. 14 is a block diagram of another electronic device of an embodiment of the present application;
Fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 16 is a schematic diagram of a hardware structure of another electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions of embodiments of the present application will be clearly described below with reference to the accompanying drawings of embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
The navigation method provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
Fig. 1 shows a flowchart of steps of a navigation method according to an embodiment of the present application, the method being performed by an electronic device, the method comprising:
step 101, a first input of a user is received.
In this step, the first input may be an input of a fixed key of the electronic device by the user, a specific gesture input by the user, a pressing input of the electronic device by the user, or the like.
For example, as shown in fig. 4, an input when the user presses the power key PU1 of the electronic device is taken as a first input by the user. Of course, the first input may also be implemented by the user pressing other fixed keys, which is not limited in the embodiment of the present application.
Step 102, responding to a first input, displaying a shooting preview interface, and displaying initial path information in the shooting preview interface; wherein the initial path information is generated based on the initial location information and the real-time location information of the electronic device.
In this step, initial position information of the electronic device may be first acquired, and after the electronic device responds to the first input, a shooting preview interface is displayed on a display screen of the electronic device, and real-time position information of the electronic device is continuously acquired, so as to generate initial path information according to the initial position information and the real-time position information of the electronic device. Wherein the initial path information indicates a corresponding initial path.
In this embodiment of the present application, the shooting preview interface may be a preview interface of a camera program when shooting an image or a video, for example, a user may synchronously display initial path information during a process of recording the video in real time; the initial path information may also be displayed during the picture taking process. The initial path information may be displayed in a form of a small window superimposed in the photographing preview interface, and a user may adjust the size and display position of the window based on the need. The initial path information can be displayed in the shooting preview interface with a certain transparency, so that the image content in the shooting preview interface is prevented from being blocked.
For example, as shown in fig. 5, in response to the first input, an image display area b3 and a route display area a3 exist on a photographing preview interface of the display screen, a photographed image may be displayed through the image display area b3, an initial route corresponding to the initial route information may be displayed in the route display area a3, or a current traveling direction may be displayed in the display area b3, and the current traveling direction may be indicated by an arrow.
Step 103, receiving a second input from the user.
In this step, the second input may be an input of a fixed key of the electronic device by the user, a specific gesture input by the user, a pressing input of the electronic device by the user, or the like.
For example, as shown in fig. 4, when the user clicks the power key PU1 again, the operation of clicking the power key PU1 by the user may be regarded as the second input by the user.
Step 104, responding to the second input, displaying a navigation route; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
In this step, the electronic device stops shooting the image in response to the second input, and displays the navigation route on the display screen of the electronic device. The navigation route is a route obtained by processing the initial path information, for example, the original initial path may be shortened, so that the user can quickly realize the return. The navigation route is used for indicating the initial position of the electronic equipment corresponding to the returned initial position information. The navigation route not only can navigate the user on the ground, but also can navigate the user in an underwater scene or an air scene. It should be noted that, when the electronic device responds to the second input, the image may be continuously captured and the navigation route may be displayed, so that the normal capturing action is not affected.
For example, as shown in FIG. 5, when the user clicks the power key again, the display interface of the electronic device automatically jumps from FIG. 5 to the display interface shown in FIG. 6. Fig. 6 is a schematic diagram of an interface after model entry in the embodiment of the present application, as shown in fig. 6, after the image acquisition is stopped, a selection window a4 is displayed on the display screen, the display content in the selection window a4 is "entry ended", whether a navigation route is displayed, and two option buttons of "yes" and "no" are displayed, and when the user selects "yes" or "no", the corresponding operation is performed. Wherein clicking on both the yes and no option buttons may be accomplished by pressing the display directly, by clicking on a virtual key or by pressing an entity key, e.g., by pressing the up volume key to select the yes option button and by pressing the down volume key to select the no button. If the user clicks the yes option button, the electronic device responds to the input corresponding to the yes option button, and the display screen interface shown in fig. 6 automatically jumps to the display screen interface shown in fig. 7 so as to display the navigation route L2 on the display screen of the electronic device, so as to guide the user to return. If the user clicks the No button, the navigation route is not displayed, and the shooting preview interface can be continuously displayed.
In the embodiment of the application, under the condition that the first input of the user is received, the electronic equipment responds to the first input, initial path information generated based on initial position information and real-time position information of the electronic equipment is displayed in a shooting preview interface, the initial path information represents an active path of the user in the shooting process, the user can observe the moving direction and the moving distance of the user in real time through the initial path information, under the condition that the user needs to return, the second input of the user is received, then a navigation route indicating to return to the initial position is displayed in response to the second input, and the navigation route can be obtained by processing the initial route so as to better and faster guide the user to return. In the embodiment of the application, accurate navigation can be realized without GPS positioning, the accuracy of underwater navigation can be improved, and the safety of a user during underwater activities is ensured.
In some embodiments of the present application, before step 101, further comprising:
step 105, displaying a calibration interface, wherein the calibration interface comprises a first calibration identifier and a device identifier, the first calibration identifier indicates a calibration gesture, and the device identifier indicates a current gesture of the electronic device.
In this step, in order to ensure the accuracy of navigation, it is necessary to calibrate the electronic device for posture before entering the navigation route. The calibration interface is displayed on the display screen of the electronic device first, wherein the calibration interface comprises a first calibration identifier and a device identifier, the first calibration identifier indicates a calibration gesture, and the device identifier indicates the current gesture of the electronic device. By changing the relative positions of the first calibration mark and the device mark, the calibration of the gesture of the electronic device can be realized.
As shown in fig. 2, the electronic device includes an inertial measurement sensor 11 (Inertial Measurement Unit, IMU), and the electronic device obtains posture information of the electronic device through the IMU11, and generates a device identifier indicating a current posture of the electronic device according to the posture information of the electronic device.
An IMU is a device that integrates a plurality of sensors for measuring and monitoring acceleration, angular velocity and direction of an object. Typically, IMUs consist of accelerometers, gyroscopes, and magnetometers, which measure linear acceleration, angular velocity, and magnetic field information of an object to derive the direction and position of the object in space. The IMU is widely applied to the fields of navigation systems, aerospace, unmanned aerial vehicles, robots and the like.
As shown in fig. 4, a model input button a1 and a model import button b1 are arranged on a display screen of the electronic device, and the model input can be selected by clicking the model input button a1 on the display screen. When a user opens a shooting function of the electronic device or clicks a model input button a1 on the display screen, the user can perform shortcut operation by directly touching the display screen and through virtual keys or entity keys of the electronic device, wherein the entity keys comprise a power key PU1, a volume up key PU2 and a volume down key PU3. Fig. 3 is an interface schematic diagram of the electronic device for calibration according to the embodiment of the present application, and as shown in fig. 3, after the shooting function of the electronic device is turned on and the model is selected for input, the calibration interface is displayed on the display screen of the electronic device. At this time, the displayed calibration interface includes a first calibration mark b2 and a device mark a2, where the display parameters of the first calibration mark b2 and the device mark a2 are different, the device mark a2 is, for example, an ellipse filled with red, the first calibration mark b2 is, for example, a blue ellipse, and the device mark a2 and the first calibration mark b2 may be in other shapes or other colors. The first calibration flag b2 indicates a calibration gesture, and the device flag a2 indicates a current gesture of the electronic device.
Step 106, receiving a third input from the user to the electronic device.
In this step, since the device identifier and the first calibration identifier are used to guide the user to perform the posture calibration operation on the electronic device, in the case where the posture information of the electronic device is changed, the relative position of the device identifier and the first calibration identifier is also changed accordingly. Therefore, the third input to the electronic device by the user is, for example, an operation of performing posture calibration on the electronic device by the user.
In response to the third input, the display parameters of the device identification are updated, step 107.
In this step, the electronic device responds to the third input, acquires real-time posture information of the electronic device through the IMU11, and obtains display parameters of the device identifier representing the current posture of the electronic device according to the real-time posture information of the electronic device. The display parameters are, for example, a display position parameter, a display direction parameter, a display area parameter, and the like of the device identifier.
And step 108, displaying calibration success prompt information under the condition that the display parameters of the equipment identifier are matched with the display parameters of the first calibration identifier.
In the step, the display parameters of the equipment identifier are continuously updated until the calibration is successful under the condition that the display parameters of the equipment identifier are matched with the display parameters of the first calibration identifier, and a calibration success prompt message is displayed on the calibration interface.
For example, as shown in fig. 3, the electronic device is, for example, a mobile phone, when the display parameter of the device identifier a2 is not matched with the display parameter of the first calibration identifier b2, generating guide information c2, guiding the user to move the electronic device according to the guide information c2, so as to change the gesture of the electronic device, and generating prompt information d2 when the display parameter of the device identifier is matched with the display parameter of the first calibration identifier, where the prompt information is, for example, indicated as "calibration is successful", so as to realize completion of calibration of the electronic device. In fig. 3 (301), the guidance information c2 is indicated as "the mobile phone needs to rotate leftwards", the guidance information c2 in fig. 3 (302) is indicated as "rotate rightwards", the mobile phone orientation is changed ", and the prompt information d2 in fig. 3 (303) is indicated as" calibration is successful ". According to the guidance information c2, the user moves the electronic device until the display parameter of the device identifier a2 matches the display parameter of the first calibration identifier b2, and the blue gesture frame b2 changes to the green gesture frame b2, at which point, as shown in fig. 3 (303), the prompt information d2 indicates "calibration is successful", that is, indicates that the electronic device calibration is completed. After the electronic device calibration is completed, the calibration interface shown in fig. 3 automatically jumps to the display screen interface shown in fig. 4.
In some embodiments of the present application, step 102 includes:
and step 1021, controlling a camera of the electronic device to acquire images, and acquiring coordinate information and posture information of the electronic device corresponding to each frame of image.
In this step, as shown in fig. 2, the electronic device further includes a camera 12 and a pressure sensor 13. After the electronic device calibration is completed, the electronic device controls the camera 12 to acquire a plurality of frame images, and acquires coordinate information and posture information of the electronic device corresponding to each frame image. Wherein the coordinate information of each frame image is generated based on the pose information of each frame image and the coordinate information of the previous frame image. And the coordinate information of each frame of image corresponds to the real-time position information of the electronic equipment shooting each frame of image. The coordinate information includes plane coordinate information and depth coordinate information, and the posture information includes gyroscope data information and acceleration information.
In the step, after the electronic equipment completes the posture calibration, the initial position information and the initial gyroscope data information of the electronic equipment are acquiredWherein the initial position information includes initial plane coordinate information and initial depth coordinate information. Wherein the initial coordinates of the initial position information characterization are expressed, for example, (x 0 ,y 0 0), the pressure value of the pressure sensor is 0 when the user has not launched water on shore because the depth coordinate information is obtained from the pressure value transmitted by the pressure sensor, so the initial depth coordinate information is 0. When controlling a camera of an electronic device to acquire an image, the electronic device measures gyroscope data information and Acceleration (ACC) information of the electronic device corresponding to each frame of image through an inertial measurement sensor 11. And determining plane coordinate information of each frame of image according to the gyroscope data information, the acceleration information and the coordinate information of the previous frame of image of each frame of image. The plane coordinate information comprises longitude information and latitude information, and the plane coordinate information corresponding to the first frame image is obtained by updating the initial position information through the posture information of the first frame image.
For example, the plane coordinate corresponding to the n-1 th frame image is P n-1 =(x n-1 ,y n-1 ) The gyroscope data information is thatThe gesture coordinate is +.>Therefore, the plane coordinate corresponding to the nth frame image is P n =(x n ,y n ) The gyroscope data information is-> Acceleration information isThe gesture coordinate is +.>Wherein n is a positive integer, and the gesture coordinate is a gesture or direction of the electronic device in a three-dimensional space. The specific formula is as follows:
Wherein Δt is the time interval of image acquisition, v n-1 Indicating the corresponding speed of the n-1 frame image.
Further, the electronic device can determine depth coordinate information corresponding to each frame of image by acquiring the pressure value transmitted by the pressure sensor 13. And the coordinate axis of the depth coordinate information corresponding to each frame of image is perpendicular to the plane of the plane coordinate information.
For example, the depth coordinate information corresponding to the nth frame image is Z n The pressure value transmitted by the pressure sensor is P n Z is then n =cP n Where c is the conversion coefficient of the pressure value and the depth coordinate information, and is determined under the same water area.
For example, as shown in fig. 4, when the electronic device completes the gesture calibration, a prompt text c1 may be displayed on the display screen, where the prompt text c1 is, for example, "clicking a power key to start to enter a model", through the prompt text c1, a user may use the power key PU1 of the electronic device as an input to the electronic device by the user, the electronic device responds to the input, controls a camera of the electronic device to collect an image, and obtains coordinate information and gesture information of the electronic device corresponding to each frame of image, and the display screen interface shown in fig. 4 automatically jumps to the display screen interface shown in fig. 5.
Sub-step 1022 displays the initial path information in the photographing preview interface according to the coordinate information and the pose information corresponding to each frame of image.
In this step, the electronic device generates initial path information and displays the initial path information on the display screen according to the coordinate information and the posture information corresponding to each frame of the image.
For example, as shown in fig. 5, an image display area b3 and a route display area a3 exist on a photographing preview interface of a display screen, a photographed image may be displayed through the image display area b3, and an initial route and a current traveling direction corresponding to the initial route information are displayed in the route display area a 3.
In the embodiment of the application, the camera of the electronic equipment is used for collecting the images, and the initial path information is displayed in the shooting preview interface according to the coordinate information and the gesture information corresponding to each frame of image, so that the purpose of guiding the current travelling path of the user can be achieved, and the user can sense the condition of the current travelling path.
In some embodiments of the present application, after the step of displaying the shooting preview interface in step 102, the method further includes:
step 109, a fourth input is received from the user.
In this step, the user may perform a marking operation on the travel route during model entry of the electronic device or during image acquisition. The fourth input of the user can be realized by clicking a mark button on the display screen by the user or by clicking a virtual key or a physical key with mark operation input. For example, as shown in FIG. 4, the user may double-click the volume up key PU2 quickly to effect a fourth input by the user.
In step 110, in response to the fourth input, marking information is displayed in the shooting preview interface, where the marking information is used to mark the location information of the electronic device corresponding to the fourth input, and the marking information is displayed on the initial path indicated by the initial path information.
In this step, the electronic device displays one piece of mark information in the photographing preview interface each time in response to the fourth input. The marking information is used for marking the position information of the electronic equipment corresponding to the fourth input, and the marking information is displayed on the initial path indicated by the initial path information. The mark information includes position information and shape information of the mark. And under the condition that the initial path comprises the first mark information and the second mark information, the navigation route also comprises the first mark information and the second mark information, the first distance on the navigation route is smaller than the second distance on the initial path, the first distance is the navigation distance between the positions corresponding to the first mark information and the second mark information on the navigation route, and the second distance is the distance between the positions corresponding to the first mark information and the second mark information on the initial path. The navigation route is planned through the marking information, so that the route of the user returning to the initial position can be shortened, and the navigation time is saved.
As shown in fig. 7, the mark points S1, S2 … … and Sm indicated by the mark information are, for example, triangular, five-pointed star or other shaped icons, which the present application is not limited to. And (3) connecting all the marking points in a straight line to obtain a route from S1 to Sm, taking the route from Sm to S1 as a navigation route L2, and displaying the navigation route L2 on a display screen to guide a user to return to an initial position, wherein m is a positive integer, and m is the number of the marking points.
In some embodiments of the present application, after step 104, further comprising:
step 111, a fifth input is received from the user.
In the step, after the user returns to the initial position according to the navigation route, the navigation route can be exported in the storage space of the image for viewing, and the marking point represented by the marking information on the navigation route can be selected according to the requirement of the user.
In response to the fifth input, the third marking information on the navigation route is deleted or the fourth marking information is displayed on the navigation route, step 112.
In this step, the electronic device deletes the third marker information on the navigation route or adds the fourth marker information on the navigation route in response to the fifth input. The third mark information is, for example, mark information on the navigation route, and the fourth mark information is, for example, added mark information or new mark information formed by moving a mark point indicated by the existing mark information.
In other embodiments, the navigation route between two pieces of mark information may be deleted by selecting a route between two pieces of mark information, but when deleting the navigation route between two pieces of mark information, if at least one piece of mark information corresponds to the last frame of image, the navigation route between two pieces of mark information is deleted directly. If the two pieces of mark information do not exist, the coordinates of the two mark points are directly connected by a straight line, so that the purposes of simplifying a route and saving return time are achieved.
For example, as shown in fig. 8, the display screen of the electronic device includes a delete control w1, an add control w2, and a modify control w3, and the delete control w1 or the modify control w3 is clicked to delete or modify the marker information by selecting the marker information on the navigation route. The user may also select at the location of the navigation route without the tag information and add the tag information by clicking on the add control w 2.
Step 113, updating the navigation route based on the deleted third mark information or the displayed fourth mark information.
In this step, the navigation route is updated based on the deleted third flag information or the displayed fourth flag information, and a new navigation route is formed.
Fig. 12 shows a flow chart of steps of some embodiments of the present application, the method comprising:
step 201, a sixth input of a user is received.
In this step, the user may navigate the user's destination using the preset route stored in the electronic device as a navigation route.
For example, as shown in fig. 9 (901), the user selects on shore from a plurality of preset route models including an M-type three-dimensional model M1, a pacific three-dimensional model M2, an atlantic three-dimensional model M3, and the like.
Step 202, in response to the sixth input, displaying preset path information, a direction identifier, a second calibration identifier and a device identifier.
In the step, the electronic device responds to the sixth input, a preset route model selected by a user is imported, and before preset path information, direction identification, second calibration identification and device identification are displayed, the electronic device displays a first calibration identification and a device identification in a preview picture by acquiring gesture information of the electronic device, and calibrates the electronic device through the first calibration identification and the device identification.
For example, as shown in fig. 9 (902) and 9 (903), after the electronic device imports the selected preset route model, the electronic device needs to be calibrated, so the first calibration identifier b5 and the device identifier a5 are generated on the display screen interface. And calibrating the electronic equipment through the first calibration mark b5 and the equipment mark a5, and moving the electronic equipment according to the prompt information c5 displayed on the display screen until the display parameters of the first calibration mark b5 and the display parameters of the equipment mark a5 are matched, thereby completing the calibration of the electronic equipment. The prompt information c5 is, for example, "rotate and translate the mobile phone, calibrate the initial position".
In some embodiments of the present application, after the electronic device completes calibration, the preset path information, the direction identifier, the second calibration identifier, and the device identifier are displayed on the display screen according to the preset route model. The direction identifier indicates a navigation direction of the electronic device, the second calibration identifier indicates a calibration gesture associated with the preset path information, and the device identifier indicates a current gesture of the electronic device. The second calibration mark may be the same as or different from the first calibration mark.
For example, as shown in fig. 9 (903), the electronic device displays a corresponding preset path L3 in the route display area d5 according to the preset route model. And the direction mark f5 is displayed, the direction mark f5 is displayed in a fixed area of the display screen, and the direction mark f5 is, for example, a three-dimensional object model. As shown in fig. 13, the direction indicator f5 is a shark model, and because the shark model is a three-dimensional object model, the user can feel the travelling direction more intuitively and truly on the display screen, and the direction of the shark head of the shark model is taken as the navigation direction. The head pointing direction of the shark model may represent a right walk, a left walk, an up walk, a down walk, a forward walk, or a backward walk in the navigation direction.
In some embodiments of the present application, after the preset route model is imported, parameters of a preset route in the preset route model may be set according to a user requirement, so as to set the parameters of the preset route as target parameters, and after the electronic device is calibrated, preset path information, a direction identifier, a second calibration identifier, and a device identifier of the preset route with the target parameters are displayed. The target parameters include the total length of the route, the length of each sub-route, the angle at which each sub-route turns, the origin coordinate information and the destination coordinate information of the route.
For example, as shown in fig. 10 (1001), in the case where the route model of the preset route selected by the user is an M-type route model, the M-type route recorded by the M-type route model is a route in the depth direction, the guide information a6 is displayed on the display screen, and the parameter of the M-type route is set as the target parameter according to the display guide information a 6. The M-type route model comprises a first sub-route, a second sub-route, a third sub-route and a fourth sub-route, wherein one end of the first sub-route is a starting point of a departure stroke, and one end of the fourth sub-route is a finishing point of the departure stroke. The guide information a6 is, for example, "please set the total length of the M-type path". The lengths of the first sub-route and the fourth sub-route may be set to be the same, the lengths of the second sub-route and the third sub-route are the same, and the length of the first sub-route is a preset multiple of the length of the second sub-route, and an angle at which the first sub-route turns to the second sub-route, an angle at which the second sub-route turns to the third sub-route, and an angle at which the third sub-route turns to the fourth sub-route may be set, so that a preset route having a target parameter is obtained.
As shown in fig. 10 (1002) and 10 (1003), after setting the parameter of the preset route as the target parameter, the electronic device needs to be calibrated, and after the electronic device is calibrated, the guide information b6 is generated on the display screen, for example, "after clicking the power key to start recording, the generation of the path" is started, and the user clicks the power key to display the preset route L6 with the target parameter, the direction identifier f6, the device identifier c6 and the second calibration identifier d6 on the display screen, so as to guide the user to move according to the preset path information.
Step 203, updating the display state of the direction identifier based on the travel direction corresponding to the preset path information and the current travel direction of the electronic device.
In the step, the direction identifier indicates the navigation direction according to the preset path information, and when the user is navigated, the display state of the direction identifier is updated based on the travel direction corresponding to the preset path information and the current travel direction of the electronic device, and the display state of the direction identifier is the indication direction of the direction identifier.
For example, as shown in fig. 9 (903), the direction identifier f5 indicates the navigation direction of the electronic device, the navigation direction indicated by the direction identifier f5 may be obtained by making a difference between the current location information of the electronic device and the location information of the target route model, and when the direction of the user is incorrect or correct, a prompt message is generated to display the prompt message on the display screen, and the direction of the direction identifier f5 is changed to indicate the navigation direction, and when the traveling direction of the user is different from the direction corresponding to the preset path information, the prompt message is, for example, "adjust direction, go to the upper left", and when the traveling direction of the user is the same as the direction corresponding to the preset path information, the prompt message is, for example, "direction is correct, please go straight", etc.
Step 204, updating the display state of the device identifier based on the current gesture of the electronic device and the calibration gesture associated with the preset path information.
In this step, the display state of the device identifier may be updated by comparing the current gesture of the electronic device with the calibration gesture associated with the preset path information, so as to navigate the user. The display state of the device identifier specifically characterizes the relative positional relationship between the display parameters of the device identifier and the display parameters of the second calibration identifier. And by adjusting the display state of the equipment identifier, and under the condition that the display parameters of the equipment identifier are matched with the display parameters of the second calibration identifier, the electronic equipment is shown to shoot according to the gesture of the electronic equipment when the electronic equipment is recorded according to the preset route model. Wherein, step 204 and step 203 are implemented without timing difference, and can be performed simultaneously; step 204 may also be performed first, and then step 203 may be performed; step 203 may also be performed first, followed by step 204.
For example, as shown in fig. 11, when the matching state of the display parameter of the device identifier a5 and the display parameter of the second calibration identifier b5 is as shown in fig. 11 (1101) at the time of navigation, the presentation information c5 appears as "the mobile phone needs to rotate rightward". When the matching state of the display parameter of the device identifier a5 and the display parameter of the second calibration identifier b5 is as shown in fig. 11 (1102), the occurrence prompt information c5 is "the mobile phone needs to rotate leftwards". When the matching state of the display parameter of the device identifier a5 and the display parameter of the second calibration identifier b5 is as shown in fig. 11 (1103), the occurrence prompt information c5 is "the mobile phone needs to rotate upward". When the matching state of the display parameter of the device identifier a5 and the display parameter of the second calibration identifier b5 is as shown in fig. 11 (1104), the occurrence prompt information c5 is "the mobile phone needs to rotate downward".
According to the navigation method provided by the embodiment of the application, the execution main body can be the electronic equipment. In the embodiment of the application, an electronic device executing a navigation method is taken as an example, and the electronic device provided in the embodiment of the application is described.
Fig. 14 is a block diagram of an electronic device according to an embodiment of the present application, the electronic device 20 including:
a first receiving module 21 for receiving a first input of a user;
a first response module 22 for displaying a photographing preview interface in response to the first input, and displaying initial path information in the photographing preview interface; wherein the initial path information is generated based on initial position information and real-time position information of the electronic device;
a second receiving module 23 for receiving a second input of the user;
a second response module 24 for displaying a navigation route in response to a second input; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
Thus, in the embodiment of the application, under the condition that the first input of the user is received, the electronic device responds to the first input, initial path information generated based on the initial position information and the real-time position information of the electronic device is displayed in a shooting preview interface, the initial path information represents an active path of the user in the shooting process, the user can observe the moving direction and the moving distance of the user in real time through the initial path information, under the condition that the user needs to return, the second input of the user is received, then a navigation route indicating to return to the initial position is displayed in response to the second input, and the navigation route can be obtained by processing the initial route so as to better and faster guide the user to return. In the embodiment of the application, accurate navigation can be realized without GPS positioning, the accuracy of underwater navigation can be improved, and the safety of a user during underwater activities is ensured.
In some embodiments, the electronic device 20 further comprises:
the calibration interface display module is used for displaying a calibration interface, wherein the calibration interface comprises a first calibration mark and a device mark, the first calibration mark indicates a calibration gesture, and the device mark indicates the current gesture of the electronic device;
the third receiving module is used for receiving a third input of the electronic equipment by a user;
the third response module is used for responding to a third input and updating the display parameters of the equipment identifier;
and the calibration prompt module is used for displaying calibration success prompt information under the condition that the display parameters of the equipment identifier are matched with the display parameters of the first calibration identifier.
In some embodiments, the first response module 22 includes:
the image information acquisition sub-module is used for controlling a camera of the electronic equipment to acquire images and acquiring coordinate information and gesture information of the electronic equipment corresponding to each frame of image;
and the path information display sub-module is used for displaying initial path information in the shooting preview interface according to the coordinate information and the gesture information corresponding to each frame of image.
In some embodiments, the electronic device 20 further comprises:
a fourth receiving module for receiving a fourth input of the user;
And the fourth response module is used for responding to the fourth input, displaying marking information in the shooting preview interface, wherein the marking information is used for marking the position information of the electronic equipment corresponding to the fourth input, and the marking information is displayed on the initial path indicated by the initial path information.
In some embodiments, in the case that the initial path includes first and second marker information, the navigation route includes the first and second marker information, and a first distance on the navigation route is smaller than a second distance on the initial path, the first distance being a navigation distance between positions on the navigation route corresponding to the first and second marker information, the second distance being a distance between positions on the initial path corresponding to the first and second marker information.
In some embodiments, the electronic device 20 further comprises:
a fifth receiving module for receiving a fifth input of the user;
a fifth response module for deleting the third mark information on the navigation route or displaying the fourth mark information on the navigation route in response to the fifth input;
a route updating module for updating the navigation route based on the deleted third mark information or the displayed fourth mark information
In some embodiments, the electronic device 20 further comprises:
a sixth receiving module for receiving a sixth input from a user;
the sixth response module is used for responding to the sixth input and displaying preset path information, a direction identifier, a second calibration identifier and a device identifier;
the direction identification updating module is used for updating the display state of the direction identification based on the traveling direction corresponding to the preset path information and the current traveling direction of the electronic equipment;
and the equipment identifier updating module is used for updating the display state of the equipment identifier based on the current gesture of the electronic equipment and the calibration gesture associated with the preset path information.
The electronic device in the embodiment of the application may be the whole electronic device, or may be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., and embodiments of the present application are not particularly limited.
The electronic device of the embodiment of the application may be a device with an action system. The action system may be an Android (Android) action system, may be an ios action system, and may also be other possible action systems, which are not specifically limited in the embodiments of the present application.
The electronic device provided in the embodiment of the present application can implement each process implemented by the above method embodiment, so as to achieve the same technical effect, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 15, the embodiment of the present application further provides an electronic device 100, including a processor 101, a memory 102, and a program or an instruction stored in the memory 102 and capable of running on the processor 101, where the program or the instruction implements each step of any one of the navigation method embodiments described above when executed by the processor 101, and the steps can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
It should be noted that, the electronic device in the embodiment of the present application includes a mobile electronic device and a non-mobile electronic device.
Fig. 16 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: radio frequency unit 1001, network module 1002, audio output unit 1003, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, and processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1010 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
Wherein, the user input unit 1007 is configured to receive a first input of a user; the display unit 1006 is configured to display a shooting preview interface in response to the first input, and display initial path information in the shooting preview interface; wherein the initial path information is generated based on initial position information and real-time position information of the electronic device; a user input unit 1007 for receiving a second input of a user; the display unit 1006 is configured to display a navigation route in response to the second input; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
Thus, in the embodiment of the application, under the condition that the first input of the user is received, the electronic device responds to the first input, initial path information generated based on the initial position information and the real-time position information of the electronic device is displayed in a shooting preview interface, the initial path information represents an active path of the user in the shooting process, the user can observe the moving direction and the moving distance of the user in real time through the initial path information, under the condition that the user needs to return, the second input of the user is received, then a navigation route indicating to return to the initial position is displayed in response to the second input, and the navigation route can be obtained by processing the initial route so as to better and faster guide the user to return. In the embodiment of the application, accurate navigation can be realized without GPS positioning, the accuracy of underwater navigation can be improved, and the safety of a user during underwater activities is ensured.
In some embodiments, the display unit 1006 is further configured to display a calibration interface, where the calibration interface includes a first calibration identifier and a device identifier, the first calibration identifier indicates a calibration gesture, and the device identifier indicates a current gesture of the electronic device; the user input unit 1007 is further configured to receive a third input of the electronic device from the user; the display unit 1006 is further configured to update a display parameter of the device identifier in response to the third input; and displaying calibration success prompt information under the condition that the display parameters of the equipment identifier are matched with the display parameters of the first calibration identifier.
In some embodiments, the processor 1010 is further configured to control a camera of the electronic device to collect images, and obtain coordinate information and pose information of the electronic device corresponding to each frame of images; the display unit 1006 is further configured to display initial path information in the shooting preview interface according to coordinate information and pose information corresponding to each frame of image.
In some embodiments, the user input unit 1007 is also used to receive a fourth input from the user; the display unit 1006 is further configured to display, in response to the fourth input, marking information in the shooting preview interface, where the marking information is used to mark location information of the electronic device corresponding to the fourth input, and the marking information is displayed on an initial path indicated by the initial path information.
In some embodiments, the user input unit 1007 is also used to receive a fifth input from the user; the processor 1010 is further configured to control the display unit 1006 to delete the third marker information on the navigation route or display the fourth marker information on the navigation route in response to the fifth input; the processor 1010 is also configured to update the navigation route based on the deleted third marker information or the displayed fourth marker information.
In some embodiments, the user input unit 1007 is further configured to receive a sixth input from the user; the display unit 1006 is further configured to display preset path information, a direction identifier, a second calibration identifier, and a device identifier in response to the sixth input; the direction identifier indicates the navigation direction of the electronic equipment, the second calibration identifier indicates the calibration gesture associated with the preset path information, and the equipment identifier indicates the current gesture of the electronic equipment; the processor 1010 is further configured to update a display state of the direction identifier based on a travel direction corresponding to the preset path information and a current travel direction of the electronic device; and updating the display state of the device identifier based on the current gesture of the electronic device and the calibration gesture associated with the preset path information.
It should be understood that in the embodiment of the present application, the input unit 1004 may include a graphics processor (Graphics Processing Unit, GPU) 10041 and a microphone 10042, and the graphics processor 10041 processes image data of a still picture or a video image obtained by an image capturing device (such as a camera) in a video image capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 can include two portions, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 1009 may be used to store software programs as well as various data including, but not limited to, application programs and an action system. The processor 1010 may integrate an application processor that primarily processes an action system, user pages, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 1010.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or nonvolatile memory, or the memory x09 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 1009 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the above navigation method embodiment are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the navigation method embodiment, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the navigation method embodiments described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. A navigation method performed by an electronic device, the method comprising:
receiving a first input of a user;
responding to the first input, displaying a shooting preview interface, and displaying initial path information in the shooting preview interface; wherein the initial path information is generated based on initial location information and real-time location information of the electronic device;
receiving a second input from the user;
displaying a navigation route in response to the second input; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
2. The method of claim 1, wherein prior to the receiving the first input from the user, the method further comprises:
displaying a calibration interface, wherein the calibration interface comprises a first calibration identifier and a device identifier, the first calibration identifier indicates a calibration gesture, and the device identifier indicates the current gesture of the electronic device;
receiving a third input of a user to the electronic device;
updating display parameters of the device identification in response to the third input;
And displaying calibration success prompt information under the condition that the display parameters of the equipment identifier are matched with the display parameters of the first calibration identifier.
3. The method of claim 1, wherein displaying initial path information in the capture preview interface comprises:
controlling a camera of the electronic equipment to acquire images, and acquiring coordinate information and gesture information of the electronic equipment corresponding to each frame of the images;
and displaying the initial path information in the shooting preview interface according to the coordinate information and the gesture information corresponding to the image of each frame.
4. The method of claim 1, wherein after the displaying the shot preview interface, the method further comprises:
receiving a fourth input from the user;
and in response to the fourth input, displaying marking information in the shooting preview interface, wherein the marking information is used for marking the position information of the electronic equipment corresponding to the fourth input, and the marking information is displayed on an initial path indicated by the initial path information.
5. The method according to claim 4, wherein in the case where first marker information and second marker information are included on the initial path, the navigation route includes the first marker information and the second marker information, and a first distance on the navigation route is smaller than a second distance on the initial path, the first distance being a navigation distance between positions on the navigation route corresponding to the first marker information and the second marker information, the second distance being a distance between positions on the initial path corresponding to the first marker information and the second marker information.
6. The method of claim 1, wherein after the displaying the navigation route, the method further comprises:
receiving a fifth input of the user;
deleting third marking information on the navigation route or displaying fourth marking information on the navigation route in response to the fifth input;
updating the navigation route based on the deleted third marker information or the displayed fourth marker information.
7. The method according to claim 1, wherein the method further comprises:
receiving a sixth input from the user;
responding to the sixth input, and displaying preset path information, a direction identifier, a second calibration identifier and a device identifier; the direction identifier indicates a navigation direction of the electronic device, the second calibration identifier indicates a calibration gesture associated with the preset path information, and the device identifier indicates a current gesture of the electronic device;
updating the display state of the direction identifier based on the traveling direction corresponding to the preset path information and the current traveling direction of the electronic equipment;
and updating the display state of the equipment identifier based on the current gesture of the electronic equipment and the calibration gesture associated with the preset path information.
8. An electronic device, comprising:
the first receiving module is used for receiving a first input of a user;
the first response module is used for responding to the first input, displaying a shooting preview interface and displaying initial path information in the shooting preview interface; wherein the initial path information is generated based on initial location information and real-time location information of the electronic device;
the second receiving module is used for receiving a second input of a user;
a second response module for displaying a navigation route in response to the second input; the navigation route is generated based on the initial path information, and the navigation route is used for indicating to return to an initial position corresponding to the initial position information.
9. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the navigation method of any of claims 1-7.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the navigation method according to any of claims 1-7.
CN202311400068.4A 2023-10-25 2023-10-25 Navigation method, electronic device and readable storage medium Pending CN117537820A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311400068.4A CN117537820A (en) 2023-10-25 2023-10-25 Navigation method, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311400068.4A CN117537820A (en) 2023-10-25 2023-10-25 Navigation method, electronic device and readable storage medium

Publications (1)

Publication Number Publication Date
CN117537820A true CN117537820A (en) 2024-02-09

Family

ID=89794862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311400068.4A Pending CN117537820A (en) 2023-10-25 2023-10-25 Navigation method, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN117537820A (en)

Similar Documents

Publication Publication Date Title
CN105594267B (en) The virtual crumbs searched for indoor location road
CN110246182B (en) Vision-based global map positioning method and device, storage medium and equipment
US9699375B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
CN112189218A (en) Site-based augmented reality method and device
JP5927966B2 (en) Display control apparatus, display control method, and program
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
WO2019037489A1 (en) Map display method, apparatus, storage medium and terminal
CN112307642B (en) Data processing method, device, system, computer equipment and storage medium
CN103471580A (en) Method for providing navigation information, mobile terminal, and server
KR20110032765A (en) Apparatus and method for providing service using a sensor and image recognition in portable terminal
WO2021088498A1 (en) Virtual object display method and electronic device
US20200265725A1 (en) Method and Apparatus for Planning Navigation Region of Unmanned Aerial Vehicle, and Remote Control
US20210263168A1 (en) System and method to determine positioning in a virtual coordinate system
US20220076469A1 (en) Information display device and information display program
CN115827906B (en) Target labeling method, target labeling device, electronic equipment and computer readable storage medium
US20200252579A1 (en) Electronic apparatus recording medium, and display method
CN113532444B (en) Navigation path processing method and device, electronic equipment and storage medium
CN114972485A (en) Positioning accuracy testing method, positioning accuracy testing apparatus, storage medium, and program product
CN114549633A (en) Pose detection method and device, electronic equipment and storage medium
CN114608591B (en) Vehicle positioning method and device, storage medium, electronic equipment, vehicle and chip
US20120281102A1 (en) Portable terminal, activity history depiction method, and activity history depiction system
TWI798789B (en) Navigation device, navigation system, navigation method, and media storaged the navigation program
CN117537820A (en) Navigation method, electronic device and readable storage medium
CN113063424B (en) Method, device, equipment and storage medium for intra-market navigation
US20220166917A1 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination