CN107843253B - Navigation control method, mobile terminal, and computer-readable storage medium - Google Patents
Navigation control method, mobile terminal, and computer-readable storage medium Download PDFInfo
- Publication number
- CN107843253B CN107843253B CN201711043741.8A CN201711043741A CN107843253B CN 107843253 B CN107843253 B CN 107843253B CN 201711043741 A CN201711043741 A CN 201711043741A CN 107843253 B CN107843253 B CN 107843253B
- Authority
- CN
- China
- Prior art keywords
- camera
- navigator
- navigation
- mobile terminal
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Telephone Function (AREA)
- Navigation (AREA)
Abstract
The application discloses a navigation control method, a mobile terminal and a computer readable storage medium, wherein the navigation control method comprises the following steps: receiving a live-action navigation instruction input by a user; simultaneously starting a camera and a navigator according to the instruction so as to obtain a current live-action image through a video head to assist the navigator; and changing the working parameters of the video head to reduce the frequency interference of the video head to the navigator. Therefore, the navigator can accurately receive and send navigation information within the navigation frequency, the accuracy and the real-time performance of the navigator are improved, and furthermore, the real-scene navigation can be formed because the camera and the navigator work simultaneously.
Description
Technical Field
The present application relates to the field of navigation technologies, and in particular, to a navigation control method, a mobile terminal, and a computer-readable storage medium.
Background
At present, the navigation service is widely applied in life and travel, and a navigation instrument is arranged in the mobile terminal, so that people can use navigation in the mobile terminal at any time and anywhere in travel. In addition to the traditional 2D and 3D navigation modes, the current navigation technology combining Virtual Reality (VR) and Augmented Reality (AR) adds new elements to the traditional navigation, so that the navigation is more vivid and realistic, and live-action navigation is one of the emerging navigation. Live-action navigation usually needs to rely on the assistance of a camera to capture and process a current picture. However, the frequency generated by the camera or other combined frequencies generally affects the operating frequency band of the navigator, and thus the operation of the navigator.
Disclosure of Invention
An embodiment of the present application provides a navigation control method of a mobile terminal, where the navigation control method includes: receiving a live-action navigation instruction input by a user; starting a camera and a navigator according to the instruction so as to obtain a current live-action image through the camera to assist the navigator; and changing the working parameters of the camera to reduce the frequency interference of the camera to the navigator.
On the other hand, an embodiment of the present application further provides a mobile terminal, where the mobile terminal includes:
the input module is used for receiving a live-action navigation instruction input by a user;
the starting module is used for starting the camera and the navigator according to the instruction so as to obtain the current live-action image through the camera to assist the navigator;
and the control module is used for changing the working parameters of the camera so as to reduce the frequency interference of the camera on the navigator.
On the other hand, the embodiment of the application also provides a mobile terminal, which comprises a processor, a memory electrically connected with the processor and an input device, wherein the input device is used for receiving the live-action navigation instruction input by a user; the memory stores program data that can be executed by the processor to implement the navigation control method described above.
On the other hand, the embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, and the computer program can be executed to implement the navigation control method described above.
According to the embodiment of the application, the working parameters of the camera are changed when the camera and the navigator are simultaneously started so as to reduce the frequency interference of the camera to the navigator, so that the navigator can accurately receive and send navigation information in the navigation frequency, the accuracy and the real-time performance of the navigator are improved, and further, as the camera and the navigator work simultaneously, real-scene navigation can be formed.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating a navigation control method of a mobile terminal according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 3 is a flowchart illustrating another navigation control method for a mobile terminal according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a navigation control method of a mobile terminal according to another embodiment of the present application;
fig. 5 is a flowchart illustrating a navigation control method of a mobile terminal according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of another mobile terminal provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of another mobile terminal provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of another mobile terminal provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be noted that the following examples are only illustrative of the present invention, and do not limit the scope of the present invention. Similarly, the following examples are only some but not all examples of the present invention, and all other examples obtained by those skilled in the art without any inventive work are within the scope of the present invention.
The terms "first", "second" and "third" in the present invention are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise. All directional indicators (such as up, down, left, right, front, and rear … …) in the embodiments of the present invention are only used to explain the relative positional relationship between the components, the movement, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indicator is changed accordingly. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Note that the camera of the Mobile terminal is usually connected to an external device through an MIPI (Mobile Industry Processor Interface). In live-action navigation, the current picture is captured and processed by means of camera assistance. However, the MIPI frequency or other combined frequency generated by the camera typically falls within the navigation band (Beidou/GPS/Glonass, 1550-. In the design of a mobile terminal, for example, a mobile phone, a receiving antenna of a navigator is usually located at the upper end of the mobile phone and is very close to a rear camera, and when the camera is turned on, the same frequency interference generated by the camera easily affects the navigation antenna. This will affect the accuracy and real-time performance of the navigator under live-action navigation, and will give users a very bad navigation experience.
The conventional measures generally include spraying EMI (Electromagnetic Interference) ink, wrapping copper foil with ground, and the like. That is, prevent the camera interference frequency from leaking or intensively flowing back, and lead the interference into the middle frame ground in time, so as not to radiate outward. However, this approach adds additional hardware cost, and does not necessarily provide sufficient shielding against interference, and for areas where EMI ink cannot be applied to the structure and the copper foil is wrapped, hardware shielding measures are ineffective.
Embodiments of the present application provide a computer-readable storage medium, a mobile terminal, and a navigation control method thereof, so as to solve the above technical problems, and refer to the following embodiments of the present application specifically:
referring to fig. 1, fig. 1 is a flowchart illustrating a navigation control method of a mobile terminal according to an embodiment of the present disclosure. As shown in fig. 1, the navigation control method of the present embodiment includes the following steps:
step S10: and receiving a live-action navigation instruction input by a user. The instruction can be received through a screen of the mobile terminal, or the instruction can be received through an entity key arranged on the mobile terminal.
In actual operation, when a navigation service is started, which is input by a user, is received, for example, when the user clicks a navigation icon on a touch screen of a mobile terminal, whether live-action navigation is required or not is automatically prompted to the user, and a selection request of the user is further received. If the selection request is that live-action navigation is needed, switching to a live-action navigation mode, and continuing to execute the step S11; if the selection request is that no live-action navigation is needed, the navigator is directly started without any switching.
Step S11: and simultaneously starting the camera and the navigator according to the instruction so as to obtain the current live-action image through the camera to assist the navigator. The camera can be a rear camera of the mobile terminal.
In this embodiment, the live-action image acquired by the camera may be transmitted to the navigator, and the navigator combines the live-action image acquired by the camera and the map data in the navigator to form live-action navigation. The camera can process the image and then transmit the image to the navigator after acquiring the live-action image.
Step S12: and changing the working parameters of the camera to reduce the frequency interference of the camera to the navigator.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure. Specifically, fig. 2 is a schematic structural diagram of a back surface (a surface opposite to the touch screen) of the mobile terminal. As shown in fig. 2, since the mobile terminal 10 is designed, the navigation antenna 11 of the navigator is generally located at the upper end portion of the mobile terminal 10. The upper end refers to an end of the mobile terminal farthest from the ground when the user uses the mobile terminal in a normal or conventional manner. And the camera 12 is also typically located at the upper end of the mobile terminal 10. Therefore, the navigation antenna 11 and the camera 12 are close to each other. In the case of turning on the camera 12 and the navigator at the same time, the co-channel interference generated by the camera 12 easily affects the navigation antenna 11. This will affect the accuracy and real-time performance of the navigator under live-action navigation, and will give users a very bad navigation experience.
Therefore, when the camera 12 and the navigator are simultaneously started, the working parameters of the camera 12 are changed to reduce the frequency interference of the camera to the navigator. The operating parameters may include a frequency parameter or a power parameter or a frequency parameter and a power parameter of the camera 12.
It should be noted that the camera in this embodiment adjusts and changes the working parameters only in the live-action navigation mode. In the normal shooting mode, i.e. non-live-action navigation, the camera still uses the preset shooting parameters. Specifically, two schemes are preset for the working parameters of the camera, one is the working parameters in the live-action navigation mode, and the other is the working parameters in the non-live-action navigation mode, namely the working parameters in the normal shooting mode. The step also comprises the step of reminding the user whether to change the working parameters of the camera, and after receiving the selection of the user to change the working parameters of the camera, selecting the working parameters preset by the camera in the live-action navigation mode to shoot.
It should be understood that the camera is only used as an auxiliary tool to assist the navigator in live-action navigation in the live-action navigation mode. The imaging quality of the camera is not the first one, and in this mode, it is not desirable to take a picture of the current real scene image, such as a road, to form a very high-definition image. User navigation is a primary requirement, so imaging users with low image quality are acceptable and not easily discovered. Therefore, in the live-action navigation mode, the user can be automatically prompted whether to reduce the image quality of the camera to improve the navigation precision, and when the user selects yes, the camera is switched to the working parameters of the navigation mode, so that the navigation antenna cannot be interfered on the basis of meeting the live-action requirement.
Therefore, in the embodiment, the frequency interference of the camera to the navigator is reduced by changing the working parameters of the camera, so that the navigator can accurately receive and transmit navigation information in the navigation frequency, and the accuracy and the real-time performance of the navigator are improved. In this embodiment, the mobile terminal includes a mobile phone, a palm computer, various wearable devices, and the like.
Referring to fig. 3, fig. 3 is a flowchart illustrating another navigation control method of a mobile terminal according to an embodiment of the present application. In the embodiment, the frequency interference on the navigator is reduced by changing the frequency parameter of the camera. Specifically, as shown in fig. 3, the navigation control method of the present embodiment includes the following steps:
step S30: and receiving a live-action navigation instruction input by a user. The instruction can be received through a touch screen of the mobile terminal, and the instruction can also be received through an entity key arranged on the mobile terminal. The specific steps are as described above, and are not described herein again.
Step S31: and simultaneously starting the camera and the navigator according to the instruction so as to obtain the current live-action image through the camera to assist the navigator. The camera can be a rear camera of the mobile terminal. The specific steps are as described above, and are not described herein again.
Step S32: and changing the frequency parameter of the camera to reduce the frequency interference of the camera to the navigator.
As described above, two frequency parameters may be preset in this step, one is a shooting frequency parameter when a camera is used for shooting alone, and the other is a navigation frequency parameter used in the navigation mode. The shooting frequency parameter can be a default frequency parameter of the camera. And in a non-navigation mode, namely selecting a default frequency parameter-shooting frequency parameter when the camera is used singly. And when the user needs to perform live-action navigation, changing the parameters of the camera into navigation frequency parameters, wherein the navigation frequency parameters of the camera are not equal to the frequency of the frequency band of the navigator.
Specifically, the nonlinearity of the camera head generates many frequency multiplication components for the used MIPI frequency, namely, the input is a single frequency F0But the output will be 2 times F03 times F04 times F05 times F0And the like. The frequency multiplication components are the main interference falling into the navigation frequency band (1550-0At 400MHz by 4 MHz 1600MHz, the high channel in the NAVIGATION band interferes with reception by GLONASS (GLOBAL NAVIGATION satellite system) NAVIGATION satellites. Generally, a camera fluctuates in a certain range of the MIPI frequency, that is, the camera can perform shooting operation in the fluctuating range. If the MIPI of the camera is set to 405MHz, the 3 and 4 frequency multiplication is 3F0=1215MHz,4F01620MHz, which avoids the navigator navigation band (1550 + 1605 MHz). It should be understood that this frequency of 405MHz is not the MIPI frequency for the best imaging quality of the camera, but the imaging chosen as live-action navigation mode is satisfactory. Thus, the present embodiment can set the default frequency parameter of the camera, i.e. the shooting frequency parameter, to 400MHZ, and the navigation frequency parameter of the camera to 405 MHZ. The camera can carry out shooting work at the frequency of 400MHz and the frequency of 405 MHz. In a non-navigation mode, namely when the camera is used singly, the default frequency parameter of-400 MHZ is selected. And when the user needs to perform live-action navigation, the parameter of the camera is changed to 405 MHz.
That is to say, the present embodiment can avoid the operating frequency of the camera from the navigation frequency band of the navigator by increasing the operating frequency of the camera. It should be understood that in other embodiments, the operating frequency of the camera may also be changed according to the actual navigation frequency band, for example, the navigation frequency band of the navigator is avoided by reducing the operating frequency of the camera.
As mentioned above, in this embodiment, the navigation frequency band of the navigator is avoided by changing the frequency parameter of the camera, so as to solve the problem of co-channel interference of the camera to the navigation antenna of the navigator under live-action navigation. In addition, the embodiment does not need to increase hardware measures, and can save great cost.
Referring to fig. 4, fig. 4 is a flowchart illustrating a navigation control method of a mobile terminal according to another embodiment of the present application. In the embodiment, the frequency interference on the navigator is reduced by changing the power parameter of the camera. Specifically, as shown in fig. 4, the navigation control method of the present embodiment includes the following steps:
step S40: and receiving a live-action navigation instruction input by a user. The instruction can be received through a touch screen of the mobile terminal, and the instruction can also be received through an entity key arranged on the mobile terminal. The specific steps are as described above, and are not described herein again.
Step S41: and simultaneously starting the camera and the navigator according to the instruction so as to obtain the current live-action image through the camera to assist the navigator. The camera can be a rear camera of the mobile terminal. The specific steps are as described above, and are not described herein again.
Step S42: and reducing the power parameter of the camera so as to reduce the frequency interference of the camera to the navigator.
Similarly, as mentioned above, two power parameters may be preset in this step, one is a shooting power parameter when the camera is used alone for shooting, the shooting power parameter may be a power parameter defaulted by the camera, and the other is a navigation power parameter used in the navigation mode. And in a non-navigation mode, namely selecting a default power parameter-shooting power parameter when the camera is used singly. And when the user needs to perform live-action navigation, changing the parameter of the camera into a navigation power parameter, wherein the value of the navigation power parameter of the camera is smaller than the value of the default shooting power parameter, so that the influence on the frequency of the navigator can be reduced by means of reducing the radiation energy.
In this embodiment, the power parameter includes a voltage parameter and a current parameter. Therefore, the purpose of reducing the radiation energy can be achieved by reducing the voltage parameter or the current parameter of the camera or simultaneously reducing the voltage parameter and the current parameter. It should be understood that the reduction of the voltage parameter and the current parameter of the present embodiment can be understood as the reduction of the voltage parameter and the current parameter in the live-action navigation mode relative to the voltage parameter and the current parameter required when the camera is used alone.
In other embodiments, other parameters of the camera can be changed to achieve the purpose of reducing the radiation energy of the camera, so that the influence of the camera on the navigation frequency band of the navigator is reduced.
Referring to fig. 5, fig. 5 is a flowchart illustrating a navigation control method of a mobile terminal according to another embodiment of the present application. In the embodiment, the frequency interference on the navigator is reduced by changing the power parameter of the camera and reducing the power parameter of the camera. Specifically, as shown in fig. 5, the navigation control method of the present embodiment includes the following steps:
step S50: and receiving a live-action navigation instruction input by a user. The instruction can be received through a touch screen of the mobile terminal, and the instruction can also be received through an entity key arranged on the mobile terminal. The specific steps are as described above, and are not described herein again.
Step S51: and simultaneously starting the camera and the navigator according to the instruction so as to obtain the current live-action image through the camera to assist the navigator. The camera can be a rear camera of the mobile terminal. The specific steps are as described above, and are not described herein again.
Step S52: and changing the frequency parameter of the camera and reducing the power parameter of the camera so as to reduce the frequency interference of the camera to the navigator.
The specific implementation of this step is as described in step S32 and step S42, and will not be described herein again.
The embodiment of the application also provides another mobile terminal, and the mobile terminal can realize the navigation control method. Please refer to the following embodiments.
Referring to fig. 6, fig. 6 is a schematic structural diagram of another mobile terminal according to an embodiment of the present disclosure. As shown in fig. 6, the mobile terminal 60 of the present embodiment includes an input module 61, a start module 62, and a control module 63.
The input module 61 is configured to receive an instruction of live-action navigation input by a user. The detailed method is as described above and will not be described herein.
The starting module 62 is configured to start the camera and the navigator according to the instruction, so as to obtain the current live-action image through the camera to assist the navigator. The detailed method is as described above and will not be described herein.
The control module 63 is used to change the working parameters of the camera to reduce the frequency interference of the camera to the navigator. The operating parameters may include a frequency parameter and a power parameter of the camera. The control module 63 can specifically achieve the purpose of reducing the frequency interference of the camera to the navigator by changing the working frequency of the camera or reducing the power parameter of the camera or changing the working frequency of the camera and simultaneously reducing the power parameter of the camera. The power parameter may include a voltage parameter or a current parameter or both. The specific method for changing the operating parameters is as described above, and is not described herein again.
Referring to fig. 7, fig. 7 is a schematic structural diagram of another mobile terminal according to an embodiment of the present application. As shown in fig. 7, the mobile terminal 70 of the present embodiment may include a processor 71, a memory 72, a bus 73, a camera 74, a navigator 75, and an input device 76. The processor 71 is electrically connected to the memory 72, the camera 74, the navigator 75 and the input device 76 through the bus 73.
The input device 76 is used for receiving the instructions of live-action navigation input by the user. The input device 76 may be a touch screen of the mobile terminal 70 or physical keys on the mobile terminal. The detailed method is as described above and will not be described herein.
The memory 72 stores program data that can be executed by the processor 71 to implement the navigation control method described hereinbefore.
Specifically, the processor 71 is configured to turn on the camera 74 and the navigator 75 at the same time according to the instruction, so as to obtain the current live-action image through the camera 74 to assist the navigator 75. The detailed method is as described above and will not be described herein.
The processor 71 is further configured to change an operating parameter of the camera 74 to reduce frequency interference of the camera 74 with the navigator 75. The operating parameters may include, among other things, frequency parameters and power parameters of the camera 74. The processor 71 can specifically achieve the purpose of reducing the frequency interference of the camera 74 to the navigator 75 by changing the operating frequency of the camera 74 or reducing the power parameter of the camera 74 or changing the operating frequency of the camera 74 and reducing the power parameter of the camera at the same time. The power parameter may include a voltage parameter or a current parameter or both. The specific method for changing the operating parameters is as described above, and is not described herein again.
Referring to fig. 8, fig. 8 is a schematic structural diagram of another mobile terminal according to an embodiment of the present application. As shown in fig. 8, the mobile terminal 800 includes an RF circuit 810, a memory 820, an input unit 830, a display unit 840, a sensor 850, an audio circuit 860, a wifi module 870, a processor 880, a power supply 890, a navigator 8110, a camera 8111, and the like. The RF circuit 810, the memory 820, the input unit 830, the display unit 840, the sensor 850, the audio circuit 860, the wifi module 870, the navigator 8110, and the camera 8111 are respectively connected to the processor 880; the power supply 890 is used to provide power to the entire mobile terminal 80.
Specifically, RF circuit 810 is used to receive and transmit signals; the memory 820 is used for storing data instruction information; the input unit 830 is used for inputting information, and may specifically include a touch panel 831 and other input devices 832 such as operation keys; the display unit 840 may include a display panel 841 and the like; the sensor 850 includes an infrared sensor, a laser sensor, etc. for detecting a user approach signal, a distance signal, etc.; a speaker 861 and a microphone 862 are electrically connected to the processor 880 through the audio circuitry 860 for receiving and transmitting sound signals; wifi module 870 is configured to receive and transmit wifi signals.
The input unit 831 is also configured to receive an instruction of live-action navigation input by the user.
The processor 880 is configured to turn on the camera 8111 and the navigator 8110 simultaneously according to the instruction, so as to obtain a current live-action image through the camera 8111 to assist the navigator 8110. The detailed method is as described above and will not be described herein.
The processor 880 is further configured to change an operating parameter of the camera 8111 to reduce frequency interference of the camera 8111 with the navigator 8110. The operating parameters may include a frequency parameter and a power parameter of the camera 8111. The processor 880 may specifically achieve the purpose of reducing the frequency interference of the camera 8111 on the navigator 8110 by changing the operating frequency of the camera 8111, or reducing the power parameter of the camera 8111, or changing the operating frequency of the camera 8111 and reducing the power parameter of the camera at the same time. The power parameter may include a voltage parameter or a current parameter or both. The specific method for changing the operating parameters is as described above, and is not described herein again.
The memory 820 is used for storing information such as operation instructions of the processor 880. For the specific operation flow of the processor 880, please refer to the detailed description of the above method embodiments.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present disclosure.
The computer-readable storage medium 90 stores a computer program 91, and the computer program 91 can be executed to implement the navigation control method of the mobile terminal set forth in the above embodiments, which will not be described herein again.
As will be appreciated by those skilled in the art, the computer-readable storage medium 90 may be a physical storage medium such as a usb disk and an optical disk, or may be a virtual storage medium such as a server.
To sum up, the mobile terminal and the memory thereof provided by the embodiment of the application change the working parameters of the camera when the camera and the navigator are simultaneously turned on, so as to reduce the frequency interference of the camera to the navigator, so that the navigator can accurately receive and transmit navigation information within the navigation frequency, and the accuracy and the real-time performance of the navigator are improved.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.
Claims (4)
1. A navigation control method of a mobile terminal, wherein the mobile terminal comprises a camera and a navigator, and is characterized by comprising the following steps:
receiving a live-action navigation instruction input by a user;
simultaneously starting the camera and the navigator according to the instruction so as to obtain a current live-action image through the camera to assist the navigator;
changing a frequency parameter of the camera and reducing a power parameter of the camera to reduce frequency interference of the camera to the navigator, wherein the reducing the power parameter of the camera comprises reducing a voltage parameter and/or a current parameter of the camera.
2. A mobile terminal, characterized in that the mobile terminal comprises:
a camera and a navigator;
the input module is used for receiving a live-action navigation instruction input by a user;
the starting module is used for simultaneously starting the camera and the navigator according to the instruction so as to obtain a current live-action image through the camera to assist the navigator;
the control module is used for changing the frequency parameter of the camera and reducing the power parameter of the camera so as to reduce the frequency interference of the camera on the navigator, wherein the reduction of the power parameter of the camera comprises the reduction of the voltage parameter and/or the current parameter of the camera.
3. A mobile terminal, comprising a processor, a memory electrically connected to the processor, and an input device, wherein:
the input device is used for receiving a live-action navigation instruction input by a user;
the memory stores program data executable by the processor to implement the navigation control method of claim 1.
4. A computer-readable storage medium, characterized in that the readable storage medium stores a computer program executable to implement the navigation control method of claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711043741.8A CN107843253B (en) | 2017-10-30 | 2017-10-30 | Navigation control method, mobile terminal, and computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711043741.8A CN107843253B (en) | 2017-10-30 | 2017-10-30 | Navigation control method, mobile terminal, and computer-readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107843253A CN107843253A (en) | 2018-03-27 |
CN107843253B true CN107843253B (en) | 2021-06-01 |
Family
ID=61681123
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711043741.8A Active CN107843253B (en) | 2017-10-30 | 2017-10-30 | Navigation control method, mobile terminal, and computer-readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107843253B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109067429B (en) * | 2018-08-24 | 2020-07-28 | 维沃移动通信有限公司 | Control method and terminal equipment |
CN111491072B (en) * | 2020-04-20 | 2022-05-10 | 维沃移动通信有限公司 | Pixel clock frequency adjusting method and device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102338639A (en) * | 2010-07-26 | 2012-02-01 | 联想(北京)有限公司 | Information processing device and information processing method |
CN103379643A (en) * | 2012-04-11 | 2013-10-30 | 中兴通讯股份有限公司 | Method and device for eliminating interference |
CN104838284A (en) * | 2012-11-08 | 2015-08-12 | 蓝泰科尼克有限公司 | Recording method for at least two ToF cameras |
CN106595641A (en) * | 2016-12-29 | 2017-04-26 | 深圳前海弘稼科技有限公司 | Travelling navigation method and device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101648339B1 (en) * | 2009-09-24 | 2016-08-17 | 삼성전자주식회사 | Apparatus and method for providing service using a sensor and image recognition in portable terminal |
-
2017
- 2017-10-30 CN CN201711043741.8A patent/CN107843253B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102338639A (en) * | 2010-07-26 | 2012-02-01 | 联想(北京)有限公司 | Information processing device and information processing method |
CN103379643A (en) * | 2012-04-11 | 2013-10-30 | 中兴通讯股份有限公司 | Method and device for eliminating interference |
CN104838284A (en) * | 2012-11-08 | 2015-08-12 | 蓝泰科尼克有限公司 | Recording method for at least two ToF cameras |
CN106595641A (en) * | 2016-12-29 | 2017-04-26 | 深圳前海弘稼科技有限公司 | Travelling navigation method and device |
Also Published As
Publication number | Publication date |
---|---|
CN107843253A (en) | 2018-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8817160B2 (en) | Mobile terminal and method of controlling the same | |
AU2017440899B2 (en) | Photographing method and terminal | |
CN115525383B (en) | Wallpaper display method and device, mobile terminal and storage medium | |
WO2013179712A1 (en) | Information processing device, information processing method, and program | |
CN107843253B (en) | Navigation control method, mobile terminal, and computer-readable storage medium | |
US11131557B2 (en) | Full-vision navigation and positioning method, intelligent terminal and storage device | |
US20220207803A1 (en) | Method for editing image, storage medium, and electronic device | |
CN109089137B (en) | Stuck detection method and device | |
CN111565309B (en) | Display device and distortion parameter determination method, device and system thereof, and storage medium | |
CN108055635A (en) | Acquisition methods, device, storage medium and the terminal of location information | |
CN111083554A (en) | Method and device for displaying live gift | |
CN113191976B (en) | Image shooting method, device, terminal and storage medium | |
CN110971840B (en) | Video mapping method and device, computer equipment and storage medium | |
CN111008083B (en) | Page communication method and device, electronic equipment and storage medium | |
CN114489314A (en) | Augmented reality image display method and related device | |
CN110990623A (en) | Method and device for displaying audio subtitles, computer equipment and storage medium | |
JP2011196787A (en) | Video processing apparatus, video processing method and video imaging apparatus | |
EP4007229A1 (en) | Bandwidth determination method and apparatus, and terminal, and storage medium | |
CN112260845A (en) | Method and device for accelerating data transmission | |
US11922093B2 (en) | Device control method and apparatus | |
CN109688446B (en) | Control method, controlled device and control system | |
CN115086200B (en) | Packet loss type determining method and device, electronic equipment and storage medium | |
CN117911482B (en) | Image processing method and device | |
CN110660031B (en) | Image sharpening method and device and storage medium | |
KR20150040436A (en) | Electronic Device And Method Of Controlling The Same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |