CN116929351A - Navigation method and electronic equipment - Google Patents
Navigation method and electronic equipment Download PDFInfo
- Publication number
- CN116929351A CN116929351A CN202210342590.0A CN202210342590A CN116929351A CN 116929351 A CN116929351 A CN 116929351A CN 202210342590 A CN202210342590 A CN 202210342590A CN 116929351 A CN116929351 A CN 116929351A
- Authority
- CN
- China
- Prior art keywords
- speed
- visual element
- navigation
- motion
- guiding
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 230000033001 locomotion Effects 0.000 claims abstract description 299
- 230000000007 visual effect Effects 0.000 claims abstract description 196
- 230000001133 acceleration Effects 0.000 claims abstract description 35
- 230000000875 corresponding effect Effects 0.000 claims abstract description 35
- 230000002596 correlated effect Effects 0.000 claims abstract description 13
- 238000004590 computer program Methods 0.000 claims description 23
- 238000003860 storage Methods 0.000 claims description 10
- 230000000386 athletic effect Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 22
- 238000004891 communication Methods 0.000 description 26
- 238000010586 diagram Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 230000007423 decrease Effects 0.000 description 10
- 239000011521 glass Substances 0.000 description 10
- 230000035484 reaction time Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 230000005236 sound signal Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000005096 rolling process Methods 0.000 description 6
- 230000001965 increasing effect Effects 0.000 description 5
- 238000010801 machine learning Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000035807 sensation Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000007774 longterm Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 241000791900 Selene vomer Species 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000000994 depressogenic effect Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 239000012788 optical film Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 241000288906 Primates Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000003592 biomimetic effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/203—Specially adapted for sailing ships
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/24—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Astronomy & Astrophysics (AREA)
- Navigation (AREA)
Abstract
The application provides a navigation method and electronic equipment, and relates to the technical field of terminals, wherein the method comprises the steps of obtaining a first movement speed of a navigation target, obtaining a guiding speed corresponding to the navigation target, and displaying a first visual element of movement based on the first movement speed and the guiding speed, wherein the first visual element is used for indicating acceleration or deceleration of the navigation target, and a second movement speed of the first visual element is positively correlated with an absolute value of a difference value between the first movement speed and the guiding speed. The technical scheme provided by the application can more intuitively indicate the acceleration or deceleration degree of the navigation target, and improves the navigation effect.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a navigation method and an electronic device.
Background
Travel is an extremely important human activity, and can achieve various purposes such as exploration of unknown areas, material transportation, sports and recreation through travel, and navigation is a technology for guaranteeing travel convenience, safety and comfort. Along with the daily accumulation of human travel experience and the continuous development of scientific technology, the navigation technology also has long-term progress.
The electronic device can display the current movement speed of the user, the preset highest speed or lowest speed and the like, and then the user adjusts the movement speed according to the highest speed or the lowest speed, so that the user moves according to the movement speed lower than the highest speed or higher than the lowest speed, but the reminding mode is easily ignored by the user, and the navigation effect is poor.
Disclosure of Invention
In view of this, the present application provides a navigation method and an electronic device, which can more intuitively indicate the acceleration or deceleration degree of a navigation target, and improve the navigation effect.
In order to achieve the above object, in a first aspect, an embodiment of the present application provides a navigation method, including:
acquiring a first movement speed of a navigation target;
acquiring a guiding speed corresponding to the navigation target;
and displaying a first visual element of motion based on the first motion speed and the guiding speed, wherein the first visual element is used for indicating the navigation target to accelerate or decelerate, and the second motion speed of the first visual element is positively correlated with the absolute value of the difference value of the first motion speed and the guiding speed.
The navigation target may be a device being navigated.
The guidance speed may be a recommended movement speed for the navigation target.
Visual elements may be used as tools and media for conveying information, and may include information elements such as graphics, text, shapes and shapes, and may also include elements in forms such as dots, lines, faces, colors and spaces.
In the embodiment of the application, the first movement speed of the navigation target and the guiding speed corresponding to the navigation target can be obtained, and the first visual element of movement is displayed based on the first movement speed and the guiding speed. Because the moving first visual element is easier to attract the attention of a user, the acceleration or the deceleration of the navigation target can be indicated by the first visual element, and the second movement speed of the first visual element is positively correlated with the absolute value of the difference value between the first movement speed and the guiding speed of the first navigation target, the acceleration or the deceleration degree of the navigation target can be indicated more intuitively, and the navigation effect is improved.
It should be noted that the navigation target and the device for performing navigation may be the same device or may be different devices, for example, the device for performing navigation may be a first device, and the navigation target may be a first device or a second device. In some embodiments, the second device may be a carrier.
It should also be noted that the first visual element of the movement may be understood as a change of the position of the first visual element in the navigation interface. The first visual element of motion can attract the attention of the user more easily to in time guide the user to accelerate or decelerate, improve navigation effect.
In some embodiments, the first motion speed of the first visual element may be equal to an absolute value of a difference between the second motion speed and the guiding speed.
In some embodiments, the reference to the guiding speed and the reference to the first movement speed may be the same.
In some embodiments, the displaying the first visual element of motion based on the first motion speed and the guiding speed comprises:
if the first movement speed is less than the guiding speed, displaying the first visual element moving along a first direction;
if the first movement speed is greater than the guiding speed, displaying the first visual element moving along a second direction;
wherein the first direction and the second direction are different.
In some embodiments, the first direction and the second direction are parallel and opposite.
In some embodiments, the first direction is a direction in which a bottom of a navigation interface points to a center position of the navigation interface, the second direction is a direction in which a center position of the navigation interface points to a bottom of the navigation interface, and the navigation interface is an interface including the first visual element.
In some embodiments, the first direction may be directed from a location near the user to a location far from the user to make the driver more intuitively generate the sensation that the first visual element is far from the vehicle, so that the driver is more likely to think of accelerating catch-up, the reaction time of the driver is reduced, and the reliability of auxiliary driving is improved; the second direction may be directed from a position away from the user to a position close to the user so that the driver more intuitively generates the sensation that the first visual element is close to the vehicle, thereby enabling the driver to more easily come to think of deceleration avoidance and reducing the reaction time of the driver. Of course, in practical applications, the first direction and the second direction may be other directions.
In some embodiments, the displaying the first visual element of motion based on the first motion speed and the guiding speed comprises:
displaying the first visual element of motion when the first motion speed is detected to be inconsistent with the guiding speed;
when the first movement speed is detected to be consistent with the guiding speed, the first visual element of movement is stopped from being displayed.
That is, the user is guided to adjust the movement speed of the user or the carrier as soon as possible, so that the movement speed is consistent with the guiding speed, and the navigation accuracy can be improved.
In some embodiments, the first device stopping displaying the first visual element of motion upon detecting that the first motion speed coincides with the guidance speed may include: the first device immediately stops displaying the first visual element of the motion when detecting that the first motion speed is consistent with the guiding speed; or when the first device detects that the first movement speed is consistent with the guiding speed and the time length of the first movement speed which is consistent with the guiding speed is longer than a preset second time length threshold value, stopping displaying the moving first visual element, so that the time length of displaying the moving first visual element is increased, guiding a user to adjust the movement speed of the user or the carrier for a longer time, and reducing the deviation of the movement speed and the guiding speed.
In some embodiments, the displaying the first visual element of motion based on the first motion speed and the guiding speed comprises:
displaying the first visual element of motion when the first speed of motion is detected not to be in a first speed range;
stopping displaying the moving first visual element when the first movement speed is detected to be in a first speed range;
Wherein the guiding speed is a value comprised in the first speed range.
That is, the user is guided to adjust the movement speed of the user or the carrier to be close to the first speed range, and the speed guiding mode is convenient for the user to control the movement speed of the user or the carrier more freely and flexibly, so that the navigation flexibility and the user experience are improved.
In some embodiments, the displaying the first visual element of motion based on the first motion speed and the guiding speed comprises:
displaying the first visual element of motion when the first speed of motion is detected not to be in a first speed range;
stopping displaying the moving first visual element when the first movement speed is detected to be consistent with the guiding speed;
wherein the guiding speed is a value comprised in the first speed range.
That is, when the difference between the first movement speed of the navigation target and the recommended guiding speed is large, the first visual element of the movement starts to be displayed to guide the user to adjust the movement speed of the user or the carrier until the movement speed is consistent with the guiding speed.
In some embodiments, the first device stopping displaying the first visual element of the movement upon detecting that the first speed of movement is in the first speed range may include: when the first equipment detects that the first movement speed is in a first speed range, the first equipment immediately stops displaying the moving first visual element, so that the duration of displaying the moving first visual element is reduced, and a user can more freely and flexibly control the movement speed of the first equipment or the carrier; or when the first device detects that the first movement speed is in the first speed range and the duration of the first movement speed in the first speed range is greater than the preset first time threshold, stopping displaying the moving first visual element, so that the duration of displaying the moving first visual element is increased, a user is guided to adjust the movement speed of the user or the carrier for a longer time, and the deviation of the movement speed and the guiding speed or the first speed range is reduced.
In some embodiments, the first visual element comprises an arrow or a light wave.
In some embodiments, the method further comprises:
and displaying a second visual element of motion based on the first motion speed and the guiding speed, wherein one of the first visual element and the second visual element is used for indicating acceleration of the navigation target, the other is used for indicating deceleration of the navigation target, and a third motion speed of the second visual element is positively correlated with an absolute value of a difference value between the first motion speed and the guiding speed.
In some embodiments, the first visual element is for indicating acceleration of the navigation target, the second visual element is for indicating deceleration of the navigation target, the displaying the first visual element of motion based on the first motion speed and the guiding speed comprises:
if the first movement speed is less than the guiding speed, displaying the first visual element moving along a first direction;
the displaying a second visual element of motion based on the first motion speed and the guidance speed, comprising:
if the first movement speed is greater than the guiding speed, displaying the second visual element moving along a second direction;
Wherein the first direction and the second direction are different.
That is, the navigation device can be more intuitively indicated to accelerate or decelerate respectively by displaying the first visual element and the second visual element in different styles, so that a user can more easily make different reactions according to the first visual element or the second visual element, and the navigation accuracy and the user experience are further improved.
The navigation target deceleration may be indicated by the first visual element, and the navigation target acceleration may be indicated by the second visual element.
In practical application, the navigation target is also indicated to be decelerated by the first visual element, and the navigation target is indicated to be accelerated by the second visual element. And if the first movement speed is greater than the guiding speed, the first device displays the first visual element moving along the second direction.
In some embodiments, the stopping displaying the moving first visual element may include displaying the stationary first visual element or hiding the first visual element; similarly, ceasing to display the moving second visual element may include displaying the stationary second visual element or hiding the second visual element.
In some embodiments, the acquiring the guiding speed corresponding to the navigation target includes:
acquiring motion environment information corresponding to the navigation target, wherein the motion environment information is used for indicating the environment where the navigation target is located;
the guidance speed is determined based on the motion environment information.
In some embodiments, the first device may obtain, based on the determined motion environment information, a stored guidance speed corresponding to the motion environment information.
In some embodiments, the first device may input the motion environment information to the stored first machine learning model and derive a guidance speed of the first machine learning model output.
In some embodiments, the first device may obtain a movement route and a movement time limit submitted by the user, and determine the guidance speed based on the movement route and the movement time limit. Of course, in practical applications, the first device may determine the guiding speed in other ways.
In some embodiments, the athletic environmental information includes at least one of location information, weather information, and obstacle information.
The location information may be used to indicate the location where the navigation target is located. In some embodiments, the location information may include location coordinates, such as longitude and latitude, of where the navigation target is located. In some embodiments, the location information may include at least one of a road identifier, a road segment identifier, and a road segment type where the navigation target is located, where the road segment type may be determined by a professional in a traffic department or the like, for example, the road segment type may include a straight lane, a left turn lane, a right turn lane, a main road, a secondary road, an auxiliary road, a directional ramp, a semi-directional ramp, a general ramp, a distributed lane, a highway, and the like. In some embodiments, the location information may include a ground clearance. Of course, in practical applications, the location information may also include other information that can indicate the location of the navigation target.
The weather information may be used to indicate weather of the region in which the navigation target is located. In some embodiments, the weather information may include information such as haze, rain, snow, and visibility. Of course, in practical applications, the weather information may also include other information that can indicate weather.
The obstacle information may be information indicating a position and a state of an obstacle within a first preset range of the navigation target, where the obstacle may be an object that obstructs the navigation target from passing through, such as a wall, a guardrail, a pedestrian, a vehicle, and the like. In some embodiments, the obstacle information may include one or more of a type of obstacle, an azimuth of the obstacle, a distance of the obstacle from the navigation target, and a moving speed of the obstacle. Of course, in practical applications, the obstacle information may also include other information that can indicate the position or state of the obstacle.
In a second aspect, an embodiment of the present application provides a navigation device, which is included in an electronic apparatus, and has a function of implementing the method according to any one of the first aspect. The functions may be realized by hardware, or may be realized by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the functions described above. Such as a transceiver module or unit, a processing module or unit, an acquisition module or unit, etc.
In a third aspect, an embodiment of the present application provides an electronic device, including: a memory and a processor, the memory for storing a computer program; the processor is configured to perform the method of any of the above first aspects when the computer program is invoked.
In some embodiments, the electronic device may be carried on the user or may be disposed or integrated on a carrier, where the electronic device and the user or the carrier are in the same space and time, and the user or the carrier will also drive the electronic device to move when moving, so that the electronic device and the user or the carrier have the same movement state. In some embodiments, the electronic device is integrated on a carrier, which may refer to both a mechanical connection and a communication connection of the electronic device with the carrier, and the electronic device may be considered part of the carrier when the electronic device is integrated on the carrier, or the electronic device and the carrier may be considered the same device. In other embodiments, the electronic device is mounted on a carrier, which may mean that there is a mechanical connection between the electronic device and the carrier, such as the carrier including a cradle for mounting the electronic device thereon, and the electronic device may be mounted on the cradle; alternatively, the electronic device is mounted on a carrier, which may mean that there is no mechanical connection between the electronic device and the carrier, as long as it remains relatively stationary, e.g. the electronic device may be placed on a certain plane of the carrier.
In some embodiments, the electronic device may be communicatively coupled to the carrier regardless of the positional relationship and the connection relationship between the electronic device and the carrier. In some embodiments, the communication connection may be a connection based on a near field communication technology.
In a fourth aspect, an embodiment of the present application provides a vehicle, where the vehicle includes the electronic device of any one of the first aspects, and the electronic device is an on-board device in the vehicle, and the electronic device further includes a Head Up Display (HUD);
the HUD is for displaying a first visual element.
Among other things, HUDs may also be referred to as heads-up displays. The HUD may be provided in a vehicle or the like for projecting an image on a windshield in front of a driver so that the driver can see the projected image through head up. The HUD may include an image generating unit and an optical display system. The image generation unit may include a light source, an optical film, and other optical components for generating an image. The optical display system may include a mirror, a control unit, and a front windshield, the mirror and the front windshield being fitted to eliminate screen distortion, the control unit being operable to access various information such as navigation to be displayed, so that the image unit can generate an image from the information. In some embodiments, the image displayed by the head-up display on the front windshield may be superimposed on an exterior scene outside the front windshield.
In practical applications, the electronic device may be disposed in the vehicle.
In a fifth aspect, an embodiment of the present application provides a chip system, including a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any one of the first aspects.
The chip system can be a single chip or a chip module formed by a plurality of chips.
In a sixth aspect, an embodiment of the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the first aspects described above.
In a seventh aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform the method of any one of the first aspects.
It will be appreciated that the advantages of the second to seventh aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
Fig. 2 is a schematic diagram of a display principle of a HUD according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a navigation scenario provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of another navigation scenario provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a navigation scenario provided by an embodiment of the present application;
FIG. 6 is a flow chart of a navigation method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a navigation interface according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another navigation interface according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another navigation interface according to an embodiment of the present application;
FIG. 10 is a schematic diagram of another navigation interface according to an embodiment of the present application;
FIG. 11 is a schematic diagram of another navigation interface according to an embodiment of the present application;
FIG. 12 is a schematic diagram of another navigation interface provided by an embodiment of the present application;
FIG. 13 is a schematic diagram of another navigation interface according to an embodiment of the present application;
FIG. 14 is a schematic diagram of another navigation interface provided by an embodiment of the present application;
fig. 15 is a flowchart of another navigation method according to an embodiment of the present application.
Detailed Description
The navigation method provided by the embodiment of the application can be applied to navigation or guidance mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented reality (augmented reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal digital assistant, PDA), running machines, rowing machines, spinning bicycles and other electronic equipment in navigation or guidance in navigation, aviation, astronomy, hydrologic or land traffic, sports fitness and the like.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to the present application. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, a communication module 150, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, and a display 194, among others. The sensor module 180 may include a pressure sensor, a gyroscope sensor, a barometric sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), and/or a baseband processor, etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are reduced, reducing the latency of the processor 110, and thus improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include controller area network (controller area network, CAN) interfaces, integrated circuit (inter-integrated circuit, I2C) interfaces, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, subscriber identity module (subscriber identity module, SIM) interfaces, and/or universal serial bus (universal serial bus, USB) interfaces, among others.
The CAN interface is a standard field bus applied to an automobile computer control system and an embedded industrial control local area network, the I2C interface is a bidirectional synchronous serial bus, and comprises a serial data line (SDA) and a serial clock line (derail clock line, SCL), the I2S interface and the PCM interface CAN be used for audio communication, the UART interface is a universal serial data bus for asynchronous communication, the MIPI interface CAN be used for connecting the processor 110 and the display 194, and peripheral devices such as a camera 193 and the like, including a camera serial interface (camera serial interface, CSI), a display screen serial interface (display serial interface, DSI) and the like.
The interfaces described above may be used to couple multiple components included in electronic device 100 to enable communication between the multiple components. For example: the processor 110 may be coupled to the touch sensor through an I2C interface, so that the processor 110 and the touch sensor communicate through an I2C bus interface to implement a touch function of the electronic device 100; the audio module 170 may transmit an audio signal to the communication module 150 through the I2S interface or the PCM interface, so as to realize a function of playing the audio signal through the bluetooth headset; processor 110 and camera 193 communicate through a CSI interface to implement a photographing function of electronic device 100; processor 110 and display 194 communicate via a DSI interface to implement the display functionality of electronic device 100; the USB interface 130 may be used to connect a charger to charge the electronic device 100, may be used to transfer data between the electronic device 100 and a peripheral device such as a mouse, a keyboard, or a game pad, may also be used to connect an earphone, play audio through the earphone, and may also be used to connect other electronic devices, such as an AR device.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The wireless communication function of the electronic device 100 can be realized by the antenna 1, the communication module 150, and the like.
The antenna 1 is used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The communication module 150 may provide solutions for wireless communication including 2G/3G/4G/5G/wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 100. The communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. In some embodiments, at least some of the functional modules of the communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
In some embodiments, the antenna 1 and the communication module 150 of the electronic device 100 are coupled such that the electronic device 100 may communicate with a network and other devices through wireless communication techniques. Wireless communication techniques may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display 194, and an application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. The display 194 is for displaying images, videos, and the like.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like. Wherein, camera 193 is used for catching still image or video, ISP is used for handling the data that camera 193 feeds back, digital signal processor is used for handling digital signal such as digital image signal, and video codec is used for compressing or decompressing digital video.
In some embodiments, the display 194 may include a HUD. HUDs may also be referred to as heads-up displays. The HUD may be provided in a vehicle or the like for projecting an image on a windshield in front of a driver so that the driver can see the projected image through head up. The HUD may include an image generating unit and an optical display system. The image generation unit may include a light source, an optical film, and other optical components for generating an image. The optical display system may include a mirror, a control unit, and a front windshield, the mirror and the front windshield being fitted to eliminate screen distortion, the control unit being operable to access various information such as navigation to be displayed, so that the image unit can generate an image from the information. In some embodiments, the image displayed by the head-up display on the front windshield may be superimposed on an exterior scene outside the front windshield. In some embodiments, as shown in fig. 2, the HUD may include a light source 210, an aspherical mirror 220, and a front windshield 230. The light source 210 projects an image to the aspherical mirror 220, the aspherical mirror 220 reflects the image to the front windshield 230, and finally the image projected on the front windshield 230 is superimposed on the external scene of the driver outside the front windshield 230, so that the driver sees a virtual-real combined scene.
In some embodiments, the display 194 may include a head mounted display (head mounted display) that may be worn on the head of the driver, including a head mounted tracker, an image processing unit that may be used to track the head position and line of sight angle of the driver, and an optical display system that may be used to generate images that conform to the head position and line of sight angle that need to be displayed.
In some embodiments, the display 194 may include a multi-function display (multi function display, MFD). The MFD may also be referred to as a look-down display, and may be mounted in the vehicle within a driver's lower field of view, which the driver may look at the displayed information in a look-down manner. The MFD may include a cathode ray tube display and a flat panel display.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store, among other things, an operating system, application programs required for at least one function (such as a navigation function, a game function, etc.), and the like. The storage data area may store data created during use of the electronic device 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided to the display 194. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: executing acceleration or deceleration of the carrier when a touch operation with the touch operation intensity smaller than the first pressure threshold acts on the acceleration icon or the deceleration icon; when a touch operation with the touch operation intensity being larger than or equal to a first pressure threshold value acts on the acceleration icon or the deceleration icon, the acceleration icon and the deceleration icon can be locked or unlocked, wherein if the acceleration icon and the deceleration icon are in a locked state, the carrier keeps the current speed to run, and the acceleration icon and the deceleration icon are not accelerated or decelerated; if the acceleration and deceleration icons are in an unlocked state, the driver can control the vehicle to accelerate or decelerate through touch operation with the touch operation intensity smaller than the first pressure threshold value.
The gyroscopic sensor may be used to determine a motion pose of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by a gyroscopic sensor. The gyroscopic sensor may be used to navigate, somatosensory a game scene.
The air pressure sensor is used for measuring air pressure. In some embodiments, the electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensors, aiding in positioning and navigation.
The acceleration sensor may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary.
And a distance sensor for measuring the distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may utilize a distance sensor to measure a distance between the vehicle and other vehicles, such as a distance between a vehicle and a vehicle in front or a distance between a vehicle behind.
The proximity light sensor may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. In some embodiments, the electronic device 100 may determine whether there is an obstacle such as a wall around by approaching the light sensor.
The ambient light sensor is used for sensing ambient light brightness. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. In some embodiments, the electronic device 100 may determine the brightness of the environment through the ambient light sensor, so as to adjust or remind the user to adjust the movement speed, turn on or off the light fixture, etc. according to the brightness.
The fingerprint sensor is used for collecting fingerprints. The electronic device 100 may utilize the captured fingerprint characteristics to effect fingerprint unlocking, such as unlocking a door of a vehicle, starting a vehicle, unlocking the display 194, and so forth.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display 194, and a touch screen, also referred to as a "touch screen", is formed by the touch sensor and the display 194. The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor may also be disposed on a surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The motor 191 may also correspond to different vibration feedback effects by touch operations applied to different areas of the display 194. Different application scenarios (such as acceleration, deceleration, impact, damage, etc. of the vehicle) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light that may be used to indicate the status of the electronic device 100, such as the speed of movement, a change in power, a change in oil, whether the seat belt is worn correctly, etc., and may also be used to indicate a message, missed call, notification, etc.
In some embodiments, the electronic device 100 may also include a drive system that may be used to drive the movement of the electronic device 100. The drive system may move structures such as wheels, propellers, robotic arms, and the like. In some embodiments, the drive system may also include a transmission structure, such as a drive shaft, gears, hinges, tracks, links, and the like, coupled to the moving structure. In some embodiments, the drive system further comprises a power structure, such as a pedal, rocker or engine, for generating or receiving mechanical energy, which may be connected to the moving structure by a transmission structure, such that mechanical energy can be transferred to the moving structure.
In some embodiments, the electronic device 100 may be carried on a user, or may be disposed or integrated on a carrier, where the electronic device 100 and the user or the carrier are in the same space and time, and the user or the carrier will also drive the electronic device 100 to move when moving, so that the electronic device 100 and the user or the carrier have the same movement state. In some embodiments, electronic device 100 is integrated on a carrier, which may refer to both a mechanical connection of electronic device 100 to the carrier and a communication connection (e.g., electronic device 100 may communicate with other components of the carrier through an interface such as a CAN in the carrier), electronic device 100 may be considered part of the carrier when electronic device 100 is integrated on the carrier, or electronic device 100 and the carrier may be considered the same device. In other embodiments, the electronic device 100 is mounted on a carrier, which may mean that there is a mechanical connection between the electronic device 100 and the carrier, such as the carrier includes a stand for mounting the electronic device 100 thereon, and the electronic device 100 may be mounted on the stand; alternatively, the electronic device 100 is mounted on a carrier, which may mean that there is no mechanical connection between the electronic device 100 and the carrier, as long as it remains relatively stationary, e.g., the electronic device 100 may be placed on a certain plane of the carrier.
In some embodiments, the electronic device 100 may be communicatively coupled to the carrier regardless of the positional relationship and the connection relationship between the electronic device 100 and the carrier. In some embodiments, the communication connection may be a connection based on a near field communication technology such as WIFI.
In order to facilitate understanding of the technical solutions in the embodiments of the present application, several navigation scenarios are first described below.
Referring to fig. 3, a schematic diagram of a navigation scenario provided by an embodiment of the present application is navigation under a driving in the field, and includes an electronic device 100 and a first carrier 310. As shown in fig. 3, the first carrier 310 is a vehicle, and the electronic device 100 is a mobile phone placed in the vehicle, but it will be understood that in practical application, the first carrier 310 may be any other carrier, and the electronic device 100 may be any other type of device.
The first carrier 310 may include a space for accommodating a driver therein and may further include a space for accommodating the electronic device 100. The electronic device 100 may be disposed or integrated in the first carrier 310, and thus, the electronic device 100 and the first carrier 310 are in the same space-time and have the same motion state. If the electronic device 100 is integrated in the first carrier 310, the electronic device 100 may be considered as an in-vehicle device built into the first carrier 310. At least one of the electronic device 100 and the first carrier 310 may include the sensor module 180 described above, so that one or more kinds of information required for navigation, such as a current motion state of the first carrier 310, a location, an external environment, and the like, can be sensed through the sensor module 180. At least one of the electronic device 100 and the first carrier 310 may include the display 194 of the foregoing for displaying a navigation interface including navigation information such as a movement route and a movement speed, and in some embodiments, the display 194 may be a HUD, HFD, or MFD.
The driver may be located inside the first vehicle 310 or outside the first vehicle 310 to drive the first vehicle 310 in real time and in the field through components such as steering wheel, keys, rudder, tie rod, throttle, and brake. The electronic device 100 obtains one or more types of information required for navigation through the local end or the first carrier 310 of the electronic device 100, determines to obtain navigation information based on the information, and then displays a navigation interface on the display 194 included in the local end or the first carrier 310 of the electronic device 100. In some embodiments, if the navigation interface is displayed through the HUD, the navigation interface may be displayed on the front windshield based on the HUD as shown in FIG. 2 to provide a more intuitive and realistic display effect. When the driver sees the navigation interface, the driver can adjust the motion state of the first carrier 310, such as acceleration, deceleration, or steering, etc., with reference to the navigation information included in the navigation interface.
Referring to fig. 4, a schematic diagram of another navigation scenario provided by an embodiment of the present application is navigation under remote driving, and includes an electronic device 100 and a second carrier 320, where the electronic device 100 and the second carrier 320 may be connected through a network. As shown in fig. 4, the second carrier 320 is an unmanned aerial vehicle, but it will be appreciated that in practical applications, the second carrier 320 may be any other carrier that can be remotely driven, such as an unmanned automobile, an unmanned submersible vehicle, and the like. The electronic device 100 may be used to remotely drive the second carrier 320, and the electronic device 100 may also be other types of devices.
The second carrier 320 may include the sensor module 180 described above, so as to obtain one or more types of information required for navigation, such as a current motion state, a location, an external environment, and the like of the second carrier 320. The electronic device 100 may include the display 194 of the foregoing for displaying a navigation interface including navigation information such as a movement route and a movement speed.
The driver may trigger the electronic device 100 to generate various control instructions at one location through a key or a touch pad included on the electronic device 100, and the electronic device 100 may send the control instructions to the second carrier 320 at another location through a network connection, and the second carrier 320 may receive the control instructions and move based on the control instructions. The second carrier 320 may acquire one or more information required for navigation, such as a current motion state, a location, an external environment, and the like, through the sensor module 180, and send the information to the electronic device 100, and the electronic device 100 may determine to obtain navigation information according to the information and display a navigation interface on the display 194 included in the electronic device 100, so that the driver may adjust the motion state of the second carrier 320 based on the navigation information included in the navigation interface displayed by the electronic device 100.
Fig. 5 is a schematic diagram of another navigation scenario provided in an embodiment of the present application, where the navigation scenario is navigation under a user motion, and includes a user. The user may carry or wear the electronic device 100, as shown in fig. 5, with the mobile phone 510 and the AR glasses 520, and move on sports equipment such as a running machine outdoors or indoors.
In some embodiments, the electronic device 100 may navigate the user alone. Taking the mobile phone 510 as an example, the mobile phone 510 may obtain one or more types of information required for navigation through the sensor module 180 included in the mobile phone 510, determine to obtain navigation information based on the information, and may display a navigation interface including the navigation information through the display 194 included in the mobile phone 510.
In other embodiments, multiple electronic devices 100 may cooperate to navigate a user. Taking the mobile phone 510 and the AR glasses 520 as an example, the mobile phone 510 may obtain one or more information required for navigation from the sensor module 180 included in at least one of the mobile phone 510 and the AR glasses 520, determine to obtain navigation information based on the information, and display a navigation interface including the navigation information in at least one of the mobile phone 510 and the AR glasses 520.
The user may adjust the motion state of the user based on navigation information included in the navigation interface displayed by the mobile phone 510 or the AR glasses 520.
The main technical terms included in the embodiments of the present application will be explained next.
The user may be an object capable of perceiving navigation information obtained by the navigation method. The user may comprise a person. In some embodiments, the user may include other animals other than humans, such as other primates. In some embodiments, the user may comprise a biomimetic robot. In some embodiments, the user may be equivalent to a driver.
The carrier may include vehicles such as vehicles, ships, submarines, airplanes, aircrafts, space ships, and other vehicles such as robots, and other devices capable of moving and accelerating or decelerating.
The navigation interface may be an interface that includes navigation information to one or more types. The navigation interface may be displayed on a display of an electronic device, and it is understood that the style of the same navigation interface displayed on different displays may be different depending on the resolution and size of the display.
The navigation target may be a device to be navigated, such as a vehicle or the like.
The guidance speed is a movement speed recommended for the navigation target.
The following describes the technical solution of the present application in detail with reference to the above navigation scenario in specific embodiments. In the following embodiments, the first device may be a device for performing navigation, a navigation target, or a device for displaying a navigation interface; the second device may be a navigation target, or may be a device displaying a navigation interface; the third device may be a device displaying a navigation interface. It should also be noted that the speed in the embodiments of the present application does not include a direction, i.e., it may be equal to the speed. It should be further noted that, in the embodiment of the present application, the navigation scene may be a real navigation scene or a virtual or simulated navigation scene, for example, the navigation scenes may be all virtual navigation scenes in an electronic game, and the user, the carrier, and various terminal devices may be virtual objects in the game. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 6 is a flowchart of a navigation method according to an embodiment of the present application. The first device may be the electronic device 100 in the navigation scenario as shown in fig. 3, or the first device may be the electronic device 100 in the navigation scenario as shown in fig. 5, such as the mobile phone 510 or the AR glasses 520. In the embodiment of the application, the first device is both a device for performing navigation and a navigation target (i.e. a device to be navigated), that is, the first device performs navigation on the local end of the first device. It should be noted that the method is not limited by the specific order shown in fig. 6 and described below, and it should be understood that, in other embodiments, the order of some steps in the method may be interchanged according to actual needs, or some steps in the method may be omitted or deleted. The method comprises the following steps:
S601, the first device obtains a first movement speed of a navigation target, wherein the navigation target is the first device.
The first movement speed may be a speed of the navigation target relative to the ground, or may be a speed of the navigation target relative to other references except the ground.
In some embodiments, the first device may acquire the first movement speed through a speed sensor, an acceleration sensor, or the like for detecting the movement speed. In other embodiments, the first device may detect the first speed of movement via a satellite or a base station. Of course, in practical applications, the first device may also acquire the first movement speed in other manners, and the embodiment of the present application does not limit the manner in which the first device acquires the first movement speed.
It should be noted that, the first device may directly call a hardware component such as a sensor of the hardware layer to obtain the first motion speed, or may obtain the first motion speed in a certain application program of the software layer. For example, a map application is included in the first device, which map application includes a movement speed interface for providing a current movement speed, and thus the first device acquires the first movement speed by calling the movement speed interface.
S602, the first device acquires a guidance speed corresponding to the navigation target.
In some embodiments, the reference to the guiding speed and the reference to the first movement speed may be the same.
In some embodiments, the first device may obtain motion environment information corresponding to the navigation target, and determine the guidance speed based on the motion environment information. The motion environment information is used for indicating the environment where the navigation target is located. In some embodiments, the athletic environmental information may include at least one of location information, weather information, and obstacle information.
The location information may be used to indicate the location where the navigation target is located. In some embodiments, the location information may include location coordinates, such as longitude and latitude, of where the navigation target is located. In some embodiments, the location information may include at least one of a road identifier, a road segment identifier, and a road segment type where the navigation target is located, where the road segment type may be determined by a professional in a traffic department or the like, for example, the road segment type may include a straight lane, a left turn lane, a right turn lane, a main road, a secondary road, an auxiliary road, a directional ramp, a semi-directional ramp, a general ramp, a distributed lane, a highway, and the like. In some embodiments, the location information may include a ground clearance. Of course, in practical applications, the location information may also include other information that can indicate the location of the navigation target.
In some embodiments, the first device may determine the location information by satellite positioning or base station positioning, or the like. Alternatively, the first device may acquire an image around the navigation target, and acquire location information such as a place name, a guidepost, a road identifier, a link identifier, and a link type from the image.
The weather information may be used to indicate weather of the region in which the navigation target is located. In some embodiments, the weather information may include information such as haze, rain, snow, and visibility. Of course, in practical applications, the weather information may also include other information that can indicate weather.
In some embodiments, the first device may obtain weather information corresponding to the location information from the network based on the obtained location information. Alternatively, the first device may acquire an image around the navigation target and identify corresponding weather information from the image.
The obstacle information may be information indicating a position and a state of an obstacle within a first preset range of the navigation target, where the obstacle may be an object that obstructs the navigation target from passing through, such as a wall, a guardrail, a pedestrian, a vehicle, and the like. In some embodiments, the obstacle information may include one or more of a type of obstacle, an azimuth of the obstacle, a distance of the obstacle from the navigation target, and a moving speed of the obstacle. Of course, in practical applications, the obstacle information may also include other information that can indicate the position or state of the obstacle.
The first preset range may be determined in advance by the first device, and the size of the first preset range is not limited in the embodiment of the present application.
In some embodiments, the first device may detect whether the surrounding of the navigation target includes an obstacle through a sensor such as a radar, a distance sensor, etc., and if so, may further determine information such as a type of the obstacle, an azimuth of the obstacle, a distance of the obstacle from the navigation target, a moving speed of the obstacle, etc. Alternatively, the first device may acquire an image around the navigation target, and recognize the obstacle information from the image.
It should be noted that, in practical application, the first device may also acquire the motion environment information in other manners, and the embodiment of the present application does not limit the manner in which the first device acquires the motion environment information.
When the first device acquires the movement environment information, a guidance speed matching the environment in which the navigation target is located may be determined based on the movement environment information.
In some embodiments, the first device may store a plurality of types of motion environment information and guidance speeds corresponding to the various types of motion environment information, and accordingly, the first device may acquire the stored guidance speeds corresponding to the motion environment information based on the determined motion environment information.
The first device may determine, in advance, a guidance speed corresponding to each of the plurality of pieces of exercise environment information and store the guidance speed. In some embodiments, the corresponding guidance speed may be determined in advance by a relevant technician from various movement environment information, and the various movement environment information and the corresponding guidance speed may be submitted to the first device and stored.
For example, a correspondence between position information and guiding speed may be shown in table 1 below, where the first two columns in table 1 may be the position information where the navigation target is located, and the last column is the guiding speed corresponding to the corresponding position information.
TABLE 1
Table 1 may be a plurality of guidance speeds set by a road traffic planning manager for an intersection type bus lane, wherein V1 may be a value set in advance. With the 2 nd example, if the current position of the navigation target is the intersection of the straight lane of the entrance road, the corresponding guiding speed may be 0.7×v1. In the 6 th behavior example, if the current position of the navigation target is a three-dimensional intersection located on the directional ramp, the semi-directional ramp or the auxiliary ramp, the corresponding guiding speed may be between 0.6v1 and 0.7v1.
As another example, road traffic planning manager has previously specified that in rainy and snowy weather, when the visibility is lower than 200 meters, the highest speed is 60 km/h; when the visibility is lower than 100 meters, the highest speed is 40 km/h; when the visibility is below 50 meters, the highest speed is 20 km/h. The developer of the first device may set the guidance speeds corresponding to the three kinds of visibility in the rainy and snowy weather, respectively, based on the highest speed set by the road traffic planning manager, such that the guidance speed is less than or equal to the highest speed under the visibility corresponding to the guidance speed.
For another example, when the obstacle information indicates that an obstacle appears in front of the navigation target, and the speed of the obstacle in the traveling direction of the navigation target is 0, the guiding speed may be 0.
In some embodiments, a speed guiding mark, such as a traffic light, a speed limit sign, or the like, may be provided on a road, a track, or a route on which the navigation target travels, and the speed guiding mark may be used to indicate a guiding speed corresponding to a position where the speed guiding mark is located, so that when the navigation target travels to the position, the first device or the navigation target may acquire an image around the navigation target, and the first device may identify the speed guiding mark from the image, and further determine the guiding speed corresponding to the position information of the position based on the speed guiding mark. The speed guiding mark can be determined by a relevant technician in advance according to the position of the speed guiding mark.
In some embodiments, the first device may input the motion environment information to the stored first machine learning model and derive a guidance speed of the first machine learning model output. The first machine learning model may be trained in advance based on a first sample set including a plurality of samples, each sample including motion environment information and a guiding speed of the marker.
In some embodiments, the first device may receive a user submitted boot speed. For example, the guiding speed actively set by the user during driving because the user desires to reach the destination in time, or the desired guiding speed actively set by the user while running the treadmill.
In some embodiments, the first device may obtain a movement route and a movement time limit submitted by the user, and determine the guidance speed based on the movement route and the movement time limit.
In some embodiments, the first device may also determine a first speed range, and the guidance speed may be a value included in the first speed range, wherein the first speed range may be a preferred speed of movement indicative of the achievement of the navigational target. For example, in a highspeed lane, the road traffic planning manager sets a minimum speed of 80 km/h and a maximum speed of 120 km/h, and the first speed range may be between 80 km/h and 120 km/h with the weather information and the obstacle information ignored, the first device may determine a guidance speed of 100 km/h. As another example, the first device is a car navigation device, which detects an obstacle in front of the vehicle, and in order to avoid hitting the obstacle and to consider the comfort of the user, the first device needs to be lowered to 40 km/h per hour (i.e., the guiding speed), and of course, actually less than 40 km/h does not hit the obstacle, and thus the first speed range may be 40 km/h or less than 40 km/h.
The first device may determine the first speed range in the same manner as or similar to the manner in which the guiding speed is determined.
It should also be noted that the first device may also determine the first speed and/or the first speed range in other ways.
It should be further noted that, in practical application, the first device may execute S601 and S602 sequentially, or may execute S601 and S602 simultaneously, and the embodiment of the present application does not limit the order in which the first device executes S601 and S602.
S603, the first device displays a first visual element of the motion based on the first motion speed and the guiding speed, wherein the first visual element is used for indicating acceleration or deceleration of the navigation target, and a second motion speed of the first visual element is positively correlated with an absolute value of a difference value between the first motion speed and the guiding speed.
Visual elements may be used as tools and media for conveying information, and may include information elements such as graphics, text, shapes and shapes, and may also include elements in forms such as dots, lines, faces, colors and spaces. For example, the visual element may include an arrow or a stripe.
It should be noted that the first visual element of the motion may be understood as a change in the position of the first visual element in the navigation interface. The first visual element of motion can attract the attention of the user more easily to in time guide the user to accelerate or decelerate, improve navigation effect.
The second movement speed is positively correlated with the absolute value of the difference between the first movement speed and the guiding speed, that is, if the absolute value of the difference between the first movement speed and the guiding speed of the navigation object is larger, the second movement speed of the first visual element is also larger, so that the user can intuitively feel the absolute value of the difference between the first movement speed and the guiding speed, and further take corresponding measures to increase or decrease the movement speed of the navigation object more quickly. In some embodiments, the first motion speed of the first visual element may be equal to an absolute value of a difference between the second motion speed and the guiding speed.
In some embodiments, the reference of the second motion speed of the first visual element may be the same as the reference of the first motion speed and the guiding speed.
In some embodiments, if the first movement speed is less than the guidance speed, displaying a first visual element moving in a first direction to indicate that the navigation target is accelerating; if the first movement speed is greater than the guiding speed, displaying a first visual element moving along a second direction to indicate the navigation target to decelerate.
Wherein the first direction and the second direction may be determined in advance by the electronic device, and the first direction and the second direction may be different. In some embodiments, the first direction may be parallel and opposite to the second direction. In some embodiments, the first direction may be directed from a location closer to the user to a location farther from the user, such as the first direction may be a direction farther from the bottom of the navigation interface, so that the driver more intuitively generates the sensation that the first visual element is farther from the vehicle, thereby making the driver more likely to think of accelerating the catch-up, reducing the reaction time of the driver, and improving the reliability of assisting the driving; the second direction may be directed from a position away from the user to a position near the user, such as the first direction may be a direction near the bottom of the navigation interface, so that the driver more intuitively generates the sensation that the first visual element is near the vehicle, thereby making it easier for the driver to think of a deceleration avoidance and reducing the reaction time of the driver. Of course, in practical applications, the first direction and the second direction may be other directions.
For example, as shown in fig. 7 and 8, the first visual element includes an arrow 700. The first direction is a direction from the bottom of the navigation interface to the central position of the navigation interface, and the second direction is a direction from the central position of the navigation interface to the bottom of the navigation interface. If an arrow 700 is displayed moving in a first direction, a first set of images comprising fig. 7 and 8 may be displayed sequentially, the arrow 700 may be included in each frame of images in the first set of images, and for each arrow 700, the position of the arrow 700 in the earlier displayed image may be closer to the bottom of the navigation interface than in the later displayed image, such that when the first set of images is displayed sequentially, the arrow 700 is further from the bottom of the navigation interface. For example, if two arrows 700 included in the dashed box are closer to the bottom of the navigation interface than in fig. 7 and fig. 8, then when fig. 7 is displayed before fig. 8, an animation effect may be produced in which the two arrows 700 are farther from the bottom of the navigation interface. Conversely, if an arrow 700 is displayed that moves in the second direction, a second set of images including fig. 8 and 7 may be displayed sequentially, the arrow 700 may be included in each frame of images in the second set of images, and for each arrow 700, the position of the arrow 700 in the earlier displayed image may be farther from the bottom of the navigation interface than in the later displayed image, such that when the second set of images is displayed sequentially, the arrow 700 is closer to the bottom of the navigation interface. For example, if two arrows 700 included in the dashed box are closer to the bottom of the navigation interface than in fig. 7, then when fig. 8 is displayed before fig. 7 is displayed, an animation effect may be produced in which the two arrows 700 move toward the bottom of the navigation interface.
As another example, as shown in fig. 9 and 10, the first visual element includes light waves 900 (i.e., dark stripes in the dashed boxes in fig. 9 and 10). The first direction is a direction away from the bottom of the navigation interface, and the second direction is a direction close to the bottom of the navigation interface. If light waves 900 moving in the first direction are displayed, a third set of images comprising fig. 9 and 10 may be displayed sequentially, light waves 900 may be included in each frame of images in the third set of images, and for each light wave 900, the position of the light wave 900 in the earlier displayed image may be closer to the bottom of the navigation interface than in the later displayed image, such that when the third set of images is displayed sequentially, light waves 900 are farther from the bottom of the navigation interface. For example, if the light wave 900 included in the dashed box is closer to the bottom of the navigation interface than in fig. 9 and then fig. 10, then when fig. 9 is displayed first and then fig. 10 is displayed, an animation effect may be created in which the light wave 900 is farther from the bottom of the navigation interface. Conversely, if light waves 900 moving in the second direction are displayed, a fourth set of images including fig. 10 and 9 may be displayed sequentially, light waves 900 may be included in each frame of images in the fourth set of images, and for each light wave 900, the position of the light wave 900 in the earlier displayed image may be farther from the bottom of the navigation interface than the position in the later displayed image, such that when the fourth set of images is displayed sequentially, light waves 900 are closer to the bottom of the navigation interface. For example, if the light wave 900 included in the dashed box is closer to the bottom of the navigation interface than in fig. 9, then when fig. 10 is displayed before fig. 9 is displayed, an animation effect of the movement of the light wave 900 toward the bottom of the navigation interface may be produced.
In some embodiments, if the display is a HUD or AR glasses, the first device may identify a specific object included in the frame captured by the camera, determine two-dimensional coordinates of the specific object in the frame through a visual identification algorithm, convert the two-dimensional coordinates of the specific object into three-dimensional coordinates through camera parameters (such as focal length, distortion parameters, first displacement matrix, rotation matrix, and the like of the camera) calibrated in advance, then determine three-dimensional coordinates of the first visual element based on the three-dimensional coordinates of the specific object, and draw the first visual element into the frame based on the determined three-dimensional coordinates of the first visual element. For example, in the process that the user drives the vehicle, the first device is a vehicle navigation device of the vehicle, then the first device can shoot a picture in front of the vehicle through a camera, then a lane line in the picture is identified through machine vision, two-dimensional coordinates of the lane line in the picture are determined, parameters given to the camera are used for converting the two-dimensional coordinates into three-dimensional coordinates, three-dimensional coordinates of a first visual element are determined based on the three-dimensional coordinates of the lane line, and the first visual element is drawn into a lane in the picture based on the three-dimensional coordinates of the first visual element. In some embodiments, the first device may draw the effect of the first visual element movement through an animation interface provided by an operating system in the first device. The first device may implement an animation effect that the first visual element is far away from the user by increasing a z-axis coordinate of the first visual element (i.e., a coordinate axis in a direction parallel to the lane), and implement a visual effect that the first visual element is close to the user by decreasing the z-axis coordinate of the first visual element; or the first device can perform rectangular translation operation on the first visual element through a preset second displacement matrix, so that an animation effect of movement of the first visual element is realized.
It should be noted that, in fig. 7 to fig. 9, the navigation interface displayed by the first device is a navigation interface with a head-up view angle, and in order to display the first visual element more intuitively and truly, the first visual element is also displayed at a position (such as a lane) corresponding to a specific object in the navigation interface with a head-up view angle, but it may be understood that, in practical application, the first visual element may also be displayed through other view angles, or may be displayed at other positions in the navigation interface. For example, as shown in fig. 11, the navigation interface is a top view navigation interface, and the light wave 900 is also displayed in the lane from the top view. For another example, as shown in FIG. 12, the navigation interface is still a navigation interface with a head-up view, but the light waves 900 are displayed on the left side of the navigation interface and are not associated with the location of any object included in the navigation interface.
In some embodiments, the first device may display the first visual element of the motion when detecting that the first motion speed and the guiding speed are inconsistent, and stop displaying the first visual element of the motion when detecting that the first motion speed and the guiding speed are consistent, by using the more accurate speed guiding manner, the user may adjust the motion speed of the user or the carrier in time as much as possible, so that the motion speed and the guiding speed are consistent, thereby improving the accuracy of navigation.
Wherein the first device, upon detecting that the first movement speed is consistent with the guiding speed, stops displaying the first visual element of the movement, may include: the first device immediately stops displaying the first visual element of the motion when detecting that the first motion speed is consistent with the guiding speed; or when the first device detects that the first movement speed is consistent with the guiding speed and the time length of the first movement speed which is consistent with the guiding speed is longer than the preset second time length threshold value, stopping displaying the moving first visual element, so that the time length of displaying the moving first visual element is increased, guiding a user to adjust the movement speed of the user or the carrier for a longer time, and reducing the deviation of the movement speed and the guiding speed.
It should be noted that, the second duration threshold may be set in advance by a related technician, or may be set by a user, and the embodiment of the present application does not limit the manner of setting the second duration threshold and the size of the second duration threshold.
In some embodiments, the first device may detect that the first movement speed is not in the first speed range, display the first visual element of movement, and stop displaying the first visual element of movement when detecting that the first movement speed is in the first speed range, so as to guide the user to adjust the movement speed of the user or the carrier to be close to the first speed range.
In some embodiments, the first device may display the moving first visual element when the first movement speed is detected not to be in the first speed range, and stop displaying the moving first visual element when the first movement speed is detected to be consistent with the guiding speed.
Wherein the first device, upon detecting that the first movement speed is in the first speed range, ceases to display the first visual element of movement, may include: when the first equipment detects that the first movement speed is in a first speed range, the first equipment immediately stops displaying the moving first visual element, so that the duration of displaying the moving first visual element is reduced, and a user can more freely and flexibly control the movement speed of the first equipment or the carrier; or when the first device detects that the first movement speed is in the first speed range and the duration of the first movement speed in the first speed range is greater than the preset first time threshold, stopping displaying the moving first visual element, so that the duration of displaying the moving first visual element is increased, a user is guided to adjust the movement speed of the user or the carrier for a longer time, and the deviation of the movement speed and the guiding speed or the first speed range is reduced.
It should be noted that, the first time length threshold may be set in advance by a related technician, or may be set by a user, and the embodiment of the present application does not limit the manner of setting the first time length threshold and the size of the first time length threshold.
In some embodiments, the first device may further display a second visual element of motion, and if the first visual element is used to indicate navigation target acceleration, the second visual element may be used to indicate navigation target deceleration, and if the first visual element is used to indicate navigation target deceleration, the second visual element may be used to indicate navigation target acceleration, and a third speed of motion of the second visual element may be positively correlated with an absolute value of a difference between the first speed of motion and the guiding speed. When the first visual element is used for indicating acceleration of the navigation target and the second visual element is used for indicating deceleration of the navigation target, if the first movement speed is smaller than the guiding speed, the first equipment displays the first visual element moving along the first direction, and if the first movement speed is larger than the guiding speed, the first equipment displays the second visual element moving along the second direction; when the first visual element is used for indicating deceleration of the navigation target and the second visual element is used for indicating acceleration of the navigation target, the first device displays the second visual element moving along the first direction if the first movement speed is smaller than the guiding speed, and the first device displays the first visual element moving along the second direction if the first movement speed is larger than the guiding speed. That is, the navigation device can be more intuitively indicated to accelerate or decelerate respectively by displaying the first visual element and the second visual element in different styles, so that a user can more easily make different reactions according to the first visual element or the second visual element, and the navigation accuracy and the user experience are further improved.
In some embodiments, the first device may display navigation information such as the first visual element and the second visual element at the local end of the first device, for example, the first device is the electronic device 100 in the navigation scene shown in fig. 3, and the electronic device 100 may display the navigation information at the local end of the electronic device 100. In other embodiments, the first device may display the navigation information through a third device, such as the first device is a mobile phone 510 in the navigation scenario shown in fig. 5, and the third device is an AR glasses 520 in the navigation scenario shown in fig. 5, so that the mobile phone 510 may display the navigation information at the local end of the mobile phone 510, and of course, the navigation information may also be displayed through the AR glasses 520.
In some embodiments, the stopping displaying the moving first visual element may include displaying the stationary first visual element or hiding the first visual element; similarly, ceasing to display the moving second visual element may include displaying the stationary second visual element or hiding the second visual element.
In the embodiment of the application, the navigation target is the first device. The first device may acquire a first movement speed of the navigation target and a guide speed corresponding to the navigation target, and display a first visual element of the movement based on the first movement speed and the guide speed. Because the moving first visual element is easier to attract the attention of a user, the acceleration or the deceleration of the navigation target can be indicated by the first visual element, and the second movement speed of the first visual element is positively correlated with the absolute value of the difference value between the first movement speed and the guiding speed of the first navigation target, the acceleration or the deceleration degree of the navigation target can be indicated more intuitively, and the navigation effect is improved.
The following illustrates a navigation scenario under field driving as shown in fig. 3, by which a user is guided to control acceleration and deceleration of a vehicle by the navigation method provided by the present application.
The user drives the vehicle to take off from a fire on the right slow lane, which determines that the corresponding guiding speed is 40 km/h and the current first moving speed of the vehicle is 0, so an arrow scrolling forward is displayed on the front windshield through the HUD, and the moving speed of the arrow is 40 km/h (i.e., the absolute value of the difference of the guiding speed and the first moving). When the user sees the arrow rolling forward, it can be determined that the current first movement speed is less than the guiding speed, so that the accelerator is stepped on to accelerate, so that the movement speed of the vehicle gradually approaches the guiding speed, and as the absolute value of the difference between the movement speed of the vehicle and the guiding speed is continuously reduced, the speed of the arrow rolling forward is gradually reduced until the movement speed of the vehicle is the same as the guiding speed, and the arrow stops moving or disappears.
Next, the user drives the vehicle to change from the right side slow lane to the left side fast lane, and the vehicle navigation apparatus determines that the guidance speed is 80 km/h and the current first movement speed is 40 km/h based on the changed fast lane and weather information. Since the current first movement speed is smaller than the guiding speed, an arrow 700 scrolling forward is displayed on the front windshield through the HUD, and the movement speed of the arrow 700 is 40 km/h (i.e., the absolute value of the difference between the guiding speed and the first movement), as shown in fig. 13. When the user sees the forward scrolling arrow 700, it may be determined that the current first movement speed is less than the guiding speed, and thus the accelerator is stepped on so that the movement speed of the vehicle gradually approaches the guiding speed, and as the absolute value of the difference between the movement speed of the vehicle and the guiding speed is continuously reduced, the forward scrolling speed of the arrow 700 gradually decreases until the movement speed of the vehicle is the same as the guiding speed, and the arrow 700 stops moving or disappears.
Thereafter, the user drives the vehicle close to the intersection, the current first movement speed is 60 km/h, and the vehicle navigation apparatus detects that the intersection is currently red-lit, and thus the guidance speed is 0. Since the guiding speed is smaller than the first moving speed, arrow 700 is scrolled back on the front windshield by the HUD, and the speed at which arrow 700 is scrolled back is 60 km/h (i.e., the absolute value of the difference between the guiding speed and the first movement), as shown in fig. 14. In some embodiments, the vehicle navigation device may determine a first distance based on the first movement speed and a preset perceived reaction time period, and start displaying the arrow 700 when the distance of the vehicle from the intersection is detected as the first distance. In some embodiments, the first distance may be a product of the first speed of movement and a preset perceived reaction time period. The sensing reaction time period may be a time required for the driver to react. When the user sees the backward scroll arrow 700, it may be determined that the current first movement speed is greater than the guiding speed, and thus the braking brake is applied such that the movement speed of the vehicle gradually decreases, and as the absolute value of the difference between the movement speed of the vehicle and the guiding speed continuously decreases, the backward scroll speed of the arrow 700 gradually decreases until the vehicle stops, and the arrow 700 stops moving or disappears.
When the vehicle navigation apparatus detects that the intersection is currently lighted by a yellow light, it determines that the guiding speed is 60 km/h according to the lane in which the vehicle is located, and the current first moving speed of the vehicle is 0, so the arrow is scrolled forward on the front windshield by the HUD, and the speed at which the arrow scrolls forward is 60 km/h (i.e., the absolute value of the difference between the guiding speed and the first moving speed). When the user sees the arrow rolling forward, it can be determined that the current first movement speed is less than the guiding speed, so that the accelerator is depressed to accelerate, so that the movement speed of the vehicle gradually increases, and as the absolute value of the difference between the movement speed of the vehicle and the guiding speed continuously decreases, the speed of the arrow rolling forward gradually decreases until the movement speed of the vehicle reaches 60 km/h, and the arrow stops moving or disappears.
Then, the vehicle is still traveling at a speed of 60 km/h (i.e., the first moving speed is 60 km/h), and the other vehicle is detected in front during traveling, and the moving speed of the vehicle is 40 km/h, and the vehicle-mounted navigation apparatus determines the guiding speed based on the current moving speed of the vehicle, the distance between the current vehicle and the vehicle, and the preset perceived reaction time period. In some embodiments, the vehicle-mounted navigation device may determine the absolute value of the difference obtained by subtracting the product of the vehicle distance and the perceived reaction time period from the current movement speed of the vehicle as the guiding speed. If the determined guiding speed is 30 km/h, the arrow is scrolled back on the front windshield by the HUD and the speed of the arrow is scrolled back is 30 km/h, since the guiding speed is smaller than the first moving speed. In some embodiments, the in-vehicle navigation device may also display a backward scrolling arrow on the front windshield upon detecting a forward collision warning signal (e.g., 2.7 seconds before a predicted collision) from a forward collision warning system (forward collision warning, FCW) in the vehicle. When the user sees the arrow rolling backward, it can be determined that the current first movement speed is greater than the guiding speed, and therefore the braking is depressed, so that the movement speed of the vehicle gradually decreases, and as the absolute value of the difference between the movement speed of the vehicle and the guiding speed continuously decreases, the speed of the arrow rolling backward also gradually decreases until the arrow stops moving or disappears according to the guiding speed.
Fig. 15 is a flowchart of a navigation method according to an embodiment of the present application. The first device may be the electronic device 100 in the navigation scenario shown in fig. 3 or fig. 4, and the second device may be the first carrier 310 in the navigation scenario shown in fig. 3 or the second carrier 320 in the navigation scenario shown in fig. 4. In the embodiment of the application, the first device is a device for performing navigation, and the second device is a navigation target (i.e. a device to be navigated), that is, the first device performs navigation on the second device. It should be noted that the method is not limited by the specific order shown in fig. 15 and described below, and it should be understood that, in other embodiments, the order of some steps in the method may be interchanged according to actual needs, or some steps in the method may be omitted or deleted. The method comprises the following steps:
s1501, a first device acquires a first movement speed of a navigation target, wherein the navigation target is a second device connected with the first device through a network.
The second device may acquire a first movement speed of the second device and send the first movement speed to the first device. Moreover, it should be noted that the manner in which the second device obtains the first movement speed of the second device may be the same as or similar to the manner in which the first device obtains the first movement speed of the first device in the foregoing description, which is not repeated herein.
S1502, the first device acquires a guidance speed corresponding to the navigation target.
In some embodiments, the second device may acquire the motion environment information corresponding to the navigation target and transmit the motion environment information to the first device, and accordingly, the first device may receive the motion environment information and determine the guidance speed based on the motion environment information. Alternatively, in some embodiments, the second device may obtain the motion information and determine a guidance speed based on the motion information, send the guidance speed to the first device, and the first device may receive the guidance speed sent by the second device.
In some embodiments, the first device may receive a user submitted boot speed.
In some embodiments, the first device may obtain a movement route and a movement time limit submitted by the user, and determine the guidance speed based on the movement route and the movement time limit.
In some embodiments, the first device may also determine a first speed range, and the boot speed may be a value in the first speed range.
It should be noted that the first device may also determine the first speed and/or the first speed range in other ways.
It should be further noted that, in practical application, the first device may execute S1501 and S1502 sequentially, or may execute S1501 and S1502 simultaneously, and the embodiment of the present application does not limit the order in which the first device executes S1501 and S1502.
The first device displays a first visual element of motion based on the first motion speed and the guiding speed, wherein the first visual element is used for indicating acceleration or deceleration of the navigation target, and a second motion speed of the first visual element is positively correlated with an absolute value of a difference value of the first motion speed and the guiding speed S1503.
In some embodiments, the first device may further display a second visual element of motion, and if the first visual element is used to indicate navigation target acceleration, the second visual element may be used to indicate navigation target deceleration, and if the first visual element is used to indicate navigation target deceleration, the second visual element may be used to indicate navigation target acceleration, and a third speed of motion of the second visual element may be positively correlated with an absolute value of a difference between the first speed of motion and the guiding speed.
The manner of displaying the moving first visual element or second visual element by the first device based on the first movement speed and the guiding speed may be the same or similar to the manner of displaying the moving first visual element or second visual element by the first device in S603 based on the first movement speed and the guiding speed, which is not described in detail herein.
In some embodiments, the first device may display navigation information such as the first visual element and the second visual element at the local end of the first device. In other embodiments, the first device may also display the navigation information through the second device or the third device. For example, the first device is the electronic device 100 in the navigation scene shown in fig. 3 or fig. 4, and the electronic device 100 may display the navigation information on the local side of the electronic device 100 or may display the navigation information through AR glasses.
In the embodiment of the application, the navigation target is the second device. The first device may obtain a first movement speed of the second device and a guiding speed corresponding to the second device, and display a first visual element of the movement in the first device based on the first movement speed and the guiding speed. Because the first visual element of the movement is easier to attract the attention of a user, the second equipment can be indicated to accelerate or decelerate through the first visual element, and the second movement speed of the first visual element is positively correlated with the difference value between the first movement speed and the guiding speed of the first second equipment, the acceleration or deceleration degree of the second equipment can be indicated more intuitively, and the navigation effect is improved.
Based on the same inventive concept, the embodiment of the present application further provides an electronic device, which may be the first device in the foregoing, including: a memory and a processor, the memory for storing a computer program; the processor is configured to execute the method described in the above method embodiments when the computer program is invoked.
The electronic device provided in this embodiment may execute the above method embodiment, and its implementation principle is similar to that of the technical effect, and will not be described herein again.
Based on the same inventive concept, the embodiment of the application further provides a vehicle, the vehicle comprises the first device, the first device is a vehicle-mounted device in the vehicle, and the first device further comprises a HUD, and the HUD is used for displaying the first visual element.
Based on the same inventive concept, the embodiment of the application also provides a chip system. The system-on-chip includes a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method described in the method embodiments above.
The chip system can be a single chip or a chip module formed by a plurality of chips.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the method described in the above method embodiment.
The embodiment of the application also provides a computer program product which, when run on an electronic device, causes the electronic device to execute the method described in the embodiment of the method.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include at least: any entity or device capable of carrying computer program code to a photographing device/electronic apparatus, recording medium, computer memory, read-only memory (ROM), random access memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
Claims (16)
1. A navigation method, comprising:
acquiring a first movement speed of a navigation target;
acquiring a guiding speed corresponding to the navigation target;
and displaying a first visual element of motion based on the first motion speed and the guiding speed, wherein the first visual element is used for indicating the navigation target to accelerate or decelerate, and the second motion speed of the first visual element is positively correlated with the absolute value of the difference value of the first motion speed and the guiding speed.
2. The method of claim 1, wherein the displaying a first visual element of motion based on the first motion speed and the guidance speed comprises:
if the first movement speed is less than the guiding speed, displaying the first visual element moving along a first direction;
if the first movement speed is greater than the guiding speed, displaying the first visual element moving along a second direction;
wherein the first direction and the second direction are different.
3. The method of claim 2, wherein the first direction and the second direction are parallel and opposite.
4. A method according to claim 2 or 3, wherein the first direction is the direction in which the bottom of the navigation interface points to the central position of the navigation interface, the second direction is the direction in which the central position of the navigation interface points to the bottom of the navigation interface, and the navigation interface is the interface comprising the first visual element.
5. The method of any of claims 1-4, wherein displaying the first visual element of motion based on the first motion speed and the guidance speed comprises:
displaying the first visual element of motion when the first motion speed is detected to be inconsistent with the guiding speed;
when the first movement speed is detected to be consistent with the guiding speed, the first visual element of movement is stopped from being displayed.
6. The method of any of claims 1-4, wherein displaying the first visual element of motion based on the first motion speed and the guidance speed comprises:
displaying the first visual element of motion when the first speed of motion is detected not to be in a first speed range;
stopping displaying the moving first visual element when the first movement speed is detected to be in a first speed range;
wherein the guiding speed is a value comprised in the first speed range.
7. The method of any of claims 1-4, wherein displaying the first visual element of motion based on the first motion speed and the guidance speed comprises:
Displaying the first visual element of motion when the first speed of motion is detected not to be in a first speed range;
stopping displaying the moving first visual element when the first movement speed is detected to be consistent with the guiding speed;
wherein the guiding speed is a value comprised in the first speed range.
8. The method of any one of claims 1-7, wherein the first visual element comprises an arrow or a light wave.
9. The method according to claim 1 or any one of claims 3-8, wherein the method further comprises:
and displaying a second visual element of motion based on the first motion speed and the guiding speed, wherein one of the first visual element and the second visual element is used for indicating acceleration of the navigation target, the other is used for indicating deceleration of the navigation target, and a third motion speed of the second visual element is positively correlated with an absolute value of a difference value between the first motion speed and the guiding speed.
10. The method of claim 9, wherein the first visual element is for indicating acceleration of the navigation target and the second visual element is for indicating deceleration of the navigation target, the displaying the first visual element of motion based on the first motion speed and the guidance speed comprising:
If the first movement speed is less than the guiding speed, displaying the first visual element moving along a first direction;
the displaying a second visual element of motion based on the first motion speed and the guidance speed, comprising:
if the first movement speed is greater than the guiding speed, displaying the second visual element moving along a second direction;
wherein the first direction and the second direction are different.
11. The method according to any one of claims 1-10, wherein the obtaining a guiding speed corresponding to the navigation target comprises:
acquiring motion environment information corresponding to the navigation target, wherein the motion environment information is used for indicating the environment where the navigation target is located;
the guidance speed is determined based on the motion environment information.
12. The method of claim 11, wherein the athletic environment information includes at least one of location information, weather information, and obstacle information.
13. An electronic device, comprising: a memory and a processor, the memory for storing a computer program; the processor is configured to perform the method of any of claims 1-12 when the computer program is invoked.
14. A vehicle comprising the electronic device of claim 13, the electronic device being an on-board device in the vehicle, the electronic device further comprising a heads-up display, HUD;
the HUD is configured to display the first visual element.
15. A computer readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any of claims 1-12.
16. A computer program product, characterized in that the computer program product, when run on an electronic device, causes the electronic device to perform the method of any of claims 1-12.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210342590.0A CN116929351A (en) | 2022-03-31 | 2022-03-31 | Navigation method and electronic equipment |
PCT/CN2023/083368 WO2023185622A1 (en) | 2022-03-31 | 2023-03-23 | Navigation method and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210342590.0A CN116929351A (en) | 2022-03-31 | 2022-03-31 | Navigation method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116929351A true CN116929351A (en) | 2023-10-24 |
Family
ID=88199230
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210342590.0A Pending CN116929351A (en) | 2022-03-31 | 2022-03-31 | Navigation method and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116929351A (en) |
WO (1) | WO2023185622A1 (en) |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003291688A (en) * | 2002-04-03 | 2003-10-15 | Denso Corp | Display method, driving support device and program |
JP2004101280A (en) * | 2002-09-06 | 2004-04-02 | Denso Corp | Car navigation system |
JP2005241288A (en) * | 2004-02-24 | 2005-09-08 | Fuji Heavy Ind Ltd | Vehicle limiting speed alarm device |
JP4609185B2 (en) * | 2005-05-25 | 2011-01-12 | 日産自動車株式会社 | Attention guidance device and attention guidance method |
US10261513B2 (en) * | 2016-12-19 | 2019-04-16 | drive.ai Inc. | Methods for communicating state, intent, and context of an autonomous vehicle |
CN110775063B (en) * | 2019-09-25 | 2021-08-13 | 华为技术有限公司 | Information display method and device of vehicle-mounted equipment and vehicle |
CN112797996A (en) * | 2020-12-18 | 2021-05-14 | 深圳市元征科技股份有限公司 | Vehicle navigation method, device, equipment and medium |
-
2022
- 2022-03-31 CN CN202210342590.0A patent/CN116929351A/en active Pending
-
2023
- 2023-03-23 WO PCT/CN2023/083368 patent/WO2023185622A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2023185622A1 (en) | 2023-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105788321B (en) | Vehicle communication method, device and system | |
US10982968B2 (en) | Sensor fusion methods for augmented reality navigation | |
KR20160107054A (en) | Vehicle control apparatus and method thereof, vehicle driving assistance apparatus and method thereof, mobile terminal and method thereof | |
US20180056861A1 (en) | Vehicle-mounted augmented reality systems, methods, and devices | |
US20240085208A1 (en) | Ar processing device, augmented reality-based route guidance method and electronic device | |
CN101939620A (en) | A navigation device and method for displaying map information | |
CN110263688B (en) | Driving related guidance providing method and apparatus, and computer readable recording medium | |
CN111192341A (en) | Method and device for generating high-precision map, automatic driving equipment and storage medium | |
KR20190029192A (en) | Automatic Driving control apparatus, vehicle having the same and method for controlling the same | |
JP2019109707A (en) | Display control device, display control method and vehicle | |
KR20160114486A (en) | Mobile terminal and method for controlling the same | |
US11393196B2 (en) | Line of sight assistance device and method | |
WO2023010923A1 (en) | Overpass identification method and apparatus | |
CN205541484U (en) | Electronic device | |
KR101859043B1 (en) | Mobile terminal, vehicle and mobile terminal link system | |
CN115170630B (en) | Map generation method, map generation device, electronic equipment, vehicle and storage medium | |
CN116929351A (en) | Navigation method and electronic equipment | |
KR20200070100A (en) | A method for detecting vehicle and device for executing the method | |
WO2017024458A1 (en) | System, method and apparatus for vehicle and computer readable medium | |
KR20160144643A (en) | Apparatus for prividing around view and vehicle including the same | |
EP4191204A1 (en) | Route guidance device and route guidance method thereof | |
KR20170011881A (en) | Radar for vehicle, and vehicle including the same | |
JP2022549752A (en) | Self-driving car interaction system | |
CN115484288B (en) | Intelligent vehicle searching system and vehicle searching method | |
CN112241662B (en) | Method and device for detecting drivable area |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |