WO2023185622A1 - 导航方法及电子设备 - Google Patents

导航方法及电子设备 Download PDF

Info

Publication number
WO2023185622A1
WO2023185622A1 PCT/CN2023/083368 CN2023083368W WO2023185622A1 WO 2023185622 A1 WO2023185622 A1 WO 2023185622A1 CN 2023083368 W CN2023083368 W CN 2023083368W WO 2023185622 A1 WO2023185622 A1 WO 2023185622A1
Authority
WO
WIPO (PCT)
Prior art keywords
speed
visual element
movement
navigation
electronic device
Prior art date
Application number
PCT/CN2023/083368
Other languages
English (en)
French (fr)
Inventor
刘敏
林尤辉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023185622A1 publication Critical patent/WO2023185622A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Definitions

  • the present application relates to the field of terminal technology, and in particular, to a navigation method and electronic device.
  • Travel is an extremely important human activity. Through travel, various purposes such as exploring unknown areas, transporting materials, and sports and recreation can be achieved. Navigation is a technology that ensures travel convenience, safety, and comfort. With the accumulation of human travel experience and the continuous development of science and technology, navigation technology has also made great progress.
  • the electronic device can display the user's current movement speed and the pre-set maximum speed or minimum speed, etc., and then the user adjusts the movement speed according to the maximum speed or the minimum speed, thereby moving at a movement speed lower than the maximum speed or higher than the minimum speed. , but this reminder method is easily ignored by users, and the navigation effect is poor.
  • the present application provides a navigation method and electronic device, which can more intuitively indicate the degree of acceleration or deceleration of the navigation target and improve the navigation effect.
  • embodiments of the present application provide a navigation method, including:
  • a first visual element of movement is displayed, wherein the first visual element is used to indicate acceleration or deceleration of the navigation target, and a second movement of the first visual element
  • the speed is positively related to the absolute value of the difference between the first movement speed and the guide speed.
  • the navigation target can be the device being navigated.
  • the guidance speed may be a recommended motion speed for the navigation target.
  • Visual elements can be used as tools and media to convey information.
  • Visual elements can include information elements such as graphics, text, shapes, and forms, as well as formal elements such as points, lines, surfaces, colors, and spaces.
  • the first movement speed of the navigation target and the guidance speed corresponding to the navigation target can be obtained, and the first visual element of the movement is displayed based on the first movement speed and the guidance speed. Since the moving first visual element is more likely to attract the user's attention, the acceleration or deceleration of the navigation target can be indicated through the first visual element, and the second movement speed of the first visual element is the sum of the first movement speed of the first navigation target.
  • the absolute value of the difference in guidance speed is positively correlated, which can more intuitively indicate the degree of acceleration or deceleration of the navigation target, thereby improving the navigation effect.
  • the navigation target and the device for navigation can be the same device, or they can be different devices.
  • the device for navigation can be the first device
  • the navigation target can be the first device or the second device. equipment.
  • the second device may be a vehicle.
  • the moving first visual element can be understood as the position of the first visual element in the navigation interface changing.
  • the first visual element of movement can more easily attract the user's attention, thereby guiding the user to speed up or slow down in time and improve the navigation effect.
  • the first movement speed of the first visual element may be equal to the absolute value of the difference between the second movement speed and the guide speed.
  • the reference for the guidance speed and the reference for the first movement speed may be the same.
  • displaying a first visual element of movement based on the first movement speed and the guidance speed includes:
  • first direction and the second direction are different.
  • the first direction and the second direction are parallel and opposite.
  • the first direction is the direction in which the bottom of the navigation interface points to the center of the navigation interface
  • the second direction is the direction in which the center of the navigation interface points to the bottom of the navigation interface
  • the navigation interface is an interface including the first visual element.
  • the first direction may be from a position close to the user to a position far away from the user, so that the driver can more intuitively feel that the first visual element is far away from the vehicle, thereby making it easier for the driver to associate acceleration.
  • the second direction can be from a position far away from the user to a position close to the user, so that the driver can more intuitively feel that the first visual element is close to the vehicle , thus making it easier for the driver to think of deceleration and avoidance and reducing the driver's reaction time.
  • the first direction and the second direction can also be other directions.
  • displaying a first visual element of movement based on the first movement speed and the guidance speed includes:
  • the user is guided as much as possible to adjust the movement speed of himself or the vehicle in a timely manner to keep it consistent with the guidance speed, thereby improving the accuracy of navigation.
  • stopping to display the moving first visual element may include: the first device detects that the first movement speed is consistent with the guide speed. When the first movement speed is consistent with the guidance speed, the display of the moving first visual element is immediately stopped; or, the first device detects that the first movement speed is consistent with the guidance speed, and the first movement speed is consistent with the guidance speed.
  • the duration is greater than the preset second duration threshold, the display of the first visual element of movement is stopped, thereby increasing the duration of displaying the first visual element of movement, guiding the user to adjust the movement speed of themselves or the vehicle for a longer period of time, and reducing the Deviation of movement speed from guidance speed.
  • displaying a first visual element of movement based on the first movement speed and the guidance speed includes:
  • the guide speed is a value included in the first speed range.
  • This speed guidance method allows the user to control the movement speed of himself or the vehicle more freely and flexibly, improving the flexibility and flexibility of navigation. user experience.
  • displaying a first visual element of movement based on the first movement speed and the guidance speed includes:
  • the guide speed is a value included in the first speed range.
  • the first visual element of movement begins to be displayed to guide the user to adjust the movement speed of himself or the vehicle until the movement speed is consistent with the guidance speed. until consistent.
  • stopping to display the first visual element of movement may include: the first device detects that the first movement speed is in the first speed range. When the first visual element of movement is detected, the display of the first visual element of movement is immediately stopped, thereby reducing the duration of displaying the first visual element of movement, allowing the user to more freely and flexibly control the movement speed of himself or the vehicle; or, when the first device detects the first movement When the speed is in the first speed range, and the duration of the first movement speed in the first speed range is greater than the preset first duration threshold, the display of the first visual element of motion is stopped, thereby increasing the duration of display of the first visual element of motion. , guiding the user to adjust the movement speed of himself or the vehicle for a longer period of time, and reducing the deviation between the movement speed and the guidance speed or the first speed range.
  • the first visual element includes an arrow or light wave.
  • the method further includes:
  • a second visual element of movement is displayed, wherein one of the first visual element and the second visual element is used to indicate acceleration of the navigation target, and the other
  • the third movement speed of the second visual element is positively related to the absolute value of the difference between the first movement speed and the guidance speed.
  • the first visual element is used to indicate acceleration of the navigation target
  • the second visual element is used to indicate deceleration of the navigation target, based on the first movement speed and the guidance speed.
  • the first visual elements that show movement include:
  • the second visual element that displays movement based on the first movement speed and the guidance speed includes:
  • first direction and the second direction are different.
  • the navigation device can be instructed to accelerate or decelerate respectively more intuitively, making it easier for the user to make different actions based on the first visual element or the second visual element. reactions to further improve navigation accuracy and user experience.
  • the deceleration of the navigation target may also be indicated through the first visual element, and the acceleration of the navigation target may be indicated through the second visual element.
  • the navigation target may be decelerated through a first visual element, and accelerated through a second visual element.
  • the first device displays the second visual element moving in the first direction; if the first movement speed is greater than the guide speed, the first device displays the first visual element moving in the second direction. element.
  • stopping the display of the moving first visual element may include displaying the stationary first visual element, or hiding the first visual element; similarly, stopping the display of the moving second visual element may include displaying the stationary first visual element. Secondary visual element, or hide the secondary visual element.
  • the obtaining the guidance speed corresponding to the navigation target includes:
  • the guidance speed is determined based on the motion environment information.
  • the first device may obtain the stored guidance speed corresponding to the determined movement environment information based on the movement environment information.
  • the first device can input the motion environment information to the stored first machine learning model, and obtain the guidance speed output by the first machine learning model.
  • the first device may obtain the movement route and movement time limit submitted by the user, and determine the guidance speed based on the movement route and the movement time limit.
  • the first device can also determine the boot speed through other methods.
  • the motion environment information includes at least one of location information, weather information, and obstacle information.
  • Location information can be used to indicate where the navigation target is located.
  • the location information may include location coordinates of the navigation target, such as longitude and latitude.
  • the location information may include at least one of a road identification, a road segment identification, and a road segment type where the navigation target is located.
  • the road segment type may be determined by professionals such as the transportation department.
  • the road segment type may include a straight lane, a left lane, or a left lane. Turn lanes, right-turn lanes, main roads, secondary roads, auxiliary roads, directional ramps, semi-directional ramps, general ramps, collector lanes and express lanes, etc.
  • location information may include height above the ground.
  • the location information may also include other information that can indicate the location of the navigation target.
  • Weather information can be used to indicate the weather in the area where the navigation target is located.
  • weather information may include information such as haze, rain, snow, and visibility.
  • the weather information may also include other information that can indicate the weather.
  • Obstacle information can be used to indicate the location and status of obstacles within the first preset range of the navigation target.
  • the obstacles can be objects that hinder the passage of the navigation target, such as walls, guardrails, pedestrians, vehicles, etc. .
  • the obstacle information may include one or more of the type of obstacle, the orientation of the obstacle, the distance of the obstacle from the navigation target, and the moving speed of the obstacle.
  • the obstacle information may also include other information that can indicate the location or status of the obstacle.
  • embodiments of the present application provide a navigation device, which is included in an electronic device and has the function of implementing any of the methods described in the first aspect.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules or units corresponding to the above functions. For example, a transceiver module or unit, a processing module or unit, an acquisition module or unit, etc.
  • embodiments of the present application provide an electronic device, including: a memory and a processor.
  • the memory is used to store a computer program; and the processor is used to execute any of the methods described in the first aspect when calling the computer program.
  • the electronic device can be carried on the user, or can be placed or integrated on the vehicle.
  • the electronic device and the user or the vehicle are in the same time and space.
  • the electronic device will also drive the electronic device.
  • the device moves, so the electronic device and the user or vehicle have the same state of motion.
  • the electronic device is integrated on the carrier, which may mean that the electronic device has both a mechanical connection and a communication connection with the carrier.
  • the electronic device may be considered a carrier. part of the vehicle, or the electronic device and the vehicle may be considered the same device.
  • the electronic device is placed on the carrier, which may mean that the electronic device is mechanically connected to the carrier.
  • the carrier includes a bracket for placing the electronic device, and the electronic device can be installed on the bracket;
  • the electronic device is placed on the carrier, which can mean that there is no mechanical connection between the electronic device and the carrier, as long as it remains relatively stationary.
  • the electronic device can be placed on a certain plane of the carrier.
  • the electronic device may have a communication connection with the carrier.
  • the communication connection may be a connection based on near field communication technology.
  • inventions of the present application provide a vehicle.
  • the vehicle includes the electronic device described in any one of the above first aspects.
  • the electronic device is a vehicle-mounted device in the vehicle.
  • the electronic device further includes a head-up display (head-up display). up display, HUD);
  • This HUD is used to display the first visual element.
  • HUD can also be called a head-up display.
  • HUD can be installed in vehicles and other vehicles to project images on the windshield in front of the driver, so that the driver can see the projected image by looking straight up.
  • the HUD may include an image generation unit and an optical display system.
  • the image generation unit may include a light source, an optical film, and other optical components for generating images.
  • the optical display system can include a reflector, a control unit and a front windshield.
  • the reflector is fitted to the front windshield to eliminate image distortion.
  • the control unit can be used to access various information such as navigation that needs to be displayed, so that the image unit can An image is generated based on this information.
  • the image displayed by the head-up display on the front windshield can be superimposed on the outside scenery outside the front windshield.
  • the electronic device may also be installed in the vehicle.
  • inventions of the present application provide a chip system.
  • the chip system includes a processor.
  • the processor is coupled to a memory.
  • the processor executes a computer program stored in the memory to implement the first aspect. any of the above methods.
  • the chip system may be a single chip or a chip module composed of multiple chips.
  • embodiments of the present application provide a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the method described in any one of the above-mentioned first aspects is implemented.
  • embodiments of the present application provide a computer program product, which when the computer program product is run on an electronic device, causes the electronic device to execute any of the methods described in the first aspect.
  • Figure 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of a display principle of a HUD provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of a navigation scenario provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of another navigation scenario provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of a navigation scenario provided by an embodiment of the present application.
  • Figure 6 is a schematic flow chart of a navigation method provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of a navigation interface provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 11 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 12 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 14 is a schematic diagram of another navigation interface provided by an embodiment of the present application.
  • Figure 15 is a schematic flowchart of another navigation method provided by an embodiment of the present application.
  • the navigation method provided by the embodiment of the present application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality, etc. for navigation or guidance in navigation, aviation, astronomy, hydrology, land transportation, sports and fitness, etc.
  • AR /virtual reality
  • VR virtual reality
  • laptops laptops
  • ultra-mobile personal computers UMPC
  • netbooks personal digital assistants
  • PDAs personal digital assistants
  • treadmills rowing machines and exercise machines
  • exercise machines For electronic devices such as bicycles, the embodiments of this application do not place any restrictions on the specific types of electronic devices.
  • FIG. 1 is a schematic structural diagram of an electronic device 100 provided by this application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna 1, a communication module 150, an audio module 170, a speaker 170A, a receiver 170B, and a microphone. 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 and display 194, etc.
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP) and/or baseband processor, etc.
  • application processor application processor, AP
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated accesses are reduced, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces. Interfaces can include controller area network (CAN) interface, integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation) , PCM) interface, universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, Subscriber identity module (SIM) interface, and/or universal serial bus (USB) interface, etc.
  • CAN controller area network
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • pulse code modulation pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM Subscriber identity module
  • USB universal serial bus
  • the CAN interface is a standard field bus used in automotive computer control systems and embedded industrial control LANs
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) And a serial clock line (SCL)
  • the I2S interface and the PCM interface can be used for audio communication
  • the UART interface is a universal serial data bus for asynchronous communication
  • the MIPI interface can be used for connection processing
  • the device 110, the display 194, the camera 193 and other peripheral devices include a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), etc.
  • the above-mentioned interface can be used to couple multiple components included in the electronic device 100, thereby realizing communication between the multiple components.
  • the processor 110 can couple the touch sensor through the I2C interface, so that the processor 110 and the touch sensor communicate through the I2C bus interface to realize the touch function of the electronic device 100;
  • the audio module 170 can transmit audio to the communication module 150 through the I2S interface or the PCM interface.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices such as a mouse, keyboard, or game controller. It can also be used to connect a headset to play audio through the headset. Can be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationships between the modules illustrated in the embodiments of the present application are only schematic illustrations and do not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also Different interface connection methods in the above embodiments may be used, or a combination of multiple interface connection methods may be used.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1 and the communication module 150 and so on.
  • Antenna 1 is used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the communication module 150 can provide applications on the electronic device 100 including 2G/3G/4G/5G/wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth) , BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions plan.
  • the communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • at least part of the functional modules of the communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the antenna 1 of the electronic device 100 and the communication module 150 are coupled so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • Wireless communication technologies can include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband code division Multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM , and/or IR technology, etc.
  • GNSS can include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi-zenith) satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display 194 is used to display images, videos, etc.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like.
  • the camera 193 is used to capture still images or videos
  • the ISP is used to process data fed back by the camera 193
  • the digital signal processor is used to process digital signals such as digital image signals
  • the video codec is used to compress or decompress digital video.
  • display 194 may include a HUD.
  • HUD can also be called a head-up display.
  • HUD can be installed in vehicles and other vehicles to project images on the windshield in front of the driver, so that the driver can see the projected image by looking straight up.
  • the HUD may include an image generation unit and an optical display system.
  • the image generation unit may include a light source, an optical film, and other optical components for generating images.
  • the optical display system can include a reflector, a control unit and a front windshield. The reflector is fitted to the front windshield to eliminate image distortion.
  • the control unit can be used to access various information such as navigation that needs to be displayed, so that the image unit can An image is generated based on this information.
  • the image displayed by the head-up display on the front windshield can be superimposed on the front windshield. Outside the windshield.
  • the HUD may include a light source 210 , an aspherical mirror 220 and a front windshield 230 .
  • the light source 210 projects the image to the aspherical mirror 220, and the aspherical mirror 220 reflects the image to the front windshield 230.
  • the image finally projected on the front windshield 230 is superimposed on the driver's view outside the front windshield 230, making the driving Members see a combination of reality and reality.
  • the display 194 may include a head mounted display (HMD).
  • the HMD may be worn on the driver's head and includes a helmet tracker, an image processing unit, and an optical display system.
  • the helmet tracker may be used to track driving The head position and sight angle of the operator.
  • the image processing unit can be used to generate an image that needs to be displayed in line with the head position and sight angle.
  • the optical display system can include lenses, goggles and masks, etc., and the image can be displayed on the protective screen. on the eyepiece or the mask.
  • the electronic device 100 can determine the driver's head position and line of sight angle according to the helmet tracker, produce an image consistent with the head position and line of sight angle through the image processing unit, and use the optical The display system displays the image.
  • display 194 may include a multifunction display (MFD).
  • MFD can also be called a downward-looking display.
  • MFD can be installed in the vehicle within the driver's lower field of view, and the driver can view the displayed information by looking down.
  • MFDs can include cathode ray tube displays and flat panel displays.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store the operating system, at least one application program required for the function (such as navigation function, game function, etc.).
  • the storage data area may store data created during use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the headphone interface 170D is used to connect wired headphones.
  • Pressure sensors are used to sense pressure signals and convert pressure signals into electrical signals.
  • a pressure sensor may be disposed on display 194.
  • pressure sensors such as resistive pressure sensors, Inductive pressure sensor, capacitive pressure sensor, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material. When a force acts on a pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the intensity of the pressure based on the change in capacitance. When a touch operation is performed on the display 194, the electronic device 100 detects the strength of the touch operation according to the pressure sensor. The electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example: when there is a touch operation with an intensity less than the first pressure threshold acting on the acceleration icon or deceleration icon, the vehicle is accelerated or decelerated; when there is a touch operation with an intensity greater than or equal to the first pressure threshold acting on the acceleration icon or deceleration icon, you can lock or unlock the acceleration and deceleration icons.
  • the vehicle will maintain the current speed and neither accelerate nor decelerate; if the acceleration and deceleration icons are in the unlocked state , then the driver can control the acceleration or deceleration of the vehicle through a touch operation whose intensity is less than the first pressure threshold.
  • the gyro sensor can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • Gyroscope sensors can be used for navigation and somatosensory gaming scenarios.
  • Air pressure sensors measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor to assist positioning and navigation.
  • the acceleration sensor can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected.
  • Distance sensor for measuring distance.
  • Electronic device 100 can measure distance via infrared or laser.
  • the electronic device 100 may use a distance sensor to measure the distance between the vehicle and other vehicles, such as the distance between the vehicle and the vehicle in front or the distance between vehicles behind it.
  • Proximity light sensors may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outwardly through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 . In some embodiments, the electronic device 100 can determine whether there are obstacles such as walls around by using a proximity light sensor.
  • the ambient light sensor is used to sense ambient light brightness.
  • Electronic device 100 may adaptively adjust display 194 brightness based on perceived ambient light brightness.
  • the electronic device 100 can determine the brightness of the environment through an ambient light sensor, so as to adjust or remind the user to adjust the movement speed, turn on or off lights, etc. based on the brightness.
  • the fingerprint sensor is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, such as unlocking the door of the vehicle, starting the vehicle, unlocking the display 194, and so on.
  • Touch sensor also called “touch panel”.
  • the touch sensor can be disposed on the display 194, and the touch sensor and the display 194 form a touch screen, also called a "touch screen”. Touch sensors are used to detect touches on or near them.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display 194 .
  • the touch sensor may also be disposed on the surface of the electronic device 100 at a location different from that of the display 194 .
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key inputs and generate user settings and function controls related to the electronic device 100 related key signal input.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • the motor 191 can also respond to different vibration feedback effects for touch operations in different areas of the display 194 .
  • Different application scenarios such as vehicle acceleration, deceleration, impact and damage, etc.
  • the touch vibration feedback effect can also be customized.
  • the indicator 192 can be an indicator light, which can be used to indicate the status of the electronic device 100, such as the speed of movement, changes in power, changes in oil level, whether the seat belt is worn correctly, etc., and can also be used to indicate messages, missed calls, notifications, etc. wait.
  • the electronic device 100 may further include a driving system, which may be used to drive the electronic device 100 to move.
  • the drive system can move structures such as wheels, propellers, propellers, and robotic arms.
  • the drive system may also include a transmission structure connected to the motion structure, such as a transmission shaft, gears, hinges, crawlers, connecting rods, etc.
  • the drive system also includes a power structure for generating or receiving mechanical energy, such as a pedal, a rocker, or an engine.
  • the power structure can be connected to the motion structure through a transmission structure, so that mechanical energy can be transmitted to the motion structure. structure.
  • the electronic device 100 can be carried on the user, or can be placed or integrated on a vehicle.
  • the electronic device 100 and the user or the vehicle are in the same time and space.
  • the electronic device 100 will also The electronic device 100 is driven to move, so the electronic device 100 and the user or carrier have the same motion state.
  • the electronic device 100 is integrated on the carrier, which may mean that the electronic device 100 has both a mechanical connection and a communication connection with the carrier (for example, the electronic device 100 can communicate with the carrier through an interface such as CAN in the carrier.
  • the electronic device 100 when the electronic device 100 is integrated on the carrier, the electronic device 100 may be considered to be part of the carrier, or the electronic device 100 and the carrier may be considered to be the same device.
  • the electronic device 100 is placed on the carrier, which may mean that the electronic device 100 is mechanically connected to the carrier.
  • the carrier includes a bracket for placing the electronic device 100, then the electronic device 100 may be installed on the carrier. on the bracket; or, the electronic device 100 is placed on the carrier, which may mean that there is no mechanical connection between the electronic device 100 and the carrier, as long as it remains relatively stationary.
  • the electronic device 100 can be placed on a certain part of the carrier. on flat surface.
  • the electronic device 100 may have a communication connection with the carrier.
  • the communication connection may be a connection based on short-range communication technology such as WIFI.
  • FIG. 3 is a schematic diagram of a navigation scenario provided by an embodiment of the present application.
  • the navigation scenario is navigation under on-site driving and includes an electronic device 100 and a first vehicle 310 .
  • the first carrier 310 is a vehicle
  • the electronic device 100 is a mobile phone placed in the vehicle.
  • the first carrier 310 can also be any other carrier.
  • the electronic device 100 may also be other types of devices.
  • the first carrier 310 may include a space for accommodating the driver, and may also include a space for accommodating the electronic device 100 .
  • the electronic device 100 may be placed or integrated in the first carrier 310. Therefore, the electronic device 100 and the first carrier 310 are in the same time and space and have the same motion state. Among them, if the electronic device 100 is integrated in the first carrier 310 , then the electronic device 100 can be considered as a vehicle-mounted device built into the first carrier 310 .
  • At least one of the electronic device 100 and the first carrier 310 may include the aforementioned sensor module 180, so that the current motion state, location and external environment of the first carrier 310 can be sensed through the sensor module 180.
  • One or more types of information required for navigation such as the environment.
  • At least one of the electronic device 100 and the first carrier 310 may include the aforementioned display 194 to display a navigation interface including navigation information such as movement route and movement speed. In some embodiments, the display 194 may be a HUD, HFD or MFD.
  • the driver can be located inside the first vehicle 310 or outside the first vehicle 310 and drive the first vehicle 310 in real time through components such as a steering wheel, buttons, rudder, pull rod, accelerator and brake.
  • the electronic device 100 obtains one or more information required for navigation through the electronic device 100 itself or the first carrier 310, determines and obtains the navigation information based on the information, and then includes the electronic device 100 itself or the first carrier 310.
  • the display 194 displays a navigation interface. In some embodiments, if the navigation interface is displayed through a HUD, the navigation interface can be displayed on the front windshield based on the HUD as shown in Figure 2 to provide a more intuitive and realistic display effect. When the driver sees the navigation interface, he can refer to the navigation information included in the navigation interface to adjust the motion state of the first vehicle 310, such as acceleration, deceleration, or steering.
  • FIG. 4 is a schematic diagram of another navigation scenario provided by an embodiment of the present application.
  • This navigation scenario is navigation under remote driving, including an electronic device 100 and a second vehicle 320 , and the electronic device 100 and the second vehicle 320 .
  • 320 can be connected through the network.
  • the second vehicle 320 is a drone, but it can be understood that in practical applications, the second vehicle 320 can also be any other vehicle that can be driven remotely, such as an unmanned car and a drone. Unmanned submersibles and more.
  • the electronic device 100 can be used to remotely drive the second vehicle 320, and the electronic device 100 can also be other types of devices.
  • the second vehicle 320 may include the aforementioned sensor module 180, thereby being able to obtain one or more information required for navigation such as the current motion state, location, and external environment of the second vehicle 320.
  • the electronic device 100 may include the aforementioned display 194 for displaying a navigation interface including navigation information such as movement route and movement speed.
  • the driver can trigger the electronic device 100 to generate various control instructions through the buttons or touchpad included on the electronic device 100 at one location, and the electronic device 100 sends these control instructions to a second location at another location through a network connection.
  • the carrier 320 and the second carrier 320 can receive these control instructions and move based on these control instructions.
  • the second vehicle 320 can obtain one or more information required for navigation such as the current motion state, location and external environment through the sensor module 180, and send the information to the electronic device 100, and the electronic device 100 can use the sensor module 180 to obtain the information required for navigation.
  • the information is determined to obtain the navigation information and display the navigation interface on the display 194 included in the electronic device 100. Then the driver can adjust the movement state of the second vehicle 320 based on the navigation information included in the navigation interface displayed by the electronic device 100.
  • FIG. 5 is a schematic diagram of another navigation scenario provided by an embodiment of the present application.
  • the navigation scenario is navigation under user movement, including the user.
  • the user can carry or wear the electronic device 100.
  • the user carries a mobile phone 510 and wears AR glasses 520, and exercises outdoors or on indoor treadmills and other sports equipment.
  • electronic device 100 may independently navigate the user.
  • the mobile phone 510 can obtain one or more types of information required for navigation through the sensor module 180 included in the mobile phone 510, and determine the navigation information based on the information. It can also display information including The navigation interface of this navigation information.
  • multiple electronic devices 100 may cooperate to navigate the user.
  • the mobile phone 510 can use a sensor included in at least one of the mobile phone 510 and the AR glasses 520
  • Module 180 obtains one or more types of information required for navigation, determines the navigation information based on the information, and displays a navigation interface including the navigation information on at least one of the mobile phone 510 and the AR glasses 520 .
  • the user can adjust his or her motion status based on the navigation information included in the navigation interface displayed by the mobile phone 510 or the AR glasses 520 .
  • the user can be an object that can perceive the navigation information obtained by the navigation method.
  • the users can include people.
  • the users may include animals other than humans, such as other primates.
  • the user may include a bionic robot.
  • the user may be equated to the driver.
  • Vehicles can include vehicles, ships, submersibles, airplanes, aircraft and spacecraft, as well as other vehicles such as robots that can move and accelerate or decelerate.
  • the navigation interface may be an interface including one or more types of navigation information.
  • the navigation interface can be displayed on a display of the electronic device, and it is understandable that the same navigation interface can be displayed in different styles on different displays according to different characteristics such as resolution and size of the display.
  • the navigation target can be the device being navigated, such as a vehicle, etc.
  • Guidance speed is the recommended movement speed for navigation targets.
  • the first device may be a device for navigation, a navigation target, or a device for displaying a navigation interface
  • the second device may be a navigation target, or a device for displaying a navigation interface.
  • the third device may be a device that displays the navigation interface.
  • the speed in the embodiment of the present application does not include the direction, that is, it can be equal to the speed.
  • the navigation scenes mentioned in the embodiments of the present application can be real navigation scenes, or virtual or simulated navigation scenes.
  • the above navigation scenes can all be virtual navigation scenes in electronic games.
  • users, vehicles and various terminal devices can be virtual objects in the game.
  • the following specific embodiments can be combined with each other, and the same or similar concepts or processes may not be described again in some embodiments.
  • the first device may be the electronic device 100 in the navigation scene as shown in FIG. 3 , or the first device may be the electronic device 100 in the navigation scene as shown in FIG. 5 , such as a mobile phone 510 or AR glasses 520 .
  • the first device is both a navigation device and a navigation target (that is, the device being navigated). That is, the first device performs navigation on the first device itself. It should be noted that this method is not limited to the specific order described in Figure 6 and below.
  • the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be exchanged with each other according to actual needs. It can also be omitted or deleted.
  • the method includes the following steps:
  • the first device obtains the first movement speed of the navigation target, where the navigation target is the first device.
  • the first movement speed may be the speed of the navigation target relative to the ground, or the speed of the navigation target relative to other reference objects other than the ground.
  • the first device may obtain the first movement speed through sensors used to detect movement speed, such as a speed sensor and an acceleration sensor. In other embodiments, the first device may detect the first movement speed through satellites or base stations. Of course, in practical applications, the first device may also obtain the first movement speed through other methods. The embodiments of this application do not limit the method by which the first device obtains the first movement speed.
  • the first device can directly call hardware components such as sensors on the hardware layer to obtain the first operation information.
  • the first movement speed can also be obtained by an application in the software layer.
  • the first device includes a map application, and the map application includes a movement speed interface for providing the current movement speed. Therefore, the first device obtains the first movement speed by calling the movement speed interface.
  • the first device obtains the guidance speed corresponding to the navigation target.
  • the reference for the guidance speed and the reference for the first movement speed may be the same.
  • the first device may obtain motion environment information corresponding to the navigation target, and determine the guidance speed based on the motion environment information.
  • the motion environment information is used to indicate the environment where the navigation target is located.
  • the motion environment information may include at least one of location information, weather information, and obstacle information.
  • Location information can be used to indicate where the navigation target is located.
  • the location information may include location coordinates of the navigation target, such as longitude and latitude.
  • the location information may include at least one of a road identification, a road segment identification, and a road segment type where the navigation target is located.
  • the road segment type may be determined by professionals such as the transportation department.
  • the road segment type may include a straight lane, a left lane, or a left lane. Turn lanes, right-turn lanes, main roads, secondary roads, auxiliary roads, directional ramps, semi-directional ramps, general ramps, collector lanes and express lanes, etc.
  • location information may include height above the ground.
  • the location information may also include other information that can indicate the location of the navigation target.
  • the first device may determine the location information through satellite positioning or base station positioning. Alternatively, the first device may obtain an image around the navigation target, and obtain location information such as place name, street sign, road identification, road section identification, and road section type from the image.
  • Weather information can be used to indicate the weather in the area where the navigation target is located.
  • weather information may include information such as haze, rain, snow, and visibility.
  • the weather information may also include other information that can indicate the weather.
  • the first device may obtain weather information corresponding to the location information from the network based on the obtained location information.
  • the first device may obtain images around the navigation target and identify corresponding weather information from the images.
  • Obstacle information can be used to indicate the location and status of obstacles within the first preset range of the navigation target.
  • the obstacles can be objects that hinder the passage of the navigation target, such as walls, guardrails, pedestrians, vehicles, etc. .
  • the obstacle information may include one or more of the type of obstacle, the orientation of the obstacle, the distance of the obstacle from the navigation target, and the moving speed of the obstacle.
  • the obstacle information may also include other information that can indicate the location or status of the obstacle.
  • the first preset range may be determined in advance by the first device, and the embodiment of the present application does not limit the size of the first preset range.
  • the first device can detect whether there are obstacles around the navigation target through sensors such as radar and distance sensors. If so, it can further determine the type of obstacle, the orientation of the obstacle, and the distance of the obstacle from the navigation target. information such as the distance and the moving speed of the obstacle. Alternatively, the first device may obtain images around the navigation target and identify obstacle information from the images.
  • the first device can also obtain the sports environment information through other methods.
  • the embodiments of this application do not limit the method by which the first device obtains the sports environment information.
  • the first device When the first device obtains the motion environment information, it can determine a guidance speed that matches the environment where the navigation target is located based on the motion environment information.
  • the first device may store a variety of motion environment information and guidance speeds corresponding to the various motion environment information.
  • the first device may obtain the stored information corresponding to the motion environment information based on the determined motion environment information.
  • the guidance speed corresponding to the motion environment information may be stored in a variety of motion environment information and guidance speeds corresponding to the various motion environment information.
  • the first device can determine in advance the guidance speed corresponding to each of the multiple types of movement environment information and store it.
  • relevant technical personnel may determine the corresponding guidance speed in advance based on various motion environment information, and submit the various motion environment information and the corresponding guidance speed to the first device and store them.
  • the corresponding relationship between a kind of position information and guidance speed can be shown in Table 1 below, where the first two columns of Table 1 can be the position information of the navigation target, and the last column can be the guidance speed corresponding to the corresponding position information.
  • Table 1 can be a variety of guide speeds set by road traffic planning managers for intersection-type vehicle lanes, where V1 can be a preset value.
  • V1 can be a preset value.
  • the corresponding guidance speed can be 0.7*V1.
  • the corresponding guidance speed can be between 0.6*V1 and 0.7*V1.
  • the road traffic planning and management personnel have stipulated in advance that in rainy and snowy weather, when the visibility is less than 200 meters, the maximum speed is 60 kilometers/hour; when the visibility is less than 100 meters, the maximum speed is 40 kilometers/hour. hours; when visibility is less than 50 meters, the maximum speed is 20 kilometers/hour. Then the developer of the first device can set the guidance speed corresponding to the three visibility levels in rain and snow weather based on the maximum speed set by the road traffic planning manager, so that the guidance speed is less than or equal to the visibility corresponding to the guidance speed. top speed.
  • the guidance speed may be 0.
  • the road, track or route traveled by the navigation target may be provided with speed guidance signs, such as traffic lights and speed limit signs.
  • the speed guidance signs may be used to indicate the location of the speed guidance sign corresponding to the location of the speed guidance sign. guide speed, so when the navigation target travels to this location, the first device or the navigation target can obtain the image around the navigation target, and the first device can recognize the speed guidance mark from the image, and then based on the speed
  • the degree guidance flag determines the guidance speed corresponding to the position information of the position.
  • the speed guidance mark may be determined in advance by relevant technical personnel based on the location of the speed guidance mark.
  • the first device can input the motion environment information to the stored first machine learning model, and obtain the guidance speed output by the first machine learning model.
  • the first machine learning model can be trained in advance based on the first sample set.
  • the first sample set includes multiple samples, and each sample includes motion environment information and marked guidance speed.
  • the first device may receive a user-submitted boot speed. For example, the user actively sets the guidance speed while driving because he hopes to reach the destination in time, or the user actively sets the desired guidance speed while running on a treadmill.
  • the first device may obtain the movement route and movement time limit submitted by the user, and determine the guidance speed based on the movement route and the movement time limit.
  • the first device may further determine a first speed range, and the guidance speed may be a value included in the first speed range, where the first speed range may indicate a better movement speed reached by the navigation target.
  • the minimum speed set by the road traffic planning manager is 80 kilometers per hour and the maximum speed is 120 kilometers per hour.
  • the first device determines The guidance speed can be 100 kilometers/hour, and the first speed range can be between 80 kilometers/hour and 120 kilometers/hour.
  • the first device is a vehicle-mounted navigation device, and the first device detects an obstacle in front of the vehicle.
  • the first device In order to avoid hitting the obstacle and consider the user's comfort, the first device needs to slow down to a speed of 40 kilometers per hour (i.e., the guidance speed). / hour, of course, in fact, it will not hit obstacles if it is less than 40 kilometers/hour. Therefore, the first speed range can be 40 kilometers/hour or less than 40 kilometers/hour.
  • the way in which the first device determines the first speed range may be the same as or similar to the way in which the guidance speed is determined.
  • the first device may also determine the first speed and/or the first speed range through other methods.
  • the first device may execute S601 and S602 one after another or may execute S601 and S602 at the same time.
  • the embodiment of the present application does not limit the order in which the first device executes S601 and S602.
  • the first device displays the first visual element of the movement based on the first movement speed and the guidance speed, where the first visual element is used to indicate acceleration or deceleration of the navigation target, and the second movement speed of the first visual element is the same as the first movement.
  • the absolute value of the difference between speed and guidance speed is positively correlated.
  • Visual elements can be used as tools and media to convey information.
  • Visual elements can include information elements such as graphics, text, shapes, and forms, as well as formal elements such as points, lines, surfaces, colors, and spaces.
  • visual elements could include arrows or stripes.
  • the moving first visual element can be understood as the position of the first visual element in the navigation interface will change.
  • the first visual element of movement can more easily attract the user's attention, thereby guiding the user to speed up or slow down in time and improve the navigation effect.
  • the second movement speed is positively related to the absolute value of the difference between the first traveling speed and the guidance speed. That is, if the absolute value of the difference between the first movement speed of the navigation target and the guidance speed is greater, the first visual element will The second movement speed is also larger, so that the user can more intuitively feel the absolute value of the difference between the first movement speed and the guidance speed, and then take corresponding measures to increase or decrease the navigation faster.
  • the speed of movement of the target exist
  • the first movement speed of the first visual element may be equal to the absolute value of the difference between the second movement speed and the guidance speed.
  • the reference for the second speed of movement of the first visual element may be the same as the reference for the first speed of movement and the guide speed.
  • the first visual element moving in the first direction is displayed to indicate the acceleration of the navigation target; if the first movement speed is greater than the guidance speed, the movement in the second direction is displayed.
  • the first visual element to indicate navigation target deceleration.
  • the first direction and the second direction may be determined in advance by the electronic device, and the first direction and the second direction may be different.
  • the first direction may be parallel and opposite to the second direction.
  • the first direction may be from a position close to the user to a position away from the user.
  • the first direction may be a direction away from the bottom of the navigation interface, so that the driver can more intuitively generate the first visual element away from the vehicle.
  • the second direction can be from a position far away from the user to a position close to the user, such as the first
  • the direction can be a direction close to the bottom of the navigation interface, so that the driver can more intuitively feel that the first visual element is close to the vehicle, thereby making it easier for the driver to think of deceleration and avoidance, and reducing the driver's reaction time.
  • the first direction and the second direction can also be other directions.
  • the first visual element includes an arrow 700.
  • the first direction is the direction from the bottom of the navigation interface to the center of the navigation interface
  • the second direction is the direction from the center of the navigation interface to the bottom of the navigation interface.
  • the first image set including FIG. 7 and FIG. 8 may be displayed in sequence.
  • Each frame image in the first image set may include the arrow 700, and for each arrow 700, The arrow 700 may be positioned closer to the bottom of the navigation interface in images displayed earlier than in images displayed later, such that when the first set of images is displayed sequentially, the arrow 700 is further away from the bottom of the navigation interface.
  • the two arrows 700 included in the dotted box are closer to the bottom of the navigation interface.
  • an animation can be generated to show that the two arrows 700 are far away from the bottom of the navigation interface. Effect.
  • the second image set including Figure 8 and Figure 7 can be displayed in sequence.
  • Each frame image in the second image set can include the arrow 700, and for each Arrow 700, which may be positioned farther from the bottom of the navigation interface in the earlier displayed image than in the later displayed image, such that when the second set of images is displayed sequentially, the arrow 700 is closer to the bottom of the navigation interface .
  • the two arrows 700 included in the dotted box are closer to the bottom of the navigation interface. Then when Figure 8 is displayed first and then Figure 7 is displayed, the two arrows 700 can move towards the bottom of the navigation interface. animation effects.
  • the first visual element includes light waves 900 (ie, the dark stripes in the dotted boxes in Figures 9 and 10).
  • the first direction is a direction away from the bottom of the navigation interface
  • the second direction is a direction close to the bottom of the navigation interface.
  • the third image set including Figure 9 and Figure 10 can be displayed in sequence.
  • Each frame image in the third image set can include the light wave 900, and for each light wave 900, The position of the light wave 900 in the earlier displayed image may be closer to the bottom of the navigation interface than in the later displayed image, so that when the third set of images is displayed in sequence, the light wave 900 is further away from the bottom of the navigation interface.
  • the light wave 900 included in the dotted box is closer to the bottom of the navigation interface. Then when Figure 9 is displayed first and then Figure 10 is displayed, an animation effect of the light wave 900 being far away from the bottom of the navigation interface can be generated.
  • the fourth image set including Figure 10 and Figure 9 can be displayed in sequence.
  • Each frame image in the fourth image set can include the light wave 900, and for each Light wave 900, the position of the light wave 900 in the image displayed earlier can be further away from the bottom of the navigation interface than in the image displayed later, so that when the fourth image set is displayed in sequence, the light wave 900 is close to the bottom of the navigation interface .
  • the light wave 900 included in the dotted box is closer to the bottom of the navigation interface in Figure 9 than in Figure 10. Then when Figure 10 is displayed first and then Figure 9 is displayed, an animation effect of the light wave 900 moving toward the bottom of the navigation interface can be generated.
  • the first device can identify a specific object included in the picture from the picture captured by the camera, and determine the two-dimensional coordinates of the specific object in the picture through a visual recognition algorithm, Through pre-calibrated camera parameters (such as the focal length of the camera, distortion parameters, first displacement matrix and rotation matrix, etc.), the two-dimensional coordinates of the specific object are converted into three-dimensional coordinates, and then the first vision is determined based on the three-dimensional coordinates of the specific object. The three-dimensional coordinates of the element are used to draw the first visual element into the picture based on the determined three-dimensional coordinates of the first visual element.
  • camera parameters such as the focal length of the camera, distortion parameters, first displacement matrix and rotation matrix, etc.
  • the first device can take a picture of the front of the vehicle through a camera, and then use machine vision to identify the lane lines in the picture and determine where the lane lines are.
  • the two-dimensional coordinates in the picture and the parameters given to the camera are used to convert the two-dimensional coordinates into three-dimensional coordinates.
  • the three-dimensional coordinates of the first visual element are determined based on the three-dimensional coordinates of the lane line.
  • the first visual element is drawn based on the three-dimensional coordinates of the first visual element. Go into the lane in this screen.
  • the first device can draw the effect of the movement of the first visual element through an animation interface provided by the operating system in the first device.
  • the first device can realize the animation effect of the first visual element moving away from the user by increasing the z-axis coordinate of the first visual element (that is, the coordinate axis in the direction parallel to the lane), and by reducing the z-axis of the first visual element. coordinates to achieve the visual effect of bringing the first visual element closer to the user; or, the first device can perform a rectangular translation operation on the first visual element through a preset second displacement matrix, thereby achieving an animation effect of moving the first visual element.
  • the navigation interface displayed by the first device is a head-up perspective navigation interface, and in order to display the first visual element more intuitively and truly, the first visual element is also displayed from a head-up perspective.
  • the first visual element can also be displayed through other perspectives, or can be displayed in the navigation interface of other locations.
  • the navigation interface is a navigation interface from a top view
  • the light wave 900 is also displayed in the lane from a top view.
  • the navigation interface is still a head-up perspective navigation interface, but the light wave 900 is displayed on the left side of the navigation interface and is not associated with the position of any object included in the navigation interface.
  • the first device may display the first visual element of movement when detecting that the first movement speed is inconsistent with the guide speed, and stop displaying the first visual element of movement when detecting that the first movement speed is consistent with the guide speed.
  • Visual elements through this more accurate speed guidance method, can enable users to adjust their own or vehicle's movement speed as promptly as possible to keep it consistent with the guidance speed, thus improving the accuracy of navigation.
  • the first device when the first device detects that the first movement speed is consistent with the guidance speed, it stops displaying the first visual element of the movement. This may include: when the first device detects that the first movement speed is consistent with the guidance speed, it immediately stops displaying the movement. the first visual element; or, the first device stops displaying the moving element when it detects that the first movement speed is consistent with the guidance speed, and the first movement speed is consistent with the guidance speed for a period longer than a preset second duration threshold.
  • the first visual element thereby increasing the duration of the first visual element that displays movement, guides the user to adjust the movement speed of himself or the vehicle for a longer period of time, and reduces the deviation between the movement speed and the guidance speed.
  • the second duration threshold can be set in advance by relevant technical personnel or set by the user.
  • the embodiment of the present application does not limit the method of setting the second duration threshold and the size of the second duration threshold.
  • the first device may display the first visual element of the motion when it detects that the first motion speed is not within the first speed range, and stops displaying the motion when it detects that the first motion speed is within the first speed range.
  • the first visual element guides the user to adjust the movement speed of themselves or the vehicle to be close to the first speed range. This speed guidance method allows the user to control the movement speed of himself or the vehicle more freely and flexibly, improving the navigation efficiency. Flexibility and user experience.
  • the first device may display the moving first visual element when detecting that the first movement speed is not within the first speed range, and stop displaying the moving first visual element when detecting that the first movement speed is consistent with the guide speed.
  • the first visual element may display the moving first visual element when detecting that the first movement speed is not within the first speed range, and stop displaying the moving first visual element when detecting that the first movement speed is consistent with the guide speed.
  • stopping to display the first visual element of movement may include: when the first device detects that the first movement speed is in the first speed range, it immediately stops Display the first visual element of movement, thereby reducing the duration of displaying the first visual element of movement, allowing the user to more freely and flexibly control the movement speed of himself or the vehicle; alternatively, the first device detects that the first movement speed is at the first speed range, and when the duration of the first movement speed in the first speed range is greater than the preset first duration threshold, the display of the first visual element of motion is stopped, thereby increasing the duration of displaying the first visual element of motion to make it longer. Time guides the user to adjust the movement speed of himself or the vehicle to reduce the deviation between the movement speed and the guidance speed or the first speed range.
  • the first duration threshold can be set in advance by relevant technical personnel or can be set by the user.
  • the embodiment of the present application does not limit the method of setting the first duration threshold and the size of the first duration threshold.
  • the first device may also display a second visual element of motion, and if the first visual element is used to indicate the navigation target is accelerating, the second visual element may be used to indicate the navigation target is decelerating, and if the first visual element is used to indicate the navigation target is decelerating, If used to indicate deceleration of the navigation target, the second visual element may be used to indicate acceleration of the navigation target, and the third movement speed of the second visual element may be positively correlated with the absolute value of the difference between the first movement speed and the guidance speed.
  • the first device displays the first visual element moving in the first direction.
  • the first device displays the second visual element moving in the second direction; when the first visual element is used to indicate the navigation target to decelerate and the second visual element is used to indicate the navigation target to accelerate, If the first movement speed is less than the guide speed, the first device displays the second visual element moving in the first direction. If the first movement speed is greater than the guide speed, the first device displays the first visual element moving in the second direction.
  • the navigation device can be instructed to accelerate or decelerate respectively more intuitively, making it easier for the user to make different actions based on the first visual element or the second visual element. reactions to further improve navigation accuracy and user experience.
  • the first device can display navigation information such as the first visual element and the second visual element on the first device itself.
  • the first device is the electronic device 100 in the navigation scene as shown in Figure 3.
  • the electronic device The device 100 can display the navigation information on the electronic device 100 itself.
  • the first device can also display the navigation information through a third device.
  • the first device is the mobile phone 510 in the navigation scene as shown in Figure 5, and the third device is the navigation as shown in Figure 5.
  • AR glasses 520 in the scene then the mobile phone 510 can display the navigation information on the mobile phone 510 itself, and of course can also display the navigation information through the AR glasses 520.
  • stopping the display of the moving first visual element may include displaying the stationary first visual element, or hiding the first visual element; similarly, stopping the display of the moving second visual element may include displaying the stationary first visual element. Secondary visual element, or hide the secondary visual element.
  • the navigation target is the first device.
  • the first device may obtain the first movement speed of the navigation target and the guidance speed corresponding to the navigation target, and display the first visual element of the movement based on the first movement speed and the guidance speed. Since the moving first visual element is more likely to attract the user's attention, the acceleration or deceleration of the navigation target can be indicated through the first visual element, and the second movement speed of the first visual element is the sum of the first movement speed of the first navigation target.
  • the absolute value of the difference in guidance speed is positively correlated, which can more intuitively indicate the degree of acceleration or deceleration of the navigation target, thereby improving the navigation effect.
  • the user drives the vehicle and starts in the slow lane on the right.
  • the vehicle navigation device determines that the corresponding guidance speed in the slow lane on the right is 40 kilometers per hour, and the current first speed of the vehicle is 0. Therefore, the vehicle is in the front gear through the HUD.
  • a forward rolling arrow is displayed on the windshield, and the arrow's movement speed is 40 km/h (that is, the absolute value of the difference between the guidance speed and the first movement).
  • the user sees the arrow scrolling forward, it can be determined that the current first movement speed is less than the guidance speed, so the user steps on the accelerator to accelerate, so that the vehicle's movement speed gradually approaches the guidance speed, and as the difference between the vehicle's movement speed and the guidance speed.
  • the absolute value of the value continues to decrease, and the speed of the arrow rolling forward gradually decreases, until the vehicle's movement speed is the same as the guidance speed, and the arrow stops moving or disappears.
  • the vehicle navigation device determines that the guidance speed is 80 kilometers per hour, and the current first movement speed is 40 kilometers per hour. km/h. Since the current first movement speed is less than the guidance speed, a forward rolling arrow 700 is displayed on the front windshield through the HUD, and the movement speed of the arrow 700 is 40 kilometers/hour (that is, the difference between the guidance speed and the first movement the absolute value of the value), as shown in Figure 13.
  • the forward scrolling arrow 700 When the user sees the forward scrolling arrow 700, it can be determined that the current first movement speed is less than the guide speed, so the user steps on the accelerator to accelerate, so that the vehicle's movement speed gradually approaches the guide speed, and as the vehicle's movement speed increases with the guide speed, The absolute value of the difference continues to decrease, and the speed of arrow 700 rolling forward also gradually decreases, until the moving speed of the vehicle is the same as the guidance speed, and arrow 700 stops moving or disappears.
  • the vehicle navigation device may determine the first distance based on the first movement speed and a preset sensing reaction duration, and start displaying the arrow 700 when it is detected that the distance of the vehicle from the intersection is the first distance.
  • the first distance may be a product of the first movement speed and a preset sensory reaction duration.
  • the perceived reaction time can be the time required for the driver to react.
  • the vehicle navigation device When the vehicle navigation device detects that the yellow light is currently on at the intersection, it determines the guidance according to the lane in which the vehicle is located.
  • the steering speed is 60 km/h
  • the current first speed of the vehicle is 0, so the arrow rolling forward on the front windshield through the HUD, and the arrow rolling forward at a speed of 60 km/h (i.e. The absolute value of the difference between the guidance velocity and the first movement).
  • the user sees the arrow scrolling forward, it can be determined that the current first movement speed is less than the guidance speed, so the user steps on the accelerator to accelerate, causing the vehicle's movement speed to gradually increase, and as the difference between the vehicle's movement speed and the guidance speed The absolute value of the value continues to decrease, and the speed of the arrow rolling forward gradually decreases, until the vehicle's moving speed reaches 60 kilometers/hour, and the arrow stops moving or disappears.
  • the vehicle-mounted navigation device determines the guidance speed based on the current movement speed of the vehicle, the current distance between the vehicle and the vehicle, and the preset sensing response time. In some embodiments, the vehicle-mounted navigation device may determine the absolute value of the difference obtained by subtracting the product of the vehicle distance and the sensing reaction duration from the vehicle's current movement speed as the guidance speed.
  • the vehicle navigation device may also display on the front windshield when it detects that the forward collision warning system (FCW) in the vehicle sends an early warning signal (for example, 2.7 seconds before a predicted collision). Scroll backward arrow.
  • FCW forward collision warning system
  • the user sees the backward scrolling arrow, it can be determined that the current first movement speed is greater than the guidance speed, so the user steps on the brake, causing the vehicle's movement speed to gradually decrease, and as the difference between the vehicle's movement speed and the guidance speed The absolute value of the value continues to decrease, and the speed of the arrow rolling backwards gradually decreases, until traveling at the guide speed, the arrow stops moving or disappears.
  • the first device may be the electronic device 100 in the navigation scene as shown in FIG. 3 or FIG. 4
  • the second device may be the first vehicle 310 in the navigation scene as shown in FIG. 3 or as shown in FIG. 4
  • the second vehicle 320 in the navigation scene.
  • the first device is a device that performs navigation
  • the second device is a navigation target (that is, the device being navigated). That is, the first device performs navigation on the second device.
  • this method is not limited to the specific sequence shown in Figure 15 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the first device obtains the first movement speed of the navigation target, where the navigation target is a second device connected to the first device through a network.
  • the second device can obtain the first movement speed of the second device, and send the first movement speed to the first device. It should be noted that the manner in which the second device obtains the first movement speed of the second device may be the same or similar to the aforementioned manner in which the first device obtains the first movement speed of the first device, which will not be described again here. .
  • the first device obtains the guidance speed corresponding to the navigation target.
  • the second device can obtain the motion environment information corresponding to the navigation target and send the motion environment information to the first device.
  • the first device can receive the motion environment information and determine guidance based on the motion environment information. speed.
  • the second device can obtain the motion information and determine the guidance speed based on the motion information, and send the guidance speed to the first device, and the first device can receive the guidance speed sent by the two devices.
  • the first device may receive a user-submitted boot speed.
  • the first device may obtain the movement route and movement time limit submitted by the user, and determine the guidance speed based on the movement route and the movement time limit.
  • the first device may also determine a first speed range, and the guide speed may be a value within the first speed range.
  • the first device may also determine the first speed and/or the first speed range through other methods.
  • the first device may execute S1501 and S1502 one after another or may execute S1501 and S1502 at the same time.
  • the embodiment of the present application does not limit the order in which the first device executes S1501 and S1502.
  • the first device displays the first visual element of the movement based on the first movement speed and the guidance speed, where the first visual element is used to indicate acceleration or deceleration of the navigation target, and the second movement speed of the first visual element is the same as the first movement.
  • the absolute value of the difference between speed and guidance speed is positively correlated.
  • the first device may also display a second visual element of motion, and if the first visual element is used to indicate the navigation target is accelerating, the second visual element may be used to indicate the navigation target is decelerating, and if the first visual element is used to indicate the navigation target is decelerating, If used to indicate deceleration of the navigation target, the second visual element may be used to indicate acceleration of the navigation target, and the third movement speed of the second visual element may be positively correlated with the absolute value of the difference between the first movement speed and the guidance speed.
  • the method in which the first device displays the first visual element or the second visual element of movement based on the first movement speed and the guidance speed may be the same as the way in which the first device displays the first movement element based on the first movement speed and guidance speed in S603.
  • the methods of the visual elements or the second visual elements are the same or similar, and will not be described again here.
  • the first device may display navigation information such as the first visual element and the second visual element on the first device itself.
  • the first device can also display the navigation information through the second device or the third device. For example, if the first device is the electronic device 100 in the navigation scene as shown in Figure 3 or Figure 4, the electronic device 100 can display the navigation information on the electronic device 100 itself or display the navigation information through AR glasses.
  • the navigation target is the second device.
  • the first device may obtain the first movement speed of the second device and the guidance speed corresponding to the second device, and display the moving first visual element in the first device based on the first movement speed and the guidance speed. Since the moving first visual element is more likely to attract the user's attention, the second device can be instructed to accelerate or decelerate through the first visual element, and the second movement speed of the first visual element is consistent with the first movement of the first and second device.
  • the difference between the speed and the guidance speed is positively correlated, which can more intuitively indicate the degree of acceleration or deceleration of the second device, improving the navigation effect.
  • inventions of the present application also provide an electronic device.
  • the electronic device may be the first device mentioned above, including: a memory and a processor.
  • the memory is used to store a computer program; the processor is used to call the computer program.
  • the electronic device provided in this embodiment can execute the above method embodiments, and its implementation principles and technical effects are similar, and will not be described again here.
  • inventions of the present application also provide a vehicle.
  • the vehicle includes the aforementioned first device.
  • the first device is a vehicle-mounted device in the vehicle.
  • the first device also includes a HUD.
  • the HUD is used to display the first device.
  • embodiments of the present application also provide a chip system.
  • the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to Implement the method described in the above method embodiment.
  • the chip system may be a single chip or a chip module composed of multiple chips.
  • Embodiments of the present application also provide a computer-readable storage medium on which a computer program is stored. When the computer program is executed by a processor, the method described in the above method embodiment is implemented.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on an electronic device, the electronic device implements the method described in the above method embodiment when executed.
  • the above-mentioned integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • this application can implement all or part of the processes in the methods of the above embodiments by instructing relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium.
  • the computer program When executed by a processor, the steps of each of the above method embodiments may be implemented.
  • the computer program includes computer program code, which may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable storage medium may at least include: any entity or device capable of carrying computer program code to a camera/electronic device, a recording medium, a computer memory, a read-only memory (ROM), a random access Memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media.
  • ROM read-only memory
  • RAM random access Memory
  • electrical carrier signals telecommunications signals
  • software distribution media For example, U disk, mobile hard disk, magnetic disk or CD, etc.
  • computer-readable media may not be electrical carrier signals and telecommunications signals.
  • the disclosed devices/devices and methods can be implemented in other ways.
  • the apparatus/equipment embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or units. Components may be combined or may be integrated into another system, or some features may be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, which may be in electrical, mechanical or other forms.
  • the term “if” may be interpreted as “when” or “once” or “in response to determining” or “in response to detecting” depending on the context. ". Similarly, the phrase “if determined” or “if [the described condition or event] is detected” may be interpreted, depending on the context, to mean “once determined” or “in response to a determination” or “once the [described condition or event] is detected ]” or “in response to detection of [the described condition or event pieces]”.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Astronomy & Astrophysics (AREA)
  • Navigation (AREA)

Abstract

一种导航方法及电子设备(100),涉及终端技术领域,其中,导航方法包括获取导航目标第一运动速度(S601),获取与导航目标对应的引导速度(S602),基于第一运动速度和引导速度,显示运动的第一视觉元素,其中,第一视觉元素用于指示导航目标加速或减速,第一视觉元素的第二运动速度与第一运动速度和引导速度的差值的绝对值成正相关(S603)。导航方法及电子设备(100)能够更加直观地指示导航目标加速或减速的程度,提高了导航效果。

Description

导航方法及电子设备
本申请要求于2022年3月31日提交国家知识产权局、申请号为202210342590.0、申请名称为“导航方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种导航方法及电子设备。
背景技术
出行是一种极其重要的人类活动,通过出行可以实现探索未知区域、运输物资和运动消遣等多种目的,而导航是一门保障出行便利性、安全性和舒适性的技术。随着人类出行经验的日积月累和科学技术的不断发展,导航技术也有了长足的进步。
电子设备可以显示用户当前的运动速度以及事先设置的最高速度或最低速度等,然后由用户根据该最高速度或该最低速度调整运动速度,从而按照低于最高速度或者高于最低速度的运动速度运动,但这种提醒方式很容易被用户忽略,导航效果较差。
发明内容
有鉴于此,本申请提供一种导航方法及电子设备,能够更加直观地指示导航目标加速或减速的程度,提高了导航效果。
为了实现上述目的,第一方面,本申请实施例提供一种导航方法,包括:
获取导航目标第一运动速度;
获取与所述导航目标对应的引导速度;
基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,其中,所述第一视觉元素用于指示所述导航目标加速或减速,所述第一视觉元素的第二运动速度与所述第一运动速度和所述引导速度的差值的绝对值成正相关。
导航目标可以为被导航的设备。
引导速度可以为针对导航目标推荐的运动速度。
视觉元素可以作为传达信息的工具和媒介,视觉元素可以包括如图形、文字、形状和形体等信息元素,也可以包括如点、线、面、色彩和空间等形式元素。
在本申请实施例中,可以获取导航目标的第一运动速度以及与导航目标对应的引导速度,并基于第一运动速度和引导速度显示运动的第一视觉元素。由于运动的第一视觉元素更容易吸引用户的注意力,从而可以通过第一视觉元素指示导航目标加速或减速,且第一视觉元素的第二运动速度与第一导航目标的第一运动速度和引导速度的差值的绝对值成正相关,能够更加直观地指示导航目标加速或减速的程度,提高了导航效果。
需要说明的是,导航目标和进行导航的设备可以同一设备,也可以是不同设备,比如,进行导航的设备可以以为第一设备,导航目标可以为第一设备,也可以为第二 设备。在一些的实施例中,第二设备可以为载具。
还需要说明的是,运动的第一视觉元素可以理解为第一视觉元素在导航界面中的位置会发生改变。运动的第一视觉元素能够更加容易吸引用户的注意力,从而及时引导用户加速或减速,提高导航效果。
在一些实施例中,第一视觉元素的第一运动速度可以等于第二运动速度与引导速度的差值的绝对值。
在一些实施例中,引导速度的参照物和第一运动速度的参照物可以相同。
在一些实施例中,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
若所述第一运动速度小于所述引导速度,则显示沿第一方向运动的所述第一视觉元素;
若所述第一运动速度大于所述引导速度,则显示沿第二方向运动的所述第一视觉元素;
其中,所述第一方向和所述第二方向不同。
在一些实施例中,第一方向和第二方向平行且相反。
在一些实施例中,所述第一方向为导航界面的底部指向所述导航界面的中心位置的方向,所述第二方向为所述导航界面的中心位置指向所述导航界面的底部的方向,所述导航界面为包括所述第一视觉元素的界面。
在一些实施例中,第一方向可以是由靠近用户的位置指向远离用户的位置,以使得驾驶员更直观地产生第一视觉元素远离载具的感觉,从而使驾驶员更容易地联想到加速追赶,减少驾驶员的反应时间,提高辅助驾驶的可靠性;第二方向可以为为由远离用户的位置指向靠近用户的位置,以使得驾驶员更直观地产生第一视觉元素靠近载具的感觉,从而使驾驶员更容易地来联想到减速避让,减少驾驶员的反应时间。当然,在实际应用中,第一方向和第二方向也可以为其他方向。
在一些实施例中,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
当检测到所述第一运动速度与所述引导速度不一致时,显示运动的所述第一视觉元素;
当检测到所述第一运动速度与所述引导速度一致时,停止显示运动的所述第一视觉元素。
也即是,尽可能引导用户及时调整自身或者载具的运动速度,使其与引导速度保持一致,从而能够提高导航的准确性。
在一些实施例中,第一设备在检测到所述第一运动速度与所述引导速度一致时,停止显示运动的所述第一视觉元素,可以包括:第一设备在检测到第一运动速度与所述引导速度一致时,立即停止显示运动的第一视觉元素;或者,第一设备在检测到第一运动速度与所述引导速度一致,且第一运动速度与所述引导速度保持一致的时长大于预设的第二时长阈值时,再停止显示运动的第一视觉元素,从而增加显示运动的第一视觉元素的时长,以更长时间引导用户调整自身或者载具的运动速度,减少该运动速度与引导速度的偏差。
在一些实施例中,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
当检测到所述第一运动速度不处于第一速度范围时,显示运动的所述第一视觉元素;
当检测到所述第一运动速度处于第一速度范围时,停止显示运动的所述第一视觉元素;
其中,所述引导速度为所述第一速度范围中包括的数值。
也即是,引导用户调整自身或者载具的运动速度至靠近第一速度范围即可,这种速度引导方式便于用户更加自由灵活地控制自身或载具的运动速度,提高了导航的灵活性和用户体验。
在一些实施例中,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
当检测到所述第一运动速度不处于第一速度范围时,显示运动的所述第一视觉元素;
当检测到所述第一运动速度与所述引导速度一致时,停止显示运动的所述第一视觉元素;
其中,所述引导速度为所述第一速度范围中包括的数值。
也即是,当导航目标的第一运动速度距离推荐的引导速度差值较大时,开始显示运动的第一视觉元素以引导用户调整自身或载具的运动速度,直至该运动速度与引导速度一致为止。
在一些实施例中,第一设备在检测到第一运动速度处于第一速度范围时,停止显示运动的第一视觉元素,可以包括:第一设备在检测到第一运动速度处于第一速度范围时,立即停止显示运动的第一视觉元素,从而减少显示运动的第一视觉元素的时长,便于用户更加自由灵活地控制自身或载具的运动速度;或者,第一设备在检测到第一运动速度处于第一速度范围,且第一运动速度处于第一速度范围的时长大于预设的第一时长阈值时,再停止显示运动的第一视觉元素,从而增加显示运动的第一视觉元素的时长,以更长时间引导用户调整自身或者载具的运动速度,减少该运动速度与引导速度或者第一速度范围的偏差。
在一些实施例中,所述第一视觉元素包括箭头或光波。
在一些实施例中,所述方法还包括:
基于所述第一运动速度和所述引导速度,显示运动的第二视觉元素,其中,所述第一视觉元素和所述第二视觉元素中的一个用于指示所述导航目标加速,另一个用于指示所述导航目标减速,所述第二视觉元素的第三运动速度与所述第一运动速度和所述引导速度的差值的绝对值成正相关。
在一些实施例中,所述第一视觉元素用于指示所述导航目标加速,所述第二视觉元素用于指示所述导航目标减速,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
若所述第一运动速度小于所述引导速度,则显示沿第一方向运动的所述第一视觉元素;
所述基于所述第一运动速度和所述引导速度,显示运动的第二视觉元素,包括:
若所述第一运动速度大于所述引导速度,则显示沿第二方向运动的所述第二视觉元素;
其中,所述第一方向和所述第二方向不同。
也即是,可以通过显示不同风格的第一视觉元素和第二视觉元素,能够更加直观地分别指示导航设备加速或减速,使得用户更加容易根据第一视觉元素或第二视觉元素做出不同的反应,进一步提高导航的准确性和用户体验。
需要说明的是,也可以通过第一视觉元素指示导航目标减速,通过第二视觉元素指示导航目标加速。
还需要说明的是,在实际应用中,也可以通过第一视觉元素指示所述导航目标减速,通过第二视觉元素指示所述导航目标加速。其中,若第一运动速度小于引导速度,则第一设备显示沿第一方向运动的第二视觉元素,若第一运动速度大于引导速度,则第一设备显示沿第二方向运动的第一视觉元素。
在一些实施例中,上述停止显示运动的第一视觉元素,可以包括显示静止的第一视觉元素,或者隐藏第一视觉元素;相似的,停止显示运动的第二视觉元素,可以包括显示静止的第二视觉元素,或者隐藏第二视觉元素。
在一些实施例中,所述获取与所述导航目标对应的引导速度,包括:
获取所述导航目标对应的运动环境信息,所述运动环境信息用于指示所述导航目标所在的环境;
基于所述运动环境信息确定所述引导速度。
在一些实施例中,第一设备可以基于所确定的运动环境信息,获取存储的与该运动环境信息所对应的引导速度。
在一些实施例中,第一设备可以将运动环境信息输入至存储的第一机器学习模型,并得到第一机器学习模型输出的引导速度。
在一些实施例中,第一设备可以获取用户提交的运动路线和运动时限,基于该运动路线和该运动时限,确定引导速度。当然,在实际应用中,第一设备也可以通过其他方式来确定引导速度。
在一些实施例中,所述运动环境信息包括位置信息、天气信息和障碍物信息中的至少一个。
位置信息可以用于指示导航目标所在的位置。在一些实施例中,位置信息可以包括导航目标所在的位置坐标,如经度和维度等。在一些实施例中,位置信息可以包括导航目标所在的道路标识、路段标识和路段类型中的至少一个,其中,路段类型可以由交通部门等专业人员确定,比如,路段类型可以包括直行车道、左转车道、右转车道、主干道、次干道、辅道、定向匝道、半定向匝道、一般匝道、集散车道和高速车道等等。在一些实施例中,位置信息可以包括离地高度。当然,在实际应用中,位置信息也可以包括其他能够指示导航目标所在位置的信息。
天气信息可以用于指示导航目标所在地区的天气。在一些实施例中,天气信息可以包括雾霾、雨、雪和能见度等信息。当然,在实际应用中,天气信息也可以包括其他能够指示天气的信息。
障碍物信息可以用于指示导航目标第一预设范围内的障碍物的位置和状态等信息,其中,障碍物可以为阻碍导航目标通行的物体,如墙体、护栏、行人和载具等等。在一些实施例中,障碍物信息可以包括障碍物的类型、障碍物的方位、该障碍物距离导航目标的距离和该障碍物的移动速度中的一个或多个。当然,在实际应用中,障碍物信息也可以包括其他能够指示障碍物的位置或状态的信息。
第二方面,本申请实施例提供了一种导航装置,该装置包含在电子设备中,该装置具有实现上述第一方面中任一项所述的方法的功能。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。例如,收发模块或单元、处理模块或单元、获取模块或单元等。
第三方面,本申请实施例提供一种电子设备,包括:存储器和处理器,存储器用于存储计算机程序;处理器用于在调用计算机程序时执行上述第一方面中任一项所述的方法。
在一些实施例中,电子设备可以被携带在用户身上,或者可以被安置或集成在载具上,电子设备和用户或者和载具处于同一时空,用户或者载具在运动时,也会带动电子设备运动,因此电子设备和用户或载具具有相同的运动状态。在一些实施例中,电子设备被集成在载具上,可以指电子设备与载具既存在机械连接,也存在通信连接,当电子设备被集成在载具上时,电子设备可以被认为是载具的一部分,或者,电子设备和载具可以被认为是同一设备。在另一些实施例中,电子设备被安置在载具上,可以指电子设备与载具存在机械连接,比如载具上包括用于安置电子设备的支架,那么电子设备可以安装在该支架上;或者,电子设备被安置在载具上,可以指电子设备与载具之间不存在机械连接,只要保持相对静止即可,比如电子设备可以放置在载具的某个平面上。
在一些实施例中,无论电子设备与载具之间的位置关系和连接关系如何,电子设备都可以与载具存在通信连接。在一些实施例中,该通信连接可以为基于近距离通信技术的连接。
第四方面,本申请实施例提供一种车辆,该车辆包括上述第一方面中任一项所述的电子设备,该电子设备为该车辆中的车载设备,该电子设备还包括抬头显示器(head up display,HUD);
该HUD用于显示第一视觉元素。
其中,HUD也可以称为平视显示器。HUD可以设置在车辆等载具中,用于将图像投影在驾驶员前方的挡风玻璃上,使得驾驶员通过平视就可以看到所投影的图像。HUD可以包括图像生成单元和光学显示系统。图像生成单元可以包括光源、光学膜片和其他光学组件,用于生成图像。光学显示系统可以包括反射镜、控制单元和前挡风玻璃,反射镜和前挡风玻璃拟合以消除画面畸变,控制单元可以用于接入需要显示的导航等各种信息,使得图像单元能够根据这些信息生成图像。在一些实施例中,抬头显示器在前挡风玻璃所显示的图像可以叠加在该前挡风玻璃之外的外景上。
需要说明的是,在实际应用中,该电子设备也可以是安置在该车辆中。
第五方面,本申请实施例提供一种芯片系统,所述芯片系统包括处理器,所述处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现上述第一方 面中任一项所述的方法。
其中,所述芯片系统可以为单个芯片,或者多个芯片组成的芯片模组。
第六方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述第一方面中任一项所述的方法。
第七方面,本申请实施例提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述第一方面中任一项所述的方法。
可以理解的是,上述第二方面至第七方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
图1为本申请实施例所提供的一种电子设备的结构示意图;
图2为本申请实施例所提供的一种HUD的显示原理的示意图;
图3为本申请实施例所提供的一种导航场景的示意图;
图4为本申请实施例所提供的另一种导航场景的示意图;
图5为本申请实施例所提供的一种导航场景的示意图;
图6为本申请实施例提供的一种导航方法的流程示意图;
图7为本申请实施例提供的一种导航界面的示意图;
图8为本申请实施例提供的另一种导航界面的示意图;
图9为本申请实施例提供的另一种导航界面的示意图;
图10为本申请实施例提供的另一种导航界面的示意图;
图11为本申请实施例提供的另一种导航界面的示意图;
图12为本申请实施例提供的另一种导航界面的示意图;
图13为本申请实施例提供的另一种导航界面的示意图;
图14为本申请实施例提供的另一种导航界面的示意图;
图15为本申请实施例提供的另一种导航方法的流程示意图。
具体实施方式
本申请实施例提供的导航方法可以应用于航海、航空、天文、水文或陆上交通、运动健身等方面的导航或制导的手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、跑步机、划船机和动感单车等电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
请参照图1,为本申请所提供的一种电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,通信模块150,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193以及显示器194等。其中传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,触摸传感器,环境光传感器等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。 在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP)和/或基带处理器等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。减少了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括控制器局域网络(controller area network,CAN)接口、集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
其中,CAN接口是一种应用于汽车计算机控制系统和嵌入式工业控制局域网的标准的现场总线,I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL),I2S接口和PCM接口可以用于音频通信,UART接口是一种通用串行数据总线,用于异步通信,MIPI接口可以被用于连接处理器110与显示器194,摄像头193等外围器件,包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。
上述接口可以用于电子设备100包括的多个组件耦合,从而实现该多个组件之间的通信。例如:处理器110可以通过I2C接口耦合触摸传感器,使处理器110与触摸传感器通过I2C总线接口通信,实现电子设备100的触摸功能;音频模块170可以通过I2S接口或PCM接口向通信模块150传递音频信号,实现通过蓝牙耳机播放音频信号的功能;处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能;处理器110和显示器194通过DSI接口通信,实现电子设备100的显示功能;USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与鼠标、键盘或游戏手柄等外围设备之间传输数据,也可以用于连接耳机,通过耳机播放音频,还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可 以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备100的无线通信功能可以通过天线1和通信模块150等实现。
天线1用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。
通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G/无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。在一些实施例中,通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
在一些实施例中,电子设备100的天线1和通信模块150耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示器194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示器194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。显示器194用于显示图像,视频等。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示器194以及应用处理器等实现拍摄功能。其中,摄像头193用于捕获静态图像或视频,ISP用于处理摄像头193反馈的数据,数字信号处理器用于处理数字图像信号等数字信号,视频编解码器用于对数字视频压缩或解压缩。
在一些实施例中,显示器194可以包括HUD。HUD也可以称为平视显示器。HUD可以设置在车辆等载具中,用于将图像投影在驾驶员前方的挡风玻璃上,使得驾驶员通过平视就可以看到所投影的图像。HUD可以包括图像生成单元和光学显示系统。图像生成单元可以包括光源、光学膜片和其他光学组件,用于生成图像。光学显示系统可以包括反射镜、控制单元和前挡风玻璃,反射镜和前挡风玻璃拟合以消除画面畸变,控制单元可以用于接入需要显示的导航等各种信息,使得图像单元能够根据这些信息生成图像。在一些实施例中,抬头显示器在前挡风玻璃所显示的图像可以叠加在该前 挡风玻璃之外的外景上。在一些实施例中,如图2所示,HUD可以包括光源210、非球面镜220和前挡风玻璃230。光源210将图像投射至非球面镜220,非球面镜220将图像反射至前挡风玻璃230,最终投射在前挡风玻璃230上的图像叠加在驾驶员从前挡风玻璃230外的外景上,使得驾驶员看到虚实结合的景象。
在一些实施例中,显示器194可以包括头盔显示器(head mounted display,HMD。HMD可以佩戴在驾驶员的头部,包括头盔跟踪器、图像处理单元和光学显示系统。头盔跟踪器可以用于跟踪驾驶员的头部位置和视线角度。图像处理单元可以用于生成需要显示的符合该头部位置和视线角度的图像。光学显示系统可以包括透镜、护目镜和面罩等,该图像可以显示在该护目镜或该面罩上。在一些实施例中,电子设备100可以根据头盔跟踪器确定驾驶员的头部位置和视线角度,通过图像处理单元生产符合该头部位置和视线角度的图像,并通过光学显示系统显示该图像。
在一些实施例中,显示器194可以包括多功能显示器(multi function display,MFD)。MFD也可以称为下视显示器,MFD可以安装在载具中驾驶员下视场范围内,驾驶员可以通过俯视的方式观看所显示的信息。MFD可以包括阴极射线管显示器和平板显示器。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如导航功能、游戏功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
耳机接口170D用于连接有线耳机。
压力传感器用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器可以设置于显示器194。压力传感器的种类很多,如电阻式压力传感器, 电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示器194,电子设备100根据压力传感器检测所述触摸操作强度。电子设备100也可以根据压力传感器的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于加速图标或减速图标时,执行载具加速或减速;当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于加速图标或减速图标时,可以锁定或解锁加速和减速图标,其中,若加速和减速图标处于锁定状态,则该载具将保持当前速度行驶,既不加速也不减速;若加速和减速图标处于解锁状态,则驾驶员可以通过触摸操作强度小于第一压力阈值的触摸操作控制该载具加速或减速。
陀螺仪传感器可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器可以用于导航,体感游戏场景。
气压传感器用于测量气压。在一些实施例中,电子设备100通过气压传感器测得的气压值计算海拔高度,辅助定位和导航。
加速度传感器可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。
距离传感器,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,电子设备100可以利用距离传感器测距载具与其他载具之间的距离,如车辆与前方的车辆之间的距离或后方的车辆之间的距离。
接近光传感器可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。在一些实施例中,电子设备100可以通过接近光传感器判断周围是否有墙体等障碍物。
环境光传感器用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示器194亮度。在一些实施例中,电子设备100可以通过环境光传感器判断所处环境的亮度,以根据该亮度调整或提醒用户调整运动速度、打开或关闭灯具等。
指纹传感器用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,如解锁载具的门、启动载具、解锁显示器194等等。
触摸传感器,也称“触控面板”。触摸传感器可以设置于显示器194,由触摸传感器与显示器194组成触摸屏,也称“触控屏”。触摸传感器用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示器194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器也可以设置于电子设备100的表面,与显示器194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制 有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。作用于显示器194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:载具加速、减速、撞击和损毁等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示电子设备100的状态,诸如运动的速度、电量变化、油量变化、安全带是否佩戴正确等等,也可以用于指示消息,未接来电,通知等。
在一些实施例中,电子设备100还可以包括驱动系统,该驱动系统可以用于驱动电子设备100运动。驱动系统可以运动结构,诸如轮子、螺旋桨、推进器、和机械臂等等。在一些实施例中,该驱动系统还可以包括与运动结构连接的传动结构,诸如传动轴、齿轮、铰链、履带和连杆等等。在一些实施例中,该驱动系统还包括用于产生或接收机械能的动力结构,比如踏板、摇杆或发动机等,该动力结构可以通过传动结构与运动结构连接,从而使得机械能能够传递至该运动结构。
在一些实施例中,电子设备100可以被携带在用户身上,或者可以被安置或集成在载具上,电子设备100和用户或者和载具处于同一时空,用户或者载具在运动时,也会带动电子设备100运动,因此电子设备100和用户或载具具有相同的运动状态。在一些实施例中,电子设备100被集成在载具上,可以指电子设备100与载具既存在机械连接,也存在通信连接(比如电子设备100可以通过载具中的CAN等接口与载具的其他组件进行通信),当电子设备100被集成在载具上时,电子设备100可以被认为是载具的一部分,或者,电子设备100和载具可以被认为是同一设备。在另一些实施例中,电子设备100被安置在载具上,可以指电子设备100与载具存在机械连接,比如载具上包括用于安置电子设备100的支架,那么电子设备100可以安装在该支架上;或者,电子设备100被安置在载具上,可以指电子设备100与载具之间不存在机械连接,只要保持相对静止即可,比如电子设备100可以放置在载具的某个平面上。
在一些实施例中,无论电子设备100与载具之间的位置关系和连接关系如何,电子设备100都可以与载具存在通信连接。在一些实施例中,该通信连接可以为基于WIFI等近距离通信技术的连接。
为了便于理解本申请实施例中的技术方案,下面首先对几种导航场景予以介绍。
请参照图3,为本申请实施例所提供的一种导航场景的示意图,该导航场景为实地驾驶下的导航,包括电子设备100和第一载具310。如图3所示,第一载具310为车辆,电子设备100为放置在车辆中的手机,但可以理解的是,在实际应用中,第一载具310还可以为其他任意的载具,电子设备100还可以为其他类型的设备。
第一载具310中可以包括用于容纳驾驶员的空间,还可以包括用于容纳电子设备100的空间。电子设备100可以安置或集成在第一载具310中,因此,电子设备100和第一载具310处于同一时空,具有相同的运动状态。其中,如果电子设备100是被集成在第一载具310中,那么电子设备100可以被认为是第一载具310内置的车载设备。电子设备100和第一载具310中的至少一个可以包括前述中的传感器模块180,从而能够通过传感器模块180感知到第一载具310当前的运动状态、所处位置和外部 环境等导航所需的一种或多种信息。电子设备100和第一载具310中的至少一个可以包括前述中的显示器194,从而用于显示包括运动路线和运动速度等导航信息的导航界面,在一些实施例中,显示器194可以为HUD、HFD或者MFD。
驾驶员可以位于第一载具310内或者第一载具310外,通过方向盘、按键、舵、拉杆、油门和刹车等组件实时实地驾驶第一载具310。电子设备100通过电子设备100本端或者第一载具310,获取导航所需的一种或多种信息,基于这些信息确定得到导航信息,然后在电子设备100本端或者第一载具310包括的显示器194进行显示导航界面。在一些实施例中,如果是通过HUD显示导航界面,那么可以基于如图2所示的HUD将该导航界面显示到前挡风玻璃上,以提供更直观和真实的显示效果。当驾驶员看到该到导航界面时,可以参考该导航界面包括的导航信息调整第一载具310的运动状态,比如加速、减速或转向等等。
请参照图4,为本申请实施例所提供的另一种导航场景的示意图,该导航场景为远程驾驶下的导航,包括电子设备100和第二载具320,电子设备100和第二载具320之间可以通过网络连接。如图4所示,第二载具320为无人机,但可以理解的是,在实际应用中,第二载具320还可以为其他任意的可以远程驾驶的载具,比如无人汽车和无人潜水器等等。电子设备100可以用于远程驾驶第二载具320,且电子设备100还可以为其他类型的设备。
第二载具320可以包括前述中的传感器模块180,从而能够获取到第二载具320当前的运动状态、所处位置和外部环境等导航所需的一种或多种信息。电子设备100可以包括前述中的显示器194,从而用于显示包括运动路线和运动速度等导航信息的导航界面。
驾驶员可以在一位置,通过电子设备100上包括的按键或者触控板,触发电子设备100生成各种的控制指令,电子设备100通过网络连接将这些控制指令发送至处于另一位置的第二载具320,第二载具320可以接收这些控制指令并基于这些控制指令运动。第二载具320可以通过传感器模块180获取当前的运动状态、所处位置和外部环境等导航所需的一种或多种信息,并将这些信息发送给电子设备100,电子设备100可以根据这些信息确定得到导航信息并在电子设备100包括的显示器194上进行显示导航界面,那么驾驶员可以基于电子设备100所显示的导航界面包括的导航信息调整第二载具320的运动状态。
请参照图5,为本申请实施例所提供的另一种导航场景的示意图,该导航场景为用户运动下的导航,包括用户。用户可以携带或佩戴电子设备100,如图5所示,用户携带了手机510,并佩戴了AR眼镜520,并在室外运动或者在室内的跑步机等运动器材上运动。
在一些实施例中,电子设备100可以单独对用户进行导航。以手机510为例,手机510可以通过手机510包括的传感器模块180,获取导航所需的一种或多种信息,并基于这些信息确定得到导航信息,还可以通过手机510包括的显示器194显示包括该导航信息的导航界面。
在另一些实施例中,多个电子设备100可以协同对用户进行导航。以手机510和AR眼镜520为例,手机510可以从手机510和AR眼镜520中至少一个包括的传感器 模块180,获取导航所需的一种或多种信息,并基于这些信息确定得到导航信息,还可以在手机510和AR眼镜520中的至少一个显示包括该导航信息的导航界面。
用户可以基于手机510或者AR眼镜520所显示的导航界面包括的导航信息调整自身的运动状态。
接下来将对本申请实施例中包括的主要的技术名词进行解释。
用户,可以为能够感知导航方法所得到的导航信息的对象。该用户可以包括人。在一些实施例中,该用户可以包括除人之外的其他动物,比如其他灵长类动物。在一些实施例中,该用户可以包括仿生机器人。在一些实施例中,用户可以等同于驾驶员。
载具可以包括车辆、船只、潜水器、飞机、飞行器和太空飞船等交通工具,也可以包括机器人等其他能够移动,也能够加速或减速的器具。
导航界面可以为包括到一种或多种导航信息的界面。该导航界面可以显示在电子设备的显示器上,且可以理解的是,根据显示器的分辨率和尺寸等特征的不同,同一导航界面在不同显示器上显示的风格可以不同。
导航目标可以为被导航的设备,比如载具等。
引导速度为针对导航目标推荐的运动速度。
下面将结合上述导航场景,以具体地实施例对本申请的技术方案进行详细说明。需要说明的是,在下述实施例中,第一设备可以为进行导航的设备,还可以为导航目标,还可以为显示导航界面的设备;第二设备可以为导航目标,还可以为显示导航界面的设备;第三设备可以为显示导航界面的设备。还需要说明的是,本申请实施例中的速度不包括方向,即可以等同于速率。还需要说明的是,本申请实施例中所提到的导航场景可以为真实的导航场景,也可以为虚拟或模拟的导航场景,例如,上述导航场景可以均为电子游戏中的虚拟的导航场景,用户、载具以及各种终端设备可以为该游戏中虚拟的物体。下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。
请参照图6,为本申请实施例所提供的一种导航方法的流程图。其中,第一设备可以为如图3的导航场景中的电子设备100,或者,第一设备可以为如图5所示的导航场景中电子设备100,如手机510或者AR眼镜520。在本申请实施例中,第一设备既为进行导航的设备,也为导航目标(即被导航的设备),也即是,第一设备对第一设备本端进行导航。需要说明的是,该方法并不以图6以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:
S601,第一设备获取导航目标的第一运动速度,其中,导航目标为第一设备。
其中,第一运动速度可以是导航目标相对地面的速度,也可以是导航目标相对除地面之外的其他参照物的速度。
在一些实施例中,第一设备可以通过速度传感器和加速度传感器等用于检测运动速度的传感器,获取第一运动速度。在另一些实施例中,第一设备可以通过卫星或者基站来检测第一运动速度。当然,在实际应用中,第一设备也可以通过其他方式来获取第一运动速度,本申请实施例不对第一设备获取第一运动速度的方式进行限定。
需要说明的是,第一设备可以直接调用硬件层的传感器等硬件组件来获取第一运 动速度,也可以在软件层的某个应用程序获取第一运动速度。例如,第一设备中包括某地图应用,该地图应用包括用于提供当前运动速度的运动速度接口,因此,第一设备通过调用该运动速度接口来获取第一运动速度。
S602,第一设备获取与导航目标对应的引导速度。
在一些实施例中,引导速度的参照物和第一运动速度的参照物可以相同。
在一些实施例中,第一设备可以获取导航目标对应的运动环境信息,基于该运动环境信息确定引导速度。其中,运动环境信息用于指示导航目标所在的环境。在一些实施例中,运动环境信息可以包括位置信息、天气信息和障碍物信息中的至少一个。
位置信息可以用于指示导航目标所在的位置。在一些实施例中,位置信息可以包括导航目标所在的位置坐标,如经度和维度等。在一些实施例中,位置信息可以包括导航目标所在的道路标识、路段标识和路段类型中的至少一个,其中,路段类型可以由交通部门等专业人员确定,比如,路段类型可以包括直行车道、左转车道、右转车道、主干道、次干道、辅道、定向匝道、半定向匝道、一般匝道、集散车道和高速车道等等。在一些实施例中,位置信息可以包括离地高度。当然,在实际应用中,位置信息也可以包括其他能够指示导航目标所在位置的信息。
在一些实施例中,第一设备可以通过卫星定位或者基站定位等方式确定该位置信息。或者,第一设备可以获取导航目标周围的图像,并从图像中获取地名、路牌、道路标识、路段标识和路段类型等位置信息。
天气信息可以用于指示导航目标所在地区的天气。在一些实施例中,天气信息可以包括雾霾、雨、雪和能见度等信息。当然,在实际应用中,天气信息也可以包括其他能够指示天气的信息。
在一些实施例中,第一设备可以基于所获取的位置信息,从网络中获取与该位置信息对应的天气信息。或者,第一设备可以获取导航目标周围的图像,并从图像中识别相应的天气信息。
障碍物信息可以用于指示导航目标第一预设范围内的障碍物的位置和状态等信息,其中,障碍物可以为阻碍导航目标通行的物体,如墙体、护栏、行人和载具等等。在一些实施例中,障碍物信息可以包括障碍物的类型、障碍物的方位、该障碍物距离导航目标的距离和该障碍物的移动速度中的一个或多个。当然,在实际应用中,障碍物信息也可以包括其他能够指示障碍物的位置或状态的信息。
其中,第一预设范围可以由第一设备事先确定,本申请实施例不对第一预设范围的大小进行限定。
在一些实施例中,第一设备可以通过雷达、距离传感器等传感器,检测导航目标周围是否包括障碍物,如果是可以进一步确定该障碍物的类型、该障碍物的方位、该障碍物距离导航目标的距离和该障碍物的移动速度等信息。或者,第一设备可以获取导航目标周围的图像,从图像中识别障碍物信息。
需要说明的是,在实际应用中,第一设备也可以通过其他方式来获取运动环境信息,本申请实施例不对第一设备获取该运动环境信息的方式进行限定。
当第一设备获取到运动环境信息时,可以基于运动环境信息确定与导航目标所在的环境相匹配的引导速度。
在一些实施例中,第一设备中可以存储有多种运动环境信息以及与各种运动环境信息对应的引导速度,相应的,第一设备可以基于所确定的运动环境信息,获取存储的与该运动环境信息所对应的引导速度。
需要说明的是,第一设备可以事先确定多种运动环境信息中各中运动环境信息对应的引导速度并进行存储。在一些实施例中,可以由相关技术人员,事先根据各种运动环境信息确定相对应的引导速度,并向第一设备提交各种运动环境信息以及对应的引导速度并进行存储。
例如,一种位置信息与引导速度之间的对应关系可以如下表1所示,其中,表1的前两列可以为导航目标所在的位置信息,最后一列为相应位置信息对应的引导速度。
表1
表1可以是由道路交通规划管理人员针对交叉口类型的汽车道设置的多种引导速度,其中,V1可以是事先设定的数值。以第2行为例,若导航目标当前所在的位置为位于进口道直行车道的平面交叉口,则对应的引导速度可以为0.7*V1。又以第6行为例,若导航目标当前所在的位置为位于定向匝道、半定向匝道或辅道的立体交叉口,则对应的引导速度可以为0.6*V1~0.7*V1之间。
又例如,道路交通规划管理人员事先规定了,在雨雪天气中,当能见度低于200米时,最高速度为60千米/小时;当能见度低于100米时,最高速度为40千米/小时;当能见度低于50米时,最高速度为20千米/小时。那么第一设备的开发人员可以基于道路交通规划管理人员所设置的最高速度,分别设置与雨雪天气中三种能见度对应的引导速度,使得该引导速度小于或等于该引导速度对应的能见度下的最高速度。
又例如,当障碍物信息指示导航目标前方出现障碍物,且障碍物在导航目标的行驶方向的速度为0时,引导速度可以为0。
在一些实施例中,导航目标所行使的道路、轨道或航线上,可以设置有速度引导标志,诸如红绿灯和限速牌等,该速度引导标志可以用于指示与该速度引导标志所在位置对应的引导速度,因此当导航目标行驶到该位置时,第一设备或导航目标可以获取导航目标周围的图像,第一设备可以从该图像识别该速度引导标志,进而基于该速 度引导标志确定与该位置的位置信息对应的引导速度。其中,速度引导标志可以是由相关技术人员事先根据该速度引导标志所在位置确定的。
在一些实施例中,第一设备可以将运动环境信息输入至存储的第一机器学习模型,并得到第一机器学习模型输出的引导速度。其中,第一机器学习模型可以事先基于第一样本集合训练得到,第一样本集合包括多个样本,各样本中包括运动环境信息以及标记的引导速度。
在一些实施例中,第一设备可以接收用户提交的引导速度。比如,用户在驾驶的过程中,因为期望及时到达目的地而主动设置的引导速度,或者,用户在跑步机跑步时主动设置的期望的引导速度。
在一些实施例中,第一设备可以获取用户提交的运动路线和运动时限,基于该运动路线和该运动时限,确定引导速度。
在一些实施例中,第一设备还可以确定第一速度范围,引导速度可以第一速度范围中包括的数值,其中,第一速度范围可以为指示导航目标达到的较佳运动速度。例如,在高速车道中,道路交通规划管理人员设置的最低时速为80千米/小时,最高时速为120千米/小时,在忽略天气信息和障碍物信息的情况下,第一设备所确定的引导速度可以为100千米/小时,第一速度范围可以为80千米/小时-120千米/小时之间。又例如,第一设备为车载导航设备,第一设备在车辆前方检测到障碍物,为避免撞到障碍物并考虑用户的舒适性,第一设备需要降低至时速(即引导速度)40千米/小时,当然实际上低于40千米/小时也不会撞到障碍物,因此,第一速度范围可以为40千米/小时或者低于40千米/小时。
需要说明的是,第一设备确定第一速度范围的方式,可以与确定引导速度的方式相同或相似。
还需要说明的是,第一设备也可以通过其他方式确定第一速度和/或第一速度范围。
还需要说明的是,在实际应用中,第一设备可以先后执行S601和S602,也可以同时执行S601和S602,本申请实施例不对第一设备执行S601和S602的顺序进行限定。
S603,第一设备基于第一运动速度和引导速度,显示运动的第一视觉元素,其中,第一视觉元素用于指示导航目标加速或减速,第一视觉元素的第二运动速度与第一运动速度和引导速度的差值的绝对值成正相关。
视觉元素可以作为传达信息的工具和媒介,视觉元素可以包括如图形、文字、形状和形体等信息元素,也可以包括如点、线、面、色彩和空间等形式元素。例如,视觉元素可以包括箭头或者条纹。
需要说明的是,运动的第一视觉元素可以理解为第一视觉元素在导航界面中的位置会发生改变。运动的第一视觉元素能够更加容易吸引用户的注意力,从而及时引导用户加速或减速,提高导航效果。
第二运动速度与第一行驶速度和引导速度的差值的绝对值成正相关,也即是,若导航目标的第一运动速度与引导速度的差值的绝对值越大,第一视觉元素的第二运动速度也越大,从而使得用户能够越加直观地感受到第一运动速度与引导速度之间的差值的绝对值大小,进而采取相应的措施以更快地增大或减小导航目标的运动速度。在 一些实施例中,第一视觉元素的第一运动速度可以等于第二运动速度与引导速度的差值的绝对值。
在一些实施例中,第一视觉元素的第二运动速度的参照物可以与第一运动速度和引导速度的参照物相同。
在一些实施例中,若第一运动速度小于引导速度,则显示沿第一方向运动的第一视觉元素,以指示导航目标加速;若第一运动速度大于引导速度,则显示沿第二方向运动的第一视觉元素,以指示导航目标减速。
其中,第一方向和第二方向可以由电子设备事先确定,且第一方向和第二方向可以不同。在一些实施例中,第一方向可以和第二方向平行且相反。在一些实施例中,第一方向可以是由靠近用户的位置指向远离用户的位置,比如第一方向可以为远离导航界面的底部的方向,以使得驾驶员更直观地产生第一视觉元素远离载具的感觉,从而使驾驶员更容易地联想到加速追赶,减少驾驶员的反应时间,提高辅助驾驶的可靠性;第二方向可以为为由远离用户的位置指向靠近用户的位置,比如第一方向可以为靠近导航界面的底部的方向,以使得驾驶员更直观地产生第一视觉元素靠近载具的感觉,从而使驾驶员更容易地来联想到减速避让,减少驾驶员的反应时间。当然,在实际应用中,第一方向和第二方向也可以为其他方向。
例如,导航界面如图7和图8所示,第一视觉元素包括箭头700。第一方向为由导航界面的底部指向导航界面的中心位置的方向,第二方向为由导航界面的中心位置指向导航界面的底部的方向。若显示沿第一方向运动的箭头700,则可以依次显示包括图7和图8的第一图像集合,第一图像集合中的各帧图像中都可以包括箭头700,且对于每个箭头700,该箭头700在较早显示的图像中的位置可以比在较晚显示的图像中位置更靠近导航界面的底部,从而使得当依次显示第一图像集合时,箭头700远离导航界面的底部。比如图7与图8相比,虚线框中包括的两个箭头700更靠近导航界面的底部,那么当先显示图7再显示图8时,可以产生这两个箭头700远离导航界面的底部的动画效果。相反的,若显示沿第二方向运动的箭头700,则可以依次显示包括图8和图7的第二图像集合,第二图像集合中的各帧图像中都可以包括箭头700,且对于每个箭头700,该箭头700在较早显示的图像中的位置可以比在较晚显示的图像中位置更远离导航界面的底部,从而使得当依次显示第二图像集合时,箭头700靠近导航界面的底部。比如图7与图8相比,虚线框中包括的两个箭头700更靠近导航界面的底部,那么当先显示图8再显示图7时,可以产生这两个箭头700朝向导航界面的底部运动的动画效果。
又例如,如图9和图10所示,第一视觉元素包括光波900(即图9和图10中虚线框中的深色条纹)。第一方向为远离导航界面的底部的方向,第二方向为靠近导航界面的底部的方向。若显示沿第一方向运动的光波900,则可以依次显示包括图9和图10的第三图像集合,第三图像集合中的各帧图像中都可以包括光波900,且对于每个光波900,该光波900在较早显示的图像中的位置可以比在较晚显示的图像中位置更靠近导航界面的底部,从而使得当依次显示第三图像集合时,光波900远离导航界面的底部。比如图9与图10相比,虚线框中包括的光波900更靠近导航界面的底部,那么当先显示图9再显示图10时,可以产生光波900远离导航界面的底部的动画效果。 相反的,若显示沿第二方向运动的光波900,则可以依次显示包括图10和图9的第四图像集合,第四图像集合中的各帧图像中都可以包括光波900,且对于每个光波900,该光波900在较早显示的图像中的位置可以比在较晚显示的图像中位置更远离导航界面的底部,从而使得当依次显示第四图像集合时,光波900靠近导航界面的底部。比如图9与图10相比,虚线框中包括的光波900更靠近导航界面的底部,那么当先显示图10再显示图9时,可以产生光波900朝向导航界面的底部运动的动画效果。
在一些实施例中,若显示器为HUD或者AR眼镜,第一设备可以从摄像头拍摄到的画面中识别该画面包括的特定物体,通过视觉识别算法确定该特定物体在该画面中的二维坐标,通过事先标定的摄像头参数(比如摄像头的焦距、畸变参数、第一位移矩阵和旋转矩阵等),将该特定物体的二维坐标转换为三维坐标,然后基于该特定物体的三维坐标确定第一视觉元素的三维坐标,基于确定的第一视觉元素的三维坐标将第一视觉元素绘制到该画面中。比如,在用户驾驶车辆的过程中,第一设备为车辆的车载导航设备,那么第一设备可以通过摄像头拍摄车辆前方的画面,然后通过机器视觉识别该画面中的车道线,确定车道线在该画面中的二维坐标,并给予摄像头的参数将二维坐标转换为三维坐标,基于车道线的三维坐标确定第一视觉元素的三维坐标,基于第一视觉元素的三维坐标将第一视觉元素绘制到该画面中的车道中。在一些实施例中,第一设备可以通过第一设备中的操作系统所提供的动画接口,绘制第一视觉元素运动的效果。其中,第一设备可以通过增大第一视觉元素的z轴(即与车道平行的方向的坐标轴)坐标,实现第一视觉元素远离用户的动画效果,通过减小第一视觉元素的z轴坐标,实现第一视觉元素靠近与用户的视觉效果;或者,第一设备可以通过预设的第二位移矩阵,对第一视觉元素进行矩形平移运算,从而实现第一视觉元素移动的动画效果。
需要说明的是,在上述图7-图9中,第一设备所展示的导航界面为平视视角的导航界面,且为了更加直观真实地展示第一视觉元素,第一视觉元素也是以平视视角展示在与该导航界面中特定物体相对应的位置(如车道中),但可以理解的是,在实际应用中,第一视觉元素也可以是通过其他视角展示的,也可以是展示在导航界面中的其他位置。例如,如图11所示,导航界面为俯视视角的导航界面,光波900也是以俯视视角显示在车道中。又例如,如图12所示,导航界面仍为平视视角的导航界面,但光波900是显示在该导航界面的左侧,且不与该导航界面中包括的任一物体的位置关联。
在一些实施例中,第一设备可以在检测到第一运动速度和引导速度不一致时,显示运动的第一视觉元素,在检测到第一运动速度与引导速度一致时,停止显示运动的第一视觉元素,通过这种更为精准的速度引导方式,可以使得用户尽可能的及时调整自身或者载具的运动速度,使其与引导速度保持一致,从而能够提高导航的准确性。
其中,第一设备在检测到第一运动速度与引导速度一致时,停止显示运动的第一视觉元素,可以包括:第一设备在检测到第一运动速度与引导速度一致时,立即停止显示运动的第一视觉元素;或者,第一设备在检测到第一运动速度与引导速度一致,且第一运动速度与引导速度保持一致的时长大于预设的第二时长阈值时,再停止显示运动的第一视觉元素,从而增加显示运动的第一视觉元素的时长,以更长时间引导用户调整自身或者载具的运动速度,减少该运动速度与引导速度的偏差。
需要说明的是,第二时长阈值可以由相关技术人员事先设置,也可以由用户设置,本申请实施例不对设置第二时长阈值的方式以及第二时长阈值的大小进行限定。
在一些实施例中,第一设备可以检测到第一运动速度不处于第一速度范围时,显示运动的第一视觉元素,而在检测到第一运动速度处于第一速度范围时,停止显示运动的第一视觉元素,从而引导用户调整自身或者载具的运动速度至靠近第一速度范围即可,这种速度引导方式便于用户更加自由灵活地控制自身或载具的运动速度,提高了导航的灵活性和用户体验。
在一些实施例中,第一设备可以在检测到第一运动速度不处于第一速度范围时,显示运动的第一视觉元素,在检测到第一运动速度与引导速度一致时,停止显示运动的第一视觉元素。
其中,第一设备在检测到第一运动速度处于第一速度范围时,停止显示运动的第一视觉元素,可以包括:第一设备在检测到第一运动速度处于第一速度范围时,立即停止显示运动的第一视觉元素,从而减少显示运动的第一视觉元素的时长,便于用户更加自由灵活地控制自身或载具的运动速度;或者,第一设备在检测到第一运动速度处于第一速度范围,且第一运动速度处于第一速度范围的时长大于预设的第一时长阈值时,再停止显示运动的第一视觉元素,从而增加显示运动的第一视觉元素的时长,以更长时间引导用户调整自身或者载具的运动速度,减少该运动速度与引导速度或者第一速度范围的偏差。
需要说明的是,第一时长阈值可以由相关技术人员事先设置,也可以由用户设置,本申请实施例不对设置第一时长阈值的方式以及第一时长阈值的大小进行限定。
在一些实施例中,第一设备还可以显示运动的第二视觉元素,且若第一视觉元素用于指示导航目标加速,则第二视觉元素可以用于指示导航目标减速,若第一视觉元素用于指示导航目标减速,则第二视觉元素可以用于指示导航目标加速,第二视觉元素的第三运动速度可以与第一运动速度和引导速度的差值的绝对值成正相关。其中,当第一视觉元素用于指示导航目标加速,第二视觉元素用于指示导航目标减速时,若第一运动速度小于引导速度,则第一设备显示沿第一方向运动的第一视觉元素,若第一运动速度大于引导速度,则第一设备显示沿第二方向运动的第二视觉元素;当第一视觉元素用于指示导航目标减速,第二视觉元素用于指示导航目标加速时,若第一运动速度小于引导速度,则第一设备显示沿第一方向运动的第二视觉元素,若第一运动速度大于引导速度,则第一设备显示沿第二方向运动的第一视觉元素。也即是,可以通过显示不同风格的第一视觉元素和第二视觉元素,能够更加直观地分别指示导航设备加速或减速,使得用户更加容易根据第一视觉元素或第二视觉元素做出不同的反应,进一步提高导航的准确性和用户体验。
在一些实施例中,第一设备可以在第一设备本端显示第一视觉元素和第二视觉元素等导航信息,比如第一设备为如图3所示的导航场景中的电子设备100,电子设备100即可以在电子设备100本端显示该导航信息。在另一些实施例中,第一设备也可以通过第三设备显示该导航信息,比如第一设备为如图5所示的导航场景中的手机510,第三设备为如图5所示的导航场景中的AR眼镜520,那么手机510既可以在手机510本端显示该导航信息,当然也可以通过AR眼镜520显示该导航信息。
在一些实施例中,上述停止显示运动的第一视觉元素,可以包括显示静止的第一视觉元素,或者隐藏第一视觉元素;相似的,停止显示运动的第二视觉元素,可以包括显示静止的第二视觉元素,或者隐藏第二视觉元素。
在本申请实施例中,导航目标为第一设备。第一设备可以获取导航目标的第一运动速度以及与导航目标对应的引导速度,并基于第一运动速度和引导速度显示运动的第一视觉元素。由于运动的第一视觉元素更容易吸引用户的注意力,从而可以通过第一视觉元素指示导航目标加速或减速,且第一视觉元素的第二运动速度与第一导航目标的第一运动速度和引导速度的差值的绝对值成正相关,能够更加直观地指示导航目标加速或减速的程度,提高了导航效果。
以下结合如图3所示的实地驾驶下的导航场景,举例说明通过本申请所提供的导航方法引导用户控制车辆加速和减速。
用户驾驶车辆在右侧慢车道打火起步,车辆导航设备该右侧慢车道确定相应的引导速度为40千米/小时,且车辆当前的第一运动速度为0,因此,通过HUD在前挡风玻璃上显示向前滚动的箭头,且箭头的运动速度为40千米/小时(即引导速度与第一运动的差值的绝对值)。当用户看到向前滚动的箭头时,可以确定当前的第一运动速度小于引导速度,因此踩油门加速,使得车辆的运动速度逐渐接近引导速度,且随着车辆的运动速度与引导速度的差值的绝对值不断减小,箭头向前滚动的速度也逐渐降低,直至车辆的运动速度与引导速度相同,箭头停止运动或者消失。
接着,用户驾驶车辆从右侧慢车道变到至左侧快车道,车辆导航设备基于所变换的快车道以及天气信息,确定引导速度为80千米/小时,而当前的第一运动速度为40千米/小时。由于当前的第一运动速度小于引导速度,因此通过HUD在前挡风玻璃上显示向前滚动的箭头700,且箭头700的运动速度为40千米/小时(即引导速度与第一运动的差值的绝对值),如图13所示。当用户看到向前滚动的箭头700时,可以确定当前的第一运动速度小于引导速度,因此踩油门加速,使得车辆的运动速度逐渐接近引导速度,且随着车辆的运动速度与引导速度的差值的绝对值不断减小,箭头700向前滚动的速度也逐渐降低,直至车辆的运动速度与引导速度相同,箭头700停止运动或者消失。
之后,用户驾驶车辆靠近十字路口,当前的第一运动速度为60千米/小时,车辆导航设备检测到当前该十字路口为红灯亮,因此引导速度为0。由于引导速度小于第一运动速度,因此通过HUD在前挡风玻璃上向后滚动的箭头700,且箭头700向后滚动的速度为60千米/小时(即引导速度与第一运动的差值的绝对值),如图14所示。在一些实施例中,车辆导航设备可以基于第一运动速度和预设的感知反应时长确定第一距离,并在检测到车辆距离路口的距离为第一距离时,开始显示箭头700。在一些的实施例中,第一距离可以为第一运动速度和预设的感知反应时长的乘积。其中,感知反应时长可以为驾驶员反应所需要的时间。当用户看到向后滚动的箭头700时,可以确定当前的第一运动速度大于引导速度,因此踩下刹车制动,使得车辆的运动速度逐渐降低,且随着车辆的运动速度与引导速度的差值的绝对值不断减小,箭头700向后滚动的速度也逐渐降低,直至车辆停止,箭头700停止运动或者消失。
当车辆导航设备检测到当前该十字路口为黄灯亮时,根据车辆所处的车道确定引 导速度为60千米/小时,而车辆当前的第一运动速度为0,因此通过HUD在前挡风玻璃上向前滚动的箭头,且箭头向前滚动的速度为60千米/小时(即引导速度与第一运动的差值的绝对值)。当用户看到向前滚动的箭头时,可以确定当前的第一运动速度小于引导速度,因此踩下油门加速,使得车辆的运动速度逐渐增大,且随着车辆的运动速度与引导速度的差值的绝对值不断减小,箭头向前滚动的速度也逐渐降低,直至车辆的运动速度达到60千米/小时,箭头停止运动或者消失。
接着,车辆仍以60千米/小时的速度(即第一运动速度为60千米/小时)行驶,在行驶的过程中检测前方有其他车辆,且该车辆的运动速度为40千米/小时,车载导航设备基于该车辆当前的运动速度、当前与该车辆之间的车距以及预设的感知反应时长,确定引导速度。在一些实施例中,车载导航设备可以将该车辆当前的运动速度,减去该车距和该感知反应时长的乘积所得到的差值的绝对值,确定为引导速度。假如确定的引导速度为30千米/小时,由于引导速度小于第一运动速度,因此通过HUD在前挡风玻璃上向后滚动的箭头,且箭头向后滚动的速度为30千米/小时。在一些实施例中,车载导航设备也可以在检测到车辆中的前方碰撞预警系统(forward collision warning,FCW)发出预警信号时(比如在预测碰撞前的2.7秒),在前挡风玻璃上显示向后滚动的箭头。当用户看到向后滚动的箭头时,可以确定当前的第一运动速度大于引导速度,因此踩下刹车制动,使得车辆的运动速度逐渐降低,且随着车辆的运动速度与引导速度的差值的绝对值不断减小,箭头向后滚动的速度也逐渐降低,直至按照引导速度行驶,箭头停止运动或者消失。
请参照图15,为本申请实施例所提供的一种导航方法的流程图。其中,第一设备可以为如图3或图4所示的导航场景中的电子设备100,第二设备可以为如图3所示的导航场景中的第一载具310或者如图4所示的导航场景中的第二载具320。在本申请实施例中,第一设备为进行导航的设备,第二设备为导航目标(即被导航的设备),也即是,第一设备对第二设备进行导航。需要说明的是,该方法并不以图15以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:
S1501,第一设备获取导航目标的第一运动速度,其中,导航目标为与第一设备通过网络连接的第二设备。
其中,第二设备可以获取第二设备的第一运动速度,并向第一设备发送第一运动速度。且需要说明的是,第二设备获取第二设备的第一运动速度的方式,可以与前述中第一设备获取第一设备的第一运动速度的方式相同或相似,此处不再一一赘述。
S1502,第一设备获取与导航目标对应的引导速度。
在一些实施例中,第二设备可以获取导航目标对应的运动环境信息,并向第一设备发送该运动环境信息,相应的,第一设备可以接收该运动环境信息并基于该运动环境信息确定引导速度。或者,在一些实施例中,第二设备可以获取该运动信息并基于该运动信息确定引导速度,向第一设备发送该引导速度,第一设备可以接收二设备发送的该引导速度。
在一些实施例中,第一设备可以接收用户提交的引导速度。
在一些实施例中,第一设备可以获取用户提交的运动路线和运动时限,基于该运动路线和该运动时限,确定引导速度。
在一些实施例中,第一设备还可以确定第一速度范围,引导速度可以为处于第一速度范围中的数值。
需要说明的是,第一设备也可以通过其他方式确定第一速度和/或第一速度范围。
还需要说明的是,在实际应用中,第一设备可以先后执行S1501和S1502,也可以同时执行S1501和S1502,本申请实施例不对第一设备执行S1501和S1502的顺序进行限定。
S1503,第一设备基于第一运动速度和引导速度,显示运动的第一视觉元素,其中,第一视觉元素用于指示导航目标加速或减速,第一视觉元素的第二运动速度与第一运动速度和引导速度的差值的绝对值成正相关。
在一些实施例中,第一设备还可以显示运动的第二视觉元素,且若第一视觉元素用于指示导航目标加速,则第二视觉元素可以用于指示导航目标减速,若第一视觉元素用于指示导航目标减速,则第二视觉元素可以用于指示导航目标加速,第二视觉元素的第三运动速度可以与第一运动速度和引导速度的差值的绝对值成正相关。
其中,第一设备基于第一运动速度和引导速度,显示运动的第一视觉元素或第二视觉元素的方式,可以与S603中第一设备基于第一运动速度和引导速度,显示运动的第一视觉元素或第二视觉元素的方式相同或相似,此处不再一一赘述。
在一些实施例中,第一设备可以在第一设备本端显示第一视觉元素和第二视觉元素等导航信息。在另一些实施例中,第一设备也可以通过第二设备或第三设备显示该导航信息。比如第一设备为如图3或图4所示的导航场景中的电子设备100,电子设备100既可以在电子设备100本端显示该导航信息,也可以通过AR眼镜显示该导航信息。
在本申请实施例中,导航目标为第二设备。第一设备可以获取第二设备的第一运动速度以及与第二设备对应的引导速度,并基于第一运动速度和引导速度在第一设备中显示运动的第一视觉元素。由于运动的第一视觉元素更容易吸引用户的注意力,从而可以通过第一视觉元素指示第二设备加速或减速,且第一视觉元素的第二运动速度与第一第二设备的第一运动速度和引导速度的差值成正相关,能够更加直观地指示第二设备加速或减速的程度,提高了导航效果。
基于同一发明构思,本申请实施例还提供了一种电子设备,该电子设备可以为前述中的第一设备,包括:存储器和处理器,存储器用于存储计算机程序;处理器用于在调用计算机程序时执行上述方法实施例所述的方法。
本实施例提供的电子设备可以执行上述方法实施例,其实现原理与技术效果类似,此处不再赘述。
基于同一发明构思,本申请实施例还提供了一种车辆,该车辆包括前述中的第一设备,第一设备为该车辆中的车载设备,第一设备还包括HUD,该HUD用于显示第一视觉元素。
基于同一发明构思,本申请实施例还提供了一种芯片系统。该所述芯片系统包括处理器,所述处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以 实现上述方法实施例所述的方法。
其中,该芯片系统可以为单个芯片,或者多个芯片组成的芯片模组。
本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述方法实施例所述的方法。
本申请实施例还提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行时实现上述方法实施例所述的方法。
上述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读存储介质至少可以包括:能够将计算机程序代码携带到拍照装置/电子设备的任何实体或装置、记录介质、计算机存储器、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践,计算机可读介质不可以是电载波信号和电信信号。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事 件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (16)

  1. 一种导航方法,其特征在于,包括:
    获取导航目标第一运动速度;
    获取与所述导航目标对应的引导速度;
    基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,其中,所述第一视觉元素用于指示所述导航目标加速或减速,所述第一视觉元素的第二运动速度与所述第一运动速度和所述引导速度的差值的绝对值成正相关。
  2. 根据权利要求1所述的方法,其特征在于,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
    若所述第一运动速度小于所述引导速度,则显示沿第一方向运动的所述第一视觉元素;
    若所述第一运动速度大于所述引导速度,则显示沿第二方向运动的所述第一视觉元素;
    其中,所述第一方向和所述第二方向不同。
  3. 根据权利要求2所述的方法,其特征在于,所述第一方向和所述第二方向平行且相反。
  4. 根据权利要求2或3所述的方法,其特征在于,所述第一方向为导航界面的底部指向所述导航界面的中心位置的方向,所述第二方向为所述导航界面的中心位置指向所述导航界面的底部的方向,所述导航界面为包括所述第一视觉元素的界面。
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
    当检测到所述第一运动速度与所述引导速度不一致时,显示运动的所述第一视觉元素;
    当检测到所述第一运动速度与所述引导速度一致时,停止显示运动的所述第一视觉元素。
  6. 根据权利要求1-4任一所述的方法,其特征在于,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
    当检测到所述第一运动速度不处于第一速度范围时,显示运动的所述第一视觉元素;
    当检测到所述第一运动速度处于第一速度范围时,停止显示运动的所述第一视觉元素;
    其中,所述引导速度为所述第一速度范围中包括的数值。
  7. 根据权利要求1-4任一所述的方法,其特征在于,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
    当检测到所述第一运动速度不处于第一速度范围时,显示运动的所述第一视觉元素;
    当检测到所述第一运动速度与所述引导速度一致时,停止显示运动的所述第一视觉元素;
    其中,所述引导速度为所述第一速度范围中包括的数值。
  8. 根据权利要求1-7任一所述的方法,其特征在于,所述第一视觉元素包括箭头或光波。
  9. 根据权利要求1或权利要求3-8任一所述的方法,其特征在于,所述方法还包括:
    基于所述第一运动速度和所述引导速度,显示运动的第二视觉元素,其中,所述第一视觉元素和所述第二视觉元素中的一个用于指示所述导航目标加速,另一个用于指示所述导航目标减速,所述第二视觉元素的第三运动速度与所述第一运动速度和所述引导速度的差值的绝对值成正相关。
  10. 根据权利要求9所述的方法,其特征在于,所述第一视觉元素用于指示所述导航目标加速,所述第二视觉元素用于指示所述导航目标减速,所述基于所述第一运动速度和所述引导速度,显示运动的第一视觉元素,包括:
    若所述第一运动速度小于所述引导速度,则显示沿第一方向运动的所述第一视觉元素;
    所述基于所述第一运动速度和所述引导速度,显示运动的第二视觉元素,包括:
    若所述第一运动速度大于所述引导速度,则显示沿第二方向运动的所述第二视觉元素;
    其中,所述第一方向和所述第二方向不同。
  11. 根据权利要求1-10任一所述的方法,其特征在于,所述获取与所述导航目标对应的引导速度,包括:
    获取所述导航目标对应的运动环境信息,所述运动环境信息用于指示所述导航目标所在的环境;
    基于所述运动环境信息确定所述引导速度。
  12. 根据权利要求11所述的方法,其特征在于,所述运动环境信息包括位置信息、天气信息和障碍物信息中的至少一个。
  13. 一种电子设备,其特征在于,包括:存储器和处理器,所述存储器用于存储计算机程序;所述处理器用于在调用所述计算机程序时执行如权利要求1-12任一项所述的方法。
  14. 一种车辆,其特征在于,所述车辆包括如权利要求13所述的电子设备,所述电子设备为所述车辆中的车载设备,所述电子设备还包括抬头显示器HUD;
    所述HUD用于显示所述第一视觉元素。
  15. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-12任一项所述的方法。
  16. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-12任一项所述的方法。
PCT/CN2023/083368 2022-03-31 2023-03-23 导航方法及电子设备 WO2023185622A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210342590.0A CN116929351A (zh) 2022-03-31 2022-03-31 导航方法及电子设备
CN202210342590.0 2022-03-31

Publications (1)

Publication Number Publication Date
WO2023185622A1 true WO2023185622A1 (zh) 2023-10-05

Family

ID=88199230

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/083368 WO2023185622A1 (zh) 2022-03-31 2023-03-23 导航方法及电子设备

Country Status (2)

Country Link
CN (1) CN116929351A (zh)
WO (1) WO2023185622A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003291688A (ja) * 2002-04-03 2003-10-15 Denso Corp 表示方法、運転支援装置、プログラム
JP2004101280A (ja) * 2002-09-06 2004-04-02 Denso Corp カーナビゲーション装置
JP2005241288A (ja) * 2004-02-24 2005-09-08 Fuji Heavy Ind Ltd 車両制限速度警報装置
JP2006331040A (ja) * 2005-05-25 2006-12-07 Nissan Motor Co Ltd 注意誘導装置及び注意誘導方法
US20180173237A1 (en) * 2016-12-19 2018-06-21 drive.ai Inc. Methods for communicating state, intent, and context of an autonomous vehicle
CN110775063A (zh) * 2019-09-25 2020-02-11 华为技术有限公司 一种车载设备的信息显示方法、装置及车辆
CN112797996A (zh) * 2020-12-18 2021-05-14 深圳市元征科技股份有限公司 一种车辆导航方法、装置、设备及介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003291688A (ja) * 2002-04-03 2003-10-15 Denso Corp 表示方法、運転支援装置、プログラム
JP2004101280A (ja) * 2002-09-06 2004-04-02 Denso Corp カーナビゲーション装置
JP2005241288A (ja) * 2004-02-24 2005-09-08 Fuji Heavy Ind Ltd 車両制限速度警報装置
JP2006331040A (ja) * 2005-05-25 2006-12-07 Nissan Motor Co Ltd 注意誘導装置及び注意誘導方法
US20180173237A1 (en) * 2016-12-19 2018-06-21 drive.ai Inc. Methods for communicating state, intent, and context of an autonomous vehicle
CN110775063A (zh) * 2019-09-25 2020-02-11 华为技术有限公司 一种车载设备的信息显示方法、装置及车辆
CN112797996A (zh) * 2020-12-18 2021-05-14 深圳市元征科技股份有限公司 一种车辆导航方法、装置、设备及介质

Also Published As

Publication number Publication date
CN116929351A (zh) 2023-10-24

Similar Documents

Publication Publication Date Title
US10982968B2 (en) Sensor fusion methods for augmented reality navigation
KR101730321B1 (ko) 운전자 보조 장치 및 그 제어방법
KR102043060B1 (ko) 자율 주행 장치 및 이를 구비한 차량
US20190041652A1 (en) Display system, display method, and program
KR20180026241A (ko) 차량용 사용자 인터페이스 장치 및 차량
CN107867296A (zh) 安装在车辆上的车辆控制装置和控制该车辆的方法
WO2019052487A1 (zh) 对车辆进行驾驶控制的方法和装置
CN114641668A (zh) 拼车应用程序中的增强现实寻路
US20220215639A1 (en) Data Presentation Method and Terminal Device
KR101711797B1 (ko) 자율 주행 차량의 자동 주차 시스템 및 그 제어방법
US11626028B2 (en) System and method for providing vehicle function guidance and virtual test-driving experience based on augmented reality content
CN111192341A (zh) 生成高精地图的方法、装置、自动驾驶设备及存储介质
WO2019021811A1 (ja) 飛行体、通信端末、通信システム、及びプログラム
WO2023010923A1 (zh) 一种高架识别方法及装置
CN205541484U (zh) 电子装置
WO2023185622A1 (zh) 导航方法及电子设备
CN115170630B (zh) 地图生成方法、装置、电子设备、车辆和存储介质
KR20160144643A (ko) 어라운드 뷰 제공장치 및 이를 구비한 차량
KR20180013162A (ko) 차량용 어라운드 뷰 제공 장치 및 차량
EP4191204A1 (en) Route guidance device and route guidance method thereof
KR101872477B1 (ko) 차량
CN114880408A (zh) 场景构建方法、装置、介质以及芯片
JP7302477B2 (ja) 情報処理装置、情報処理方法および情報処理プログラム
JP2022549752A (ja) 自動運転車インタラクションシステム
KR20170011881A (ko) 차량용 레이더, 및 이를 구비하는 차량

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23777992

Country of ref document: EP

Kind code of ref document: A1