WO2023010923A1 - 一种高架识别方法及装置 - Google Patents

一种高架识别方法及装置 Download PDF

Info

Publication number
WO2023010923A1
WO2023010923A1 PCT/CN2022/091520 CN2022091520W WO2023010923A1 WO 2023010923 A1 WO2023010923 A1 WO 2023010923A1 CN 2022091520 W CN2022091520 W CN 2022091520W WO 2023010923 A1 WO2023010923 A1 WO 2023010923A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
elevated
road
ramp
parameter
Prior art date
Application number
PCT/CN2022/091520
Other languages
English (en)
French (fr)
Inventor
邱宇
李庆奇
李康
黄鹏飞
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023010923A1 publication Critical patent/WO2023010923A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Definitions

  • the present application relates to the field of navigation technology, in particular to an elevated identification method and device.
  • the elevated road can be referred to simply as the elevated road, which is a three-dimensional road erected on the ground road and used for vehicle driving.
  • the road With the elevated, the road can be divided into the road on the upper side of the elevated and the road on the lower side of the elevated.
  • the user can choose whether to drive the vehicle on the road on the upper side of the elevated road or on the road on the lower side of the elevated road.
  • the user usually needs to perform navigation during the process of driving the vehicle.
  • the terminal equipment used for navigation usually needs to determine the location information of the vehicle, and plan an appropriate route for the vehicle accordingly.
  • the location information of the road on the upper side of the elevated road and the road on the lower side of the elevated road may be the same or similar, so the terminal device often cannot determine whether the vehicle is located on the road on the upper side of the elevated road or the road on the lower side of the elevated road based on the location information of the vehicle. This leads to a situation where navigation errors occur. However, navigation errors often lead to the wrong route for the user driving the vehicle, which brings an extremely poor driving experience to the user, and has the disadvantages of long driving process and high fuel consumption of the vehicle.
  • an embodiment of the present application provides an elevated identification method and device.
  • an elevated identification method including:
  • the elevated recognition result of the vehicle is determined according to the height change of the vehicle, and the target state includes at least one of the following driving states: start on-ramp, end on-ramp 1. Starting the off-ramp and ending the off-ramp, the elevated recognition result of the vehicle is that the vehicle is traveling on the road above the elevated, or that the vehicle is traveling on the road below the elevated.
  • the first parameter at the first moment can be determined according to the GNSS signal, and the second parameter of the vehicle at the first moment can be determined through the sensor, and then the driving state of the vehicle can be determined by combining the first parameter and the second parameter, and then combined with The driving state of the vehicle and the height change of the vehicle determine the elevated recognition result of the vehicle.
  • the first parameter includes at least one of the following information: the speed and heading of the vehicle;
  • the second parameter includes at least one of the following information: pitch angle, roll angle and heading angle of the vehicle.
  • the determining the first parameter of the vehicle at the first moment includes:
  • the first parameter of the vehicle at the first moment is determined
  • the first parameter of the vehicle at the first moment is determined.
  • the terminal device can only determine the first parameter of the vehicle at the first moment when certain conditions are met, thereby reducing the number of operations required to determine the first parameter, reducing the amount of data that needs to be processed, and reducing the need for determining the first parameter. Computational resources used in this process.
  • the determining the driving state of the vehicle according to the first parameter and the second parameter includes:
  • the driving state of the vehicle is determined.
  • the elevated road is one floor
  • the determination of the elevated recognition result of the vehicle according to the height change of the vehicle includes:
  • the driving state of the vehicle is starting to go on the ramp, and the height change of the vehicle within the first time period after that is greater than a first threshold, determine that the elevated identification result of the vehicle is the road where the vehicle is on the upper side of the elevated driving;
  • the driving state of the vehicle is to start off-ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, determine that the elevated identification result of the vehicle is The vehicle travels on the road under the elevated road;
  • the driving state of the vehicle is ending the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is on the Driving on the road on the upper side of the elevated road;
  • the driving state of the vehicle is ending the off-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is in the The elevated underside of the road travels.
  • the elevated identification result of the vehicle can be determined in combination with the height change of the vehicle and the driving state of the vehicle, thereby improving the accuracy of navigation.
  • the elevated structure includes at least two floors
  • the determination of the elevated identification result of the vehicle according to the height change of the vehicle includes:
  • the driving state of the vehicle is starting to go on the ramp, and the height change of the vehicle within the first time period after that is greater than a first threshold, determine that the elevated identification result of the vehicle is the road where the vehicle is on the upper side of the elevated driving;
  • the driving state of the vehicle is to start off-ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, according to the state of the vehicle before the off-ramp , determining the elevated identification result of the vehicle;
  • the driving state of the vehicle is ending the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is on the Driving on the road on the upper side of the elevated road;
  • the driving state of the vehicle is ending the off-ramp, and the absolute value of the height change of the vehicle in the previous second time period is greater than the first threshold, according to the state of the vehicle before the off-ramp, An overhead identification result for the vehicle is determined.
  • the elevated identification result of the vehicle can be determined in combination with the height change of the vehicle and the driving state of the vehicle, thereby improving the accuracy of navigation.
  • the state of the vehicle before the off-ramp includes one of the following states: the vehicle is driving on the first floor elevated and the vehicle is driving on other floors of the elevated, the first The elevated level is the road on the upper level of the elevated level closest to the road on the lower side of the elevated level.
  • each time after determining the elevated identification result of the vehicle record the change in the number of floors of the elevated where the vehicle is located;
  • the state of the vehicle before the off-ramp is determined according to the change of the number of floors of the elevated road where the vehicle is located.
  • the state of the vehicle before the off-ramp can be determined, so as to determine the elevated identification result of the vehicle according to the state of the vehicle before the off-ramp.
  • the elevated recognition result of the vehicle indicates that the vehicle is driving on the road on the upper side of the elevated road
  • the accuracy of determining the position of the vehicle by the server can be improved, and the accuracy of navigation can be further improved.
  • the elevated identification result of the vehicle indicates that the vehicle is driving on the road below the elevated, and the direction of travel of the vehicle includes a weak signal area ahead, adjust the elevated identification method from navigation by GNSS signal to navigation by network positioning method Or navigate through the network positioning method and the inertial elevated identification method.
  • This implementation method can enable the vehicle to adjust the navigation method before driving to the weak signal area, and reduce the influence of the weak signal area on the navigation accuracy, so as to ensure the navigation accuracy of the terminal device in the weak signal area.
  • the weak signal area includes: a tunnel area or an obstacle shielding area.
  • the embodiment of the present application provides an overhead identification device, the device includes: a transceiver and a processor;
  • the transceiver is used to receive GNSS signals from the Global Navigation Satellite System
  • the processor is used to:
  • the elevated recognition result of the vehicle is determined according to the height change of the vehicle, and the target state includes at least one of the following driving states: start on-ramp, end on-ramp 1. Starting the off-ramp and ending the off-ramp, the elevated recognition result of the vehicle is that the vehicle is traveling on the road above the elevated, or that the vehicle is traveling on the road below the elevated.
  • the first parameter includes at least one of the following information: the speed and heading of the vehicle;
  • the second parameter includes at least one of the following information: pitch angle, roll angle and heading angle of the vehicle.
  • the processor is configured to determine the first parameter of the vehicle at the first moment, specifically:
  • the first parameter of the vehicle at the first moment is determined
  • the first parameter of the vehicle at the first moment is determined.
  • the processor is configured to determine the driving state of the vehicle according to the first parameter and the second parameter, specifically:
  • the driving state of the vehicle is determined.
  • the elevated road is one floor
  • the processor is configured to determine the elevated recognition result of the vehicle according to the height change of the vehicle, specifically:
  • the driving state of the vehicle is starting to go on the ramp, and the height change of the vehicle within the first time period after that is greater than a first threshold, determine that the elevated identification result of the vehicle is the road where the vehicle is on the upper side of the elevated driving;
  • the driving state of the vehicle is to start off-ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, determine that the elevated identification result of the vehicle is The vehicle travels on the road under the elevated road;
  • the driving state of the vehicle is ending the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is on the Driving on the road on the upper side of the elevated road;
  • the driving state of the vehicle is ending the off-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is in the The elevated underside of the road travels.
  • the elevated structure includes at least two floors
  • the processor determines the elevated identification result of the vehicle according to the height change of the vehicle, specifically:
  • the driving state of the vehicle is starting to go on the ramp, and the height change of the vehicle within the first time period after that is greater than a first threshold, determine that the elevated identification result of the vehicle is the road where the vehicle is on the upper side of the elevated driving;
  • the driving state of the vehicle is to start off-ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, according to the state of the vehicle before the off-ramp , determining the elevated identification result of the vehicle;
  • the driving state of the vehicle is ending the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is on the Driving on the road on the upper side of the elevated road;
  • the driving state of the vehicle is ending the off-ramp, and the absolute value of the height change of the vehicle in the previous second time period is greater than the first threshold, according to the state of the vehicle before the off-ramp, An overhead identification result for the vehicle is determined.
  • the state of the vehicle before the off-ramp includes one of the following states: the vehicle is driving on the first floor elevated and the vehicle is driving on other floors of the elevated, the first The elevated level is the road on the upper level of the elevated level closest to the road on the lower side of the elevated level.
  • the processor is further configured to:
  • each time after determining the elevated identification result of the vehicle record the change in the number of floors of the elevated where the vehicle is located;
  • the state of the vehicle before the off-ramp is determined according to the change of the number of floors of the elevated road where the vehicle is located.
  • the processor is further configured to:
  • the elevated recognition result of the vehicle indicates that the vehicle is driving on the road on the upper side of the elevated road
  • the processor is further configured to:
  • the elevated identification result of the vehicle indicates that the vehicle is driving on the road below the elevated, and the direction of travel of the vehicle includes a weak signal area ahead, adjust the elevated identification method from navigation by GNSS signal to navigation by network positioning method Or navigate through the network positioning method and the inertial elevated identification method.
  • the weak signal area includes: a tunnel area or an obstacle shielding area.
  • an embodiment of the present application provides a terminal device, where the terminal device includes a processor, and when the processor executes a computer program or an instruction in a memory, the method described in the first aspect is executed.
  • an embodiment of the present application provides a terminal device, the terminal device includes a processor and a memory; the memory is used to store computer programs or instructions; the processor is used to execute the computer programs or instructions stored in the memory , so that the terminal device executes the method described in the first aspect.
  • the present application provides a terminal device, which includes a processor, a memory, and a transceiver; the transceiver is used to receive signals or send signals; the memory is used to store computer programs or instructions; the processing The device is configured to execute the computer program or instruction stored in the memory, so that the terminal device executes the method as described in the first aspect.
  • the present application provides a terminal device, which includes a processor and an interface circuit; the interface circuit is used to receive computer programs or instructions and transmit them to the processor; the processor is used to run the The above computer program or instruction, so that the terminal device executes the method according to the first aspect.
  • the present application provides a computer storage medium, the computer storage medium is used to store computer programs or instructions, and when the computer programs or instructions are executed, the method described in the first aspect is implemented.
  • the present application provides a computer program product including a computer program or an instruction, and when the computer program or instruction is executed, the method described in the first aspect is implemented.
  • the present application provides a chip, the chip includes a processor, the processor is coupled with a memory, and is used to execute a computer program or instruction stored in the memory, when the computer program or instruction is executed , the method as described in the first aspect is executed.
  • Embodiments of the present application provide an elevated identification method and device.
  • the first parameter at the first moment is determined according to the GNSS signal
  • the second parameter of the vehicle at the first moment is determined by the sensor
  • the driving state of the vehicle is determined by combining the first parameter and the second parameter, and then combined with the vehicle
  • the driving state and the height change of the vehicle can determine the elevated recognition result of the vehicle, thereby solving the problem in the prior art that the elevated recognition result of the vehicle cannot be determined, and can reduce the number of navigation errors and improve the accuracy of navigation.
  • the solution of the present application can improve the accuracy of navigation, it can also improve the experience of the user on the vehicle, reduce the time consumption of the driving process, and reduce the fuel consumption of the vehicle, so as to achieve the purpose of energy saving.
  • Fig. 1 is a schematic diagram of a vehicle's upper elevated ramp and lower elevated ramp road network
  • Fig. 2 is a structural schematic diagram of a GNSS system
  • FIG. 3 is a schematic diagram of an interface of an electronic map displayed by a terminal device
  • FIG. 4 is a schematic diagram of a vehicle driving scene provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • FIG. 6 is a software structural block diagram of a mobile phone provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a vehicle disclosed in an embodiment of the present application.
  • FIG. 8(a) is an example diagram of an interface of a terminal device disclosed in an embodiment of the present application.
  • FIG. 8(b) is an example diagram of an interface of another terminal device disclosed in the embodiment of the present application.
  • FIG. 8(c) is an example diagram of an interface of another terminal device disclosed in the embodiment of the present application.
  • FIG. 9 is a schematic workflow diagram of an elevated identification method disclosed in the embodiment of the present application.
  • Fig. 10(a) is a schematic diagram of a scene of a vehicle traveling on the road above the elevated side disclosed in the embodiment of the present application;
  • Fig. 10(b) is a top view of a vehicle traveling on the elevated side of the road disclosed in the embodiment of the present application;
  • FIG. 11 is a schematic diagram of an interface of an electronic map displayed by a terminal device provided by an embodiment of the present application.
  • FIG. 12 is a structural block diagram of an implementation manner of a navigation device provided by the present application.
  • FIG. 13 is a structural block diagram of an implementation manner of a chip provided by the present application.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • the elevated road may be referred to simply as an elevated road, which is a three-dimensional road erected on the ground road and used for driving vehicles.
  • the road can be divided into the road on the upper side of the elevated and the road on the lower side of the elevated.
  • the road on the upper side of the elevated refers to the elevated road above the ground
  • the road on the lower side of the elevated refers to the ground road under the elevated;
  • the elevated The road on the upper side is any layer of elevated road above the ground
  • the road on the lower side of the elevated is the ground road located under the overhead road closest to the ground.
  • Ramp intersections usually include an on-ramp intersection and an off-ramp intersection.
  • an on-ramp intersection When a vehicle is on an elevated road, it first needs to pass through the on-ramp intersection, and then drive into the road on the upper side of the elevated road.
  • the vehicle when the vehicle is getting off the elevated road, or when the vehicle travels from the road on the elevated side of a certain layer to the road on the elevated side of the lower layer, it first needs to pass through the off-ramp, and then enter the road on the elevated side, or drive Access the road on the elevated side of the lower floor.
  • Figure 1 In order to clarify the scene of vehicles driving on elevated roads, Figure 1 is provided.
  • the scene corresponding to this figure includes a layer of elevated roads.
  • the roads include the roads on the upper side of the elevated road and the roads on the lower side of the elevated road.
  • the front of the vehicle includes a solid line with an arrow indicating the direction in which the vehicle is traveling.
  • the vehicle In the initial stage of vehicle driving, the vehicle is located on the left side in Figure 1, and the vehicle is driving on the road under the elevated road. After a period of time, the vehicle starts to go up the elevated road through the ramp entrance of the elevated road, thereby driving into the road on the upper side of the elevated road.
  • the position of the vehicle is the position of the vehicle on the right side in Fig. 1 .
  • the vehicle travels to the road on the lower side of the elevated road through the lower ramp of the elevated road.
  • the driving route of the vehicle is: driving on the ground -> driving on the road on the upper side of the elevated road -> driving on the road on the lower side of the elevated road, that is, driving on the ground.
  • the user can usually use a terminal device (such as a mobile phone or a vehicle terminal, etc.) for navigation.
  • a terminal device such as a mobile phone or a vehicle terminal, etc.
  • the terminal device may turn on the navigation of the terminal device before the vehicle gets on the elevated, or turn on the navigation when the vehicle is driving on the road below the elevated.
  • the terminal device mainly determines its own position through the Global Navigation Satellite System (GNSS).
  • GNSS Global Navigation Satellite System
  • GNSS Global Navigation Satellite System
  • GNSS Global Navigation Satellite System
  • GNSS is a space-based radio navigation and positioning system that can provide users with all-weather three-dimensional coordinates, velocity and time information at any point on the earth's surface or in near-earth space.
  • GNSS systems usually include the global positioning system (GPS) of the United States, the globalnaja nawigazionnaja sputnikowaja ista (GLONASS) system of Russia, the Galileo (GALILEO) system of the European Union and the Beidou satellite navigation system of China, etc.
  • GPS global positioning system
  • GLONASS globalnaja nawigazionnaja sputnikowaja
  • GALILEO Galileo
  • Beidou satellite navigation system of China etc.
  • the GPS system is a radio navigation positioning system based on artificial earth satellites, including 24 satellites covering the whole world.
  • the Beidou satellite navigation system is a global satellite navigation system independently developed and operated by China. The system is divided into two generations, namely the Beidou first-generation system and the Beidou second-generation system. The system typically includes four satellites in geosynchronous orbit.
  • the GNSS navigation system generally includes three parts: a space part, a ground monitoring part and a user receiver.
  • the space part of GNSS navigation system comprises a plurality of satellites 10
  • the ground monitoring part comprises ground monitoring and tracking station 20
  • ground monitoring and tracking station 20 generally comprises main control station, monitoring station and injection station, and the GNSS navigation system
  • the user receiver 30 can receive satellite signals transmitted by a plurality of satellites 10 .
  • the basic principle of the GNSS navigation system is to determine the position of the user receiver through the distance between multiple known satellites and the user receiver.
  • the position of the satellite can be found in the satellite ephemeris according to the time recorded by the on-board clock, and the distance between the user receiver and the satellite can be determined by the time when the satellite signal transmitted by the satellite is transmitted to the user receiver. Also known as GNSS signal.
  • the ground monitoring and tracking station 20 can transmit information such as satellite ephemeris to multiple satellites 10; multiple satellites 10 can continuously transmit satellite signals, which usually include satellite ephemeris and launch time of satellite signals;
  • the receiver 30 can search for and receive satellite signals, determine the position of the satellite 10 through the satellite ephemeris in the satellite signal, and determine the distance between itself and the satellite 10 through its own clock and the launch time of the satellite signal, and further determine the distance between itself and the satellite 10 according to the satellite ephemeris in the satellite signal. 10, as well as the distance between itself and the satellite 10, to determine the position information of its own position.
  • a user can perform navigation through a terminal device, which can be a mobile terminal (such as a mobile phone) and a device with a navigation function such as a vehicle.
  • the terminal device displays an electronic map during the navigation process, which is convenient for the user to query the destination and plan the route.
  • the terminal device in the process of navigating the vehicle, can display its own position on the electronic map after determining its own position.
  • the electronic map usually includes the environment around the position of the terminal device and indicates the The position in the electronic map can further include the route planned for the vehicle and indicate the direction of the vehicle to meet the user's navigation needs.
  • the terminal device usually determines its own location information according to the received GNSS signal, and further performs navigation for the user according to the location information.
  • the position information of the road on the upper side of the elevated and the road on the lower side of the elevated may be the same or similar.
  • the terminal device cannot determine whether the vehicle is located on the road above the elevated road or on the road below the elevated road based on the determined location information, which may easily lead to navigation errors.
  • FIG. 3 is an electronic map displayed by the terminal device when the navigation accuracy is low.
  • the navigation in this figure indicates that the user's vehicle is located on the side road of North Fourth Ring East Road, under the viaduct, and the triangle marked by the solid line in Figure 3 indicates the location of the terminal device indicated by the navigation; the actual location of the user's vehicle is The position of the vehicle has driven from under the viaduct to the viaduct, and the triangle marked by the dotted line in Figure 3 indicates the actual position of the vehicle. It can be seen that the actual position of the vehicle is inconsistent with the navigation position indicated in the electronic map displayed on the user's terminal device. Vehicle position recognition has deviated.
  • the embodiments of the present application provide an elevated identification method and device to identify whether the vehicle is on the road above the elevated or on the lower side of the elevated, so as to improve the accuracy of navigation.
  • the technical solution of the present application can be applied to the field of vehicle driving, including but not limited to the fields of automated driving (automated driving, ADS), intelligent driving (intelligent driving) and intelligent connected vehicle (Intelligent Connected Vehicle, ICV).
  • ADS automated driving
  • intelligent driving intelligent driving
  • intelligent connected vehicle Intelligent Connected Vehicle
  • ICV Intelligent Connected Vehicle
  • FIG. 4 is a schematic diagram of a vehicle driving scene provided in this embodiment.
  • This scenario involves a server, at least one terminal device, and a vehicle corresponding to the terminal device.
  • the server and the terminal device (such as a mobile phone terminal) can be connected through a wireless network.
  • the server can be a service platform or a car networking server that manages mobile terminals, for example, the server is used to receive messages sent by the mobile terminal, determine the location of the vehicle, and provide users with maps and real-time navigation services.
  • electronic maps of multiple regions can be stored in the server.
  • the terminal device is used to send a request to the server to realize the real-time positioning and navigation functions of the vehicle.
  • the vehicle includes a communication module and a processing module, which are used to receive the signal sent by the server and/or the mobile phone terminal, and control the start and stop of the vehicle according to the signal and the preset program, and obtain the status of the vehicle on the elevated or off the elevated.
  • the server may be one or more independent servers or server clusters, or may also be a cloud platform service deployed on the cloud.
  • the server can be a network device, such as a base station (BS), and further, the base station can be a Global System for Mobile Communication (GSM) or Code Division Multiple Access (CDMA)
  • GSM Global System for Mobile Communication
  • CDMA Code Division Multiple Access
  • the base transceiver station (BTS) in wideband code division multiple access (wideband-CDMA, WCDMA) can also be the base station (NodeB) in wideband-CDMA, it can also be the evolved base station (eNB/e- NodeB), or the evolved base station (next generation eNB, ng-eNB) in next-generation LTE, or the base station (gNB) in NR, or, the base station or wireless fidelity (wireless fidelity, WiFi) in the future mobile communication system
  • the embodiments of the present application do not limit the specific technology and specific equipment form adopted by the network equipment, which may be cloud deployment or independent computer
  • the terminal device in the embodiment of this application may be a device that provides services and/or data connectivity to users, a handheld device with a wireless connection function, or other processing devices connected to a wireless modem, such as a wireless terminal, a vehicle-mounted wireless terminal , portable devices, wearable devices, mobile phones (or "cellular" phones), portable, pocket, handheld terminals, etc., which exchange language and/or data with the radio access network.
  • a wireless modem such as a wireless terminal, a vehicle-mounted wireless terminal , portable devices, wearable devices, mobile phones (or “cellular” phones), portable, pocket, handheld terminals, etc.
  • personal communication service personal communication service, PCS
  • PCS personal communication service
  • SIP session initiation protocol
  • WLL wireless local loop
  • PDA personal digital assistant
  • the wireless terminal may also be a subscriber unit (subscriber unit), an access terminal (access terminal), a user terminal (user terminal), a user agent (user agent), a user device (user device) or a user equipment (user equipment, UE ), etc., this application does not limit the type of terminal equipment.
  • subscriber unit subscriber unit
  • access terminal access terminal
  • user terminal user terminal
  • user agent user agent
  • user device user device
  • user equipment user equipment
  • FIG. 5 it is a schematic structural diagram of a mobile phone.
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, etc.
  • a universal serial bus universal serial bus, USB
  • the structure shown in the embodiment of the present invention does not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or fewer components than shown in the illustration, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, And/or a universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in a mobile phone can be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to mobile phones.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide applications on mobile phones including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system ( Global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2.
  • the antenna 1 of the mobile phone is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • WLAN NFC
  • FM
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the mobile phone may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the mobile phone may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone selects the frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • a mobile phone can support one or more video codecs.
  • the mobile phone can play or record video in multiple encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the mobile phone can realize the audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the cell phone can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to listen to the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the mobile phone may be provided with at least one microphone 170C.
  • the mobile phone can be provided with two microphones 170C, which can also implement a noise reduction function in addition to collecting sound signals.
  • the mobile phone can also be equipped with three, four or more microphones 170C to realize the collection of sound signals, noise reduction, identification of sound sources, and realization of directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the mobile phone may also include a charging management module, a power management module, a battery, buttons, an indicator, and one or more SIM card interfaces, etc., which are not limited in this embodiment of the present application.
  • the software system of the mobile phone can adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes the Android system with layered architecture as an example to illustrate the software structure of the mobile phone.
  • Fig. 6 is a software structural block diagram of an implementation manner of the mobile phone provided by the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs such as camera, gallery, call, navigation, bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 6, the application framework layer can include window manager, content provider, view system, phone manager, resource manager and notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can acquire the size of the display screen, parameters of each display area on the display interface, and the like.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views. For example, a display interface including a camera icon.
  • the phone manager is used to provide the communication functions of the mobile phone. For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the Android Runtime includes core library and virtual machine.
  • the Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can contain display drivers, camera drivers, audio drivers, sensor drivers, etc.
  • the system library and kernel layer below the application framework layer can also be referred to as the underlying system.
  • the underlying system includes a status monitoring service for identifying changes in the attitude of the mobile phone.
  • the status monitoring service can be set in the system library and/or the kernel layer.
  • the terminal device that executes the overhead identification method provided in the embodiment of the present application may be a vehicle.
  • the overhead identification method can be performed by a vehicle unit in the vehicle, wherein the vehicle unit is usually installed in the center console of the vehicle.
  • vehicle 100 may be a smart vehicle.
  • Fig. 7 is a functional block diagram of the vehicle 100 provided by the embodiment of the present application.
  • vehicle 100 may include various subsystems such as travel system 1002 , sensor system 1004 , planning control system 1006 , one or more peripheral devices 1008 as well as power supply 1010 , computer system 1001 and user interface 1016 .
  • vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each subsystem and element of the vehicle 100 may be interconnected by wire or wirelessly.
  • Propulsion system 1002 may include components that power vehicle 100 .
  • propulsion system 1002 may include engine 1018 , energy source 1019 , transmission 1020 and wheels 1021 .
  • the engine 1018 may be an internal combustion engine, an electric motor, an air compression engine or other types of an engine or a combination of multiple engines.
  • the combination of various engines may include, for example: a hybrid engine composed of a gasoline engine and an electric motor , a hybrid engine consisting of an internal combustion engine and an air compression engine.
  • the engine 1018 converts the energy source 1019 into mechanical energy.
  • Examples of energy sources 1019 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. Energy source 1019 may also provide energy to other systems of vehicle 100 .
  • Transmission 1020 may transmit mechanical power from engine 1018 to wheels 1021 .
  • Transmission 1020 may include a gearbox, a differential, and drive shafts.
  • the transmission device 1020 may also include other devices, such as clutches.
  • drive shafts may include one or more axles that may be coupled to one or more wheels 1021 .
  • the sensor system 1004 may include several sensors that sense information about the vehicle 100 itself and the environment surrounding the vehicle 100 .
  • the sensor system 1004 may include a positioning system 1022 (the positioning system may be a GNSS system, may include a GPS system, may also include a Beidou system or other positioning systems), an inertial measurement unit (inertial measurement unit, IMU) 1024, a radar 1026, a laser Range finder 1028 , camera 1030 , computer vision system 1038 and sensor fusion algorithm 1040 .
  • the sensor system 1004 may also include sensors of interior systems of the vehicle 100 (eg, cabin air quality monitor, fuel gauge, oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect the object to be detected and its corresponding properties (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function for safe operation of the vehicle 100 .
  • the global positioning system 1022 may be used to estimate the geographic location of the vehicle 100 .
  • the IMU 1024 is used to sense changes in position and orientation of the vehicle 100 based on inertial acceleration.
  • IMU 1024 may be a combination accelerometer and gyroscope.
  • the radar 1026 may utilize radio signals to sense objects in the environment surrounding the vehicle 100 . In some embodiments, in addition to sensing objects, radar 1026 may be used to sense the velocity or direction of travel of objects.
  • the laser range finder 1028 may utilize laser light to sense objects in the environment of the vehicle 100 .
  • laser rangefinder 1028 may include one or more laser sources, a laser scanner, and one or more detectors, among other system components.
  • Camera 1030 may be used to capture multiple images of the surrounding environment of vehicle 100 .
  • Camera 1030 may be a still camera or a video camera.
  • Computer vision system 1038 is operable to process and analyze images captured by camera 1030 in order to identify objects or features in the environment surrounding vehicle 100 .
  • the objects or features may include traffic signals, road boundaries and objects.
  • the computer vision system 1038 may use object recognition algorithms, Structure from Motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • SFM Structure from Motion
  • computer vision system 1038 may be used to map the environment, track objects, estimate the velocity of objects, and the like.
  • Program control system 1006 is to control the operation of vehicle 100 and its components.
  • Planning control system 1006 may include various elements including steering system 1032 , accelerator 1034 , braking unit 1036 , route control system 1042 , and object avoidance system 1044 .
  • the forward direction of the vehicle 100 can be adjusted by operating the steering system 1032 .
  • the steering system 1032 For example in one embodiment it could be a steering wheel system.
  • Throttle 1034 is used to control the operating speed of engine 1018 and thus the speed of vehicle 100 .
  • the braking unit 1036 is used to control the deceleration of the vehicle 100 .
  • the braking unit 1036 may use friction to slow the wheels 1021 .
  • the brake unit 1036 can convert the kinetic energy of the wheel 1021 into electric current.
  • the braking unit 1036 may also take other forms to slow down the rotation of the wheels 1021 to control the speed of the vehicle 100 .
  • the route planning system 1042 is used to determine the driving route of the vehicle 100 .
  • route planning system 1042 may combine data from sensors 1038, GPS 1022, and one or more predetermined maps to plan a travel route for vehicle 100 that avoids potential objects in the environment.
  • the trajectory planning method provided by the embodiment of the present application can be executed by the route planning system 1042 to output a target driving trajectory for the vehicle 100, and the target driving trajectory includes multiple target waypoints, wherein each of the multiple target waypoints
  • the target waypoint includes the coordinates of the waypoint, as well as the lateral allowable error and speed allowable error of the waypoint.
  • the lateral allowable error described in this paper includes the value range of the lateral allowable error, which can be understood as the range of the lateral allowable error in some cases. Abbreviation for range of values.
  • the lateral direction here refers to the direction perpendicular to the direction of travel of the vehicle or the approximately vertical direction; the lateral allowable error, which essentially means the allowable error of the lateral displacement, that is, the vehicle 100 in the vertical direction or approximately vertical direction of the vehicle travel direction
  • the value range of the displacement error This will not be repeated in the following text.
  • the control system 1044 is used to generate accelerator, brake, and steering angle control quantities according to the driving route/trajectory output by the route planning system, so as to control the steering system 1032 , the accelerator 1034 and the braking unit 1036 .
  • planning control system 1006 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 1008 .
  • Peripherals 1008 may include wireless communication system 1046 , on-board computer 1048 , microphone 1050 or speaker 1052 .
  • peripheral device 1008 provides a means for a user of vehicle 100 to interact with user interface 1016 .
  • on-board computer 1048 may provide information to a user of vehicle 100 .
  • User interface 1016 may also operate on-board computer 1048 to receive user input.
  • the on-board computer 1048 can be operated through a touch screen.
  • peripheral devices 1008 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • microphone 1050 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
  • speaker 1052 may output audio to a user of vehicle 100 .
  • Wireless communication system 1046 may communicate wirelessly with one or more devices, either directly or via a communication network.
  • wireless communication system 1046 may use 3G cellular communications, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communications, such as LTE. Or 5G cellular communications.
  • the wireless communication system 1046 can use WiFi to communicate with a wireless local area network (wireless local area network, WLAN).
  • wireless communication system 1046 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 1046 may include one or more dedicated short range communications (DSRC) devices, which may include public communication between vehicles or roadside stations. Or private data communication.
  • DSRC dedicated short range communications
  • Power supply 1010 may provide power to various components of vehicle 100 .
  • the power source 1010 may be a rechargeable lithium-ion or lead-acid battery.
  • One or more packs of such batteries may be configured as a power source and provide power to various components of the vehicle 100 .
  • power source 1010 and energy source 1019 may be implemented together, such as in an all-electric vehicle.
  • Computer system 1001 may include at least one processor 1013 executing instructions 1015 stored in a non-transitory computer-readable medium such as memory 1014 .
  • Computer system 1001 may also be a plurality of computing devices that control individual components or subsystems of vehicle 100 in a distributed manner.
  • Processor 1013 may be any conventional processor, such as a commercially available CPU.
  • the processor may be a special purpose device such as an ASIC or other hardware based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of the computer system 1001, those of ordinary skill in the art will appreciate that the processor, memory may actually include other multiple processing elements that are not located within the same physical enclosure. device, or memory.
  • the memory may be a hard drive or other storage medium located in a different housing than the computer system 1001 . Accordingly, a reference to a processor will be understood to include references to a collection of processors or memories that may or may not operate in parallel.
  • some components may each have their own processor that only performs calculations related to component-specific functions ; or subsystems such as the traveling system, sensor system, and planning control system may also have their own processors, which are used to perform calculations of related tasks of the corresponding subsystems to achieve corresponding functions.
  • the processor can be located remotely from the vehicle and be in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle, while others are executed by a remote processor, including taking the necessary steps to perform a single maneuver.
  • memory 1014 may contain instructions 1015 (eg, program logic) executable by processor 1013 to perform various functions of vehicle 100 , including those described above.
  • Memory 1014 may also contain additional instructions, including instructions to send data to, receive data from, interact with, or control one or more of travel system 1002, sensor system 1004, planning control system 1006, and peripherals 1008 .
  • memory 1014 may also store other relevant data, such as road maps, route information, vehicle's position, direction, speed, and other relevant information. Such information may be used by the vehicle 100 or specifically by the computer system 1001 during operation of the vehicle 100 in autonomous, semi-autonomous, or manual modes.
  • a user interface 1016 for providing information to or receiving information from a user of the vehicle 100 may include one or more input/output devices within set of peripheral devices 1008 , such as wireless communication system 1046 , on-board computer 1048 , microphone 1050 and speaker 1052 .
  • Computer system 1001 may control functions of vehicle 100 based on input received from various subsystems (eg, travel system 1002 , sensor system 1004 , and planning control system 1006 ) as well as from user interface 1016 .
  • the computer system 1001 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed separately from or associated with the vehicle 100 .
  • memory 1014 may exist partially or completely separate from vehicle 100 .
  • the above-mentioned components may be communicatively coupled together in a wired or wireless manner.
  • FIG. 7 should not be construed as a limitation to this embodiment of the present invention.
  • the above-mentioned vehicle 100 can be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train and trolley, etc., the present invention Examples are not particularly limited.
  • the vehicle 100 can receive the GNSS signal, determine its own position through the GNSS signal, realize its own positioning, and can determine that the vehicle is located on the road above the elevated side through the elevated identification method provided in the embodiment of the present application , or the road on the lower side of the elevated road.
  • the elevated identification method provided by the embodiment of the present application will be exemplarily described below in conjunction with the terminal device shown in FIG. 4 and the schematic interface diagrams of the terminal device shown in FIG. 8( a ) to FIG. 8( c ).
  • the vehicle can be navigated through the terminal device.
  • the vehicle can use the navigation APP installed in the terminal device during driving, such as or Wait for the navigation APP to navigate the vehicle.
  • the terminal device can determine the first parameter of the vehicle at the first moment through the overhead identification method provided by the embodiment of the present application, and determine the second parameter of the vehicle at the first moment according to the sensor, and then determine the second parameter of the vehicle at the first moment according to the first parameter and the second parameter
  • the terminal device can determine the first parameter of the vehicle at the first moment through the overhead identification method provided by the embodiment of the present application, and determine the second parameter of the vehicle at the first moment according to the sensor, and then determine the second parameter of the vehicle at the first moment according to the first parameter and the second parameter
  • the first parameter of the vehicle at the first moment may be determined periodically after each startup of the terminal device.
  • the terminal device can determine the first parameter of the vehicle at the first moment when a certain trigger condition is met. For this scenario, for example, the following solutions are disclosed:
  • the terminal device receives the operation of starting the navigation function, it indicates that the user needs to use the terminal device for navigation. In this case, the first parameter of the vehicle at the first moment is determined.
  • the operation of starting the navigation function may include various forms, for example, may include a touch operation on the navigation APP or a specific gesture operation, etc., which is not limited in this embodiment of the present application.
  • the terminal device receives a location search operation, indicating that the user needs to view the surrounding environment of a certain location, and the user often has navigation needs, then the first parameter of the vehicle at the first moment is determined.
  • Exemplarily refer to an example diagram of a display interface of a terminal device shown in FIG. 8(a).
  • the position indicated by the circle containing the triangle is the National Library of China.
  • the first parameter of the vehicle at the first moment can be determined.
  • the terminal device receives an operation for indicating a destination, it indicates that the user needs to go to a certain destination, and the user often has navigation needs, so the first parameter of the vehicle at the first moment is determined.
  • the starting point is the current location of the vehicle
  • the ending point is the National Library of China.
  • a first parameter of the vehicle at a first instant can be determined.
  • Navigation methods usually include: taxi, driving, public transportation, walking and cycling, etc.
  • the navigation mode applied by the terminal device is driving, it indicates that the user needs to drive a vehicle, and the user has a navigation demand.
  • a first parameter of the vehicle at a first instant can be determined.
  • the navigation mode applied by the terminal device is driving.
  • the speed of the terminal device is greater than the target speed threshold, it indicates that the speed of the terminal device is relatively fast, and the user carrying the terminal device is driving a vehicle.
  • the user may drive into the road on the upper side of the elevated road, therefore, the first parameter of the vehicle at the first moment can be determined.
  • the target speed threshold may be 30KM/h, of course, the target speed threshold may also be set to other values, which is not limited in this embodiment of the present application.
  • the terminal device can determine the location information of the terminal device according to the GNSS signal, and the terminal device can transmit the location information to a remote server.
  • the server stores the positions of the ramps in various places, and determines whether the front of the vehicle contains the ramps according to the received position information. After determining that the front of the vehicle contains the ramp, the server transmits corresponding prompt information to the terminal device to prompt that the vehicle is about to drive into the position of the ramp.
  • the terminal device may store the location information of ramp crossings in various places. After determining the location information of the terminal device, the terminal device may match the location information of the terminal device with its own storage to determine whether there is a ramp crossing ahead of the vehicle.
  • the terminal device can also be connected with the equipment in the vehicle, for example, the terminal device can be connected with the vehicle machine installed in the center console of the vehicle.
  • the device can store the location information of the ramp intersections in various places, and the terminal device can also transmit the location information to the device. Based on this, the device can determine whether there is a ramp intersection in front of the vehicle, and determine whether there is a ramp intersection in front of the vehicle. After the port, the corresponding prompt information is transmitted to the terminal device.
  • Ramp junctions generally include ramp entrances and ramp exits.
  • the vehicle often needs to pass through the entrance of the ramp, and then drive into the road on the upper side of the elevated road. And, when the vehicle drives into the lower road from the road on the elevated side, it often needs to pass through the ramp exit, and then drive into the lower road.
  • the terminal device can acquire the image in front of the vehicle, and through image analysis, determine whether there is an elevated sign in front of the vehicle. If it is determined that there is an elevated sign in front of the vehicle, it indicates that the vehicle is about to go on or off the elevated shelf. In this case, it can be determined that the vehicle is in the The first parameter at the first moment is used to identify whether the vehicle is driving on the road on the upper side of the elevated road or on the lower side of the elevated road through the solution provided by the embodiment of this application.
  • the terminal device can determine the location of the terminal device according to the GNSS signal, and according to the electronic map, determine whether there is an elevated road around the location. If so, it indicates that the vehicle may be on or off the elevated road, so it can be determined that the vehicle is in the first
  • the first parameter of the time is used to identify whether the vehicle is on the road on the upper side of the elevated road or on the lower side of the elevated road.
  • the terminal device may also determine the first parameter of the vehicle at the first moment in other scenarios, which is not limited in this embodiment of the present application.
  • an embodiment of the present application provides an elevated identification method.
  • the elevated identification method provided in the embodiment of the present application includes the following steps:
  • Step S11 Determine the first parameter of the vehicle at the first moment according to the GNSS signal of the global navigation satellite system.
  • the first parameter includes at least one of the following information: speed and heading of the vehicle.
  • speed and heading of the vehicle can usually be used as the heading of the vehicle.
  • the position information of the positions of the terminal devices at different times can be determined. It can be understood that when the terminal device is navigating the vehicle, the position information of the terminal device at different times can reflect the trajectory of the vehicle to a certain extent, and the terminal device can determine the speed and heading of the vehicle accordingly. Among them, the speed of the vehicle can be determined through the distance and time difference between the vehicles at different times.
  • Figure 10(b) corresponding to Figure 10(a) is disclosed, wherein Figure 10(b) is a top view of Figure 10(a), and the road shown in Figure 10(b) is the elevated The road on the side, and the position of the vehicle at each moment is represented by a circle containing numbers in Figure 10(b), where the smaller the number in the circle, the earlier the moment the vehicle is at this position.
  • the vehicle is set at the circle position indicated by the number 1 at time t1
  • the vehicle is set at the circle position indicated by the number 2 at time t2
  • the vehicle is set at the circle position indicated by the number 3 at time t3
  • the vehicle is set at the circle position indicated by the number 3 at time t4 It is in the circle position indicated by the numeral 4, and since the vehicle travels from left to right, the time t1 is earlier than the time t2, the time t2 is earlier than the time t3, and the time t3 is earlier than the time t4.
  • the speed of the vehicle between time t1 and time t2 is the difference in distance between the circle position indicated by number 1 and the circle position indicated by number 2 and the time difference between time t1 and time t2
  • the ratio of ; the speed of the vehicle between time t3 and time t2 is the ratio of the distance difference between the circle position indicated by the number 3 and the circle position indicated by the number 2 to the time difference between time t3 and time t2
  • the speed of the vehicle between time t1 and time t4 is the ratio of the distance difference between the circle position indicated by number 1 and the circle position indicated by number 4 to the time difference between time t1 and time t4.
  • the speed of the vehicle is determined from the position of the vehicle at time t1, time t2, time t3 and time t4, respectively, and the time difference between the different times.
  • the heading of the vehicle is the direction indicated by the dotted line including the arrow in FIG. 10( b ).
  • Step S12 according to the sensor, determine the second parameter of the vehicle at the first moment.
  • the first moment is the current moment
  • the first parameter of the vehicle at the first moment may be the first parameter of the current moment
  • the second parameter of the vehicle at the first moment may be the second parameter of the vehicle at the current moment.
  • the first moment can be any moment in a time period.
  • the first parameter of the vehicle at the first moment can be the first parameter at a certain moment in the time period
  • the second parameter of the vehicle at the first moment may be the second parameter of the vehicle at another moment within the time period.
  • the second parameter includes at least one of the following information: pitch angle, roll angle and heading angle of the vehicle.
  • the pitch angle of the vehicle usually refers to the "pitch" angle of the vehicle relative to the XOY plane of the inertial coordinate system;
  • the roll angle of the vehicle usually refers to the lateral inclination angle used to identify the vehicle in the inertial coordinate system;
  • the vehicle's The heading angle usually refers to the angle between the speed of the center of mass of the vehicle and the horizontal axis in the inertial coordinate system.
  • the second parameter may be collected by a sensor, where the sensor for collecting the second parameter may generally include a gyroscope, and of course, other sensors capable of collecting the second parameter may also be included, which is not limited in this embodiment of the present application.
  • the senor for collecting the second parameter can be set in the terminal device used for navigation, and can also be set in the vehicle. If the sensor is arranged in the vehicle, the sensor can transmit the collected second parameter to the terminal device through the network.
  • the senor may periodically collect and store the second parameter during the terminal device receiving the GNSS signal. In this case, based on the cached second parameter, the second parameter of the vehicle at the first moment can be determined.
  • the sensor may be triggered to collect the second parameter, and the second parameter collected by the sensor may be obtained.
  • Step S13 Determine the driving state of the vehicle according to the first parameter and the second parameter.
  • the driving state of the vehicle generally includes multiple types. Wherein, if the vehicle needs to go on or off the elevated road, the driving state of the vehicle may include start of on-ramp, end of on-ramp, start of off-ramp and end of off-ramp. In addition, if the vehicle is neither on the elevated nor off the elevated, the driving state of the vehicle may include: uphill, downhill, and driving on the road.
  • the driving state of the vehicle may also include other types, which are not limited in this embodiment of the present application.
  • this step the driving state of the vehicle is determined according to the first parameter and the second parameter.
  • this operation can be realized through the following steps:
  • the first parameter and the second parameter are transmitted to the classifier, and the classifier is used to classify the driving state of the vehicle according to the parameters of the vehicle;
  • the driving state of the vehicle is determined.
  • the driving state of the vehicle can be determined based on a classifier, and the classifier can be determined by training in advance according to vehicle information of the vehicle in different shape states.
  • the classifier can be trained according to the first parameter and the second parameter, and the states output by the classifier can include various driving states of the vehicle.
  • the classifier can be trained by the speed, heading and pitch angle of the vehicle, and the states output by the classifier can include states such as start of on-ramp, end of on-ramp, start of off-ramp, and end of off-ramp.
  • the driving state of the vehicle is determined according to the first parameter and the second parameter
  • the first parameter and the second parameter are the input of the classifier
  • the driving state of the vehicle is the output of the classifier
  • the classifier may include a support vector machine (support vector machine, SVM).
  • SVM support vector machine
  • Step S14 when the driving state of the vehicle belongs to the target state, determine the elevated recognition result of the vehicle according to the height change of the vehicle.
  • the target state includes at least one of the following driving states: start of on-ramp, end of on-ramp, start of off-ramp and end of off-ramp, and the elevated identification result of the vehicle is that the vehicle is traveling on the road on the upper side of the elevated , or for the vehicle to travel on the road under the elevated road.
  • the driving state of the vehicle is one of the target states, it indicates that the driving state of the vehicle belongs to the target state.
  • the vehicle There is usually a ramp at the entrance and exit of the elevated.
  • the vehicle When the vehicle is on the elevated, it first needs to go on the ramp, and then drive into the road on the upper side of the elevated.
  • the vehicle when the vehicle is getting off the elevated road, it first needs to get off the ramp, and then drive into the road on the lower side of the elevated road. Therefore, if the driving state of the vehicle belongs to the target state, it means that when the vehicle is in the target state, it is going up the elevated frame, has completed the elevated frame, started to get off the elevated frame, or has completed the elevated frame.
  • the embodiment of the present application determines the elevated identification result of the vehicle in combination with the target state to which the vehicle belongs and the height change of the vehicle.
  • the elevated road can be multi-layered or one layer, wherein, the elevated road is one layer, which means that the elevated road only includes one layer of road on the upper side, and the elevated road is multi-layered, which means that the elevated road includes multiple layers on the upper side side of the road.
  • the identification result of the elevated structure of the vehicle can be determined through the following steps:
  • the first time period is a time period after the time when the driving state of the vehicle is determined. That is to say, within a period of time after the vehicle's driving state starts to go up the ramp, the height of the vehicle increases, and the height of the increase is greater than the first threshold. road driving.
  • the first threshold is generally a positive number, and the specific value of the first threshold can be preset. Alternatively, the specific numerical value of the first threshold may be determined by the height of the local elevated.
  • the first threshold is usually slightly smaller than the height of the elevated. Since the elevated is one floor, in this scheme, the height of the elevated usually refers to the height of the road on the upper side of the elevated when the road on the lower side of the elevated is taken as the reference plane.
  • the terminal device may determine the height of the local overhead through various methods.
  • the terminal device can store the height of the overhead in various places.
  • the terminal device determines the location information of the terminal device according to the GNSS signal, it can determine the surrounding area of the location by querying its own storage. the elevated height.
  • the terminal device after the terminal device determines the location information of the terminal device according to the received GNSS signal, it can transmit the location information to a remote server, and then the remote server determines the location information based on the location information.
  • the elevated height around the location is transmitted to the terminal equipment.
  • the terminal device can exchange information with the vehicle.
  • the terminal device can transmit its own position information to the vehicle, and the vehicle determines the height of the elevated structure, and then transmits the height of the elevated structure to the Terminal Equipment.
  • the terminal device may also determine the height of the local elevated in other ways, which is not limited in this embodiment of the present application.
  • the duration of the first time period may be preset.
  • the first time period may be determined according to the time t1 required for the vehicle to get on and off the elevated shelf, wherein the first time period may be slightly longer than the time t1. For example, if t1 is 20 seconds, the first time period may be 25 seconds.
  • the length of the first time period may also be determined in other ways.
  • the terminal device may determine the length of the first time period according to the received setting operation for the first time period, which is not limited in this embodiment of the present application.
  • the first time period is a time period after the time when the driving state of the vehicle is determined. That is to say, within a period of time after the driving state of the vehicle starts to go off the ramp, the height of the vehicle decreases, and the reduced height is greater than the first threshold. road driving.
  • the driving state of the vehicle is the completion of the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and the elevated identification result of the vehicle is determined to be that the vehicle is traveling on the road on the upper side of the elevated.
  • the second time period is a time period before the time when the driving state of the vehicle is determined. That is to say, during a period of time before the vehicle's driving state ends on the ramp, the height of the vehicle increases, and the increased height is greater than the first threshold. road driving.
  • the duration of the second time period may be preset.
  • the second time period may be determined according to the time t1 required for the vehicle to get on and off the elevated shelf, wherein the second time period may be slightly longer than the time t1.
  • the length of the second time period may be the same as that of the first time period, or the two may also be different, which is not limited in this embodiment of the present application.
  • the driving state of the vehicle is to end the off-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and the elevated identification result of the vehicle is determined to be the vehicle traveling on the road below the elevated.
  • the second time period is a time period before the time when the driving state of the vehicle is determined. That is to say, within a period of time after the driving state of the vehicle starts to go up the ramp, the height of the vehicle decreases, and the reduced height is greater than the first threshold. road driving.
  • the elevated recognition results of the vehicle in different driving states can be determined. Moreover, when the above-mentioned steps determine the elevated recognition result of the vehicle, the driving state of the vehicle and the height change of the vehicle can be combined to identify whether the vehicle is on the road on the elevated side or on the lower side of the elevated road, which solves the problem that the vehicle cannot be determined in the prior art. Problems with elevated recognition results.
  • the elevated road includes two or more layers, that is, the road on the upper side of the elevated includes at least two layers. If the elevated structure includes at least two floors, in this scenario, in the case of different driving states of the vehicle, the elevated recognition result of the vehicle can be determined through the following steps:
  • the first period of time is after the time when the driving state of the vehicle is determined. That is to say, within a period of time after the vehicle's driving state starts to go up the ramp, the height of the vehicle increases, and the height of the increase is greater than the first threshold. road driving.
  • the elevated identification result of the vehicle is a vehicle Whether to drive on the road under the elevated road.
  • the absolute value of the height change of the vehicle within the first time period is greater than the first threshold, it indicates that the height of the vehicle has decreased within a period of time after the driving state of the vehicle is starting off-ramp.
  • the elevated can include multiple layers, in this case, it is possible for the vehicle to drive into the road on the lower side of the elevated, or from a road on a higher level on the upper side of the elevated to the road on the upper side of the elevated. lower level road.
  • the road on the upper side of the elevated road includes three layers, and the higher the set height, the higher the corresponding number of layers.
  • the layer closest to the road on the lower side of the elevated road is the first layer, and the distance from the road on the lower side of the elevated road is the farthest.
  • the vehicle starts to go down the ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, it may be that the vehicle enters the elevated road from the third floor road on the upper side of the elevated 2nd floor road on the upper side. Therefore, it is also necessary to determine whether the elevated recognition result of the vehicle is whether the vehicle is driving on the road below the elevated according to the state of the vehicle before the off-ramp.
  • the state of the vehicle before the off-ramp generally includes: the vehicle is driving on the first level elevated and the vehicle is driving on other levels of the elevated, the first level elevated is the closest to the road on the lower side of the elevated elevated floors.
  • the state of the vehicle before the off-ramp is that the vehicle is driving on the first floor elevated, then in the above steps, it can be determined that the elevated identification result of the vehicle is that the vehicle is driving on the road below the elevated; if the state of the vehicle before the off-ramp is If the vehicle is driving on the elevated road on other floors, then in the above steps, it can be determined that the elevated recognition result of the vehicle is that the vehicle is driving on the road on the upper side of the elevated road, that is, the vehicle is driving from a road on a higher level on the upper side of the elevated road to the upper side of the elevated road. lower level roads.
  • the driving state of the vehicle is the completion of the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and the elevated identification result of the vehicle is determined to be that the vehicle is traveling on the road on the upper side of the elevated.
  • the second time period is a time period before the time when the driving state of the vehicle is determined. That is to say, during a period of time before the vehicle's driving state ends on the ramp, the height of the vehicle increases, and the increased height is greater than the first threshold. road driving.
  • the driving state of the vehicle is to end the off-ramp, and the absolute value of the height change of the vehicle in the previous second time period is greater than the first threshold, and the elevated identification result of the vehicle is determined according to the state of the vehicle before the off-ramp.
  • the second time period is a time period before the time when the driving state of the vehicle is determined. If the driving state of the vehicle is ending the off-ramp, and the absolute value of the height change of the vehicle within the second time period is greater than the first threshold, it indicates that the height of the vehicle has decreased within a period of time after the driving state of the vehicle is starting the off-ramp.
  • the elevated can include multiple layers, in this case, it is possible for the vehicle to drive into the road on the lower side of the elevated, or from a road on a higher level on the upper side of the elevated to the road on the upper side of the elevated. lower level road. Therefore, it is also necessary to determine whether the elevated recognition result of the vehicle is whether the vehicle is driving on the road below the elevated according to the state of the vehicle before the off-ramp.
  • the state of the vehicle before the off-ramp is that the vehicle is driving on the first floor elevated, then in the above steps, it can be determined that the elevated identification result of the vehicle is that the vehicle is driving on the road below the elevated; if the state of the vehicle before the off-ramp is If the vehicle is driving on the elevated road on other floors, then in the above steps, it can be determined that the elevated recognition result of the vehicle is that the vehicle is driving on the road on the upper side of the elevated road, that is, the vehicle is driving from a road on a higher level on the upper side of the elevated road to the upper side of the elevated road. lower level roads.
  • the above steps combine the state of the vehicle before the off-ramp when determining the elevated identification result of the vehicle.
  • the state of the vehicle before going off the ramp can be determined according to the height of the vehicle and the height of each elevated floor.
  • the height of each elevated level generally refers to the height of the elevated road at that level when the road on the lower side of the elevated is taken as the reference plane.
  • the road on the upper side of the viaduct includes multiple layers, and the higher the set height, the higher the corresponding number of floors, and the layer closest to the road on the lower side of the viaduct is the first floor. is the height of the nth floor elevated on the upper side of the elevated when the road on the lower side of the elevated is taken as the reference plane.
  • the height of the vehicle can be determined according to a height sensor (such as a barometer, etc.).
  • the height sensor can be installed in the terminal device, or the height sensor can be set in the vehicle, and after collecting the height of the vehicle, transmit the height of the vehicle to the terminal device.
  • the terminal device can determine the height of each elevated shelf in various ways.
  • the terminal device may store the height of each floor of the elevated shelves in various places.
  • the terminal device determines the location information of the terminal device according to the GNSS signal, it can determine the elevated roads around the location information, and determine the height of the elevated roads on each floor by querying its own storage.
  • the terminal device after the terminal device determines the location information of the terminal device according to the received GNSS signal, it can transmit the location information to a remote server, and the remote server determines the height of each elevated floor, and It is transmitted to the terminal equipment, so that the terminal equipment can determine the height of each floor according to the transmission of the server.
  • the terminal device may also determine the height of each elevated shelf in other ways, which is not limited in this embodiment of the present application.
  • the state of the vehicle before the off-ramp can be determined.
  • the state of the vehicle before the off-ramp can be determined through the following steps:
  • the first step is to record the change in the number of layers of the elevated vehicle where the vehicle is located each time after the elevated identification result of the vehicle is determined;
  • the second step is to determine the state of the vehicle before the off-ramp according to the change of the number of floors where the vehicle is located.
  • the terminal device can record the number of layers of the road on the upper side of the elevated road where the vehicle is located each time after determining the road from the current position to the upper layer of the elevated road. Add one, after determining the road from the current position to the next level of the elevated road, the number of layers of the road on the upper side of the elevated road where the recorded vehicle is located is reduced by one.
  • the vehicle if the vehicle is not on the elevated, it can usually be recorded that the number of layers of the elevated vehicle is 0. If the recorded layer of the elevated vehicle is greater than 0, it indicates that the vehicle is on the road on the upper side of the elevated. If it is recorded The number of elevated floors where the vehicle is located is 0, indicating that the vehicle is located on the lower side of the elevated road.
  • the terminal device determines that the driving state of the vehicle is starting to go on the ramp according to the first parameter and the second parameter of the vehicle, and the height of the vehicle within the first time period is The change is greater than the first threshold. In this case, the terminal device determines that the vehicle has boarded the elevated once, and records that the number of floors of the elevated where the vehicle is located is n+1.
  • the state of the vehicle before the off-ramp can be determined according to the record of the change of the number of floors of the elevated vehicle where the vehicle is located.
  • An embodiment of the present application provides an elevated identification method.
  • the first parameter at the first moment is determined according to the GNSS signal
  • the second parameter of the vehicle at the first moment is determined by the sensor, and then the first parameter and the second parameter are combined.
  • the two parameters determine the driving state of the vehicle, and then combine the driving state of the vehicle and the height change of the vehicle to determine the elevated recognition result of the vehicle, thereby solving the problem that the elevated recognition result of the vehicle cannot be determined in the prior art, and can reduce navigation errors The number of times to improve the accuracy of navigation.
  • the solution of the present application can improve the accuracy of navigation, it can also improve the experience of the user on the vehicle, reduce the time consumption of the driving process, and reduce the fuel consumption of the vehicle, so as to achieve the purpose of energy saving.
  • the state of the vehicle before the off-ramp is determined according to the change in the number of floors of the elevated vehicle where the vehicle is located in the embodiment of the present application, the number of floors of the elevated vehicle where the vehicle is located can also be determined to further improve the accuracy of navigation.
  • the elevated identification result of the vehicle indicates that the vehicle is driving on the upper side of the elevated road
  • the elevated identification result of the vehicle and the number of floors of the elevated vehicle are reported to the server.
  • the server may be a server of the navigation APP, and according to the reported information, the server determines whether the vehicle is traveling on the road on the upper side of the elevated road or on the road on the lower side of the elevated road. If the terminal device not only reports the identification result of the elevated vehicle, but also reports the number of floors of the elevated vehicle where the vehicle is located, then when the vehicle is driving on the road above the elevated, the server can also determine the number of floors of the elevated vehicle where the vehicle is located, so that the terminal can be more accurately determined. The location of the device, which helps to improve the accuracy of navigation.
  • the elevated identification result of the vehicle shows that the vehicle is driving on the road under the elevated, and the vehicle's driving direction includes a weak signal area ahead, adjust the navigation method from GNSS signal navigation to network positioning method or network positioning method and inertial navigation method to navigate.
  • the weak signal area generally includes: a tunnel area or a block-shielding area.
  • the occluded area refers to an area covered by an occluder resulting in a weaker signal, and the occluder may be a building or vegetation.
  • the terminal device may store the location of each weak signal area, and determine whether the front of the vehicle's driving direction is a weak signal area according to its own storage.
  • the remote server can determine whether the front of the vehicle's driving direction is a weak signal area, and if so, the server transmits a corresponding instruction to the terminal device, so that the terminal device can determine whether the front is a weak signal area according to the received instruction. signal area.
  • the GNSS signal received by the terminal device is often weak, or the GNSS signal cannot be received. If the terminal device continues to navigate based on the GNSS signal, the navigation accuracy is low, even when the GNSS signal cannot be received , navigation is not possible.
  • the terminal device performs navigation through a network positioning method or a network positioning method and an inertial navigation method, which can improve the accuracy of navigation.
  • the network positioning is a positioning technology for determining the position of the terminal device through a network signal received by the terminal device.
  • the network signal may originate from a base station, or may originate from a wireless fidelity (wireless fidelity, WIFI) hotspot.
  • WIFI wireless fidelity
  • the terminal device can determine the distance between itself and different base stations through the transmission time of the network signal transmitted by different base stations and the receiving time of the network signal, and then according to the distance between itself and different base stations and the location of different base stations to determine the location of the terminal device; if the network signal comes from a WIFI hotspot, the terminal device can use the transmission time of the network signal transmitted by different WIFI hotspots and the receiving time of the network signal to determine itself and different WIFI The distance between hotspots, and then determine the location of the terminal device according to the distance between itself and different WIFI hotspots and the locations of different WIFI hotspots.
  • the location of devices that generate network signals is different from that of satellites.
  • shelters When shelters have a greater impact on the reception of GNSS signals, they may have less impact on terminal equipment receiving network signals. Navigation can improve the accuracy of navigation.
  • the terminal device can combine the network positioning method and the inertial navigation method to jointly perform navigation.
  • the inertial navigation method is based on vehicle dead reckoning (VDR) technology, which can estimate the instantaneous position of the vehicle through inertial navigation sensors (such as direction sensors and speed sensors).
  • VDR vehicle dead reckoning
  • the navigation accuracy can be further improved by using the network positioning method and the inertial navigation method to conduct navigation together.
  • the elevated identification method provided in the embodiment of the present application can effectively reduce navigation errors and improve navigation accuracy.
  • an example is provided below.
  • the terminal device navigates the vehicle through the solutions provided by the prior art and the embodiment of the present application respectively.
  • FIG. 3 is an electronic map displayed by the terminal device for navigating the vehicle when the terminal device navigates the vehicle through the prior art.
  • the navigation in the figure indicates that the user's vehicle is located on the side road of the North Fourth Ring Road East Road, under the viaduct; while the actual location of the user's vehicle is on the elevated road of the North Fourth Ring Road.
  • the five-pointed star in Figure 3 indicates the actual location of the user's vehicle. The location is inconsistent with the navigation location of the user's mobile phone, and the navigation has a deviation in the recognition of the vehicle location.
  • FIG. 11 is an electronic map displayed by the terminal device for navigating the vehicle when the terminal device navigates the vehicle through the embodiment of the present application.
  • the elevated identification method of the embodiment of the present application can accurately locate the user's vehicle on the elevated, so that the navigation position of the mobile phone is consistent with the actual position of the vehicle, thereby realizing accurate positioning and navigation.
  • the methods and operations implemented by the terminal device may also be implemented by components (such as chips or circuits) that can be used for the terminal device.
  • the terminal device includes corresponding hardware structures and/or software modules for performing each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the units and algorithm steps of each example described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the functional modules of the terminal device may be divided according to the above method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 12 is a structural block diagram of an implementation manner of a navigation device provided by the present application.
  • the apparatus 1000 may include: a transceiver 1001 and a processor 1002 .
  • the apparatus 1000 may perform the operations performed by the terminal device in the method embodiment shown in FIG. 9 above.
  • the transceiver 1001 is configured to receive GNSS signals of a global navigation satellite system.
  • the processor 1002 is configured to: determine a first parameter of the vehicle at a first moment according to the GNSS signal;
  • the elevated recognition result of the vehicle is determined according to the height change of the vehicle, and the target state includes at least one of the following driving states: start on-ramp, end on-ramp 1. Starting the off-ramp and ending the off-ramp, the elevated recognition result of the vehicle is that the vehicle is traveling on the road above the elevated, or that the vehicle is traveling on the road below the elevated.
  • the first parameter includes at least one of the following information: the speed and heading of the vehicle;
  • the second parameter includes at least one of the following information: pitch angle, roll angle and heading angle of the vehicle.
  • the processor 1002 is configured to determine the first parameter of the vehicle at the first moment, specifically:
  • the first parameter of the vehicle at the first moment is determined
  • the first parameter of the vehicle at the first moment is determined.
  • the processor 1002 is configured to determine the driving state of the vehicle according to the first parameter and the second parameter, specifically:
  • the driving state of the vehicle is determined.
  • the elevated road is one floor
  • the processor 1002 is configured to determine the elevated recognition result of the vehicle according to the height change of the vehicle, specifically:
  • the driving state of the vehicle is starting to go on the ramp, and the height change of the vehicle within the first time period after that is greater than a first threshold, determine that the elevated identification result of the vehicle is the road where the vehicle is on the upper side of the elevated driving;
  • the driving state of the vehicle is to start off-ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, determine that the elevated identification result of the vehicle is The vehicle travels on the road under the elevated road;
  • the driving state of the vehicle is ending the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is on the Driving on the road on the upper side of the elevated road;
  • the driving state of the vehicle is ending the off-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is in the The elevated underside of the road travels.
  • the elevated structure includes at least two floors
  • the processor 1002 determines the elevated identification result of the vehicle according to the height change of the vehicle, specifically:
  • the driving state of the vehicle is starting to go on the ramp, and the height change of the vehicle within the first time period after that is greater than a first threshold, determine that the elevated identification result of the vehicle is the road where the vehicle is on the upper side of the elevated driving;
  • the driving state of the vehicle is to start off-ramp, and the absolute value of the height change of the vehicle in the first time period after that is greater than the first threshold, according to the state of the vehicle before the off-ramp , determining the elevated identification result of the vehicle;
  • the driving state of the vehicle is ending the on-ramp, and the height change of the vehicle in the previous second time period is greater than the first threshold, and it is determined that the elevated identification result of the vehicle is that the vehicle is on the Driving on the road on the upper side of the elevated road;
  • the driving state of the vehicle is ending the off-ramp, and the absolute value of the height change of the vehicle in the previous second time period is greater than the first threshold, according to the state of the vehicle before the off-ramp, An overhead identification result for the vehicle is determined.
  • the state of the vehicle before the off-ramp includes one of the following states: the vehicle is driving on the first floor elevated and the vehicle is driving on other floors of the elevated, the first The elevated level is the road on the upper level of the elevated level closest to the road on the lower side of the elevated level.
  • the processor 1002 is further configured to:
  • each time after determining the elevated identification result of the vehicle record the change in the number of floors of the elevated where the vehicle is located;
  • the state of the vehicle before the off-ramp is determined according to the change of the number of floors of the elevated road where the vehicle is located.
  • the processor 1002 is further configured to:
  • the elevated recognition result of the vehicle indicates that the vehicle is driving on the road on the upper side of the elevated road
  • the processor 1002 is further configured to:
  • the elevated identification result of the vehicle indicates that the vehicle is driving on the road below the elevated, and the direction of travel of the vehicle includes a weak signal area ahead, adjust the elevated identification method from navigation by GNSS signal to navigation by network positioning method Or navigate through the network positioning method and the inertial elevated identification method.
  • the weak signal area includes: a tunnel area or an obstacle shielding area.
  • the apparatus 1000 can implement the steps or processes corresponding to the steps or processes performed by the terminal device in the embodiment of the elevated identification method shown in FIG.
  • the module of the method to execute It should be understood that the specific process of each module performing the above corresponding steps has been described in detail in the above embodiment of the elevated identification method, and for the sake of brevity, details are not repeated here.
  • the embodiment of the present application also provides a navigation device, which includes at least one processor and a communication interface.
  • the communication interface is used to provide information input and/or output to the at least one processor, and the at least one processor is used to execute the methods in the above method embodiments.
  • the embodiment of the present application also provides a terminal device, the terminal device includes a processor, and when the processor executes the computer program or instruction in the memory, the method in the above method embodiment is executed.
  • the embodiment of the present application also provides a terminal device, the terminal device includes a processor and a memory; the memory is used to store computer programs or instructions; the processor is used to execute the computer programs or instructions stored in the memory, so that The terminal device executes the methods in the foregoing method embodiments.
  • the embodiment of the present application also provides a terminal device, which includes a processor, a memory, and a transceiver; the transceiver is used to receive signals or send signals; the memory is used to store computer programs or instructions; the processor is used to Executing the computer programs or instructions stored in the memory, so that the terminal device executes the methods in the above method embodiments.
  • the embodiment of the present application also provides a terminal device, the terminal device includes a processor and an interface circuit; the interface circuit is used to receive computer programs or instructions and transmit them to the processor; the processor is used to run the computer A program or an instruction, so that the terminal device executes the methods in the foregoing method embodiments.
  • FIG. 13 is a structural block diagram of an implementation manner of a chip provided by the present application.
  • the chip shown in FIG. 13 may be a general-purpose processor or a special-purpose processor.
  • the chip 1100 may include at least one processor 1101 . Wherein, the at least one processor 1101 may be used to support the apparatus shown in FIG. 14 to execute the technical solution shown in FIG. 9 .
  • the chip 1100 may further include a transceiver 1102, and the transceiver 1102 is used to accept the control of the processor 1101, and is used to support the device shown in FIG. 12 to execute the technical solution shown in FIG. 9 .
  • the chip 1100 shown in FIG. 13 may further include a storage medium 1103 .
  • the transceiver 1102 may be replaced with a communication interface, and the communication interface provides information input and/or output for the at least one processor 1101 .
  • the chip 1100 shown in FIG. 13 can be implemented using the following circuits or devices: one or more field programmable gate arrays (field programmable gate array, FPGA), programmable logic device (programmable logic device, PLD ), application specific integrated circuit (ASIC), system chip (system on chip, SoC), central processing unit (central processor unit, CPU), network processor (network processor, NP), digital signal processing circuit ( digital signal processor (DSP), microcontroller (micro controller unit, MCU), controller, state machine, gate logic, discrete hardware components, any other suitable circuitry, or capable of performing the various functions described throughout this application Any combination of circuits.
  • field programmable gate array field programmable gate array
  • PLD programmable logic device
  • ASIC application specific integrated circuit
  • SoC system chip
  • central processing unit central processor unit, CPU
  • network processor network processor
  • DSP digital signal processing circuit
  • microcontroller microcontroller (micro controller unit, MCU), controller, state machine, gate logic, discrete hardware components, any other suitable circuitry, or capable of performing the
  • each step of the above method can be completed by an integrated logic circuit of hardware in a processor or an instruction in the form of software.
  • the steps of the methods disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor.
  • the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware. To avoid repetition, no detailed description is given here.
  • the processor in the embodiment of the present application may be an integrated circuit chip, which has a signal processing capability.
  • each step of the above-mentioned method embodiments may be completed by an integrated logic circuit of hardware in a processor or instructions in the form of software.
  • the above-mentioned processor may be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components .
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate array
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
  • the memory in the embodiments of the present application may be a volatile memory or a nonvolatile memory, or may include both volatile and nonvolatile memories.
  • the non-volatile memory can be read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically programmable Erases programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory can be random access memory (RAM), which acts as external cache memory.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM direct memory bus random access memory
  • direct rambus RAM direct rambus RAM
  • the embodiment of the present application also provides a computer program product, the computer program product including: a computer program or instruction, when the computer program or instruction is run on the computer, the computer is made to execute the The method of any of the illustrated embodiments.
  • the embodiment of the present application also provides a computer storage medium, the computer storage medium stores a computer program or instruction, and when the computer program or instruction is run on the computer, the computer executes the method shown in Figure 9. The method of any of the illustrated embodiments.
  • the embodiment of the present application also provides a terminal device, the terminal device is a smart device, including a smart phone, a tablet computer or a personal digital assistant, etc., and the smart device includes the above-mentioned location information generation device .
  • the disclosed systems, devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules is only a logical function division. In actual implementation, there may be other division methods.
  • multiple modules or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • modules described as separate components may or may not be physically separated, and the components shown as modules may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional module in each embodiment of the present application may be integrated into one processing unit, each module may exist separately physically, or two or more modules may be integrated into one unit.
  • the functions described above are realized in the form of software function units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disc and other media that can store program codes. .
  • the location information generation device, chip, computer storage medium, computer program product, and terminal device provided by the above-mentioned embodiments of the present application are all used to execute the method provided above. Therefore, the beneficial effects that it can achieve can refer to the above-mentioned The beneficial effects corresponding to the provided method will not be repeated here.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

提供了一种高架识别方法及装置。方法包括:根据GNSS信号确定在第一时刻的第一参数(S11),以及通过传感器确定车辆(100)在第一时刻的第二参数(S12),然后结合第一参数和第二参数确定车辆(100)的行驶状态(S13),其中,当车辆(100)的行驶状态属于目标状态时,再根据车辆(100)的高度变化,确定车辆(100)的高架识别结果(S14)。目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,车辆(100)的高架识别结果为车辆(100)在高架上侧的道路行驶,或者为车辆(100)在高架下侧的道路行驶。能够解决无法确定车辆(100)的高架识别结果的问题,并进一步减少导航出现错误的次数,提高导航的准确度。

Description

一种高架识别方法及装置
本申请要求于2021年8月6日提交到国家知识产权局、申请号为202110904063.X、发明名称为“一种高架识别方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及导航技术领域,具体涉及一种高架识别方法及装置。
背景技术
随着城市的发展以及人们生活水平的提高,汽车数量不断增多。为了提升行车速度,舒缓拥堵情况,以及解决道路与行人动线交会的安全问题,许多城市修建了高架道路。
高架道路可简称为高架,是一种架设于地面道路之上,并且用于车辆行驶使用的立体式道路。通过高架,可将道路划分为高架上侧的道路和高架下侧的道路。根据用户的目的地,用户可选择是在高架上侧的道路上行驶车辆,还是在高架下侧的道路上行驶车辆。
另外,用户在驾驶车辆的过程中,通常需要进行导航。在导航过程中,用于导航的终端设备通常需要确定车辆的位置信息,并据此为车辆规划合适的路线。
但是,同一高架的高架上侧的道路和高架下侧的道路的位置信息可能相同或相似,因此终端设备根据车辆的位置信息,往往无法确定车辆位于高架上侧的道路还是高架下侧的道路,从而导致出现导航错误的情况。而导航错误往往会导致驾驶车辆的用户驶入错误的路线,为用户带来极差的驾驶体验,并存在驾驶过程耗时长,车辆油耗高的弊端。
发明内容
为了解决现有技术中,无法识别车辆位于高架上侧的道路还是位于高架下侧的道路的问题,本申请实施例提供一种高架识别方法及装置。
第一方面,本申请实施例提供一种高架识别方法,包括:
根据全球导航卫星系统GNSS信号,确定车辆在第一时刻的第一参数;
根据传感器,确定所述车辆在第一时刻的第二参数;
根据所述第一参数和所述第二参数确定所述车辆的行驶状态;
当所述车辆的行驶状态属于目标状态时,根据所述车辆的高度变化,确定所述车辆的高架识别结果,所述目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶,或者为所述车辆在高架下侧的道路行驶。
本实现方式中,可根据GNSS信号确定在第一时刻的第一参数,以及通过传感器确定车辆在第一时刻的第二参数,然后结合第一参数和第二参数确定车辆的行驶状态,再结合车辆的行驶状态和车辆的高度变化,确定车辆的高架识别结果。
一种可能的实现方式中,所述第一参数至少包括以下信息中的一种:所述车辆的速度和航向;
所述第二参数至少包括以下信息中的一种:所述车辆的俯仰角、横滚角和航向角。
一种可能的实现方式中,所述确定车辆在第一时刻的第一参数,包括:
当接收到启动导航功能的操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到位置搜索操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到用于指示目的地的操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到用于指示导航方式为驾车的操作时,确定所述车辆在第一时刻的第一参数
或者,当终端设备的速度大于目标速度阈值时,确定所述车辆在第一时刻的第一参数;
或者,当根据所述GNSS信号,确定所述车辆的前方包含匝道口时,确定所述车辆在第一时刻的第一参数;
或者,当根据包括所述车辆的前方的图像,确定所述车辆的前方包括高架标志时,确定所述车辆在第一时刻的第一参数。
通过上述方案,终端设备可在满足一定条件的情况下才确定车辆在第一时刻的第一参数,从而减少需要确定第一参数的操作,减少需要处理的数据量,以及能够减少确定第一参数的这一过程中占用的计算资源。
一种可能的实现方式中,所述根据所述第一参数和所述第二参数,确定所述车辆的行驶状态,包括:
向分类器传输所述第一参数和所述第二参数,所述分类器用于根据所述车辆的参数,对所述车辆的行驶状态进行分类;
根据所述分类器的输出,确定所述车辆的行驶状态。
一种可能的实现方式中,所述高架为一层,所述根据所述车辆的高度变化,确定所述车辆的高架识别结果,包括:
如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在高架下侧的道路行驶;
或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架下侧的道路行驶。
通过上述步骤,能够在高架为一层的情况下,结合车辆的高度变化和车辆所属的行驶状态,确定车辆的高架识别结果,从而提高导航的准确度。
一种可能的实现方式中,所述高架包括至少两层,所述根据所述车辆的高度变化,确定所述车辆的高架识别结果,包括:
如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果;
或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果。
通过上述步骤,能够在高架包括多层的情况下,结合车辆的高度变化和车辆所属的行驶状态,确定车辆的高架识别结果,从而提高导航的准确度。
一种可能的实现方式中,所述车辆在下匝道之前的状态包括以下状态中的一种:所述车辆在第一层高架上行驶和所述车辆在其他层的高架上行驶,所述第一层高架为与所述高架下侧的道路最接近的高架上层的道路。
一种可能的实现方式中,还包括:
根据所述车辆的高度和每层高架的高度,确定所述车辆在下匝道之前的状态;
或者,每次在确定所述车辆的高架识别结果之后,记录所述车辆所在高架的层数的变化;
根据所述车辆所在高架的层数的变化,确定所述车辆在下匝道之前的状态。
通过上述步骤,能够确定车辆在下匝道之前的状态,以便根据车辆在下匝道之前的状态,确定车辆的高架识别结果。
一种可能的实现方式中,还包括:
在确定所述车辆的高架识别结果之后,向服务器上报所述车辆的高架识别结果;
或者,如果记录所述车辆所在高架的层数的变化,并且所述车辆的高架识别结果表明所述车辆在所述高架上侧的道路行驶,向所述服务器上报所述车辆的高架识别结果和所述车辆所在高架的层数。
通过上述步骤,能够提高服务器确定车辆的位置的准确度,进一步提高导航的准确度。
一种可能的实现方式中,还包括:
如果所述车辆的高架识别结果表明所述车辆在高架下侧的道路行驶,并且所述车辆的行驶方向的前方包括弱信号区域,将高架识别方法由通过GNSS信号导航调整为通过网络定位方法导航或通过网络定位方法和惯性高架识别方法导航。
这一实现方式,能够使车辆行驶至弱信号区域之前,调整导航方法,减少弱信号区域对导航准确度的影响,以保障终端设备在弱信号区域的导航准确度。
一种可能的实现方式中,所述弱信号区域包括:隧道区域或遮挡物遮蔽区域。
第二方面,本申请实施例提供一种高架识别装置,所述装置包括:收发器和处理器;
所述收发器用于接收全球导航卫星系统GNSS信号;
所述处理器用于:
根据所述GNSS信号,确定车辆在第一时刻的第一参数;
根据传感器,确定所述车辆在第一时刻的第二参数;
根据所述第一参数和所述第二参数确定所述车辆的行驶状态;
当所述车辆的行驶状态属于目标状态时,根据所述车辆的高度变化,确定所述车辆的高架识别结果,所述目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶,或者为所述车辆在高架下侧的道路行驶。
一种可能的实现方式中,所述第一参数至少包括以下信息中的一种:所述车辆的速度和航向;
所述第二参数至少包括以下信息中的一种:所述车辆的俯仰角、横滚角和航向角。
一种可能的实现方式中,所述处理器用于确定车辆在第一时刻的第一参数,具体为:
当接收到启动导航功能的操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到位置搜索操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到用于指示目的地的操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到用于指示导航方式为驾车的操作时,确定所述车辆在第一时刻的第一参数
或者,当终端设备的速度大于目标速度阈值时,确定所述车辆在第一时刻的第一参数;
或者,当根据所述GNSS信号,确定所述车辆的前方包含匝道口时,确定所述车辆在第一时刻的第一参数;
或者,当根据包括所述车辆的前方的图像,确定所述车辆的前方包括高架标志时,确定所述车辆在第一时刻的第一参数。
一种可能的实现方式中,所述处理器用于根据所述第一参数和所述第二参数,确定所述车辆的行驶状态,具体为:
向分类器传输所述第一参数和所述第二参数,所述分类器用于根据所述车辆的参数,对所述车辆的行驶状态进行分类;
根据所述分类器的输出,确定所述车辆的行驶状态。
一种可能的实现方式中,所述高架为一层,所述处理器用于根据所述车辆的高度变化,确定所述车辆的高架识别结果,具体为:
如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在高架下侧的道路行驶;
或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架下侧的道路行驶。
一种可能的实现方式中,所述高架包括至少两层,所述处理器根据所述车辆的高度变化,确定所述车辆的高架识别结果,具体为:
如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果;
或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果。
一种可能的实现方式中,所述车辆在下匝道之前的状态包括以下状态中的一种:所述车辆在第一层高架上行驶和所述车辆在其他层的高架上行驶,所述第一层高架为与所述高架下侧的道路最接近的高架上层的道路。
一种可能的实现方式中,所述处理器还用于:
根据所述车辆的高度和每层高架的高度,确定所述车辆在下匝道之前的状态;
或者,每次在确定所述车辆的高架识别结果之后,记录所述车辆所在高架的层数的变化;
根据所述车辆所在高架的层数的变化,确定所述车辆在下匝道之前的状态。
一种可能的实现方式中,所述处理器还用于:
在确定所述车辆的高架识别结果之后,向服务器上报所述车辆的高架识别结果;
或者,如果记录所述车辆所在高架的层数的变化,并且所述车辆的高架识别结果表明所述车辆在所述高架上侧的道路行驶,向所述服务器上报所述车辆的高架识别结果和所述车辆所在高架的层数。
一种可能的实现方式中,所述处理器还用于:
如果所述车辆的高架识别结果表明所述车辆在高架下侧的道路行驶,并且所述车辆的行驶方向的前方包括弱信号区域,将高架识别方法由通过GNSS信号导航调整为通过网络定位方法导航或通过网络定位方法和惯性高架识别方法导航。
一种可能的实现方式中,所述弱信号区域包括:隧道区域或遮挡物遮蔽区域。
第三方面,本申请实施例提供一种终端设备,该终端设备包括处理器,当所述处理器执行存储器中的计算机程序或指令时,如第一方面所述的方法被执行。
第四方面,本申请实施例提供一种终端设备,该终端设备包括处理器和存储器;所述存储器用于存储计算机程序或指令;所述处理器用于执行所述存储器所存储的计算机程序或指令,以使所述终端设备执行如第一方面所述的方法。
第五方面,本申请提供了一种终端设备,该终端设备包括处理器、存储器和收发器;所述收发器用于接收信号或者发送信号;所述存储器用于存储计算机程序或指令;所述处理器用于执行所述存储器所存储的计算机程序或指令,以使所述终端设备执行如第一方面所述的方法。
第六方面,本申请提供了一种终端设备,该终端设备包括处理器和接口电路;所述接口电路,用于接收计算机程序或指令并传输至所述处理器;所述处理器用于运行所述 计算机程序或指令,以使所述终端设备执行如第一方面所述的方法。
第七方面,本申请提供了一种计算机存储介质,所述计算机存储介质用于存储计算机程序或指令,当所述计算机程序或指令被执行时,使得第一方面所述的方法被实现。
第八方面,本申请提供了一种包括计算机程序或指令的计算机程序产品,当所述计算机程序或指令被执行时,使得第一方面所述的方法被实现。
第九方面,本申请提供了一种芯片,所述芯片包括处理器,所述处理器与存储器耦合,用于执行所述存储器中存储的计算机程序或指令,当所述计算机程序或指令被执行时,如第一方面所述的方法被执行。
本申请实施例提供一种高架识别方法及装置。在该方法中,根据GNSS信号确定在第一时刻的第一参数,以及通过传感器确定车辆在第一时刻的第二参数,然后结合第一参数和第二参数确定车辆的行驶状态,再结合车辆的行驶状态和车辆的高度变化,确定车辆的高架识别结果,从而解决现有技术中,无法确定车辆的高架识别结果的问题,并且能够减少导航出现错误的次数,提高导航的准确度。
进一步的,由于本申请的方案可以提高导航的准确度,因此,还能够提高车辆上的用户的体验,减少驾驶过程的耗时,以及减少车辆的油耗,达到节能的目的。
附图说明
图1为一种车辆上高架匝道和下高架匝道路网的示意图;
图2为一种GNSS系统的结构示意图;
图3为一种终端设备显示的电子地图的界面示意图;
图4为本申请实施例提供的一种车辆行驶的场景示意图;
图5为本申请实施例提供的一种手机的结构示意图;
图6为本申请实施例提供的一种手机的软件结构框图;
图7为本申请实施例公开的一种车辆的结构示意图;
图8(a)为本申请实施例公开的一种终端设备的界面的示例图;
图8(b)为本申请实施例公开的又一种终端设备的界面的示例图;
图8(c)为本申请实施例公开的又一种终端设备的界面的示例图;
图9为本申请实施例公开的一种高架识别方法的工作流程示意图;
图10(a)为本申请实施例公开的一种车辆在高架上侧的道路行驶的场景示意图;
图10(b)为本申请实施例公开的一种车辆在高架上侧的道路行驶的俯视图;
图11为利用本申请实施例提供的一种终端设备显示的电子地图的界面示意图;
图12为本申请提供的导航装置的一种实施方式的结构框图;
图13为本申请提供的芯片的一种实施方式的结构框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个、两个或两个以上。术语“和/或”, 用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
为了下述各实施例的描述清楚简洁,首先给出相关技术的简要介绍:
为了保障行车速度,舒缓拥堵情况,以及解决道路与行人动线交会的安全问题,目前许多城市修建了高架道路,以应对日益增加的汽车数量。
其中,高架道路可简称为高架,是一种架设于地面道路之上,并且用于车辆行驶使用的立体式道路。通过高架,可将道路划分为高架上侧的道路和高架下侧的道路。对于只有一层高架的场景而言,高架上侧的道路是指高于地面的架空道路,高架下侧的道路是指位于高架下的地面道路;对于有两层以上的高架场景而言,高架上侧的道路是高于地面的任一层架空道路,高架下侧的道路是位于最接近地面的架空道路之下的地面道路。
另外,在高架的出入口通常设置有匝道口。匝道口通常包括上匝道口和下匝道口,车辆在上高架时,首先需要经过上匝道口,然后再驶入高架上侧的道路。另外,车辆在下高架时,或者车辆由某一层的高架上侧的道路,行驶至下层的高架上侧的道路时,首先需要经过下匝道口,然后再驶入高架上侧的道路,或者驶入下层的高架上侧的道路。
为了明确车辆在高架上行驶的场景,提供了图1。参见图1所示的场景示意图,该图所对应的场景包含一层高架,相应的,道路包括高架上侧的道路和高架下侧的道路。并且,车辆前方包含一条带箭头的实线,该箭头表示车辆行驶的方向。在车辆行驶的初始阶段,车辆位于图1中左侧的位置,车辆在高架下侧的道路行驶。一段时间之后,车辆通过上高架匝道口开始上高架,从而驶入高架上侧的道路,这时车辆所处的位置为图1中右侧的车辆所处的位置。然后,在高架上侧的道路行驶一段时间之后,该车辆通过下高架匝道口行驶至高架下侧的道路。这一过程中,车辆的行驶路线为:在地面行驶—>在高架上侧的道路行驶—>在高架下侧的道路行驶,即在地面行驶。
在车辆行驶过程中,用户通常可利用终端设备(例如手机或者车载终端等)进行导航。例如,用户可能在车辆上高架前开启终端设备的导航,也可能在车辆在高架下方的道路行驶时开启导航。目前,终端设备主要通过全球导航卫星系统(Global Navigation Satellite System,GNSS)确定自身的位置。其中,GNSS是一种能够在地球表面或近地空间的任何地点,为用户提供全天候的三维坐标和速度以及时间信息的空基无线电导航定位系统。
GNSS系统通常包括美国的全球定位系统(global positioning system,GPS)、俄罗斯的(globalnaja nawigazionnaja sputnikowaja sistema,GLONASS)系统、欧盟的伽利略(GALILEO)系统和中国的北斗卫星导航系统等。
其中,GPS系统以人造地球卫星为基础的无线电导航的定位系统,包括覆盖全球的24颗卫星。北斗卫星导航系统是中国自主研发、独立运行的全球卫星导航系统。该系统分为两代,即北斗一代和北斗二代系统。该系统通常包括四颗地球同步轨道卫星。
参见图2所示的GNSS导航系统的示意图,其中,GNSS导航系统通常包括空间部分、地面监控部分和用户接收机三大部分。
如图2中所示,GNSS导航系统的空间部分包括多个卫星10,地面监控部分包括地面监控跟踪站20,地面监控跟踪站20通常包括主控站、监控站和注入站,GNSS导航系统的用户接收机30可接收多个卫星10传输的卫星信号。
GNSS导航系统的基本原理是通过多颗已知位置的卫星到用户接收机之间的距离,确定用户接收机的位置。其中,卫星的位置可以根据星载时钟所记录的时间在卫星星历中查出,用户接收机与卫星之间的距离可通过卫星发射的卫星信号传输至用户接收机的时间确定,该卫星信号也可称为GNSS信号。
在导航过程中,地面监控跟踪站20可向多个卫星10传输卫星星历等信息;多个卫星10可持续发射卫星信号,该卫星信号中通常包括卫星星历以及卫星信号的发射时间;用户接收机30可搜索并接收卫星信号,通过卫星信号中的卫星星历,确定卫星10的位置,以及通过自身时钟和卫星信号的发射时间,确定自身与卫星10之间的距离,并进一步根据卫星10的位置,以及自身与卫星10之间的距离,确定自身所处位置的位置信息。
用户可通过终端设备进行导航,该终端设备可以是移动终端(例如手机)和车机等具有导航功能的设备。另外,终端设备在导航过程中显示电子地图,便于用户查询目的地和进行路线规划。
其中,终端设备在为车辆导航的过程中,在确定自身所处的位置之后,可在电子地图上显示自身所在的位置,该电子地图通常包括终端设备所处的位置周边的环境,以及指示车辆在电子地图中的位置,进一步还可以包括为车辆规划的路线,以及指示车辆的前进方向,满足用户的导航需求。
通过上述技术的简要介绍可知,目前终端设备通常根据接收到的GNSS信号,确定需要自身的位置信息,进一步再根据该位置信息,为用户进行导航。
但是,同一高架的高架上侧的道路和高架下侧的道路的位置信息可能相同或相似。这种情况下,终端设备根据确定的位置信息,无法确定车辆位于高架上侧的道路还是高架下侧的道路,从而较易导致出现导航错误。
而导航错误往往会导致驾驶车辆的用户驶入错误的路线,为用户带来极差的驾驶体验,并存在驾驶过程耗时长,车辆油耗高的弊端。
示例性的,图3为终端设备在导航准确性低的情况下显示的电子地图。参见图3,该图中的导航指示用户车辆位于北四环东路辅路上,处于高架桥下,图3中实线标出的三角形表示导航指示的终端设备所处的位置;而用户车辆实际所在的位置已经从高架桥下驶入高架桥上,图3中虚线标出的三角形表示该车辆的实际位置,可见车辆的实际位置与用户的终端设备显示的电子地图中指示的导航位置不一致,导航用户对车辆位置识别发生偏差。
针对上述问题,本申请实施例提供一种高架识别方法及装置,以识别车辆在高架上侧的道路还是在高架下侧的道路,提高导航的准确度。
本申请的技术方案可应用于车辆驾驶领域,包括但不限于自动驾驶(automated driving,ADS)、智能驾驶(intelligent driving)和智能网联车(Intelligent Connected Vehicle,ICV)等领域。本申请提供一种识别车辆是否处于上高架桥、下高架桥状态的技术方案。该技术方案可应用车辆驾驶领域,用于对车辆提供定位、导航的服务。
本申请的技术方案可以应用于任意一种定位系统或导航系统,如图4所示,为本实施例提供的一种车辆行驶的场景示意图。该场景涉及服务器、至少一个终端设备和终端设备对应的车辆。其中,服务器与终端设备(例如手机终端)之间可通过无线网络连接。
进一步地,服务器可以是对手机终端进行管理的服务平台或车联网服务器,例如服务器用于接收手机终端发送的消息,确定车辆位置以及为用户提供地图和实时导航服务。其中,服务器中可存储多个地区的电子地图。
终端设备用于向服务器发送请求,实现车辆的实时定位和导航功能。另外,车辆中包括通信模块和处理模块,用于接收服务器和/或手机终端发送的信号,并根据该信号和预设程序控制车辆启停,获取车辆上高架或下高架状态。
可选的,服务器可以是一个或多个独立的服务器或服务器集群,或者还可以是部署在云端的云平台服务。服务器可以是一种网络设备,比如基站(base station,BS),进一步地,该基站可以是全球移动通信系统(global system for mobile communication,GSM)或码分多址(code division multiple access,CDMA)中的基站(base transceiver station,BTS),也可以是宽带码分多址(wideband-CDMA,WCDMA)中的基站(NodeB),还可以是LTE中的演进型基站(evolutional NodeB,eNB/e-NodeB),或者下一代LTE中的演进型基站(next generation eNB,ng-eNB),或者NR中的基站(gNB),或者,未来移动通信系统中的基站或无线保真(wireless fidelity,WiFi)系统中的接入节点等,本申请的实施例对网络设备所采用的具体技术和具体设备形态不做限定,具体可以是云端部署,还可以是独立的计算机设备等。
本申请实施例中的终端设备,可以是指向用户提供服务和/或数据连通性的设备,具有无线连接功能的手持式设备、或连接到无线调制解调器的其他处理设备,例如无线终端,车载无线终端,便携设备,可穿戴设备,移动电话(或称为“蜂窝”电话),便携式、袖珍式、手持式终端等,它们与无线接入网交换语言和/或数据。例如,个人通信业务(personal communication service,PCS)电话、无绳电话、会话发起协议(SIP)话机、无线本地环路(wireless local loop,WLL)站、个人数字助理(personal digital assistant,PDA)等设备。所述无线终端也可以为订户单元(subscriber unit)、接入终端(access terminal)、用户终端(user terminal)、用户代理(user agent)、用户设备(user device)或用户设备(user equipment,UE)等,本申请对终端设备的类型不进行限定。
以手机为上述终端设备举例,如图5所示,为手机的一种结构示意图。
其中,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180等。
可以理解的是,本发明实施例示意的结构并不构成对手机的具体限定。在本申请另 一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobi le industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器 110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机可以支持一种或多种视频编解码器。这样,手机可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving  picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机可以设置至少一个麦克风170C。在另一些实施例中,手机可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180中可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
当然,手机还可以包括充电管理模块、电源管理模块、电池、按键、指示器以及1个或多个SIM卡接口等,本申请实施例对此不做任何限制。
仍以手机为上述终端设备举例,手机的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示 例性说明手机的软件结构。
图6为本申请提供的手机的一种实施方式的软件结构框图。参见图6,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。如图6所示,应用程序包可以包括相机,图库,通话,导航,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图6所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器和通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏的大小,获取显示界面上各显示区域的参数等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括照相机图标的显示界面。
电话管理器用于提供手机的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成和图层处理等。2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包含显示驱动,摄像头驱动,音频驱动,传感器驱动等。
应用程序框架层以下的系统库和内核层等还可称为底层系统,底层系统中包括用于识别手机姿态变化的状态监测服务,该状态监测服务可设置在系统库和/或内核层内。
在另一种可行的实现方案中,执行本申请实施例提供的高架识别方法的终端设备可为车辆。示例性的,可由车辆内的车机执行所述高架识别方法,其中,所述车机通常安装在车辆的中控台中。
这一实现方案中,该车辆可为智能车辆。图7是本申请实施例提供的车辆100的功能框图。参见图7,车辆100可包括各种子系统,例如行进系统1002、传感器系统1004、规划控制系统1006、一个或多个外围设备1008以及电源1010、计算机系统1001和用户接口1016。
可选地,车辆100可包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆100的每个子系统和元件可以通过有线或者无线互连。
行进系统1002可包括为车辆100提供动力的组件。在一个实施例中,推进系统1002可包括引擎1018、能量源1019、传动装置1020和车轮1021。其中,引擎1018可以是内燃引擎、电动机、空气压缩引擎或其他类型的一种引擎或多种引擎的组合,这里多种引擎的组合,举例来说可以包括:汽油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎1018将能量源1019转换成机械能量。
能量源1019的示例包括汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源1019也可以为车辆100的其他系统提供能量。
传动装置1020可以将来自引擎1018的机械动力传送到车轮1021。传动装置1020可包括变速箱、差速器和驱动轴。在一个实施例中,传动装置1020还可以包括其他器件,比如离合器。其中,驱动轴可包括可耦合到一个或多个车轮1021的一个或多个轴。
传感器系统1004可包括感测关于车辆100自身以及车辆100周边的环境的信息的若干个传感器。例如,传感器系统1004可包括定位系统1022(定位系统可为GNSS系统,可包括GPS系统,也可以包括北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)1024、雷达1026、激光测距仪1028、相机1030、计算机视觉系统1038以及传感器融合算法1040。传感器系统1004还可包括车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个传感器数据可用于检测待检测的对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是车辆100实现安全操作的关键功能。
全球定位系统1022可用于估计车辆100的地理位置。IMU 1024用于基于惯性加速度来感测车辆100的位置和朝向变化。在一个实施例中,IMU 1024可以是加速度计和陀螺仪的组合。
雷达1026可利用无线电信号来感测车辆100的周边环境中的物体。在一些实施例中,除了感测物体以外,雷达1026还可用于感测物体的速度或者行进方向。
激光测距仪1028可利用激光来感测车辆100所处环境中的物体。在一些实施例中,激光测距仪1028可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。
相机1030可用于捕捉车辆100的周边环境的多个图像。相机1030可以是静态摄像 头或视频摄像头。
计算机视觉系统1038可以操作来处理和分析由相机1030捕捉的图像以便识别车辆100周边环境中的物体或者特征。所述物体或者特征可包括交通信号、道路边界和目标。计算机视觉系统1038可使用物体识别算法、运动中恢复结构(Structure from Motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统1038可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。
规划控制系统1006为控制车辆100及其组件的操作。规划控制系统1006可包括各种元件,其中包括转向系统1032、油门1034、制动单元1036、路线控制系统1042以及目标避免系统1044。
通过对转向系统1032的操作可以调整车辆100的前进方向。例如在一个实施例中可以为方向盘系统。
油门1034用于控制引擎1018的操作速度并进而控制车辆100的速度。
制动单元1036用于控制车辆100减速。制动单元1036可使用摩擦力来减慢车轮1021。在其他实施例中,制动单元1036可将车轮1021的动能转换为电流。制动单元1036也可采取其他形式来减慢车轮1021转速从而控制车辆100的速度。
路线规划系统1042用于确定车辆100的行驶路线。在一些实施例中,路线规划系统1042可结合来自传感器1038、GPS 1022和一个或多个预定地图的数据为车辆100规划出能避开环境中潜在目标的行驶路线。本申请实施例提供的轨迹规划方法,即可以由路线规划系统1042执行,以为车辆100输出一条目标行驶轨迹,该目标行驶轨迹包含多个目标路点,其中,多个目标路点中的每个目标路点包含该路点的坐标,以及该路点的横向允许误差和速度允许误差,本文所述的横向允许误差包括横向允许误差的取值范围,在一些情况下可以理解为横向允许误差的取值范围的简称。这里的横向,是指与车辆行进方向垂直的方向或近似垂直方向;横向允许误差,其实质含义为横向位移允许误差,也即车辆100在车辆行进方向的垂直方向或近似垂直方向上,允许的位移误差的取值范围。后文对此不再赘述。
控制系统1044用于根据路线规划系统输出的行驶路线/行驶轨迹生成油门刹车以及转向角的控制量,从而对转向系统1032、油门1034以及制动单元1036进行控制。
当然,在一个实例中,规划控制系统1006可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
车辆100通过外围设备1008与外部传感器、其他车辆、其他计算机系统或用户之间进行交互。外围设备1008可包括无线通信系统1046、车载电脑1048、麦克风1050或者扬声器1052。
在一些实施例中,外围设备1008提供车辆100的用户与用户接口1016交互的手段。例如,车载电脑1048可向车辆100的用户提供信息。用户接口1016还可操作车载电脑1048来接收用户的输入。在一种实现方式中,车载电脑1048可以通过触摸屏进行操作。在其他情况中,外围设备1008可提供用于车辆100与位于车内的其它设备通信的手段。例如,麦克风1050可从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器1052可向车辆100的用户输出音频。
无线通信系统1046可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统1046可使用3G蜂窝通信,例如CDMA、EVD0、GSM/GPRS,或者4G蜂窝 通信,例如LTE。或者5G蜂窝通信。无线通信系统1046可利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统1046可利用红外链路、蓝牙或ZigBee与设备直接通信。其他无线协议,例如各种车辆通信系统,例如,无线通信系统1046可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆或者路边台站之间的公共或者私有数据通信。
电源1010可向车辆100的各种组件提供电力。在一个实施例中,电源1010可以为可再充电锂离子或铅酸电池。这种电池的一个或多个电池组可被配置为电源并为车辆100的各种组件提供电力。在一些实施例中,电源1010和能量源1019可一起实现,如全电动车中。
车辆100的部分或所有功能受计算机系统1001控制。计算机系统1001可包括至少一个处理器1013,处理器1013执行存储在例如存储器1014这样的非暂态计算机可读介质中的指令1015。计算机系统1001还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。
处理器1013可以是任何常规的处理器,诸如商业可获得的CPU。替选地,该处理器可以是诸如ASIC或其它基于硬件的处理器的专用设备。尽管图1功能性地图示了处理器、存储器、以及计算机系统1001的其它元件,但是本领域的普通技术人员应该理解该处理器、存储器实际上可以包括不位于相同物理外壳内的其他多个处理器、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机系统1001的外壳内的其它存储介质。因此,对处理器的引用将被理解为包括对可以并行或不并行操作的处理器或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算;或者行进系统、传感器系统、规划控制系统等子系统也可以有自己的处理器,用于实现对应子系统的相关任务的计算从而实现相应功能。
在此处所描述的各个方面中,处理器可以位于远离该车辆的地方并且与该车辆进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行,而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器1014可包含指令1015(例如,程序逻辑),指令1015可被处理器1013执行来执行车辆100的各种功能,包括以上描述的那些功能。存储器1014也可包含额外的指令,包括向行进系统1002、传感器系统1004、规划控制系统1006和外围设备1008中的一个或多个发送数据、从其接收数据、与其交互或者对其进行控制的指令。
除了指令1015以外,存储器1014还可存储其他相关数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它相关信息。这种信息可在车辆100处于自主、半自主或者手动模式的操作期间被车辆100或具体被计算机系统1001使用。
用户接口1016,用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口1016可包括在外围设备1008的集合内的一个或多个输入/输出设备,例如无线通信系统1046、车载电脑1048、麦克风1050和扬声器1052。
计算机系统1001可基于从各种子系统(例如,行进系统1002、传感器系统1004和规划控制系统1006)以及从用户接口1016接收的输入来控制车辆100的功能。在一些实施 例中,计算机系统1001可操作以对车辆100及其子系统的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,存储器1014可以部分或完全地与车辆100分开存在。上述组件可以按有线或者无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图7不应理解为对本发明实施例的限制。
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车和手推车等,本发明实施例不做特别的限定。
在本申请实施例中,车辆100可接收GNSS信号,通过GNSS信号确定自身所处位置,实现对自身的定位,并且可通过本申请实施例提供的高架识别方法确定车辆是位于高架上侧的道路,还是位于高架下侧的道路。
下面结合图4中示出的终端设备,以及图8(a)至图8(c)所示的终端设备的界面示意图,对本申请实施例提供的高架识别方法进行示例性说明。
车辆在行驶过程中,可以通过终端设备进行导航。一种可能的实现方式中,车辆在行驶过程中,可以通过终端设备中安装的导航APP,例如
Figure PCTCN2022091520-appb-000001
Figure PCTCN2022091520-appb-000002
Figure PCTCN2022091520-appb-000003
等导航APP,为车辆进行导航。
其中,终端设备可通过本申请实施例提供的高架识别方法,确定车辆在第一时刻的第一参数,并根据传感器确定车辆在第一时刻的第二参数,在根据第一参数和第二参数确定车辆的行驶状态属于目标状态时,再根据车辆的高度变化,确定车辆是位于高架上侧的道路,还是位于高架下侧的道路。
在一些实施例中,可在终端设备每次启动之后,就周期性确定车辆在第一时刻的第一参数。
在一些实施例中,终端设备可在满足一定触发条件的情况下,再确定车辆在第一时刻的第一参数,针对这一场景,示例性的,公开以下方案:
(1)当终端设备接收到启动导航功能的操作时,确定车辆在第一时刻的第一参数。
如果终端设备接收到启动导航功能的操作,则表明用户需要利用该终端设备进行导航,这种情况下,再确定车辆在第一时刻的第一参数。
所述启动导航功能的操作可包括多种形式,例如,可包括对导航APP的触控操作或特定的手势操作等,本申请实施例对此不作限定。
(2)当终端设备接收到位置搜索操作时,确定车辆在第一时刻的第一参数。
如果终端设备接收到位置搜索操作,表明用户需要查看某一位置周边环境,该用户往往有导航需求,则确定车辆在第一时刻的第一参数。
示例性的,参见图8(a)所示的一种终端设备的显示界面的示例图,该图对应的示例中,终端设备接收到用于搜索中国国家图书馆的位置搜索操作,该显示界面中,包含三角形的圆圈所指示的位置即为中国国家图书馆,这种情况下,可确定车辆在第一时刻的第一参数。
(3)当终端设备接收到用于指示目的地的操作时,确定车辆在第一时刻的第一参数。
如果终端设备接收到用于指示目的地的操作,则表明用户需要前往某一目的地,该 用户往往有导航需求,则确定车辆在第一时刻的第一参数。
示例性的,参见图8(b)所示的一种终端设备的显示界面的示例图,该图对应的示例中,终端设备接收到用于指示目的地为中国国家图书馆的操作,在该显示界面中,起点即为车辆当前所处的位置,终点即为中国国家图书馆。这种情况下,可确定车辆在第一时刻的第一参数。
(4)当终端设备接收到用于指示导航方式为驾车的操作时,确定车辆在第一时刻的第一参数。
在导航过程中,根据导航需求,用户往往会选择不同的导航方式。导航方式通常包括:打车、驾车、公共交通、步行和骑行等。其中,如果终端设备应用的导航方式为驾车,则表明用户需要驾驶车辆,用户有导航需求。这种情况下,可确定车辆在第一时刻的第一参数。
示例性的,参见图8(c)所示的一种终端设备的界面的示例图,该示例中,终端设备应用的导航方式为驾车。
(5)当终端设备的速度大于目标速度阈值时,确定车辆在第一时刻的第一参数。
其中,如果终端设备的速度大于目标速度阈值,则表明终端设备的速度较快,携带终端设备的用户正在驾驶车辆。而用户在驾驶车辆的过程中,可能会驶入高架上侧的道路,因此,可确定车辆在第一时刻的第一参数。
这一方案中,示例性的,目标速度阈值可为30KM/h,当然,该目标速度阈值也可设置为其他值,本申请实施例对此不作限定。
(6)当根据GNSS信号,确定车辆的前方包含匝道口时,确定车辆在第一时刻的第一参数。
其中,终端设备可根据GNSS信号,确定终端设备的位置信息,并且,终端设备可将位置信息传输至远程的服务器。服务器内存储有各地匝道口的位置,并根据接收到的位置信息确定车辆的前方是否包含匝道口。服务器在确定车辆的前方包含匝道口之后,再向终端设备传输相应的提示信息,以提示车辆即将驶入匝道口所在的位置。
或者,终端设备内可存储各地的匝道口的位置信息,在确定终端设备的位置信息之后,终端设备可根据终端设备的位置信息与自身的存储相匹配,以确定车辆前方是否包含匝道口。
另外,终端设备还可与车辆内的设备相连接,例如,终端设备可与车辆的中控台内安装的车机相连接。这种情况下,该设备内可存储各地的匝道口的位置信息,终端设备还可向该设备传输位置信息,该设备据此确定车辆的前方是否包含匝道口,并在确定车辆的前方包含匝道口之后,向终端设备传输相应的提示信息。
匝道口通常包括匝道入口和匝道出口。车辆在行驶过程中,往往需要经过匝道入口,然后再驶入高架上侧的道路。并且,车辆从高架上侧的道路驶入下层的道路时,往往需要经过匝道出口,然后再驶入下层的道路。
如果确定车辆的前方包含匝道口,则表明车辆将要上高架或者将要下高架,这种情况下,可确定车辆在第一时刻的第一参数,以便通过本申请实施例提供的方案,对车辆是否在高架上侧或高架下侧的道路上行驶进行识别。
(7)当根据包括车辆的前方的图像,确定车辆的前方包括高架标志时,确定车辆在第一时刻的第一参数。
终端设备可获取车辆前方的图像,并通过图像分析,确定车辆前方是否包括高架标志,如果确定车辆前方包括高架标志,则表明车辆将要上高架或者将要下高架,这种情况下,可确定车辆在第一时刻的第一参数,以便通过本申请实施例提供的方案,对车辆是否在高架上侧或高架下侧的道路上行驶进行识别。
(8)当根据GNSS信号,确定车辆的周边包括高架时,确定车辆在第一时刻的第一参数。
终端设备可根据GNSS信号,确定终端设备所处的位置,并根据电子地图,确定该位置周边是否包括高架,如果包括,则表明车辆存在上高架或下高架的可能,因此可确定车辆在第一时刻的第一参数,以便对车辆是否在高架上侧或高架下侧的道路上进行识别。
当然,终端设备还可在其他场景下确定车辆在第一时刻的第一参数,本申请实施例对此不作限定。
为了明确本申请提供的方案,以下结合附图,通过各个实施例,对本申请所提供的方案进行介绍说明。
为了解决现有技术中,无法识别车辆位于高架上侧的道路还是位于高架下侧的道路的问题,本申请实施例提供一种高架识别方法。
参见图9所示的工作流程示意图,本申请实施例提供的高架识别方法包括以下步骤:
步骤S11、根据全球导航卫星系统GNSS信号,确定车辆在第一时刻的第一参数。
在本申请实施例的一种可行的设计中,第一参数至少包括以下信息中的一种:车辆的速度和航向。其中,车辆的行驶方向通常可作为车辆的航向。
根据在各个时刻接收到的GNSS信号,可确定终端设备分别在不同时刻所处的位置的位置信息。可以理解的是,当终端设备为车辆导航时,终端设备在不同时刻的位置信息,可在一定程度上反映车辆的轨迹,终端设备可据此确定车辆的速度和航向。其中,通过车辆在不同的时刻之间的距离和时间差,即可确定车辆的速度。
示例性的,参见图10(a)所示的车辆行驶的场景示意图,该示例中,车辆在高架上侧的道路行驶,并且车辆的行驶方向为从左至右。针对这一场景,公开与图10(a)相对应的图10(b),其中图10(b)为针对图10(a)的俯视图,图10(b)中显示的道路即为高架上侧的道路,并且,车辆各个时刻所处的位置在图10(b)中通过包含数字的圆圈表示,其中圆圈内的数字越小,表示车辆位于该位置的时刻越早。其中,设定车辆在时刻t1处于数字1指示的圆圈位置,设定车辆在时刻t2处于数字2指示的圆圈位置,设定车辆在时刻t3处于数字3指示的圆圈位置,设定车辆在时刻t4处于数字4指示的圆圈位置,并且,由于车辆从左向右行驶,因此时刻t1早于时刻t2,时刻t2早于时刻t3,时刻t3早于时刻t4。
这一示例中,车辆在时刻t1和时刻t2之间的速度,即为数字1指示的圆圈位置与数字2指示的圆圈位置之间的距离差值与时刻t1和时刻t2之间的时间差值的比值;车辆在时刻t3和时刻t2之间的速度,即为数字3指示的圆圈位置与数字2指示的圆圈位置之间的距离差值与时刻t3和时刻t2之间的时间差值的比值;车辆在时刻t1和时刻t4之间的速度,即为数字1指示的圆圈位置与数字4指示的圆圈位置之间的距离差值与时刻t1和时刻t4之间的时间差值的比值。
这种情况下,根据车辆分别在时刻t1、时刻t2、时刻t3和时刻t4的位置,以及不 同的时刻之间的时间差,确定车辆的速度。另外,在图10(b)所示的场景下,车辆的航向为图10(b)中包括箭头的虚线所指示的方向。
步骤S12、根据传感器,确定车辆在第一时刻的第二参数。
在一种可行的设计中,第一时刻为当前时刻,相应的,车辆在第一时刻的第一参数可为当前时刻的第一参数。另外,在这一设计中,车辆在第一时刻的第二参数可为车辆在当前时刻的第二参数。
在另一种可行的设计中,第一时刻可为一个时间段内的任意一个时刻,这种情况下,车辆在第一时刻的第一参数可为该时间段内某一个时刻的第一参数,车辆在第一时刻的第二参数可为车辆在该时间段内另一个时刻的第二参数。
一些实施例中,第二参数至少包括以下信息中的一种:车辆的俯仰角、横滚角和航向角。
其中,车辆的俯仰角通常指的是车辆相对于惯性坐标系的XOY平面“俯仰”的角度;车辆的横滚角通常指的是在惯性坐标系中,用于标识车辆的横向倾角;车辆的航向角通常指的是在惯性坐标系下,车辆质心速度与横轴的夹角。
第二参数可通过传感器采集,其中,采集第二参数的传感器通常可包括陀螺仪,当然,也可包括其他能够采集第二参数的传感器,本申请实施例对此不作限定。
其中,采集第二参数的传感器可设置在用于导航的终端设备内,也可设置在车辆内。如果该传感器设置在车辆内,该传感器可通过网络将采集到的第二参数传输至终端设备。
一些实施例中,该传感器可在终端设备接收GNSS信号的过程中,周期性采集第二参数并缓存。这种情况下,基于缓存的第二参数,可确定车辆在第一时刻的第二参数。
一些实施例中,在确定车辆在第一时刻的第一参数的过程中,可触发该传感器进行第二参数的采集,并获取传感器采集的第二参数。
步骤S13、根据第一参数和第二参数,确定车辆的行驶状态。
在本申请实施例提供的方案中,车辆的行驶状态通常包括多种。其中,如果车辆需要上、下高架,车辆的行驶状态可包括开始上匝道、结束上匝道、开始下匝道和结束下匝道。另外,如果车辆未上高架且未下高架,车辆的行驶状态可包括:上坡、下坡和在道路上行驶等。
当然,车辆的行驶状态还可包括其他类型,本申请实施例对此不作限定。
在这一步骤中,根据第一参数和第二参数确定车辆的行驶状态。在本申请实施例提供的一种可行的方案中,该操作可通过以下步骤实现:
首先,向分类器传输第一参数和第二参数,分类器用于根据车辆的参数,对车辆的行驶状态进行分类;
然后,根据分类器的输出,确定车辆的行驶状态。
也就是说,可基于分类器确定车辆的行驶状态,分类器可通过预先根据车辆在不同形状状态下的车辆信息进行训练确定。其中,该分类器可根据第一参数和第二参数训练得到,并且,该分类器输出的状态可包括车辆的各种行驶状态。示例性的,可通过车辆的速度、航向和俯仰角等对分类器进行训练,该分类器输出的状态可包括开始上匝道、结束上匝道、开始下匝道和结束下匝道等状态。
这种情况下,在根据第一参数和第二参数确定车辆的行驶状态时,第一参数和第二参数为分类器的输入,车辆的行驶状态为分类器的输出。
本申请对分类器的类型不做限定,在一种可行的示例中,分类器可包括支持向量机(support vector machine,SVM)。
步骤S14、当车辆的行驶状态属于目标状态时,根据所述车辆的高度变化,确定所述车辆的高架识别结果。
其中,所述目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶,或者为所述车辆在高架下侧的道路行驶。
在本申请实施例中,如果车辆的行驶状态为目标状态中的一种,则表明车辆的行驶状态属于目标状态。
在高架的出入口处通常设置有匝道。车辆在上高架时,首先需要上匝道,然后驶入高架上侧的道路。相应的,车辆在下高架时,首先需要下匝道,然后驶入高架下侧的道路。因此,如果车辆的行驶状态属于目标状态,则表示车辆处于目标状态的时刻正在上高架、已完成上高架、开始下高架或者已完成下高架。
而车辆在上、下高架之后,车辆的高度会发生变化。这种情况下,本申请实施例结合车辆所属的目标状态和车辆的高度变化,确定车辆的高架识别结果。
在实际的路况中,高架可为多层或一层,其中,高架为一层,指的是高架只包括一层位于上侧的道路,高架为多层,指的是高架包括多层位于上侧的道路。
一些实施例中,如果高架为一层,在车辆不同的行驶状态下,可分别通过以下步骤确定车辆的高架识别结果:
(1)如果车辆的行驶状态为开始上匝道,并且车辆在之后的第一时间段内的高度变化大于第一阈值,确定车辆的高架识别结果为车辆在高架上侧的道路行驶。
这一方案中,第一时间段为在确定车辆的行驶状态的时间之后的时间段。也就是说,在车辆的行驶状态为开始上匝道之后的一段时间内,车辆高度增加,并且增加的高度大于第一阈值,这种情况下,可确定车辆已上高架,车辆在高架上侧的道路行驶。
第一阈值通常为正数,并且第一阈值的具体数值可预先设置。或者,第一阈值的具体数值可由当地的高架的高度确定。
如果第一阈值的具体数值由当地的高架的高度确定,则第一阈值通常略小于高架的高度。由于高架为一层,因此,这一方案中,高架的高度通常指的是以高架下侧的道路为基准面的情况下,高架上侧的道路的高度。
其中,终端设备可通过多种方式确定当地的高架的高度。在一种可行的方式中,终端设备内可存储各地的高架的高度,这种情况下,终端设备在根据GNSS信号,确定终端设备的位置信息之后,可通过查询自身的存储,确定该位置周边的高架的高度。
在另一种可行的方式中,终端设备在根据接收到的GNSS信号,确定终端设备的位置信息之后,可将该位置信息传输至远程的服务器,再由远程的服务器根据该位置信息,确定该位置周边的高架的高度,再将该高架的高度传输至终端设备。
在另一种可行的方式中,终端设备可与车辆进行信息交互,这种情况下,终端设备可将自身的位置信息传输至车辆,由车辆确定高架的高度,再将该高架的高度传输至终端设备。
当然,终端设备还可通过其他方式确定当地高架的高度,本申请实施例对此不作限定。
在本申请实施例的一种可行的设计中,第一时间段的时长可预先设置。
在另一种可行的设计中,第一时间段可根据车辆上、下高架时需要耗费的时长t1确定,其中,第一时间段可为比时长t1略长的时长。例如,如果t1为20秒,则第一时间段可为25秒。
在交通规则中,通常对车辆上、下高架的过程中的车速有一定限制。在这一设计中,根据对车速的限制,以及上、下高架时的坡度的长度,可确定车辆在上、下高架时所耗费的时长t1。
当然,还可通过其他方式确定第一时间段的长度,例如,终端设备可根据接收到的针对第一时间段的设置操作,确定第一时间段的长度,本申请实施例对此不作限定。
(2)如果车辆的行驶状态为开始下匝道,并且车辆在之后的第一时间段内的高度变化的绝对值大于第一阈值,确定车辆的高架识别结果为车辆在高架下侧的道路行驶。
这一方案中,第一时间段为在确定车辆的行驶状态的时间之后的时间段。也就是说,在车辆的行驶状态为开始下匝道之后的一段时间内,车辆高度降低,并且减少的高度大于第一阈值,这种情况下,可确定车辆已下高架,车辆在高架下侧的道路行驶。
(3)车辆的行驶状态为结束上匝道,并且车辆在之前的第二时间段内的高度变化大于第一阈值,确定车辆的高架识别结果为车辆在高架上侧的道路行驶。
这一方案中,第二时间段为在确定车辆的行驶状态的时间之前的时间段。也就是说,在车辆的行驶状态为结束上匝道之前的一段时间内,车辆高度增加,并且增加的高度大于第一阈值,这种情况下,可确定车辆已上高架,车辆在高架上侧的道路行驶。
在本申请实施例的一种可行的设计中,第二时间段的时长可预先设置。
在另一种可行的设计中,第二时间段可根据车辆上、下高架时需要耗费的时长t1确定,其中,第二时间段可为比时长t1略长的时长。
另外,第二时间段的长度可与第一时间段的长度相同,或者,二者也可不同,本申请实施例对此不作限定。
(4)车辆的行驶状态为结束下匝道,并且车辆在之前的第二时间段内的高度变化大于第一阈值,确定车辆的高架识别结果为车辆在高架下侧的道路行驶。
这一方案中,第二时间段为在确定车辆的行驶状态的时间之前的时间段。也就是说,在车辆的行驶状态为开始上匝道之后的一段时间内,车辆高度降低,并且减少的高度大于第一阈值,这种情况下,可确定车辆已下高架,车辆在高架下侧的道路行驶。
通过上述步骤,可确定车辆在不同的行驶状态下的高架识别结果。并且,上述步骤在确定车辆的高架识别结果时,结合了车辆的行驶状态和车辆的高度变化,能够识别车辆在高架上侧的道路还是在高架下侧道路,解决了现有技术不能确定车辆的高架识别结果的问题。
有些场景下,高架包括两层或更多层,即高架上侧的道路包括至少两层。如果高架包括至少两层,在这一场景下,在车辆的行驶状态不同的情况下,可分别通过以下步骤确定车辆的高架识别结果:
(1)如果车辆的行驶状态为开始上匝道,并且车辆在之后的第一时间段内的高度变化大于第一阈值,确定车辆的高架识别结果为车辆在高架上侧的道路行驶。
这一方案中,第一时间段在确定车辆的行驶状态的时间之后。也就是说,在车辆的行驶状态为开始上匝道之后的一段时间内,车辆高度增加,并且增加的高度大于第一阈 值,这种情况下,可确定车辆已上高架,车辆在高架上侧的道路行驶。
(2)如果车辆的行驶状态为开始下匝道,并且车辆在之后的第一时间段内的高度变化的绝对值大于第一阈值,根据车辆在下匝道之前的状态,确定车辆的高架识别结果为车辆是否在高架下侧的道路行驶。
如果车辆的行驶状态为开始下匝道,并且车辆在第一时间段内的高度变化的绝对值大于第一阈值,则表明在车辆的行驶状态为开始下匝道之后的一段时间内,车辆高度降低。
由于在这一场景中,高架可包括多层,因此,这种情况下,车辆有可能驶入高架下侧的道路,也可能是从高架上侧的较高层的道路,驶入高架上侧的较低层的道路。例如,高架上侧的道路包括三层,并且设定高度越高,对应的层数越高,与高架下侧的道路最接近的一层为第一层,与高架下侧的道路距离最远的一层为第三层,则车辆开始下匝道,并且车辆在之后的第一时间段内的高度变化的绝对值大于第一阈值,可能是车辆从高架上侧的第三层道路驶入高架上侧的第二层道路。因此,还需要根据车辆在下匝道之前的状态,确定车辆的高架识别结果为车辆是否在高架下侧的道路行驶。
在本申请实施例提供的方案中,车辆在下匝道之前的状态通常包括:车辆在第一层高架上行驶和车辆在其他层的高架上行驶,第一层高架为与高架下侧的道路最接近的高架层。
其中,如果车辆在下匝道之前的状态为车辆在第一层高架上行驶,那么在上述步骤中,可确定车辆的高架识别结果为车辆在高架下侧的道路行驶;如果车辆在下匝道之前的状态为车辆在其他层的高架上行驶,那么在上述步骤中,可确定车辆的高架识别结果为车辆在高架上侧的道路行驶,即车辆是从高架上侧的较高层的道路,驶入高架上侧的较低层的道路。
(3)车辆的行驶状态为结束上匝道,并且车辆在之前的第二时间段内的高度变化大于第一阈值,确定车辆的高架识别结果为车辆在高架上侧的道路行驶。
这一方案中,第二时间段为在确定车辆的行驶状态的时间之前的时间段。也就是说,在车辆的行驶状态为结束上匝道之前的一段时间内,车辆高度增加,并且增加的高度大于第一阈值,这种情况下,可确定车辆已上高架,车辆在高架上侧的道路行驶。
(4)车辆的行驶状态为结束下匝道,并且车辆在之前的第二时间段内的高度变化的绝对值大于所述第一阈值,根据车辆在下匝道之前的状态,确定车辆的高架识别结果。
这一方案中,第二时间段为在确定车辆的行驶状态的时间之前的时间段。如果车辆的行驶状态为结束下匝道,并且车辆在第二时间段内的高度变化的绝对值大于第一阈值,则表明在车辆的行驶状态为开始下匝道之后的一段时间内,车辆高度降低。
由于在这一场景中,高架可包括多层,因此,这种情况下,车辆有可能驶入高架下侧的道路,也可能是从高架上侧的较高层的道路,驶入高架上侧的较低层的道路。因此,还需要根据车辆在下匝道之前的状态,确定车辆的高架识别结果为车辆是否在高架下侧的道路行驶。
其中,如果车辆在下匝道之前的状态为车辆在第一层高架上行驶,那么在上述步骤中,可确定车辆的高架识别结果为车辆在高架下侧的道路行驶;如果车辆在下匝道之前的状态为车辆在其他层的高架上行驶,那么在上述步骤中,可确定车辆的高架识别结果为车辆在高架上侧的道路行驶,即车辆是从高架上侧的较高层的道路,驶入高架上侧的 较低层的道路。
通过上述步骤,可确定高架包括至少两层的情况下,车辆在不同的行驶状态下的高架识别结果。并且,上述步骤在确定车辆的高架识别结果时,结合了车辆在下匝道之前的状态。一些实施例中,可根据车辆的高度和每层高架的高度,确定车辆在下匝道之前的状态。
这一实施例中,每层高架的高度通常指的是以高架下侧的道路为基准面的情况下,该层高架的道路的高度。例如,高架上侧的道路包括多层,并且设定高度越高,对应的层数越高,与高架下侧的道路最接近的一层为第一层,则第n层高架的高度,指的是以高架下侧的道路为基准面的情况下,高架上侧的第n层高架的高度。
其中,车辆的高度可根据高度传感器(例如气压计等)确定。该高度传感器可安装在终端设备内,或者,该高度传感器可设置在车辆内,并在采集到车辆的高度之后,将车辆的高度传输至终端设备。
这一实施例中,终端设备可通过多种方式确定每层高架的高度。在一种可行的方式中,终端设备内可存储各地的高架中每层高架的高度。这种情况下,终端设备在根据GNSS信号,确定终端设备的位置信息之后,可确定该位置信息周边存在的高架,并通过查询自身的存储,确定每层高架的高度。
在另一种可行的方式中,终端设备在根据接收到的GNSS信号,确定终端设备的位置信息之后,可将该位置信息传输至远程的服务器,该远程的服务器确定每层高架的高度,并将其传输至终端设备,以便终端设备根据服务器的传输,确定每层高架的高度。
当然,终端设备还可通过其他方式确定每层高架的高度,本申请实施例对此不作限定。
通过上述方案,基于车辆的高度和每层高架的高度,可确定车辆在下匝道之前的状态。
一些实施例中,可通过以下步骤确定车辆在下匝道之前的状态:
第一步,每次在确定车辆的高架识别结果之后,记录车辆所在高架的层数的变化;
第二步,根据车辆所在高架的层数的变化,确定车辆在下匝道之前的状态。
示例性的,如果高架包括两层以上,这一实施例中,终端设备可每次在确定从当前位置驶入高架上一层的道路之后,则记录的车辆所在高架上侧的道路的层数加一,在确定从当前位置驶入高架下一层的道路之后,则记录的车辆所在高架上侧的道路的层数减一。
这种情况下,如果车辆在还未上高架前,通常可记录车辆所在高架的层数为0,如果记录的车辆所在高架的层数大于0,则表明车辆位于高架上侧的道路,如果记录的车辆所在高架的层数为0,则表明车辆位于高架下侧的道路。
另外,如果终端设备记录的车辆所在高架的层数为n,然后终端设备根据车辆的第一参数和第二参数,确定车辆的行驶状态为开始上匝道,并且车辆在第一时间段内的高度变化大于第一阈值,这种情况下,终端设备确定车辆上了一次高架,则记录车辆所在高架的层数为n+1。
通过上述方案,可根据对车辆所在高架的层数的变化的记录,确定车辆在下匝道之前的状态。
本申请实施例提供一种高架识别方法,在该方法中,根据GNSS信号确定在第一时刻 的第一参数,以及通过传感器确定车辆在第一时刻的第二参数,然后结合第一参数和第二参数确定车辆的行驶状态,再结合车辆的行驶状态和车辆的高度变化,确定车辆的高架识别结果,从而解决现有技术中,无法确定车辆的高架识别结果的问题,并且能够减少导航出现错误的次数,提高导航的准确度。
进一步的,由于本申请的方案可以提高导航的准确度,因此,还能够提高车辆上的用户的体验,减少驾驶过程的耗时,以及减少车辆的油耗,达到节能的目的。
并且,如果在本申请实施例中根据车辆所在高架的层数的变化,确定车辆在下匝道之前的状态,还可确定车辆所在高架的层数,进一步提高导航的准确度。
进一步的,为了提供导航的准确度,一些实施例中,还包括以下步骤:
在确定车辆的高架识别结果之后,向服务器上报车辆的高架识别结果;
或者,如果记录车辆所在高架的层数的变化,并且车辆的高架识别结果表明车辆在高架上侧的道路行驶,向服务器上报车辆的高架识别结果和车辆所在高架的层数。
该服务器可为导航APP的服务器,根据上报的信息,该服务器确定车辆在高架上侧的道路行驶还是在高架下侧的道路行驶。如果终端设备不仅上报车辆的高架识别结果,还上报了车辆所在高架的层数,则车辆在高架上侧的道路行驶时,服务器还可确定车辆所在高架的层数,从而能够更加准确地确定终端设备所在的位置,有助于提高导航的准确度。
进一步的,一些实施例中,还包括以下步骤:
如果车辆的高架识别结果表明车辆在高架下侧的道路行驶,并且车辆的行驶方向的前方包括弱信号区域,将导航方法由通过GNSS信号导航调整为通过网络定位方法或网络定位方法和惯性导航方法进行导航。
其中,弱信号区域通常包括:隧道区域或遮挡物遮蔽区域。遮挡物遮蔽区域指的是由遮挡物遮蔽,导致信号较弱的区域,该遮挡物可以为建筑物或植被等。
一种可行的设计中,终端设备可存储各个弱信号区域的位置,并根据自身的存储,确定车辆的行驶方向的前方是否为弱信号区域。
另一种可行的设计中,可由远程的服务器确定车辆的行驶方向的前方是否为弱信号区域,若是,该服务器向该终端设备传输相应的指示,以便终端设备根据接收到指示确定前方是否为弱信号区域。
当车辆位于弱信号区域时,终端设备接收到的GNSS信号往往较弱,或者无法接收到GNSS信号,如果终端设备继续基于GNSS信号进行导航,导航的准确度较低,甚至在接收不到GNSS信号时,无法实现导航。
这种情况下,终端设备通过网络定位方法或网络定位方法和惯性导航方法进行导航,能够提高导航的准确度。
其中,网络定位是一种通过终端设备接收到的网络信号,确定终端设备的位置的定位技术。该网络信号可来源于基站,也可来源无线保真(wireless fidelity,WIFI)热点。
如果该网络信号来源于基站,则终端设备可通过不同基站传输的网络信号的发射时间与该网络信号的接收时间,确定自身与不同基站之间的距离,然后根据自身与不同基站之间的距离和不同基站的位置,确定终端设备所处位置;如果该网络信号来源于WIFI热点,则终端设备可通过不同WIFI热点传输的网络信号的发射时间与该网络信号的接 收时间,确定自身与不同WIFI热点之间的距离,然后根据自身与不同WIFI热点之间的距离和不同WIFI热点所处的位置,确定终端设备所处的位置。
而生成网络信号的设备(例如基站和WIFI热点)的位置与卫星的位置不同,遮蔽物对GNSS信号的接收有较大影响时,可能对终端设备接收网络信号的影响较小,通过网络定位方法进行导航,可提高导航的准确度。
进一步,终端设备可结合网络定位方法和惯性导航方法,共同进行导航。其中,惯性导航方法依据车辆航位推算(vehicle dead reckoning,VDR)技术,该方法可通过惯导传感器(例如方向传感器和速度传感器)推算车辆的瞬时位置。通过网络定位方法和惯性导航方法共同进行导航,可进一步提高导航的准确度。
本申请实施例提供的高架识别方法,能够有效减少导航错误,提高导航精度。为了明确本申请的优势,以下提供一个示例。
在该示例中,终端设备分别通过现有技术和本申请实施例提供的方案,分别对车辆进行导航。
其中,图3为终端设备通过现有技术对车辆进行导航时,终端设备显示的用于为车辆进行导航的电子地图。图中导航指示用户车辆位于北四环东路辅路上,处于高架桥下;而用户车辆实际所在的位置是北四环主路的高架上,图3中五角星表示用户车辆实际位置,可见车辆实际位置与用户手机导航位置不一致,导航对车辆位置识别发生偏差。
图11为终端设备通过本申请实施例对车辆进行导航时,终端设备显示的用于为车辆进行导航的电子地图。参见图11,采用本申请实施例的高架识别方法可以准确地定位出用户车辆位于高架上,使手机导航位置与车辆实际位置一致,从而实现准确定位和导航。
本文中描述的各个方法实施例可以为独立的方案,也可以根据内在逻辑进行组合,这些方案都落入本申请的保护范围中。
可以理解的是,上述各个方法实施例中,由终端设备实现的方法和操作,也可以由可用于终端设备的部件(例如芯片或者电路)实现。
上述实施例对本申请提供的高架识别方法进行了介绍。可以理解的是,终端设备为了实现上述功能,其包含了执行每一个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对终端设备进行功能模块的划分,例如,可以对应每一个功能划分每一个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
以上,结合图1至图11详细说明了本申请实施例提供的方法。以下,结合图12至图13详细说明本申请实施例提供的装置。应理解,装置实施例的描述与方法实施例的描述相互对应,因此,未详细描述的内容可以参见上文方法实施例,为了简洁,这里不再 赘述。
参见图12,图12为本申请提供的导航装置的一种实施方式的结构框图。如图12所示,该装置1000可以包括:收发器1001和处理器1002。该装置1000可以执行上述图9所示方法实施例中终端设备执行的操作。
示例性的,在本申请一种可选的实施例中,所述收发器1001用于接收全球导航卫星系统GNSS信号。所述处理器1002用于:根据所述GNSS信号,确定车辆在第一时刻的第一参数;
根据传感器,确定所述车辆在第一时刻的第二参数;
根据所述第一参数和所述第二参数确定所述车辆的行驶状态;
当所述车辆的行驶状态属于目标状态时,根据所述车辆的高度变化,确定所述车辆的高架识别结果,所述目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶,或者为所述车辆在高架下侧的道路行驶。
一种可能的实现方式中,所述第一参数至少包括以下信息中的一种:所述车辆的速度和航向;
所述第二参数至少包括以下信息中的一种:所述车辆的俯仰角、横滚角和航向角。
一种可能的实现方式中,所述处理器1002用于确定车辆在第一时刻的第一参数,具体为:
当接收到启动导航功能的操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到位置搜索操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到用于指示目的地的操作时,确定所述车辆在第一时刻的第一参数;
或者,当接收到用于指示导航方式为驾车的操作时,确定所述车辆在第一时刻的第一参数
或者,当终端设备的速度大于目标速度阈值时,确定所述车辆在第一时刻的第一参数;
或者,当根据所述GNSS信号,确定所述车辆的前方包含匝道口时,确定所述车辆在第一时刻的第一参数;
或者,当根据包括所述车辆的前方的图像,确定所述车辆的前方包括高架标志时,确定所述车辆在第一时刻的第一参数。
一种可能的实现方式中,所述处理器1002用于根据所述第一参数和所述第二参数,确定所述车辆的行驶状态,具体为:
向分类器传输所述第一参数和所述第二参数,所述分类器用于根据所述车辆的参数,对所述车辆的行驶状态进行分类;
根据所述分类器的输出,确定所述车辆的行驶状态。
一种可能的实现方式中,所述高架为一层,所述处理器1002用于根据所述车辆的高度变化,确定所述车辆的高架识别结果,具体为:
如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,确定所述车辆的高架识别结果为所述车 辆在高架下侧的道路行驶;
或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架下侧的道路行驶。
一种可能的实现方式中,所述高架包括至少两层,所述处理器1002根据所述车辆的高度变化,确定所述车辆的高架识别结果,具体为:
如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果;
或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果。
一种可能的实现方式中,所述车辆在下匝道之前的状态包括以下状态中的一种:所述车辆在第一层高架上行驶和所述车辆在其他层的高架上行驶,所述第一层高架为与所述高架下侧的道路最接近的高架上层的道路。
一种可能的实现方式中,所述处理器1002还用于:
根据所述车辆的高度和每层高架的高度,确定所述车辆在下匝道之前的状态;
或者,每次在确定所述车辆的高架识别结果之后,记录所述车辆所在高架的层数的变化;
根据所述车辆所在高架的层数的变化,确定所述车辆在下匝道之前的状态。
一种可能的实现方式中,所述处理器1002还用于:
在确定所述车辆的高架识别结果之后,向服务器上报所述车辆的高架识别结果;
或者,如果记录所述车辆所在高架的层数的变化,并且所述车辆的高架识别结果表明所述车辆在所述高架上侧的道路行驶,向所述服务器上报所述车辆的高架识别结果和所述车辆所在高架的层数。
一种可能的实现方式中,所述处理器1002还用于:
如果所述车辆的高架识别结果表明所述车辆在高架下侧的道路行驶,并且所述车辆的行驶方向的前方包括弱信号区域,将高架识别方法由通过GNSS信号导航调整为通过网络定位方法导航或通过网络定位方法和惯性高架识别方法导航。
一种可能的实现方式中,所述弱信号区域包括:隧道区域或遮挡物遮蔽区域。
也就是说,该装置1000可以实现对应于图9所示高架识别方法实施例中终端设备所执行的步骤或者流程,该装置1000可以包括用于执行图9所示高架识别方法实施例中终 端设备执行的方法的模块。应理解,各模块执行上述相应步骤的具体过程在上述高架识别方法实施例中已经详细说明,为了简洁,在此不再赘述。
本申请实施例还提供了一种导航装置,该导航装置包括至少一个处理器和通信接口。所述通信接口用于为所述至少一个处理器提供信息输入和/或输出,所述至少一个处理器用于执行上述方法实施例中的方法。
本申请实施例还提供一种终端设备,该终端设备包括处理器,当所述处理器执行存储器中的计算机程序或指令时,如上述方法实施例中的方法被执行。
本申请实施例还提供一种终端设备,该终端设备包括处理器和存储器;所述存储器用于存储计算机程序或指令;所述处理器用于执行所述存储器所存储的计算机程序或指令,以使所述终端设备执行如上述方法实施例中的方法。
本申请实施例还提供一种终端设备,该终端设备包括处理器、存储器和收发器;所述收发器用于接收信号或者发送信号;所述存储器用于存储计算机程序或指令;所述处理器用于执行所述存储器所存储的计算机程序或指令,以使所述终端设备执行如上述方法实施例中的方法。
本申请实施例还提供一种终端设备,该终端设备包括处理器和接口电路;所述接口电路,用于接收计算机程序或指令并传输至所述处理器;所述处理器用于运行所述计算机程序或指令,以使所述终端设备执行如上述方法实施例中的方法。
应理解,上述导航装置可以是一个芯片。例如,参见图13,图13为本申请提供的芯片的一种实施方式的结构框图。图13所示的芯片可以为通用处理器,也可以为专用处理器。该芯片1100可以包括至少一个处理器1101。其中,所述至少一个处理器1101可以用于支持图14所示的装置执行图9所示的技术方案。
可选的,该芯片1100还可以包括收发器1102,收发器1102用于接受处理器1101的控制,用于支持图12所示的装置执行图9所示的技术方案。可选的,图13所示的芯片1100还可以包括存储介质1103。具体的,所述收发器1102可以替换为通信接口,所述通信接口为所述至少一个处理器1101提供信息输入和/或输出。
需要说明的是,图13所示的芯片1100可以使用下述电路或者器件来实现:一个或多个现场可编程门阵列(field programmable gate array,FPGA)、可编程逻辑器件(programmable logic device,PLD)、专用集成芯片(application specific integrated circuit,ASIC)、系统芯片(system on chip,SoC)、中央处理器(central processor unit,CPU)、网络处理器(network processor,NP)、数字信号处理电路(digital signal processor,DSP)、微控制器(micro controller unit,MCU),控制器、状态机、门逻辑、分立硬件部件、任何其他适合的电路、或者能够执行本申请通篇所描述的各种功能的电路的任意组合。
在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
应注意,本申请实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。 在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
可以理解,本申请实施例中的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。应注意,本文描述的系统和方法的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
根据本申请实施例提供的方法,本申请实施例还提供一种计算机程序产品,该计算机程序产品包括:计算机程序或指令,当该计算机程序或指令在计算机上运行时,使得该计算机执行图9所示实施例中任意一个实施例的方法。
根据本申请实施例提供的方法,本申请实施例还提供一种计算机存储介质,该计算机存储介质存储有计算机程序或指令,当该计算机程序或指令在计算机上运行时,使得该计算机执行图9所示实施例中任意一个实施例的方法。
根据本申请实施例提供的方法,本申请实施例还提供一种终端设备,所述终端设备为智能设备,包含智能手机、平板电脑或个人数字助理等,该智能设备包含上述位置信息的生成装置。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各种说明性逻辑块(illustrative logical block)和步骤(step),能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模 块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理单元中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
上述本申请实施例提供的位置信息的生成装置、芯片、计算机存储介质、计算机程序产品、终端设备均用于执行上文所提供的方法,因此,其所能达到的有益效果可参考上文所提供的方法对应的有益效果,在此不再赘述。
应理解,在本申请的各个实施例中,各步骤的执行顺序应以其功能和内在逻辑确定,各步骤序号的大小并不意味着执行顺序的先后,不对实施例的实施过程构成限定。
本说明书的各个部分均采用递进的方式进行描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点介绍的都是与其他实施例不同之处。尤其,对于位置信息的生成装置、芯片、计算机存储介质、计算机程序产品、终端设备的实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例中的说明即可。
尽管已描述了本申请的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本申请范围的所有变更和修改。
以上所述的本申请实施方式并不构成对本申请保护范围的限定。

Claims (26)

  1. 一种高架识别方法,其特征在于,包括:
    根据全球导航卫星系统GNSS信号,确定车辆在第一时刻的第一参数;
    根据传感器,确定所述车辆在第一时刻的第二参数;
    根据所述第一参数和所述第二参数确定所述车辆的行驶状态;
    当所述车辆的行驶状态属于目标状态时,根据所述车辆的高度变化,确定所述车辆的高架识别结果,所述目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶,或者为所述车辆在高架下侧的道路行驶。
  2. 根据权利要求1所述的方法,其特征在于,
    所述第一参数至少包括以下信息中的一种:所述车辆的速度和航向;
    所述第二参数至少包括以下信息中的一种:所述车辆的俯仰角、横滚角和航向角。
  3. 根据权利要求1所述的方法,其特征在于,所述确定车辆在第一时刻的第一参数,包括:
    当接收到启动导航功能的操作时,确定所述车辆在第一时刻的第一参数;
    或者,当接收到位置搜索操作时,确定所述车辆在第一时刻的第一参数;
    或者,当接收到用于指示目的地的操作时,确定所述车辆在第一时刻的第一参数;
    或者,当接收到用于指示导航方式为驾车的操作时,确定所述车辆在第一时刻的第一参数
    或者,当终端设备的速度大于目标速度阈值时,确定所述车辆在第一时刻的第一参数;
    或者,当根据所述GNSS信号,确定所述车辆的前方包含匝道口时,确定所述车辆在第一时刻的第一参数;
    或者,当根据包括所述车辆的前方的图像,确定所述车辆的前方包括高架标志时,确定所述车辆在第一时刻的第一参数。
  4. 根据权利要求1所述的方法,其特征在于,所述根据所述第一参数和所述第二参数,确定所述车辆的行驶状态,包括:
    向分类器传输所述第一参数和所述第二参数,所述分类器用于根据所述车辆的参数,对所述车辆的行驶状态进行分类;
    根据所述分类器的输出,确定所述车辆的行驶状态。
  5. 根据权利要求1所述的方法,其特征在于,所述高架为一层,所述根据所述车辆的高度变化,确定所述车辆的高架识别结果,包括:
    如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
    或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时 间段内的高度变化的绝对值大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在高架下侧的道路行驶;
    或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
    或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架下侧的道路行驶。
  6. 根据权利要求1所述的方法,其特征在于,所述高架包括至少两层,所述根据所述车辆的高度变化,确定所述车辆的高架识别结果,包括:
    如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
    或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果;
    或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
    或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果。
  7. 根据权利要求6所述的方法,其特征在于,
    所述车辆在下匝道之前的状态包括以下状态中的一种:所述车辆在第一层高架上行驶和所述车辆在其他层的高架上行驶,所述第一层高架为与所述高架下侧的道路最接近的高架上层的道路。
  8. 根据权利要求6或7所述的方法,其特征在于,还包括:
    根据所述车辆的高度和每层高架的高度,确定所述车辆在下匝道之前的状态;
    或者,每次在确定所述车辆的高架识别结果之后,记录所述车辆所在高架的层数的变化;
    根据所述车辆所在高架的层数的变化,确定所述车辆在下匝道之前的状态。
  9. 根据权利要求8所述的方法,其特征在于,还包括:
    在确定所述车辆的高架识别结果之后,向服务器上报所述车辆的高架识别结果;
    或者,如果记录所述车辆所在高架的层数的变化,并且所述车辆的高架识别结果表明所述车辆在所述高架上侧的道路行驶,向所述服务器上报所述车辆的高架识别结果和所述车辆所在高架的层数。
  10. 根据权利要求1至9任一项所述的方法,其特征在于,还包括:
    如果所述车辆的高架识别结果表明所述车辆在高架下侧的道路行驶,并且所述车辆的行驶方向的前方包括弱信号区域,将导航方法由通过GNSS信号导航调整为通过网络定位方法导航或通过网络定位方法和惯性导航方法导航。
  11. 根据权利要求10所述的方法,其特征在于,
    所述弱信号区域包括:隧道区域或遮挡物遮蔽区域。
  12. 一种高架识别装置,其特征在于,所述装置包括:收发器和处理器;
    所述收发器用于接收全球导航卫星系统GNSS信号;
    所述处理器用于:
    根据所述GNSS信号,确定车辆在第一时刻的第一参数;
    根据传感器,确定所述车辆在第一时刻的第二参数;
    根据所述第一参数和所述第二参数确定所述车辆的行驶状态;
    当所述车辆的行驶状态属于目标状态时,根据所述车辆的高度变化,确定所述车辆的高架识别结果,所述目标状态包括以下行驶状态中的至少一种:开始上匝道、结束上匝道、开始下匝道和结束下匝道,所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶,或者为所述车辆在高架下侧的道路行驶。
  13. 根据权利要求12所述的装置,其特征在于,
    所述第一参数至少包括以下信息中的一种:所述车辆的速度和航向;
    所述第二参数至少包括以下信息中的一种:所述车辆的俯仰角、横滚角和航向角。
  14. 根据权利要求12所述的装置,其特征在于,所述处理器用于确定车辆在第一时刻的第一参数,具体为:
    当接收到启动导航功能的操作时,确定所述车辆在第一时刻的第一参数;
    或者,当接收到位置搜索操作时,确定所述车辆在第一时刻的第一参数;
    或者,当接收到用于指示目的地的操作时,确定所述车辆在第一时刻的第一参数;
    或者,当接收到用于指示导航方式为驾车的操作时,确定所述车辆在第一时刻的第一参数
    或者,当终端设备的速度大于目标速度阈值时,确定所述车辆在第一时刻的第一参数;
    或者,当根据所述GNSS信号,确定所述车辆的前方包含匝道口时,确定所述车辆在第一时刻的第一参数;
    或者,当根据包括所述车辆的前方的图像,确定所述车辆的前方包括高架标志时,确定所述车辆在第一时刻的第一参数。
  15. 根据权利要求12所述的装置,其特征在于,所述处理器用于根据所述第一参数和所述第二参数,确定所述车辆的行驶状态,具体为:
    向分类器传输所述第一参数和所述第二参数,所述分类器用于根据所述车辆的参数, 对所述车辆的行驶状态进行分类;
    根据所述分类器的输出,确定所述车辆的行驶状态。
  16. 根据权利要求12所述的装置,其特征在于,所述高架为一层,所述处理器用于根据所述车辆的高度变化,确定所述车辆的高架识别结果,具体为:
    如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
    或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在高架下侧的道路行驶;
    或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
    或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架下侧的道路行驶。
  17. 根据权利要求12所述的装置,其特征在于,所述高架包括至少两层,所述处理器根据所述车辆的高度变化,确定所述车辆的高架识别结果,具体为:
    如果所述车辆的行驶状态为开始上匝道,并且所述车辆在之后的第一时间段内的高度变化大于第一阈值,确定所述车辆的高架识别结果为所述车辆在高架上侧的道路行驶;
    或者,如果所述车辆的行驶状态为开始下匝道,并且所述车辆在之后的所述第一时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果;
    或者,所述车辆的行驶状态为结束上匝道,并且所述车辆在之前的第二时间段内的高度变化大于所述第一阈值,确定所述车辆的高架识别结果为所述车辆在所述高架上侧的道路行驶;
    或者,所述车辆的行驶状态为结束下匝道,并且所述车辆在之前的所述第二时间段内的高度变化的绝对值大于所述第一阈值,根据所述车辆在下匝道之前的状态,确定所述车辆的高架识别结果。
  18. 根据权利要求17所述的装置,其特征在于,
    所述车辆在下匝道之前的状态包括以下状态中的一种:所述车辆在第一层高架上行驶和所述车辆在其他层的高架上行驶,所述第一层高架为与所述高架下侧的道路最接近的高架上层的道路。
  19. 根据权利要求17或18所述的装置,其特征在于,所述处理器还用于:
    根据所述车辆的高度和每层高架的高度,确定所述车辆在下匝道之前的状态;
    或者,每次在确定所述车辆的高架识别结果之后,记录所述车辆所在高架的层数的变化;
    根据所述车辆所在高架的层数的变化,确定所述车辆在下匝道之前的状态。
  20. 根据权利要求19所述的装置,其特征在于,所述处理器还用于:
    在确定所述车辆的高架识别结果之后,向服务器上报所述车辆的高架识别结果;
    或者,如果记录所述车辆所在高架的层数的变化,并且所述车辆的高架识别结果表明所述车辆在所述高架上侧的道路行驶,向所述服务器上报所述车辆的高架识别结果和所述车辆所在高架的层数。
  21. 根据权利要求12至20任一项所述的装置,其特征在于,所述处理器还用于:
    如果所述车辆的高架识别结果表明所述车辆在高架下侧的道路行驶,并且所述车辆的行驶方向的前方包括弱信号区域,将导航方法由通过GNSS信号导航调整为通过网络定位方法导航或通过网络定位方法和惯性导航方法导航。
  22. 根据权利要求21所述的装置,其特征在于,
    所述弱信号区域包括:隧道区域或遮挡物遮蔽区域。
  23. 一种终端设备,其特征在于,所述终端设备包括权利要求12至22任意一项所述的装置。
  24. 一种计算机存储介质,其特征在于,所述计算机存储介质中存储有计算机程序或指令,当所述计算机程序或指令被执行时,如权利要求1-11中任一项所述的方法被执行。
  25. 一种计算机程序产品,其特征在于,所述计算机程序产品包括计算机程序或指令,当所述计算机程序或指令在计算机上运行时,使得计算机执行如权利要求1-11中任一项所述的方法。
  26. 一种芯片,其特征在于,所述芯片包括处理器,所述处理器与存储器耦合,用于执行所述存储器中存储的计算机程序或指令,当所述计算机程序或指令被执行时,如权利要求1-11中任一项所述的方法被执行。
PCT/CN2022/091520 2021-08-06 2022-05-07 一种高架识别方法及装置 WO2023010923A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110904063.XA CN113804211B (zh) 2021-08-06 2021-08-06 一种高架识别方法及装置
CN202110904063.X 2021-08-06

Publications (1)

Publication Number Publication Date
WO2023010923A1 true WO2023010923A1 (zh) 2023-02-09

Family

ID=78942753

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/091520 WO2023010923A1 (zh) 2021-08-06 2022-05-07 一种高架识别方法及装置

Country Status (2)

Country Link
CN (1) CN113804211B (zh)
WO (1) WO2023010923A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113804211B (zh) * 2021-08-06 2023-10-03 荣耀终端有限公司 一种高架识别方法及装置
CN114509068A (zh) * 2022-01-04 2022-05-17 海信集团控股股份有限公司 一种多层道路上的车辆的位置判断方法及装置

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012234A (zh) * 2009-09-04 2011-04-13 陈宗炜 一种车辆导航装置
CN102401660A (zh) * 2010-09-17 2012-04-04 环达电脑(上海)有限公司 高架道路的定位方法
US20150066351A1 (en) * 2013-08-30 2015-03-05 Bosch Automotive Products (Suzhou) Co. Ltd. Method and apparatus for providing vehicle navigation information within an elevated road area
JP2015083930A (ja) * 2013-10-25 2015-04-30 アルパイン株式会社 ナビゲーション装置および高架上下道判定方法
CN107764274A (zh) * 2016-08-17 2018-03-06 厦门雅迅网络股份有限公司 一种判别车辆是否行驶在高架道路的方法
CN109883438A (zh) * 2019-03-21 2019-06-14 斑马网络技术有限公司 车辆导航方法、装置、介质及电子设备
CN111127874A (zh) * 2018-10-30 2020-05-08 上海擎感智能科技有限公司 一种高架识别方法及识别系统
CN111310675A (zh) * 2020-02-20 2020-06-19 上海赛可出行科技服务有限公司 基于卷积神经网络的高架识别辅助定位方法
CN113804211A (zh) * 2021-08-06 2021-12-17 荣耀终端有限公司 一种高架识别方法及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103335655A (zh) * 2013-05-29 2013-10-02 周眉 导航仪及其导航方法
CN106989743A (zh) * 2017-03-31 2017-07-28 上海电机学院 一种能自动感知进出高架道路信息的车辆导航设备
CN108195391A (zh) * 2018-01-29 2018-06-22 千寻位置网络有限公司 基于气压计的在高架上或高架下的检测方法
CN110979339B (zh) * 2019-11-26 2021-03-30 南京市德赛西威汽车电子有限公司 一种基于v2v的前方道路形态重建方法
CN112945230B (zh) * 2021-01-26 2022-03-25 腾讯科技(深圳)有限公司 车辆行车状态的识别方法、装置、计算机设备和存储介质
CN115164911A (zh) * 2021-02-03 2022-10-11 西华大学 基于图像识别的高精准度立交桥快速导航方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012234A (zh) * 2009-09-04 2011-04-13 陈宗炜 一种车辆导航装置
CN102401660A (zh) * 2010-09-17 2012-04-04 环达电脑(上海)有限公司 高架道路的定位方法
US20150066351A1 (en) * 2013-08-30 2015-03-05 Bosch Automotive Products (Suzhou) Co. Ltd. Method and apparatus for providing vehicle navigation information within an elevated road area
JP2015083930A (ja) * 2013-10-25 2015-04-30 アルパイン株式会社 ナビゲーション装置および高架上下道判定方法
CN107764274A (zh) * 2016-08-17 2018-03-06 厦门雅迅网络股份有限公司 一种判别车辆是否行驶在高架道路的方法
CN111127874A (zh) * 2018-10-30 2020-05-08 上海擎感智能科技有限公司 一种高架识别方法及识别系统
CN109883438A (zh) * 2019-03-21 2019-06-14 斑马网络技术有限公司 车辆导航方法、装置、介质及电子设备
CN111310675A (zh) * 2020-02-20 2020-06-19 上海赛可出行科技服务有限公司 基于卷积神经网络的高架识别辅助定位方法
CN113804211A (zh) * 2021-08-06 2021-12-17 荣耀终端有限公司 一种高架识别方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BAI JIE: "Research on Vehicle Postures Recognition Method Based on Multi-source Sensor Information Fusion at the Overpass", CHINA MASTER'S' THESES FULL-TEXT DATABASE (ELECTRONIC JOURNAL)-INFORMATION & TECHNOLOGY), TIANJIN POLYTECHNIC UNIVERSITY, CN, no. 10, 15 October 2020 (2020-10-15), CN , XP093031492, ISSN: 1674-0246 *

Also Published As

Publication number Publication date
CN113804211B (zh) 2023-10-03
CN113804211A (zh) 2021-12-17

Similar Documents

Publication Publication Date Title
WO2023010922A1 (zh) 一种高架识别方法及装置
WO2023010923A1 (zh) 一种高架识别方法及装置
US10551207B2 (en) Autonomous vehicle sensor data and map integration
WO2020133088A1 (zh) 一种可用于自动驾驶的地图更新系统与方法
JP7192772B2 (ja) 画像処理装置および画像処理方法
US20220215639A1 (en) Data Presentation Method and Terminal Device
WO2022052765A1 (zh) 目标跟踪方法及装置
US20230161034A1 (en) Point cloud registration for lidar labeling
CN114882464B (zh) 多任务模型训练方法、多任务处理方法、装置及车辆
KR20130011351A (ko) 스마트 정보를 이용한 차량 운행 시스템, 이를 위한 단말기, 서비스장치 및 방법
CN110347147A (zh) 用于车辆的定位的方法和系统
WO2023169448A1 (zh) 一种感知目标的方法和装置
CN115170630B (zh) 地图生成方法、装置、电子设备、车辆和存储介质
CN114756700B (zh) 场景库建立方法、装置、车辆、存储介质及芯片
CN115164910B (zh) 行驶路径生成方法、装置、车辆、存储介质及芯片
CN114863717B (zh) 车位推荐方法、装置、存储介质及车辆
CN114937351B (zh) 车队控制方法、装置、存储介质、芯片、电子设备及车辆
WO2022142596A1 (zh) 一种图像处理方法,装置及储存介质
CN114880408A (zh) 场景构建方法、装置、介质以及芯片
CN114537450A (zh) 车辆控制方法、装置、介质、芯片、电子设备及车辆
CN113820732A (zh) 一种导航方法及装置
CN115115822B (zh) 车端图像处理方法、装置、车辆、存储介质及芯片
CN115221260B (zh) 数据处理方法、装置、车辆及存储介质
CN114842454B (zh) 障碍物检测方法、装置、设备、存储介质、芯片及车辆
CN113790733B (zh) 一种导航方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22851643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE