CN113792589B - Overhead identification method and device - Google Patents

Overhead identification method and device Download PDF

Info

Publication number
CN113792589B
CN113792589B CN202110902596.4A CN202110902596A CN113792589B CN 113792589 B CN113792589 B CN 113792589B CN 202110902596 A CN202110902596 A CN 202110902596A CN 113792589 B CN113792589 B CN 113792589B
Authority
CN
China
Prior art keywords
vehicle
overhead
parking
determining
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110902596.4A
Other languages
Chinese (zh)
Other versions
CN113792589A (en
Inventor
邱宇
李康
李庆奇
黄鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110902596.4A priority Critical patent/CN113792589B/en
Publication of CN113792589A publication Critical patent/CN113792589A/en
Priority to PCT/CN2022/091512 priority patent/WO2023010922A1/en
Application granted granted Critical
Publication of CN113792589B publication Critical patent/CN113792589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/52Determining velocity

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the application provides an overhead identification method and device, in the method, a terminal device determines an overhead identification parameter of a vehicle, determines a road with the vehicle positioned on the upper side of an overhead when the overhead identification parameter accords with a first condition, and determines a road with the vehicle positioned on the lower side of the overhead when the overhead identification parameter accords with a second condition. The overhead identification parameter includes: a vehicle speed of the vehicle or a parking state parameter of the vehicle; the parking state parameters of the vehicle include at least one of the following parameters: the parking times and the parking duration in the first time period and the distance between adjacent parking places; the first condition is indicative of a relationship between a vehicle speed of the vehicle and a corresponding speed threshold; the second condition is used to indicate a relationship between the parking state parameter and the corresponding threshold value. The problem of prior art unable discernment vehicle whether lie in the overhead is solved to the scheme of this application, can reduce the number of times that the mistake appears in the navigation, improves the degree of accuracy of navigation.

Description

Overhead identification method and device
Technical Field
The application relates to the technical field of navigation, in particular to an overhead identification method and device.
Background
With the development of cities and the improvement of living standard of people, the number of automobiles is continuously increased. In order to improve the driving speed, relieve the congestion condition and solve the safety problem of the intersection of roads and pedestrians, elevated roads are built in many cities.
The elevated road may be referred to as an elevated road, which is a three-dimensional road that is erected on a ground road and used for vehicle driving. By means of the elevated frame, the road can be divided into a road on the upper side of the elevated frame and a road on the lower side of the elevated frame. Depending on the destination of the user, the user can select whether to run the vehicle on the road on the upper side of the overhead or on the road on the lower side of the overhead.
In addition, the user often needs to navigate while driving the vehicle. During navigation, the terminal device for navigation is usually required to determine the position information of the vehicle and plan a suitable route for the vehicle according to the position information.
However, the position information of the road on the upper side of the same overhead and the road on the lower side of the same overhead may be the same or similar, and therefore the terminal device may not be able to determine whether the vehicle is located on the road on the upper side of the overhead or the road on the lower side of the overhead based on the position information of the vehicle, thereby causing a navigation error. Navigation errors often cause a user driving a vehicle to drive into an incorrect route, extremely poor driving experience is brought to the user, and the defects of long time consumption and high vehicle oil consumption in the driving process exist.
Disclosure of Invention
In order to solve the problem that in the prior art, whether a vehicle is located on a road on the upper side of an overhead or on a road on the lower side of the overhead cannot be identified, the embodiment of the application provides an overhead identification method and device.
In a first aspect, an embodiment of the present application provides an overhead identification method, including:
determining an overhead identification parameter of the vehicle;
determining that the vehicle is located on a road on the upper side of the overhead if the overhead identification parameter of the vehicle meets a first condition;
determining that the vehicle is located on a road below the overhead if the overhead identification parameter of the vehicle meets a second condition;
wherein the overhead identification parameters of the vehicle include: a vehicle speed of the vehicle or a parking state parameter of the vehicle;
the parking state parameter of the vehicle includes at least one of the following parameters: the parking times and the parking duration in the first time period and the distance between adjacent parking places are set;
the first condition is used for indicating the relation between the vehicle speed of the vehicle and a speed threshold value corresponding to the vehicle speed;
the second condition is used for indicating a relationship between a parking state parameter of the vehicle and a threshold value corresponding to the parking state parameter.
Through the steps, whether the vehicle is positioned on the road on the upper side of the overhead or the road on the lower side of the overhead can be determined according to the overhead identification parameter, the first condition and the second condition of the vehicle, and the problem that whether the vehicle is elevated or not cannot be identified in the prior art is solved.
In one possible implementation, the determining an overhead identification parameter of a vehicle includes:
determining an overhead identification parameter of the vehicle when an operation of starting a navigation function is received;
or when receiving the position searching operation, determining the overhead identification parameters of the vehicle;
or, when an operation for indicating a destination is received, determining an overhead identification parameter of the vehicle;
alternatively, when an operation for indicating that the navigation mode is driving is received, an overhead identification parameter of the vehicle is determined
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining an overhead identification parameter of the vehicle;
or when the front of the vehicle is determined to contain the ramp junction according to the GNSS signals, determining the overhead identification parameters of the vehicle;
or when it is determined that the front of the vehicle includes an overhead sign from an image including the front of the vehicle, determining an overhead recognition parameter of the vehicle;
alternatively, when it is determined that the periphery of the vehicle includes an overhead from the GNSS signal, an overhead identification parameter of the vehicle is determined.
Through the scheme, the terminal equipment can determine the overhead identification parameters of the vehicle under the condition that certain conditions are met, so that the operation of determining the overhead identification parameters of the vehicle is reduced, the data volume to be processed is reduced, and the calculation resources occupied in the process of determining the overhead identification parameters of the vehicle can be reduced.
In a possible implementation manner, if the vehicle speed of the vehicle in the second time period is not less than the speed threshold corresponding to the vehicle speed, the overhead identification parameter of the vehicle meets the first condition;
or the parking state parameter of the vehicle comprises the number of parking times in a first time period, and if the number of parking times in the first time period is greater than a number threshold corresponding to the number of parking times, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises parking time, and if the parking time of the vehicle is greater than a time threshold corresponding to the parking time, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameters of the vehicle comprise the distance between adjacent parking places, and if the distance between the adjacent parking places is greater than a distance threshold corresponding to the distance between the adjacent parking places, the overhead identification parameters of the vehicle meet the second condition.
In the implementation mode, the vehicle can be identified to be positioned on the road on the upper side of the overhead or the road on the lower side of the overhead according to the overhead identification parameters of different types, the identification mode is diversified, and the adaptability is good.
In one possible implementation, the determining the overhead identification parameter of the vehicle includes:
determining a vehicle speed of the vehicle from a sensor;
alternatively, the determining an overhead identification parameter of the vehicle comprises:
determining the position information of the vehicle at different moments according to the GNSS signals;
and determining the speed of the vehicle according to the position information of the vehicle at different moments.
In one possible implementation, the determining the vehicle speed of the vehicle according to the position information of the vehicle at different time includes:
setting the time periods of the different moments as target time periods, wherein the target time periods comprise at least two sub time periods, and determining the speed of the vehicle in the sub time periods according to the position information of the vehicle at each moment in the sub time periods;
determining a vehicle speed of the vehicle based on a first sub-period of vehicle speed, wherein the first sub-period of vehicle speed is greater than a first speed threshold.
In one possible implementation, the determining the overhead identification parameter of the vehicle includes:
determining the position information of the vehicle at different moments according to the GNSS signals;
the position information of the vehicle at different moments indicates that the difference value of the distances of the vehicle at the first moment and the second moment is within a first distance threshold value, and the vehicle is determined to be in a parking state at the first moment and the second moment;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between adjacent parking places according to the position information of the vehicle in the parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
In one possible implementation, the determining the overhead identification parameter of the vehicle includes:
determining whether the vehicle is in a parking state or not according to the vehicle speed of the vehicle, wherein when the vehicle speed of the vehicle is smaller than a second speed threshold value, the vehicle is in the parking state;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between the adjacent parking places according to the position information of the vehicle in the parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
In a possible implementation manner, the method further includes:
the elevated frame comprises more than two layers, and the height of the vehicle is determined after the vehicle is determined to be positioned on the road on the upper side of the elevated frame;
and determining the number of layers of the road on the upper side of the elevated frame where the vehicle is located according to the height of the vehicle and the height of each layer of elevated frame.
In this implementation, the number of layers of the road on the upper side of the overhead where the vehicle is located can be determined based on the height of the vehicle and the height of each layer of overhead, and the accuracy of determining the position where the vehicle is located is improved.
In a possible implementation manner, the method further includes:
reporting positioning information of the vehicle to a server, wherein the positioning information comprises information of a road on which the vehicle is positioned at the upper side of the overhead or a road on which the vehicle is positioned at the lower side of the overhead;
if the elevated frame comprises more than two layers, the positioning information also comprises the number of layers of the road on the upper side of the elevated frame where the vehicle is positioned.
In this implementation manner, the positioning information of the vehicle can be reported to the server, so that the accuracy of the server in determining the position of the vehicle is improved, and the accuracy of navigation is further improved.
In a first aspect, an embodiment of the present application provides an overhead identification device, where the device includes: a transceiver and a processor;
the transceiver is used for receiving GNSS signals;
the processor is configured to: determining an overhead identification parameter of the vehicle;
determining that the vehicle is located on a road on the upper side of the overhead if the overhead identification parameter of the vehicle meets a first condition;
determining that the vehicle is located on a road below the overhead if the overhead identification parameter of the vehicle meets a second condition;
wherein the overhead identification parameters of the vehicle include: a vehicle speed of the vehicle or a parking state parameter of the vehicle;
the parking state parameter of the vehicle includes at least one of the following parameters: the parking times and the parking duration in the first time period and the distance between adjacent parking places are set;
the first condition is used for indicating the relation between the vehicle speed of the vehicle and a speed threshold value corresponding to the vehicle speed;
the second condition is used for indicating a relationship between a parking state parameter of the vehicle and a threshold value corresponding to the parking state parameter.
In one possible implementation, the processor determines an overhead identification parameter of the vehicle, specifically:
determining an overhead identification parameter of the vehicle when an operation to start a navigation function is received;
or when receiving a position searching operation, determining an overhead identification parameter of the vehicle;
or, when an operation for indicating a destination is received, determining an overhead identification parameter of the vehicle;
alternatively, when an operation for indicating that the navigation mode is driving is received, an overhead identification parameter of the vehicle is determined
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining an overhead identification parameter of the vehicle;
or when the front of the vehicle is determined to contain a ramp junction according to a Global Navigation Satellite System (GNSS) signal, determining an overhead identification parameter of the vehicle;
or when it is determined that the front of the vehicle includes an overhead sign from an image including the front of the vehicle, determining an overhead recognition parameter of the vehicle;
alternatively, when it is determined that the periphery of the vehicle includes an overhead from the GNSS signal, an overhead identification parameter of the vehicle is determined.
In a possible implementation manner, if the vehicle speed of the vehicle in the second time period is not less than the speed threshold corresponding to the vehicle speed, the overhead identification parameter of the vehicle meets the first condition;
or the parking state parameter of the vehicle comprises the number of parking times in a first time period, and if the number of parking times in the first time period is greater than a number threshold corresponding to the number of parking times, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises parking time, and if the parking time of the vehicle is greater than a time threshold corresponding to the parking time, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises a distance between adjacent parking places, and if the distance between the adjacent parking places is greater than a distance threshold corresponding to the distance between the adjacent parking places, the overhead identification parameter of the vehicle meets the second condition.
In one possible implementation manner, the overhead identification parameter of the vehicle includes a vehicle speed of the vehicle, and the processor determines the overhead identification parameter of the vehicle, specifically:
determining a vehicle speed of the vehicle from a sensor;
or, the processor determines an overhead identification parameter of the vehicle, specifically:
determining the position information of the vehicle at different moments according to the GNSS signals;
and determining the speed of the vehicle according to the position information of the vehicle at different moments.
In a possible implementation manner, the processor determines the vehicle speed of the vehicle according to the position information of the vehicle at different times, specifically:
setting the time periods of the different moments as target time periods, wherein the target time periods comprise at least two sub time periods, and determining the speed of the vehicle in the sub time periods according to the position information of the vehicle at each moment in the sub time periods;
determining a vehicle speed of the vehicle based on a vehicle speed of a first sub-time period, wherein the vehicle speed of the first sub-time period is greater than a first speed threshold.
In a possible implementation manner, the overhead identification parameter of the vehicle includes a parking state parameter of the vehicle, and the processor determines the overhead identification parameter of the vehicle, specifically:
determining the position information of the vehicle at different moments according to the GNSS signals;
the position information of the vehicle at different moments indicates that the difference value of the distances of the vehicle at the first moment and the second moment is within a first distance threshold value, and the vehicle is determined to be in a parking state at the first moment and the second moment;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between adjacent parking places according to the position information when the vehicle is in a parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
In a possible implementation manner, the overhead identification parameter of the vehicle includes a parking state parameter of the vehicle, and the processor determines the overhead identification parameter of the vehicle, specifically:
determining whether the vehicle is in a parking state or not according to the vehicle speed of the vehicle, wherein when the vehicle speed of the vehicle is smaller than a second speed threshold value, the vehicle is in the parking state;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between the adjacent parking places according to the position information when the vehicle is in the parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
In one possible implementation, the processor is further configured to:
the elevated frame comprises more than two layers, and the height of the vehicle is determined after the vehicle is determined to be positioned on the road on the upper side of the elevated frame;
and determining the number of layers of the road on the upper side of the elevated frame where the vehicle is located according to the height of the vehicle and the height of each layer of elevated frame.
In one possible implementation, the processor is further configured to:
reporting positioning information of the vehicle to a server, wherein the positioning information comprises information of a road on which the vehicle is positioned at the upper side of the overhead or a road on which the vehicle is positioned at the lower side of the overhead;
if the elevated frame comprises more than two layers, the positioning information also comprises the number of layers of the road on the upper side of the elevated frame where the vehicle is located.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a processor, and when the processor executes a computer program or instructions in a memory, the method according to the first aspect is performed.
In a fourth aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a processor and a memory; the memory is for storing a computer program or instructions; the processor is adapted to execute the computer program or instructions stored by the memory to cause the terminal device to perform the method according to the first aspect.
In a fifth aspect, the present application provides a terminal device comprising a processor, a memory, and a transceiver; the transceiver is used for receiving signals or sending signals; the memory is used for storing computer programs or instructions; the processor is adapted to execute the computer program or instructions stored by the memory to cause the terminal device to perform the method according to the first aspect.
In a sixth aspect, the present application provides a terminal device comprising a processor and an interface circuit; the interface circuit is used for receiving a computer program or instructions and transmitting the computer program or instructions to the processor; the processor is configured to execute the computer program or instructions to cause the terminal device to perform the method according to the first aspect.
In a seventh aspect, the present application provides a computer storage medium for storing a computer program or instructions which, when executed, cause the method of the first aspect to be carried out.
In an eighth aspect, the present application provides a computer program product comprising a computer program or instructions which, when executed, cause the method of the first aspect to be carried out.
In a ninth aspect, the present application provides a chip comprising a processor coupled with a memory for executing a computer program or instructions stored in the memory, the computer program or instructions when executed, the method according to the first aspect being performed.
The embodiment of the application provides an overhead identification method and device, in the method, a terminal device determines an overhead identification parameter of a vehicle, determines a road on the upper side of the overhead when the overhead identification parameter of the vehicle meets a first condition, and determines a road on the lower side of the overhead when the overhead identification parameter of the vehicle meets a second condition.
The embodiment of the application combines the overhead identification parameter of the vehicle, the first condition required to be met when the vehicle is positioned on the road on the upper side of the overhead and the second condition required to be met when the vehicle is positioned on the road on the lower side of the overhead when determining whether the vehicle runs on the road on the upper side of the overhead or on the road on the lower side of the overhead, thereby being capable of determining whether the vehicle runs on the road on the upper side of the overhead or on the road on the lower side of the overhead and solving the problem that whether the vehicle is positioned on the overhead or not in the prior art. Compared with the prior art, the scheme of the application can reduce the number of times of errors in navigation and improve the navigation accuracy.
Furthermore, the navigation accuracy can be improved by the scheme, so that the experience of a user on the vehicle can be improved, the consumed time of the driving process is reduced, the oil consumption of the vehicle is reduced, and the purpose of energy conservation is achieved.
Drawings
FIG. 1 is a schematic illustration of a network of elevated ramps and elevated ramps on a vehicle;
FIG. 2 is a schematic diagram of a GNSS system;
FIG. 3(a) is a schematic interface diagram of an electronic map displayed by a terminal device;
FIG. 3(b) is a schematic interface diagram of an electronic map displayed by a terminal device;
fig. 4 is a schematic view of a driving scene of a vehicle according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 6 is a block diagram of a software structure of a mobile phone according to an embodiment of the present disclosure;
FIG. 7 is a schematic illustration of a vehicle according to an exemplary embodiment of the present disclosure;
fig. 8(a) is an exemplary diagram of an interface of a terminal device disclosed in an embodiment of the present application;
fig. 8(b) is an exemplary diagram of an interface of another terminal device disclosed in the embodiment of the present application;
fig. 8(c) is an exemplary diagram of an interface of another terminal device disclosed in the embodiment of the present application;
fig. 9 is a schematic workflow diagram of an overhead identification method according to an embodiment of the present application;
fig. 10(a) is a schematic view of a scene in which a vehicle travels on a road on the upper side of an overhead according to an embodiment of the present application;
fig. 10(b) is a plan view of a vehicle according to an embodiment of the present application running on a road on an upper side of an overhead;
fig. 11 is a schematic workflow diagram of another overhead identification method disclosed in the embodiment of the present application;
fig. 12 is a schematic interface diagram of an electronic map displayed by a terminal device according to an embodiment of the present application;
FIG. 13 is a block diagram illustrating the structure of one embodiment of an overhead identification device provided herein;
fig. 14 is a block diagram of a chip according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one, two or more. The term "and/or" is used to describe an association relationship that associates objects, meaning that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
For clarity and conciseness of the following descriptions of the various embodiments, a brief introduction to the related art is first given:
in order to guarantee the driving speed, relieve the congestion condition and solve the safety problem of intersection of roads and pedestrian traffic lines, elevated roads are built in many cities at present so as to cope with the increasing number of automobiles.
The elevated road can be referred to as elevated for short, and is a three-dimensional road which is erected on a ground road and used for driving vehicles. By means of the elevated frame, the road can be divided into a road on the upper side of the elevated frame and a road on the lower side of the elevated frame. For a scene with only one layer of overhead, the road on the upper side of the overhead refers to an overhead road higher than the ground, and the road on the lower side of the overhead refers to a ground road below the overhead; in an overhead scene with two or more floors, the road on the upper side of the overhead is an overhead road on any floor above the ground, and the road on the lower side of the overhead is a ground road below the overhead road closest to the ground.
In addition, a ramp opening is usually provided at the entrance of the overhead. The ramp opening usually includes an upper ramp opening and a lower ramp opening, and when a vehicle is on an upper rack, the vehicle firstly needs to pass through the upper ramp opening and then drives into a road on the upper side of the upper rack. When a vehicle is driven to an elevated road on a lower floor from an elevated road on a certain floor, the vehicle first needs to pass through a lower ramp, and then to enter the road on the elevated side, or to enter the road on the elevated side on the lower floor.
In order to clarify the scene in which the vehicle is traveling on the overhead, fig. 1 is provided. Referring to the schematic view of the scene shown in fig. 1, the scene of the figure includes a layer of overhead, and accordingly, the roads include the road on the upper side of the overhead and the road on the lower side of the overhead. Further, the front of the vehicle includes a solid line with an arrow indicating the direction in which the vehicle travels. In the initial stage of the vehicle running, the vehicle is located at the left side position in fig. 1, and the vehicle runs on the road on the lower side of the overhead. After a while, the vehicle starts to be elevated through the upper elevated ramp entrance, thereby entering the road on the upper side of the elevated ramp, where the vehicle is located at the right side in fig. 1. Then, after a certain time of traveling on the road on the upper side of the overhead, the vehicle travels to the road on the lower side of the overhead through the lower ramp entrance. In this process, the driving route of the vehicle is as follows: driving on the ground- > driving on the road above the elevated frame- > driving on the road below the elevated frame, i.e. driving on the ground.
During the driving of the vehicle, a user may generally use a terminal device (such as a mobile phone or a vehicle-mounted terminal) to navigate. For example, the user may turn on navigation of the terminal device before the vehicle is elevated, and may turn on navigation while the vehicle is traveling on a road under the elevated. Currently, a terminal device mainly determines its own position through a Global Navigation Satellite System (GNSS). The GNSS is a space-based radio navigation positioning system capable of providing users with all-weather three-dimensional coordinates and speed and time information at any place on the earth's surface or in near-earth space.
GNSS systems typically include the Global Positioning System (GPS) in the united states, the russian (glonasa) system, the GALILEO (GALILEO) system in the european union, and the beidou satellite navigation system in china, among others.
The GPS system is a positioning system for radio navigation based on artificial earth satellites, and includes 24 satellites covering the world. The Beidou satellite navigation system is a global satellite navigation system which is independently researched and developed by China and operates independently. The system is divided into two generations, namely a first Beidou generation system and a second Beidou generation system. The system typically includes four geosynchronous orbit satellites.
Referring to fig. 2, a schematic diagram of a GNSS navigation system is shown, wherein the GNSS navigation system generally includes three major parts, a space part, a ground monitoring part and a user receiver.
As shown in fig. 2, the space portion of the GNSS navigation system includes a plurality of satellites 10, the ground monitoring portion includes a ground monitoring and tracking station 20, the ground monitoring and tracking station 20 generally includes a master control station, a monitoring station and an injection station, and the user receiver 30 of the GNSS navigation system can receive satellite signals transmitted by the plurality of satellites 10.
The basic principle of a GNSS navigation system is to determine the position of a user receiver by the distance between a number of satellites of known position to the user receiver. The position of the satellite can be found from the satellite ephemeris according to the time recorded by the satellite-borne clock, and the distance between the user receiver and the satellite can be determined by the time when the satellite signal transmitted by the satellite, which may also be referred to as a GNSS signal, is transmitted to the user receiver.
During navigation, the ground monitoring and tracking station 20 may transmit information such as satellite ephemeris to a plurality of satellites 10; a plurality of satellites 10 can continuously transmit satellite signals, wherein the satellite signals generally comprise satellite ephemeris and the transmission time of the satellite signals; the user receiver 30 may search for and receive satellite signals, determine the position of the satellite 10 from the satellite ephemeris in the satellite signals, and determine the distance between itself and the satellite 10 from its own clock and the transmission time of the satellite signals, and further determine its own position information from the position of the satellite 10 and the distance between itself and the satellite 10.
The user can navigate through a terminal device, which may be a mobile terminal (e.g., a mobile phone) and a car machine, and the like having a navigation function. In addition, the terminal equipment displays the electronic map in the navigation process, so that a user can conveniently inquire a destination and plan a route.
The terminal device can display the position of the terminal device on an electronic map after determining the position of the terminal device in the process of navigating the vehicle, wherein the electronic map generally comprises the environment around the position of the terminal device and indicates the position of the vehicle in the electronic map, and further can comprise a route planned for the vehicle and an advancing direction of the vehicle, so that the navigation requirement of a user is met.
As can be seen from the brief introduction of the above technology, currently, the terminal device usually determines the required location information according to the received GNSS signal, and further navigates the user according to the location information.
However, the position information of the road on the upper side of the same overhead and the road on the lower side of the overhead may be the same or similar. In this case, the terminal device cannot determine whether the vehicle is located on the road on the upper side of the overhead or the road on the lower side of the overhead based on the determined position information, and thus a navigation error is more likely to occur. Navigation errors often cause a user driving a vehicle to drive into an incorrect route, extremely poor driving experience is brought to the user, and the defects of long time consumption and high vehicle oil consumption in the driving process exist.
For example, fig. 3(a) and 3(b) are electronic maps displayed in a case where the terminal device has low navigation accuracy. When a user may drive a car on a road under an overhead, a navigation APP of the terminal device is started for navigation. Because the position of the overhead overlaps with the road position on ground, when the terminal equipment starts the navigation APP for the first time, it is difficult to judge whether the terminal equipment is positioned on the overhead or on the road on ground at the moment. As shown in fig. 3(a), in this example, the navigation instruction in the figure indicates that the user's vehicle is located on the main road of the north four-loop east road, the vehicle is located on the overhead, and the triangle marked by the solid line indicates the position where the terminal device of the navigation instruction is located. The actual position of the vehicle is on the ground, the actual position of the vehicle is inconsistent with the navigation position displayed by the terminal device, and a navigation error occurs, so that the deviation of the identification of the user on the vehicle position may occur.
The user may also turn on the navigation APP of the terminal device for navigation while the driving vehicle is on the road on the overhead. Under this condition, when the terminal equipment started the navigation APP for the first time, also it was difficult to judge whether terminal equipment was located overhead or the road on ground this moment. Referring to fig. 3(b), the navigation instruction in the figure indicates that the user vehicle is located on the supplementary road of the north, four roads and east road, and is located on the road of the ground, and the triangle marked by the solid line in the figure indicates the position where the terminal device of the navigation instruction is located; however, the actual position of the vehicle has already entered the elevated road from the road on the ground, and the triangle marked by the dotted line in fig. 3(b) represents the actual position of the vehicle, so that it can be seen that the actual position of the vehicle is not consistent with the navigation position indicated in the electronic map displayed by the terminal device of the user, and a navigation error occurs, which may cause a deviation in the recognition of the vehicle position by the user.
In view of the above problems, embodiments of the present application provide a navigation method and apparatus to improve navigation accuracy of a terminal device.
The technical scheme of the application can be applied to the field of Vehicle driving, including but not limited to the fields of Automatic Driving (ADS), Intelligent driving (Intelligent driving), Intelligent internet Vehicle (ICV) and the like. The application provides a technical scheme for identifying whether a vehicle is in a state of an upper viaduct and a lower viaduct. The technical scheme can be applied to the field of vehicle driving and is used for providing positioning and navigation services for vehicles.
The technical scheme of the application can be applied to any positioning system or navigation system, and is a scene schematic diagram of vehicle driving provided by the embodiment as shown in fig. 4. The scene relates to a server, at least one terminal device and a vehicle corresponding to the terminal device. The server and the terminal device (such as a mobile phone terminal) can be connected through a wireless network.
Further, the server may be a service platform or a car networking server for managing the mobile phone terminal, for example, the server is used for receiving messages sent by the mobile phone terminal, determining the position of the vehicle, and providing map and real-time navigation services for the user. The server can store electronic maps of a plurality of regions.
The terminal equipment is used for sending a request to the server to realize the real-time positioning and navigation functions of the vehicle. In addition, the vehicle comprises a communication module and a processing module, and the communication module and the processing module are used for receiving signals sent by the server and/or the mobile phone terminal, controlling the starting and stopping of the vehicle according to the signals and a preset program, and acquiring the high-rise or low-rise state of the vehicle.
Optionally, the server may be one or more independent servers or a server cluster, or may also be a cloud platform service deployed in the cloud. The server may be a network device, such as a Base Station (BS), further, the base station may be a Base Transceiver Station (BTS) in a global system for mobile communication (GSM) or Code Division Multiple Access (CDMA), a base station (NodeB) in a Wideband Code Division Multiple Access (WCDMA), an evolved node b (eNB/e-NodeB) in LTE, or a next generation evolved node b (eNB, ng-eNB) in LTE, or a base station (gNB) in NR, or a base station in a future mobile communication system or an access node in a wireless fidelity (WiFi) system, and the like, and embodiments of the present application do not limit a specific deployment technology and a specific device form adopted by the network device, and may be specifically a cloud terminal, but also a stand-alone computer device, etc.
The terminal device in the embodiments of the present application may refer to a device providing service and/or data connectivity to a user, a handheld device having a wireless connection function, or other processing device connected to a wireless modem, such as a wireless terminal, a car-mounted wireless terminal, a portable device, a wearable device, a mobile phone (or referred to as "cellular" phone), a portable, pocket, handheld terminal, etc., which exchange language and/or data with a radio access network. Such as Personal Communication Service (PCS) phones, cordless phones, Session Initiation Protocol (SIP) phones, Wireless Local Loop (WLL) stations, Personal Digital Assistants (PDAs), and the like. The wireless terminal may also be a subscriber unit (subscriber unit), an access terminal (access terminal), a user terminal (user terminal), a user agent (user agent), user equipment (user device), or User Equipment (UE), and the like, and the type of the terminal device is not limited in this application.
Taking a mobile phone as an example of the terminal device, as shown in fig. 5, a schematic structural diagram of the mobile phone is shown.
The mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention is not to be specifically limited to a mobile phone. In other embodiments of the present application, the handset may include more or fewer components than illustrated, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a bus or Universal Serial Bus (USB) interface, and the like.
The wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to a mobile phone. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to a mobile phone, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone realizes the display function through the GPU, the display screen 194, the application processor and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone may include 1 or N display screens 194, N being a positive integer greater than 1.
The mobile phone can realize shooting functions through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the mobile phone selects the frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The handset may support one or more video codecs. Thus, the mobile phone can play or record videos in various encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, a phone book and the like) created in the use process of the mobile phone. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The mobile phone can realize audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The handset can listen to music through the speaker 170A or listen to a hands-free conversation.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the mobile phone receives a call or voice information, the receiver 170B can be close to the ear to receive voice.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The handset may be provided with at least one microphone 170C. In other embodiments, the mobile phone may be provided with two microphones 170C to achieve the noise reduction function in addition to collecting the sound signal. In other embodiments, the mobile phone may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
Certainly, the mobile phone may further include a charging management module, a power management module, a battery, a key, an indicator, 1 or more SIM card interfaces, and the like, which is not limited in this embodiment of the present application.
Still taking the mobile phone as the above terminal device for example, the software system of the mobile phone may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplifies a software structure of a mobile phone.
Fig. 6 is a block diagram of a software structure of an embodiment of a mobile phone provided in the present application. Referring to fig. 6, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 6, the application package may include applications such as camera, gallery, call, navigation, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 6, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the size of the display screen, obtain parameters of each display area on the display interface, and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a camera icon.
The telephone manager is used for providing the communication function of the mobile phone. Such as management of call status (including connection, hangup, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like. The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The core layer may include a display driver, a camera driver, an audio driver, a sensor driver, and the like.
The system library and the kernel layer below the application framework layer can also be referred to as a bottom layer system, and the bottom layer system comprises a state monitoring service for identifying the change of the mobile phone posture, and the state monitoring service can be arranged in the system library and/or the kernel layer.
In another possible implementation, the terminal device executing the overhead identification method provided by the embodiment of the present application may be a vehicle. For example, the navigation method may be performed by a vehicle machine within a vehicle, wherein the vehicle machine is typically mounted in a center console of the vehicle.
In this implementation, the vehicle may be a smart vehicle. Fig. 7 is a functional block diagram of the vehicle 100 according to the embodiment of the present application. Referring to fig. 7, the vehicle 100 may include various subsystems such as a travel system 1002, a sensor system 1004, a planning control system 1006, one or more peripherals 1008, as well as a power supply 1010, a computer system 1001, and a user interface 1016.
Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 1002 may include components to power the vehicle 100. In one embodiment, the propulsion system 1002 may include an engine 1018, an energy source 1019, a transmission 1020, and wheels 1021. The engine 1018 may be an internal combustion engine, an electric motor, an air compression engine, or other type of engine or combination of engines, which may include, for example: a hybrid engine consisting of a gasoline engine and an electric motor, and a hybrid engine consisting of an internal combustion engine and an air compression engine. The engine 1018 converts the energy source 1019 into mechanical energy.
Examples of energy sources 1019 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 1019 may also provide energy to other systems of the vehicle 100.
The transmission 1020 may transmit mechanical power from the engine 1018 to the wheels 1021. The transmission 1020 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 1020 may also include other devices, such as a clutch. Among other things, the drive shaft may include one or more axles that may be coupled to one or more wheels 1021.
The sensor system 1004 may include several sensors that sense information about the vehicle 100 itself and the environment surrounding the vehicle 100. For example, the sensor system 1004 may include a positioning system 1022 (the positioning system may be a GNSS system, may include a GPS system, may also include a beidou system or other positioning system), an Inertial Measurement Unit (IMU) 1024, a radar 1026, a laser range finder 1028, a camera 1030, a computer vision system 1038, and a sensor fusion algorithm 1040. The sensor system 1004 may also include sensors of internal systems of the vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect the object to be detected and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a key function of the vehicle 100 for safe operation.
Global positioning system 1022 may be used to estimate the geographic location of vehicle 100. The IMU 1024 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, the IMU 1024 may be a combination of an accelerometer and a gyroscope.
The radar 1026 may utilize radio signals to sense objects in the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing objects, radar 1026 may also be used to sense the speed or direction of travel of objects.
Laser rangefinder 1028 may utilize a laser to sense objects in the environment in which vehicle 100 is located. In some embodiments, laser rangefinder 1028 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 1030 may be used to capture multiple images of the surrounding environment of the vehicle 100. Camera 1030 may be a still camera or a video camera.
The computer vision system 1038 may be operable to process and analyze images captured by the camera 1030 to identify objects or features in the environment surrounding the vehicle 100. The objects or features may include traffic signals, road boundaries, and objects. The computer vision system 1038 may use object recognition algorithms, Motion from Motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 1038 may be used to map an environment, track objects, estimate the speed of objects, and so forth.
The planning control system 1006 is for controlling the operation of the vehicle 100 and its components. The planning control system 1006 may include various elements including a steering system 1032, a throttle 1034, a brake unit 1036, a route planning system 1042, and a control system 1044.
The forward direction of vehicle 100 may be adjusted through operation of steering system 1032. For example, in one embodiment, a steering wheel system.
The throttle 1034 is used to control the operating speed of the engine 1018 and, in turn, the speed of the vehicle 100.
The brake unit 1036 is used to control deceleration of the vehicle 100. The brake unit 1036 may use friction to slow the wheel 1021. In other embodiments, the brake unit 1036 may convert the kinetic energy of the wheel 1021 to an electrical current. The brake unit 1036 may take other forms to slow the rotational speed of the wheels 1021 to control the speed of the vehicle 100.
The route planning system 1042 is used to determine a travel route of the vehicle 100. In some embodiments, the route planning system 1042 may combine data from the sensors 1038, GPS 1022, and one or more predetermined maps to plan a travel route for the vehicle 100 that avoids potential targets in the environment. The trajectory planning method provided in the embodiment of the present application may be executed by the route planning system 1042 to output a target travel trajectory for the vehicle 100, where the target travel trajectory includes a plurality of target waypoints, where each target waypoint in the plurality of target waypoints includes coordinates of the waypoint, and a lateral allowable error and a speed allowable error of the waypoint, where the lateral allowable error described herein includes a value range of the lateral allowable error, and in some cases, may be understood as short for the value range of the lateral allowable error. The lateral direction here means a direction perpendicular or approximately perpendicular to the vehicle traveling direction; the lateral allowable error means a lateral displacement allowable error, that is, a range of values of the displacement error that is allowed for the vehicle 100 in a direction perpendicular or approximately perpendicular to the vehicle traveling direction. This will not be described in detail later.
The control system 1044 is configured to generate control quantities of the accelerator brake and the steering angle according to the driving route/driving track output by the route planning system, so as to control the steering system 1032, the accelerator 1034 and the brake unit 1036.
Of course, in one example, the planning control system 1006 may additionally or alternatively include components other than those shown and described. Or may reduce some of the components shown above.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 1008. The peripheral devices 1008 may include a wireless communication system 1046, an in-vehicle computer 1048, a microphone 1050, or speakers 1052.
In some embodiments, the peripheral device 1008 provides a means for a user of the vehicle 100 to interact with a user interface 1016. For example, the in-vehicle computer 1048 may provide information to a user of the vehicle 100. User interface 1016 may also operate in-vehicle computer 1048 to receive user inputs. In one implementation, the onboard computer 1048 may be operated via a touch screen. In other instances, the peripheral device 1008 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 1050 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, speakers 1052 may output audio to a user of vehicle 100.
The wireless communication system 1046 may wirelessly communicate with one or more devices directly or via a communication network. For example, the wireless communication system 1046 may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system 1046 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 1046 may utilize an infrared link, bluetooth, or ZigBee to communicate directly with the device. Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 1046 may include one or more Dedicated Short Range Communications (DSRC) devices, which may include public or private data communications between vehicles or roadside stations.
Power supply 1010 may provide power to various components of vehicle 100. In one embodiment, power supply 1010 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source and provide power to various components of the vehicle 100. In some embodiments, the power source 1010 and the energy source 1019 may be implemented together, such as in an all-electric vehicle.
Some or all of the functions of vehicle 100 are controlled by computer system 1001. The computer system 1001 may include at least one processor 1013 that executes instructions 1015 stored in a non-transitory computer-readable medium, such as the memory 1014. The computer system 1001 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
Processor 1013 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor. Although fig. 1 functionally illustrates a processor, memory, and other elements of the computer system 1001, those of ordinary skill in the art will appreciate that the processor, memory, and other elements may actually comprise multiple processors, or memories, that are not located within the same physical housing. For example, the memory may be a hard drive or other storage medium located in a different enclosure than computer system 1001. Thus, references to a processor will be understood to include references to a collection of processors or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions; or the subsystems such as the traveling system, the sensor system, the planning control system and the like can also be provided with own processors for realizing the calculation of the related tasks of the corresponding subsystems so as to realize the corresponding functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle, while others are executed by a remote processor, including taking the steps necessary to execute a single maneuver.
In some embodiments, memory 1014 may include instructions 1015 (e.g., program logic), which instructions 1015 may be executed by processor 1013 to perform various functions of vehicle 100, including those described above. The memory 1014 may also contain additional instructions, including instructions to send data to, receive data from, interact with, or control one or more of the travel system 1002, the sensor system 1004, the planning control system 1006, and the peripherals 1008.
In addition to instructions 1015, memory 1014 may also store other relevant data such as road maps, route information, the location, direction, speed of the vehicle, and other relevant information. Such information may be used by the vehicle 100 or specifically by the computer system 1001 during operation of the vehicle 100 in an autonomous, semi-autonomous, or manual mode.
A user interface 1016 for providing information to and receiving information from a user of the vehicle 100. Optionally, the user interface 1016 may include one or more input/output devices within the set of peripheral devices 1008, such as a wireless communication system 1046, an in-vehicle computer 1048, a microphone 1050, and a speaker 1052.
The computer system 1001 may control the functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 1002, the sensor system 1004, and the planning control system 1006) and from a user interface 1016. In some embodiments, the computer system 1001 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
Optionally, one or more of these components described above may be mounted separately from or associated with the vehicle 100. For example, the memory 1014 may exist partially or completely separate from the vehicle 100. The above components may be communicatively coupled together in a wired or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 7 should not be construed as limiting the embodiment of the present invention.
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
In the embodiment of the present application, the vehicle 100 may receive a GNSS signal, determine a location of the vehicle according to the GNSS signal, perform positioning on the vehicle, and determine whether there is a navigation delay according to the navigation method provided in the embodiment of the present application.
The overhead identification method provided by the embodiment of the present application is exemplarily described below with reference to the terminal device shown in fig. 4 and fig. 8(a) to 8 (c).
The vehicle can be navigated through the terminal equipment during running. In a possible implementation manner, the vehicle can pass through a navigation APP installed in the terminal device during driving, such as a Baidu map
Figure GDA0003669493910000161
High map
Figure GDA0003669493910000162
Or drip out
Figure GDA0003669493910000163
And waiting for the navigation APP to navigate the vehicle.
The terminal device can determine an overhead identification parameter of the vehicle through the overhead identification method provided by the embodiment of the application, and determine whether the vehicle is a road on the upper side of the overhead or a road on the lower side of the overhead according to the overhead identification parameter of the vehicle.
In some embodiments, the overhead identification parameters of the vehicle may be determined periodically after each start-up of the terminal device.
In some embodiments, the terminal device may determine the overhead identification parameter of the vehicle again when a certain trigger condition is met, and for this scenario, for example, the following scheme is disclosed:
(1) when the terminal device receives an operation of starting a navigation function, an overhead identification parameter of the vehicle is determined.
If the terminal device receives the operation of starting the navigation function, the terminal device indicates that the user needs to utilize the terminal device for navigation, and in this case, the overhead identification parameters of the vehicle are determined.
The operation of starting the navigation function may include various forms, for example, a touch operation or a specific gesture operation on the navigation APP may be included, which is not limited in this embodiment of the present application.
(2) When the terminal device receives the position search operation, the overhead identification parameter of the vehicle is determined.
And if the terminal equipment receives the position searching operation, which indicates that the user needs to check the surrounding environment of a certain position and the user often has a navigation demand, determining the overhead identification parameters of the vehicle.
For example, referring to an exemplary diagram of a display interface of a terminal device shown in fig. 8(a), in a corresponding example, the terminal device receives a location search operation for searching a library in the country of china, and in the display interface, a location indicated by a circle including a triangle is the library in the country of china, in which case, the overhead identification parameter of the vehicle may be determined.
(3) When the terminal device receives an operation for indicating a destination, an overhead identification parameter of the vehicle is determined.
If the terminal equipment receives an operation for indicating a destination, which indicates that a user needs to go to a certain destination, and the user often has a navigation demand, the overhead identification parameter of the vehicle is determined.
For example, referring to an exemplary diagram of a display interface of a terminal device shown in fig. 8(b), in a corresponding example, the terminal device receives an operation for indicating that a destination is a library of a chinese country, in the display interface, a starting point is a position where a vehicle is currently located, and a finishing point is the library of the chinese country. In this case, the overhead identification parameters of the vehicle may be determined.
(4) When the terminal device receives an operation for indicating that the navigation mode is driving, the overhead identification parameters of the vehicle are determined.
In the navigation process, according to the navigation requirement, the user often selects different navigation modes. The navigation modes generally include: taxi, drive, public transport, walk, ride, etc. If the navigation mode applied by the terminal equipment is driving, the fact that the user needs to drive the vehicle is indicated, and the user has a navigation requirement. In this case, the overhead identification parameters of the vehicle may be determined.
For example, refer to an exemplary diagram of an interface of a terminal device shown in fig. 8(c), in this example, the navigation mode applied by the terminal device is driving.
(5) When the speed of the terminal device is greater than the target speed threshold, an overhead identification parameter of the vehicle is determined.
If the speed of the terminal device is greater than the target speed threshold, the terminal device is indicated to be fast, and the user carrying the terminal device is driving the vehicle. The user may enter the road on the upper side of the overhead while driving the vehicle, and thus, the overhead recognition parameter of the vehicle may be determined.
In this embodiment, the target speed threshold may be 30KM/h, for example, but the target speed threshold may also be set to other values, which is not limited in this embodiment of the application.
(6) And when the front of the vehicle is determined to contain the ramp junction according to the GNSS signals, determining the overhead identification parameters of the vehicle.
The terminal device may determine location information of the terminal device according to the GNSS signal, and the terminal device may transmit the location information to a remote server. The server stores the positions of all the ramp ports, and determines whether the front of the vehicle contains the ramp ports according to the received position information. And after determining that the front of the vehicle contains the ramp opening, the server transmits corresponding prompt information to the terminal equipment so as to prompt the position of the vehicle about to enter the ramp opening.
Or the terminal equipment can store the position information of the ramp mouths of each place, and after the position information of the terminal equipment is determined, the terminal equipment can be matched with the storage of the terminal equipment according to the position information of the terminal equipment so as to determine whether the front of the vehicle contains the ramp mouths.
In addition, the terminal device may be connected to a device in the vehicle, for example, the terminal device may be connected to a vehicle machine installed in a center console of the vehicle. In this case, the device may store position information of the ramp openings of each region, and the terminal device may transmit the position information to the device, and the device may determine whether the front of the vehicle includes a ramp opening based on the position information, and transmit corresponding prompt information to the terminal device after determining that the front of the vehicle includes a ramp opening.
The ramp-off typically includes a ramp-in and a ramp-out. In the driving process, vehicles often need to pass through a ramp entrance and then enter a road on the upper side of the overhead. When a vehicle enters a road on a lower floor from a road on an upper side of an overhead, the vehicle often needs to pass through a ramp exit and then enter the road on the lower floor.
If the front of the vehicle is determined to contain the crossroads, the vehicle is about to go high or about to go low, and in this case, the overhead identification parameter of the vehicle can be determined so as to identify whether the vehicle is on the road on the upper side or the lower side of the overhead through the scheme provided by the embodiment of the application.
(7) When it is determined that the front of the vehicle includes the overhead sign from the image including the front of the vehicle, the overhead recognition parameter of the vehicle is determined.
The terminal device can acquire an image in front of the vehicle, and determine whether the front of the vehicle comprises an elevated sign through image analysis, if the front of the vehicle comprises the elevated sign, the vehicle is indicated to be elevated or elevated, in this case, the elevated identification parameter of the vehicle can be determined, so that whether the vehicle is on the road on the upper side or the lower side of the elevated can be identified through the scheme provided by the embodiment of the application.
(8) When it is determined that the periphery of the vehicle includes an overhead from the GNSS signals, an overhead identification parameter of the vehicle is determined.
The terminal device can determine the position of the terminal device according to the GNSS signals, and determine whether the periphery of the position comprises an overhead according to the electronic map, if so, the possibility that the vehicle is elevated or lowered is indicated, so that the overhead identification parameters of the vehicle can be determined so as to identify whether the vehicle is on the road on the upper side or the lower side of the overhead.
Of course, the terminal device may also determine the overhead identification parameter of the vehicle in other scenarios. Also, after the overhead recognition parameter of the vehicle is determined, it is possible to recognize whether the vehicle is located on a road on the upper side or the lower side of the overhead based on the overhead recognition parameter, so as to solve the problems of the related art.
In order to clarify the aspects provided by the present application, the following description is made of various embodiments with reference to the accompanying drawings.
In order to solve the problem of navigation errors caused by the fact that a vehicle cannot be identified on a road on the upper side or on a road on the lower side of an overhead vehicle in the prior art, the embodiment of the application provides an overhead identification method.
Referring to a workflow diagram shown in fig. 9, an overhead identification method provided in an embodiment of the present application includes the following steps:
step S11, determining elevated identification parameters of vehicle
In the solution provided by the embodiment of the present application, the overhead identification parameter may be used to characterize a driving state or a parking state of the vehicle. In one possible implementation, the overhead identification parameters of the vehicle include: a vehicle speed of the vehicle or a parking state parameter of the vehicle.
Wherein the parking state parameters of the vehicle include at least one of the following parameters: the parking duration, the distance between adjacent parking spots and the number of parking in the first time period.
In some embodiments, the overhead identification parameter of the vehicle may include a vehicle speed of the vehicle. In this case, in one possible design, the vehicle speed of the vehicle may be determined by a sensor. Wherein the sensor typically comprises a speed sensor that can acquire speed.
The sensor may be provided in the terminal device. In addition, the sensor can also be arranged in the vehicle and transmits the acquired vehicle speed to the terminal equipment.
The sensor can continuously transmit the acquired vehicle speed to the terminal equipment after the terminal equipment is started. Or if the overhead identification parameters of the vehicle are determined under the condition that the terminal equipment meets the triggering conditions, the sensor can be triggered after the terminal equipment meets the triggering conditions, and the vehicle speed is collected and transmitted to the terminal equipment after the sensor is triggered again.
In another possible mode, when the terminal device is located in the vehicle, the terminal device can perform information interaction with other devices in the vehicle. Illustratively, the terminal device can perform information interaction with a vehicle machine installed in a center console of a vehicle. In this case, the other devices in the vehicle may determine the vehicle speed of the vehicle based on the relevant information of the vehicle (e.g., information such as the rotational speed of the wheels), and transmit the vehicle speed to the terminal device, thereby causing the terminal device to determine the vehicle speed.
In another possible approach, the vehicle speed of the vehicle may be determined by:
in a first step, position information of the vehicle at different times is determined based on the GNSS signals.
And secondly, determining the speed of the vehicle according to the position information of the vehicle at different moments.
The terminal device may receive GNSS signals transmitted by the satellite system and determine position information of the vehicle at different times based on the GNSS signals. The position information of the vehicle at different moments can reflect the track of the vehicle to a certain extent, and the speed of the vehicle can be determined according to the track information.
For example, referring to a schematic view of a vehicle driving scene shown in fig. 10(a), in this example, the vehicle drives on a road on the upper side of the overhead, and the driving direction of the vehicle is from left to right. For this scenario, fig. 10(b) corresponding to fig. 10(a) is disclosed, wherein fig. 10(b) is a top view for fig. 10(a), wherein the road displayed is the road on the upper side of the overhead, and the position of the vehicle at each moment is represented in fig. 10(b) by a circle containing a number, wherein the smaller the number within the circle, the earlier the moment the vehicle is located at the position is represented. Here, the vehicle is set to be at the circle position indicated by the numeral 1 at the time t1, the vehicle is set to be at the circle position indicated by the numeral 2 at the time t2, the vehicle is set to be at the circle position indicated by the numeral 3 at the time t3, the vehicle is set to be at the circle position indicated by the numeral 4 at the time t4, and since the vehicle travels from left to right, the time t1 is earlier than the time t2, the time t2 is earlier than the time t3, and the time t3 is earlier than the time t 4.
In this example, the vehicle speed between time t1 and time t2 is the ratio of the difference in distance between the circle position indicated by the numeral 1 and the circle position indicated by the numeral 2 to the difference in time between time t1 and time t 2; the vehicle speed between time t3 and time t2, which is the ratio of the difference in distance between the circle position indicated by the numeral 3 and the circle position indicated by the numeral 2 to the difference in time between time t3 and time t 2; the vehicle speed between time t1 and time t4 is the ratio of the distance difference between the circle position indicated by the numeral 1 and the circle position indicated by the numeral 4 to the time difference between time t1 and time t 4.
Further, the vehicle may stop during the driving process, in this case, when the vehicle speed of the vehicle is determined according to the position information of the vehicle at different time, the following steps may be adopted:
setting time periods of different moments as target time periods, wherein the target time periods comprise at least two sub time periods, and determining the speed of the vehicle in the sub time periods according to the position information of the vehicle in the moments included in the sub time periods;
the vehicle speed of the vehicle is determined based on the vehicle speed of the first sub-period, wherein the vehicle speed of the first sub-period is greater than the first speed threshold, and the vehicle speed of the vehicle may be an average of the vehicle speeds of the first sub-period.
When the vehicle speed of the vehicle is determined based on the position information of the vehicle at different times, the vehicle may be in a stopped state at a certain time. For this case, in the above-described scheme, the target time period is divided into at least two sub-time periods, and then the vehicle speed in each sub-time period is determined. If the vehicle speed of a certain sub-time period is not greater than the first speed threshold value, the vehicle can be considered to be in a parking state in the sub-time period, and the vehicle speed is no longer determined according to the position of the vehicle in the sub-time period, so that the accuracy of determining the vehicle speed can be improved.
In this case, the vehicle speed of the vehicle can be determined based on the position information of the vehicle at different times.
In some embodiments, the overhead identification parameters of the vehicle may include parking status parameters of the vehicle, which typically include at least one of: the parking duration, the distance between adjacent parking spots and the number of parking in the first time period.
In one possible embodiment, the parking state parameter of the vehicle can be determined by:
in a first step, position information of the vehicle at different times is determined based on the GNSS signals.
Secondly, if the position information of the vehicle at different moments indicates that the difference value of the distances of the vehicle at the first moment and the second moment is within a first distance threshold value, determining that the vehicle is in a parking state at the first moment and the second moment;
thirdly, determining the parking time length of the vehicle according to the time length of the vehicle in the parking state; or determining the distance between adjacent parking places according to the position information when the vehicle is in a parking state; or determining the number of parking times in the first time period according to the number of times that the vehicle is in the parking state in the first time period.
The parking place is the position of the vehicle when the vehicle is in a parking state. If the vehicle is in the parking state at the first time and the second time, the position where the vehicle is located at the first time or the second time can be used as the parking place.
In addition, the number of times the vehicle is parked in the first time period may be generally considered as the number of times the vehicle is parked in the first time period.
Wherein, if the vehicle is in the same position at different times, the vehicle can be considered to be in a parking state. In addition, considering that there may be an error in determining the position information of the vehicle at different times, the vehicle may be considered to be in a stopped state when the distance between the positions where the vehicle is located at different times is smaller than the first distance threshold.
For example, if the distance between the position of the vehicle at time t1 and the position of the vehicle at time t2 is less than the first distance threshold, the vehicle may be considered to be in a parked state at times t1 and t 2.
In another possible manner, the parking state parameter of the vehicle can be determined by:
the method comprises the steps of firstly, determining whether a vehicle is in a parking state according to the vehicle speed of the vehicle, wherein the vehicle is normally in the parking state when the vehicle speed of the vehicle is smaller than a second speed threshold value;
secondly, determining the parking duration of the vehicle according to the duration of the vehicle in the parking state; or determining the distance between adjacent parking places according to the position information when the vehicle is in a parking state; or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
Of course, the parking parameters may also be determined in other manners, which is not limited in the embodiment of the present application.
Step S12, determining whether the vehicle is on the road on the upper side of the overhead or on the road on the lower side of the overhead according to the overhead identification parameter of the vehicle. If the overhead identification parameters of the vehicle accord with a first condition, determining that the vehicle is positioned on a road on the upper side of the overhead; if the overhead identifying parameter of the vehicle meets the second condition, determining that the vehicle is located on a road under the overhead.
When a vehicle runs on a road on the upper side of an overhead, the running state of the vehicle needs to meet a first condition, and the first condition is used for indicating the relation between the vehicle speed of the vehicle and a speed threshold corresponding to the vehicle speed.
For example, when a vehicle travels on a road on the upper side of an overhead, there is usually a certain requirement for the speed of the vehicle in traffic regulations for reasons such as ensuring driving safety and avoiding congestion, and it is usually not desirable that the vehicle stops on the road on the upper side of the overhead. In this case, when the overhead recognition parameter of the vehicle meets the first condition, the terminal device may determine that the vehicle is located on a road on the overhead upper side.
In some embodiments, if the vehicle speed of the vehicle in the second time period is not less than the speed threshold corresponding to the vehicle speed, the overhead identification parameter of the vehicle meets the first condition, and the road with the vehicle on the upper side of the overhead can be determined.
When a vehicle runs on a road on the upper side of an overhead, in order to guarantee the driving safety and avoid congestion, traffic regulations generally require that the vehicle maintain a certain speed. In this case, if the vehicle speed of the vehicle in the second time period is not less than the speed threshold corresponding to the vehicle speed, it indicates that the vehicle always maintains a faster vehicle speed in the second time period, and the overhead recognition parameter of the vehicle may be considered to meet the first condition, and accordingly, the road on which the vehicle is located on the upper side of the overhead may be determined.
In the solution provided in the embodiment of the present application, the specific duration of the second time period and the size of the speed threshold corresponding to the vehicle speed are not limited. In one possible example, the second time period may be 40 seconds, and the speed threshold corresponding to the vehicle speed may be 90 km/h, in which case the vehicle may be considered to be on the road on the upper side of the overhead if the vehicle speed is not less than 90 km/h within 40 seconds. Of course, the second time period may be other time periods, and the speed threshold corresponding to the vehicle speed may also be other speed values.
In addition, the vehicle is located on the road on the underside of the overhead when the vehicle meets the second condition. Where the traffic regulations generally place less restrictions on whether a vehicle may be parked on a road below an elevated level. In some embodiments, the parking status parameter of the vehicle includes a number of parks in the first time period, in which case the overhead identification parameter of the vehicle complies with the second condition if the number of parks in the first time period is greater than a threshold number of parks.
In the case where the vehicle is not normally allowed to stop in the traffic regulation while the vehicle is traveling on the road on the upper side of the overhead, if the number of times the vehicle has stopped in the first time period is greater than the number threshold corresponding to the number of times of stopping, it may be considered that the vehicle is not traveling on the road on the upper side of the overhead, that is, the overhead recognition parameter of the vehicle conforms to the second condition, and it may be further determined that the vehicle is traveling on the road on the lower side of the overhead.
In one possible example, the first time period is 3 minutes, and the number of times of parking corresponds to a number threshold of 2, in which case if the number of times of parking of the vehicle within 3 minutes exceeds two, it can be considered that the vehicle is traveling on the road on the lower side of the overhead.
Of course, the first time period may also be other time lengths, and the number threshold corresponding to the number of times of parking may also be other numerical values, which is not limited in this embodiment of the application.
In some embodiments, the parking status parameter of the vehicle includes a parking duration, and the overhead identification parameter of the vehicle satisfies the second condition if the parking duration of the vehicle is greater than a duration threshold corresponding to the parking duration.
When a vehicle travels on a road on the upper side of an overhead, the vehicle is usually not allowed to stop in traffic regulations, and therefore, even if the vehicle stops due to a contingency, the user ends the stopped state of the vehicle as soon as possible. That is, when the vehicle travels on the road on the overhead side, the vehicle is generally stopped less frequently, and the time for which the stopped state continues is relatively short even when the vehicle is stopped.
In this case, if the parking duration of the vehicle is greater than the duration threshold corresponding to the parking duration, indicating that the vehicle is parked for a longer time, in this case, it may be determined that the overhead recognition parameter of the vehicle meets the second condition, and it may be further determined that the vehicle is traveling on the road on the lower side of the overhead.
In the scheme provided by the embodiment of the application, the specific duration of the duration threshold corresponding to the parking duration is not limited. In one possible example, the parking duration corresponds to a duration threshold of 30 seconds, in which case if the parking duration is greater than 30 seconds, it may be assumed that the vehicle is traveling on the road on the lower side of the overhead. Of course, the time length threshold corresponding to the parking time length may also be other time lengths.
In some embodiments, the parking status parameter of the vehicle includes a distance between adjacent parking locations, and the overhead identification parameter of the vehicle complies with the second condition if the distance between adjacent parking locations is greater than a distance threshold corresponding to the distance between adjacent parking locations.
When the vehicle travels on the road on the upper side of the overhead, the vehicle is not generally allowed to stop in the traffic regulation, and therefore, if the distance between the adjacent parking places is greater than the distance threshold corresponding to the distance between the adjacent parking places, it is indicated that the vehicle stops at least twice and the distance between the parking places is greater, in which case, it can be determined that the overhead recognition parameter of the vehicle meets the second condition, and further, it can be determined that the vehicle travels on the road on the lower side of the overhead.
In one possible example, the distance threshold is 200 meters, in which case if the distance between adjacent parking places is greater than 200 meters, the vehicle may be considered to be traveling on the road below the overhead.
Of course, the distance threshold may have other lengths, which is not limited in this embodiment.
In the solution provided by the embodiment of the present application, the parking state parameter of the vehicle includes at least one of the following parameters: the number of parking in the first time period, the parking duration and the distance between adjacent parking places. In order to improve the accuracy of determining whether a vehicle is travelling on the elevated underside road, it is also possible to jointly determine whether a vehicle is travelling on the elevated underside road in combination with at least two parking status parameters.
In one possible example, it may be determined that the overhead identification parameter of the vehicle meets the second condition only when a distance between adjacent parking locations is greater than a distance threshold corresponding to the distance between the adjacent parking locations and a parking duration is greater than a duration threshold corresponding to a parking time. Wherein n is a positive integer not less than 2.
For example, if the vehicle is parked at the places 1 and 2 in sequence, the distance between the places 1 and 2 is greater than 200 m, and the parking time at the places 1 and 2 is greater than 1 min, the overhead recognition parameter of the vehicle may be considered to be in accordance with the second condition, and it may be further determined that the vehicle is traveling on the road on the lower side of the overhead.
The embodiment of the application provides an overhead identification method, in which a terminal device determines an overhead identification parameter of a vehicle, determines a road on which the vehicle is positioned on the upper side of an overhead when the overhead identification parameter of the vehicle meets a first condition, and determines a road on which the vehicle is positioned on the lower side of the overhead when the overhead identification parameter of the vehicle meets a second condition.
The embodiment of the application combines the overhead identification parameter of the vehicle, the first condition required to be met when the vehicle is positioned on the road on the upper side of the overhead and the second condition required to be met when the vehicle is positioned on the road on the lower side of the overhead when determining whether the vehicle runs on the road on the upper side of the overhead or on the road on the lower side of the overhead, thereby being capable of determining whether the vehicle runs on the road on the upper side of the overhead or on the road on the lower side of the overhead and solving the problem that whether the vehicle is positioned on the overhead or not in the prior art. Compared with the prior art, the scheme of the application can reduce the number of times of errors in navigation and improve the navigation accuracy.
Furthermore, because the scheme of the application can improve the navigation accuracy, the experience of a user on the vehicle can be improved, the time consumption of the driving process is reduced, the oil consumption of the vehicle is reduced, and the purpose of energy conservation is achieved.
In addition, the elevated structure includes various forms, for example, some elevated structures have only one floor, i.e., the road on the upper side of such elevated structures has only one floor. In addition, some of the elevated frames include two or more layers, and for such an elevated frame, the road on the upper side of the elevated frame is any one of the elevated road layers above the ground, and the road on the lower side of the elevated frame is the ground road located below the elevated road closest to the ground.
If the overhead structure includes two or more layers, the road on the upper side of the overhead structure often includes two or more layers, and in order to improve the accuracy of vehicle navigation, after determining that the vehicle is located on the road on the upper side of the overhead structure, it is often necessary to determine the number of layers of the road on the upper side of the overhead structure on which the vehicle is located. For this situation, referring to the workflow diagram of fig. 11, the embodiment of the present application further includes the following steps:
step S13, the elevated frame comprises more than two layers, and after the road of the vehicle on the upper side of the elevated frame is determined, the height of the vehicle is determined;
and step S14, determining the number of layers of the road on the upper side of the overhead where the vehicle is positioned according to the height of the vehicle and the height of the overhead of each layer.
In the embodiments of the present application, the height of the vehicle may be determined in various ways. In some embodiments, the terminal device may determine the height of the vehicle based on a height sensor (e.g., a barometer).
In some embodiments, the height of each of the levels of the local overhead may be stored in the terminal device. In this case, the terminal device may determine its own position information based on the GNSS signal, and then may query its own memory based on the position information to determine the height of each of the levels of the overhead around the position of the terminal device itself, and then further determine the number of layers of the road on the upper side of the overhead where the vehicle is located based on the height of the vehicle.
In some embodiments, the terminal device may also transmit the location information to a remote server after determining its own location information. The server determines the height of each of the peripheral elevated frames of the terminal device based on the position information after receiving the position information, and transmits the height of each of the peripheral elevated frames to the terminal device, and the terminal device determines the number of floors of the road on the upper side of the elevated frame where the vehicle is located based on the received information and the height of the vehicle.
In some embodiments, the terminal device may interact with information from devices within the vehicle. For example, the terminal device may perform information interaction with a vehicle machine installed in a center console of a vehicle.
The terminal device may transmit the position information of itself and the height of the vehicle to the device in the vehicle, and the device in the vehicle determines the number of floors of the road on the upper side of the overhead where the vehicle is located based on this, and transmits it to the terminal device, so that the terminal device determines the number of floors of the road on the upper side of the overhead where the vehicle is located.
In the prior art, the terminal device for navigation cannot determine whether the vehicle is located on the road on the upper side of the overhead, and accordingly, cannot determine the number of layers of the road on the upper side of the overhead where the vehicle is located.
The scheme provided by the embodiment of the application can determine whether the vehicle runs on the road on the upper side of the overhead, and can further determine the number of layers of the road on the upper side of the overhead where the vehicle is located when the vehicle is determined to run on the road on the upper side of the overhead, so that the accuracy of vehicle navigation is further improved.
In the solution provided in the embodiment of the present application, the method may further include the following steps:
and reporting the positioning information of the vehicle to a server, wherein the positioning information is used for indicating the road of the vehicle positioned on the upper side of the overhead or the road positioned on the lower side of the overhead.
Further, if the overhead structure includes more than two floors, the positioning information further includes the number of floors of the road on the upper side of the overhead structure where the vehicle is located.
Wherein the server may be a server of a navigation application. Compared with the prior art, the positioning information received by the server is more accurate, so that the position of the vehicle can be more accurately determined by the server, and the navigation accuracy is further improved.
The overhead identification method provided by the embodiment of the application can effectively reduce navigation errors and improve navigation precision. To clarify the advantages of the present application, an example is provided below.
In this example, the terminal device respectively navigates the vehicle through solutions provided in the prior art and the embodiment of the present application.
Fig. 3(b) illustrates an electronic map displayed by the terminal device for navigating the vehicle according to the prior art. Navigation in the figure indicates that a user vehicle is positioned on the northbound four-ring east road and under an overhead bridge; the actual position of the user vehicle is on the overhead of the main road with four circles, the five-pointed star in fig. 3 represents the actual position of the user vehicle, and the actual position of the user vehicle is not consistent with the navigation position of the mobile phone of the user, so that the navigation deviates the vehicle position identification.
Fig. 12 is an electronic map displayed by the terminal device for navigating the vehicle when the terminal device navigates the vehicle according to the embodiment of the present application. Referring to fig. 12, the overhead recognition method according to the embodiment of the present application can accurately position the vehicle of the user on the overhead, so that the navigation position of the mobile phone is consistent with the actual position of the vehicle, thereby realizing accurate positioning and navigation.
The various method embodiments described herein may be implemented as stand-alone solutions or combined in accordance with inherent logic and are intended to fall within the scope of the present application.
It is to be understood that, in the above-described method embodiments, the method and operations implemented by the terminal device may also be implemented by a component (e.g., a chip or a circuit) that can be used for the terminal device.
The above embodiments describe the overhead identification method provided in the present application. It is understood that the terminal device includes a hardware structure and/or a software module for performing each function in order to implement the above functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the terminal device may be divided into the functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
The method provided by the embodiment of the present application is described in detail above with reference to fig. 1 to 12. Hereinafter, the apparatus provided in the embodiment of the present application will be described in detail with reference to fig. 13 to 14. It should be understood that the description of the apparatus embodiments corresponds to the description of the method embodiments, and therefore, for brevity, details are not repeated here, since the details that are not described in detail may be referred to the above method embodiments.
Referring to fig. 13, fig. 13 is a block diagram illustrating a structure of an embodiment of an overhead identification device according to the present disclosure. As shown in fig. 13, the apparatus 1000 may include: a transceiver 1001 and a processor 1002. The apparatus 1000 may perform the operations performed by the terminal device in the method embodiments shown in fig. 9 or fig. 11.
For example, in an alternative embodiment of the present application, the transceiver 1001 is used for receiving GNSS signals. The processor 1002 is configured to: determining an overhead identification parameter of the vehicle;
if the overhead identification parameters of the vehicle accord with a first condition, determining that the vehicle is positioned on a road on the upper side of the overhead;
determining a road with the vehicle on the lower side of the overhead if the overhead identification parameter of the vehicle meets a second condition;
wherein the overhead identification parameters of the vehicle include: a vehicle speed of the vehicle or a parking state parameter of the vehicle;
the parking state parameter of the vehicle includes at least one of the following parameters: the parking times and the parking duration in the first time period and the distance between adjacent parking places;
the first condition is used for indicating the relation between the vehicle speed of the vehicle and a speed threshold value corresponding to the vehicle speed;
the second condition is used for indicating a relationship between a parking state parameter of the vehicle and a threshold value corresponding to the parking state parameter.
In one possible implementation, the processor determines an overhead identification parameter of the vehicle, specifically:
determining an overhead identification parameter of the vehicle when an operation to start a navigation function is received;
or when receiving a position searching operation, determining an overhead identification parameter of the vehicle;
or, when an operation for indicating a destination is received, determining an overhead identification parameter of the vehicle;
alternatively, when an operation for indicating that the navigation mode is driving is received, the overhead recognition parameter of the vehicle is determined
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining an overhead identification parameter of the vehicle;
or when the front of the vehicle is determined to contain the ramp junction according to the GNSS signals, determining the overhead identification parameters of the vehicle;
or when it is determined that the front of the vehicle includes an overhead sign from an image including the front of the vehicle, determining an overhead recognition parameter of the vehicle;
alternatively, when it is determined that the periphery of the vehicle includes an overhead from the GNSS signal, an overhead identification parameter of the vehicle is determined.
In a possible implementation manner, if the vehicle speed of the vehicle in the second time period is not less than the speed threshold corresponding to the vehicle speed, the overhead identification parameter of the vehicle meets the first condition;
or the parking state parameter of the vehicle comprises the number of parking times in a first time period, and if the number of parking times in the first time period is greater than a number threshold corresponding to the number of parking times, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises parking time, and if the parking time of the vehicle is greater than a time threshold corresponding to the parking time, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises a distance between adjacent parking places, and if the distance between the adjacent parking places is greater than a distance threshold corresponding to the distance between the adjacent parking places, the overhead identification parameter of the vehicle meets the second condition.
In one possible implementation manner, the overhead identification parameter of the vehicle includes a vehicle speed of the vehicle, and the processor determines the overhead identification parameter of the vehicle, specifically:
determining a vehicle speed of the vehicle from a sensor;
or, the processor determines an overhead identification parameter of the vehicle, specifically:
determining the position information of the vehicle at different moments according to the GNSS signals;
and determining the speed of the vehicle according to the position information of the vehicle at different moments.
In a possible implementation manner, the processor determines the vehicle speed of the vehicle according to the position information of the vehicle at different times, specifically:
setting the time periods of the different moments as target time periods, wherein the target time periods comprise at least two sub time periods, and determining the speed of the vehicle in the sub time periods according to the position information of the vehicle at each moment in the sub time periods;
determining a vehicle speed of the vehicle based on a first sub-period of vehicle speed, wherein the first sub-period of vehicle speed is greater than a first speed threshold.
In a possible implementation manner, the overhead identification parameter of the vehicle includes a parking state parameter of the vehicle, and the processor determines the overhead identification parameter of the vehicle, specifically:
determining the position information of the vehicle at different moments according to the GNSS signals;
the position information of the vehicle at different moments indicates that the difference value of the distances of the vehicle at the first moment and the second moment is within a first distance threshold value, and the vehicle is determined to be in a parking state at the first moment and the second moment;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between adjacent parking places according to the position information when the vehicle is in a parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
In a possible implementation manner, the overhead identification parameter of the vehicle includes a parking state parameter of the vehicle, and the processor determines the overhead identification parameter of the vehicle, specifically:
determining whether the vehicle is in a parking state or not according to the vehicle speed of the vehicle, wherein when the vehicle speed of the vehicle is smaller than a second speed threshold value, the vehicle is in the parking state;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between the adjacent parking places according to the position information when the vehicle is in the parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
In one possible implementation, the processor is further configured to:
the elevated frame comprises more than two layers, and the height of the vehicle is determined after the vehicle is determined to be positioned on the road on the upper side of the elevated frame;
and determining the number of layers of the road on the upper side of the elevated frame where the vehicle is located according to the height of the vehicle and the height of each layer of elevated frame.
In one possible implementation, the processor is further configured to:
reporting positioning information of the vehicle to a server, wherein the positioning information comprises information of a road on which the vehicle is positioned at the upper side of the overhead or a road on which the vehicle is positioned at the lower side of the overhead;
if the elevated frame comprises more than two layers, the positioning information also comprises the number of layers of the road on the upper side of the elevated frame where the vehicle is located.
That is, the apparatus 1000 may implement steps or flows corresponding to those executed by the terminal device in the overhead identification method embodiment shown in fig. 9 or fig. 11, and the apparatus 1000 may include modules for executing the method executed by the terminal device in the overhead identification method embodiment shown in fig. 9 or fig. 11. It should be understood that the specific processes of the modules for executing the above corresponding steps have been described in detail in the above embodiment of the overhead identification method, and are not described herein again for brevity.
An embodiment of the present application further provides an overhead identification device, which includes at least one processor and a communication interface. The communication interface is used for providing information input and/or output for the at least one processor, and the at least one processor is used for executing the method in the method embodiment.
Embodiments of the present application further provide a terminal device, which includes a processor, and when the processor executes a computer program or instructions in a memory, the method in the above method embodiments is performed.
The embodiment of the application also provides a terminal device, which comprises a processor and a memory; the memory is for storing a computer program or instructions; the processor is configured to execute the computer program or instructions stored in the memory to cause the terminal device to perform the method as in the above-mentioned method embodiments.
The embodiment of the application also provides terminal equipment, which comprises a processor, a memory and a transceiver; the transceiver is used for receiving signals or sending signals; the memory is for storing a computer program or instructions; the processor is configured to execute the computer program or instructions stored in the memory to cause the terminal device to perform the method as in the above-described method embodiments.
The embodiment of the application also provides terminal equipment, which comprises a processor and an interface circuit; the interface circuit is used for receiving a computer program or instructions and transmitting the computer program or instructions to the processor; the processor is configured to execute the computer program or instructions to cause the terminal device to perform the method as in the above method embodiments.
It should be understood that the overhead identification device may be a chip. For example, referring to fig. 14, fig. 14 is a block diagram of a chip according to an embodiment of the present disclosure. The chip shown in fig. 14 may be a general-purpose processor or a special-purpose processor. The chip 1100 may include at least one processor 1101. Wherein the at least one processor 1101 may be configured to support the apparatus shown in fig. 13 to execute the technical solution shown in fig. 9 or fig. 11.
Optionally, the chip 1100 may further include a transceiver 1102, where the transceiver 1102 is configured to receive control of the processor 1101, and is configured to support the apparatus shown in fig. 13 to execute the technical solution shown in fig. 9 or fig. 11. Optionally, the chip 1100 shown in fig. 14 may further include a storage medium 1103. In particular, the transceiver 1102 may be replaced with a communication interface that provides information input and/or output to the at least one processor 1101.
It should be noted that the chip 1100 shown in fig. 14 can be implemented by using the following circuits or devices: one or more Field Programmable Gate Arrays (FPGAs), Programmable Logic Devices (PLDs), Application Specific Integrated Circuits (ASICs), system chips (socs), Central Processing Units (CPUs), Network Processors (NPs), digital signal processing circuits (DSPs), Micro Controller Units (MCUs), controllers, state machines, gate logic, discrete hardware components, any other suitable circuitry, or any combination of circuitry capable of performing the various functions described throughout this application.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor. To avoid repetition, it is not described in detail here.
It should be noted that the processor in the embodiments of the present application may be an integrated circuit chip having signal processing capability. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and combines hardware thereof to complete the steps of the method.
It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, Synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
According to the method provided by the embodiment of the present application, an embodiment of the present application further provides a computer program product, which includes: computer program or instructions which, when run on a computer, cause the computer to perform the method of any one of the embodiments shown in figure 9 or figure 11.
According to the method provided by the embodiment of the present application, a computer storage medium is further provided, and the computer storage medium stores a computer program or instructions, and when the computer program or instructions runs on a computer, the computer is caused to execute the method of any one of the embodiments shown in fig. 9 or fig. 11.
According to the method provided by the embodiment of the application, the embodiment of the application also provides a terminal device, wherein the terminal device is an intelligent device and comprises a smart phone, a tablet computer or a personal digital assistant and the like, and the intelligent device comprises the overhead identification device.
Those of ordinary skill in the art will appreciate that the various illustrative logical blocks and steps (step) described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the module described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing unit, or each module may exist alone physically, or two or more modules are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The overhead identification device, the chip, the computer storage medium, the computer program product, and the terminal device provided in the embodiments of the present application are all configured to execute the method provided above, and therefore, the beneficial effects achieved by the overhead identification device can refer to the beneficial effects corresponding to the method provided above, and are not described herein again.
It should be understood that, in the embodiments of the present application, the execution sequence of each step should be determined by its function and inherent logic, and the size of the sequence number of each step does not mean the execution sequence, and does not limit the implementation process of the embodiments.
All parts of this specification are described in a progressive manner, and like parts of the various embodiments can be referred to one another, with emphasis on each embodiment being placed on differences from other embodiments. In particular, for embodiments of the overhead identification device, the chip, the computer storage medium, the computer program product, the terminal device, since they are substantially similar to the method embodiments, the description is relatively simple, and it suffices to refer to the description in the method embodiments for relevant points.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
The above-described embodiments of the present application do not limit the scope of the present application.

Claims (21)

1. An overhead identification method, comprising:
determining an overhead identification parameter of the vehicle;
if the overhead identification parameters of the vehicle meet a first condition, determining that the vehicle is positioned on a road on the upper side of an overhead;
determining that the vehicle is located on a road below the overhead if the overhead identification parameter of the vehicle meets a second condition;
wherein the overhead identification parameters of the vehicle include: a vehicle speed of the vehicle or a parking state parameter of the vehicle;
the parking state parameter of the vehicle includes at least one of the following parameters: the parking times and the parking duration in the first time period and the distance between adjacent parking places;
the first condition is used for indicating the relation between the vehicle speed of the vehicle and a speed threshold value corresponding to the vehicle speed;
the second condition is used for indicating a relationship between a parking state parameter of the vehicle and a threshold value corresponding to the parking state parameter.
2. The method of claim 1, wherein determining an overhead identification parameter for a vehicle comprises:
determining an overhead identification parameter of the vehicle when an operation to start a navigation function is received;
or when receiving a position searching operation, determining an overhead identification parameter of the vehicle;
or, when an operation for indicating a destination is received, determining an overhead identification parameter of the vehicle;
alternatively, when an operation for indicating that the navigation mode is driving is received, an overhead identification parameter of the vehicle is determined
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining an overhead identification parameter of the vehicle;
or when the front of the vehicle is determined to contain the ramp junction according to the GNSS signals, determining the overhead identification parameters of the vehicle;
or when the front of the vehicle is determined to comprise an overhead mark according to the image comprising the front of the vehicle, determining an overhead identification parameter of the vehicle;
alternatively, when it is determined that the periphery of the vehicle includes an overhead from the GNSS signal, an overhead identification parameter of the vehicle is determined.
3. The method of claim 1,
if the vehicle speed of the vehicle in a second time period is not less than the speed threshold corresponding to the vehicle speed, the overhead identification parameter of the vehicle meets the first condition;
or the parking state parameter of the vehicle comprises the number of parking times in a first time period, and if the number of parking times in the first time period is greater than a number threshold corresponding to the number of parking times, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises parking time, and if the parking time of the vehicle is greater than a time threshold corresponding to the parking time, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises a distance between adjacent parking places, and if the distance between the adjacent parking places is greater than a distance threshold corresponding to the distance between the adjacent parking places, the overhead identification parameter of the vehicle meets the second condition.
4. The method of claim 1, wherein the overhead identification parameter of the vehicle comprises a vehicle speed of the vehicle, and wherein determining the overhead identification parameter of the vehicle comprises:
determining a vehicle speed of the vehicle from a sensor;
alternatively, the determining an overhead identification parameter of the vehicle comprises:
determining the position information of the vehicle at different moments according to the GNSS signals;
and determining the speed of the vehicle according to the position information of the vehicle at different moments.
5. The method of claim 4, wherein determining the vehicle speed of the vehicle based on the position information of the vehicle at different times comprises:
setting the time periods of the different moments as target time periods, wherein the target time periods comprise at least two sub time periods, and determining the speed of the vehicle in the sub time periods according to the position information of the vehicle at each moment in the sub time periods;
determining a vehicle speed of the vehicle based on a vehicle speed of a first sub-time period, wherein the vehicle speed of the first sub-time period is greater than a first speed threshold.
6. The method of claim 1, wherein the overhead identification parameter of the vehicle comprises a parking status parameter of the vehicle, and wherein determining the overhead identification parameter of the vehicle comprises:
determining the position information of the vehicle at different moments according to the GNSS signals;
the position information of the vehicle at different moments indicates that the difference value of the distances of the vehicle at the first moment and the second moment is within a first distance threshold value, and the vehicle is determined to be in a parking state at the first moment and the second moment;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between adjacent parking places according to the position information when the vehicle is in a parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
7. The method of claim 1, wherein the overhead identification parameter of the vehicle comprises a parking status parameter of the vehicle, and wherein determining the overhead identification parameter of the vehicle comprises:
determining whether the vehicle is in a parking state or not according to the vehicle speed of the vehicle, wherein when the vehicle speed of the vehicle is smaller than a second speed threshold value, the vehicle is in the parking state;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between the adjacent parking places according to the position information when the vehicle is in the parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
8. The method of any one of claims 1 to 7, further comprising:
the elevated frame comprises more than two layers, and the height of the vehicle is determined after the vehicle is determined to be positioned on the road on the upper side of the elevated frame;
and determining the number of layers of the road on the upper side of the elevated frame where the vehicle is located according to the height of the vehicle and the height of each layer of elevated frame.
9. The method of claim 8, further comprising:
reporting positioning information of the vehicle to a server, wherein the positioning information comprises information of a road on which the vehicle is positioned at the upper side of the overhead or a road on which the vehicle is positioned at the lower side of the overhead;
if the elevated frame comprises more than two layers, the positioning information also comprises the number of layers of the road on the upper side of the elevated frame where the vehicle is located.
10. An overhead identification device, the device comprising: a transceiver and a processor;
the transceiver is used for receiving GNSS signals;
the processor is configured to: determining an overhead identification parameter of the vehicle;
determining that the vehicle is located on a road on the upper side of the overhead if the overhead identification parameter of the vehicle meets a first condition;
determining a road with the vehicle on the lower side of the overhead if the overhead identification parameter of the vehicle meets a second condition;
wherein the overhead identification parameters of the vehicle include: a vehicle speed of the vehicle or a parking state parameter of the vehicle;
the parking state parameter of the vehicle includes at least one of the following parameters: the parking times and the parking duration in the first time period and the distance between adjacent parking places are set;
the first condition is used for indicating the relation between the vehicle speed of the vehicle and a speed threshold value corresponding to the vehicle speed;
the second condition is used for indicating a relationship between a parking state parameter of the vehicle and a threshold value corresponding to the parking state parameter.
11. The apparatus of claim 10, wherein the processor determines an overhead identification parameter of the vehicle, in particular:
determining an overhead identification parameter of the vehicle when an operation to start a navigation function is received;
or when receiving a position searching operation, determining an overhead identification parameter of the vehicle;
or, when an operation for indicating a destination is received, determining an overhead identification parameter of the vehicle;
alternatively, when an operation for indicating that the navigation mode is driving is received, an overhead identification parameter of the vehicle is determined
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining an overhead identification parameter of the vehicle;
or when the front of the vehicle is determined to contain a ramp junction according to a Global Navigation Satellite System (GNSS) signal, determining an overhead identification parameter of the vehicle;
or when it is determined that the front of the vehicle includes an overhead sign from an image including the front of the vehicle, determining an overhead recognition parameter of the vehicle;
alternatively, when it is determined that the periphery of the vehicle includes an overhead from the GNSS signal, an overhead identification parameter of the vehicle is determined.
12. The apparatus of claim 10,
if the vehicle speed of the vehicle in a second time period is not less than the speed threshold corresponding to the vehicle speed, the overhead identification parameter of the vehicle meets the first condition;
or the parking state parameter of the vehicle comprises the number of parking times in a first time period, and if the number of parking times in the first time period is greater than a number threshold corresponding to the number of parking times, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameter of the vehicle comprises parking time, and if the parking time of the vehicle is greater than a time threshold corresponding to the parking time, the overhead identification parameter of the vehicle meets the second condition;
or the parking state parameters of the vehicle comprise the distance between adjacent parking places, and if the distance between the adjacent parking places is greater than a distance threshold corresponding to the distance between the adjacent parking places, the overhead identification parameters of the vehicle meet the second condition.
13. The apparatus of claim 10, wherein the overhead identification parameter of the vehicle comprises a vehicle speed of the vehicle, and wherein the processor determines the overhead identification parameter of the vehicle, in particular:
determining a vehicle speed of the vehicle from a sensor;
or, the processor determines an overhead identification parameter of the vehicle, specifically:
determining the position information of the vehicle at different moments according to the GNSS signals;
and determining the speed of the vehicle according to the position information of the vehicle at different moments.
14. The apparatus according to claim 13, wherein the processor determines the vehicle speed of the vehicle based on the position information of the vehicle at different times, specifically:
setting the time periods of the different moments as target time periods, wherein the target time periods comprise at least two sub time periods, and determining the speed of the vehicle in the sub time periods according to the position information of the vehicle at each moment in the sub time periods;
determining a vehicle speed of the vehicle based on a vehicle speed of a first sub-time period, wherein the vehicle speed of the first sub-time period is greater than a first speed threshold.
15. The apparatus of claim 10, wherein the overhead identification parameter of the vehicle comprises a parking status parameter of the vehicle, and wherein the processor determines the overhead identification parameter of the vehicle, in particular:
determining the position information of the vehicle at different moments according to the GNSS signals;
the position information of the vehicle at different moments indicates that the difference value of the distances of the vehicle at the first moment and the second moment is within a first distance threshold value, and the vehicle is determined to be in a parking state at the first moment and the second moment;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between adjacent parking places according to the position information when the vehicle is in a parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
16. The apparatus of claim 10, wherein the overhead identification parameter of the vehicle comprises a parking status parameter of the vehicle, and wherein the processor determines the overhead identification parameter of the vehicle, in particular:
determining whether the vehicle is in a parking state or not according to the vehicle speed of the vehicle, wherein when the vehicle speed of the vehicle is smaller than a second speed threshold value, the vehicle is in the parking state;
determining the parking time length of the vehicle according to the time length of the vehicle in the parking state;
or determining the distance between the adjacent parking places according to the position information when the vehicle is in the parking state;
or determining the parking times in the first time period according to the times of the vehicle in the parking state in the first time period.
17. The apparatus of any of claims 10 to 16, wherein the processor is further configured to:
the elevated frame comprises more than two layers, and the height of the vehicle is determined after the vehicle is determined to be positioned on the road on the upper side of the elevated frame;
and determining the number of layers of the road on the upper side of the elevated frame where the vehicle is located according to the height of the vehicle and the height of each layer of elevated frame.
18. The apparatus of claim 17, wherein the processor is further configured to:
reporting positioning information of the vehicle to a server, wherein the positioning information comprises information of a road on which the vehicle is positioned at the upper side of the overhead or a road on which the vehicle is positioned at the lower side of the overhead;
if the elevated frame comprises more than two layers, the positioning information also comprises the number of layers of the road on the upper side of the elevated frame where the vehicle is located.
19. A terminal device, characterized in that it comprises the apparatus of any of claims 10 to 18.
20. A computer storage medium having stored thereon a computer program or instructions which, when executed, cause the method of any one of claims 1-9 to be performed.
21. A chip comprising a processor coupled with a memory for executing a computer program or instructions stored in the memory, the computer program or instructions, when executed, performing the method of any of claims 1-9.
CN202110902596.4A 2021-08-06 2021-08-06 Overhead identification method and device Active CN113792589B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110902596.4A CN113792589B (en) 2021-08-06 2021-08-06 Overhead identification method and device
PCT/CN2022/091512 WO2023010922A1 (en) 2021-08-06 2022-05-07 Overpass identification method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110902596.4A CN113792589B (en) 2021-08-06 2021-08-06 Overhead identification method and device

Publications (2)

Publication Number Publication Date
CN113792589A CN113792589A (en) 2021-12-14
CN113792589B true CN113792589B (en) 2022-09-09

Family

ID=79181528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110902596.4A Active CN113792589B (en) 2021-08-06 2021-08-06 Overhead identification method and device

Country Status (2)

Country Link
CN (1) CN113792589B (en)
WO (1) WO2023010922A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113792589B (en) * 2021-08-06 2022-09-09 荣耀终端有限公司 Overhead identification method and device
CN117367487A (en) * 2022-07-01 2024-01-09 荣耀终端有限公司 Climbing state identification method and device
CN114979949B (en) * 2022-07-26 2022-12-27 荣耀终端有限公司 Flight state identification method and flight state identification device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112997126A (en) * 2020-12-25 2021-06-18 华为技术有限公司 Vehicle calling method, intelligent vehicle and equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070233353A1 (en) * 2006-03-28 2007-10-04 Alexander Kade Enhanced adaptive cruise control system with forward vehicle collision mitigation
CN103376112A (en) * 2012-04-18 2013-10-30 德尔福技术有限公司 Elevated road auxiliary judging system, navigation equipment with same and navigation method thereof
CN107764274B (en) * 2016-08-17 2021-03-02 厦门雅迅网络股份有限公司 Method for judging whether vehicle runs on elevated road or not
CN108802769B (en) * 2018-05-30 2022-11-18 千寻位置网络有限公司 Detection method and device of GNSS terminal on or under overhead
CN111127874B (en) * 2018-10-30 2022-03-08 上海擎感智能科技有限公司 Overhead identification method and identification system
CN109917440B (en) * 2019-04-09 2021-07-13 广州小鹏汽车科技有限公司 Combined navigation method, system and vehicle
CN113819910A (en) * 2019-09-29 2021-12-21 百度在线网络技术(北京)有限公司 Method and device for identifying overpass zone in vehicle navigation
CN111860322B (en) * 2020-07-20 2022-10-11 吉林大学 Unstructured pavement type identification method based on multi-source sensor information fusion
CN113792589B (en) * 2021-08-06 2022-09-09 荣耀终端有限公司 Overhead identification method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112997126A (en) * 2020-12-25 2021-06-18 华为技术有限公司 Vehicle calling method, intelligent vehicle and equipment

Also Published As

Publication number Publication date
WO2023010922A1 (en) 2023-02-09
CN113792589A (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN113792589B (en) Overhead identification method and device
WO2020133088A1 (en) System and method for updating map for self-driving
CN108860141B (en) Parking method, parking device and storage medium
WO2023010923A1 (en) Overpass identification method and apparatus
CN114882464B (en) Multi-task model training method, multi-task processing method, device and vehicle
CN110347147A (en) The method and system of positioning for vehicle
WO2023169448A1 (en) Method and apparatus for sensing target
CN115170630B (en) Map generation method, map generation device, electronic equipment, vehicle and storage medium
CN114756700B (en) Scene library establishing method and device, vehicle, storage medium and chip
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
CN115297461B (en) Data interaction method and device, vehicle, readable storage medium and chip
CN114863717B (en) Parking stall recommendation method and device, storage medium and vehicle
CN114937351B (en) Motorcade control method and device, storage medium, chip, electronic equipment and vehicle
CN113820732A (en) Navigation method and device
CN113790732A (en) Position information generation method and device
CN113790733B (en) Navigation method and device
CN115115822B (en) Vehicle-end image processing method and device, vehicle, storage medium and chip
CN114842454B (en) Obstacle detection method, device, equipment, storage medium, chip and vehicle
CN115221260B (en) Data processing method, device, vehicle and storage medium
CN115257628B (en) Vehicle control method, device, storage medium, vehicle and chip
CN114911630B (en) Data processing method and device, vehicle, storage medium and chip
EP4296132A1 (en) Vehicle control method and apparatus, vehicle, non-transitory storage medium and chip
CN115221261A (en) Map data fusion method and device, vehicle and storage medium
CN117325837A (en) Vehicle brake control method and device, readable storage medium, chip and vehicle
CN115150756A (en) Vehicle networking method, vehicle, computer readable storage medium and chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant