CN112146671B - Path planning method, related equipment and computer readable storage medium - Google Patents

Path planning method, related equipment and computer readable storage medium Download PDF

Info

Publication number
CN112146671B
CN112146671B CN202010895471.9A CN202010895471A CN112146671B CN 112146671 B CN112146671 B CN 112146671B CN 202010895471 A CN202010895471 A CN 202010895471A CN 112146671 B CN112146671 B CN 112146671B
Authority
CN
China
Prior art keywords
road
lane
level
vehicle
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010895471.9A
Other languages
Chinese (zh)
Other versions
CN112146671A (en
Inventor
熊福祥
王萌
唐炉亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010895471.9A priority Critical patent/CN112146671B/en
Publication of CN112146671A publication Critical patent/CN112146671A/en
Application granted granted Critical
Publication of CN112146671B publication Critical patent/CN112146671B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes

Abstract

The application relates to the field of navigation, and provides a path planning method, related equipment and a computer-readable storage medium, wherein the method comprises the following steps: firstly, receiving a road-level planned driving path from a first vehicle-mounted terminal, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; the road-level planning driving path is determined by the second cloud server according to the travel information of the first vehicle-mounted terminal and in combination with a road-level map; then, matching is carried out in a lane level driving route database according to longitude and latitude information corresponding to the N track points respectively to obtain a first driving route set; and finally, determining a lane-level planning driving path in the first driving route set. By implementing the method and the device, the flexibility and the safety of the automatic driving vehicle can be considered.

Description

Path planning method, related equipment and computer readable storage medium
Technical Field
The present application relates to the field of navigation, and in particular, to a path planning method, a related device, and a computer-readable storage medium.
Background
Automatic driving is a mainstream application in the field of artificial intelligence, and the automatic driving technology depends on the cooperative cooperation of computer vision, radar, a monitoring device, a global positioning system and the like, so that the motor vehicle can realize automatic driving without the active operation of human beings. Autonomous vehicles use various computing systems to effect the transport of passengers from one location to another. Some autonomous vehicles may have some initial input or continuous input from an operator (such as a driver, passenger). Autonomous vehicles permit the operator to switch from a manual driving mode to an autonomous driving mode or modes in between. Because the automatic driving technology does not need human to drive the motor vehicle, the driving error of human drivers can be effectively avoided theoretically, the occurrence of traffic accidents is reduced, and the transportation efficiency of roads can be improved. Therefore, the automatic driving technique is increasingly emphasized.
In the existing implementation mode, the traditional navigation map cannot support an automatic driving vehicle to plan a driving path at a lane level according to the navigation map because of low precision; the lane-level planning of the driving path can be performed based on the high-precision map, and some data in the high-precision map belong to confidential data, which undoubtedly increases the difficulty of the automatic driving vehicle in planning the lane-level planning of the driving path. Therefore, how to give consideration to the flexibility and the safety of the automatic driving vehicle is an urgent technical problem to be solved.
Disclosure of Invention
The application provides a path planning method, related equipment and a computer readable storage medium, which can determine a lane-level planned driving path in a lane-level driving route database and can take the flexibility and the safety of an automatic driving vehicle into consideration.
In a first aspect, a path planning method is provided, which may be applied to a first cloud server storing a lane-level driving route database; the method may comprise the steps of: firstly, receiving a road-level planned driving path from a first vehicle-mounted terminal, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point (the longitude and latitude are a combined name of the longitude and the latitude to form a coordinate system); n is an integer greater than 1; here, the road-level planned driving route may be a route determined by the second cloud server according to the travel information of the first vehicle-mounted terminal and in combination with a road-level map; specifically, the road-level planned travel path refers to a path for causing a vehicle to travel on a specified road; then, matching a first travelable route set in a lane-level travel route database according to the respective longitude and latitude information corresponding to the N track points; here, the travelable route means a route on which a vehicle can travel under the satisfaction of traffic regulations; finally, a lane-level planned driving path is determined in the first set of drivable paths, wherein the lane-level planned driving path can be used for the first vehicle-mounted terminal to control the vehicle to drive according to the lane-level planned driving path.
By implementing the embodiment of the application, the first cloud server can plan the lane-level planned driving path based on the road-level planned driving path sent to the vehicle-mounted terminal by the second cloud server, and determines the lane-level planned driving path in the lane-level driving line database, so that the flexibility and the safety of the automatic driving vehicle can be considered.
In one possible implementation, the N road track points include a start road track point and an end road track point; matching a first travelable route set in a lane-level travel route database according to the longitude and latitude information corresponding to the N track points respectively; matching M intersections existing in the N road track points in a lane level driving line database according to the longitude and latitude information corresponding to the N road track points respectively; m is an integer less than N; and sequentially acquiring the travelable route in each intersection of the M intersections and the travelable route corresponding to each road track point to obtain a first travelable route set. By implementing the embodiment of the application, the first cloud server can match the intersections in the lane-level driving route database according to the respective longitude and latitude information corresponding to the N road track points, so that the drivable routes in each intersection and the drivable routes corresponding to each road track point in the M intersections can be obtained, and the flexibility and the safety of the automatic driving vehicle can be considered when the first drivable route set is obtained.
In one possible implementation, the lane-level travel route database includes at least one of lane travel route information, lane travel route type information, and lane travel route connectivity attribute information.
In a possible implementation manner, the road-level planned driving route is a route determined by the second cloud server according to the travel information of the first vehicle-mounted terminal and in combination with a road-level map.
In one possible implementation, the method may further include the steps of: firstly, a first cloud server receives a road-level planned driving path from a second vehicle-mounted terminal, wherein the road-level planned driving path comprises M road track points, and each road track point comprises longitude and latitude information of the road track point; m is an integer greater than 1; then, if at least the similarity between the longitude and latitude information contained in the Q groups of road track points is larger than a target threshold value, the first cloud server sends a lane-level planned driving route corresponding to the first vehicle-mounted terminal to the second vehicle-mounted terminal; each group of Q groups of road track points comprises one of the M road track points and one of the N road track points. In the embodiment of the application, when the first cloud server determines the lane-level planned driving path of the second vehicle-mounted terminal, the lane-level planned driving path of the first vehicle-mounted terminal can be used as a reference, so that the safety of an automatic driving vehicle can be guaranteed, and the path planning efficiency can be improved.
In a second aspect, an embodiment of the present application further provides a path planning method, which may be applied to a first vehicle terminal on a vehicle, where the vehicle is a vehicle having an automatic driving mode and a manual automatic driving mode, and the method may include the following steps: firstly, sending a road-level planned driving path to a first cloud server, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; n is an integer greater than 1; then, receiving a lane-level planning driving path determined by the first cloud server; the lane-level planned driving path is determined by the first cloud server through the path planning method according to any one of the first aspect; and controlling the vehicle to drive according to the lane-level planned driving path.
By implementing the embodiment of the application, the first cloud server can be used for planning the lane-level planned driving path based on the road-level planned driving path sent to the vehicle-mounted terminal by the second cloud server, so that decoupling of road information and lane information is realized. After the vehicle receives the lane-level planned driving path, navigation service can be provided according to the lane-level planned driving path, and the flexibility and the safety of the automatic driving vehicle can be considered.
In a third aspect, embodiments of the present application provide a vehicle navigation device, which may implement the functions described in the method according to the first aspect, where the functions may be implemented by hardware, or may be implemented by hardware executing corresponding software, where the hardware or software includes one or more units or modules corresponding to the functions.
In one possible implementation, the apparatus may include a transceiver configured to support the apparatus to perform the functions described in the method according to the first aspect, and a processor configured to support communication between the apparatus and other apparatuses. The apparatus may also include a memory, coupled to the processor, that retains program instructions and data necessary for the apparatus.
In a fourth aspect, the present invention provides a vehicle navigation device, which can implement the functions described in the method according to the second aspect, where the functions can be implemented by hardware, or by hardware executing corresponding software, and the hardware or software includes one or more units or modules corresponding to the functions.
In one possible implementation, the apparatus may include a transceiver configured to support the apparatus to perform the functions described in the method according to the first aspect, and a processor configured to support communication between the apparatus and other apparatuses. The apparatus may also include a memory, coupled to the processor, that retains program instructions and data necessary for the apparatus.
In a fifth aspect, embodiments of the present application provide a chip, in which instructions are stored, which when run on a car navigation device cause the chip to perform the method of the first aspect described above.
In a sixth aspect, embodiments of the present application provide a chip, in which instructions are stored, which when run on a car navigation device cause the chip to perform the method of the second aspect described above.
In a seventh aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, the computer program including program instructions, which, when executed by a processor, cause the processor to execute the method of the first aspect or the second aspect.
In an eighth aspect, embodiments of the present application further provide a computer program, where the computer program includes computer software instructions, and the computer software instructions, when executed by a computer, cause the computer to perform the method of the first aspect or the second aspect.
Drawings
Fig. 1a is a schematic network architecture diagram of a vehicle navigation system according to an embodiment of the present disclosure;
fig. 1b is a functional block diagram of an automatic driving device 100 according to an embodiment of the present disclosure;
FIG. 1c is a schematic structural diagram of an automatic driving system according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a path planning method according to an embodiment of the present application;
fig. 3a is a schematic diagram of a planned path according to an embodiment of the present application;
fig. 3b is a schematic diagram of a road-level planned driving path according to an embodiment of the present disclosure;
fig. 3c is a schematic diagram of lane driving line information provided in an embodiment of the present application;
FIG. 3d is a schematic diagram of another lane driving line information provided by the embodiment of the present application;
fig. 3e is a schematic diagram of a road-level planned driving path according to an embodiment of the present disclosure;
fig. 3f is a schematic diagram of a matched intersection in a road-level planned driving path according to an embodiment of the present application;
fig. 3g is a schematic diagram of lane driving line information according to an embodiment of the present application;
fig. 3h is a schematic diagram of lane driving line information according to an embodiment of the present disclosure;
fig. 3i is a schematic diagram of lane driving line information provided in an embodiment of the present application;
fig. 3j is a schematic diagram of a road-level planned driving path according to an embodiment of the present disclosure;
fig. 4a is a schematic structural diagram of another path planning method provided in the embodiment of the present application;
fig. 4b is a schematic diagram of two road-level planned driving paths according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a vehicle navigation device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of another vehicle navigation device provided in the embodiment of the present application;
fig. 7 is a schematic structural diagram of a cloud server according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The terms "first" and "second" and the like in the description and drawings of the present application are used for distinguishing different objects or for distinguishing different processes for the same object, and are not used for describing a specific order of the objects. Furthermore, the terms "including" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design method described herein as "exemplary" or "e.g.," should not be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion. In the examples of the present application, "A and/or B" means both A and B, and A or B. "A, and/or B, and/or C" means A, B, C, or A, B, C, or A and B and C.
In order to facilitate understanding of the technical solutions described in the present application, some terms in the present application are explained below:
( 1) Autonomous vehicles (Autonomous vehicles; self-doping automobile )
In the embodiment of the application, the automatic driving vehicle is also called an unmanned vehicle, a computer-driven vehicle or a wheeled mobile robot, and is an intelligent vehicle which realizes unmanned driving through a computer system. In practical applications, autonomous vehicles rely on the cooperative use of artificial intelligence, vision computing, radar, surveillance devices, and global positioning systems to allow computer devices to operate motor vehicles automatically and safely without any human-active operations.
(2) Map layer information
In the embodiment of the present application, an image layer may be regarded as a specific map. Specifically, the layer information may be two-dimensional information or three-dimensional information. In the embodiment of the present application, two-dimensional information, also called vector information, so-called vector, is a quantity having both magnitude and direction. Illustratively, the two-dimensional information may be coordinate information of an obstacle in a road. In the embodiment of the present application, the three-dimensional information refers to that, on the basis of the two-dimensional information, some abstract information is further included, and the abstract information is used for reflecting the characteristics of the object. For example, the three-dimensional information may be coordinate information of the obstacle in the road and a size of the obstacle.
(3) Road, lane and intersection
In the embodiment of the present application, a road refers to a passage for a vehicle to travel and for communicating two places.
In the embodiment of the present application, the lane is a passage for a single tandem vehicle traveling in the same direction, and the common lanes include different types such as a straight lane, a left-turn lane, and a right-turn lane. One road includes one or more lanes. For example, one road includes: four lanes including 1 left-turning lane, 2 straight lanes and 1 right-turning lane.
In the embodiment of the application, the intersection is the intersection of two or more roads and is also a necessary place for gathering, steering and evacuating traffic of vehicles and pedestrians. Generally, the types of intersections may be divided according to the number of intersecting roads, and may be divided into three branches, four branches, and multiple branches, for example. The crossing mode includes plane crossing and stereo crossing. For example, the main forms of the plane intersections are cross-shaped intersections, X-shaped intersections, T-shaped intersections, Y-shaped intersections, staggered intersections, multiple-path intersections, and the like.
(4) Road grade map
In an embodiment of the present application, the road-level map may include road-level static layer information. In particular, the road-level static layer information is used to indicate the static distribution condition of the roads in the road network environment. For example, the road-level static layer information may include road geometry, road curvature, road heading, road speed limit, number of lanes, longitudinal slope, and lateral slope information.
In this embodiment, the second cloud server is a server providing a road-level navigation service.
In this embodiment, the first cloud server is a server providing lane-level navigation service.
It should be noted that the second cloud server and the first cloud server may be network side devices. Taking the first cloud server as an example, the first cloud server may be one server, or a server cluster composed of a plurality of servers, or a cloud computing service center. In an embodiment of the present application, the first cloud server maintains a simplified lane-level map (e.g., a lane-level driving route database).
The path planning method provided by the embodiment of the application is applied to other devices (for example, a first cloud server) with an automatic driving control function, or is applied to a vehicle with an automatic driving function, and is specifically described as follows:
in an implementation manner, the at least two cloud servers are used for implementing the path planning method provided by the embodiment of the application, the at least two cloud servers acquire the lane-level planned driving path according to the travel information reported by the vehicle and by combining maps respectively stored by the at least two servers, and send the lane-level planned driving path to the first vehicle-mounted terminal on the vehicle, so that the first vehicle-mounted terminal controls the vehicle to drive according to the lane-level planned driving path. Specifically, the second cloud server can obtain a road-level planned driving path by combining a road-level map according to the travel information reported by the vehicle, wherein the road-level planned driving path comprises a plurality of road track points, and each road track point comprises longitude and latitude information of the road track point; then, the first cloud server matches the longitude and latitude information corresponding to each of the plurality of road track points in a lane-level driving route database (a simplified lane-level map) to form a first drivable route set, so that the planning of the lane-level planned driving route can be performed based on the first drivable route set. In this implementation, decoupling of road information and lane information is achieved. After the vehicle receives the lane-level planned driving path, navigation service can be provided according to the lane-level planned driving path, and the flexibility and the safety of the automatic driving vehicle can be considered. It can be appreciated that the autonomous vehicle does not need to be autonomously driven from a high-precision map, which may improve flexibility in determining a lane-level planned travel path.
It should be noted that the road level map may refer to a conventional navigation map.
Fig. 1a is a schematic network architecture diagram of a vehicle navigation system according to an embodiment of the present disclosure. As shown in fig. 1a, the vehicle navigation system architecture includes a vehicle 10, a second cloud server 20, and a first cloud server 30. In practical applications, the second cloud server 20 and the first cloud server 30 may establish communication connections with the plurality of vehicles 10 through a wired network or a wireless network; meanwhile, the second cloud server 20 and the first cloud server 30 may also perform information interaction through a wired network or a wireless network. The wireless network may be any wireless network based on a communication technology standard, such as a Long Term Evolution (LTE) wireless network.
In the embodiment of the present application, as shown in fig. 1a, the vehicle 10 includes an automatic driving apparatus 100.
In the embodiment of the present application, the second cloud server 20 stores a road-level map, for example, the road-level map includes road-level static layer information, and the vehicle 10 may be controlled by running a program related to controlling automatic driving of the vehicle, which is stored in the road-level static layer information, through multidimensional data included in the road-level static layer information (for example, how the vehicle is to be driven is indicated by a road-level planned driving path). For example, the program related to controlling the automatic driving of the automobile may be: the system comprises a program for managing interaction between an automatically driven automobile and obstacles on the road, a program for controlling the route or speed of the automatically driven automobile, and a program for controlling interaction between the automatically driven automobile and other automatically driven automobiles on the road.
In the embodiment of the present application, the first cloud server 30 stores a lane-level driving route database, specifically, the lane-level driving route database may be generated by the first cloud server using V2X trajectory data, and may be considered as a simplified lane-level map. The first cloud server 30 runs the stored program related to controlling the automatic driving of the vehicle, and controls the vehicle 10 according to the lane-level planned driving path determined by the lane-level driving route database (for example, the lane-level planned driving path indicates how the vehicle is driven). For example, a program associated with controlling the automatic driving of a vehicle may be a program that manages the interaction of an automatically driven vehicle with obstacles on the road, a program that controls the route or speed of an automatically driven vehicle, and a program that controls the interaction of an automatically driven vehicle with other automatically driven vehicles on the road.
Fig. 1b is a functional block diagram of the automatic driving device 100 according to the embodiment of the present application. In some embodiments, the autopilot device 100 may be configured in a fully autopilot mode or a partially autopilot mode, or in a manual driving mode. Taking the autodrive hierarchy proposed by the Society of Automotive Engineers (SAE) as an example, the full autodrive mode may be L5, which means that all driving operations are done by the vehicle and human drivers do not need to keep their attention; the partially automatic driving mode can be L1, L2, L3, L4, wherein L1 represents that the vehicle provides driving for one of steering wheel and acceleration and deceleration, and the human driver is responsible for the rest of driving operation; l2 represents that the vehicle provides driving for a plurality of operations in a steering wheel and acceleration and deceleration, and a human driver is responsible for the rest driving actions; l3 represents that most driving operations are completed by the vehicle, and a human driver needs to keep attention for preparing for occasional needs; l4 represents the completion of all driving operations by the vehicle, the human driver not needing to keep his attention, but defining road and environmental conditions; the manual driving mode may be L0, indicating that the automobile is driven by a human driver at all authority.
In practical applications, the autonomous driving apparatus 100 may control itself while in the autonomous driving mode, and may determine a current state of the vehicle and the surrounding environment by a human operation, determine a possible behavior of at least one other vehicle in the surrounding environment, and determine a confidence level corresponding to a likelihood that the other vehicle performs the possible behavior, and control the autonomous driving apparatus 100 based on the determined information. When the autonomous device 100 is in the full autonomous mode, the autonomous device 100 may be placed into operation without human interaction.
In general, the autopilot device 100 may include various subsystems such as a travel system 102, a sensing system 104, a control system 106, one or more peripherals 108, as well as a power source 110, a computer system 112, and a user interface 116. In some implementations, the autopilot device 100 may include more or fewer subsystems and each subsystem may include multiple elements. In addition, each of the subsystems and elements of the autopilot device 100 may be interconnected by wire or wirelessly.
In an embodiment of the present application, the travel system 102 may include components that provide powered motion to the autopilot device 100. In some implementations, the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121. The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of a gasoline engine and an electric motor, a hybrid engine of an internal combustion engine and an air compression engine. In practice, the engine 118 converts the energy source 119 into mechanical energy.
In the present embodiment, the energy source 119 may include, but is not limited to: gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, or other sources of electrical power. The energy source 119 may also provide energy to other systems of the autopilot device 100.
In the subject embodiment, the transmission 120 may transmit mechanical power from the engine 118 to the wheels 121. The transmission 120 may include a gearbox, a differential, and a drive shaft. In some implementations, the transmission 120 may also include other devices, such as a clutch. Wherein the drive shaft comprises one or more shafts that may be coupled to one or more wheels 121.
In an embodiment of the present application, the sensing system 104 may include several sensors that sense environmental information about the surroundings of the autopilot device 100. For example, the sensing system 104 may include a global positioning system 122 (here, the global positioning system may be a GPS system, a beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 124, a radar 126, a laser range finder 128, and a camera 130. The sensing system 104 may also include sensors that are monitored for systems internal to the autopilot device 100, such as an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, and the like. Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (e.g., position, shape, orientation, velocity, etc.). These detections and identifications are key functions to the safe operation of the autonomous driving apparatus 100.
In the present embodiment, the global positioning system 122 may be used to estimate the geographic location of the autonomous device 100; the IMU124 is used to sense position and orientation changes of the autopilot device 100 based on inertial acceleration. In some implementations, the IMU124 may be a combination of an accelerometer and a gyroscope.
In an embodiment of the present application, the radar 126 may utilize radio signals to sense objects within the surrounding environment of the autopilot device 100. In some implementations, in addition to sensing objects, the radar 126 may also be used to sense the speed and/or heading of an object.
In the present embodiment, the laser rangefinder 128 may utilize a laser to sense objects in the environment in which the autopilot device 100 is located. In some implementations, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more monitors, among other system components.
In an embodiment of the present application, the camera 130 may be used to capture multiple images of the surrounding environment of the autonomous device 100. In some implementations, the camera 130 may be a still camera or a video camera, and the embodiments of the present application are not particularly limited.
In the subject embodiment, the control system 106 may control the operation of the autopilot device 100 and components. Control system 106 may include various elements including a steering system 132, a throttle 134, a braking unit 136, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
In the present embodiment, the steering system 132 is operable to adjust the heading of the autopilot device 100. For example, in one embodiment, a steering wheel system.
In the subject embodiment, the throttle 134 is used to control the speed of operation of the engine 118 and, in turn, the speed of the autopilot device 100.
In the present embodiment, the brake unit 136 is used to control the speed of the autopilot device 100. The brake unit 136 may use friction to slow the wheel 121. In some implementations, the brake unit 136 may convert the kinetic energy of the wheel 121 into an electrical current. The brake unit 136 may take other forms to slow the rotational speed of the wheels 121 to control the speed of the autopilot device 100.
In an embodiment of the present application, the computer vision system 140 may be operable to process and analyze images captured by the camera 130 to identify objects and/or features in the environment surrounding the autonomous device 100. In some implementations, the objects and/or features mentioned herein may include, but are not limited to: traffic signals, road boundaries and obstacles. The computer vision system 140 may use object recognition algorithms, motion from motion (SFM) algorithms, vision tracking, and other computer vision techniques. In some implementations, the computer vision system 140 can be used to map an environment, track objects, estimate the speed of objects, and so forth.
In the present embodiment, the route control system 142 is used to determine the travel route of the automatic driving device 100. In some implementations, the route control system 142 may combine data from sensors, the global positioning system 122, and one or more predetermined maps to determine a travel route for the autopilot device 100.
In the present embodiment, obstacle avoidance system 144 is used to identify, evaluate, and avoid or otherwise negotiate potential obstacles in the environment of autonomous driving apparatus 100.
It will be appreciated that in some implementations, the control system 106 may additionally or alternatively include components other than those shown and described in fig. 1 b. Or some of the above-described components may be reduced, which is not particularly limited in the embodiments of the present application.
In the present embodiment, the autopilot device 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 108. The peripheral devices 108 may include a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and/or speakers 152.
In some implementations, the peripheral device 108 provides a means for a user of the autopilot device 100 to interact with the user interface 116. For example, the onboard computer 148 may provide information to a user of the autopilot device 100. The user interface 116 may also operate the in-vehicle computer 148 to receive user input. The onboard computer 148 may be operated via a touch screen. In other cases, the peripheral device 108 may provide a means for the autopilot device 100 to communicate with other devices within the vehicle. For example, the microphone 150 may receive audio, such as voice commands or other audio input, from a user of the autopilot device 100. Similarly, speaker 150 may output audio to a user of autopilot device 100.
In the present embodiment, the wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network. For example, the wireless communication system 146 may use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. In some implementations, the Wireless communication system 146 may communicate with a Wireless Local Area Network (WLAN) using WIFI. In some implementations, the wireless communication system 146 can communicate directly with the device using an infrared link, bluetooth, or ZigBee. Other wireless protocols, for example, various vehicle communication systems, such as the wireless communication system 146, may include one or more Dedicated short-range communications (DSRC) devices.
In an embodiment of the present application, the power supply 110 may provide power to various components of the autopilot device 100. In some implementations, the power supply 110 can be a rechargeable lithium-ion or lead-acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to the various components of the autopilot device 100. In some implementations, the power source 110 and the energy source 119 may be implemented together, for example, configured together as in some all-electric vehicles (hybrid vehicles).
In the present embodiment, some or all of the functionality of the autopilot device 100 is controlled by the computer system 112. The computer system 112 may include at least one processor 113, the processor 113 executing instructions 115 stored in a non-transitory computer readable storage medium, such as data storage device 114. The computer system 112 may also be a plurality of computing devices in individual components or subsystems that employ the distributed control autopilot apparatus 100.
In some implementations, the Processor 113 may be any conventional Processor, such as a commercially available Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, and so forth. Although FIG. 1b functionally shows a processor, memory and other elements within the same physical housing, those skilled in the art will appreciate that the processor, computer system or memory, or alternatively, comprise multiple processors, computer systems or memories which may not be stored within the same physical housing. For example, the memory may be a hard drive, or other storage medium located in a different physical enclosure. Thus, references to a processor or computer system are to be understood as including references to a collection of processors or computer systems or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the function of the particular component.
In various aspects described herein, the processor 113 may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single operation.
In some implementations, the data storage device 114 may include instructions 115 (e.g., program logic), the instructions 115 being executable by the processor 113 to perform various functions of the autopilot device 100, including those described above. The data storage device 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 102, the sensing system 104, the control system 106, and the peripheral devices 108.
In addition to the instructions 115, the data storage device 114 may also store data such as road maps (also known as traditional navigation maps), route messages, the location, direction, speed, and other vehicle data of the vehicle, as well as other information. The above information may be used by the autopilot device 100 and the computer system 112 during operation of the autopilot device 100 in autonomous, semi-autonomous, and/or manual modes.
For example, in the subject embodiment, the data storage device 114 obtains environmental information of the vehicle from the sensors 104 or other components of the autopilot device 100. The environment information may be, for example, lane line information, the number of lanes, road boundary information, road travel parameters, traffic signals, green belt information, and whether there is a pedestrian, a vehicle, etc., in the environment where the vehicle is currently located. The data storage device 114 may also store status information of the vehicle itself, as well as status information of other vehicles with which the vehicle has interaction. The status information may include, but is not limited to: speed, acceleration, heading angle, etc. of the vehicle. For example, the vehicle obtains the distance between the other vehicle and itself, the speed of the other vehicle, and the like based on the speed measurement and distance measurement functions of the radar 126. In this case, then, the processor 113 may retrieve the vehicle data from the data storage device 114 and determine a driving strategy that satisfies the safety requirements based on the environmental information in which the vehicle is located.
In an embodiment of the present application, a user interface 116 is used to provide information to and receive information from a user of the autopilot device 100. In some implementations, the user interface 116 may include one or more input/output devices within the set of peripheral devices 108, such as one or more of the wireless communication system 146, the in-vehicle computer 148, the microphone 150, and the speaker 152.
In the present embodiment, the computer system 112 may control the functions of the autopilot device 100 based on inputs received from various subsystems (e.g., the travel system 102, the sensing system 104, and the control system) and from the user interface 116. For example, computer system 112 may utilize inputs from gas control system 106 to control steering system 132 to avoid obstacles detected by sensing system 104 and obstacle avoidance system 144. In some implementations, the computer system 112 is operable to provide control over many aspects of the autopilot device 100 and its subsystems.
In some implementations, one or more of the above-described components may be mounted or associated separately from the autopilot device 100. For example, the data storage device 114 may be partially or completely separate from the autopilot device 100. The above components may be communicatively coupled together in a wired and/or wireless manner.
In some implementations, the above-described components are but one example. In practical applications, components in the above modules may be added or deleted according to practical needs, and fig. 1b should not be construed as limiting the embodiments of the present application.
An autonomous vehicle traveling on a roadway, such as autonomous device 100, may recognize objects within its surrounding environment to determine whether to adjust the speed at which autonomous device 100 is currently traveling. Here, the object may be another vehicle, a traffic control device, or another type of object. In some implementations, each identified object may be considered independently and the speed at which the autonomous vehicle is to adjust determined based on the respective characteristics of the object, e.g., its current driving data, acceleration, vehicle separation, etc.
In some implementations, the autonomous driving apparatus 100 or a computer device associated with the autonomous driving apparatus 100 (e.g., the computer system 112, the computer vision system 140, the data storage 114 as shown in fig. 1 b) may predict behavior of the identified object based on characteristics of the identified object and a state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). In some implementations, each identified object depends on the behavior of each other, and thus, the behavior of a single identified object may also be predicted by considering all identified objects together. The autonomous driving apparatus 100 is able to adjust its speed based on the predicted behavior of the recognized object. In other words, the autonomous driving apparatus 100 can determine what steady state the vehicle will need to be adjusted to based on the predicted behavior of the object (e.g., the adjustment operation may include acceleration, deceleration, or stopping). In this process, other factors may also be considered to determine the speed of the autonomous device 100, such as the lateral position of the autonomous device 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computer device may also provide instructions to modify the steering angle of the vehicle 100 to cause the autonomous vehicle to follow a given trajectory and/or to maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., vehicles in adjacent lanes on a road).
In the embodiment of the present invention, the automatic steering device 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement ride, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a cart, or the like, and the embodiment of the present invention is not particularly limited.
In some implementations, the autopilot device 100 can also include hardware structures and/or software modules that implement the functionality described above in the context of a hardware structure, a software module, or a combination of hardware structures and software modules. Whether any of the above-described functions is performed by hardware structures, software modules, or both, depends upon the particular application for which the solution is used and may involve constraints.
Fig. 1b presents a functional block diagram of the autopilot unit 100, and the autopilot system 101 in the autopilot unit 100 is presented below. Fig. 1c is a schematic structural diagram of an automatic driving system according to an embodiment of the present application. Fig. 1b and 1c illustrate the autopilot device 100 from different perspectives, for example, the computer system 101 of fig. 1c is the computer system 112 of fig. 1 b. As shown in FIG. 1c, computer system 101 comprises a processor 103, wherein processor 103 is coupled to a system bus 105. In practical applications, processor 103 may be one or more processors, each of which may include one or more processor cores. A display adapter (video adapter) 107, which may drive a display 109, the display 109 coupled with system bus 105. System bus 105 is coupled through a bus bridge 111 and an input/output (I/O) bus 113. The I/O interface 115 is coupled to an I/O bus. The I/O interface 115 communicates with various I/O devices, such as an input device 117 (e.g., keyboard, mouse, touch screen, etc.), a multimedia disk (media tray) 121 (e.g., CD-ROM, multimedia interface, etc.). A transceiver 123 (which can send and/or receive radio communication signals), a camera 155 (which can capture Jing Tian and motion digital video images), and an external USB interface 125. Wherein, optionally, the interface connected with the I/O interface 115 may be a USB interface.
The processor 103 may be any conventional processor, including a reduced instruction set computing ("RISC") processor, a complex instruction set computing ("CISC") processor, or a combination of the above. Alternatively, the processor may be a dedicated device such as an application specific integrated circuit ("ASIC"). Alternatively, the processor 103 may be a neural network processor or a combination of a neural network processor and a conventional processor as described above.
Optionally, in various embodiments described herein, the computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle 100. In other aspects, some processes described herein are performed on a processor disposed within an autonomous vehicle, others being performed by a remote processor, including taking the actions required to perform a single maneuver.
Computer system 101 may communicate with software deploying server 149 via network interface 129. The network interface 129 is a hardware network interface, such as a network card. The network 127 may be an external network, such as the internet, or an internal network, such as an ethernet or a Virtual Private Network (VPN). Alternatively, the network 127 may be a wireless network, such as a WiFi network, a cellular network, or the like.
The hard drive interface is coupled to system bus 105. The hardware drive interface is connected with the hard disk drive. System memory 135 is coupled to system bus 105. Data running in system memory 135 may include the operating system 137 and application programs 143 of computer 101.
The operating system includes a Shell 139 and a kernel 141.Shell 139 is an interface between the user and the kernel of the operating system. The shell is the outermost layer of the operating system. The shell manages the interaction between the user and the operating system, waiting for user input, interpreting the user input to the operating system, and processing the output results of the various operating systems.
Kernel 141 is comprised of those portions of the operating system that are used to manage memory, files, peripherals, and system resources. Interacting directly with the hardware, the operating system kernel typically runs processes and provides inter-process communication, CPU slot management, interrupts, memory management, IO management, and the like.
The application programs 141 include programs related to controlling the automatic driving of a vehicle, such as programs for managing the interaction of an automatically driven vehicle with obstacles on the road, programs for controlling the route or speed of an automatically driven vehicle, and programs for controlling the interaction of an automatically driven vehicle with other automatically driven vehicles on the road. Application 141 also resides on the system of the exploiting server 149. In one embodiment, computer system 101 may download application 141 from deploying server14 when execution of application 141 is required.
As another example, the application 141 may be an application that controls the vehicle to calculate a driving strategy based on sensor data acquired in real time. The sensor data acquired in real time can include environmental information, self-state information of the vehicle and state information of potential interaction target objects of the vehicle. Specifically, the environment information is information of the environment where the vehicle is currently located (e.g., green belt distribution, lanes, traffic lights, etc.), and the state information may include, but is not limited to, a speed, an acceleration, and a heading angle of the vehicle. For example, the vehicle obtains the distance between the other vehicle and itself, the speed of the other vehicle, and the like based on the speed measurement and distance measurement functions of the radar 126. The processor 103 of the computer system 101 may invoke the application 141 to obtain the second driving strategy.
In some implementations, since the first cloud server stores the lane-level travel route database, the first cloud server may match the first travelable route set in the lane-level travel route database according to the longitude and latitude information corresponding to each of the N road track points, and in this case, the application program 141 may be an application program that controls the vehicle to drive according to the lane-level planned travel route acquired in the first travelable route set.
Sensor 153 is associated with computer system 101. The sensor 153 is used to detect the environment surrounding the computer 101. For example, the sensor 153 may detect an animal, a car, an obstacle, a crosswalk, and the like, and further, the sensor may detect an environment around the animal, the car, the obstacle, the crosswalk, and the like, such as: the environment surrounding the animal, e.g., other animals present around the animal, weather conditions, brightness of the surrounding environment, etc. Alternatively, if the computer 101 is located on an autonomous automobile, the sensor may be a camera, infrared sensor, chemical detector, microphone, or the like. Sensor 153, when activated, senses information at preset intervals and provides the sensed information to computer system 101 in real time.
Alternatively, in various embodiments described herein, the computer system 101 may be located remotely from the autopilot device 100 and may communicate wirelessly with the autopilot device 100. Transceiver 123 may transmit the autopilot task, sensor data collected by sensors 153, and other data to computer system 101; control instructions sent by computer system 101 may also be received. The autopilot may execute control commands received by transceiver 123 from computer system 101 and perform corresponding driving operations. In other aspects, some of the processes described herein are provided for execution on a processor within an autonomous vehicle, others being executed by a remote processor, including taking the actions required to perform a single operation.
Based on the system architecture shown in fig. 1a, the following flowchart of a path planning method provided in the embodiment of the present application shown in fig. 2 is combined, and specifically illustrates how, in the embodiment of the present application, the first cloud server provides navigation service for a vehicle, which may include, but is not limited to, the following steps:
step S200, a first cloud server receives a road-level planned driving path from a first vehicle-mounted terminal, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; n is an integer greater than 1.
In the embodiment of the present application, a sequence point or a curve connecting a start point position and an end point position is referred to as a path, and a strategy for constructing the path is referred to as path planning. Generally, path planning can be divided into road-level planning travel paths and lane-level planning travel paths. The road-level planned driving route refers to a route for allowing a vehicle to drive on a specified road. The lane-level planning driving path is a path which is accurate to a sub-meter level and used for enabling a vehicle to drive on a formulated lane. For example, as shown in fig. 3a, the road-level planned driving route is a curve from a starting position a to an ending position B.
In this embodiment of the application, the first vehicle-mounted terminal on the vehicle may send a navigation request to the second cloud server through the wireless network, where the navigation request at least includes travel information of the vehicle, for example, the travel information may include a start position and an end position of the vehicle. Correspondingly, the second cloud server receives a navigation request sent by the first vehicle-mounted terminal on the vehicle, and then the second cloud server can determine a road-level planning driving path according to the initial position and the final position and in combination with a road-level map. In particular, the road level map may be a conventional navigation map.
It will be appreciated that each track point includes respective corresponding identification information (e.g., the identification information is a track point number).
In some possible implementations, the navigation request may also include identification information. Different identification information is used to distinguish different vehicles. For example, the identification information may be a device identifier of the vehicle-mounted terminal, a user account number for logging in the vehicle-mounted terminal, a unique identifier of the vehicle, or another preset identifier. In some possible implementations, heading information may also be included in the navigation request. The heading information is used for indicating the current heading of the vehicle, namely the direction opposite to the vehicle head. Specifically, the heading information may be acquired by an onboard sensor. In some possible implementations, the navigation request may also include path constraint information. The path constraint information refers to a constraint condition for providing a path plan. For example, the path constraint condition is used to instruct the second cloud server to plan a driving path from the starting position to the end position, where the driving path must pass through the target position specified by the user. For another example, the path constraint information is used to indicate that the second cloud server plans a driving path with the shortest distance.
In this embodiment of the application, the road-level planned driving path includes a plurality of track points, for example, as shown in fig. 3b, the road-level planned driving path includes road track points 1, road track points 2, road track points 3, and road track points 4, where the longitude and latitude information corresponding to the track points 1 is (LngA, latA), the longitude and latitude information corresponding to the track points 2 is (LngB, latB), the longitude and latitude information corresponding to the track points 3 is (LngC, latC), and the longitude and latitude information corresponding to the track points 4 is (LngD, latD).
And S202, matching a first drivable route set in a lane-level driving route database according to the respective longitude and latitude information corresponding to the N road track points.
In this application, the lane-level travel route database may be generated by the first cloud server using V2X (vehicle to observing) trajectory data, and the lane-level travel route database may be used as a supplement to lane-level information of a conventional map (e.g., the road-level map of this application). In practical application, the collision risks of vehicles in front of and behind the current lane and vehicles on blind spots of adjacent lanes can be identified based on the lane-level driving route database, and lane-level navigation service is provided for users.
In this embodiment of the present application, the implementation process of matching the first drivable path set in the lane-level driving path database according to the longitude and latitude information corresponding to each of the N road track points may include: firstly, matching M intersections existing in N road track points in a lane level driving line database according to longitude and latitude information corresponding to the N road track points respectively; m is an integer less than N; then, the travelable route in each intersection of the M intersections and the travelable route corresponding to each road track point are sequentially acquired, and a first travelable route set is obtained.
In the present embodiment, the lane-level travel route database may include one or more of the following attributes: the first cloud server can perform path planning according to the lane driving line information, the lane driving line type information, the lane driving line communication attribute information and the like.
Specifically, the lane driving line may refer to a center line between two lane lines, or may refer to a driving line deviated from the center line within a preset range. And the lane line means a boundary line of the lane. The lane driving line type may include a real lane driving line and a virtual lane driving line. For example, lane lines may include a virtual lane line located within the intersection and a real lane line located on the lane. The following describes the lane driving line information according to the embodiment of the present application with reference to fig. 3c to 3j, respectively.
In the present embodiment, the lane-level travel route database may be regarded as a simplified lane-level map. Specifically, when the lane-level driving route database is constructed, a lane driving route is interrupted by taking an intersection as a node, a road section is formed between the nodes, and the road section comprises information of the lane driving route in the section and information of the lane driving route in the intersection. It should be noted that the lane driving line is formed by a plurality of nodes, and each node includes corresponding latitude and longitude information.
Taking the database of lane-level driving routes shown in fig. 3c as an example, the road segment is composed of one or several lane driving lines, and each lane driving line is composed of nodes. For example, the nodes are represented by black circles in FIG. 3 c. A node is a connection point that generates a connection relationship. The driving lines of adjacent lanes share the same node. Taking the first link as an example, the first link is composed of three lane lines, each lane line is composed of several nodes, for example, as shown in fig. 3c, a lane line L1 includes a node 1, a node 2, a node 3, a node 4, and a node 5. Illustratively, the distance between adjacent nodes may be less than 1 meter, and may also be between 1 meter and 1.5 meters, which is not specifically limited in this application.
As shown in fig. 3d, in the lane-level driving route database, the first cloud server divides the road-level planned driving route into 4 segments, i.e. a first segment, a second segment, a third segment and a fourth segment along the driving direction of the vehicle. Wherein the third path segment is located within the intersection.
Specifically, as shown in fig. 3d, the first segment includes 3 lane driving lines (for example, the type of the lane driving line is a real lane driving line), and the lane driving line L1 is composed of a node 1, a node 2, a node 3, a node 4, and a node 5; the lane travel line L2 is formed by a node 6, a node 7, a node 8, a node 9, and a node 10; the lane line L3 is formed by the node 11, the node 12, the node 13, the node 14, and the node 15; the second road segment includes 3 lane driving lines (for example, the type of the lane driving line is an actual lane driving line), and the lane driving line L4 is composed of a node 5 and a node 16; the lane driving line L5 is composed of a node 10 and a node 17; the lane driving line L6 is formed by a node 15 and a node 18; the third route segment includes a plurality of lane lines (e.g., the type of lane line is a virtual lane line), for example, the plurality of lane lines may be L7, L8, L9; the fourth road segment includes 3 lane driving lines (for example, the type of the lane driving line is an actual lane driving line), and the lane driving line L10 is composed of a node 19, a node 20, and a node 21; the lane travel line L11 is constituted by a node 22, a node 23, and a node 24; the lane travel line L12 is constituted by a node 25, a node 26, and a node 27.
In practical applications, the lane driving line information corresponding to each of the first to fourth road segments may include the contents shown in tables 1 to 4, that is, the number of the lane driving line, the node number, and the number of the lane driving line connecting to the next lane driving line. It will be appreciated that the direction of the lane line may be determined by the orientation between the nodes.
For the first road segment, as shown in fig. 3d, the first road segment contains 3 lane lines. The lane driving line L1 includes a node 1, a node 2, a node 3, a node 4, and a node 5; the lane travel line L2 includes a node 6, a node 7, a node 8, a node 9, and a node 10; the lane travel line L3 includes a node 11, a node 12, a node 13, a node 14, and a node 15. Further, the lane driving line L1 joins a lane driving line L4 in the second road section; the lane driving line L2 is connected with a lane driving line L5 in the second road section; the lane line L3 links the lane line L6 in the second section. Specifically, the lane driving line information corresponding to the first road segment may be as shown in table 1:
Figure BDA0002658316950000131
Figure BDA0002658316950000141
TABLE 1
For the second road segment, which is close to the intersection as shown in fig. 3d, the second road segment contains 3 lane driving lines. The lane driving line L4 comprises a node 5 and a node 16; the lane travel line L5 includes a node 10 and a node 17; the lane travel line L6 includes nodes 15 and 18. Further, the lane driving line L4 joins the lane driving line L7 in the third road section; the lane driving line L5 is connected with a lane driving line L8 in the third road section; the lane line L6 links the lane line L9 in the third link. Specifically, the lane driving line information corresponding to the second road segment may be as shown in table 2:
lane lane line numbering Node numbering Next lane driving line number
L4 Node 5, node 16 L72
L5 Node 10, node 17 L71、L81、L82
L6 Node
15, nodeDot 18 L91、L92
TABLE 2
For the third route segment, as shown in fig. 3d, the third route segment is located within an intersection, which includes a plurality of virtual lane lines within the intersection. For example, the plurality of virtual lane lines may include L81, L82, L91, and L92. Further, the lane travel line L81 links the lane travel line L11 in the fourth road section; the lane driving line L82 is connected with the lane driving line L12 in the fourth road section; the lane driving line L91 is connected with a lane driving line L10 in the fourth road section; the lane travel line L92 links the lane travel line L11 in the fourth link. Specifically, the lane driving line information corresponding to the third route segment may be as shown in table 3:
lane lane line numbering Node numbering Next lane driving line number
L81 Node
17, node 22 L11
L82 Node
17, node 25 L12
L91 Node 18, node 19 L10
L92 Node 18, node 22 L11
TABLE 3
For the fourth road segment, as shown in fig. 3d, the fourth road segment contains 3 lane driving lines. The lane driving line L10 includes a node 19, a node 20, and a node 21; the lane travel line L11 includes a node 22, a node 23, and a node 24; the lane travel line L12 includes a node 25, a node 26, and a node 27. Further, the lane driving line L10 joins the lane driving line L13 in the next road section; the lane driving line L11 is connected with a lane driving line L14 in the next road section; the lane travel line L12 links the lane travel line L15 in the next link. Specifically, the lane driving line information corresponding to the fourth road segment may be as shown in table 4:
Figure BDA0002658316950000142
TABLE 4
For example, a certain road-level planned path includes 4 continuous road track points, and the first cloud server performs matching in the lane-level driving route database according to the respective longitude and latitude information corresponding to the 4 continuous road track points, so as to determine an intersection existing in the 4 road track points. For example, as shown in fig. 3e, the road-level planned driving path determined by the second cloud server includes four road track points P1, P2, P3, and P4, where P0 is a starting road track point, P4 is a destination road track point, and P2-P3 are path road track points. Firstly, according to the path road track points, the position relation of an intersection circle is judged in a lane-level driving route database to obtain an intersection. As shown in FIG. 3f, P2 does not match to the intersection and P3 matches to the intersection. At this time, it may be determined that there are 1 intersection in the road-level planned driving path. Then, the travelable routes in the 1 intersection and the travelable route corresponding to each road track point are sequentially determined and obtained, and a first travelable route set is obtained.
For example, as shown in fig. 3g, for a track point P1, a travelable route corresponding to the track point P1 includes L22, L23, and L24; for the track point 2, the travelable routes of the road sections where the track point 2 is communicated with the first intersection are obtained, wherein the travelable routes corresponding to the track point 2 include L22, L23 and L24. For the road track point P3, since the road track point P3 is in the intersection, a travelable route in the intersection can be determined according to the travel entrance section and the travel exit section. For example, as shown in fig. 3h, since the L22 lane change attribute is left lane change, the L23 lane change attribute is left lane change straight line, the L24 lane change attribute is straight line, and the travelable route corresponding to the road track point P3 includes L22-L3, L22-L2, L23-L1, LL23-L3, L23-L2, and L23-L1; for track point P4, the travelable route corresponding to track point P4 includes L1, L2, and L3, so that the first travelable route set can be obtained. For example, as shown in fig. 3i, in the lane-level map, the first set of travelable routes includes link information, intra-link lane travel route information, intersection information, intra-intersection dashed line travel route information, and position point information constituting the travel route. As can be seen from the figure, the first set of travelable routes may be encoded in different encoding manners. For example, 2004, 200401, and 200402 may be used to represent the section information.
For another example, a certain road-level planned path includes 7 continuous road track points, and the first cloud server performs matching in the lane-level driving route database according to respective longitude and latitude information corresponding to the 7 continuous road track points, so as to determine an intersection existing in the 7 road track points. As shown in fig. 3j, the road-level planned driving path determined by the second cloud server includes seven road track points P1, P2, P3, P4, P5, P6, and P7, where P0 is a starting road track point, P7 is an ending road track point, and P2-P6 are path road track points. Firstly, judging the position relation of an intersection circle in a lane-level driving route database according to road track points of the way in sequence to obtain an intersection. As shown in fig. 3j, P2 matches to the intersection, P4 matches to the intersection, and P6 matches to the intersection. At this time, it may be determined that there are 3 intersections in the road-level planned driving path. In this case, first, the first cloud server obtains travelable routes of each road segment where the starting road track point is communicated with the first intersection; then, determining a travelable route of the vehicle to the intersection according to the traveling direction from the initial road track point to the first intersection; then, filtering out travelable routes which have the closest distance and belong to the same road segment as travelable routes of the starting road track point according to the distance between the starting road track point and the travelling line; for the road track points which do not relate to the intersection, acquiring travelable lines communicated with other road track points; for the last intersection, firstly, filtering out a travelable route exiting the last intersection according to the direction from the last intersection to the track point of the end road; and then, filtering out a drivable path which is closest to the distance between the end point road track point and the driving line and belongs to the same road track point as a drivable path corresponding to the end point road track point.
Step S204, the first cloud server determines a lane-level planning driving path in the first driving route set.
After determining the first set of travelable routes, the first cloud server may determine a lane-level planned travel route in the first set of travelable routes according to a driving strategy. For example, the driving strategy may include, but is not limited to, a shortest path strategy, a shortest travel time strategy, and the like.
For example, referring to fig. 3c, the first drivable route set includes 3 lane-level planned driving paths, which are path 1, path 2, and path 3, respectively, and the first cloud server calculates the estimated driving time corresponding to each alternative lane-level planned driving path according to the path distance corresponding to each alternative lane-level planned driving path, for example, the estimated driving time corresponding to path 1 is t1, the estimated driving time corresponding to path 2 is t2, and the estimated driving time corresponding to path 3 is t3, where t2< t1< t3, then, in this case, the first cloud server selects path 2 as the lane-level planned driving path of the vehicle according to the shortest driving time policy.
And S206, the first cloud server sends the lane-level planned driving path to the first vehicle-mounted terminal.
And S208, the first vehicle-mounted terminal receives the lane-level planned driving path from the first cloud server and controls the vehicle to drive according to the lane-level planned driving path.
In one example, the autopilot may autonomously drive according to a lane-level planned travel path. In one example, the first vehicle-mounted terminal may provide the vehicle navigation service to the user by voice broadcasting, so that the user may plan a driving path to drive the vehicle according to the broadcasted voice at a lane level.
By implementing the embodiment of the application, the first cloud server can be used for planning the lane-level planned driving path based on the road-level planned driving path sent to the vehicle-mounted terminal by the second cloud server, so that decoupling of road information and lane information is realized, and flexibility and safety of an automatic driving vehicle can be considered.
In some possible implementation manners, after the first cloud server determines the lane-level planned driving path of the first vehicle-mounted terminal, the first cloud server receives the road-level planned driving path from the second vehicle-mounted terminal, and in this case, the first cloud server may determine the lane-level planned driving path of the second vehicle-mounted terminal with reference to the lane-level planned driving path of the first vehicle-mounted terminal. Referring to fig. 4a, a path planning method provided in the embodiment of the present application may include, but is not limited to, the following steps:
step S400, receiving a road-level planning driving path from a second vehicle-mounted terminal, wherein the road-level planning driving path comprises M road track points, and each road track point comprises longitude and latitude information of the road track point; m is an integer greater than 1.
Step S402, judging whether the similarity between the longitude and latitude information contained in at least Q groups of road track points is larger than a target threshold value in the road-level planned driving path corresponding to the second vehicle-mounted terminal and the road-level planned driving path corresponding to the first vehicle-mounted terminal, and if yes, executing step S404; if not, go to step S406.
In the embodiment of the present application, the similarity is also called similarity measurement, which is a measurement for comprehensively evaluating the similarity between two things. It will be appreciated that the closer two things are, the greater their similarity.
Illustratively, the vehicle determines that the automatic driving strategy of the first road section traveled by the vehicle is full automatic driving through the sensor data acquired in real time, and the first cloud server determines that the automatic driving strategy of the first road section traveled by the vehicle is high automatic driving according to the high-precision map. In the prior art, the full automatic driving of the L5 means that all driving operations are completed by a vehicle, and a human driver does not need to keep attention. Highly autonomous driving L4, which means that all driving operations are done by the vehicle, the human driver does not need to keep his attention, but defines road and environmental conditions. The difference between these two driving strategies is reflected in: the highly automated driving L4 defines the road and environmental conditions, while the fully automated driving L5 does not define the road and environmental conditions. It is understood that the similarity between the highly automated driving L4 and the fully automated driving L5 is extremely high, and the similarity between these two driving strategies is determined to be 0.85 by, for example, a similarity calculation formula (e.g., euclidean distance formula).
In the embodiment of the present application, the Q value may be determined according to the following formula:
Figure BDA0002658316950000161
wherein M is the number of road track points included in the road-level planned route corresponding to the first vehicle-mounted terminal, N is the number of road track points included in the road-level planned driving route corresponding to the second vehicle-mounted terminal, and a is a constant, for example, a may be 1; for another example, a may be 2.
For example, as shown in fig. 4b, the road-level planned driving path 1 sent by the first vehicle-mounted terminal to the first cloud server includes 4 road track points, where the longitude and latitude information corresponding to the road track point 11 is ((LngA 1, latA 1), the longitude and latitude information corresponding to the road track point 12 is (LngB 1, latB 1), the longitude and latitude information corresponding to the road track point 13 is (LngC 1, latC 1), the longitude and latitude information corresponding to the road track point 14 is (LngD 1, latD 1), the road-level planned driving path 2 sent by the second vehicle-mounted terminal to the first cloud server includes 4 road track points, where the longitude and latitude information corresponding to the road track point 21 is ((LngA 2, latA 2), the longitude and latitude information corresponding to the road track point 22 is (LngB 2, latB 2), the longitude and latitude information corresponding to the road track point 23 is (LngC 2, latC 2), the longitude and latitude information corresponding to the track point 24 is (LngD 2, latD 2). Then, in this case, the first cloud server sequentially determines the similarity (Q1 = 0.8) between the longitude and latitude information corresponding to the track point 11 and the longitude and latitude information corresponding to the track point 21, the similarity (Q2 = 0.7) between the longitude and latitude information corresponding to the track point 12 and the longitude and latitude information corresponding to the track point 22, the similarity (Q3 = 0.9) between the longitude and latitude information corresponding to the track point 13 and the longitude and latitude information corresponding to the track point 23, and the similarity (Q4 = 0.85) between the longitude and latitude information corresponding to the track point 14 and the longitude and latitude information corresponding to the track point 24, where the target threshold is 0.75, thereby determining that the similarity between the longitude and latitude information corresponding to the set of 3 sets of track points is greater than the target threshold Step S404 is performed.
In the embodiment of the present application, the target threshold may be preset, or may be set in combination with actual requirements.
In a possible implementation manner, the first cloud server performs step S402 only when it is determined that the road-level planned driving path 1 and the road-level planned driving path 2 have the same advancing direction. For example, the forward direction of the road-level planned driving route 1 is the direction from the track point a to the track point B, and the forward direction of the road-level planned driving route 2 is the direction from the track point a to the track point C, wherein the track point B is adjacent to the track point C. At this time, it may be determined that the road-level planned travel path 1 and the road-level planned travel path 2 have the same advancing direction.
And S404, sending the lane-level planned driving path to a second vehicle-mounted terminal.
And step S406, planning the lane-level planned driving path according to the road-level planned driving path sent by the second vehicle-mounted terminal.
In this embodiment of the application, the implementation process of the first cloud server planning the lane-level planned driving path according to the road-level planned driving path sent by the second vehicle-mounted terminal may refer to the method described above, which is not repeated herein.
In the embodiment of the application, when the first cloud server determines the lane-level planned driving path of the second vehicle-mounted terminal, the lane-level planned driving path of the first vehicle-mounted terminal can be used as a reference, so that the safety of an automatic driving vehicle can be guaranteed, and the path planning efficiency can be improved.
The foregoing embodiments have focused on how to plan a driving path for a vehicle at the lane level, and the following describes embodiments of the apparatus according to the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference should be made to the embodiments of the method of the present application.
In order to better implement the foregoing aspects of the embodiments of the present application, an embodiment of the present application provides a first cloud server, where the first cloud server is configured to execute the units of the method in any one of the foregoing first aspects. Specifically, the first cloud server plans a driving path according to a road level provided by the second cloud server, and determines the lane level planned driving path by combining a lane level map. Specifically, please refer to fig. 5, which is a schematic structural diagram of a first cloud server 500 according to an embodiment of the present disclosure. As shown in fig. 5, the first cloud server 50 may include:
the first receiving unit 500 is configured to receive a road-level planned driving path from a first vehicle-mounted terminal, where the road-level planned driving path includes N road track points, and each road track point includes longitude and latitude information of the road track point; n is an integer greater than 1;
the processing unit 502 is configured to match a first drivable path set in the lane-level driving path database according to the respective longitude and latitude information corresponding to the N road track points;
a driving path planning unit 504 is configured to determine a lane-level planned driving path in the first set of drivable paths.
In one possible implementation, the processing unit 502 includes a matching unit 5021 and a travelable route determination unit 5022, wherein,
the matching unit 5021 is used for matching M intersections existing in the N road track points in the lane-level driving route database according to the longitude and latitude information corresponding to the N road track points; m is an integer less than N; the travelable route determining unit 5022 is configured to sequentially obtain travelable routes in each intersection of the M intersections and travelable routes corresponding to each road track point, so as to obtain a first travelable route set.
In one possible implementation, the lane-level travel route database includes at least one of lane travel route information, lane travel route type information, and lane travel route connectivity attribute information.
In a possible implementation manner, the road-level planned driving path is a path determined by the second cloud server according to the travel information of the first vehicle-mounted terminal and in combination with a road-level map.
In a possible implementation manner, the first cloud server 50 may further include:
a second receiving unit 506, configured to receive a road-level planned driving path from a second vehicle-mounted terminal, where the road-level planned driving path includes M road track points, and each road track point includes longitude and latitude information of the road track point; m is an integer greater than 1;
a sending unit 508, configured to send, by the first cloud server, the lane-level planned driving path corresponding to the first vehicle-mounted terminal to the second vehicle-mounted terminal when at least there is a situation where a similarity between the longitude and latitude information included in the Q sets of road track points is greater than a target threshold; each group of the Q groups of the road track points comprises one of the M road track points and one of the N road track points.
It should be noted that, in the embodiment of the present application, the first cloud server may refer to the related description of the path planning method in the embodiment of the method described in fig. 2 and fig. 4a, and details are not repeated here.
In order to better implement the foregoing aspects of the embodiments of the present application, an embodiment of the present application provides a vehicle navigation apparatus, where the apparatus is configured to execute the unit of the method according to any one of the foregoing second aspects, so as to determine a lane-level planned driving path according to a first cloud server for driving. Specifically, please refer to fig. 6, which is a schematic structural diagram of a car navigation device 60 according to an embodiment of the present application. The apparatus 60 may comprise:
the transmitting unit 600 is configured to transmit a road-level planned driving path to a first cloud server, where the road-level planned driving path includes N road track points, and each road track point includes longitude and latitude information of the road track point; n is an integer greater than 1;
a receiving unit 602, configured to receive the lane-level planned driving path determined by the first cloud server; the lane-level planned driving path is determined by the first cloud server through the path planning method according to any one of the first aspect;
and a driving unit 604, configured to control the vehicle to drive according to the lane-level planned driving path.
It should be noted that, in the vehicle navigation device described in the embodiment of the present application, reference may be made to the related description of the path planning method in the method embodiment described in fig. 2 and fig. 4a, and details are not repeated herein.
It is also noted that the automatic driving apparatus may include the above-described vehicle navigation apparatus.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a cloud server according to an embodiment of the present disclosure. For example, the cloud server may be the second cloud server or the first cloud server. The cloud server 70 includes at least one processor 701, at least one memory 702, and at least one communication interface 703. In addition, the cloud server may further include general components such as an antenna, which are not described in detail herein.
The processor 701 may be a general purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of programs according to the above schemes.
A communication interface 703 for communicating with other devices or a communication network.
The Memory 702 may be, but is not limited to, a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via a bus. The memory may also be integral to the processor.
The memory 702 is used for storing application program codes for executing the above schemes, and the processor 701 controls the execution. The processor 701 is configured to execute application program code stored in the memory 702. For example, the memory 702 stores code that may perform the path planning methods provided above in fig. 2 or fig. 4 a.
It should be noted that, for the functions of the first cloud server 70 described in the embodiment of the present application, reference may be made to the related description in the method embodiment described in fig. 2 and fig. 4a, and details are not repeated herein.
Embodiments of the present invention also provide a computer storage medium having stored therein instructions, which when executed on a computer or processor, cause the computer or processor to perform one or more steps of a method according to any of the above embodiments. Based on the understanding that the constituent modules of the above-mentioned apparatus, if implemented in the form of software functional units and sold or used as independent products, may be stored in the computer-readable storage medium, and based on this understanding, the technical solutions of the present application, in essence, or a part contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of software products, and the computer products are stored in the computer-readable storage medium.
The computer readable storage medium may be an internal storage unit of the device according to the foregoing embodiment, for example, a hard disk or a memory. The computer readable storage medium may be an external storage device of the above-described apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the computer-readable storage medium may include both an internal storage unit and an external storage device of the device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the apparatus. The above-described computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the above embodiments of the methods when the computer program is executed. And the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The modules in the device can be merged, divided and deleted according to actual needs.
It is to be understood that one of ordinary skill in the art would recognize that the elements and algorithm steps of the various examples described in connection with the embodiments disclosed in the various embodiments disclosed herein can be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Those of skill would appreciate that the functions described in connection with the various illustrative logical blocks, modules, and algorithm steps disclosed in the various embodiments disclosed herein may be implemented as hardware, software, firmware, or any combination thereof. If implemented in software, the functions described in the various illustrative logical blocks, modules, and steps may be stored on or transmitted over as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. The computer-readable medium may include a computer-readable storage medium, which corresponds to a tangible medium, such as a data storage medium, or any communication medium including a medium that facilitates transfer of a computer program from one place to another (e.g., according to a communication protocol). In this manner, a computer-readable medium may generally correspond to (1) a tangible computer-readable storage medium that is not transitory, or (2) a communication medium, such as a signal or carrier wave. A data storage medium may be any available medium that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementing the techniques described herein. The computer program product may include a computer-readable medium.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. The path planning method is applied to a first cloud server, and a lane-level driving line database is stored in the first cloud server; the method comprises the following steps:
receiving a road-level planned driving path from a first vehicle-mounted terminal, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; n is an integer greater than 1;
matching a first travelable route set in the lane-level travel route database according to the longitude and latitude information corresponding to the N road track points respectively;
and determining a lane-level planning driving path in the first driving route set.
2. The method of claim 1, wherein the N road trace points comprise a start road trace point and an end road trace point; matching a first travelable route set in the lane-level travel route database according to the longitude and latitude information corresponding to the N road track points respectively;
according to the longitude and latitude information corresponding to the N road track points, M intersections existing in the N road track points are matched in the lane level driving line database; m is an integer less than N;
and sequentially acquiring the travelable route in each intersection of the M intersections and the travelable route corresponding to each road track point to obtain a first travelable route set.
3. The method of claim 1, wherein the lane-level travel route database includes at least one of lane travel route information, lane travel route type information, lane travel route connectivity attribute information.
4. The method of claim 1, wherein the road-level planned travel path is a path determined by a second cloud server according to travel information of the first vehicle-mounted terminal and in combination with a road-level map.
5. The method of claim 1, wherein the method further comprises:
receiving a road-level planned driving path from a second vehicle-mounted terminal, wherein the road-level planned driving path comprises M road track points, and each road track point comprises longitude and latitude information of the road track point; m is an integer greater than 1;
if the similarity between the longitude and latitude information contained in at least Q groups of road track points is larger than a target threshold value, the first cloud server sends the lane-level planning driving path corresponding to the first vehicle-mounted terminal to the second vehicle-mounted terminal; each group of the Q groups of the road track points comprises one of the M road track points and one of the N road track points.
6. A path planning method is applied to a first vehicle-mounted terminal on a vehicle, and comprises the following steps:
sending a road-level planned driving path to a first cloud server, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; n is an integer greater than 1;
receiving a lane-level planning driving path determined by the first cloud server; the lane-level planned driving path is determined by the first cloud server through the path planning method according to any one of claims 1 to 5;
and controlling the vehicle to drive according to the lane-level planned driving path.
7. The vehicle navigation device is applied to a first cloud server, and the first cloud server stores a lane-level driving route database; the device comprises:
the first receiving unit is used for receiving a road-level planned driving path from a first vehicle-mounted terminal, wherein the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; n is an integer greater than 1;
the processing unit is used for matching a first drivable route set in the lane level driving route database according to the respective longitude and latitude information corresponding to the N road track points;
and the driving path planning unit is used for determining a lane-level planning driving path in the first driving route set.
8. The apparatus according to claim 7, wherein the processing unit comprises a matching unit and a travelable route determination unit, wherein,
the matching unit is used for matching M intersections existing in the N road track points in the lane level driving route database according to the respective longitude and latitude information corresponding to the N road track points; m is an integer less than N;
the travelable route determining unit is configured to sequentially acquire travelable routes in each intersection of the M intersections and travelable routes corresponding to each road track point, and obtain a first travelable route set.
9. The apparatus of claim 7, wherein the lane-level travel route database includes at least one of lane travel route information, lane travel route type information, lane travel route connectivity attribute information.
10. The apparatus of claim 7, wherein the road-level planned travel path is a path determined by a second cloud server according to travel information of the first vehicle-mounted terminal and in combination with a road-level map.
11. The apparatus of any one of claims 7-10, further comprising:
the second receiving unit is used for receiving a road-level planned driving path from a second vehicle-mounted terminal, wherein the road-level planned driving path comprises M road track points, and each road track point comprises longitude and latitude information of the road track point; m is an integer greater than 1;
the first cloud server is used for sending the lane-level planning driving path corresponding to the first vehicle-mounted terminal to the second vehicle-mounted terminal under the condition that at least the similarity between the longitude and latitude information contained in the Q groups of road track points is larger than a target threshold; each group of the Q groups of the road track points comprises one of the M road track points and one of the N road track points.
12. A vehicular navigation apparatus applied to a first onboard terminal on a vehicle, the apparatus comprising:
the system comprises a sending unit, a first cloud server and a second cloud server, wherein the sending unit is used for sending a road-level planned driving path to the first cloud server, the road-level planned driving path comprises N road track points, and each road track point comprises longitude and latitude information of the road track point; n is an integer greater than 1;
the receiving unit is used for receiving the lane-level planning driving path determined by the first cloud server; the lane-level planned driving path is determined by the first cloud server through the path planning method according to any one of claims 1 to 5;
and the driving unit is used for controlling the vehicle to drive according to the lane-level planned driving path.
13. A first cloud server, comprising a processor and a memory, the processor and the memory being interconnected, wherein the memory is configured to store a computer program, the computer program comprising program instructions, and wherein the processor is configured to invoke the program instructions to perform the method of any one of claims 1-5.
14. An in-vehicle terminal, characterized in that it comprises a processor and a memory, said processor and memory being interconnected, wherein said memory is used for storing a computer program, said computer program comprising program instructions, said processor being configured for invoking said program instructions for performing the method according to claim 6.
15. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-5 or 6.
CN202010895471.9A 2020-08-31 2020-08-31 Path planning method, related equipment and computer readable storage medium Active CN112146671B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010895471.9A CN112146671B (en) 2020-08-31 2020-08-31 Path planning method, related equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010895471.9A CN112146671B (en) 2020-08-31 2020-08-31 Path planning method, related equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112146671A CN112146671A (en) 2020-12-29
CN112146671B true CN112146671B (en) 2022-10-28

Family

ID=73889832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010895471.9A Active CN112146671B (en) 2020-08-31 2020-08-31 Path planning method, related equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112146671B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112735169B (en) * 2021-01-06 2022-01-28 山东旗帜信息有限公司 Vehicle route restoration method, equipment and medium
CN113008255A (en) * 2021-01-28 2021-06-22 沈阳美行科技有限公司 Navigation method and navigation device
CN113190768B (en) 2021-04-29 2024-03-12 北京百度网讯科技有限公司 Path processing method and device
CN113329073A (en) * 2021-05-26 2021-08-31 上海声通信息科技股份有限公司 Full-media intelligent Internet of vehicles platform
DE102022002768A1 (en) 2021-07-30 2023-02-02 Mercedes-Benz Group AG System and method for spook location of a vehicle
CN113776555A (en) * 2021-08-18 2021-12-10 南斗六星系统集成有限公司 Method for calculating automatic driving road coverage mileage based on road network slice
CN113682318B (en) * 2021-09-30 2022-09-06 阿波罗智能技术(北京)有限公司 Vehicle running control method and device
CN114328594B (en) * 2021-11-25 2022-11-01 北京掌行通信息技术有限公司 Method and device for judging running path of vehicle, storage medium and terminal
CN114964292B (en) * 2022-05-30 2023-10-20 国汽智控(北京)科技有限公司 Global path planning method, device, electronic equipment and storage medium
CN116698054B (en) * 2023-08-03 2023-10-27 腾讯科技(深圳)有限公司 Road matching method, device, electronic equipment and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104880193A (en) * 2015-05-06 2015-09-02 石立公 Lane-level navigation system and lane-level navigation method thereof
CN108139218B (en) * 2015-10-15 2020-12-08 华为技术有限公司 Navigation system, apparatus and method
CN108027242B (en) * 2015-11-30 2021-06-01 华为技术有限公司 Automatic driving navigation method, device and system, vehicle-mounted terminal and server
CN106017491B (en) * 2016-05-04 2019-08-02 玉环看知信息科技有限公司 A kind of navigation path planning method, system and navigation server
CN108663059A (en) * 2017-03-29 2018-10-16 高德信息技术有限公司 A kind of navigation path planning method and device
CN107462243B (en) * 2017-08-04 2019-09-20 浙江大学 A kind of cloud control automatic Pilot task creating method based on high-precision map
US10496098B2 (en) * 2017-09-12 2019-12-03 Baidu Usa Llc Road segment-based routing guidance system for autonomous driving vehicles
CN109631927A (en) * 2018-12-29 2019-04-16 北斗天地股份有限公司 A kind of paths planning method and device
CN111380547B (en) * 2018-12-29 2022-05-17 沈阳美行科技股份有限公司 Mature path track determining method and device, computer equipment and storage medium
CN110160552B (en) * 2019-05-29 2021-05-04 百度在线网络技术(北京)有限公司 Navigation information determination method, device, equipment and storage medium
CN110530392B (en) * 2019-09-29 2021-10-08 武汉中海庭数据技术有限公司 Path planning method and device based on combination of traditional map and high-precision map
CN111102988A (en) * 2020-01-03 2020-05-05 北京汽车集团有限公司 Map-based path planning method, server, vehicle-mounted terminal, and storage medium
CN111337045A (en) * 2020-03-27 2020-06-26 北京百度网讯科技有限公司 Vehicle navigation method and device

Also Published As

Publication number Publication date
CN112146671A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
CN112146671B (en) Path planning method, related equipment and computer readable storage medium
CN111123952B (en) Trajectory planning method and device
CN110379193B (en) Behavior planning method and behavior planning device for automatic driving vehicle
EP4071661A1 (en) Automatic driving method, related device and computer-readable storage medium
WO2022027304A1 (en) Testing method and apparatus for autonomous vehicle
WO2021102955A1 (en) Path planning method for vehicle and path planning apparatus for vehicle
US9451020B2 (en) Distributed communication of independent autonomous vehicles to provide redundancy and performance
CN112859830B (en) Design operation region ODD judgment method, device and related equipment
CN113968216B (en) Vehicle collision detection method and device and computer readable storage medium
CN110789533B (en) Data presentation method and terminal equipment
CN113160547B (en) Automatic driving method and related equipment
CN113492830B (en) Vehicle parking path planning method and related equipment
CN112672942B (en) Vehicle lane changing method and related equipment
CN112585045A (en) Electromechanical braking method and electromechanical braking device
CN113954858A (en) Method for planning vehicle driving route and intelligent automobile
CN113859265A (en) Reminding method and device in driving process
CN114792149A (en) Track prediction method and device and map
CN112829762A (en) Vehicle running speed generation method and related equipment
CN114782638A (en) Method and device for generating lane line, vehicle, storage medium and chip
CN113963535A (en) Driving decision determination method and device and electronic equipment storage medium
CN114764980A (en) Vehicle turning route planning method and device
WO2022061725A1 (en) Traffic element observation method and apparatus
WO2022267004A1 (en) Path planning method and apparatus
CN113799794B (en) Method and device for planning longitudinal movement parameters of vehicle
WO2022041820A1 (en) Method and apparatus for planning lane-changing trajectory

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant