WO2022021982A1 - 可行驶区域判定的方法、智能驾驶系统和智能汽车 - Google Patents

可行驶区域判定的方法、智能驾驶系统和智能汽车 Download PDF

Info

Publication number
WO2022021982A1
WO2022021982A1 PCT/CN2021/091159 CN2021091159W WO2022021982A1 WO 2022021982 A1 WO2022021982 A1 WO 2022021982A1 CN 2021091159 W CN2021091159 W CN 2021091159W WO 2022021982 A1 WO2022021982 A1 WO 2022021982A1
Authority
WO
WIPO (PCT)
Prior art keywords
drivable area
drivable
historical
area
current
Prior art date
Application number
PCT/CN2021/091159
Other languages
English (en)
French (fr)
Inventor
郭剑艇
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21849359.1A priority Critical patent/EP4184119A4/en
Publication of WO2022021982A1 publication Critical patent/WO2022021982A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • G01C21/3822Road feature data, e.g. slope data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control

Definitions

  • the present application relates to the field of smart cars, and in particular, to a method for determining a drivable area applied to smart driving, an smart driving system, and a smart car.
  • Autonomous driving technology has broad application prospects and important research significance. Automatic driving detects roads and obstacles through sensing equipment, and autonomously performs driving operations, which can improve driving safety, reduce the incidence of traffic accidents, and reduce human and economic losses; at the same time, automatic driving can also cooperate with intelligent transportation systems. Allocate road resources more reasonably to ease urban congestion. At this stage, the autonomous driving technology is still in the research and testing stage, and the detection of drivable areas is an indispensable part of advanced assisted driving and autonomous driving.
  • the drivable area detection method is a method of determining the drivable area according to the input of the current sensing device.
  • the existing technology aims to instantly determine the road surface and identify the drivable area through means such as machine learning.
  • the automatic driving based on the current drivable area detection technology only performs the planning and control of the drivable path according to the detected current drivable area, and the accuracy of the path planning is limited.
  • the embodiments of the present application provide a drivable area determination method, an intelligent driving system, and an intelligent car applied to intelligent driving, which solve the problem that the existing drivable area determination is instant determination, the accuracy is limited, and self-learning is impossible.
  • an embodiment of the present application provides a drivable area determination method applied to intelligent driving, including:
  • the intelligent driving system obtains the environmental information around the location of the vehicle and determines the current driving area
  • the intelligent driving system queries the historical drivable area database according to the location of the vehicle, and obtains the information of the corresponding historical drivable area;
  • the intelligent driving system superimposes the current drivable area and the historical drivable area to obtain the current drivable area.
  • the determination result of the previous drivable area is persisted to the historical drivable area database and applied to the subsequent drivable area determination process, so that the intelligent driving system can select the current drivable area from the current drivable area.
  • Driving route to output the current drivable area including the optimal route.
  • the intelligent driving system determines whether the superimposed drivable area includes a drivable lane that meets the first length, and if so, takes the drivable lane that meets the first length as the current drivable area.
  • the current drivable area is used as the current drivable area.
  • the historical drivable area where the vehicle is located is used as a factor for determining the drivable area this time.
  • this embodiment of the present application superimposes the current drivable area and the historical drivable area, and the obtained drivable area this time may be:
  • the superimposed drivable area is used as the current drivable area
  • the drivable area obtained by superimposing the current drivable area and the previous k historical drivable areas does not include a drivable lane that satisfies the first length, it is determined that the current drivable area and the previous historical drivable area are superimposed. Whether the drivable area contains a drivable lane that meets the first length, and if so, the drivable area obtained by superimposing the current drivable area and the previous historical drivable area as the current drivable area;
  • k is a positive integer greater than or equal to 2.
  • the current drivable area and the previous k historical drivable areas can be superimposed, and there is a drivable lane that meets the first length in the current drivable area and the previous k historical drivable areas.
  • the combination of the previous k times of historical drivable area contains the optimal route, and the route is also likely to be the optimal route during the first k times of travel; suboptimally, in the current drivable area and the previous k times of historical drivable area
  • the previous drivable area can be superimposed with the previous historical drivable area. If there is a drivable lane that meets the first length in the superimposed area, it means the The drivable lanes of the first length are drivable lanes in this and the previous driving process.
  • the current drivable area when the current drivable area includes a drivable lane satisfying the first length, the current drivable area is updated to the historical drivable area database.
  • the embodiment of the present application may update the historical drivable area database using the drivable area obtained by superposition. For example:
  • the superimposed drivable area is output as the current drivable area , and recorded in the historical drivable area database;
  • the drivable area obtained by superimposing the current drivable area and the previous k historical drivable areas does not include a drivable lane satisfying the first length
  • the drivable area obtained by superimposing the current drivable area and the previous historical drivable area is output as the current drivable area, and recorded in the historical drivable area database;
  • the current drivable area When none of the drivable areas obtained by the superposition above includes a drivable lane that satisfies the first length, and the current drivable area includes a drivable lane that satisfies the first length, the current drivable area is used as the current drivable area.
  • the driving area is output and recorded to the historical drivable area library.
  • the intelligent driving system can also delete the record with the longest storage time when there are more than k records corresponding to a certain positioning data in the historical drivable area database.
  • the timeliness of the data recorded in the historical drivable area database is improved, and it is avoided that too old data is stored in the historical drivable area database and affects the subsequent determination results of the drivable area.
  • the aforementioned first length represents a range perceived by the intelligent driving system
  • the drivable lanes satisfying the first length represent lanes that are all drivable areas within the range perceived by the intelligent driving system.
  • the historical drivable area database records information of historical drivable areas, including historical drivable area records, positioning data, and road feature points.
  • the intelligent driving system stores the information of the historical drivable area with the second length as the granularity.
  • the second length may be an empirical value, such as 1 meter or 10 meters, etc. If the second length is about small, the more information of the historical drivable area needs to be stored, the more storage space is occupied.
  • the information of the historical drivable area further includes a tag number.
  • Each historical drivable area record corresponds to a tag number, and the tag number corresponds to positioning data.
  • the intelligent driving system can query the tag number of the positioning data according to the positioning data of the location of the vehicle, and output the information of all historical drivable areas matching the tag number.
  • the outputted historical drivable area is greater than or equal to the perception range of the intelligent driving system, that is, the outputted historical drivable area covers the perception range of the vehicle's forward direction.
  • the intelligent driving system obtains the historical drivable area where the length of the vehicle's location along the vehicle's advancing direction is greater than or equal to the sensing range, that is, obtains the driving direction Information on the historical drivable areas corresponding to multiple tag numbers in the direction, so that the range represented by the collection of the historical drivable areas corresponding to the multiple tag numbers is greater than or equal to the perceived range of the vehicle's forward direction.
  • the method further includes: the intelligent driving system, according to the road feature points of the location of the vehicle, The current drivable area and the historical drivable area are registered, so that the road feature points of the current drivable area and the road feature points included in the information of the historical drivable area obtained by the query coincide on the map.
  • an intelligent driving system including:
  • the detection module is used to obtain the environmental information around the location of the vehicle and determine the current drivable area
  • a query module configured to query the historical drivable area database according to the location of the vehicle to obtain the information of the corresponding historical drivable area;
  • the fusion module is used for superimposing the current drivable area and the historical drivable area to obtain the current drivable area.
  • the fusion module is specifically configured to determine whether the drivable area obtained by superposition includes a drivable lane that satisfies the first length, and if so, the drivable lane that satisfies the first length is used as this secondary drivable area.
  • the fusion module is specifically configured to use the current drivable area as the current drivable area when the superimposed drivable area does not include a drivable lane satisfying the first length.
  • the fusion module is specifically configured to, when the drivable area obtained by superimposing the current drivable area and the previous k historical drivable areas includes a drivable lane satisfying the first length, use the superimposed drivable area as the drivable area.
  • the drivable area this time;
  • the fusion module is specifically configured to judge the current drivable area and the previous drivable area when the drivable area obtained by superimposing the current drivable area and the previous k times of historical drivable areas does not include a drivable lane that satisfies the first length. Whether the drivable area obtained by superimposing a historical drivable area contains a drivable lane that meets the first length, and if so, the drivable area obtained by superimposing the current drivable area and the previous historical drivable area as the current drivable area area, where k is a positive integer greater than or equal to 2.
  • the fusion module is specifically configured to update the current drivable area to the historical drivable area database when the current drivable area includes a drivable lane satisfying the first length.
  • the information of the historical drivable area includes historical drivable area records, positioning data, and road feature points.
  • the fusion module is specifically used to register the current drivable area and the historical drivable area according to the road feature points of the location of the vehicle, so that the road feature points of the current drivable area and the historical drivable area obtained by the query are registered.
  • the road feature points included in the information are coincident on the map.
  • an embodiment of the present application further provides a smart car, including a processor, a memory, and a perception device,
  • the memory for storing the historical drivable area library
  • the perception device is used to obtain environmental information around the location of the vehicle
  • the processor for executing instructions to implement the method as described in any specific implementation of the first aspect.
  • an embodiment of the present application provides an apparatus for determining a drivable area, including a processor, a communication interface, and a memory; the memory is used for storing instructions, the processor is used for executing the instructions, and the communication interface is used for receiving or transmitting data; wherein the processor executes the instructions to implement the method as described in any specific implementation manner of the first aspect above.
  • embodiments of the present application provide a non-transitory computer storage medium, where the computer medium stores a computer program, and when the computer program is executed by a processor, the first aspect or any specific implementation of the first aspect is implemented method described in the method.
  • the present application may further combine to provide more implementation manners.
  • the embodiment of the present application discloses a drivable area determination method applied to intelligent driving, which determines the current drivable area according to the environmental information around the location of the vehicle, and queries the historical drivable area database according to the location of the vehicle, The information of the corresponding historical drivable area is obtained, and the current drivable area is superimposed with the historical drivable area to obtain the current drivable area.
  • the embodiment of the present application provides a method for determining the current drivable area based on the information of the historical drivable area, and outputs the optimal drivable area by fusing the stored historical drivable area information with the current drivable area.
  • the embodiments of the present application can combine the detection of road feature points to improve the accuracy of fusion.
  • the historical drivable area can also be updated, and changes in road conditions can be updated to the historical record in time.
  • an aging mechanism for historical information may also be set. After updating the information of the current formable area to the historical drivable area database, at least one piece of old data is discarded, so that the algorithm can eliminate the influence of outdated information. , while avoiding the interference of individual outliers.
  • FIG. 1 is a schematic diagram of the hardware structure of a vehicle-mounted computing system provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of a logical structure of an intelligent driving system provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of a sequence flow diagram of a drivable area fusion determination provided by an embodiment of the present application
  • FIG. 4 is a schematic diagram of a fusion of a current drivable area and a historical drivable area provided by an embodiment of the present application;
  • FIG. 5 is a schematic diagram of a sensing range of an intelligent driving system provided by an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a method for judging a drivable area applied to intelligent driving according to an embodiment of the application
  • FIG. 7 is a schematic flowchart of another method for judging a drivable area applied to intelligent driving provided by an embodiment of the application;
  • FIG. 8 is a schematic diagram of an apparatus for judging a drivable area provided by an embodiment of the application.
  • the embodiments of the present application are applied to the field of smart cars, and as shown in FIG. 1 , it is a schematic diagram of the hardware structure of a vehicle-mounted computing system applicable to the embodiments of the present application.
  • the in-vehicle computing system may include an in-vehicle processing system 101 , and devices/devices/networks directly or indirectly connected to the in-vehicle processing system 101 .
  • the in-vehicle processing system 101 includes a processor 103 and a system memory 135 , and the processor 103 is connected to other components/interfaces on the in-vehicle processing system 101 through a system bus 105 .
  • the processor 103 may be one or more processors, each of which may include one or more processor cores.
  • the in-vehicle processing system 101 is connected to other components of the vehicle, such as the display 109, the interactive device 117, the multimedia device 121, the positioning device 123, and the sensory Equipment 153 (camera and various sensors, etc.).
  • the in-vehicle processing system 101 is connected to the external network 127 through the network interface 129 , and exchanges information with the server 149 through the external network 127 .
  • the network interface 129 may send and/or receive communication signals.
  • the system bus 105 is coupled to an input/output (I/O) bus 113 through a bus bridge 111 .
  • the I/O interface 115 communicates with various I/O devices, such as the interactive device 117 (such as a keyboard, a mouse, a touch screen, etc.), a multimedia device 121 (such as a compact disc read-only memory, CD-ROM), Multimedia interface, etc.), positioning device 123, universal serial bus (USB) interface 125 and sensing device 153 (cameras can capture still and dynamic digital video images).
  • the interactive device 117 such as a keyboard, a mouse, a touch screen, etc.
  • a multimedia device 121 such as a compact disc read-only memory, CD-ROM), Multimedia interface, etc.
  • positioning device 123 such as a compact disc read-only memory, CD-ROM), Multimedia interface, etc.
  • USB universal serial bus
  • the interaction device 117 is used to realize the message interaction between the smart car and the driver.
  • the driver can select the driving mode and driving style model of the smart car through the interaction device 117 .
  • the interaction device 117 may be integrated with the display 109 .
  • the processor 103 may be any conventional processor, including a reduced instruction set computing (reduced instruction set computer, RISC) processor, a complex instruction set computing (complex instruction set computer, CISC) processor, or a combination thereof.
  • the processor may be a dedicated device such as an application specific integrated circuit (ASIC).
  • the processor 103 may be a neural network processor or a combination of a neural network processor and the above-mentioned conventional processors.
  • the processor 103 may include a main controller (also referred to as a central controller) and an advanced driver assistance system controller. Among them, the main controller is the control center of the computer system.
  • the advanced driver assistance system controller is used to control the route of automatic driving or assisted automatic driving, etc.
  • the display 109 may be any one or more display devices installed in the vehicle.
  • the displays 109 may include: a head up display (HUD), an instrument panel, a display dedicated to a passenger, and the like.
  • HUD head up display
  • instrument panel a display dedicated to a passenger
  • the onboard processing system 101 may communicate with a deploying server 149 via a network interface 129 .
  • Network interface 129 may be a hardware network interface, such as a network card.
  • the network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet network or a virtual private network (VPN).
  • the network 127 may also be a wireless network, such as a WiFi network, a cellular network, and the like.
  • the memory interface 131 is coupled to the system bus 105 .
  • the memory interface is connected to the memory.
  • System memory 135 is coupled to system bus 105 .
  • Data running in system memory 135 may include operating system 137 and application programs 143 .
  • the operating system 137 includes a shell 139 and a kernel 141 .
  • the shell 139 is an interface between the user and the kernel of the operating system.
  • the shell is the outermost layer of the operating system. The shell manages the interaction between the user and the operating system, waits for user input, interprets the user's input to the operating system, and processes various operating system output.
  • Kernel 141 consists of those parts of the operating system that manage memory, files, peripherals, and system resources. Interacting directly with hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, and IO management, among others.
  • the intelligent driving system 143 includes programs related to controlling the automatic driving of the vehicle.
  • the program for processing the information about the surrounding environment of the vehicle obtained by the in-vehicle device is, for example, the program for implementing the method for determining a drivable path provided by the embodiment of the present application.
  • Another example is a program that controls the route or speed of an autonomous vehicle, a program that controls the interaction between the autonomous vehicle and other autonomous vehicles on the road, and so on.
  • the intelligent driving system 143 may also exist on the system of the server 149 . In one embodiment, when the intelligent driving system 143 needs to be executed, the in-vehicle processing system 101 may download the installation program of the intelligent driving system 143 from the server 149 .
  • the perception device 153 is associated with the onboard processing system 101 .
  • the perception device 153 is used to detect the environment around the vehicle.
  • the perception device 153 can detect animals, other vehicles, obstacles, pedestrian crossings, lane lines, etc. around the vehicle, and further the perception device 153 can also detect the environment around objects such as the above-mentioned animals, vehicles, obstacles, and pedestrian crossings, for example, Weather conditions, and the brightness of the surrounding environment, etc.
  • Perception devices can include cameras, infrared sensors, chemical detectors, and microphones.
  • the sensing device 153 may also include a speed sensor for measuring the speed information (such as speed, acceleration, etc.) of the vehicle (that is, the vehicle where the on-board computing system shown in FIG.
  • the sensing device 153 may further include a lidar sensor for detecting the reflected signal of the laser signal sent by the lidar, thereby obtaining a laser point cloud.
  • Lidar can be mounted above the vehicle to send laser signals.
  • the positioning device 123 includes a global positioning system (GPS), an inertial navigation system (INS) and other devices or subsystems used to determine the position of the vehicle.
  • GPS global positioning system
  • INS inertial navigation system
  • the processor 103 executes various instructions to implement various functions of the intelligent driving system 143 .
  • the embodiment of the present application provides a structural example of the intelligent driving system 143 .
  • the intelligent driving system 143 may include a detection module 1431 , a query module 1432 , a fusion module 1433 , and a vehicle control module 1434 .
  • the detection module 1431 is used to obtain the environmental information around the location of the vehicle, and determine the current drivable area. Specifically, the detection module 1431 determines the surrounding environment information through the positioning device 123 and the sensing device 153, including information on obstacles in the surrounding area of the vehicle (such as the location and size of obstacles, including but not limited to people, vehicles, roadblocks, etc. position, size, attitude and speed of the object), as well as lane information, etc. The detection module 1431 determines the current drivable area according to the surrounding environment information.
  • the query module 1432 is used to query the historical drivable area database according to the location of the vehicle, and obtain the information of the corresponding historical drivable area. The drivable path recorded at the location.
  • the fusion module 1433 is configured to superimpose the current drivable area and the historical drivable area to obtain the current drivable area.
  • the fusion module 1433 is specifically configured to determine whether the superimposed drivable area includes a drivable lane that meets the first length, and if so, the drivable lane that meets the first length is used as the current drivable area.
  • the fusion module 1433 notifies the vehicle control module 1434 to control the driving path of the vehicle according to the output current drivable area.
  • the memory 133 is used to store historical drivable area information and maps (such as high-precision maps), and to update the stored content.
  • the detection module 1431 acquires the information of the map in a certain area around the vehicle from the memory 133 according to the current position of the vehicle.
  • the information of the map includes: road signs in a certain area around the vehicle, such as road lines, lane lines, parking lines, and the like.
  • the detection module 1433 exchanges information with the sensing device 153, obtains obstacles and road signs in the surrounding area of the vehicle, and performs registration operations.
  • the memory 133 may be a memory specially used for storing maps, or may be a general memory. This embodiment of the present application does not limit this.
  • the above modules can be implemented by software and/or hardware. And, any one or more modules can be set independently or integrated together. This embodiment of the present application does not specifically limit this. In one example, any one or more of these modules may act as logical function modules in the main controller or in the ADAS controller.
  • the in-vehicle computing system may be entirely located on the vehicle, or part of the processing logic may be located in a cloud network connected to the vehicle through the network.
  • the onboard processing system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle.
  • some of the processes described herein are performed on a processor disposed within the autonomous vehicle, others are performed by a remote processor.
  • all or part of the functions of the query module 1432 are implemented on a server deployed in the cloud.
  • FIG. 1 is only an example, which does not limit the computer system to which the embodiments of the present application are applicable.
  • the drivable area in this embodiment of the present application may include a structured road surface, a semi-structured road surface, an unstructured road surface, and other road surface areas that can be used for driving a smart car.
  • Structured pavement generally has a road edge line and a single pavement structure, such as urban arterial roads, expressways, national roads, provincial roads, etc.
  • the structural layer of this pavement implements certain standards, and the color and material of the surface layer are uniform.
  • Semi-structured pavement refers to the general non-standardized pavement.
  • the surface layer of the pavement has a large difference in color and material, such as parking lots, squares, etc., as well as some branch roads.
  • Unstructured pavement has no structural layer, natural road scene.
  • Autonomous driving needs to realize path planning, and it is necessary to realize the detection of drivable areas.
  • the drivable area is generally detected based on image segmentation technology.
  • automatic driving can solve the detection and identification of structured and unstructured roads.
  • unmanned vehicles in the wild it is necessary to solve the detection of unstructured roads.
  • the basic methods are to obtain the basic structural features of the road surface based on the road surface color, road model, and road surface texture features. Through these features, the road edge line and the basic direction of the road (straight) are further obtained. Go, turn left, turn right, sharp left, sharp right) and other potential information. These features can be extracted by traditional segmentation extraction methods or machine learning methods.
  • the detection of the drivable area is mainly used for automatic driving to provide path planning assistance.
  • the detection can be the detection of the entire road surface, or only part of the road information can be extracted, such as the road direction in a certain area ahead or the road midpoint or lane line, etc., as long as the road path planning and obstacle avoidance can be realized in combination with the high-precision map That is, it is not necessary to extract the complete drivable area of the road surface.
  • the intelligent driving system 143 can further perform an analysis function, divide the road surface within the detection range into lanes, and divide the road within the detection range.
  • the drivable area detection result after dividing the lanes is used in the subsequent processing.
  • the drivable area detection method based on the network model generally uses a large amount of driving data to train the network model in the training stage. After the network model is fixed, the drivable area determination method will not be gradually optimized as the number of driving repetitions increases. For the repeated route driving that often occurs in conventional driving, the previous historical driving experience cannot be referenced in time through the network model.
  • the embodiment of the present application proposes an experience-based drivable area determination method, which stores the previous drivable route, and continuously optimizes the historical drivable area information through an algorithm.
  • the current drivable area and the historical drivable area information are combined and superimposed to obtain the optimized current drivable area.
  • the embodiment of the present application can store the historical drivable area information in the historical drivable area database by using a binary map, so that the historical drivable area information can be used without occupying a lot of space. under persistent storage.
  • the stored historical drivable area information can be continuously optimized with the number of repetitions of the trip. In the above manner, the embodiment of the present application enables the determination of the drivable area to have the self-learning capability.
  • FIG. 3 a schematic diagram of a time sequence for determining a drivable area provided by an embodiment of the present application.
  • T 1 , T 2 , T 3 , ... respectively represent the first time, the second time, the third time ... the vehicle passes through the same judgment location.
  • the smart car passes through the determination point for the first time at T1, and the on - board computing system performs the drivable area initialization in the T1 stage, and obtains the drivable area information when it passes the determination point in the T1 stage.
  • the drivable area is fused to determine and update. specifically:
  • the drivable area is determined at the determination point that the vehicle has passed. If the record information of the historical drivable area related to the to-be-determined location is not matched in the historical drivable area database, then at this time, the determined location is processed according to the operation of stage T1.
  • the detection module is first called to detect the drivable area and output the drivable area information.
  • Store the drivable area information output in the T1 stage including drivable area description information, GPS data, road feature points, and so on.
  • the drivable area information output in the T1 stage is used as the initialized historical drivable area information and recorded in the historical area database.
  • the GPS data is used as a reference for storage, and a unique tag number can be assigned to identify each record, so as to facilitate query.
  • the historical travelable area information is read from the historical area database, and the operation in the T2 stage is performed.
  • the GPS data of the determined location can be used to match the GPS data in the historical database, after the matching is successful, the tag number of the historical drivable area is output, and the historical drivable area information in the corresponding tag number is called out for fusion use.
  • the detection module detects the current drivable area at the determined location.
  • the fusion method is invoked to fuse the current drivable area with the historical drivable area.
  • the area determination method is used to obtain the current drivable area.
  • the drivable area output this time is used as the historical information, and the historical area information is updated, and the updated content is the drivable area information in the corresponding label number.
  • stage T 2 When the vehicle passes through the determination point (T 3 , T 4 , ... etc. stage) again, the operation in stage T 2 is repeated, and the optimized drivable area information is output.
  • FIG. 4 it is a schematic diagram of a drivable area fusion provided by an embodiment of the present application.
  • Figure a identifies the drivable area recorded in the historical drivable area database when the vehicle is at the current determined location, which is represented by a slash.
  • Picture b is the current drivable area detected by the detection module, which is indicated by a reverse slash.
  • Figure c shows the optimized drivable area output after combining the historical drivable area determination.
  • the optimized drivable area can be combined with historical experience information, and the smallest drivable area containing the optimal route can be output preferentially.
  • the non-drivable area of the right lane in Figure a may be the entry area of the ramp, there may be decelerating vehicles about to enter the ramp, and there is a high probability of obstacles, while the middle lane is a straight lane, and the maximum probability is a clear lane.
  • FIG. 5 which is a schematic diagram of a sensing range of a sensing device provided by an embodiment of the present application, the embodiment of the present application performs planning of a drivable path within the above sensing range.
  • the embodiment of the present application provides a method for storing historical driving area information, and the historical driving area information may adopt a low-resolution binary picture (for example: 128*128).
  • the granularity of the recorded historical drivable area information may be one meter, that is, one piece of data is stored for every 1 meter the vehicle moves, and the stored data includes three parts: drivable area information, GPS data, and road feature points.
  • the drivable area is stored as a binary image
  • the road feature points are the scale-invariant feature transform (SIFT) point data after the camera has eliminated moving objects according to target detection, and the road feature points can be stored as text
  • GPS The data is stored in the list and has a unique tag number to associate the GPS data with the binary map and feature point data.
  • SIFT scale-invariant feature transform
  • the corresponding tag number can be queried through GPS data, and the corresponding binary map and feature point data can be obtained according to the tag number.
  • the system may map the GPS data to areas within a range of no more than one meter to the same tag number.
  • k pieces of historical drivable area information denoted as S n-1 , S n-2 , ..., S nk , representing the n-1th, nth respectively -2, ..., the drivable area information output when passing through the area for nk times.
  • the value of k may be 10.
  • an embodiment of the present application provides a schematic flowchart of a method for fusion determination of a drivable area, and the method includes:
  • Step 601 query the historical drivable area database according to the location positioning information (GPS) to obtain the information of the corresponding historical drivable area.
  • GPS location positioning information
  • Step 602 Register the current drivable area with the historical drivable area, that is, align the historical drivable area with the current drivable area according to feature points such as lane lines.
  • the embodiments of the present application may use the feature points combined with SIFT to complete the registration operation through the existing general feature extraction and registration methods for computer vision.
  • SIFT Scale-invariant feature transform
  • SIFT Scale-invariant feature transform
  • SIFT feature detection mainly includes the following basic steps:
  • Scale-space extrema detection Search image locations at all scales to identify potential scale- and rotation-invariant points of interest through a Gaussian differential function.
  • Keypoint localization At each candidate location, the location and scale are determined by a well-fitted model. Keypoints are chosen based on how stable they are.
  • Orientation determination Assign one or more orientations to each keypoint location based on the local gradient orientation of the image. All subsequent operations on image data are transformed relative to the orientation, scale, and position of keypoints, thereby providing invariance to these transformations.
  • Keypoint description In the neighborhood around each keypoint, measure the gradient of the image locality at the selected scale. These gradients are transformed into a representation that allows for relatively large local shape deformations and lighting changes.
  • the road feature information we store, that is, the SIFT feature point information of the region;
  • the registration stage we also perform SIFT feature extraction on the current area; when the SIFT feature vectors of the two images are generated, the next step can use the Euclidean distance of the key point feature vectors as the key points in the two images similarity determination metric. Take a certain key point of the current graph, and find the two closest key points in the historical graph by traversing. Among these two key points, if the closest distance divided by the second closest distance is less than a certain threshold, it is determined as a pair of matching points.
  • Step 603 Superimpose the current drivable area and the historical drivable area.
  • Step 604 Determine the current drivable area according to the superimposed drivable area.
  • the superposition obtains S com as the highest priority output. If a complete lane does not exist in S com , use as output.
  • the source of lane line information is not limited, and may come from a detection module in the system or other existing methods.
  • the complete lane may be that, with the vehicle as a reference, there is no obstacle target in a lane within the perception range.
  • S n-1 does not have a complete lane, use S n as the output, that is, the output result of the drivable area detection module as the output.
  • the generation methods of S com include:
  • the generation method of take S n-1 and the current drivable area S n and (and), get
  • This embodiment also provides a method for updating historical drivable area information. If the current drivable area Sn contains a complete lane, update the historical drivable area information, keep Sn as the historical area information, and discard Snk The previous historical area information (forgetting), otherwise the historical drivable area will not be updated (bad value discarded).
  • FIG. 7 it is a schematic diagram of a process flow for determining a drivable area provided by an embodiment of the present application. include:
  • Step 701 The sensing device acquires environmental information around the vehicle.
  • Step 702 The detection module obtains the environmental information around the vehicle input by the sensing device, and obtains the lane line information.
  • Step 703 The detection module obtains positioning information, and obtains the current drivable area according to the environmental information.
  • Step 704 The query module searches the historical drivable area information in the historical drivable area database in combination with the positioning information.
  • Step 705 The fusion module combines feature point matching to register the historical drivable area with the current drivable area (aligning lane lines, etc.).
  • Step 706 Perform S com area fusion with the current drivable area information and the registered historical drivable area information.
  • Step 707 Combined with the lane line detection result, if the S com contains a complete lane, output it as the determined drivable area.
  • Step 708 If the complete lane is not included in the S com , proceed to Regional fusion.
  • Step 709 Combined with the lane line detection results, if contains a complete lane, it is output as the determined drivable area.
  • Step 710 If does not contain a complete lane, then directly output Sn as the determined historical drivable area.
  • Step 711 After the determined historical drivable area is output, update the historical area according to the aforementioned method in the embodiment of the present application, and use the updated historical area as the historical drivable area library in subsequent determination.
  • the drivable area determination is continuously updated and optimized as the number of repetitions increases.
  • the definition and storage method of the historical drivable area is used to represent and persist the historical experience information of the drivable area in automatic driving.
  • the aforementioned fusion determination method is used to fuse the current drivable area with the historical drivable area, and output the drivable area where the optimal route is located.
  • the update method of the historical drivable area is used to update the historical drivable area, and can set forgetting rules and discarding rules, so that the algorithm can forget the long-term information or discard the old data after adding a new formable area, so as to update the road conditions in time. , to avoid the interference of individual outliers.
  • the embodiments of the present application provide a method and process for determining a drivable area based on historical information, combined with lane line detection, and output the optimal drivable area by fusing the stored historical drivable area information with the current drivable area.
  • the embodiment of the present application provides a storage method and an update process of a historical drivable area, converts the historical drivable area into a binary image, and stores it according to the current positioning information, which reduces the space required for storage.
  • the embodiment of the present application provides a self-learning update method for historical drivable areas, which stores historically determined drivable areas through coding, and combines it with the real-time determination of the current drivable area, so that the selection of drivable areas can follow historical records. increase and gradually optimize.
  • the storage manner of the historical drivable area information may be other manners, for example, storing a numerical matrix, storing it as map information, and the like.
  • the fusion determination method of the historical drivable area and the current drivable area may be other methods, for example, adding or weighted multiplication of the historical area and the current area.
  • the method for updating the historical drivable area may also be other methods, for example, updating the historical drivable area by using the drivable area after fusion determination.
  • the embodiment of the present application enables the determination of the drivable area to preferentially output the drivable area where the optimal line is located, improves the accuracy of the determination of the drivable area, reduces the exploration process of the automatic driving algorithm, and improves the riding experience of automatic driving.
  • the judgment algorithm has self-learning ability and realizes automatic optimization based on historical records.
  • the apparatus for determining a drivable area 800 may be a computing device, or may be a module in an intelligent driving system or a mobile data center MDC. in order to realize the functions described in the foregoing processes.
  • the drivable area determination device 800 at least includes: a processor 810, a communication interface 820 and a memory 830, the processor 810, the communication interface 820 and the memory 830 are connected to each other through a bus 840, wherein,
  • the processor 810 may have various specific implementation forms.
  • the processor 810 executes related operations according to program units stored in the memory, and the program units may be instructions, or computer programs.
  • the processor 810 may be a central processing unit (central processing unit, CPU) or a graphics processing unit (graphics processing unit, GPU), and the processor 810 may also be a single-core processor or a multi-core processor.
  • the processor 810 may be a combination of a CPU and a hardware chip.
  • the above-mentioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • GAL general-purpose array logic
  • the processor 810 can also be independently implemented by a logic device with built-in processing logic, such as an FPGA or a digital signal processor (digital signal processor, DSP).
  • a logic device with built-in processing logic such as an FPGA or a digital signal processor (digital signal processor, DSP).
  • DSP digital signal processor
  • the communication interface 820 can be a wired interface or a wireless interface for communicating with other modules or devices, and the wired interface can be an Ethernet interface, a controller area network (CAN) interface, a local interconnect network, LIN) and FlexRay interface, the wireless interface can be a cellular network interface or use a wireless local area network interface, etc.
  • the communication interface 820 in this embodiment of the present application may be specifically configured to receive environmental data collected by the sensing device 153 .
  • the bus 840 may be a CAN bus or other internal bus for interconnecting various systems or devices in the vehicle.
  • the bus 840 can be divided into an address bus, a data bus, a control bus, and the like. For ease of presentation, only one thick line is used in FIG. 8, but it does not mean that there is only one bus or one type of bus.
  • the drivable area determination apparatus may further include a memory 830, and the storage medium of the memory 830 may be a volatile memory and a nonvolatile memory, or may include both volatile and nonvolatile memories.
  • the non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically programmable Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be random access memory (RAM), which acts as an external cache.
  • RAM random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • direct rambus RAM direct rambus RAM, DR RAM
  • the memory 530 can also be used to store program codes and data, so that the processor 810 can call the program codes stored in the memory 830 to implement the functions of the intelligent driving system in the foregoing embodiments.
  • the drivable area determination device 800 may include more or less components than those shown in FIG. 8 , or have different component configurations.
  • the above embodiments may be implemented in whole or in part by software, hardware, firmware or any other combination.
  • the above-described embodiments may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded or executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center is by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that a computer can access, or a data storage device such as a server, a data center, or the like containing one or more sets of available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media.
  • the semiconductor medium may be a solid state drive (SSD).
  • the steps in the method of the embodiment of the present application may be sequentially adjusted, combined or deleted according to actual needs; the modules in the device of the embodiment of the present application may be divided, combined or deleted according to actual needs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

一种应用于智能驾驶的可行驶区域判定方法,根据车辆所处位置周围的环境信息,确定当前可行驶区域,根据车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息(601),将当前可行驶区域与历史可行驶区域叠加(603),得到本次的可行驶区域(604)。

Description

可行驶区域判定的方法、智能驾驶系统和智能汽车 技术领域
本申请涉及智能汽车领域,尤其涉及应用于智能驾驶的可行驶区域判定的方法、智能驾驶系统和智能汽车。
背景技术
随着经济的发展,汽车的保有量迅速增加,而汽车技术也在与计算机技术发生越来越多的融合。近年来,智能汽车已成为车辆发展的新趋势,越来越多的汽车采用了辅助驾驶(driver assistance)、自动驾驶(automated driving)或智能网联驾驶(intelligent network driving)的系统。这类系统在行驶过程中通过车载图像采集设备和车载传感器等感知设备智能化探测障碍物、感知周围环境,利用车载计算平台(例如,移动数据中心(mobile data center,MDC))决策车辆的行驶路径并控制车辆的行驶状态。
自动驾驶技术具有广泛的应用前景和重要的研究意义。自动驾驶通过感知设备检测出道路和障碍物,自主地进行驾驶操作,可以提升驾驶的安全性,降低交通事故发生率,减少人员和经济的损失;同时,自动驾驶也可以与智能交通系统配合,更合理地分配道路资源,缓解城市拥堵。现阶段的自动驾驶技术还处在研究和测试阶段,而可行驶区域检测,则是高级辅助驾驶以及自动驾驶必不可少的环节。
可行驶区域检测方法,是根据当前感知设备的输入,确定可行驶区域的方法。现有技术旨在通过机器学习等手段,对路面进行即时判定,识别可行驶区域。而基于当前可行驶区域检测技术的自动驾驶,只是根据检测出的当前可行驶区域进行可行驶路径的规划控制,路径规划的准确度受限。
发明内容
本申请实施例提供了一种应用于智能驾驶的可行驶区域判定方法、智能驾驶系统和智能汽车,解决现有可行驶区域判定为即时判定,准确度受限,无法自学习的问题。
第一方面,本申请实施例提供了一种应用于智能驾驶的可行驶区域判定方法,包括:
智能驾驶系统获取车辆所处位置周围的环境信息,确定当前可行驶区域;
所述智能驾驶系统根据所述车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息;
所述智能驾驶系统将所述当前可行驶区域与所述历史可行驶区域叠加,得到本次的可行驶区域。
本申请实施例通过将在先的可行驶区域的判定结果持久化到历史可行驶区域库,并应用于后续的可行驶区域判定过程,从而使得智能驾驶系统可以从当前可行驶区域选择本次的行驶路径,以输出包括最优路径的本次可行驶区域。
所述智能驾驶系统判断叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将满足第一长度的可行驶车道作为本次的可行驶区域。当叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,将当前可行驶区域作为所述本次的可行驶区域。
本申请实施例将车辆所在位置的历史可行驶区域作为判定本次可行驶区域的一个因素,在一种可能的实施方式中,仅在叠加得到的可行驶区域存在满足第一长度的可行驶车道时,才将该叠加得到的可行驶区域作为本次可行驶区域进行输出,此时,满足第一长度的可行驶车道可以作为最优路径,从而使得输出的本次可行驶区域准确度更高。
在一种可能的实施方式中,本申请实施例对当前可行驶区域和历史可行驶区域进行叠加,得到本次的可行驶区域可以为:
当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域包含满足第一长度的可行驶车道时,则将叠加得到的可行驶区域作为本次的可行驶区域;
当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,判断当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域作为本次的可行驶区域;
其中,k为大于等于2的正整数。
上述可行驶区域叠加的两种方法,优选可以采用当前可行驶区域与前k次历史可行驶区域叠加,在当前可行驶区域与前k次历史可行驶区域叠加存在满足第一长度的可行驶车道,表示结合前k次的历史可行驶区域包含了最优的路径,且该路径在前k次行驶过程中同样大概率为优选路径;次优地,在当前可行驶区域与前k次历史可行驶区域叠加后不存在满足第一长度的可行驶车道时,可以将前可行驶区域与上一次历史可行驶区域叠加,若叠加后的区域存在满足第一长度的可行驶车道,则表示该满足第一长度的可行驶车道在本次和上一次行驶过程中均为可行驶车道。
在一种可能的实施方式中,在当前可行驶区域包括满足第一长度的可行驶车道时,将当前可行驶区域更新到历史可行驶区域库。
在另一种可能的实现方式中,本申请实施例可以使用叠加得到的可行驶区域更新历史可行驶区域库。例如:
当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域包含满足第一长度的可行驶车道时,则将叠加得到的可行驶区域作为本次的可行驶区域输出,且记录到历史可行驶区域库;
当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域不包含满足第一长度的可行驶车道,且当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域包含满足第一长度的可行驶车道时,将当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域作为本次的可行驶区域输出,且记录到历史可行驶区域库;
当前述叠加得到的可行驶区域均不包含满足第一长度的可行驶车道,且所述当前可行驶区域包含满足第一长度的可行驶车道时,将当前可行驶区域作为所述本次的可行驶区域输出,且记录到历史可行驶区域库。
所述智能驾驶系统还可以在历史可行驶区域库中某个定位数据对应的记录超过k条时,删除存储时间最久的记录。
通过上述更新方法,提高了历史可行驶区域库记载的数据的时效性,避免过老的数据存储在历史可行驶区域库中,影响后续可行驶区域的判定结果。
在一种示例中,前述的第一长度表示所述智能驾驶系统感知的范围,所述满足第一长度的可行驶车道表示在智能驾驶系统感知的范围内均为可行驶区域的车道。
在一种可能的实施方式中,历史可行驶区域库中记录有历史可行驶区域的信息,包括历史可行驶区域记录、定位数据以及道路特征点。所述智能驾驶系统以第二长度为粒度存储历史可行驶区域的信息。其中,所述第二长度可以为经验值,例如1米或10米等等,第二长度约小,则需要存储的历史可行驶区域的信息越多,占用更多的存储空间。
在一种示例中,所述历史可行驶区域的信息还包括标签号。每条历史可行驶区域记录对应一个标签号,所述标签号对应定位数据。智能驾驶系统可以根据车辆所在位置的定位数据查询该定位数据的标签号,输出与所述标签号匹配的所有历史可行驶区域的信息。
可以理解的是,输出的所述历史可行驶区域大于或等于所述智能驾驶系统的感知范围,即输出的历史可行驶区域覆盖车辆前进方向的感知范围。此时,当第二长度小于所述智能驾驶系统的感知范围半径时,智能驾驶系统获取所述车辆所在位置的延车辆前进方向长度大于或等于所述感知范围的历史可行驶区域,即获取前进方向上多个标签号对应的历史可行驶区域的信息,从而使得多个标签号对应的历史可行驶区域的集合表示的范围大于或等于所述车辆前进方向的感知范围。
在一种可能的实施方式中,在查询历史可行驶区域库,得到对应的历史可行驶区域的信息之后,所述方法还包括:所述智能驾驶系统根据所述车辆所在位置的道路特征点,对当前可行驶区域和历史可行驶区域进行配准,使得当前可行驶区域的道路特征点与查询得到的历史可行驶区域的信息中包括的道路特征点在地图上重合。
第二方面,本申请实施例提供了一种智能驾驶系统,包括:
检测模块,用于获取车辆所处位置周围的环境信息,确定当前可行驶区域;
查询模块,用于根据所述车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息;
融合模块,用于将所述当前可行驶区域与所述历史可行驶区域叠加,得到本次的可行驶区域。
在一种可能的实施方式中,所述融合模块,具体用于判断叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将满足第一长度的可行驶车道作为本次的可行驶区域。
所述融合模块,具体用于当叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,将当前可行驶区域作为所述本次的可行驶区域。
所述融合模块,具体用于当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域包含满足第一长度的可行驶车道时,将叠加得到的可行驶区域作为本次的可行驶区域;
所述融合模块,具体用于当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,判断当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域作为本次的可行驶区域,其中,k为大于等于2的正整数。
所述融合模块,具体用于在当前可行驶区域包括满足第一长度的可行驶车道时,将当前可行驶区域更新到历史可行驶区域库。
历史可行驶区域的信息包括历史可行驶区域记录、定位数据以及道路特征点。
所述融合模块,具体用于根据所述车辆所在位置的道路特征点,对当前可行驶区域和历史可行驶区域进行配准,使得当前可行驶区域的道路特征点与查询得到的历史可行驶区域的信息中包括的道路特征点在地图上重合。
第三方面,本申请实施例还提供了一种智能汽车,包括处理器、存储器以及感知设备,
所述存储器,用于存储历史可行驶区域库;
所述感知设备,用于获取车辆所处位置周围的环境信息;
所述处理器,用于执行指令以实施如第一方面任意具体实现方式中所描述方法。
第四方面,本申请实施例提供一种可行驶区域判定装置,包括处理器、通信接口以及存储器;所述存储器用于存储指令,所述处理器用于执行所述指令,所述通信接口用于接收或者发送数据;其中,所述处理器执行所述指令以实施如上述第一方面的任意具体实现方式中所描述方法。
第五方面,本申请实施例提供一种非瞬态计算机存储介质,所述计算机介质存储有计算机程序,所述计算机程序被处理器执行时实现如上述第一方面或者第一方面的任意具体实现方式中所描述方法。
本申请在上述各方面提供的实现方式的基础上,还可以进行进一步组合以提供更多实现方式。
本申请实施例公开了一种应用于智能驾驶的可行驶区域判定方法,根据车辆所处位置周围的环境信息,确定当前可行驶区域,根据所述车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息,将所述当前可行驶区域与所述历史可行驶区域叠加,得到本次的可行驶区域。本申请实施例提供了基于历史可行驶区域的信息判定本次可行驶区域方法,通过存储的历史可行驶区域信息与当前可行驶区域融合,输出最优的可行驶区域。进一步的,本申请实施例可以结合道路特征点检测,提高融合的准确度。本申请实施例还可以更新历史可行驶区域,及时将路况变动更新到历史记录中。更进一步的,本申请实施例还可以设置历史信息老化机制,在更新了本次的可形式区域的信息到历史可行驶区域库后,丢弃至少一条老数据,使得算法可以消除过时的信息的影响,同时避免受到个别异常值的干扰。
附图说明
图1是本申请实施例提供的一种车载计算系统的硬件结构示意图;
图2是本申请实施例提供的智能驾驶系统的逻辑结构示意图;
图3是本申请实施例提供的一种可行驶区域融合判定的时序流程示意图;
图4是本申请实施例提供的当前可行驶区域与历史可行驶区域的融合示意图;
图5是本申请实施例提供的一种智能驾驶系统感知范围示意图。
图6是申请实施例提供的一种应用于智能驾驶的可行驶区域判断方法流程示意图;
图7是申请实施例提供的另一种应用于智能驾驶的可行驶区域判断方法流程示意图;
图8是申请实施例提供的一种可行驶区域判断装置示意图。
具体实施方式
下面结合附图对本申请实施例进行详细的阐述。
本申请实施例应用于智能汽车领域,如图1所示,为可适用于本申请实施例的一种车载计算系 统的硬件结构示意图。该车载计算系统可以包括车载处理系统101,以及与车载处理系统101直接或间接连接的设备/器件/网络等。
参见图1,车载处理系统101包括处理器103和系统内存135,处理器103通过系统总线105与车载处理系统101上的其他部件/接口相连。处理器103可以是一个或者多个处理器,其中每个处理器都可以包括一个或多个处理器核。车载处理系统101通过各种接口(例如,适配器107、I/O接口115、USB接口125等)与车载的其他部件相连,例如,显示器109、交互设备117、多媒体设备121、定位设备123以及感知设备153(摄像头和各种传感器等)。车载处理系统101通过网络接口129连接到外部网络127,通过外部网络127与服务器149进行信息交互。所述网络接口129可以发送和/或接收通信信号。
在车载处理系统101中,系统总线105通过总线桥111和输入输出(input/output,I/O)总线113耦合。I/O接口115和多种I/O设备进行通信,比如交互设备117(如键盘、鼠标、触摸屏等),多媒体设备121(如只读光盘(compact disc read-only memory,CD-ROM)、多媒体接口等),定位设备123,通用串行总线(universal serial bus,USB)接口125和感知设备153(摄像头可以捕捉静态和动态数字视频图像)。
交互设备117,用于实现智能汽车和驾驶员的消息交互,驾驶员可以通过交互设备117选择智能汽车的驾驶模式和驾驶风格模型等。在一种可能的实施方式中,交互设备117可以与显示器109集成。
其中,处理器103可以是任何传统处理器,包括精简指令集计算(reduced instruction set computer,RISC)处理器、复杂指令集计算(complex instruction set computer,CISC)处理器或上述的组合。可选地,处理器可以是诸如专用集成电路(application specific integrated circuit,ASIC)的专用装置。可选地,处理器103可以是神经网络处理器或者是神经网络处理器和上述传统处理器的组合。可选地,处理器103可以包括:主控制器(也可以称作中控)和先进驾驶辅助系统控制器。其中,主控制器是计算机系统的控制中心。先进驾驶辅助系统控制器用于对自动驾驶或辅助自动驾驶的路线等进行控制。
显示器109,可以是车辆中安装的任意一个或多个显示设备。例如,显示器109可以包括:平视显示器(head up display,HUD)、仪表盘以及专供乘客使用的显示器等。
车载处理系统101可以通过网络接口129和服务器(deploying server)149通信。网络接口129可以是硬件网络接口,比如,网卡。网络127可以是外部网络,比如因特网,也可以是内部网络,比如以太网或者虚拟私人网络(virtual private network,VPN)。可选地,网络127还可以是无线网络,比如WiFi网络,蜂窝网络等。
存储器接口131和系统总线105耦合。存储器接口和存储器相连接。
系统内存135和系统总线105耦合。运行在系统内存135的数据可以包括操作系统137和应用程序143。
操作系统137包括壳层(shell)139和内核(kernel)141。shell 139是介于使用者和操作系 统之内核间的一个接口。shell是操作系统最外面的一层。shell管理使用者与操作系统之间的交互,等待使用者的输入,向操作系统解释使用者的输入,并且处理各种各样的操作系统的输出结果。
内核141由操作系统中用于管理存储器、文件、外设和系统资源的那些部分组成。直接与硬件交互,操作系统内核通常运行进程,并提供进程间的通信,提供CPU时间片管理、中断、内存管理,以及IO管理等等。
智能驾驶系统143包括控制车辆自动驾驶相关的程序。例如,对车载设备所获取到的包含车辆周围环境信息进行处理的程序,如用于实现本申请实施例所提供的可行驶路径判定方法的程序。又如,控制自动驾驶车辆路线或者速度的程序,控制自动驾驶车辆和路上其他自动驾驶车辆交互的程序等。智能驾驶系统143也可以存在于服务器149的系统上。在一个实施例中,在需要执行智能驾驶系统143时,车载处理系统101可以从服务器149下载智能驾驶系统143的安装程序。
感知设备153和车载处理系统101关联。感知设备153用于探测车辆周围的环境。举例来说,感知设备153可以探测车辆周围的动物、其他车辆、障碍物、人行横道和车道线等,进一步感知设备153还可以探测上述动物、车辆、障碍物和人行横道等物体周围的环境,例如,天气条件,以及周围环境的光亮度等。感知设备可以包括摄像头、红外线感应器、化学检测器,以及麦克风等。感知设备153还可以包括速度传感器,用于测量本车辆(即图1所示的车载计算系统所在的车辆)的速度信息(如速度、加速度等);角度传感器,用于测量车辆的方向信息,以及车辆与车辆周边的物体/对象之间的相对角度等。感知设备153还可以包括激光雷达传感器,用于探测激光雷达发送的激光信号的反射信号,从而得到激光点云。激光雷达可以安装在车辆上方,用于发送激光信号。
定位设备123,包括全球定位系统(global positioning system,GPS)、惯性导航系统(inertial navigation system,INS)等用于确定车辆位置的设备或子系统。
在本申请的一些实施例中,处理器103执行各种指令,实现智能驾驶系统143的各种功能。具体的,如图2所示,本申请实施例提供了一种所述智能驾驶系统143的结构示例,智能驾驶系统143可以包括检测模块1431,查询模块1432,融合模块1433,以及车辆控制模块1434。
检测模块1431,用于获取车辆所处位置周围的环境信息,确定当前可行驶区域。具体的,检测模块1431通过定位设备123和感知设备153确定周围的环境信息,包括车辆周边区域内的障碍物的信息(如障碍物的位置和大小等,包括但不限于人、车辆、路障等实物的位置、大小、姿态和速度),以及车道信息等等。检测模块1431根据周围的环境信息确定当前可行驶区域。
查询模块1432,用于根据所述车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息.所述历史可行驶区域库中记录有历史上本车辆通过所述车辆所处位置时记录的可行驶路径。
融合模块1433,用于将所述当前可行驶区域与所述历史可行驶区域叠加,得到本次的可行驶区域。
所述融合模块1433,具体用于判断叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将满足第一长度的可行驶车道作为本次的可行驶区域。
进一步的,融合模块1433通知车辆控制模块1434按照输出的本次可行驶区域控制车辆的行驶路径。
存储器133用于存储历史可行驶区域信息以及地图(如高精地图),并对存储的内容进行更新。检测模块1431根据车辆当前的位置,从存储器133中获取该车辆周边一定区域内的地图的信息。其中,所述地图的信息包括:该车辆周边一定区域内的道路标识,如道路线、车道线、停车线等。检测模块1433与感知设备153进行信息交互,获取车辆周边区域中的障碍物情况和道路标识等等,进行配准操作。
其中,存储器133可以是专门用于存储地图的存储器,也可以是通用的存储器。本申请实施例对此不进行限定。
上述各模块(检测模块1431,查询模块1432,融合模块1433,以及车辆控制模块1434)均可以通过软件和/或硬件实现。并且,其中的任意一个或多个模块可以独立设置,也可以集成在一起。本申请实施例对此不进行具体限定。在一个示例中,这些模块中的任意一个或多个可以作为主控制器中或者ADAS控制器中的逻辑功能模块。
该车载计算系统可以全部位于车辆上,或者部分处理逻辑位于与车辆通过网络相连的云端网络中。例如,车载处理系统101可位于远离自动驾驶车辆的地方,并且可与自动驾驶车辆无线通信。在其它方面,本文所述的一些过程在设置在自动驾驶车辆内的处理器上执行,其它由远程处理器执行。例如,查询模块1432的全部或部分功能在云端部署的服务器上实现。
需要说明的是,图1所示的计算机系统仅为示例,其不对本申请实施例可适用的计算机系统构成限定。
本申请实施例中的可行驶区域可以包括了结构化的路面、半结构化的路面、非结构化的路面等可用于智能汽车行驶的路面区域。结构化的路面一般是有道路边缘线,路面结构单一,比如城市主干道,高速、国道、省道等,这个路面的结构层执行一定的标准,面层的颜色和材质统一。半结构化的路面是指一般的非标准化的路面,路面面层是颜色和材质差异较大,比如停车场,广场等,还有一些分支道路。非结构化的路面没有结构层,天然的道路场景。自动驾驶需要实现路径规划,就必须要实现对可行驶区域的检测。在基于计算机视觉的自动驾驶系统中,一般基于图像分割技术检测可行驶区域。对于城市车辆,自动驾驶解决结构化路面和非结构化路面的检测和识别就可以,对于野外的无人驾驶车辆,需要解决非结构化路面的检测。
对不同的环境,有很多不同的检测方法,基本的方法有基于路面颜色、道路模型、路面纹理特征等获取路面的基本结构特征,通过这些特征进一步的获得道路边缘线、道路的基本方向(直走、左转、右转、左急转、右急转)等潜在信息。通过传统的分割提取方法或者机器学习的方法对这些特征可以进行提取。
可行驶区域的检测主要是用于自动驾驶提供路径规划辅助。检测可以是针对整个路面的检测,也可以只提取出部分的道路信息,比如前方一定区域内的道路走向或者道路中点或车道线等,只要能结合高精度地图实现道路路径规划和障碍物躲避即可,不一定要提取出完整的路面可行驶区域。而对于没有清晰的道路标识的半结构化或非结构化路面,此时路面上不存在车道的划分,智能驾驶系统143进一步还可以执行分析功能,对检测范围内的路面进行车道划分,并将划分车道后的可行驶区域检测结果应用在后续的处理过程中。
对于自动驾驶场景中的重复道路,特别是通勤道路线,现有技术无法像人类驾驶员一样,在多次通行之后,对历史区域的路况进行预判。例如,哪个路段的部分车道存在坑洼,哪个路段为复杂路口,哪个路段有较多的路边停车,等等。而人类驾驶员可以根据经验进行预判,提前进行规避或采取更合理的驾驶策略。而基于网络模型的可行驶区域检测方法一般是在训练阶段,通过大量驾驶数据,训练网络模型,在网络模型固定之后,可行驶区域的判定方法不会随着驾驶重复次数增长而逐步优化。而对于常规驾驶上经常出现的重复线路驾驶,通过网络模型也无法及时参考在先的历史驾驶经验。
本申请实施例提出了一种基于经验的可行驶区域判定方法,将在先的可行驶路径存储下来,并通过算法持续优化历史可行驶区域信息。在行驶过程中进行可行驶区域判定时,将当前可行驶区域与历史可行驶区域信息组合叠加,得到优化的本次可行驶区域。
在一种可能的实施方式中,本申请实施例可以采用二值图的方式将历史可行驶区域信息存储到历史可行驶区域库中,从而使得历史可行驶区域信息能够在不占用大量空间的情况下持久化存储。存储的历史可行驶区域信息可以随着行驶的重复次数而不断优化。通过上述方式,本申请实施例使得可行驶区域判定具备了自学习能力。
如图3所示,为本申请实施例提供的一种可行驶区域判定时序示意图。其中,T 1,T 2,T 3,…分别表示车辆第一次,第二次,第三次…经过同一判定地点。智能汽车在T1时第一次经过该判定地点,车载计算系统在T 1阶段执行可行驶区域初始化,得到T1阶段经过该判定点时的可行驶区域信息,经过T 2,T 3,…等后续阶段时,对可行驶区域进行融合判定与更新。具体地:
车辆行驶时,在行驶经过的判定地点进行可行驶区域判定。假如在历史可行驶区域库中没有匹配到与待判定地点相关的历史可行驶区域的记录信息,则此时在该判定地点按照T 1阶段的操作进行处理。
对于T 1阶段,首先调用检测模块进行可行驶区域检测,输出可行驶区域信息。对T 1阶段输出的可行驶区域信息进行存储,包括可行驶区域描述信息、GPS数据以及道路特征点等等。T1阶段输出的可行驶区域信息作为初始化的历史可行驶区域信息,记录在历史区域库中。在一种具体的实施方式中,对于每一个记录的可行驶区域,以GPS数据为基准进行存储,可以分配唯一的标签号来标识各个记录,以利于查询。
当车辆第二次(在T2阶段)经过该判定地点时,从历史区域库中读取历史可行驶区域信息,进行T 2阶段的操作。具体地,可以利用判定地点的GPS数据与历史库中的GPS数据进行匹配,匹配成功后输出历史可行驶区域的标签号,将对应标签号中的历史可行驶区域信息调出以供融合使用。
在T 2阶段,检测模块在该判定地点进行当前可行驶区域检测,对于T 2阶段检测输出的当前可行驶区域,调用融合方法,将当前可行驶区域与历史可行驶区域进行融合,按照可行驶区域判定方法,得到本次可行驶区域。
将本次输出的可行驶区域作为历史信息,对历史区域信息进行更新,更新的内容为对应标签号中的可行驶区域信息。
当车辆再一次经过该判定地点(T 3,T 4,…等阶段)时,重复T 2阶段的操作,输出优化的可行驶区域信息。
如图4所示,为本申请实施例提供的一种可行驶区域融合的示意图。a图标识车辆在当前判定地 点时,历史可行驶区域库中记录的可行驶区域,用斜线表示。b图为检测模块检测出的当前可行驶区域,反向斜线表示。c图为结合历史可行驶区域判定后输出的优化的可行驶区域。优化的可行驶区域可以结合历史经验信息,优先输出含有最优路径的最小可行驶区域。例如,a图中右侧车道的不可行驶区域可能为匝道汇入区域,可能存在即将驶入匝道的减速车辆,较大概率存在障碍物,而中间车道为直行车道,最大概率为畅通车道。如图5所示,为本申请实施例提供的一种感知设备感知的范围示意图,本申请实施例在上述感知范围内进行可行驶路径的规划。
本申请实施例提供了一种历史行驶区域信息的存储方法,历史行驶区域信息可以采用低分辨率的二值图片(例如:128*128)。示例性的,记录的历史可行驶区域信息的粒度可以为一米,即车辆每移动一米存储一份数据,存储的数据包括三部分:可行驶区域信息、GPS数据以及道路特征点。其中,将可行驶区域存储为二值图片,道路特征点为摄像头根据目标检测剔除运动物体后的尺度不变特征(Scale-invariant feature transform,SIFT)点数据,道路特征点可以存储为文本;GPS数据存在列表中,并有唯一标签号将GPS数据与二值图、特征点数据关联起来。
在进行历史可行驶区域数据查询时,可以通过GPS数据,可以查询到对应的标签号,并根据标签号获取对应的二值图和特征点数据。示例性的,系统可以将GPS数据相差不超过一米的范围的区域,对应到同一个标签号上。
对于每个位置(即按粒度区分出来的每个区域),存储k份历史可行驶区域信息,记为S n-1,S n-2,…,S n-k,分别表示第n-1,n-2,…,n-k次经过该区域时输出的可行驶区域信息。实例性地,k的取值可以为10。
如图6所示,本申请实施例提供了一种可行驶区域融合判定的方法流程示意图,所述方法包括:
步骤601:根据位置定位信息(GPS),查询历史可行驶区域库,得到对应的历史可行驶区域的信息。
步骤602:将当前可行驶区域与历史可行驶区域进行配准,即,根据车道线等特征点,将历史可行驶区域与当前可行驶区域进行对齐。
在一种可能的实施方式中,本申请实施例可以使用结合SIFT特征点,通过现有的计算机视觉通用的特征提取与配准方法完成配准操作。SIFT,即尺度不变特征变换(Scale-invariant feature transform,SIFT),是用于图像处理领域的一种描述。这种描述具有尺度不变性,可在图像中检测出关键点,是一种局部特征描述。
SIFT特征检测主要包括以下基本步骤:
尺度空间极值检测:搜索所有尺度上的图像位置,通过高斯微分函数来识别潜在的对于尺度和旋转不变的兴趣点。
关键点定位:在每个候选的位置上,通过一个拟合精细的模型来确定位置和尺度。关键点的选择依据于它们的稳定程度。
方向确定:基于图像局部的梯度方向,分配给每个关键点位置一个或多个方向。所有后面的对图像数据的操作都相对于关键点的方向、尺度和位置进行变换,从而提供对于这些变换的不变性。
关键点描述:在每个关键点周围的邻域内,在选定的尺度上测量图像局部的梯度。这些梯度被变换成一种表示,这种表示允许比较大的局部形状的变形和光照变化。我们存储的道路特征信息,即区域的SIFT特征点信息;
配准:在配准阶段,我们同样对当前的区域进行SIFT特征提取;当两幅图像的SIFT特征向量生 成以后,下一步就可以采用关键点特征向量的欧式距离来作为两幅图像中关键点的相似性判定度量。取当前图的某个关键点,通过遍历找到历史图中的距离最近的两个关键点。在这两个关键点中,如果最近距离除以次近距离小于某个阈值,则判定为一对匹配点。根据上述步骤的匹配结果,选择匹配值最高的30对匹配点对作为基准点对,利用剩下的匹配点对计算尺度和方向参数,再根据参数做平移旋转伸缩变换,即可得到配准之后的历史可行驶区域。
步骤603:将当前可行驶区域与历史可行驶区域叠加。;
步骤604:根据叠加得到的可行驶区域,确定本次的可行驶区域。
叠加得到S com,作为最优先输出。如果S com中不存在完整车道,则使用
Figure PCTCN2021091159-appb-000001
作为输出。
车道线信息的获取方式,根据应用系统的差异,可以有两种获取方式,其一,根据GPS的定位结果,直接从高精地图中获取,这种方法受限于GPS定位的精度与是否具备高精地图;其二,利用计算机视觉的方法,直接从当前区域的图像中提取,在计算机视觉的方法中,又可以分为利用传统的图像处理方式(提取边缘信息等)和利用深度学习(LaneNet等)的处理方法.对于本申请实施例而言,车道线信息来源并不限定,可以来自系统中检测模块或其他现有方式。
为了确定是否存在完整的可行驶车道,我们还需要车载处理系统中检测模块的检测结果,将检测结果与车道线信息的坐标进行对比,即可得到可行驶区域是否存在某一条可行驶的完整车道。示例性的,完整车道可以为,以本车为基准,感知范围内的某条车道不存在障碍物目标。
如果S n-1不存在完整车道,则使用S n作为输出,即可行驶区域检测模块的输出结果作为输出。
S com的生成方法包括:
将S n-1,S n-2,…,S n-k累加,根据阈值(例如,k/2)输出二值矩阵S;
将S与当前可行驶区域S n取与(and),得到S com
Figure PCTCN2021091159-appb-000002
的生成方法:将S n-1与当前可行驶区域S n取与(and),得到
Figure PCTCN2021091159-appb-000003
S com
Figure PCTCN2021091159-appb-000004
S n说明如下表所述:
Figure PCTCN2021091159-appb-000005
本实施例还提供了一种更新历史可行驶区域信息的方法,若当前可行驶区域S n中包含完整车道,则对历史可行驶区域信息进行更新,保留S n作为历史区域信息,丢弃S n-k以前的历史区域信息(遗忘),否则不对历史可行驶区域做更新(坏值丢弃)。
如图7所示,为本申请实施例提供的一种可行驶区域判定流程示意图。包括:
步骤701:感知设备获取车辆周围的环境信息。
步骤702:检测模块获取感知设备输入的车辆周围的环境信息,得到车道线信息.
步骤703:检测模块获得定位信息,根据所述环境信息得到的当前可行驶区域。
步骤704:查询模块结合定位信息,在历史可行驶区域库中查询历史可行驶区域信息。
步骤705:融合模块结合特征点匹配,将历史可行驶区域与当前可行驶区域进行配准(使得车道线等对齐)。
步骤706:将当前可行驶区域信息,与配准后的历史可行驶区域信息,进行S com区域融合。
步骤707:结合车道线检测结果,若S com中包含完整车道,则将其作为判定的可行驶区域输出。
步骤708:若S com中不包含完整车道,则进行
Figure PCTCN2021091159-appb-000006
区域融合。
步骤709:结合车道线检测结果,若
Figure PCTCN2021091159-appb-000007
中包含完整车道,则将其作为判定的可行驶区域输出。
步骤710:若
Figure PCTCN2021091159-appb-000008
中不包含完整车道,则直接将S n作为判定的历史可行驶区域输出。
步骤711:判定的历史可行驶区域输出后,按照本申请实施例前述的方法,进行历史区域更新,并使用更新后的历史区域作为后续判定中的历史可行驶区域库。
重复上述过程,可行驶区域判定随着重复次数的增加,不断进行自更新与优化。
历史可行驶区域的定义与存储方法,用于表示和持久化自动驾驶中可行驶区域的历史经验信息。前述融合判定方法用于将当前可行驶区域与历史可行驶区域进行融合,并输出最优路线所在的可行驶区域。历史可行驶区域的更新方法,用于更新历史可行驶区域,并可以设置遗忘规则与丢弃规则,使得算法可以遗忘久远的信息或者在新增可形式区域后丢弃老的数据,从而及时更新路况变动,避免受到个别异常值的干扰。
本申请实施例提供了基于历史信息的可行驶区域判定方法与流程,并结合车道线检测,通过存储的历史可行驶区域信息与当前可行驶区域融合,输出最优可行驶区域。
本申请实施例提供了历史可行驶区域的存储方法与更新流程,将历史的可行驶区域,转化为二值图片,并按照当前定位信息进行存储,降低了存储所需的空间。本申请实施例提供了历史可行驶区域的自学习更新方法,通过编码存储历史判定的可行驶区域,并与当前可行驶区域的即时判定进行结合,从而使得可行驶区域的选择能够随着历史记录的增加而逐步优化。
在具体的实施方式中,历史可行驶区域信息的存储方式可以为其他方式,例如,存储数值矩阵,以及存储为地图信息等。历史可行驶区域与当前可行驶区域的融合判定方式可以为其他方式,例如,将历史区域与当前区域进行相加或者加权相乘等。历史可行驶区域的更新方法,也可以为其他方式,例如,利用融合判定后的可行驶区域对历史可行驶区域进行更新。
本申请实施例使得可行驶区域的判定优先输出最优线路所在的可行驶区域,提高可行驶区域判定的准确性,减少自动驾驶算法的探索流程,提升自动驾驶的乘坐体验,同时可行驶区域的判定算法具备自学习能力,实现基于历史记录的自动优化。
图8为本申请实施例提供的一种可行驶区域判定装置的结构示意图,所述可行驶区域判定装置800可以为计算设备,也可以为智能驾驶系统或移动数据中心MDC中的一个模块,用于实现前述各流程描述的功能。该可行驶区域判定装置800至少包括:处理器810、通信接口820以及存储器830,所述处理器810、通信接口820以及存储器830通过总线840相互连接,其中,
所述处理器810执行各种操作的具体实现可参照上述方法实施例中获取环境信息、可行驶区域融合等具体操作。处理器810可以有多种具体实现形式,处理器810根据内存中存储的程序单元执 行相关的操作,程序单元可以是指令,或称计算机程序。处理器810可以为中央处理器(central processing unit,CPU)或图像处理器(graphics processing unit,GPU),处理器810还可以是单核处理器或多核处理器。
处理器810可以由CPU和硬件芯片的组合。上述硬件芯片可以是专用集成电路(application-specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA),通用阵列逻辑(generic array logic,GAL)或其任意组合。
处理器810也可以单独采用内置处理逻辑的逻辑器件来实现,例如FPGA或数字信号处理器(digital signal processor,DSP)等。
通信接口820可以为有线接口或无线接口,用于与其他模块或设备进行通信,有线接口可以是以太接口、控制器局域网络(controller area network,CAN)接口、局域互联网络(local interconnect network,LIN)以及FlexRay接口,无线接口可以是蜂窝网络接口或使用无线局域网接口等。例如,本申请实施例中通信接口820具体可用于接收感知设备153采集的环境数据。
总线840可以是CAN总线或其他实现车内各个系统或设备之间互连的内部总线。所述总线840可以分为地址总线、数据总线、控制总线等。为便于表示,图8中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
可选地,该可行驶区域判定装置还可以包括存储器830,存储器830的存储介质可以是易失性存储器和非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data date SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。存储器530也可用于存储程序代码和数据,以便于处理器810调用存储器830中存储的程序代码实现前述各实施例中的智能驾驶系统的功能。此外,可行驶区域判定装置800可能包含相比于图8展示的更多或者更少的组件,或者有不同的组件配置方式。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其它实施例的相关描述。
上述实施例,可以全部或部分地通过软件、硬件、固件或其他任意组合来实现。当使用软件实现时,上述实施例可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载或执行所述计算机程序指令时,全部或部分地产生按照本申 请实施例所述的流程或功能。所述计算机可以为通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集合的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质。半导体介质可以是固态硬盘(solid state drive,SSD)。
本申请实施例方法中的步骤可以根据实际需要进行顺序调整、合并或删减;本申请实施例装置中的模块可以根据实际需要进行划分、合并或删减。
以上对本申请实施例进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (18)

  1. 一种应用于智能驾驶的可行驶区域判定方法,其特征在于,包括:
    智能驾驶系统获取车辆所处位置周围的环境信息,确定当前可行驶区域;
    所述智能驾驶系统根据所述车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息;
    所述智能驾驶系统将所述当前可行驶区域与所述历史可行驶区域叠加,得到本次的可行驶区域。
  2. 如权利要求1所述的方法,其特征在于,所述智能驾驶系统将所述当前可行驶区域与所述历史可行驶区域叠加,得到优化后的可行驶区域包括:
    所述智能驾驶系统判断叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将满足第一长度的可行驶车道作为本次的可行驶区域。
  3. 如权利要求2所述的方法,其特征在于,
    当叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,将当前可行驶区域作为所述本次的可行驶区域。
  4. 如权利要求2或3所述的方法,其特征在于,
    当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域包含满足第一长度的可行驶车道时,则将叠加得到的可行驶区域作为本次的可行驶区域;
    当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,判断当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域作为本次的可行驶区域,其中,k为大于等于2的正整数。
  5. 如权利要求1-4任一所述的方法,其特征在于,还包括:
    在当前可行驶区域包括满足第一长度的可行驶车道时,将当前可行驶区域更新到历史可行驶区域库。
  6. 如权利要求2-4任一所述的方法,其特征在于,
    所述第一长度表示所述智能驾驶系统感知的范围,所述满足第一长度的可行驶车道表示在智能驾驶系统感知的范围内均为可行驶区域的车道。
  7. 如权利要求1-6任一所述的方法,其特征在于,历史可行驶区域的信息包括历史可行驶区域记录、定位数据以及道路特征点。
  8. 如权利要求1-7任一所述的方法,其特征在于,所述智能驾驶系统以第二长度为粒度存储历史可行驶区域的信息。
  9. 如权利要求1-8任一所述的方法,其特征在于,在查询历史可行驶区域库,得到对应的历史可行驶区域的信息之后,所述方法还包括:
    所述智能驾驶系统根据所述车辆所在位置的道路特征点,对当前可行驶区域和历史可行驶区域进行配准,使得当前可行驶区域的道路特征点与查询得到的历史可行驶区域的信息中包括的道路特征点在地图上重合。
  10. 如权利要求1-9任一所述的方法,其特征在于,查询得到的所述历史可行驶区域大于或等于所述智能驾驶系统的感知范围。
  11. 一种智能驾驶系统,其特征在于,包括:
    检测模块,用于获取车辆所处位置周围的环境信息,确定当前可行驶区域;
    查询模块,用于根据所述车辆所处位置,查询历史可行驶区域库,得到对应的历史可行驶区域的信息;
    融合模块,用于将所述当前可行驶区域与所述历史可行驶区域叠加,得到本次的可行驶区域。
  12. 如权利要求11所述的系统,其特征在于,
    所述融合模块,具体用于判断叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将满足第一长度的可行驶车道作为本次的可行驶区域。
  13. 如权利要求12所述的系统,其特征在于,
    所述融合模块,具体用于当叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,将当前可行驶区域作为所述本次的可行驶区域。
  14. 如权利要求12或13所述的系统,其特征在于,
    所述融合模块,具体用于当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域包含满足第一长度的可行驶车道时,将叠加得到的可行驶区域作为本次的可行驶区域;
    所述融合模块,具体用于当所述当前可行驶区域与在先的k次历史可行驶区域叠加得到的可行驶区域不包含满足第一长度的可行驶车道时,判断当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域是否包含满足第一长度的可行驶车道,如果是,则将当前可行驶区域与上一次历史可行驶区域叠加得到的可行驶区域作为本次的可行驶区域,其中,k为大于等于2的正整数。
  15. 如权利要求12-14任一所述的系统,其特征在于,
    所述融合模块,具体用于在当前可行驶区域包括满足第一长度的可行驶车道时,将当前可行驶区域更新到历史可行驶区域库。
  16. 如权利要求12-15任一所述的系统,其特征在于,历史可行驶区域的信息包括历史可行驶区域记录、定位数据以及道路特征点。
  17. 如权利要求16所述的系统,其特征在于,
    所述融合模块,具体用于根据所述车辆所在位置的道路特征点,对当前可行驶区域和历史可行驶区域进行配准,使得当前可行驶区域的道路特征点与查询得到的历史可行驶区域的信息中包括的道路特征点在地图上重合。
  18. 一种智能汽车,其特征在于,包括处理器、存储器以及感知设备,
    所述存储器,用于存储历史可行驶区域库;
    所述感知设备,用于获取车辆所处位置周围的环境信息;
    所述处理器,用于执行指令以实施如权利要求1至10任一项所述的方法。
PCT/CN2021/091159 2019-09-05 2021-04-29 可行驶区域判定的方法、智能驾驶系统和智能汽车 WO2022021982A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21849359.1A EP4184119A4 (en) 2019-09-05 2021-04-29 METHOD FOR DETERMINING THE MOVABLE AREA, INTELLIGENT DRIVE SYSTEM AND INTELLIGENT VEHICLE

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201910839081 2019-09-05
CN202010726575.7 2020-07-25
CN202010726575.7A CN112444258A (zh) 2019-09-05 2020-07-25 可行驶区域判定的方法、智能驾驶系统和智能汽车

Publications (1)

Publication Number Publication Date
WO2022021982A1 true WO2022021982A1 (zh) 2022-02-03

Family

ID=74733164

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/091159 WO2022021982A1 (zh) 2019-09-05 2021-04-29 可行驶区域判定的方法、智能驾驶系统和智能汽车

Country Status (3)

Country Link
EP (1) EP4184119A4 (zh)
CN (1) CN112444258A (zh)
WO (1) WO2022021982A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912790A (zh) * 2023-06-16 2023-10-20 武汉环宇智行科技有限公司 一种车道线检测方法及装置

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112444258A (zh) * 2019-09-05 2021-03-05 华为技术有限公司 可行驶区域判定的方法、智能驾驶系统和智能汽车
JP7442424B2 (ja) * 2020-11-05 2024-03-04 株式会社日立製作所 走行領域管理装置、走行領域管理システム及び走行領域管理方法
CN113420687A (zh) * 2021-06-29 2021-09-21 三一专用汽车有限责任公司 可行驶区域的获取方法、装置和车辆
CN114355874B (zh) * 2021-11-11 2024-03-22 北京百度网讯科技有限公司 路径规划方法、装置、电子设备及自动行驶设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103748431A (zh) * 2011-08-31 2014-04-23 日产自动车株式会社 可行驶区域显示装置
CN107622684A (zh) * 2017-09-14 2018-01-23 华为技术有限公司 信息传输方法、交通控制单元和车载单元
US20180284772A1 (en) * 2017-04-03 2018-10-04 nuTonomy Inc. Processing a request signal regarding operation of an autonomous vehicle
CN109532379A (zh) * 2018-11-12 2019-03-29 重庆科技学院 自适应调整平衡运输车及其智能控制系统和方法
CN110633597A (zh) * 2018-06-21 2019-12-31 北京京东尚科信息技术有限公司 一种可行驶区域检测方法和装置
CN112444258A (zh) * 2019-09-05 2021-03-05 华为技术有限公司 可行驶区域判定的方法、智能驾驶系统和智能汽车

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3106836B1 (en) * 2015-06-16 2018-06-06 Volvo Car Corporation A unit and method for adjusting a road boundary
US10444763B2 (en) * 2016-03-21 2019-10-15 Ford Global Technologies, Llc Systems, methods, and devices for fusion of predicted path attributes and drive history
WO2018063245A1 (en) * 2016-09-29 2018-04-05 The Charles Stark Draper Laboratory, Inc. Autonomous vehicle localization
CN110832417B (zh) * 2016-12-30 2023-06-09 辉达公司 使用高清地图为自主车辆生成路线
US10520319B2 (en) * 2017-09-13 2019-12-31 Baidu Usa Llc Data driven map updating system for autonomous driving vehicles
CN108510737B (zh) * 2018-04-12 2020-04-10 中南大学 一种融合风环境的无人驾驶车辆电源实时监控方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103748431A (zh) * 2011-08-31 2014-04-23 日产自动车株式会社 可行驶区域显示装置
US20180284772A1 (en) * 2017-04-03 2018-10-04 nuTonomy Inc. Processing a request signal regarding operation of an autonomous vehicle
CN107622684A (zh) * 2017-09-14 2018-01-23 华为技术有限公司 信息传输方法、交通控制单元和车载单元
CN110633597A (zh) * 2018-06-21 2019-12-31 北京京东尚科信息技术有限公司 一种可行驶区域检测方法和装置
CN109532379A (zh) * 2018-11-12 2019-03-29 重庆科技学院 自适应调整平衡运输车及其智能控制系统和方法
CN112444258A (zh) * 2019-09-05 2021-03-05 华为技术有限公司 可行驶区域判定的方法、智能驾驶系统和智能汽车

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4184119A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912790A (zh) * 2023-06-16 2023-10-20 武汉环宇智行科技有限公司 一种车道线检测方法及装置

Also Published As

Publication number Publication date
CN112444258A (zh) 2021-03-05
EP4184119A4 (en) 2024-01-10
EP4184119A1 (en) 2023-05-24

Similar Documents

Publication Publication Date Title
WO2022021982A1 (zh) 可行驶区域判定的方法、智能驾驶系统和智能汽车
JP6602352B2 (ja) 自律走行車用の計画フィードバックに基づく決定改善システム
US20220001871A1 (en) Road vector fields
JP6637088B2 (ja) ウォルシュ・カーネル・投影技術に基づく自動運転車両の位置決め
EP3497405B1 (en) System and method for precision localization and mapping
CN112740268B (zh) 目标检测方法和装置
CN111874006B (zh) 路线规划处理方法和装置
CN112212874B (zh) 车辆轨迹预测方法、装置、电子设备及计算机可读介质
CN109426256A (zh) 自动驾驶车辆的基于驾驶员意图的车道辅助系统
JP2021514885A (ja) 自動運転車のlidar測位に用いられるディープラーニングに基づく特徴抽出方法
JP2021515178A (ja) 自動運転車両においてrnnとlstmを用いて時間平滑化を行うlidar測位
CN110096053A (zh) 用于自动驾驶车辆的驾驶轨迹生成方法、系统和机器可读介质
US10152635B2 (en) Unsupervised online learning of overhanging structure detector for map generation
WO2023179027A1 (zh) 一种道路障碍物检测方法、装置、设备及存储介质
KR20230012953A (ko) 운전 가능 표면 주석 달기를 위한 머신 러닝 기반 프레임워크
CN109085818A (zh) 基于车道信息控制自动驾驶车辆的车门锁的方法和系统
WO2023060963A1 (zh) 道路信息的识别方法、装置、电子设备、车辆及介质
WO2023179028A1 (zh) 一种图像处理方法、装置、设备及存储介质
WO2023179030A1 (zh) 一种道路边界检测方法、装置、电子设备、存储介质和计算机程序产品
WO2024008086A1 (zh) 轨迹预测方法及其装置、介质、程序产品和电子设备
US20210048819A1 (en) Apparatus and method for determining junction
US20220129683A1 (en) Selecting data for deep learning
US20230211808A1 (en) Radar-based data filtering for visual and lidar odometry
US12116008B2 (en) Attentional sampling for long range detection in autonomous vehicles
CN117470254B (zh) 一种基于雷达服务的车载导航系统和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21849359

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021849359

Country of ref document: EP

Effective date: 20230216