CN115033651A - Method, system and apparatus for determining road boundaries for vehicles - Google Patents

Method, system and apparatus for determining road boundaries for vehicles Download PDF

Info

Publication number
CN115033651A
CN115033651A CN202110193167.4A CN202110193167A CN115033651A CN 115033651 A CN115033651 A CN 115033651A CN 202110193167 A CN202110193167 A CN 202110193167A CN 115033651 A CN115033651 A CN 115033651A
Authority
CN
China
Prior art keywords
result
road boundary
module
data
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110193167.4A
Other languages
Chinese (zh)
Inventor
李千山
陆亚辉
袁圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Priority to CN202110193167.4A priority Critical patent/CN115033651A/en
Publication of CN115033651A publication Critical patent/CN115033651A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to a method for determining a road boundary for a vehicle, comprising: obtaining a first result for a road boundary from a first road boundary module; obtaining a second result for the road boundary from a second road boundary module; determining a difference between the first result and the second result; and in response to the difference not being greater than a predetermined threshold, determining the road boundary from the second result, and in response to the difference being greater than a predetermined threshold, determining the road boundary from a combination of the first result and the second result, wherein the first road boundary module is based on first data and the second road boundary module is based on second data, and wherein the first data is closer to real-time road conditions than the second data. The present disclosure also relates to systems and devices for determining road boundaries for vehicles.

Description

Method, system and apparatus for determining road boundaries for vehicles
Technical Field
The present disclosure relates to methods, systems, and devices for determining road boundaries for vehicles.
Background
In the automatic driving process of a vehicle, MAP data around the vehicle is often extracted from a high precision MAP (HD MAP) according to positioning information of the vehicle to plan a driving track of the vehicle. For example, after the road on which the vehicle is to travel is determined by the route-level travel plan (or the lane on which the vehicle is to travel is determined by the lane-level travel plan), MAP data within a range in front of and behind the vehicle (for example, within a range in front of the vehicle by 50 meters and in back of the vehicle by 10 meters) is extracted from the HD MAP based on the positioning information of the vehicle (for example, the position of the vehicle with respect to the global coordinate system), as shown in fig. 1A. The MAP data extracted from the HD MAP may include information of the road or lane, such as the center line and the boundaries on both sides of the lane. The vehicle can travel according to the information of these roads or lanes.
Disclosure of Invention
It is an object of the present disclosure to provide methods, systems and apparatus for determining a road boundary for a vehicle.
According to a first aspect of the present disclosure, a method for determining a road boundary for a vehicle is provided. The method comprises the following steps: obtaining a first result for a road boundary from a first road boundary module; obtaining a second result for the road boundary from a second road boundary module; determining a difference between the first result and the second result; and in response to the difference not being greater than a predetermined threshold, determining the road boundary from the second result, and in response to the difference being greater than a predetermined threshold, determining the road boundary from a combination of the first result and the second result, wherein the first road boundary module is based on first data and the second road boundary module is based on second data, and wherein the first data is closer to real-time road conditions than the second data.
In an exemplary embodiment of this aspect, the first data is from sensors onboard the vehicle and the second data is from a pre-stored database.
In one exemplary embodiment of this aspect, the sensor is a camera and the database is a high precision map.
In an exemplary embodiment of this aspect, the roadway includes one or more lanes.
In one exemplary embodiment of the present aspect, determining the road boundary from a combination of the first result and the second result includes: respectively converting the first result and the second result into the same coordinate system to obtain a first conversion result and a second conversion result; superposing the first conversion result and the second conversion result to obtain a combined result; and determining the road boundary according to the narrowest road boundary indicated by the combination result.
In an exemplary embodiment of this aspect, either of the first road boundary module and the second road boundary module is a module onboard the vehicle or a remote module.
According to a second aspect of the present disclosure, a system for determining a road boundary for a vehicle is provided. The system comprises: a first road boundary module configured to provide a first result for a road boundary based on first data; a second road boundary module configured to provide a second result for the road boundary based on second data, wherein the first data is closer to real-time road conditions than the second data; and a determination module configured to: obtaining the first result from the first road boundary module; obtaining the second result from the second road boundary module; determining a difference between the first result and the second result; and in response to the difference not being greater than a predetermined threshold, determining the road boundary from the second result, and in response to the difference being greater than a predetermined threshold, determining the road boundary from a combination of the first result and the second result.
In an exemplary embodiment of this aspect, the first data is from a sensor onboard the vehicle and the second data is from a pre-stored database.
In one exemplary embodiment of this aspect, the sensor is a camera and the database is a high precision map.
In an exemplary embodiment of this aspect, the roadway includes one or more lanes.
In an exemplary embodiment of this aspect, the determination module is further configured to determine the road boundary from a combination of the first result and the second result based on: respectively converting the first result and the second result into the same coordinate system to obtain a first conversion result and a second conversion result; superposing the first conversion result and the second conversion result to obtain a combined result; and determining the road boundary according to the narrowest road boundary indicated by the combination result.
In an exemplary embodiment of this aspect, any of the first road boundary module, the second road boundary module and the determination module is a module onboard the vehicle or a remote module.
According to a third aspect of the present disclosure, an apparatus for determining a road boundary for a vehicle is provided. The apparatus comprises: one or more processors; and one or more memories configured to store a series of computer-executable instructions, wherein the series of computer-executable instructions, when executed by the one or more processors, cause the one or more processors to perform the method as described above.
In an exemplary embodiment of the present aspect, the apparatus is mounted on the vehicle.
According to a fourth aspect of the disclosure, a non-transitory computer-readable storage medium is provided. The non-transitory computer-readable storage medium has stored thereon a series of computer-executable instructions that, when executed by one or more computing devices, cause the one or more computing devices to perform the method as described above.
Other features of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
The present disclosure may be more clearly understood from the following detailed description, taken with reference to the accompanying drawings, in which:
fig. 1A is a schematic diagram schematically showing map data of the vehicle surroundings extracted from a high-precision map.
Fig. 1B to 1E are schematic diagrams schematically illustrating a method for determining a road boundary for a vehicle according to one embodiment of the present disclosure.
FIG. 2 is a flow chart schematically illustrating a method for determining a road boundary for a vehicle, according to one embodiment of the present disclosure.
FIG. 3 is a block diagram schematically illustrating a system for determining a road boundary for a vehicle, according to one embodiment of the present disclosure.
Fig. 4 is an exemplary block diagram schematically illustrating a general hardware system applicable to the present disclosure according to an embodiment of the present disclosure.
Note that in the embodiments described below, the same reference numerals are used in common between different drawings to denote the same portions or portions having the same functions, and a repetitive description thereof will be omitted. In some cases, similar reference numbers and letters are used to denote similar items, and thus, once an item is defined in one figure, it need not be discussed further in subsequent figures.
Detailed Description
The present disclosure will now be described with reference to the accompanying drawings, which illustrate several embodiments of the disclosure. It should be understood, however, that the present disclosure may be presented in many different ways and is not limited to the embodiments described below; rather, the embodiments described below are intended to provide a more complete disclosure of the present disclosure, and to fully convey the scope of the disclosure to those skilled in the art. It is also to be understood that the embodiments disclosed herein can be combined in various ways to provide further additional embodiments.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. All terms (including technical and scientific terms) used herein have the meaning commonly understood by one of ordinary skill in the art unless otherwise defined. Well-known functions or constructions may not be described in detail for brevity and/or clarity.
Herein, the term "a or B" includes "a and B" and "a or B" rather than exclusively including only "a" or only "B" unless otherwise specifically stated.
In this document, the term "exemplary" means "serving as an example, instance, or illustration," and not as a "model" that is to be reproduced exactly. Any implementation exemplarily described herein is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the detailed description.
In addition, "first," "second," and like terms may also be used herein for reference purposes only, and thus are not intended to be limiting. For example, the terms "first," "second," and other such numerical terms referring to structures or elements do not imply a sequence or order unless clearly indicated by the context.
It will be further understood that the terms "comprises/comprising," "includes" and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The MAP data in HD MAP is relatively static compared to real-time road conditions. If a situation such as road construction or roadside parking occurs temporarily in the real environment (which may result in narrowing of the road/lane), for example, as shown in fig. 1B (in the figure, a filled rectangle a located at the edge of the road is a schematic representation of a temporary obstacle in the real environment, and is hereinafter referred to as "obstacle a"), the situation cannot be reflected in the MAP data extracted from the HD MAP, that is, the MAP data extracted from the HD MAP is also as shown in fig. 1A. Obviously, planning a travel trajectory on the basis of MAP data extracted from the HD MAP in this case would result in the vehicle having a wrong travel trajectory, such as a collision with an obstacle or the like.
FIG. 2 is a flow chart schematically illustrating a method 100 for determining a road boundary for a vehicle, according to one embodiment of the present disclosure. Specifically, the method 100 includes steps S11 through S14 (including steps S14-1 and S14-2) as described below.
Step S11: obtaining a first result for the road boundary from the first road boundary module, and step S12: a second result for the road boundary is obtained from a second road boundary module. The first road boundary module is a module for determining a road boundary based on the first data, so the first result for the road boundary is a result of the road boundary determined based on the first data. The second road boundary module is a module for determining a road boundary based on the second data, so the second result for the road boundary is a result of the road boundary determined based on the second data. Since the first data is closer to the real-time road condition than the second data, i.e. the second data is relatively static data and the first data is relatively real-time data, e.g. the first data is from a sensor onboard the vehicle and the second data is from a pre-stored database, the first result determined based on the first data is more likely to be closer to the real-time road condition than the second result determined based on the second data.
In one particular example, the first road boundary module may be a module that determines a road boundary (i.e., a first result) based on data sensed from a sensor (e.g., LiDAR, camera, etc.) (i.e., first data in relatively real-time), and the second road boundary module may be a module that determines a road boundary (i.e., a second result) based on MAP data extracted from the HD MAP (i.e., second data that is relatively static). In the case where a temporary obstacle a exists at the road edge as shown in fig. 1B, a first result for the road boundary obtained from the first road boundary module may be as shown in fig. 1A, which shows that the road boundaries on both sides of the vehicle extend substantially straight ahead. The driving trajectory planned based on such road boundary results is generally along the center line of the road itself (e.g., the dotted line in the figure), and then there is a high possibility that the vehicle may have a driving accident such as collision, scratch, etc. at the temporary obstacle a. The second result for the road boundary obtained from the second road boundary module may be as shown in fig. 1C, which shows that the road boundaries on both sides of the vehicle at the temporary obstacle a are significantly narrowed. When planning a travel trajectory based on the road boundary result, a travel trajectory having an avoidance purpose is planned at the temporary obstacle a instead of planning a travel trajectory such that the vehicle travels along the center line of the road itself.
In one particular example, either of the first and second results for the roadway boundary may include coordinates of a centerline of the roadway and coordinates of respective roadway boundaries located on either side of the centerline of the roadway. For example, the first result may include a first centerline coordinate (it being understood that when referring to these coordinates, it is meant a set of discrete or continuous coordinate values rather than a single one, since the result is representative of a range of road conditions around the vehicle), a first left boundary coordinate, and a first right boundary coordinate. The second result may include a second centerline coordinate, a second left boundary coordinate, and a second right boundary coordinate.
Step S13: a difference between the first result and the second result is determined. Results for road boundaries from different sources typically have respective coordinate systems. For example, a first result determined based on data sensed from the sensors may be a result presented in a sensor coordinate system (which may be equivalent to a body coordinate system in some cases), and a second result determined based on MAP data extracted from the HD MAP may be a result presented in a MAP coordinate system (which may be equivalent to a global coordinate system in some cases). The first result and the second result are transformed into the same coordinate system, and then the difference between the first result and the second result is determined. For example, the first result may be converted into a coordinate system used for the second result to obtain a first conversion result, and then the first conversion result and the second result (in this example, although the second result is not coordinate-converted, the second result at this time may also be referred to as a "second conversion result" herein) may be compared to determine a difference between the first result and the second result.
Determining the difference between the first result and the second result includes determining a difference between coordinates of respective road boundaries located on either side of a center line of the road. For example, a first difference of a first converted left-side boundary coordinate in the first conversion result and a second left-side boundary coordinate in the second result is determined, and a second difference of a first converted right-side boundary coordinate in the first conversion result and a second right-side boundary coordinate in the second result is determined. The difference between the first result and the second result may be determined based on any one of the first difference and the second difference, the greater one of the first difference and the second difference, the sum of the first difference and the second difference, and so on.
Step S14-1: in response to the difference not being greater than the predetermined threshold, a road boundary is determined according to the second result. If the difference between the first result and the second result determined in step S13 is small, e.g., not greater than a predetermined threshold, then the first result based on the relatively real-time data and the second result based on the relatively static data may be considered to have mutually verified, which are both reliable. The second result based on the relatively static data (e.g., the result of the road boundary determined based on the MAP data extracted from the HD MAP) has higher stability and accuracy than the first result based on the relatively real-time data (the result of the road boundary determined based on the data sensed from the sensor) in the case of coincidence with the real environment. In this case, therefore, the determination of the road boundary from the second result based on the relatively static data can be selected as a basis for the planning of the driving trajectory.
Step S14-2: in response to the difference being greater than a predetermined threshold, a road boundary is determined from a combination of the first result and the second result. If the difference between the first result and the second result determined in step S13 is large, e.g., greater than a predetermined threshold, then at least one of the first result based on the relatively real-time data and the second result based on the relatively static data may be considered unreliable, which may be based on both the first result and the second result to determine the road boundary. In one embodiment, the first result and the second result are respectively converted to the same coordinate system to obtain a first conversion result and a second conversion result; superposing the first conversion result and the second conversion result to obtain a combined result; and determining a road boundary from the narrowest road boundary indicated by the combination result. For example, the first result may be converted into a coordinate system used by the second result to obtain a first conversion result, and then the first conversion result and the second result (although the second result is not coordinate-converted in this example, the second result at this time may also be referred to as a "second conversion result" herein) may be superimposed to obtain a combined result. Note that, if the coordinate conversion to be performed in this step is the same as the coordinate conversion already performed in step S13 (for example, both the first result and the second result are converted into the coordinate system used in the step S13), the conversion result obtained in step S13 may be used without performing the coordinate conversion again in this step (for example, the first conversion result already obtained in step S13 may be used directly).
In one specific example, the relatively static second result determined based on the MAP data extracted from the HD MAP as shown in fig. 1A and the relatively real-time first result determined based on the data sensed from the sensors as shown in fig. 1C are superimposed (e.g., layer-wise superimposed) in the same coordinate system, resulting in the combined result as shown in fig. 1D. There may be various ways to determine the road boundary based on the combined result. In one embodiment, the boundary of the road may be determined by determining a left boundary of the road from a mean (or median) of the first left boundary and the second left boundary, and determining a right boundary of the road from a mean (or median) of the first right boundary and the second right boundary. In one embodiment, the road boundary may be determined from the narrowest road boundary indicated by the combination result. In the example shown in fig. 1D, for the left side of the road, the first left side boundary in the first result shown in fig. 1C is closer to the center line of the road, and thus the first left side boundary may be determined as the left side boundary of the road. For the right side of the road, the second right side boundary in the second result as shown in fig. 1A is closer to the center line of the road, and thus the second right side boundary may be determined as the right side boundary of the road as shown in fig. 1E. The travel trajectory may then be planned according to the determined road boundaries, for example as indicated by the dashed lines in fig. 1E.
It should be noted that the example shown in the figure is for simplicity, and the narrowest road boundary indicated by the combined result comes from one result (the first result or the second result) for each side of the road. It should be understood that for each side of the road, a portion of the narrowest road boundary indicated by the combined result may be from one result, while another portion of the narrowest road boundary may be from another result. Either one of the first road boundary module and the second road boundary module may be a vehicle-mounted module or a remote module. For example, the first road boundary module may be a functional module implemented by a cloud server that determines the result of a road boundary based on data collected by sensors onboard the vehicle. The execution subject of the method 100 may be a module onboard a vehicle or a remote module. For example, the method 100 may be implemented by a processor onboard a vehicle executing program instructions. The method 100 may also be implemented by a processor in a cloud server executing program instructions, where the cloud server may send the finally determined road boundary to the vehicle so that the vehicle determines the driving track, and the cloud server may also send the driving track planned based on the finally determined road boundary to the vehicle so that the vehicle directly controls driving according to the driving track. The road, as referred to in this disclosure, may include one or more lanes (lanes). That is, the road referred to in this disclosure may be understood as a lane on which the vehicle is driving or is about to drive, where the road boundary refers to the boundary of the lane. Further, references to a road in this disclosure may be understood to include a road that includes lanes on which vehicles are traveling or are about to travel, where a road boundary refers to a boundary of the road.
The present disclosure also provides a system for determining a road boundary for a vehicle. The system comprises: a first road boundary module configured to provide a first result for a road boundary based on first data; a second road boundary module configured to provide a second result for the road boundary based on second data, wherein the first data is closer to the real-time road conditions than the second data; and a determination module configured to: obtaining a first result from a first road boundary module; obtaining a second result from a second road boundary module; determining a difference between the first result and the second result; and responsive to the difference not being greater than a predetermined threshold, determining a road boundary from the second result, and responsive to the difference being greater than the predetermined threshold, determining a road boundary from a combination of the first result and the second result.
FIG. 3 is a block diagram schematically illustrating a system 200 for determining a road boundary for a vehicle, according to one embodiment of the present disclosure. The system 200 includes a first road boundary module 210, a second road boundary module 220, and a determination module 230. Any of the first road boundary module 210, the second road boundary module 220, and the determination module 230 may be a module loaded on a vehicle or may be a remote module.
The first lane boundary module 210 comprises a route unit 212 which extracts data of a lane around the vehicle's localized position from a high precision MAP (HD MAP)211, which may comprise, for example, the coordinates of the center line of the lane and the coordinates of lane boundaries on both sides of the center line, according to the vehicle's current localized position and the lane on which the vehicle is/will travel as determined by the route planning. Wherein data relating to the lane boundaries may be provided to the determination module 230 as a relatively static result 213.
The second road boundary module 220 includes a camera 222. The camera 222 may photograph the real environment 221 in real time, so that the second road boundary module 220 may detect the boundary of the lane in real time based on the real environment 221 to provide a relatively real-time result 223 to the determination module 230. It should be noted that various existing camera-based techniques for detecting road/lane boundaries can be utilized to obtain relatively real-time results 223, and the present disclosure is not limited thereto.
The determination module 230 compares the result 223 of the lane boundary detected based on the data photographed by the camera 222 with the result 213 of the lane boundary in the lane information extracted from the HD MAP. If the difference between the results 213 and 223 is small, it can be considered that the difference is caused by a detection error based on the data photographed by the camera 222, and therefore, the result 213, which is relatively static, can be selected as a lane boundary, that is, the vehicle can travel in accordance with the lane information given by the HD MAP. If the difference between the results 213 and 223 is large, which may be considered to be the difference caused by the data in the HD MAP being outdated, the relatively real-time result 223 of the road boundary detected based on the data captured by the camera 222 may be superimposed with the relatively static result 213 of the road boundary in the lane information extracted from the HD MAP. For example, the map layer may be superimposed after the map layer is converted to the same coordinate system (e.g., a map coordinate system). The vehicle can control the driving track according to the lane boundary displayed after the layers are overlapped.
Fig. 4 is an exemplary block diagram schematically illustrating a generic hardware system 300 applicable to the present disclosure according to an embodiment of the present disclosure. A system 300, which is an example of a hardware device that may be applied to aspects of the present disclosure, will now be described with reference to fig. 4. System 300 may be any machine configured to perform processing and/or computing, and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal data assistant, a smart phone, a vehicle computer, or any combination thereof. The system 200 for determining a roadway boundary for a vehicle according to embodiments of the present disclosure described above may be implemented in whole or at least in part by the system 300 or a similar device or system.
System 300 may include components connected to bus 302 or in communication with bus 302, possibly via one or more interfaces. For example, the system 300 may include a bus 302, as well as one or more processors 304, one or more input devices 306, and one or more output devices 308. The one or more processors 304 may be any type of processor, and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (e.g., special purpose processing chips). The various steps in the methods described above may be implemented by one or more processors 304 executing instructions.
Input device 306 may be any type of device that can input information to a computing device, which may include, but is not limited to, a mouse, a keyboard, a touch screen, a microphone, and/or a remote control. Output device 308 may be any type of device that can present information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer.
The system 300 may also include a non-transitory storage device 310 or be connected with a non-transitory storage device 310. The non-transitory storage device 310 may be any storage device that is non-transitory and that may enable data storage, and may include, but is not limited to, a magnetic disk drive, an optical storage device, solid state memory, a floppy disk, a hard disk, a magnetic tape, or any other magnetic medium, an optical disk, or any other optical medium, a ROM (read only memory), a RAM (random access memory), a cache memory, and/or any other memory chip/chip set, and/or any other medium from which a computer may read data, instructions, and/or code. The non-transitory storage device 310 may be detachable from the interface. The non-transitory storage device 310 may have data/instructions/code for implementing the methods, steps, and processes described above. For example, the HD MAP 211 described above may be stored, at least in part, in the non-transitory storage device 310.
The system 300 may also include a communication device 312. The communication device 312 may be any type of device or system capable of communicating with external devices and/or with a network and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset, such as a bluetooth device, 1302.11 device, WiFi device, WiMax device, cellular communication device, satellite communication device, and/or the like.
When the system 300 is used as an on-board device, it may also be connected to external devices, such as a GPS receiver, sensors for sensing different environmental data, such as acceleration sensors, wheel speed sensors, gyroscopes, and so on. In this manner, the system 300 may, for example, receive location data and sensor data indicative of a driving condition of the vehicle. When the system 300 is used as an on-board device, it may also be connected to other equipment of the vehicle (e.g., an engine system, wipers, anti-lock brake system, etc.) to control operation and handling of the vehicle.
In addition, the non-transitory storage device 310 may have map information and software elements so that the processor 304 may perform route guidance processing. In addition, the output device 308 may include a display for displaying a map, a position marker of the vehicle, and an image indicating the running condition of the vehicle. The output device 308 may also include a speaker or an interface with headphones for audio guidance.
The bus 302 may include, but is not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA (eisa) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus. In particular, for on-board devices, the bus 302 may also include a Controller Area Network (CAN) bus or other architecture designed for application on a vehicle.
System 300 may also include a working memory 314, which may be any type of working memory that can store instructions and/or data useful for the operation of processor 304, which may include, but is not limited to, a random access memory and/or a read-only memory device.
Software elements may be located in working memory 314 including, but not limited to, an operating system 316, one or more application programs 318, drivers, and/or other data and code. Instructions for performing the methods and steps described above may be included in one or more application programs 318. Executable code or source code for the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as storage device 310 described above, and may be read into working memory 314 by compilation and/or installation. Executable or source code for the instructions of the software elements may also be downloaded from a remote location.
It is also to be understood that variations may be made in accordance with specific requirements. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. In addition, connections to other computing devices, such as network input/output devices, may be employed. For example, some or all of the methods or apparatus according to embodiments of the present disclosure may be implemented by programming hardware (e.g., programmable logic circuitry including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in assembly or hardware programming languages (e.g., VERILOG, VHDL, C + +) using logic and algorithms according to the present disclosure.
It should also be understood that the components of system 300 may be distributed across a network. For example, some processes may be performed using one processor, while other processes may be performed by another processor that is remote from the one processor. Other components of the system 300 may also be similarly distributed. As such, system 300 may be construed as a distributed computing system performing processing at multiple locations.
Although aspects of the present disclosure have been described thus far with reference to the accompanying drawings, the above-described methods, systems and apparatuses are merely exemplary examples, and the scope of the present disclosure is not limited by these aspects, but is only limited by the following aspects: the appended claims and their equivalents. Various elements may be omitted or equivalent elements may be substituted. In addition, the steps may be performed in a different order than described in the present disclosure. Further, the various elements may be combined in various ways. It is also important that as technology develops many of the elements described can be replaced by equivalent elements which appear after the present disclosure.

Claims (15)

1. A method for determining a road boundary for a vehicle, comprising:
obtaining a first result for a road boundary from a first road boundary module;
obtaining a second result for the road boundary from a second road boundary module;
determining a difference between the first result and the second result; and
determining the road boundary from the second result in response to the difference not being greater than a predetermined threshold, and determining the road boundary from a combination of the first result and the second result in response to the difference being greater than a predetermined threshold,
wherein the first road boundary module is based on first data and the second road boundary module is based on second data, and wherein the first data is closer to real-time road conditions than the second data.
2. The method of claim 1, wherein the first data is from a sensor onboard the vehicle and the second data is from a pre-stored database.
3. The method of claim 2, wherein the sensor is a camera and the database is a high precision map.
4. The method of claim 1, wherein the roadway comprises one or more lanes.
5. The method of claim 1, wherein determining the road boundary from a combination of the first result and the second result comprises:
respectively converting the first result and the second result into the same coordinate system to obtain a first conversion result and a second conversion result;
superposing the first conversion result and the second conversion result to obtain a combined result; and
determining the road boundary according to the narrowest road boundary indicated by the combination result.
6. The method of claim 1, wherein any of the first roadway boundary module and the second roadway boundary module is a module onboard the vehicle or a remote module.
7. A system for determining a roadway boundary for a vehicle, comprising:
a first road boundary module configured to provide a first result for a road boundary based on first data;
a second road boundary module configured to provide a second result for the road boundary based on second data, wherein the first data is closer to real-time road conditions than the second data; and
a determination module configured to:
obtaining the first result from the first road boundary module;
obtaining the second result from the second road boundary module;
determining a difference between the first result and the second result; and
determining the road boundary from the second result in response to the difference not being greater than a predetermined threshold, and determining the road boundary from a combination of the first result and the second result in response to the difference being greater than a predetermined threshold.
8. The system of claim 7, wherein the first data is from a sensor onboard the vehicle and the second data is from a pre-stored database.
9. The system of claim 8, wherein the sensor is a camera and the database is a high precision map.
10. The system of claim 8, wherein the roadway comprises one or more lanes.
11. The system of claim 7, wherein the determination module is further configured to determine the road boundary from a combination of the first result and the second result based on:
respectively converting the first result and the second result into the same coordinate system to obtain a first conversion result and a second conversion result;
superposing the first conversion result and the second conversion result to obtain a combined result; and
determining the road boundary according to the narrowest road boundary indicated by the combination result.
12. The system of claim 7, wherein any of the first roadway boundary module, the second roadway boundary module, and the determination module is a module onboard the vehicle or a remote module.
13. An apparatus for determining a roadway boundary for a vehicle, comprising:
one or more processors; and
one or more memories configured to store a series of computer-executable instructions,
wherein the series of computer-executable instructions, when executed by the one or more processors, cause the one or more processors to perform the method recited in any one of claims 1-6.
14. The apparatus of claim 13, wherein the apparatus is loaded on the vehicle.
15. A non-transitory computer-readable storage medium having stored thereon a series of computer-executable instructions that, when executed by one or more computing devices, cause the one or more computing devices to perform the method of any of claims 1-6.
CN202110193167.4A 2021-02-20 2021-02-20 Method, system and apparatus for determining road boundaries for vehicles Pending CN115033651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110193167.4A CN115033651A (en) 2021-02-20 2021-02-20 Method, system and apparatus for determining road boundaries for vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110193167.4A CN115033651A (en) 2021-02-20 2021-02-20 Method, system and apparatus for determining road boundaries for vehicles

Publications (1)

Publication Number Publication Date
CN115033651A true CN115033651A (en) 2022-09-09

Family

ID=83117690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110193167.4A Pending CN115033651A (en) 2021-02-20 2021-02-20 Method, system and apparatus for determining road boundaries for vehicles

Country Status (1)

Country Link
CN (1) CN115033651A (en)

Similar Documents

Publication Publication Date Title
EP3660807A1 (en) Server device and vehicle
US11663835B2 (en) Method for operating a navigation system
KR102630991B1 (en) Method for determining driving posision of vehicle, apparatus thereof and driving control system
US20220324387A1 (en) Display control system, display control method, and non-transitory storage medium
JP2008262481A (en) Vehicle control device
CN115033651A (en) Method, system and apparatus for determining road boundaries for vehicles
JP2022139009A (en) Drive support device, drive support method, and program
CN112885087A (en) Method, apparatus, device and medium for determining road condition information and program product
CN114973742A (en) Method, system and device for verifying positioning information of vehicle
EP4064220A1 (en) Method, system and device for detecting traffic light for vehicle
CN113494911B (en) Method and system for positioning vehicle
JP7203905B2 (en) CONTROL DEVICE, MOVING OBJECT, CONTROL METHOD AND PROGRAM
JP7307824B1 (en) Information processing device, mobile object, system, information processing method, and program
JP7415849B2 (en) Programs for vehicle systems and object signs
JP7494231B2 (en) Information processing device, mobile object, server, program, and method
JP7449206B2 (en) Communication control device, vehicle, program, and communication control method
JP7203902B2 (en) CONTROL DEVICE, MOVING OBJECT, CONTROL METHOD AND PROGRAM
WO2023286303A1 (en) Vehicle control appraratus
CN115457805B (en) Control device, moving body, control method, and computer-readable storage medium
US20240132101A1 (en) Autonomous driving system in heterogeneous sd map and hd map environment and management method for the autonomous driving
CN114973170A (en) Autonomous driving method and system
JP7126629B1 (en) Information integration device, information integration method, and information integration program
US20230306752A1 (en) Information processing apparatus, moving object, system, information processing method, and server
US20230266133A1 (en) Information processing apparatus, moving object, server, and method
JP2022048829A (en) Communication control device, vehicle, program, and communication control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination