CN114291084A - Method and device for controlling a vehicle - Google Patents

Method and device for controlling a vehicle Download PDF

Info

Publication number
CN114291084A
CN114291084A CN202210033318.4A CN202210033318A CN114291084A CN 114291084 A CN114291084 A CN 114291084A CN 202210033318 A CN202210033318 A CN 202210033318A CN 114291084 A CN114291084 A CN 114291084A
Authority
CN
China
Prior art keywords
obstacle
determining
road boundary
response
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210033318.4A
Other languages
Chinese (zh)
Inventor
高斌
刘祥
张双
朱晓星
薛晶晶
杨凡
王成法
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202210033318.4A priority Critical patent/CN114291084A/en
Publication of CN114291084A publication Critical patent/CN114291084A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09626Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees

Abstract

The disclosed embodiments disclose a method and apparatus for controlling a vehicle. One embodiment of the method comprises: acquiring point clouds at a road boundary acquired in the driving process of a vehicle; in response to the fact that the point cloud comprises the obstacle, determining the road boundary attribute of the position where the point cloud is located according to preset map data; determining whether the obstacle is a false detection obstacle or not based on the road boundary attribute and the point cloud of the obstacle; and controlling the vehicle to continue running according to the current running speed in response to the fact that the obstacle is determined to be the false detection obstacle. The embodiment avoids the influence of false detection of the obstacle on the normal running of the vehicle.

Description

Method and device for controlling a vehicle
Technical Field
The disclosed embodiments relate to the field of computer technology, and in particular, to a method and apparatus for controlling a vehicle.
Background
During the driving process of the automatic driving vehicle, path planning and driving control are realized by comprehensively analyzing and processing information collected by various sensors (such as a camera, a laser radar and the like) arranged on the vehicle. For example, obstacle detection may be performed on a laser point cloud collected by a laser radar installed in an autonomous vehicle, and motion estimation may be performed on the detected obstacle, so as to avoid the obstacle. In some cases, various objects may be present near the road boundary, such as greenery, railings, and so forth. If these objects are mistakenly detected as obstacles, normal running of the autonomous vehicle may be affected. For example, green branches protruding into a road at the roadside are mistakenly detected as obstacles on the road, and the road is bypassed or stopped.
Disclosure of Invention
The disclosed embodiments provide a method and apparatus for controlling a vehicle.
In a first aspect, embodiments of the present disclosure provide a method for controlling a vehicle, the method comprising: acquiring point clouds at a road boundary acquired in the driving process of a vehicle; in response to the fact that the point cloud comprises the obstacle, determining the road boundary attribute of the position where the point cloud is located according to preset map data; determining whether the obstacle is a false detection obstacle or not based on the road boundary attribute and the point cloud of the obstacle; and controlling the vehicle to continue running according to the current running speed in response to the fact that the obstacle is determined to be the false detection obstacle.
In some embodiments, in response to determining that the obstacle is not a false positive obstacle, control information is sent to the vehicle to control the vehicle to avoid a collision with the obstacle based on the position and state of the obstacle.
In some embodiments, the road boundary attribute comprises green plants; and the determining whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle includes: in response to determining that the obstacle is stationary, performing a first operation of: in response to the fact that the obstacle is determined to be an unknown type obstacle, determining the minimum distance between the point data located at the innermost side of the road boundary in the point cloud of the obstacle and the road boundary where the obstacle is located; in response to determining that the minimum distance is smaller than a preset threshold value, determining that the obstacle is a false detection obstacle; determining that the obstacle is not a false detection obstacle in response to determining that the minimum distance is greater than or equal to the preset threshold.
In some embodiments, the road boundary attribute comprises a wall or a railing; and the determining whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle includes: in response to determining that the obstacle is stationary, performing the following second operational step: determining whether the obstacle is located outside a road boundary; determining the obstacle to be a false detection obstacle in response to determining that the obstacle is located outside a road boundary; determining that the obstacle is not a false detection obstacle in response to determining that the obstacle is located inside a road boundary.
In some embodiments, the states include a stationary state and a moving state; and the transmitting of control information to the vehicle based on the position and state of the obstacle includes: in response to determining that the state is a moving state, determining the position and the moving speed of the obstacle according to the point cloud of the obstacle; and sending control information to the vehicle according to the position and the moving speed of the obstacle so as to control the vehicle to avoid collision with the obstacle.
In a second aspect, an embodiment of the present disclosure provides an apparatus for controlling a vehicle, the apparatus including: an acquisition unit configured to acquire a point cloud at a road boundary acquired during a vehicle traveling; the first determining unit is configured to respond to the fact that the point cloud comprises the obstacle, and determine the road boundary attribute of the position where the point cloud is located according to preset map data; a second determination unit configured to determine whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle; a first control unit configured to control the vehicle to continue traveling at a current traveling speed in response to a determination that the obstacle is a false detection obstacle.
In some embodiments, the above apparatus further comprises: a second control unit configured to transmit control information to the vehicle to control the vehicle to avoid a collision with the obstacle based on a position and a state of the obstacle in response to determining that the obstacle is not a false detection obstacle.
In some embodiments, the road boundary attribute comprises green plants; and the second determination unit is further configured to: in response to determining that the obstacle is stationary, performing a first operation of: in response to the fact that the obstacle is determined to be an unknown type obstacle, determining the minimum distance between the point data located at the innermost side of the road boundary in the point cloud of the obstacle and the road boundary where the obstacle is located; in response to determining that the minimum distance is smaller than a preset threshold value, determining that the obstacle is a false detection obstacle; determining that the obstacle is not a false detection obstacle in response to determining that the minimum distance is greater than or equal to the preset threshold.
In some embodiments, the road boundary attribute comprises a wall or a railing; and the second determination unit is further configured to: in response to determining that the obstacle is stationary, performing the following second operational step: determining whether the obstacle is located outside a road boundary; determining the obstacle to be a false detection obstacle in response to determining that the obstacle is located outside a road boundary; determining that the obstacle is not a false detection obstacle in response to determining that the obstacle is located inside a road boundary.
In some embodiments, the states include a stationary state and a moving state; and the second control unit is further configured to: in response to determining that the state is a moving state, determining the position and the moving speed of the obstacle according to the point cloud of the obstacle; and sending control information to the vehicle according to the position and the moving speed of the obstacle so as to control the vehicle to avoid collision with the obstacle.
In a third aspect, an embodiment of the present disclosure provides an apparatus, including: one or more processors; a storage device, on which one or more programs are stored, which, when executed by the one or more processors, cause the one or more processors to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for controlling the vehicle, the point cloud of the road boundary acquired in the driving process of the vehicle is acquired firstly. And then, carrying out obstacle identification on the point cloud, and if the point cloud includes an obstacle, determining the road boundary attribute of the position where the point cloud is located according to preset map data. Then, whether the obstacle is a false detection obstacle is determined based on the road boundary attribute and the point cloud of the obstacle. And if the obstacle is the false detection obstacle, controlling the vehicle to continue to run according to the current running speed. Therefore, the influence of mistakenly detecting the obstacles on the normal running of the vehicle is avoided.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow chart of one embodiment of a method for controlling a vehicle according to the present disclosure;
FIG. 3 is a schematic diagram of one application scenario of a method for controlling a vehicle according to the present disclosure;
FIG. 4 is a flow chart of yet another embodiment of a method for controlling a vehicle according to the present disclosure;
FIG. 5 is a schematic block diagram of one embodiment of an apparatus for controlling a vehicle according to the present disclosure;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing an electronic device of an embodiment of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 for a method for controlling a vehicle or an apparatus for controlling a vehicle to which embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include vehicles 101, 102, 103, a network 104, and a server 105. The network 104 is used to provide a medium for communication links between the vehicles 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The vehicles 101, 102, 103 may interact with a server 105 over a network 104 to receive or send messages, etc. The vehicles 101, 102, 103 may have various information acquisition devices mounted thereon, such as image acquisition devices, binocular cameras, sensors, lidar, and the like. The information acquisition device can be used for acquiring the internal and external environment information of the vehicles 101, 102 and 103. The vehicles 101, 102, 103 may further be equipped with vehicle-mounted intelligent brains (not shown in the figure), and the vehicle-mounted intelligent brains may receive the information collected by the information collecting device, analyze the information, and perform processing, and then control the vehicles 101, 102, 103 to perform corresponding operations (e.g., continue driving, emergency stop, etc.) according to the processing result. The vehicles 101, 102, 103 may be vehicles including an autonomous driving mode, including vehicles that are fully autonomous, and vehicles that can be switched to an autonomous driving mode.
The vehicles 101, 102, 103 may be various types of vehicles including, but not limited to, large buses, tractors, city buses, medium buses, large trucks, minicars, and the like.
The server 105 may be a server that provides various services, such as a backend server that processes information sent by the vehicles 101, 102, 103. The backend server may perform various analysis processes on the received information and transmit control information to the vehicles 101, 102, 103 according to the processing result to control the vehicles 101, 102, 103.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of vehicles, networks, and servers in FIG. 1 is merely illustrative. There may be any number of vehicles, networks, and servers, as desired for implementation.
It should be noted that the method for controlling the vehicle provided in the embodiment of the present application may be executed by the onboard intelligent brains installed on the vehicles 101, 102, 103, or may be executed by the server 105. Accordingly, the device for controlling the vehicle may be provided in the in-vehicle intelligent brain mounted on the vehicles 101, 102, 103, or may be provided in the server 105.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for controlling a vehicle according to the present disclosure is shown. The method for controlling a vehicle includes the steps of:
step 201, point clouds at the road boundary collected in the vehicle driving process are obtained.
In the present embodiment, an executing subject of the method for controlling the vehicle (for example, the onboard intelligent brains or 105 servers of the vehicles 101, 102, 103 shown in fig. 1) may acquire point cloud data of the road environment ahead, which is acquired by using the laser radar during the driving process of the vehicle, through a wired connection manner or a wireless connection manner. Here, each point data in the point cloud data may include three-dimensional coordinates. In general, the three-dimensional coordinates of the point data may include information on the X-axis, Y-axis, and Z-axis. The enforcement agent may determine the road boundary of the road on which it is located in various ways. For example, the road boundary of the road on which the map is located can be determined by a high-precision map. For another example, the enforcement agent may identify a road boundary based on point cloud data collected by a lidar. Thereafter, the executing subject may acquire point clouds at the road boundaries from the acquired point cloud data. As an example, the point cloud at the road boundary may refer to a point cloud within a preset range at a position where the road boundary is located. Here, the preset range may be set according to actual needs.
Step 202, in response to the fact that the point cloud comprises the obstacle, determining road boundary attributes of the position where the point cloud is located according to preset map data.
In this embodiment, the executing subject may perform obstacle identification and tracking on the point cloud at the road boundary, so as to identify whether an obstacle is included in the point cloud, and which point data in the point cloud are used to describe the same obstacle. Here, the obstacles may include, but are not limited to, trees, warning signs, traffic signs, pedestrians, animals, vehicles, and the like.
If the point cloud comprises obstacles, the executive body can determine the road boundary attribute of the position where the point cloud is located according to preset map data. Here, the map data may be a high-precision map for assisting the travel of the vehicle, and the map data may include therein road data such as lane information of a position, a type, a width, a gradient, and a curvature of a lane line, and may further include fixed object information of a periphery of the lane such as a traffic sign, a road boundary attribute, and the like. Here, the road boundary attribute may be used to indicate an attribute of an object at the road boundary, and the road boundary attribute may include, but is not limited to, a green plant, a rail, a wall, and the like, as an example.
Step 203, determining whether the obstacle is a false detection obstacle or not based on the road boundary attribute and the point cloud of the obstacle.
In this embodiment, the execution subject may determine whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle. Here, the point cloud of the obstacle may refer to a point cloud composed of point data for describing the obstacle.
In some optional implementations of this embodiment, the road boundary attribute may include green plants; and the step 203 may be specifically performed as follows:
in response to determining that the obstacle is stationary, first operation steps S11-S13 are performed.
In this implementation, the execution subject may determine whether the obstacle is stationary according to the point cloud of the obstacle. If the obstacle is not stationary but moving, the execution subject may determine a position and a moving speed of the obstacle from the point cloud of the obstacle and transmit control information to the vehicle according to the position and the moving speed of the obstacle to control the vehicle to avoid a collision with the obstacle. In some cases, there may be moving obstacles such as people and animals under the green plants at the road boundary, and in this case, the vehicle needs to avoid the obstacles according to the position and moving speed of the obstacles so as to avoid collision. If the obstacle is stationary, the execution main body may perform the first operation steps S11-S13.
Step S11, in response to the fact that the obstacle is determined to be an unknown obstacle, the minimum distance between the point data located at the innermost side of the road boundary in the point cloud of the obstacle and the road boundary at the side where the obstacle is located is determined.
In this implementation, the executive may identify the type of obstacle. For example, the performing subject may utilize a trained point cloud recognition model to identify obstacles in the point cloud. The point cloud identification model may be a machine learning model. The executing body can input the point cloud from the input side of the point cloud identification model, and can obtain the type of the obstacle from the output side of the point cloud identification model. The types of obstacles may include trees, pedestrians, vehicles, and so forth. For the obstacle which cannot be identified, the point cloud identification model can mark the obstacle as an unknown obstacle.
In response to determining that the obstacle is an unknown type obstacle, the execution subject may determine a minimum distance between point data located at an innermost side of a road boundary in a point cloud of the obstacle and the road boundary at a side where the obstacle is located. Here, the road boundary may include an inner side and an outer side, and the inner side may refer to a side for the vehicle to travel, and the side may include a lane line. The outer side may refer to a side not used for vehicle travel, which may include greenery, railings, walls, and the like.
And step S12, responding to the fact that the minimum distance is smaller than a preset threshold value, and determining that the obstacle is a false detection obstacle.
In the present implementation, the execution subject may determine whether the minimum distance determined in step S11 is less than a preset threshold. Here, the threshold may be set according to actual needs, and for example, the threshold may be set to 4 meters. And if the minimum distance is smaller than a preset threshold value, determining that the obstacle is a false detection obstacle. In practice, assuming that the road boundary attribute of a certain road boundary is determined to be green vegetation according to map data, the obstacle at the road boundary is still and is an unknown type obstacle, and the minimum distance from the point data located at the innermost side of the road boundary in the point cloud of the obstacle at the road boundary to the road boundary is smaller than a preset threshold, the obstacle may be a branch of the green vegetation extending to the inner side of the road boundary. Accordingly, the execution subject may determine that the obstacle is a false detection obstacle. In practical applications, in some cases, when detecting an obstacle based on a point cloud, the point cloud of the obstacle is projected onto an XOY plane. At this time, branches of the green plants above the road extending to the inner side of the road boundary may be erroneously detected as obstacles. The length of the branch extending to the inner side of the road boundary from the green plant is usually within a certain range, so that the false-detection obstacles can be filtered out through a preset threshold value.
In step S13, in response to determining that the minimum distance is greater than or equal to a preset threshold, it is determined that the obstacle is not a false detection obstacle.
In the present implementation, if it is determined that the minimum distance determined in step S11 is greater than or equal to the preset threshold, the execution main body may determine that the obstacle is not a false detection obstacle.
In some optional implementations of this embodiment, the road boundary attribute may include a wall or a rail; and the step 203 may be specifically performed as follows:
in response to determining that the obstacle is stationary, second operation steps S21-S23 are performed.
In this implementation, the execution subject may determine whether the obstacle is stationary according to the point cloud of the obstacle. If the obstacle is stationary, the execution main body may perform the second operation steps S21-S23.
In step S21, it is determined whether an obstacle is located outside the road boundary.
In this implementation, the executive may determine whether the obstacle is located outside the road boundary according to the point cloud of the obstacle.
In step S22, in response to determining that the obstacle is located outside the road boundary, the obstacle is determined to be a false detection obstacle.
In this implementation, the execution subject may determine that the obstacle is a false detection obstacle if it is determined that the obstacle is located outside the road boundary. In practice, assuming that the road boundary attribute of a road boundary at a certain position is determined to be a wall or a rail according to the map data, the obstacle at the road boundary is still, and the obstacle is located outside the road boundary, the obstacle may be the wall or the rail of the false-detected road boundary. Accordingly, the execution subject may determine that the obstacle is a false detection obstacle.
In step S23, in response to determining that the obstacle is located inside the road boundary, it is determined that the obstacle is not a false detection obstacle.
In this implementation, the execution subject may determine that the obstacle is not a false detection obstacle if it is determined that the obstacle is located inside the road boundary.
And step 204, in response to the fact that the obstacle is determined to be the false detection obstacle, controlling the vehicle to continue running according to the current running speed.
In the present embodiment, if it is determined that the obstacle is the false detection obstacle, the execution subject may control the vehicle to continue traveling at the current traveling speed.
With continued reference to fig. 3, fig. 3 is a schematic view of an application scenario of the method for controlling a vehicle according to the present embodiment. In the application scenario of fig. 3, an onboard intelligent brain (not shown in the figure) in the vehicle 301 first acquires points at the road boundary that are captured during the vehicle's travel. Then, if the point cloud is identified to include the obstacle 302, the vehicle-mounted intelligent brain may determine the road boundary attribute "green plant" at the position where the point cloud is located according to preset map data. Then, the vehicle-mounted intelligent brain determines whether the obstacle 302 is a false detection obstacle based on the road boundary attribute "green plant" and the point cloud of the obstacle 302. If the obstacle 302 is determined to be a false detection obstacle, the vehicle-mounted intelligent brain can control the vehicle to continue to run according to the current running speed.
According to the method provided by the embodiment of the disclosure, whether the obstacle identified in the point cloud at the road boundary is the false detection obstacle is determined based on the road boundary attribute, and if the obstacle is the false detection obstacle, the vehicle is controlled to continue to run at the current running speed, so that the influence of the false detection obstacle on the normal running of the vehicle is avoided.
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method for controlling a vehicle is shown. The process 400 of the method for controlling a vehicle includes the steps of:
step 401, point clouds at the road boundary collected in the vehicle driving process are obtained.
In this embodiment, step 401 is similar to step 201 of the embodiment shown in fig. 2, and is not described here again.
Step 402, in response to the fact that the point cloud includes the obstacle, determining road boundary attributes of the position where the point cloud is located according to preset map data.
In this embodiment, step 402 is similar to step 202 of the embodiment shown in fig. 2, and is not described herein again.
Step 403, determining whether the obstacle is a false detection obstacle or not based on the road boundary attribute and the point cloud of the obstacle.
In this embodiment, step 403 is similar to step 203 of the embodiment shown in fig. 2, and is not described herein again.
And step 404, controlling the vehicle to continuously run according to the current running speed in response to the fact that the obstacle is determined to be the false detection obstacle.
In this embodiment, step 404 is similar to step 204 of the embodiment shown in fig. 2, and is not described here again.
Step 405, in response to determining that the obstacle is not a false detection obstacle, sending control information to the vehicle to control the vehicle to avoid collision with the obstacle based on the position and state of the obstacle.
In the present embodiment, if it is certain that the obstacle is not a false detection obstacle, the execution subject may transmit control information to the vehicle based on the position and state of the obstacle to control the vehicle to avoid a collision with the obstacle. Here, the state of the obstacle may include a stationary state and a moving state. If the state of the obstacle is a stationary state, the execution main body may transmit control information to the vehicle according to the position of the obstacle to control the vehicle to stop traveling or to travel around the obstacle.
In some optional implementations of this embodiment, the state may include a stationary state and a moving state; and in step 405, sending control information to the vehicle based on the position and state of the obstacle may be specifically performed as follows:
first, in response to the determination state being the moving state, the position and moving speed of the obstacle are determined from the point cloud of the obstacle.
In this implementation, the executive may determine the state of the obstacle. If the state of the obstacle is a moving state, the execution subject can determine the position and the moving speed of the obstacle according to the point cloud of the obstacle. As an example, the execution subject may determine a moving speed of the obstacle from a point cloud of the obstacle acquired at a current acquisition time and a point cloud of the obstacle acquired at a previous acquisition time of the current acquisition time, where the moving speed may include a velocity and a direction.
Then, control information is transmitted to the vehicle according to the position and the moving speed of the obstacle to control the vehicle to avoid collision with the obstacle.
In this implementation, the execution body may transmit control information to the vehicle according to the position and the moving speed of the obstacle to control the vehicle to avoid collision with the obstacle.
For example, the execution body may predict whether the vehicle collides with the obstacle while continuously traveling at the current speed, based on the moving speed of the obstacle, the traveling speed of the vehicle, and the distance between the obstacle and the vehicle in the X axis and the Y axis. Here, the traveling speed of the vehicle includes a speed and a direction. For example, the execution subject may predict whether the vehicle will collide with the obstacle by continuing to travel at the current speed by: first, first relative speeds of the obstacle and the vehicle on the X axis are determined according to speed components of the moving speed of the obstacle and the running speed of the vehicle on the X axis respectively. Then, according to the moving speed of the obstacle and the speed component of the running speed of the vehicle in the Y axis, a second relative speed of the obstacle and the vehicle in the Y axis is determined. And thirdly, taking the distance between the obstacle and the vehicle on the X axis as the X axis distance, and calculating the ratio of the X axis distance to the first relative speed to obtain the first time. Then, the distance between the obstacle and the vehicle on the Y axis is used as the Y axis distance, and the ratio of the Y axis distance to the second relative speed is calculated to obtain the second time. Finally, in response to determining that the difference between the first time and the second time is less than the preset time interval, it is predicted that the vehicle will collide with the obstacle while continuing to travel at the current travel speed.
The execution subject may transmit control information to the vehicle if it is predicted that the vehicle will collide with the obstacle while continuing to travel at the current speed. Here, the above control information may be used to control the vehicle to avoid collision with an obstacle. For example, the control information may be used to control the vehicle to stop running, or to run around an obstacle.
And if the vehicle is predicted to continue to run at the current speed without colliding with the obstacle, controlling the vehicle to continue to run at the current running speed.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the flow 400 of the method for controlling a vehicle in the present embodiment highlights the step of transmitting control information to the vehicle based on the position and state of the obstacle when the obstacle is not a false detection obstacle. Therefore, the scheme described in the embodiment can avoid the influence of the false detection obstacle on the normal running of the vehicle, and can ensure the safe running of the vehicle when the obstacle is not the false detection obstacle.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides one embodiment of an apparatus for controlling a vehicle, which corresponds to the method embodiment shown in fig. 2, and which may be particularly applied in various electronic devices.
As shown in fig. 5, the apparatus 500 for controlling a vehicle of the present embodiment includes: an acquisition unit 501, a first determination unit 502, a second determination unit 503, and a first control unit 504. Wherein the obtaining unit 501 is configured to obtain a point cloud at a road boundary acquired during the driving of the vehicle; the first determining unit 502 is configured to determine, according to preset map data, a road boundary attribute at a position where the point cloud is located, in response to recognizing that an obstacle is included in the point cloud; the second determination unit 503 is configured to determine whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle; the first control unit 504 is configured to control the vehicle to continue traveling at the current traveling speed in response to a determination that the obstacle is a false detection obstacle.
In this embodiment, specific processes of the obtaining unit 501, the first determining unit 502, the second determining unit 503, and the first controlling unit 504 of the apparatus 500 for controlling a vehicle and technical effects brought by the specific processes can refer to related descriptions of step 201, step 202, step 203, and step 204 in the corresponding embodiment of fig. 2, which are not repeated herein.
In some optional implementations of this embodiment, the apparatus 500 further includes: a second control unit (not shown in the figure) configured to send control information to the vehicle to control the vehicle to avoid a collision with the obstacle based on the position and state of the obstacle in response to determining that the obstacle is not a false detection obstacle.
In some optional implementations of this embodiment, the road boundary attribute includes green plants; and the second determining unit 503 is further configured to: in response to determining that the obstacle is stationary, performing a first operation of: in response to the fact that the obstacle is determined to be an unknown type obstacle, determining the minimum distance between the point data located at the innermost side of the road boundary in the point cloud of the obstacle and the road boundary where the obstacle is located; in response to determining that the minimum distance is smaller than a preset threshold value, determining that the obstacle is a false detection obstacle; determining that the obstacle is not a false detection obstacle in response to determining that the minimum distance is greater than or equal to the preset threshold.
In some optional implementations of this embodiment, the road boundary attribute includes a wall or a rail; and the second determining unit 503 is further configured to: in response to determining that the obstacle is stationary, performing the following second operational step: determining whether the obstacle is located outside a road boundary; determining the obstacle to be a false detection obstacle in response to determining that the obstacle is located outside a road boundary; determining that the obstacle is not a false detection obstacle in response to determining that the obstacle is located inside a road boundary.
In some optional implementations of this embodiment, the state includes a stationary state and a moving state; and the second control unit is further configured to: in response to determining that the state is a moving state, determining the position and the moving speed of the obstacle according to the point cloud of the obstacle; and sending control information to the vehicle according to the position and the moving speed of the obstacle so as to control the vehicle to avoid collision with the obstacle.
Referring now to fig. 6, a schematic diagram of an electronic device (e.g., the server in fig. 1 or an onboard intelligent brain installed in the vehicles 101, 102, 103) 600 suitable for implementing embodiments of the present disclosure is shown. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring point clouds at a road boundary acquired in the driving process of a vehicle; in response to the fact that the point cloud comprises the obstacle, determining the road boundary attribute of the position where the point cloud is located according to preset map data; determining whether the obstacle is a false detection obstacle or not based on the road boundary attribute and the point cloud of the obstacle; and controlling the vehicle to continue running according to the current running speed in response to the fact that the obstacle is determined to be the false detection obstacle.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a first determination unit, a second determination unit, and a first control unit. The names of these units do not in some cases constitute a limitation of the unit itself, and for example, the acquisition unit may also be described as a "unit that acquires a point cloud at a road boundary acquired during travel of the vehicle".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (12)

1. A method for controlling a vehicle, comprising:
acquiring point clouds at a road boundary acquired in the driving process of a vehicle;
in response to the fact that the point cloud comprises the obstacle, determining road boundary attributes of the position where the point cloud is located according to preset map data, wherein the road boundary attributes comprise railings;
determining whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle; the determining whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle includes: in response to determining that the obstacle is stationary, determining whether the obstacle is located outside a road boundary; in response to determining that the obstacle is located outside a road boundary, determining that the obstacle is a false positive obstacle;
and controlling the vehicle to continue running according to the current running speed in response to the fact that the obstacle is determined to be the false detection obstacle.
2. The method of claim 1, wherein the method further comprises:
in response to determining that the obstacle is not a false positive obstacle, sending control information to the vehicle to control the vehicle to avoid a collision with the obstacle based on the position and state of the obstacle.
3. The method of claim 1, wherein the road boundary attribute comprises green plants; and
the determining whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle includes:
in response to determining that the obstacle is stationary, performing a first operational step of:
in response to the fact that the obstacle is determined to be an unknown type obstacle, determining the minimum distance between the point data located at the innermost side of the road boundary in the point cloud of the obstacle and the road boundary at the side where the obstacle is located;
in response to determining that the minimum distance is less than a preset threshold, determining that the obstacle is a false detection obstacle;
determining that the obstacle is not a false detection obstacle in response to determining that the minimum distance is greater than or equal to the preset threshold.
4. The method of claim 1, wherein the road boundary attribute comprises a wall; and
the determining whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle includes:
in response to determining that the obstacle is stationary, performing the following second operational step:
determining whether the obstacle is located outside a road boundary;
in response to determining that the obstacle is located outside a road boundary, determining that the obstacle is a false positive obstacle;
determining that the obstacle is not a false detection obstacle in response to determining that the obstacle is located inside a road boundary.
5. The method of claim 2, wherein the states include a stationary state and a moving state; and
the transmitting control information to the vehicle based on the position and state of the obstacle includes:
in response to determining that the state is a moving state, determining a position and a moving speed of the obstacle from the point cloud of the obstacle;
and sending control information to the vehicle according to the position and the moving speed of the obstacle so as to control the vehicle to avoid collision with the obstacle.
6. An apparatus for controlling a vehicle, comprising:
an acquisition unit configured to acquire a point cloud at a road boundary acquired during a vehicle traveling;
the first determining unit is configured to respond to the fact that the point cloud comprises obstacles, and determine road boundary attributes at the position of the point cloud according to preset map data, wherein the road boundary attributes comprise railings;
a second determination unit configured to determine whether the obstacle is a false detection obstacle based on the road boundary attribute and the point cloud of the obstacle; the second determination unit is further configured to: in response to determining that the obstacle is stationary, determining whether the obstacle is located outside a road boundary; in response to determining that the obstacle is located outside a road boundary, determining that the obstacle is a false positive obstacle;
a first control unit configured to control the vehicle to continue traveling at a current traveling speed in response to determining that the obstacle is a false detection obstacle.
7. The apparatus of claim 6, wherein the apparatus further comprises:
a second control unit configured to transmit control information to the vehicle to control the vehicle to avoid a collision with the obstacle based on a position and a state of the obstacle in response to determining that the obstacle is not a false detection obstacle.
8. The apparatus of claim 6, wherein the road boundary attribute comprises green plants; and
the second determination unit is further configured to:
in response to determining that the obstacle is stationary, performing a first operational step of:
in response to the fact that the obstacle is determined to be an unknown type obstacle, determining the minimum distance between the point data located at the innermost side of the road boundary in the point cloud of the obstacle and the road boundary at the side where the obstacle is located;
in response to determining that the minimum distance is less than a preset threshold, determining that the obstacle is a false detection obstacle;
determining that the obstacle is not a false detection obstacle in response to determining that the minimum distance is greater than or equal to the preset threshold.
9. The apparatus of claim 6, wherein the road boundary attribute comprises a wall; and
the second determination unit is further configured to:
in response to determining that the obstacle is stationary, performing the following second operational step:
determining whether the obstacle is located outside a road boundary;
in response to determining that the obstacle is located outside a road boundary, determining that the obstacle is a false positive obstacle;
determining that the obstacle is not a false detection obstacle in response to determining that the obstacle is located inside a road boundary.
10. The apparatus of claim 7, wherein the states comprise a stationary state and a moving state; and
the second control unit is further configured to:
in response to determining that the state is a moving state, determining a position and a moving speed of the obstacle from the point cloud of the obstacle;
and sending control information to the vehicle according to the position and the moving speed of the obstacle so as to control the vehicle to avoid collision with the obstacle.
11. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-5.
CN202210033318.4A 2019-10-09 2019-10-09 Method and device for controlling a vehicle Pending CN114291084A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210033318.4A CN114291084A (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910953744.8A CN110696826B (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle
CN202210033318.4A CN114291084A (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910953744.8A Division CN110696826B (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle

Publications (1)

Publication Number Publication Date
CN114291084A true CN114291084A (en) 2022-04-08

Family

ID=69199263

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202210031998.6A Pending CN114291082A (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle
CN202210033318.4A Pending CN114291084A (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle
CN201910953744.8A Active CN110696826B (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210031998.6A Pending CN114291082A (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910953744.8A Active CN110696826B (en) 2019-10-09 2019-10-09 Method and device for controlling a vehicle

Country Status (1)

Country Link
CN (3) CN114291082A (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468922B (en) * 2020-03-31 2023-04-18 宇通客车股份有限公司 Road boundary identification method and device based on radar point cloud
CN111582173A (en) * 2020-05-08 2020-08-25 东软睿驰汽车技术(沈阳)有限公司 Automatic driving method and system
CN112801024B (en) * 2021-02-09 2023-08-29 广州小鹏自动驾驶科技有限公司 Detection information processing method and device
CN115808929B (en) * 2023-01-19 2023-04-14 禾多科技(北京)有限公司 Vehicle simulation obstacle avoidance method and device, electronic equipment and computer readable medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04193641A (en) * 1990-11-28 1992-07-13 Nissan Motor Co Ltd Obstacle detection device for vehicle
JPH08122432A (en) * 1994-10-20 1996-05-17 Honda Motor Co Ltd Mobile detector
JPH08124080A (en) * 1994-10-20 1996-05-17 Honda Motor Co Ltd Obstacle detector of vehicle
CN104334427A (en) * 2012-05-24 2015-02-04 罗伯特·博世有限公司 Method and device for avoiding or mitigating a collision of a vehicle with an obstacle
US20170185089A1 (en) * 2015-12-27 2017-06-29 Toyota Motor Engineering & Manufacturing North America, Inc. Detection of overhanging objects
US20170369051A1 (en) * 2016-06-28 2017-12-28 Toyota Motor Engineering & Manufacturing North America, Inc. Occluded obstacle classification for vehicles
CN109839922A (en) * 2017-11-28 2019-06-04 百度在线网络技术(北京)有限公司 For controlling the method and device of automatic driving vehicle

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4893118B2 (en) * 2006-06-13 2012-03-07 日産自動車株式会社 Avoidance control device, vehicle including the avoidance control device, and avoidance control method
DE102010025612A1 (en) * 2010-06-30 2011-03-10 Daimler Ag Method for warning driver of towing vehicle with trailer before collision with stationary obstacles on driving route with curve, involves sending signal and/or carrying out interference when wheel of axle of vehicle is approximated to range
WO2013098995A1 (en) * 2011-12-28 2013-07-04 トヨタ自動車株式会社 Obstruction determination device
DE102013221369A1 (en) * 2013-10-22 2015-04-23 Robert Bosch Gmbh Procedure for bypassing an obstacle
CN104850834A (en) * 2015-05-11 2015-08-19 中国科学院合肥物质科学研究院 Road boundary detection method based on three-dimensional laser radar
KR101714273B1 (en) * 2015-12-11 2017-03-08 현대자동차주식회사 Method and apparatus for controlling path of autonomous driving system
CN105825173B (en) * 2016-03-11 2019-07-19 福州华鹰重工机械有限公司 General road and lane detection system and method
CN107226088B (en) * 2016-03-25 2022-03-08 松下电器(美国)知识产权公司 Controller, driving control method, and program
CN106485233B (en) * 2016-10-21 2020-01-17 深圳地平线机器人科技有限公司 Method and device for detecting travelable area and electronic equipment
CN106842231B (en) * 2016-11-08 2019-03-22 长安大学 A kind of road edge identification and tracking
JP6547735B2 (en) * 2016-12-22 2019-07-24 トヨタ自動車株式会社 Collision avoidance support device
CN107169464B (en) * 2017-05-25 2019-04-09 中国农业科学院农业资源与农业区划研究所 A kind of Method for Road Boundary Detection based on laser point cloud
CN107169468A (en) * 2017-05-31 2017-09-15 北京京东尚科信息技术有限公司 Method for controlling a vehicle and device
JP6972793B2 (en) * 2017-09-04 2021-11-24 日産自動車株式会社 Driving control method and driving control device for driving support vehicles
US10788831B2 (en) * 2017-10-06 2020-09-29 Wipro Limited Method and device for identifying center of a path for navigation of autonomous vehicles
CN109017786B (en) * 2018-08-09 2020-09-22 北京智行者科技有限公司 Vehicle obstacle avoidance method
CN109255181B (en) * 2018-09-07 2019-12-24 百度在线网络技术(北京)有限公司 Obstacle distribution simulation method and device based on multiple models and terminal
CN109583384A (en) * 2018-11-30 2019-04-05 百度在线网络技术(北京)有限公司 Barrier-avoiding method and device for automatic driving car
CN109740484A (en) * 2018-12-27 2019-05-10 斑马网络技术有限公司 The method, apparatus and system of road barrier identification
CN110147748B (en) * 2019-05-10 2022-09-30 安徽工程大学 Mobile robot obstacle identification method based on road edge detection

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04193641A (en) * 1990-11-28 1992-07-13 Nissan Motor Co Ltd Obstacle detection device for vehicle
JPH08122432A (en) * 1994-10-20 1996-05-17 Honda Motor Co Ltd Mobile detector
JPH08124080A (en) * 1994-10-20 1996-05-17 Honda Motor Co Ltd Obstacle detector of vehicle
CN104334427A (en) * 2012-05-24 2015-02-04 罗伯特·博世有限公司 Method and device for avoiding or mitigating a collision of a vehicle with an obstacle
US20170185089A1 (en) * 2015-12-27 2017-06-29 Toyota Motor Engineering & Manufacturing North America, Inc. Detection of overhanging objects
US20170369051A1 (en) * 2016-06-28 2017-12-28 Toyota Motor Engineering & Manufacturing North America, Inc. Occluded obstacle classification for vehicles
CN109839922A (en) * 2017-11-28 2019-06-04 百度在线网络技术(北京)有限公司 For controlling the method and device of automatic driving vehicle

Also Published As

Publication number Publication date
CN114291082A (en) 2022-04-08
CN110696826B (en) 2022-04-01
CN110696826A (en) 2020-01-17

Similar Documents

Publication Publication Date Title
CN110696826B (en) Method and device for controlling a vehicle
US20210001841A1 (en) Obstacle Avoidance Method and Apparatus for Autonomous Driving Vehicle
CN110654381B (en) Method and device for controlling a vehicle
CN110687549B (en) Obstacle detection method and device
US10642268B2 (en) Method and apparatus for generating automatic driving strategy
CN110654380B (en) Method and device for controlling a vehicle
US10809723B2 (en) Method and apparatus for generating information
US10369993B2 (en) Method and device for monitoring a setpoint trajectory to be traveled by a vehicle for being collision free
CN111401255A (en) Method and device for identifying divergent intersection
CN114212108A (en) Automatic driving method, device, vehicle, storage medium and product
CN112558036B (en) Method and device for outputting information
CN112622923B (en) Method and device for controlling a vehicle
CN112526477B (en) Method and device for processing information
CN112528711A (en) Method and apparatus for processing information
CN114056337B (en) Method, device and computer program product for predicting vehicle running behavior
CN114771533A (en) Control method, device, equipment, vehicle and medium for automatic driving vehicle
CN112668371B (en) Method and device for outputting information
CN112885087A (en) Method, apparatus, device and medium for determining road condition information and program product
CN110362086B (en) Method and device for controlling an autonomous vehicle
CN115981344B (en) Automatic driving method and device
CN113401127B (en) Path updating method, path updating device, electronic equipment and computer readable medium
CN115848358B (en) Vehicle parking method, device, electronic equipment and computer readable medium
US20230024799A1 (en) Method, system and computer program product for the automated locating of a vehicle
CN109116357B (en) Method, device and server for synchronizing time
CN114333368A (en) Voice reminding method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination