CN117681868A - Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium - Google Patents

Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117681868A
CN117681868A CN202311504946.7A CN202311504946A CN117681868A CN 117681868 A CN117681868 A CN 117681868A CN 202311504946 A CN202311504946 A CN 202311504946A CN 117681868 A CN117681868 A CN 117681868A
Authority
CN
China
Prior art keywords
vehicle
target
information
self
boundary frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311504946.7A
Other languages
Chinese (zh)
Inventor
彭帅
李小刚
邹欣
潘文博
刘叶叶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foss Hangzhou Intelligent Technology Co Ltd
Original Assignee
Foss Hangzhou Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foss Hangzhou Intelligent Technology Co Ltd filed Critical Foss Hangzhou Intelligent Technology Co Ltd
Priority to CN202311504946.7A priority Critical patent/CN117681868A/en
Publication of CN117681868A publication Critical patent/CN117681868A/en
Pending legal-status Critical Current

Links

Abstract

The application relates to a vehicle obstacle avoidance collision detection method, a device, computer equipment and a storage medium. The method comprises the following steps: firstly, acquiring target information, determining target relative information based on vehicle information, a vehicle coordinate system and the target information, then determining a vehicle boundary frame and a target boundary frame based on the vehicle information and the target relative information respectively, finally judging the relative position relationship of the vehicle boundary frame and the target boundary frame, and determining whether the vehicle and the target collide or not based on the relative position relationship. That is, when the vehicle obstacle avoidance planning is performed, whether collision occurs between the vehicle and the target or not can be determined by judging the position relationship between the vehicle and the target based on the vehicle information and the target information at the same moment before or during the planning, so that the safety in the vehicle obstacle avoidance process is ensured, and the safety of automatic driving of the vehicle is further improved.

Description

Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of autopilot technologies, and in particular, to a method and apparatus for detecting obstacle avoidance collision of a vehicle, a computer device, and a storage medium.
Background
The automatic driving technology relies on cooperation of artificial intelligence, visual calculation, radar, monitoring device and global positioning system, so that the computer can automatically and safely control the vehicle to run without any active operation of human beings. The track planning is one of key technologies of the automatic driving technology, and can plan an effective vehicle driving path on the basis of vehicle-road coordination, and generally, a path from a starting point to a terminal point is planned by using a global path planning method according to high-precision map information. When a collision target exists in front of a road, the vehicle can acquire specific information of the collision target such as the vehicle and the roadblock appearing in front of the vehicle running in real time based on the perception system, and a route which needs to be run for avoiding collision with the collision target is planned by using a local path planning method.
However, since the vehicle and the dangerous collision target may be in a moving state at any time, the key information such as speed, acceleration and position of the vehicle and the dangerous collision target will also change along with time, in the prior art, only real-time collision detection and obstacle avoidance path planning are generally performed, without considering that in the obstacle avoidance process, the dangerous collision target is dynamically changed to cause collision, so that the safety of obstacle avoidance is greatly reduced.
Accordingly, there is a need in the art for a way to ensure and improve the obstacle avoidance safety of vehicles.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a vehicle obstacle avoidance collision detection method, apparatus, computer device, and computer-readable storage medium that are capable of ensuring and improving the obstacle avoidance safety of a vehicle.
In a first aspect, the present application provides a method for obstacle avoidance collision detection of a vehicle. The method comprises the following steps:
acquiring target information, and determining target relative information based on the vehicle information, a vehicle coordinate system and the target information;
determining a self-vehicle bounding box and a target bounding box based on the self-vehicle information and the target relative information respectively;
and judging the relative position relation between the self-vehicle boundary frame and the target boundary frame, and determining whether the self-vehicle and the target collide or not based on the relative position relation.
Optionally, in one embodiment of the present application, the target information includes a target position and a target heading angle, and determining the target relative information based on the vehicle information, the vehicle coordinate system, and the target information includes:
establishing a reference coordinate system based on the vehicle information and the vehicle coordinate system;
and determining the relative position and the relative course angle of the target based on the reference coordinate system and the target information.
Optionally, in an embodiment of the present application, the determining the own vehicle bounding box and the target bounding box based on the own vehicle information and the target relative information respectively includes:
representing a self-vehicle boundary point and a target boundary point based on the reference coordinate system and the self-vehicle information and the target relative information respectively;
and determining a self-vehicle boundary box and a target boundary box based on the self-vehicle boundary point and the target boundary point.
Optionally, in an embodiment of the present application, the determining a relative positional relationship between the self-vehicle bounding box and the target bounding box, and determining whether the self-vehicle and the target collide based on the relative positional relationship includes:
judging whether four sides of the target boundary frame are intersected with the own vehicle boundary frame or not respectively, and if so, collision between the own vehicle and the target occurs;
if the two frames do not intersect, judging whether the self-vehicle boundary frame and the target boundary frame have a containing relationship.
Optionally, in an embodiment of the present application, the determining whether the four sides of the target bounding box intersect with the own bounding box includes:
and if the four sides are perpendicular to the reference coordinate system, judging whether the horizontal coordinates and the vertical coordinates of the four sides are in the horizontal coordinate range and the vertical coordinate range of the self-vehicle boundary frame.
Optionally, in an embodiment of the present application, the determining whether the four sides of the target bounding box intersect with the own bounding box further includes:
if the four sides are not perpendicular to the reference coordinate system, a linear equation is built based on the four sides, whether the intersection exists between the abscissa of the four sides and the abscissa of the self-vehicle boundary frame is judged, and if the intersection exists, whether the ordinate of the four sides is in the ordinate range of the self-vehicle boundary frame is judged based on the intersection and the linear equation.
Optionally, in an embodiment of the present application, the determining whether the inclusion relationship exists between the own vehicle bounding box and the target bounding box includes:
and determining the distance from the target boundary frame to the coordinate axis, and if the distance is not greater than the target length or the target width, determining that the inclusion relationship exists between the self-vehicle boundary frame and the target boundary frame.
In a second aspect, the present application further provides a vehicle obstacle avoidance collision detection device. The device comprises:
the information acquisition module is used for acquiring target information and determining target relative information based on the vehicle information, the vehicle coordinate system and the target information;
the boundary frame determining module is used for determining a self-vehicle boundary frame and a target boundary frame based on the self-vehicle information and the target relative information respectively;
And the collision detection module is used for judging the relative position relation between the self-vehicle boundary frame and the target boundary frame and determining whether the self-vehicle and the target collide or not based on the relative position relation.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor executing the steps of the method according to the various embodiments described above.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the method described in the above embodiments.
According to the vehicle obstacle avoidance collision detection method, the device, the computer equipment and the storage medium, firstly, target information is obtained, target relative information is determined based on own vehicle information and the target information, wherein the target information comprises a target position and a target course angle, then, an own vehicle boundary frame and a target boundary frame are determined based on the own vehicle information and the target relative information respectively, finally, the relative position relationship between the own vehicle boundary frame and the target boundary frame is judged, and whether collision occurs between an own vehicle and the target is determined based on the relative position relationship. That is, when the vehicle obstacle avoidance planning is performed, whether collision occurs between the vehicle and the target or not can be determined by judging the position relationship between the vehicle and the target based on the vehicle information and the target information at the same moment before or during the planning, so that the safety in the vehicle obstacle avoidance process is ensured, and the safety of automatic driving of the vehicle is further improved.
Drawings
FIG. 1 is a diagram of an application environment of a method for detecting obstacle avoidance collisions of a vehicle in one embodiment;
FIG. 2 is a flow chart of a method of detecting a collision of a vehicle in one embodiment;
FIG. 3 is a schematic diagram of a determination of a host bounding box and a target bounding box in one embodiment;
FIG. 4 is a flow chart of determining whether four edges of a target bounding box intersect a self-propelled bounding box in one embodiment;
FIG. 5 is a flowchart illustrating a method for determining whether four edges of a target bounding box intersect a self-propelled bounding box according to another embodiment;
FIG. 6 is a flow chart of a method for determining whether an inclusion relationship exists between a host bounding box and a target bounding box in one embodiment;
FIG. 7 is a schematic diagram of a self-propelled bounding box overlapping a target bounding box in one embodiment;
FIG. 8 is a schematic diagram of a self-contained bounding box and a target bounding box in one embodiment;
FIG. 9 is a block diagram of a vehicle obstacle avoidance collision detection device in one embodiment;
fig. 10 is an internal structural view of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The term "system" as used herein refers to mechanical and electrical hardware, software, firmware, electronic control components, processing logic, and/or processor devices, which may provide the described functionality alone or in combination. May include, but is not limited to, an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a memory containing software or firmware instructions, a combinational logic circuit, and/or other components.
The method for detecting the obstacle avoidance collision of the vehicle, provided by the embodiment of the application, can be applied to an application environment shown in fig. 1. Fig. 1 shows a side view of a vehicle 10, the vehicle 10 being disposed on a travel surface 70 (e.g., a paved road surface) and being capable of traversing travel on the travel surface 70. The vehicle 10 includes a vehicle on-board navigation system 24, a computer readable storage or medium (memory) 23 of a digitized road map 25, a space monitoring system 100, a vehicle controller 50, a Global Positioning System (GPS) sensor 52, a human/machine interface (HMI) device 60, and in one embodiment, an autonomous controller 65 and a telematics controller 75. The vehicle 10 includes, but is not limited to, a commercial vehicle, an industrial vehicle, an agricultural vehicle, a passenger vehicle, an aircraft, a watercraft, a train, an all-terrain vehicle, a personal mobile device, a robot, and similar forms of mobile platforms for the purposes of this application.
In one embodiment, the spatial monitoring system 100 includes: one or more space sensors and systems configured to monitor a viewable area 32 in front of the vehicle 10; and a space monitoring controller 110. The spatial sensors configured to monitor the viewable area 32 in front of the vehicle 10 include, for example, a lidar sensor 34, a radar sensor 36, a digital camera 38, and the like. Each spatial sensor arrangement includes onboard vehicles to monitor all or a portion of the viewable area 32 for detecting proximity to remote objects, such as road features, lane markings, buildings, pedestrians, road signs, traffic control lights and signs, other vehicles, and geographic features proximal to the vehicle 10. The spatial monitoring controller 110 generates a representation number of the viewable area 32 based on data input from the spatial sensor. The space monitoring controller 110 may evaluate the inputs from the space sensors to determine the linear range, relative speed, and trajectory of the vehicle 10 based on each near-remote object. The space sensors may be disposed at various locations on the vehicle 10, including front corners, rear sides, and mid sides. In one embodiment, the spatial sensor may include, but is not limited to, a front radar sensor and a camera. The spatial sensors are arranged in a manner that enables the spatial monitoring controller 110 to monitor traffic flow, including approaching vehicles, intersections, lane markings, and other objects surrounding the vehicle 10. A lane marker detection processor (not shown) may estimate a road based on data generated by the spatial monitoring controller 110. The spatial sensors of the vehicle spatial monitoring system 100 may include object location sensing devices including range sensors, such as FM-CW (frequency modulated continuous wave) radar, pulse and FSK (frequency shift keying) radar, and Lidar (light detection and ranging) devices, as well as ultrasonic devices, that rely on effects such as doppler effect measurements to locate a forward object. The object positioning device may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) video image sensor as well as other camera/video image processors that utilize digital photography methods to 'view' the object in front (including one or more vehicles).
The lidar sensor 34 measures the range or distance to the object based on the pulsed and reflected laser beams. The radar sensor 36 determines the range, angle and/or speed of the object based on the radio waves. The camera 38 includes an image sensor, a lens, and a camera controller. An image sensor is an electro-optical device that converts an optical image into an electronic signal using a multi-dimensional array of photosensitive sensing elements. The camera controller is operatively connected to the image sensor to monitor the viewable area 32. The camera controller is arranged to control the image sensor for capturing an image of a field of view (FOV) associated with a field of view 32 projected onto the image sensor via the lens. The optical lens may include a pinhole lens, a fisheye lens, a stereoscopic lens, a telescopic lens, and the like. The camera 38 periodically captures image files associated with the viewable area 32 via the image sensor at a desired rate (e.g., 30 image files per second). Each image file includes 2D or 3D pixelated representations of all or a portion of the viewable area 32 captured at the original resolution of the camera 38. In one embodiment, the image file is in the form of a 24-bit image including spectral values and depth values of RGB (red-green-blue) visible light representing the viewable area 32. Other embodiments of the image file may include a 2D or 3D image at a resolution level depicting a spectrum of black and white or gray-scale visible light of the viewable area 32, an infrared spectrum of the viewable area 32, or other images, as this application is not specifically limited. In one embodiment, images of multiple image files may be evaluated for parameters related to brightness and/or luminance. Alternatively, the image may be evaluated based on RGB color components, brightness, texture, contours, or combinations thereof. The image sensor communicates with an encoder that performs Digital Signal Processing (DSP) for each image file. The image sensor of camera 38 may be configured to capture images at a nominal standard definition resolution (e.g., 640x480 pixels). Alternatively, the image sensor of camera 38 may be configured to capture images at a nominal high definition resolution (e.g., 1440x1024 pixels) or at another suitable resolution. The image sensor of camera 38 may capture still images or alternatively digital video images at a predetermined image capture rate. In one embodiment, the image file is sent to the camera controller as an encoded data file that is stored in a non-transitory digital data storage medium for on-board or off-board analysis.
The camera 38 is disposed and positioned on the vehicle 10 in a position capable of capturing an image of the viewable area 32, wherein the viewable area 32 includes at least in part a portion of the travel surface 70 forward of the vehicle 10 and including a trajectory of the vehicle 10. The viewable area 32 may also include the surrounding environment, including, for example, vehicle traffic, roadside objects, pedestrians and other features, sky, horizon, travel lanes, and vehicles coming in front of the vehicle 10. Other cameras (not shown) may also be included, including, for example, a second camera disposed on a rear or side portion of the vehicle 10 for monitoring the rear of the vehicle 10 and either the right or left side of the vehicle 10.
The autonomous controller 65 is used to implement autonomous driving or Advanced Driver Assistance System (ADAS) vehicle functionality. Such functionality may include a vehicle onboard control system capable of providing a level of driving automation. The terms 'driver' and 'operator' describe the person responsible for directing the operation of the vehicle 10, who may be involved in controlling one or more vehicle functions, or directing an autonomous vehicle. Driving automation may include dynamic driving and vehicle operation. Driving automation may include some level of automatic control or intervention involving individual vehicle functions (e.g., steering, acceleration, and/or braking), wherein the driver may continuously control the vehicle 10 as a whole. Driving automation may include some level of automatic control or intervention involving simultaneous control of multiple vehicle functions (e.g., steering, acceleration, and/or braking), wherein the driver may continuously control the vehicle 10 as a whole. Driving automation may include simultaneous automatic control of vehicle driving functions (including steering, acceleration, and braking), wherein the driver may relinquish control of the vehicle for a period of time during the course. The driving automation may include simultaneous automatic control of vehicle driving functions (including steering, acceleration, and braking), wherein the driver may override control of the vehicle 10 throughout the journey. The driving automation comprises hardware and a controller arranged to monitor the spatial environment in various driving modes for performing various driving tasks during dynamic vehicle operation. Driving automation includes, but is not limited to, cruise control, adaptive cruise control, lane change warning, intervention and control, automatic stopping, acceleration, braking, and the like. Autonomous vehicle functions include, but are not limited to, adaptive Cruise Control (ACC) operations, lane guidance and lane keeping operations, lane changing operations, steering assist operations, object avoidance operations, parking assist operations, vehicle braking operations, vehicle speed and acceleration operations, vehicle lateral movement operations, e.g., as lane guidance, lane keeping and lane changing operations, and the like. Based thereon, the brake command may be generated by the autonomous controller 65 independent of the action by the vehicle operator and in response to the autonomous control function.
Operator controls may be included in the passenger compartment of the vehicle 10 including, but not limited to, steering wheels, accelerator pedals, brake pedals, and operator input devices that are elements of the HMI device 60. The vehicle operator may interact with the running vehicle 10 based on operator controls and direct the operation of the vehicle 10 for providing passenger transport. In some embodiments of the vehicle 10, operator controls may be omitted, including steering wheels, accelerator pedals, brake pedals, gear-change range selectors, and other control devices of the like.
The HMI device 60 provides man-machine interaction for guiding the infotainment system, global Positioning System (GPS) sensor 52, navigation system 24, and similar operational functions, and the HMI device 60 may include a controller. The HMI device 60 monitors operator requests and provides information to the operator including status, service, and maintenance information of the vehicle system. HMI device 60 may communicate with and/or control operation of a plurality of operator interface devices capable of communicating messages associated with operation in an automatic vehicle control system. HMI device 60 may also communicate with one or more devices that monitor biometric data associated with the vehicle operator, including, for example, eye gaze location, pose, and head position tracking, among others. For simplicity of description, the HMI device 60 is depicted as a single device, but in embodiments of the system of the present application may be provided as multiple controllers and associated sensing devices. The operator interface device may include a device capable of transmitting a message prompting an operator action, and may include an electronic visual display module, such as a Liquid Crystal Display (LCD) device, head-up display (HUD), audio feedback device, wearable device, and haptic seat. The operator interface device capable of prompting an operator action may be controlled by the HMI device 60 or by the HMI device 60. In the operator's field of view, the HUD may project information reflected onto the interior side of the vehicle's windshield, including conveying a confidence level associated with operating one of the automatic vehicle control systems. The HUD may also provide augmented reality information, such as lane position, vehicle path, direction and/or navigation information, and so forth.
The on-board navigation system 24 provides navigation support and information to the vehicle operator based on the digitized road map 25. The autonomous controller 65 controls autonomous vehicle operation or ADAS vehicle functions based on the digitized road map 25.
The vehicle 10 may include a telematics controller 75, the telematics controller 75 including a wireless telematics communication system capable of off-vehicle communication, including communication with a communication network 90 having wireless and wired communication capabilities. The telematics controller 75 is capable of off-vehicle communications, including short range vehicle-to-vehicle (V2V) communications and/or vehicle-to-outside world (V2 x) communications, which may include communications with infrastructure monitors (e.g., traffic cameras). Alternatively or additionally, the telematics controller 75 has a wireless telematics communication system that is capable of short-range wireless communication with a handheld device (e.g., a cellular telephone, satellite telephone, or another telephone device). In one embodiment, the handheld device includes a software application that includes a wireless protocol for communicating with the telematics controller 75, and the handheld device can perform off-vehicle communications, including communication with the off-board server 95 based on the communication network 90. Alternatively or additionally, the telematics controller 75 directly performs off-vehicle communications based on the communication network 90 communicating with the off-board server 95.
The term "controller" and related terms (e.g., microcontroller, control unit, processor, and the like) refer to one or various combinations of the following: application specific integrated circuit(s) (ASIC), field Programmable Gate Array (FPGA), electronic circuit(s), central processing unit(s), e.g., microprocessor(s) and associated non-transitory memory component(s) (indicated by memory 23) in the form of memory and storage (read-only, programmable read-only, random access, hard drive, etc.). The non-transitory memory component is capable of storing machine-readable instructions in the form of: one or more software or firmware programs or routines, combinational logic circuit(s), input/output circuit(s) and devices, signal conditioning and buffering circuitry, and other components accessible by the one or more processors to implement the corresponding functionality. The input/output circuit(s) and devices include analog/digital converters and related devices that monitor inputs from the sensors, which can be monitored at a preset sampling frequency or in response to a trigger event. Software, firmware, programs, instructions, control routines, code, algorithms, and similar terms refer to a controller-executable instruction set, including calibration and lookup tables. Each controller executes control routine(s) for providing the respective function. The routine may be performed at regular intervals, for example, every 100 microseconds during ongoing operation. Alternatively, the routine may be executed in response to a triggering event. Communication between the controllers, actuators, and/or sensors may be implemented using direct wired point-to-point links, networked communication bus links, wireless links, or other suitable communication links. The communication includes corresponding exchanged data signals, including, for example, conductive medium-based electrical signals, air-based electromagnetic signals, optical waveguide-based optical signals, and the like. The data signals may include discrete, analog or digitized analog signals representing inputs from the sensors, actuator commands, and communications between the controllers. The term "signal" refers to a physically identifiable indicator of conveyed information and may be a corresponding waveform (e.g., electrical, optical, magnetic, mechanical, or electromagnetic), such as, for example, DC, AC, sine wave, triangular wave, square wave, vibration, and the like, capable of propagating through a medium. A parameter is defined as a measurable quantity that represents a physical property of a device or other element that can be identified using one or more sensors and/or physical models. The parameter may have a discrete value, e.g., "1" or "0", or be infinitely variable in value.
In one embodiment, as shown in fig. 2, there is provided a vehicle obstacle avoidance collision detection method, including the steps of:
s201: target information is acquired, and target relative information is determined based on the vehicle information, the vehicle coordinate system and the target information.
In the embodiment of the application, first, target information is acquired, the target information includes a target position and a target course angle, and target relative information is determined based on vehicle information, a vehicle coordinate system and the target information. Specifically, the vehicle information includes information such as a position, a speed, an acceleration, a course angle and the like of the vehicle at the current detection moment, and the detection moment may be before the vehicle obstacle avoidance planning, or may be a moment in the vehicle obstacle avoidance planning to obtain a route, and similarly, the target information should be position information at the corresponding moment. When the vehicle information is information before the vehicle obstacle avoidance planning, the corresponding target information is information such as a target position, a target course angle and the like at the same moment before the obstacle avoidance planning; when the vehicle information is information of a certain moment in the route obtained by vehicle obstacle avoidance planning, the corresponding target information is also information of a target position, a target course angle and the like at the same moment, and the target information is obtained through a predicted target future track in obstacle avoidance planning. And correcting the target information based on the vehicle information, namely the current position information of the vehicle, and obtaining corrected target relative information, namely the target position information expressed under the same coordinate system at the current moment.
S203: and determining a self-vehicle boundary box and a target boundary box based on the self-vehicle information and the target relative information respectively.
In the embodiment of the application, after target information is acquired and target relative information is determined based on vehicle information and target information, a vehicle boundary frame and a target boundary frame are determined based on the vehicle information and the target relative information respectively, specifically, the vehicle boundary frame is determined based on vehicle position information, wheelbase information and safety distance, and the target boundary frame is determined based on a center position point and length and width information of a target.
S205: and judging the relative position relation between the self-vehicle boundary frame and the target boundary frame, and determining whether the self-vehicle and the target collide or not based on the relative position relation.
In the embodiment of the application, after determining the self-vehicle boundary frame and the target boundary frame, judging the relative position relationship between the self-vehicle boundary frame and the target boundary frame, and determining whether the self-vehicle and the target collide or not based on the relative position relationship, wherein the specific relative position relationship comprises complete non-intersection, overlapping or inclusion; when the self-vehicle boundary frame and the target boundary frame overlap, namely, are partially intersected, the collision between the self-vehicle and the target at the detection moment is indicated; when the self-vehicle bounding box comprises a target bounding box or the target bounding box comprises a self-vehicle bounding box, the collision between the self-vehicle and the target at the detection moment is indicated.
In the vehicle obstacle avoidance collision detection method, first, target information is acquired, target relative information is determined based on vehicle information and target information, wherein the target information comprises a target position and a target course angle, then, a vehicle boundary frame and a target boundary frame are determined based on the vehicle information and the target relative information respectively, finally, the relative position relation between the vehicle boundary frame and the target boundary frame is judged, and whether the vehicle and the target collide or not is determined based on the relative position relation. That is, when the vehicle obstacle avoidance planning is performed, whether collision occurs between the vehicle and the target or not can be determined by judging the position relationship between the vehicle and the target based on the vehicle information and the target information at the same moment before or during the planning, so that the safety in the vehicle obstacle avoidance process is ensured, and the safety of automatic driving of the vehicle is further improved.
In one embodiment of the present application, the target information includes a target position and a target heading angle, and the determining the target relative information based on the vehicle information, the vehicle coordinate system, and the target information includes:
s301: and establishing a reference coordinate system based on the vehicle information and the vehicle coordinate system.
S303: and determining the relative position and the relative course angle of the target based on the reference coordinate system and the target information.
In one embodiment of the present application, the vehicle information and the target information are related information in the process of performing obstacle avoidance planning on the vehicle, that is, the vehicle information is related information based on a vehicle coordinate system at a certain sampling point moment in the planned obstacle avoidance route, and the target information corresponding to the same sampling point moment is related information relative to the vehicle coordinate system at the planning moment in the future predicted track of the target. Since the vehicle coordinate system will change along with the movement of the vehicle, the vehicle coordinate system at a certain sampling point moment and the vehicle coordinate system at a planning moment do not belong to the same coordinate system, therefore, a new reference coordinate system is established based on the vehicle information and the vehicle coordinate system at the planning moment, namely, the vehicle coordinate system at the sampling point moment, the origin of the coordinate system is still the center of the rear axle of the vehicle, the current detection moment, namely, the current sampling point moment, the vehicle coordinate comprises position information and course angle information, the form is (PxEgot, pyEgot, HEgot), the target information obtained at the same moment also comprises position information and course angle information, the form is (PxObj, pyObj, HObjt), then the relative position of the target and the relative course angle of the target are determined based on the reference coordinate system and the target information, namely, based on the reference coordinate system, and the related information based on the vehicle coordinate system at the planning moment is converted into the related information under the reference coordinate system based on the current detection moment, namely, the current sampling point moment, namely, the related information under the reference coordinate system is (PxObjNew, pyObjNew, HObjNew). Specifically, the method can be calculated by the following formula.
PxObjnew=PxObj*cos(HEgot)+PyObj*sin(HEgot)-(PyEgot*sin(HEgot)+PxEgot*cos(HEgot))
PyObjNew=PxEgot*sin(HEgot)-PyEgot*cos(HEgot)-(PxObj*sin(HEgot)-PyObj*cos(HEgot))
HObjNew=HObjt-HEgot,HObjNew∈[-π,π]
Wherein, (PxEgot, pyEgot, HEgot) is the vehicle coordinate information of the vehicle coordinate system based on the sampling point moment, including the vehicle x coordinate pxego, the vehicle y coordinate PyEgot and the vehicle heading angle hego, and (PxObj, pyObj, HObjt) is the target coordinate information of the vehicle coordinate system based on the planning moment, including the target x coordinate PxObj, the target y coordinate PyObj and the target heading angle HObjt.
In this embodiment, the target coordinates can be corrected by establishing the own vehicle coordinate system based on the own vehicle, and determining the target relative position and the target relative heading angle based on the own vehicle coordinate system, the own vehicle information, and the target information, so that the collision detection is more accurate.
In one embodiment of the present application, the determining the own vehicle bounding box and the target bounding box based on the own vehicle information and the target relative information, respectively, includes:
s401: and respectively representing the own vehicle boundary point and the target boundary point based on the reference coordinate system and the own vehicle information and the target relative information.
S403: and determining a self-vehicle boundary box and a target boundary box based on the self-vehicle boundary point and the target boundary point.
In one embodiment of the present application, as shown in fig. 3, a vehicle boundary point (range x1, range x2, range y1, range y 2) is determined based on vehicle information (PxEgot, pyEgot, HEgot), vehicle width, wheel base, safety distance, and the like, where Wego is the vehicle width of the vehicle, egoLr is the distance from the rear axle to the rear edge of the vehicle, egoLr is the distance from the rear axle to the front edge of the vehicle, safe margin lgt is the longitudinal safety distance, safe margin lat is the lateral safety distance, and then a vehicle boundary frame BoundingBox is determined based on the vehicle boundary point. For the target, the coordinates of the target boundary points, i.e., four vertices, are determined based on the target relative information, the target center position point, and the target length-width information, and the specific calculation formula is shown below.
Px 2 =Px 1 -L obj ×cos(HObjNew)
Py 2 =Px 1 -L obj ×sin(HObjNew)
Px 3 =Px 4 +L obj ×cos(HObjNew)
Py 3 =Py 4 +L obj ×sin(HPbjNew)
Wherein W is obj Refer to the width of the target vehicle, L obj The target length (PxObjNew, pyObjNew, HObjNew) is target relative information, and includes a target relative x-coordinate PxObjNew, a target relative y-coordinate PyObjNew and a target relative heading angle HObjNew.
In the present embodiment, the own vehicle boundary frame and the target boundary frame can be intuitively represented by representing the own vehicle boundary point and the target boundary point based on the reference coordinate system and the own vehicle information and the target relative information, respectively, and determining the own vehicle boundary frame and the target boundary frame based on the own vehicle boundary point and the target boundary point.
In one embodiment of the present application, the determining the relative positional relationship between the own vehicle bounding box and the target bounding box, and determining whether the collision between the own vehicle and the target occurs based on the relative positional relationship includes:
s501: and respectively judging whether four sides of the target boundary frame are intersected with the own vehicle boundary frame, and if so, collision between the own vehicle and the target occurs.
S503: if the two frames do not intersect, judging whether the self-vehicle boundary frame and the target boundary frame have a containing relationship.
In one embodiment of the present application, the relative positional relationship between the self-vehicle boundary frame and the target boundary frame includes completely disjoint, overlapping or containing, specifically, when determining whether the self-vehicle boundary frame and the target boundary frame overlap, mainly by determining whether four sides of the target boundary frame intersect with the self-vehicle boundary frame, if intersecting, it is indicated that the self-vehicle boundary frame and the target boundary frame overlap, the self-vehicle and the target collide, if disjoint, it is required to determine whether the self-vehicle boundary frame and the target boundary frame have a containing relationship, that is, whether the self-vehicle boundary frame is within the target boundary frame, or whether the target boundary frame is within the self-vehicle boundary frame, and if there is a containing relationship, the self-vehicle and the target collide. Only when the own vehicle bounding box and the target bounding box do not intersect at all, i.e., neither overlap nor have an inclusion relationship, will the own vehicle and the target collide.
In this embodiment, by respectively determining whether four sides of the target bounding box intersect with the own vehicle bounding box, if so, the own vehicle collides with the target, and if not, whether the own vehicle bounding box and the target bounding box have a containment relationship is determined, so that whether the own vehicle collides with the target can be accurately determined according to the situation.
In one embodiment of the present application, determining whether four sides of the target bounding box intersect the own bounding box includes:
and if the four sides are perpendicular to the reference coordinate system, judging whether the horizontal coordinates and the vertical coordinates of the four sides are in the horizontal coordinate range and the vertical coordinate range of the self-vehicle boundary frame.
In one embodiment of the present application, when determining whether four sides of the target bounding box intersect with the own bounding box, respectively, taking one side as an example, as shown in fig. 4, a P1 point coordinate (objbody px (1), objbody py (1)), a P2 point coordinate (objbody px (2), objbody py (2)), first ordering two points P1, P2, and placing the x coordinate at the front, i.e. after ordering, satisfying P1x < P2x, and then determining whether the sides P1-P2 intersect with the own bounding box. If the sides P1-P2 are perpendicular to the reference coordinate system, i.e. the current objbody px (1) =objbody px (2), the linear equation is y=b, then it is determined whether the ordinate of the sides is within the abscissa range of the own-vehicle bounding box, i.e. it is determined whether there is an intersection of y, and if there is no intersection, the sides P1-P2 do not intersect with the own-vehicle bounding box. Two situations without intersection are shown, except YSmalerThanMin and YBIggerThanMax, the overlapping scenes exist on the edges P1-P2 and the boundary frames of the vehicle, the situations without intersection mean that the Y values of two points P1-P2 are smaller than the Y minimum value of the boundary frames of the vehicle or the Y values of two points P1-P2 are larger than the Y maximum value of the boundary frames of the vehicle, and the specific range is expressed as follows by a formula.
YSmallerThanMin=ObjP1Y<RangeY1&&ObjP2Y<RangeY1
YBiggerThanMax=ObjP1Y>RangeY2&&ObjP2Y>RangeY2
Where Range 1 is the Y minimum of the own vehicle bounding box, range 2 is the Y maximum of the own vehicle bounding box, and ObjP1Y, objP Y is the Y value of two points P1 and P2.
In this embodiment, if the four sides are perpendicular to the reference coordinate system, it can be accurately determined whether the target bounding box and the own-vehicle bounding box overlap by determining whether the horizontal and vertical coordinates of the four sides are within the horizontal and vertical coordinate ranges of the own-vehicle bounding box.
In one embodiment of the present application, the determining whether the four sides of the target bounding box intersect with the own bounding box further includes:
if the four sides are not perpendicular to the reference coordinate system, a linear equation is built based on the four sides, whether the intersection exists between the abscissa of the four sides and the abscissa of the self-vehicle boundary frame is judged, and if the intersection exists, whether the ordinate of the four sides is in the ordinate range of the self-vehicle boundary frame is judged based on the intersection and the linear equation.
In one embodiment of the present application, when determining whether four sides of the target bounding box intersect with the own bounding box, as described above, taking one side as an example, the coordinates of point P1 (objbody px (1)), the coordinates of point P2 (objbody px (2), objbody py (2)), the two points P1 and P2 are sorted first, the x coordinate is placed in front, that is, after sorting, P1x < P2x is satisfied, and then whether the sides P1-P2 intersect with the own bounding box is determined. As shown in fig. 5, if the edges P1-P2 are not perpendicular to the reference coordinate system, i.e. the current objbody px (1) +.objbody px (2), a linear equation y=kx+b is constructed, and first, it is determined whether there is an intersection of the X coordinates, and if there is an intersection of the X coordinates, the largest X coordinate OnEdgeX1 and the smallest X coordinate OnEdgeX2 are found, respectively, where onedgex1=max (ObjP 1X, range X1), onedgex2=min (ObjP 2X, range X2). Then, the corresponding y coordinates are respectively determined based on a linear equation, namely onedgey1=k×onedgex1+b, onedgey2=k×onedgex2+b. Similarly, there are two cases without intersection, except ysmalerthanmin and YBiggerThanMax, where the edges P1-P2 and the boundary frame of the vehicle have overlapping scenes, and the case without intersection means that the Y values of the two points corresponding to the largest x-coordinate OnEdgeX1 and the smallest x-coordinate OnEdgeX2 are smaller than the Y minimum value of the boundary frame of the vehicle or the case without intersection means that the Y values of the two points corresponding to the largest x-coordinate OnEdgeX1 and the smallest x-coordinate OnEdgeX2 are larger than the Y maximum value of the boundary frame of the vehicle, and the specific ranges are expressed as follows.
YSmallerThanMin=OnEdgeY1<RangeY1&&OnEdgeY2<RangeY1
YBiggerThanMax=OnEdgeY1>RangeY2&&OnEdgeY2>RangeY2
Wherein Range 1 is the Y minimum value of the self-vehicle boundary frame, range 2 is the Y maximum value of the self-vehicle boundary frame, onEdgeY1 and OnEdgeY2 are the Y values of two points corresponding to the largest x coordinate OnEdgeX1 and the smallest x coordinate OnEdgeX 2.
In this embodiment, if the four sides are not perpendicular to the reference coordinate system, a linear equation is constructed based on the four sides, whether the intersection exists between the abscissa of the four sides and the abscissa of the self-vehicle boundary frame is determined, and if the intersection exists, whether the ordinate of the four sides is within the ordinate range of the self-vehicle boundary frame is determined based on the intersection and the linear equation, so that whether the target boundary frame and the self-vehicle boundary frame overlap can be accurately determined.
In one embodiment of the present application, the determining whether the inclusion relationship exists between the own vehicle bounding box and the target bounding box includes:
and determining the distance from the target boundary frame to the coordinate axis, and if the distance is not greater than the target length or the target width, determining that the inclusion relationship exists between the self-vehicle boundary frame and the target boundary frame.
In one embodiment of the present application, when four sides of the target bounding box are not intersected with the self-bounding box, it is determined whether a containment relationship exists between the self-bounding box and the target bounding box, specifically, as shown in fig. 6, first, a distance from the target bounding box to a coordinate axis is determined, that is, a distance DistP0ToEdge1 from a P1-P2 side to an EgoBoundingBox is calculated, a distance DistP0ToEdge2 from a P2-P3 side to an EgoBoundingBox is calculated, a distance DistP0ToEdge3 from a P3-P4 side to an EgoBoundingBox is calculated, and then, whether each distance is greater than a target length or a target width is determined, and if the distance is not greater than the target length or the target width, the containment relationship exists between the self-bounding box and the target bounding box. Specifically, if the following formula is satisfied, the inclusion relationship is defined.
DistP0ToEdge1<=Lobj&&DistP0ToEdge3<=Lobj&&DistP0ToEdge2<=Wobj&&DistP0ToEdge4<=Wobj
Wherein Lobj is the target length, wobj is the target width, distP0ToEdge1 is the distance from the P1-P2 side to the Ego BounddingBox, distP0ToEdge2 is the distance from the P2-P3 side to the Ego BounddingBox, distP0ToEdge3 is the distance from the P3-P4 side to the Ego BounddingBox, distP0ToEdge4 is the distance from the P4-P1 side to the Ego BounddingBox.
In this embodiment, by determining the distance from the target bounding box to the coordinate axis, if the distance is not greater than the target length or the target width, the inclusion relationship exists between the own vehicle bounding box and the target bounding box, so that whether the inclusion relationship exists between the target and the own vehicle can be accurately determined, and whether collision occurs can be further determined.
The following describes, in a specific embodiment, the steps of the method for detecting a collision of a vehicle with an obstacle. First, in S601, target information is acquired, and target relative information is determined based on the vehicle information, the vehicle coordinate system, and the target information. Specifically, S603-S605 establish a reference coordinate system based on the vehicle information and the vehicle coordinate system, and determine the target relative position and the target relative heading angle based on the reference coordinate system and the target information.
Thereafter, S607, a self-vehicle boundary box and a target boundary box are determined based on the self-vehicle information and the target relative information, respectively, and concretely S609-611, a self-vehicle boundary box and a target boundary box are determined based on the self-vehicle boundary point and the target boundary point based on the reference coordinate system and the self-vehicle information and the target relative information, respectively.
Finally, S613, determining a relative positional relationship between the own vehicle bounding box and the target bounding box, and determining whether collision between the own vehicle and the target occurs based on the relative positional relationship. Specifically, S615-S617 respectively determine whether four sides of the target bounding box intersect with the own vehicle bounding box, if so, the own vehicle collides with the target, and if not, it is determined whether the own vehicle bounding box and the target bounding box have a containment relationship. As shown in fig. 7, the own vehicle boundary box and the target boundary box overlap, and as shown in fig. 8, the own vehicle boundary box and the target boundary box have an inclusion relationship. The specific judging method comprises the following steps that S619, if the four sides are perpendicular to a reference coordinate system, whether the transverse coordinates and the longitudinal coordinates of the four sides are in the transverse coordinate range and the longitudinal coordinate range of a self-propelled boundary frame is judged; s621, if the four sides are not perpendicular to the reference coordinate system, constructing a linear equation based on the four sides, judging whether intersection exists between the abscissa of the four sides and the abscissa of the self-vehicle boundary frame, and if the intersection exists, judging whether the ordinate of the four sides is in the ordinate range of the self-vehicle boundary frame based on the intersection and the linear equation; s623, determining the distance from the target boundary frame to the coordinate axis, and if the distance is not greater than the target length or the target width, then the self-vehicle boundary frame and the target boundary frame have a containing relationship.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a vehicle obstacle avoidance collision detection device for realizing the vehicle obstacle avoidance collision detection method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the vehicle obstacle avoidance collision detection device provided below may refer to the limitation of the vehicle obstacle avoidance collision detection method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 9, there is provided a vehicle obstacle avoidance collision detection device 900 comprising: an information acquisition module 901, a bounding box determination module 903, and a collision detection module 905, wherein:
an information acquisition module 901, configured to acquire target information, and determine target relative information based on vehicle information, a vehicle coordinate system, and the target information;
a bounding box determination module 903 for determining a self-vehicle bounding box and a target bounding box based on the self-vehicle information and the target relative information, respectively;
the collision detection module 905 is configured to determine a relative positional relationship between the own vehicle bounding box and the target bounding box, and determine whether the own vehicle and the target collide based on the relative positional relationship.
In one embodiment of the present application, the target information includes a target position and a target heading angle, and the information obtaining module is further configured to:
establishing a reference coordinate system based on the vehicle information and the vehicle coordinate system;
and determining the relative position and the relative course angle of the target based on the reference coordinate system and the target information.
In one embodiment of the present application, the bounding box determination module is further configured to:
representing a self-vehicle boundary point and a target boundary point based on the reference coordinate system and the self-vehicle information and the target relative information respectively;
And determining a self-vehicle boundary box and a target boundary box based on the self-vehicle boundary point and the target boundary point.
In one embodiment of the present application, the collision detection module is further configured to:
judging whether four sides of the target boundary frame are intersected with the own vehicle boundary frame or not respectively, and if so, collision between the own vehicle and the target occurs;
if the two frames do not intersect, judging whether the self-vehicle boundary frame and the target boundary frame have a containing relationship.
In one embodiment of the present application, the collision detection module is further configured to:
and if the four sides are perpendicular to the reference coordinate system, judging whether the horizontal coordinates and the vertical coordinates of the four sides are in the horizontal coordinate range and the vertical coordinate range of the self-vehicle boundary frame.
In one embodiment of the present application, the collision detection module is further configured to:
if the four sides are not perpendicular to the reference coordinate system, a linear equation is built based on the four sides, whether the intersection exists between the abscissa of the four sides and the abscissa of the self-vehicle boundary frame is judged, and if the intersection exists, whether the ordinate of the four sides is in the ordinate range of the self-vehicle boundary frame is judged based on the intersection and the linear equation.
In one embodiment of the present application, the collision detection module is further configured to:
and determining the distance from the target boundary frame to the coordinate axis, and if the distance is not greater than the target length or the target width, determining that the inclusion relationship exists between the self-vehicle boundary frame and the target boundary frame.
The modules in the vehicle obstacle avoidance collision detection device can be realized in whole or in part by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 10. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program when executed by a processor implements a vehicle obstacle avoidance collision detection method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 10 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, implements the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
It should be noted that, user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A method for detecting obstacle avoidance collisions of a vehicle, the method comprising:
acquiring target information, and determining target relative information based on the vehicle information, a vehicle coordinate system and the target information;
determining a self-vehicle bounding box and a target bounding box based on the self-vehicle information and the target relative information respectively;
and judging the relative position relation between the self-vehicle boundary frame and the target boundary frame, and determining whether the self-vehicle and the target collide or not based on the relative position relation.
2. The method of claim 1, wherein the target information comprises a target location and a target heading angle, and wherein determining target relative information based on the vehicle information, the vehicle coordinate system, and the target information comprises:
establishing a reference coordinate system based on the vehicle information and the vehicle coordinate system;
and determining the relative position and the relative course angle of the target based on the reference coordinate system and the target information.
3. The method of claim 2, wherein the determining the own vehicle bounding box and the target bounding box based on the own vehicle information and the target relative information, respectively, comprises:
representing a self-vehicle boundary point and a target boundary point based on the reference coordinate system and the self-vehicle information and the target relative information respectively;
and determining a self-vehicle boundary box and a target boundary box based on the self-vehicle boundary point and the target boundary point.
4. The method of claim 1, wherein the determining a relative positional relationship of the own vehicle bounding box and the target bounding box, determining whether an own vehicle and target will collide based on the relative positional relationship comprises:
judging whether four sides of the target boundary frame are intersected with the own vehicle boundary frame or not respectively, and if so, collision between the own vehicle and the target occurs;
If the two frames do not intersect, judging whether the self-vehicle boundary frame and the target boundary frame have a containing relationship.
5. The method of claim 4, wherein the separately determining whether four edges of the target bounding box intersect the own bounding box comprises:
and if the four sides are perpendicular to the reference coordinate system, judging whether the horizontal coordinates and the vertical coordinates of the four sides are in the horizontal coordinate range and the vertical coordinate range of the self-vehicle boundary frame.
6. The method of claim 4, wherein the determining whether the four sides of the target bounding box intersect the own bounding box, respectively, further comprises:
if the four sides are not perpendicular to the reference coordinate system, a linear equation is built based on the four sides, whether the intersection exists between the abscissa of the four sides and the abscissa of the self-vehicle boundary frame is judged, and if the intersection exists, whether the ordinate of the four sides is in the ordinate range of the self-vehicle boundary frame is judged based on the intersection and the linear equation.
7. The method of claim 4, wherein determining whether the inclusion relationship exists between the host bounding box and the target bounding box comprises:
and determining the distance from the target boundary frame to the coordinate axis, and if the distance is not greater than the target length or the target width, determining that the inclusion relationship exists between the self-vehicle boundary frame and the target boundary frame.
8. A vehicle obstacle avoidance collision detection device, the device comprising:
the information acquisition module is used for acquiring target information and determining target relative information based on the vehicle information, the vehicle coordinate system and the target information;
the boundary frame determining module is used for determining a self-vehicle boundary frame and a target boundary frame based on the self-vehicle information and the target relative information respectively;
and the collision detection module is used for judging the relative position relation between the self-vehicle boundary frame and the target boundary frame and determining whether the self-vehicle and the target collide or not based on the relative position relation.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202311504946.7A 2023-11-13 2023-11-13 Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium Pending CN117681868A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311504946.7A CN117681868A (en) 2023-11-13 2023-11-13 Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311504946.7A CN117681868A (en) 2023-11-13 2023-11-13 Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117681868A true CN117681868A (en) 2024-03-12

Family

ID=90132872

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311504946.7A Pending CN117681868A (en) 2023-11-13 2023-11-13 Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117681868A (en)

Similar Documents

Publication Publication Date Title
US11789461B2 (en) Autonomous vehicle collision mitigation systems and methods
JP2019156265A (en) Display system, display method, and program
US11548443B2 (en) Display system, display method, and program for indicating a peripheral situation of a vehicle
US11327506B2 (en) Method and system for localized travel lane perception
CN110371018A (en) Improve vehicle behavior using the information of other vehicle car lights
WO2022150250A1 (en) Methods and system for predicting trajectories of uncertain road users by semantic segmentation of drivable area boundaries
CN116653964B (en) Lane changing longitudinal speed planning method, apparatus and vehicle-mounted device
CN116691688B (en) Vehicle lane change track stitching method, device and domain controller
CN116674557B (en) Vehicle autonomous lane change dynamic programming method and device and domain controller
EP4285083A1 (en) Methods and system for generating a lane-level map for an area of interest for navigation of an autonomous vehicle
CN112987053A (en) Method and apparatus for monitoring yaw sensor
US11845429B2 (en) Localizing and updating a map using interpolated lane edge data
US11796331B2 (en) Associating perceived and mapped lane edges for localization
CN117681868A (en) Vehicle obstacle avoidance collision detection method, device, computer equipment and storage medium
CN116653965B (en) Vehicle lane change re-planning triggering method and device and domain controller
CN117784768A (en) Vehicle obstacle avoidance planning method, device, computer equipment and storage medium
CN112896117A (en) Method and subsystem for controlling an autonomous braking system of a vehicle
CN117073709B (en) Path planning method, path planning device, computer equipment and storage medium
CN117782120A (en) Dangerous collision target track prediction method, dangerous collision target track prediction device and computer equipment
CN117930220A (en) Obstacle speed detection method, obstacle speed detection device, computer device and storage medium
US11887272B2 (en) Method and system for determining a spatial transformation employing partial dimension iterative closest point
CN117698718A (en) Lane-level navigation method, device, computer equipment and storage medium
I. Meneguette et al. Autonomous Vehicles
US11830254B2 (en) Outside environment recognition device
CN117725502A (en) Target pitch classification method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination