CN115507873B - Route planning method, device, equipment and medium based on bus tail traffic light - Google Patents

Route planning method, device, equipment and medium based on bus tail traffic light Download PDF

Info

Publication number
CN115507873B
CN115507873B CN202211365316.1A CN202211365316A CN115507873B CN 115507873 B CN115507873 B CN 115507873B CN 202211365316 A CN202211365316 A CN 202211365316A CN 115507873 B CN115507873 B CN 115507873B
Authority
CN
China
Prior art keywords
traffic light
target bus
sequence
coordinate
tail
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211365316.1A
Other languages
Chinese (zh)
Other versions
CN115507873A (en
Inventor
李敏
张�雄
龙文
齐新迎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GAC Aion New Energy Automobile Co Ltd
Original Assignee
GAC Aion New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GAC Aion New Energy Automobile Co Ltd filed Critical GAC Aion New Energy Automobile Co Ltd
Priority to CN202211365316.1A priority Critical patent/CN115507873B/en
Publication of CN115507873A publication Critical patent/CN115507873A/en
Application granted granted Critical
Publication of CN115507873B publication Critical patent/CN115507873B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the disclosure discloses a route planning method, a route planning device, route planning equipment and a route planning medium based on a bus tail traffic light. One embodiment of the method comprises: acquiring a road image sequence to be detected and a road radar sensing data sequence; performing position identification on each road radar sensing data in the road radar sensing data sequence to generate external connection frame information of the target bus to obtain an external connection frame information sequence of the target bus; generating a traffic light position information sequence of a target bus position; performing image detection on each image to be detected in the road image sequence to be detected to generate an obstacle outer boundary frame information group to obtain an obstacle outer boundary frame information group sequence; generating a coordinate set of a traffic light shielding area at the tail of the target bus; and planning the current vehicle path in an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area to obtain a planned path. The embodiment can improve the running safety and the passing efficiency of the vehicle.

Description

Route planning method, device, equipment and medium based on bus tail traffic light
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a route planning method, a route planning device, route planning equipment and a route planning medium based on tail traffic lights of buses.
Background
To facilitate the detection of traffic lights by autonomous vehicles and the observation of traffic lights by drivers, some bus tails are equipped with traffic light display devices. The traffic light display equipment can display the opposite traffic lights at the front intersection of the bus in real time, so that the driver can observe the traffic lights conveniently. At present, when path planning is performed, the method generally adopted is as follows: by means of a path planning algorithm, a planned path is determined which can bypass the obstacle. In addition, if the traffic light of the bus station in front of the current vehicle is shielded, and the current vehicle is in the blind area of the traffic light, the path planning is stopped, and the vehicle moves along with the front vehicle in a passive following mode.
However, the inventor finds that when the path planning is performed in the above manner, the following technical problems often exist:
firstly, when a current vehicle does not enter a solid line road section, if a traffic light at the front bus position is shielded, the current vehicle enters a following moving mode, the current vehicle can follow the vehicle blindly, namely the traffic light is changed into a red light or is about to be changed into the red light after the front vehicle passes through an intersection, and the current vehicle still follows the vehicle, so that the current vehicle is easy to run the red light, higher potential safety hazard exists, the driving safety is reduced, and in addition, if the current vehicle keeps a longer distance from the front vehicle, the current vehicle is easy to be jammed by other vehicles, and the passing efficiency of the current vehicle is reduced;
secondly, under the condition that the traffic lights at the tail of the bus are shielded, the influence of the dead zone of the traffic lights at the tail of the bus on the planned route of the current vehicle and the influence of the planned route on the passing efficiency of the vehicle are not fully considered, so that the generated planned route enables the current vehicle not to well avoid the dead zone of the traffic lights in the moving process, and the current vehicle enters a following mode, thereby reducing the driving safety.
The above information disclosed in this background section is only for enhancement of understanding of the background of the inventive concept and, therefore, it may contain information that does not form the prior art that is already known to a person of ordinary skill in the art in this country.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose bus tail traffic light based path planning methods, apparatus, devices and media to address one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a method for planning a route based on a tail traffic light of a bus, the method including: acquiring a road image sequence to be detected and a road radar perception data sequence in response to the fact that traffic light information of the tail of the target bus does not exist in the acquired target road image; performing position identification on each road radar sensing data in the road radar sensing data sequence to generate external connecting frame information of the target bus to obtain an external connecting frame information sequence of the target bus; generating a position information sequence of a traffic light of a target bus position based on the information sequence of the external connecting frame of the target bus; performing image detection on each image to be detected in the road image sequence to be detected to generate an obstacle outer boundary frame information group, and obtaining an obstacle outer connecting frame information group sequence; generating a coordinate set of a tail traffic light shielding area of the target bus based on the position information sequence of the parking spaces of the target bus and the information group sequence of the outer connecting frame of the obstacle; and planning the current vehicle path in an occlusion area corresponding to the coordinate set of the traffic light occlusion area at the tail of the target bus to obtain a planned path.
In a second aspect, some embodiments of the present disclosure provide a route planning device based on a bus tail traffic light, the device comprising: the acquisition unit is configured to respond to the fact that traffic light information of the tail of the target bus does not exist in the acquired target road image, and acquire a road image sequence to be detected and a road radar perception data sequence; the identification unit is configured to identify the position of the target bus to each road radar sensing data in the road radar sensing data sequence to generate the external connecting frame information of the target bus, so as to obtain an external connecting frame information sequence of the target bus; the first generating unit is configured to generate a target bus position traffic light position information sequence based on the target bus external frame information sequence; the image detection unit is configured to perform image detection on each image to be detected in the road image sequence to be detected so as to generate an obstacle outer border frame information group and obtain an obstacle outer border frame information group sequence; the second generation unit is configured to generate a coordinate set of a traffic light shielding area at the tail of the target bus based on the position information sequence of the parking space traffic light of the target bus and the information group sequence of the outer connecting frame of the obstacle; and the path planning unit is configured to plan the current vehicle path in an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area to obtain a planned path.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantages: according to the route planning method based on the bus tail traffic light, the possibility that the current vehicle runs the red light can be reduced, and the passing efficiency of the current vehicle can be improved. Specifically, the reason for reducing the traffic efficiency of the current vehicle is that: when current vehicle does not get into the solid line highway section, if the place ahead bus parking stall traffic lights are sheltered from, lead to getting into the mode of removing with the car, can make current vehicle blind with the car, the place ahead vehicle passes through crossing back traffic lights and becomes the red light or be about to become the red light, current vehicle is still following the car yet, with this lead to current vehicle to appear easily running the possibility of red light, there is higher potential safety hazard, driving safety has been reduced, in addition, if keep far away distance with the front truck, then easily by other vehicles plus the jam, lead to reducing the current efficiency of current vehicle. Based on this, according to the route planning method based on the bus tail traffic light of some embodiments of the present disclosure, firstly, it is determined that route planning is performed under the condition that the bus seat traffic light is shielded. Therefore, the road image sequence to be detected and the road radar perception data sequence are obtained in response to the fact that the traffic light information of the tail of the target bus does not exist in the obtained target road image. Then, in order to determine the position of the traffic lights of the bus space, the position of the target bus is identified for each road radar sensing data in the road radar sensing data sequence to generate the external frame information of the target bus, and the external frame information sequence of the target bus is obtained. Then, considering that the bus is moving, the bus rear traffic light is also moving. Therefore, a target bus parking space traffic light position information sequence is generated based on the target bus outer connecting frame information sequence. And generating a position information sequence of the traffic lights of the parking spaces of the target bus to correspond to the traffic lights at the tail of the bus at different moments. Then, in order to determine the area where the traffic lights are blocked, image detection is carried out on each image to be detected in the road image sequence to be detected so as to generate an obstacle outer boundary frame information group, and an obstacle outer connection frame information group sequence is obtained. And then, generating a coordinate set of the shielding area of the tail traffic light of the target bus based on the position information sequence of the parking space traffic light of the target bus and the information group sequence of the outer connecting frame of the barrier. Therefore, the area where the traffic light at the tail of the target bus is shielded can be determined. And finally, planning the current vehicle path in an occlusion area corresponding to the coordinate set of the traffic light occlusion area at the tail of the target bus to obtain a planned path. The generated planned path can avoid the occlusion area to the maximum extent by planning the current vehicle path in the occlusion area. Meanwhile, the current vehicle does not need to keep a longer distance from the front vehicle. Therefore, the passing efficiency of the current vehicle can be ensured to some extent. Thus, the planned path can be used for reducing the possibility that the current vehicle runs the red light and improving the traffic efficiency of the current vehicle.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a flow diagram of some embodiments of a bus tail traffic light based path planning method according to the present disclosure;
FIG. 2 is a schematic structural diagram of some embodiments of a bus tail traffic light based path planning apparatus according to the present disclosure;
FIG. 3 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and the embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates a flow 100 of some embodiments of a bus tail traffic light based path planning method according to the present disclosure. The route planning method based on the bus tail traffic light comprises the following steps:
step 101, in response to the fact that traffic light information of the tail of the target bus does not exist in the obtained target road image, obtaining a road image sequence to be detected and a road radar perception data sequence.
In some embodiments, in response to detecting that the traffic light information of the tail of the target bus does not exist in the acquired target road image, the executing body of the bus tail-based path planning method may acquire the road image sequence to be detected and the road radar sensing data sequence in a wired manner or a wireless manner. The execution main body can also determine whether traffic light information of the tail of the target bus exists in the target road image through a preset image detection algorithm. Secondly, the target road image and the road image to be detected in the road image sequence to be detected can be road images shot by a vehicle-mounted camera of the current vehicle. In contrast, the target road image is captured at a time before the road image to be detected is captured. Each road radar sensing data in the road radar sensing data sequence may be point cloud data measured by a vehicle-mounted radar of the current vehicle. Specifically, each road image to be detected in the road image sequence to be detected corresponds to the same timestamp as each road radar sensing data in the road radar sensing data sequence.
By way of example, the image detection algorithm described above may include, but is not limited to, at least one of: FCN (full volumetric Networks) model, resnet (Residual neural Network) model, VGG (Visual Geometry Group Network) model, google Net (deep neural Network) model, and the like. The vehicle-mounted radar may be a laser radar, a millimeter wave radar, or the like.
And 102, identifying the position of the target bus to generate the external frame information of the target bus so as to obtain the external frame information sequence of the target bus.
In some embodiments, the executing agent may perform position recognition on the target bus by using various methods on each road radar sensing data in the road radar sensing data sequence to generate external frame information of the target bus, so as to obtain an external frame information sequence of the target bus.
In some optional implementations of some embodiments, each road radar-perception data in the above-described sequence of road radar-perception data may include a set of road radar-perception coordinates. And the executing main body carries out target bus position recognition on each road radar sensing data in the road radar sensing data sequence to generate the external connecting frame information of the target bus, so as to obtain the external connecting frame information sequence of the target bus, and the executing main body can comprise the following steps:
firstly, carrying out target bus detection on the road image to be detected corresponding to the road radar sensing data in the road image sequence to be detected so as to generate a detected external rectangular frame. The target bus can be a bus with traffic lights in front of the current bus and at the tail of the current bus. And detecting a target bus by using a preset target detection model and the road image to be detected corresponding to the road radar sensing data in the road image sequence to be detected so as to generate a detected external rectangular frame. Here, the circumscribed rectangular frame after detection may be a rectangular frame in a camera coordinate system. The detected external rectangular frame can be used for representing information such as the position of the target bus, the posture of the target bus, the size of the target bus and the like.
As an example, the above object detection model may include, but is not limited to, at least one of: a G-CRF (Gaussian-Conditional Random Field) model, a DenSeCRF (Fully-Connected Conditional Random Field) model, an MRF (MRF-Markov Random Field) model, and an SPP (Spatial Pyramid Pooling) model. And secondly, selecting a road radar sensing coordinate matched with the detected external rectangular frame from the road radar sensing data to obtain a road radar sensing coordinate group. The matching can be that the road radar sensing coordinate is in the external rectangular frame after detection. Thus, the set of road radar perceived coordinates may be used to characterize the target bus.
In practice, each road radar sensing coordinate in the road radar sensing data may be converted to the camera coordinate system before being selected, so as to facilitate screening. And is not particularly limited herein.
And thirdly, determining the external frame of the target bus of each road radar sensing coordinate in the road radar sensing coordinate group. And determining the external frame of the target bus of each road radar sensing coordinate in the road radar sensing coordinate group by a minimum convex hull method. Here, the target bus circumscribing frame may be a three-dimensional frame within the camera coordinate system.
And fourthly, determining the external connecting frame of the target bus as the external connecting frame information of the target bus.
And 103, generating a position information sequence of the traffic lights of the parking spaces of the target bus based on the information sequence of the outer connecting frame of the target bus.
In some embodiments, the execution subject may generate the sequence of position information of the traffic light of the target bus space in various ways based on the sequence of information of the external frame of the target bus.
In some optional implementation manners of some embodiments, generating a position information sequence of a parking space traffic light of a target bus based on the information sequence of the external connecting frame of the target bus may include the following steps:
and selecting the coordinates of the target bus external connection frame corresponding to the preset traffic light position from the target bus external connection frame included by the external connection frame information of each target bus in the target bus external connection frame information sequence to serve as the coordinate information of the traffic light of the target bus position, so as to obtain the position information sequence of the traffic light of the target bus position. Wherein, each target bus parking space traffic light position information in the target bus parking space traffic light position information sequence can comprise a target bus parking space traffic light coordinate. Here, the coordinates of the position where the traffic light is set on the external frame of the target bus may be determined as the coordinates information of the traffic light of the bus position of the target bus.
In practice, information such as the size of a bus with a traffic light mounted at the tail of the bus, the position of the traffic light relative to the bus and the like can be stored in a database. So as to determine the position of the traffic light from the external frame of the target bus.
And 104, performing image detection on each image to be detected in the road image sequence to be detected to generate an obstacle outer boundary frame information group, so as to obtain an obstacle outer boundary frame information group sequence.
In some embodiments, the executing body may perform image detection on each to-be-detected image in the to-be-detected road image sequence to generate an obstacle outside bounding box information group, so as to obtain an obstacle outside bounding box information group sequence.
In some optional implementation manners of some embodiments, the performing main body performs image detection on each to-be-detected image in the to-be-detected road image sequence to generate an obstacle outside bounding box information group, and obtains an obstacle outside bounding box information group sequence, which may include the following steps:
firstly, obstacle recognition is carried out on the road image to be detected, and an obstacle recognition information set is obtained. Each obstacle identification information in the obstacle identification information group may include: obstacle size information and an obstacle position and attitude matrix. Next, obstacle recognition may be performed by the image detection algorithm. Here, each obstacle identification information in the obstacle identification information group may be used to characterize one obstacle in the road image. The obstacle size information may be a size value of the obstacle. For example, the obstacle size information may include an obstacle length value, a height value, a width value, and the like. The obstacle position and posture matrix can be used for representing the position coordinates and the posture of the obstacle.
And secondly, generating obstacle external connection frame information by using obstacle size information and an obstacle position and posture matrix included by each obstacle identification information in the obstacle identification information group to obtain an obstacle external connection frame information group. Each obstacle external connection frame information in the obstacle external connection frame information group comprises an obstacle external rectangle and an obstacle external connection frame vertex coordinate group. Each obstacle external frame information in the obstacle external frame information group may include an obstacle external rectangle and an obstacle external frame vertex coordinate group. The obstacle circumscribed rectangle may be a three-dimensional rectangle in the camera coordinate system of the above-described onboard camera. The set of obstacle bounding box vertex coordinates may include eight obstacle bounding box vertex coordinates. The obstacle circumscribing bounding box vertex coordinates may characterize the vertices of the obstacle circumscribing rectangle. Here, first, the obstacle coordinates and the direction vector in the obstacle position and orientation matrix may be determined. And secondly, constructing an obstacle external connection rectangle according to the length value, the width value and the height value of the obstacle included by the obstacle size information in the direction represented by the direction vector. Meanwhile, a vertex coordinate set of the outer connecting frame of the obstacle can be obtained. And finally, determining the external obstacle border corresponding to the same obstacle and the vertex coordinate set of the external obstacle border as the external obstacle border information. Thus, the obstacle outside-frame information group can be obtained.
And 105, generating a coordinate set of the shielding area of the tail traffic light of the target bus based on the position information sequence of the parking space traffic light of the target bus and the information group sequence of the outer connecting frame of the obstacle.
In some embodiments, the execution main body may generate a coordinate set of a traffic light shielding area at the tail of the target bus in various ways based on the position information sequence of the traffic light at the parking space of the target bus and the information group sequence of the outside-connected border of the obstacle.
In some optional implementation manners of some embodiments, the executing main body generates a coordinate set of a target bus tail traffic light blocking area based on the target bus parking space traffic light position information sequence and the obstacle external connection frame information group sequence, and may include the following steps:
firstly, a camera plane is constructed based on the camera coordinates of the vehicle-mounted camera of the current vehicle. The camera plane may be a plane where a camera ordinate in a camera coordinate system is located. Here, it may be determined that the camera plane is parallel to the ground. Thus, the camera plane can be determined by the ordinate of the coordinate point of the onboard camera.
In practice, a traffic light blind area coordinate set is generated, and not only five surfaces (except the bottom surface) of the circumscribed rectangle of the barrier corresponding to each barrier are projected to a camera plane, so that a projection surface is obtained. The projection surface of the barriers corresponding to the traffic lights at different positions on the camera plane needs to be determined. Here, the plane projection may be a projection plane in which four vertex coordinates corresponding to each plane are projected and the projected points are connected to form a plane. Then, the union of the projection surfaces is taken as the complete projection surface. And finally, each coordinate point on the complete projection surface is the traffic light blind area coordinate in the traffic light blind area coordinate set.
And secondly, for a target bus parking space traffic light coordinate included by each target bus parking space traffic light position information in the target bus parking space traffic light position information sequence and an obstacle external connection frame information group corresponding to the target bus parking space traffic light position information in the obstacle external connection frame information group sequence, executing the following steps to generate a single-frame traffic light shielding area coordinate set:
the first substep is to determine four plane vertex coordinates of an intersecting quadrilateral between the camera plane and the obstacle external rectangles included in each obstacle external bounding box information group in the obstacle external bounding box information group as a plane vertex coordinate group, and obtain a plane vertex coordinate group set. Wherein the intersecting quadrilateral may be a cross-section of the camera plane in an obstacle enclosing rectangle. Each planar vertex coordinate in each planar vertex coordinate set may be generated by:
Figure 912640DEST_PATH_IMAGE001
wherein,
Figure 626518DEST_PATH_IMAGE002
representing coordinate points.
Figure 547201DEST_PATH_IMAGE003
Indicating the above-mentioned barrierAnd the obstacle external border information comprises the vertex coordinates of a first obstacle external border in the vertex coordinates of the four obstacle external borders corresponding to the top surface of the obstacle external border in the vertex coordinate set of the obstacle external border.
Figure 473568DEST_PATH_IMAGE004
And the vertex coordinates of the external obstacle frame corresponding to the vertex coordinates of the first external obstacle frame in the vertex coordinates of the four external obstacle frames corresponding to the bottom surface of the external obstacle frame in the vertex coordinate set of the external obstacle frame included in the information of the external obstacle frame.
Figure 960044DEST_PATH_IMAGE005
Representing the coordinates of the vertices of the above-mentioned plane.
Figure 313665DEST_PATH_IMAGE006
And a vertical coordinate value indicating a camera coordinate in the camera plane.
Figure 315119DEST_PATH_IMAGE007
The first two data are shown taking vectors in parentheses.
Figure 920544DEST_PATH_IMAGE008
Third data taking vectors in parentheses.
In practice, p1 and p5 may be one edge of a rectangle surrounding the corresponding obstacle, so that there is a correspondence between p1, p5, and p 9. By analogy, other vertex coordinates also correspond to each other. Thus, four plane vertex coordinates can be obtained.
And a second substep, taking the traffic light coordinates of the target bus station as a starting point, respectively determining four ray intersection point coordinates of four plane vertex coordinates included in each plane vertex coordinate set in the plane vertex coordinate set and intersected with the camera plane as ray intersection point coordinate sets, and obtaining a ray intersection point coordinate set. Wherein each ray corner coordinate in each ray corner coordinate set may be generated by the following formula:
Figure 120581DEST_PATH_IMAGE009
wherein,
Figure 786049DEST_PATH_IMAGE010
representing the above-mentioned ray corner coordinates.
Figure 71537DEST_PATH_IMAGE011
And representing the coordinates of the traffic light of the target bus station.
Figure 215073DEST_PATH_IMAGE012
Represents the trim coefficient for the trim equation.
And a third substep of generating a coordinate set of a single-frame traffic light shielding area based on the ray intersection point coordinate set and the plane vertex coordinate set. For the four ray intersection point coordinates included by each ray intersection point coordinate set and the four plane vertex coordinates included by the corresponding plane vertex coordinate set, a single-frame traffic light shielding area coordinate set can be generated through the following steps: first, a minimum bounding polygon between the coordinates of the four ray intersection points and the coordinates of the vertices of the four planes may be determined. Here, the minimum circumscribed polygon may represent a complete projection plane corresponding to the coordinates of the traffic light of the target bus station. Secondly, determining each coordinate of the area where the minimum external polygon of the transverse projection is located as a single-frame traffic light shielding area coordinate to obtain a single-frame traffic light shielding area coordinate set.
And thirdly, combining the coordinates of each single-frame traffic light shielding area in each generated single-frame traffic light shielding area coordinate set to generate a target bus tail traffic light shielding area coordinate set. The combination processing may be to determine a union set of the first projected area coordinates in each generated single-frame traffic light occlusion area coordinate set, so as to obtain a target bus tail traffic light occlusion area coordinate set.
Optionally, before the executing body performs the current vehicle path planning in the occlusion area corresponding to the coordinate set of the traffic light occlusion area at the tail of the bus to obtain the planned path, the executing body may further execute the following steps:
the method comprises the steps of firstly, determining the distance value between the single-frame traffic light sheltering area coordinate in each single-frame traffic light sheltering area coordinate set in each generated single-frame traffic light sheltering area coordinate set and the corresponding target bus station traffic light coordinate. The distance value between the coordinates of the single-frame traffic light shielding area and the corresponding coordinates of the target bus station traffic light can be determined through a distance formula between two points.
And secondly, determining the tail traffic light shielding area coordinates of the target bus with the corresponding distance values meeting the preset distance condition in each generated single-frame traffic light shielding area coordinate set as the tail traffic light shielding area coordinates after screening to obtain a tail traffic light shielding area coordinate set after screening. The preset distance condition may be that the distance value is smaller than a preset distance threshold (for example, 50 meters).
And 106, planning the current vehicle path in an occlusion area corresponding to the coordinate set of the traffic light occlusion area at the tail of the target bus to obtain a planned path.
In some embodiments, the executing body may perform current vehicle path planning in an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area to obtain a planned path. Wherein the planned path may be generated by:
the method comprises the steps of firstly, acquiring a longitudinal traffic light coordinate and a transverse traffic light coordinate set corresponding to the position of a current vehicle. The transverse traffic light coordinate set matched with the vehicle positioning coordinate of the current vehicle can be selected from preset high-precision map data. The longitudinal traffic light coordinates may be coordinates of a traffic light at a crossing on the same road as the current vehicle. The transverse traffic light coordinates in the transverse traffic light coordinate set may be coordinates of traffic lights of other roads at a crossing in front of the current vehicle. Secondly, the matching can be that the traffic light represented by the traffic light coordinates and the current vehicle positioning coordinates are on the same road. Here, the longitudinal traffic light coordinates and the lateral traffic light coordinates may be obtained by converting high-precision map traffic light coordinates in a map coordinate system selected from the high-precision map data to a camera coordinate system of the above-mentioned vehicle-mounted camera. At least one lateral traffic light coordinate may be included in the set of lateral traffic light coordinates. Each lateral traffic light coordinate may correspond to a traffic light of one direction of the intersection.
And secondly, generating a corresponding first projected area coordinate set for each obstacle external connection frame information in each obstacle external connection frame information group in the obstacle external connection frame information group sequence and each transverse traffic light coordinate in the transverse traffic light coordinate group. The first projected area coordinate set can be generated through the generation mode of the single-frame traffic light shielding area coordinate set.
And thirdly, combining the generated first projected area coordinates in each first projected area coordinate set to generate a transverse traffic light shielding area coordinate set. The combining process may be to determine a union set of the first projected area coordinates in the generated first projected area coordinate sets, so as to obtain a transverse traffic light blocking area coordinate set.
And fourthly, generating a longitudinal traffic light sheltering area coordinate set based on the barrier external connection frame information group sequence and the longitudinal traffic light coordinate. The specific implementation manner and the technical effects of the fourth step may be those steps of generating a coordinate set of a transverse traffic light shielding region, which are not described herein again.
Fifthly, path planning can be carried out on an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area in the space-time coordinate system through the following formula so as to generate a planned path:
Figure 472879DEST_PATH_IMAGE013
wherein,
Figure 433882DEST_PATH_IMAGE014
representing the planned path.
Figure 79103DEST_PATH_IMAGE015
Representing the minimization objective function.
Figure 150964DEST_PATH_IMAGE016
The objective representing the minimized objective function is the planned path in an iterative process.
Figure 528856DEST_PATH_IMAGE017
Indicating the time of day.
Figure 270547DEST_PATH_IMAGE018
The time corresponding to the first road image in the road image sequence is shown.
Figure 796206DEST_PATH_IMAGE019
Indicating the time corresponding to the last road image in the road image sequence.
Figure 281545DEST_PATH_IMAGE020
A loss function is represented for generating loss values for the path coordinates on the planned path.
Figure 310681DEST_PATH_IMAGE021
On the representation of the planned path
Figure 816748DEST_PATH_IMAGE017
The path coordinates of the time of day.
Figure 439491DEST_PATH_IMAGE022
On the representation of the planned path
Figure 56417DEST_PATH_IMAGE017
Loss value of path coordinates of time of day.
Figure 940059DEST_PATH_IMAGE023
And an abscissa value representing coordinates of a path on the planned path.
Figure 554711DEST_PATH_IMAGE024
Representing the current vehicle traffic efficiency loss function.
Figure 727067DEST_PATH_IMAGE025
Is shown in
Figure 944422DEST_PATH_IMAGE018
To
Figure 151412DEST_PATH_IMAGE019
A loss value of traffic efficiency of the planned path within the time period.
Figure 468124DEST_PATH_IMAGE026
The transition number loss function is shown.
Figure 127775DEST_PATH_IMAGE027
A weight coefficient (e.g., 0.8) representing the movement velocity term.
Figure 758608DEST_PATH_IMAGE028
A weight coefficient (e.g., 0.2) representing the degree of switch to term.
Figure 616842DEST_PATH_IMAGE029
Is shown in
Figure 635614DEST_PATH_IMAGE018
To
Figure 720245DEST_PATH_IMAGE019
The number of times a path u within a time period needs to be rerouted.
Figure 279402DEST_PATH_IMAGE030
On the representation of the planned path
Figure 195405DEST_PATH_IMAGE017
The abscissa value of the path coordinate of the time.
Figure 54253DEST_PATH_IMAGE031
And a ordinate value indicating a path coordinate on the planned path.
Figure 485234DEST_PATH_IMAGE032
On the representation of the planned path
Figure 785765DEST_PATH_IMAGE017
The ordinate value of the time path coordinate.
Figure 759537DEST_PATH_IMAGE033
Figure 854532DEST_PATH_IMAGE034
Represents a preset weight value for participating in calculating a loss value (e.g.,
Figure 772810DEST_PATH_IMAGE033
the value may be taken to be 0.7,
Figure 814715DEST_PATH_IMAGE034
values may be 0.3).
Figure 439732DEST_PATH_IMAGE035
And representing the coordinate set of the shielded area of the traffic light at the tail of the target bus.
Figure 33524DEST_PATH_IMAGE036
And representing the coordinate set of the transverse traffic light shielding area.
Figure 48884DEST_PATH_IMAGE037
And representing the coordinate set of the longitudinal traffic light shielding area. The other can indicate that the path coordinate does not belong to the coordinate set of the target transverse traffic light occlusion area and the coordinate set of the target longitudinal traffic light occlusion area.
Figure 956798DEST_PATH_IMAGE038
Represents a 2-way expression.
In addition, in the iteration process, the generation of the planning path needs to meet the condition that the path coordinate can only move towards the positive direction in the space-time coordinate system. Meanwhile, a planned path in the iteration process can be provided for the formula through a preset path planning algorithm. Here, the planned path algorithm may include, but is not limited to, at least one of: an a-star algorithm, an artificial potential field, a random tree algorithm, a dixtre (Dijkstra) algorithm, etc. Finally, the above formula can be solved by ISAM (Incremental Smoothing And Mapping method), GTSAM (nonlinear optimization library), and the like. In practice, the path planning method may not be considered when the angle between the connection line between the traffic light of the target bus station and the optical center of the current vehicle-mounted camera and the optical axis of the vehicle-mounted camera is out of a preset threshold range (for example, plus or minus 30 degrees).
The above formulas and the related contents serve as an invention point of the embodiment of the disclosure, and the technical problem mentioned in the background art is solved, namely, under the condition that the traffic lights at the tail of the bus are shielded, the influence of the dead zone of the traffic lights at the tail of the bus on the planned route of the current vehicle and the influence of the planned route on the vehicle passing efficiency are not fully considered, so that the generated planned route enables the current vehicle not to well avoid the dead zone of the traffic lights in the moving process, and the current vehicle enters a following mode, thereby reducing the driving safety. Factors that lead to reduced driving safety and traffic efficiency tend to be as follows: under the condition that the traffic light at the tail of the bus is shielded, the influence of the dead zone of the traffic light at the tail of the bus on the planned route of the current vehicle and the influence of the planned route on the passing efficiency of the vehicle are not fully considered, so that the generated planned route enables the current vehicle not to well avoid the dead zone of the traffic light in the moving process. If the above-mentioned factors are solved, driving safety and traffic efficiency can be improved. To achieve this, first, by generating the obstacle-bordering rectangles and the obstacle-bordering bounding box vertex coordinates, it is possible to determine the position coordinates and the space occupation area of the obstacle in the camera coordinate system. Next, it is considered that the autonomous vehicle performs traffic light information recognition by taking a road image with an on-vehicle camera. Thus, by constructing the camera plane, it may be used to improve the accuracy of the generated occluded traffic light region. Then, the formula of generating the plane vertex coordinates can be used for determining the four vertex coordinates of the cross section of the obstacle circumscribed rectangle corresponding to the obstacle in the camera plane. Then, by the formula for generating the coordinates of the intersection points of the transverse traffic light rays, the coordinates of the corner points of the shadow area which can be presented by the circumscribed rectangle of the obstacle corresponding to the obstacle in the camera plane can be determined under the condition that the longitudinal traffic light and each transverse traffic light are respectively the light source. Then, through the combination processing, the coordinates of the target traffic light blind areas in all the shielded areas can be determined. Then, by introducing a space-time coordinate system, the optimal planning path can be conveniently found in the time series. In addition, the existence of the blind area coordinates of the traffic light at a far position is considered to be unnecessary for the current vehicle, so the blind area coordinates of the traffic light which do not meet the conditions are filtered out through the preset distance condition. Thus, the accuracy of the generated target blind area coordinates is improved. In addition, if only the need to avoid the occlusion region is considered in generating the planned route, the generated planned route may need to be changed lanes many times. This not only reduces the safety of the vehicle but also reduces the efficiency of vehicle passage. Therefore, factors influencing the vehicle passing efficiency are considered, and a current vehicle passing efficiency loss function is introduced. Therefore, the passing efficiency of the vehicle can be improved to a certain degree. Secondly, because the corresponding sheltering areas of different traffic lights are different. Therefore, the traffic light shielding area is divided into three parts, namely a target bus tail traffic light shielding area coordinate set, a transverse traffic light shielding area coordinate set and a longitudinal traffic light shielding area coordinate set. Thus, the loss value of coordinates in different areas can be determined with finer granularity. Therefore, different weight values are set for the coordinates of traffic light blind areas in different shielding areas. Therefore, the planning path with the minimum loss value can be conveniently selected in the space-time coordinate system. Thus, driving safety and traffic efficiency can be improved.
Optionally, the executing body may further send the planned path to a control terminal of the current vehicle, so that the control terminal controls the current vehicle to move along the planned path.
The above embodiments of the present disclosure have the following advantages: according to the route planning method based on the bus tail traffic light, the possibility that the current vehicle runs the red light can be reduced, and the passing efficiency of the current vehicle can be improved. Specifically, the reason for reducing the traffic efficiency of the current vehicle is that: when current vehicle does not get into the solid line highway section, if the place ahead bus parking stall traffic lights are sheltered from, lead to getting into the mode of removing with the car, can make current vehicle blind with the car, the place ahead vehicle passes through crossing back traffic lights and becomes the red light or be about to become the red light, current vehicle is still following the car yet, with this lead to current vehicle to appear easily running the possibility of red light, there is higher potential safety hazard, driving safety has been reduced, in addition, if keep far away distance with the front truck, then easily by other vehicles plus the jam, lead to reducing the current efficiency of current vehicle. Based on this, according to the route planning method based on the bus tail traffic light of some embodiments of the present disclosure, firstly, it is determined that route planning is performed under the condition that the bus seat traffic light is shielded. Therefore, the road image sequence to be detected and the road radar perception data sequence are obtained in response to the fact that the traffic light information of the tail of the target bus does not exist in the obtained target road image. Then, in order to determine the position of the traffic light of the bus station, the position of the target bus is identified for each road radar sensing data in the road radar sensing data sequence to generate the external frame information of the target bus, and the external frame information sequence of the target bus is obtained. Then, considering that the bus is moving, the bus rear traffic light is also moving. Therefore, a target bus parking space traffic light position information sequence is generated based on the target bus outer connecting frame information sequence. And generating a position information sequence of the traffic lights of the parking spaces of the target bus to correspond to the traffic lights at the tail of the bus at different moments. Then, in order to determine the area where the traffic lights are blocked, image detection is carried out on each image to be detected in the road image sequence to be detected so as to generate an obstacle outer boundary frame information group, and an obstacle outer connection frame information group sequence is obtained. And then, generating a coordinate set of a target bus tail traffic light shielding area based on the target bus parking space traffic light position information sequence and the barrier external connection frame information group sequence. Therefore, the area where the traffic light at the tail of the target bus is shielded can be determined. And finally, planning the current vehicle path in an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area to obtain a planned path. The generated planned path can avoid the occlusion area to the maximum extent by planning the current vehicle path in the occlusion area. Meanwhile, the current vehicle does not need to keep a longer distance from the front vehicle. Therefore, the passing efficiency of the current vehicle can be ensured to some extent. Therefore, the planned path can be used for reducing the possibility that the current vehicle runs the red light and improving the traffic efficiency of the current vehicle.
With further reference to fig. 2, as an implementation of the methods shown in the above figures, the present disclosure provides some embodiments of a route planning device based on a bus tail traffic light, which correspond to those of the method shown in fig. 1, and the device may be applied to various electronic devices.
As shown in fig. 2, a bus tail traffic light-based path planning apparatus 200 according to some embodiments includes: an acquisition unit 201, a recognition unit 202, a first generation unit 203, an image detection unit 204, a second generation unit 205, and a path planning unit 206. The acquisition unit 201 is configured to acquire a road image sequence to be detected and a road radar perception data sequence in response to detecting that traffic light information of the tail of a target bus does not exist in the acquired target road image; the identification unit 202 is configured to perform position identification on each road radar sensing data in the road radar sensing data sequence to generate external frame information of the target bus, so as to obtain an external frame information sequence of the target bus; a first generating unit 203 configured to generate a sequence of position information of a traffic light of a target bus space based on the sequence of information of the external frame of the target bus; the image detection unit 204 is configured to perform image detection on each image to be detected in the road image sequence to be detected to generate an obstacle outer border frame information group, so as to obtain an obstacle outer border frame information group sequence; a second generating unit 205, configured to generate a coordinate set of a traffic light blocking area at the tail of the target bus based on the position information sequence of the parking space traffic light of the target bus and the information group sequence of the external connection frame of the obstacle; and the path planning unit 206 is configured to plan the current vehicle path in an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area, so as to obtain a planned path.
It will be understood that the units described in the apparatus 200 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 200 and the units included therein, and are not described herein again.
Referring now to FIG. 3, a block diagram of an electronic device 300 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device 300 may include a processing means 301 (e.g., a central processing unit, a graphics processor, etc.) that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device 300 to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device 300 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 3 may represent one device or may represent multiple devices, as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 309, or installed from the storage device 308, or installed from the ROM 302. The computer program, when executed by the processing apparatus 301, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (Hyper Text Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a road image sequence to be detected and a road radar perception data sequence in response to the fact that traffic light information of the tail of the target bus does not exist in the acquired target road image; performing position identification on each road radar sensing data in the road radar sensing data sequence to generate external connecting frame information of the target bus to obtain an external connecting frame information sequence of the target bus; generating a position information sequence of a traffic light of a target bus position based on the information sequence of the external connecting frame of the target bus; performing image detection on each image to be detected in the road image sequence to be detected to generate an obstacle outer boundary frame information group, and obtaining an obstacle outer connecting frame information group sequence; generating a coordinate set of a tail traffic light sheltered area of the target bus based on the position information sequence of the parking spaces of the target bus and the information group sequence of the outer connecting frame of the barrier; and planning the current vehicle path in an occlusion area corresponding to the coordinate set of the traffic light occlusion area at the tail of the target bus to obtain a planned path.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a recognition unit, a first generation unit, an image detection unit, a second generation unit, and a path planning unit. The names of these units do not in some cases constitute a limitation to the unit itself, and for example, the acquisition unit may also be described as a "unit that acquires a road image sequence to be detected and a road radar perception data sequence".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A route planning method based on a bus tail traffic light comprises the following steps:
acquiring a road image sequence to be detected and a road radar perception data sequence in response to the fact that no traffic light information of the tail of the target bus exists in the acquired target road image;
performing position identification on each road radar perception data in the road radar perception data sequence to generate external frame information of the target bus to obtain an external frame information sequence of the target bus;
generating a target bus tail traffic light position information sequence based on the target bus external frame information sequence;
performing image detection on each image to be detected in the road image sequence to be detected to generate an obstacle outer boundary frame information group, and obtaining an obstacle outer boundary frame information group sequence;
generating a coordinate set of a target bus tail traffic light sheltered area based on the target bus tail traffic light position information sequence and the barrier external border information group sequence;
planning a current vehicle path in an occlusion area corresponding to the coordinate set of the traffic light occlusion area at the tail of the target bus to obtain a planned path;
the method comprises the following steps that a shielding area corresponding to a target bus tail traffic light shielding area coordinate set carries out current vehicle path planning to obtain a planned path, and the method comprises the following steps:
acquiring a longitudinal traffic light coordinate and a transverse traffic light coordinate set corresponding to the position of the current vehicle;
generating a corresponding first projected area coordinate set for each obstacle external connection frame information in each obstacle external connection frame information group in the obstacle external connection frame information group sequence and each transverse traffic light coordinate in the transverse traffic light coordinate group;
combining the generated first projected area coordinates in each first projected area coordinate set to generate a transverse traffic light shielding area coordinate set;
generating a longitudinal traffic light shielding area coordinate set based on the barrier external connection frame information group sequence and the longitudinal traffic light coordinate;
and performing path planning on an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area in a space-time coordinate system through the following formula to generate a planned path:
Figure DEST_PATH_IMAGE001
wherein,
Figure 986843DEST_PATH_IMAGE002
the planned path is represented by a representation of the planned path,
Figure DEST_PATH_IMAGE003
it is shown that the objective function is minimized,
Figure 982612DEST_PATH_IMAGE004
the objective representing the minimized objective function is the planned path in an iterative process,
Figure DEST_PATH_IMAGE005
which is indicative of the time of day,
Figure 965612DEST_PATH_IMAGE006
representing the time instant corresponding to the first road image in said sequence of road images,
Figure DEST_PATH_IMAGE007
representing the time instant corresponding to the last road image in said sequence of road images,
Figure 564083DEST_PATH_IMAGE008
a loss function for generating a loss value for a path coordinate on the planned path,
Figure DEST_PATH_IMAGE009
on the representation of the planned path
Figure 581718DEST_PATH_IMAGE005
The coordinates of the path at the time of day,
Figure 810705DEST_PATH_IMAGE010
on the representation of the planned path
Figure 608897DEST_PATH_IMAGE005
The loss value of the path coordinates of the time instant,
Figure DEST_PATH_IMAGE011
an abscissa value representing coordinates of a path on the planned path,
Figure 745480DEST_PATH_IMAGE012
represents a current vehicle traffic efficiency loss function,
Figure DEST_PATH_IMAGE013
is shown in
Figure 352042DEST_PATH_IMAGE006
To
Figure 751930DEST_PATH_IMAGE007
A loss value of traffic efficiency of the planned path within the time period,
Figure 37418DEST_PATH_IMAGE014
a function representing the number of passes of the conversion,
Figure DEST_PATH_IMAGE015
a weight coefficient representing the moving speed term,
Figure 712113DEST_PATH_IMAGE016
a weight coefficient representing the number of passes of the conversion,
Figure DEST_PATH_IMAGE017
is shown in
Figure 173181DEST_PATH_IMAGE006
To
Figure 9550DEST_PATH_IMAGE007
Paths within a time period
Figure 782334DEST_PATH_IMAGE018
The number of times of lane changes is required,
Figure DEST_PATH_IMAGE019
on the representation of the planned path
Figure 995141DEST_PATH_IMAGE005
The abscissa value of the path coordinate of the time,
Figure 310716DEST_PATH_IMAGE020
a ordinate value representing a path coordinate on the planned path,
Figure DEST_PATH_IMAGE021
on the representation of the planned path
Figure 849145DEST_PATH_IMAGE005
The ordinate value of the path coordinate of the time,
Figure 814669DEST_PATH_IMAGE022
Figure DEST_PATH_IMAGE023
representing a preset weight value for participating in calculating a loss value,
Figure 831167DEST_PATH_IMAGE024
representing the coordinate set of the traffic light shielding area at the tail of the target bus,
Figure DEST_PATH_IMAGE025
representing coordinates of the blocked area of the transverse traffic lightThe collection of the data is carried out,
Figure 266827DEST_PATH_IMAGE026
representing the coordinate set of the longitudinal traffic light shielding area, other representing path coordinates do not belong to the coordinate set of the target transverse traffic light shielding area and the coordinate set of the target longitudinal traffic light shielding area,
Figure DEST_PATH_IMAGE027
represents a 2-way expression.
2. The method of claim 1, wherein the method further comprises:
and sending the planned path to a control terminal of the current vehicle so that the control terminal can control the current vehicle to move along the planned path.
3. The method of claim 1, wherein each road radar perception data in the sequence of road radar perception data comprises a set of road radar perception coordinates; and
the said every road radar perception data to in the said road radar perception data sequence carry on the recognition of position of target bus in order to produce the outside of target bus connects the frame information, include:
detecting a target bus to the road image to be detected corresponding to the road radar sensing data in the road image sequence to be detected so as to generate a detected external rectangular frame;
selecting a road radar sensing coordinate matched with the detected external rectangular frame from the road radar sensing data to obtain a road radar sensing coordinate group;
determining the external frame of the target bus of each road radar sensing coordinate in the road radar sensing coordinate group;
and determining the external connecting frame of the target bus as the external connecting frame information of the target bus.
4. The method of claim 1, wherein generating a sequence of target bus tailed traffic light position information based on the sequence of target bus frame-out information comprises:
and selecting the external connecting frame coordinates of the target bus corresponding to the preset traffic light position on the external connecting frame of the target bus, wherein the external connecting frame coordinates of each target bus in the external connecting frame information sequence of the target bus comprise the external connecting frame coordinate information of the target bus to serve as the tail traffic light coordinate information of the target bus, and obtaining the tail traffic light position information sequence of the target bus, wherein each tail traffic light position information of the target bus in the tail traffic light position information sequence of the target bus comprises the tail traffic light coordinate of the target bus.
5. The method according to claim 4, wherein the image detection of each image to be detected in the road image sequence to be detected to generate an obstacle outer boundary frame information group comprises:
carrying out obstacle identification on the road image to be detected to obtain an obstacle identification information group, wherein each obstacle identification information in the obstacle identification information group comprises: obstacle size information and an obstacle position and posture matrix;
and generating obstacle external frame information by using obstacle size information and an obstacle position and posture matrix included by each obstacle identification information in the obstacle identification information group to obtain an obstacle external frame information group, wherein each obstacle external frame information in the obstacle external frame information group comprises an obstacle external rectangle and an obstacle external frame vertex coordinate group.
6. The method of claim 5, wherein generating a target bus tail traffic light occlusion area coordinate set based on the target bus tail traffic light position information sequence and the barrier border information group sequence comprises:
constructing a camera plane based on the camera coordinates of the vehicle-mounted camera of the current vehicle;
for target bus tail traffic light coordinates that every target bus tail traffic light position information includes in the target bus tail traffic light position information sequence and in the external frame information group sequence of barrier with the external frame information group of barrier that target bus tail traffic light position information corresponds connects the frame information group outward, carries out following step and shelters from regional coordinate set in order to generate single frame traffic light:
determining four plane vertex coordinates of an intersecting quadrangle between the camera plane and the obstacle external connection rectangles included in each obstacle external connection frame information in the obstacle external connection frame information groups to serve as plane vertex coordinate groups, and obtaining a plane vertex coordinate group set;
respectively determining four ray intersection point coordinates of four plane vertex coordinates included by each plane vertex coordinate set in the plane vertex coordinate set and intersected with the camera plane as ray intersection point coordinate sets by taking the traffic light coordinates of the tail of the target bus as a starting point, and obtaining a ray intersection point coordinate set;
generating a single-frame traffic light shielding area coordinate set based on the ray intersection point coordinate set and the plane vertex coordinate set;
and combining the coordinates of each single-frame traffic light shielding area in each generated single-frame traffic light shielding area coordinate set to generate a target bus tail traffic light shielding area coordinate set.
7. The method of claim 6, wherein before planning a current vehicle path in an occlusion area corresponding to the coordinate set of the occlusion area of the tail traffic light of the target bus to obtain a planned path, the method further comprises:
determining a distance value between the single-frame traffic light occlusion area coordinate in each single-frame traffic light occlusion area coordinate set in each generated single-frame traffic light occlusion area coordinate set and the corresponding target bus tail traffic light coordinate;
and determining the tail traffic light shielding area coordinates of the target bus with the corresponding distance values meeting the preset distance condition in the generated single-frame traffic light shielding area coordinate set as the tail traffic light shielding area coordinates of the screened tail of the bus to obtain a tail traffic light shielding area coordinate set of the screened tail of the bus.
8. A route planning device based on bus tail traffic lights comprises:
the acquisition unit is configured to respond to the fact that traffic light information of the tail of the target bus does not exist in the acquired target road image, and acquire a road image sequence to be detected and a road radar perception data sequence;
the identification unit is configured to identify the position of the target bus to each road radar sensing data in the road radar sensing data sequence to generate the external connecting frame information of the target bus, so as to obtain an external connecting frame information sequence of the target bus;
a first generating unit configured to generate a sequence of target bus tail traffic light position information based on the sequence of target bus external frame information;
the image detection unit is configured to perform image detection on each image to be detected in the road image sequence to be detected so as to generate an obstacle outer border frame information group and obtain an obstacle outer border frame information group sequence;
the second generation unit is configured to generate a coordinate set of a target bus tail traffic light shielding area based on the target bus tail traffic light position information sequence and the barrier external connection frame information group sequence;
the route planning unit is configured to plan a current vehicle route in an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area to obtain a planned route;
the method comprises the following steps that a shielding area corresponding to a target bus tail traffic light shielding area coordinate set carries out current vehicle path planning to obtain a planned path, and the method comprises the following steps:
acquiring a longitudinal traffic light coordinate and a transverse traffic light coordinate set corresponding to the position of the current vehicle;
generating a corresponding first projected area coordinate set for each obstacle external border information in each obstacle external border information group in the obstacle external border information group sequence and each transverse traffic light coordinate in the transverse traffic light coordinate group;
combining the generated first projected area coordinates in each first projected area coordinate set to generate a transverse traffic light shielding area coordinate set;
generating a longitudinal traffic light shielding area coordinate set based on the barrier external connection frame information group sequence and the longitudinal traffic light coordinate;
and performing path planning on an occlusion area corresponding to the coordinate set of the target bus tail traffic light occlusion area in a space-time coordinate system through the following formula to generate a planned path:
Figure 976157DEST_PATH_IMAGE001
wherein,
Figure 598899DEST_PATH_IMAGE002
the planned path is represented by a representation of the planned path,
Figure 12563DEST_PATH_IMAGE003
it is shown that the objective function is minimized,
Figure 771572DEST_PATH_IMAGE004
the objective representing the minimized objective function is the planned path in an iterative process,
Figure 245278DEST_PATH_IMAGE005
which is indicative of the time of day,
Figure 89738DEST_PATH_IMAGE006
representing the time instant corresponding to the first road image in said sequence of road images,
Figure 41513DEST_PATH_IMAGE007
representing the time instant corresponding to the last road image in said sequence of road images,
Figure 920608DEST_PATH_IMAGE008
a loss function for generating a loss value for a path coordinate on the planned path,
Figure 565215DEST_PATH_IMAGE009
on the representation of the planned path
Figure 896971DEST_PATH_IMAGE005
The coordinates of the path at the time of day,
Figure 918016DEST_PATH_IMAGE010
on the representation of the planned path
Figure 386038DEST_PATH_IMAGE005
The loss value of the path coordinates of the time instant,
Figure 201547DEST_PATH_IMAGE011
an abscissa value representing coordinates of a path on the planned path,
Figure 20599DEST_PATH_IMAGE012
represents a current vehicle traffic efficiency loss function,
Figure 579756DEST_PATH_IMAGE013
is shown in
Figure 167863DEST_PATH_IMAGE006
To
Figure 888695DEST_PATH_IMAGE007
A loss value of traffic efficiency of the planned path within the time period,
Figure 195042DEST_PATH_IMAGE014
presentation tradeThe number of passes loss function is a function of,
Figure 433257DEST_PATH_IMAGE015
a weight coefficient representing the movement velocity term,
Figure 266083DEST_PATH_IMAGE016
a weight coefficient representing the number of passes of the conversion,
Figure 33182DEST_PATH_IMAGE017
is shown in
Figure 951460DEST_PATH_IMAGE006
To
Figure 993365DEST_PATH_IMAGE007
Paths within a time period
Figure 415119DEST_PATH_IMAGE018
The number of times of lane changes is required,
Figure 612839DEST_PATH_IMAGE019
on the representation of the planned path
Figure 752833DEST_PATH_IMAGE005
The abscissa value of the path coordinate of the time,
Figure 332850DEST_PATH_IMAGE020
a ordinate value representing a path coordinate on the planned path,
Figure 874690DEST_PATH_IMAGE021
on the representation of the planned path
Figure 249171DEST_PATH_IMAGE005
The ordinate value of the path coordinate of the time,
Figure 876461DEST_PATH_IMAGE022
Figure 994590DEST_PATH_IMAGE023
representing a preset weight value for participating in calculating a loss value,
Figure 390936DEST_PATH_IMAGE024
representing the coordinate set of the traffic light shielding area at the tail of the target bus,
Figure 936318DEST_PATH_IMAGE025
representing the set of lateral traffic light occlusion region coordinates,
Figure 50905DEST_PATH_IMAGE026
representing the coordinate set of the longitudinal traffic light shielding area, other representing path coordinates do not belong to the coordinate set of the target transverse traffic light shielding area and the coordinate set of the target longitudinal traffic light shielding area,
Figure 972724DEST_PATH_IMAGE027
represents a 2-way expression.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202211365316.1A 2022-11-03 2022-11-03 Route planning method, device, equipment and medium based on bus tail traffic light Active CN115507873B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211365316.1A CN115507873B (en) 2022-11-03 2022-11-03 Route planning method, device, equipment and medium based on bus tail traffic light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211365316.1A CN115507873B (en) 2022-11-03 2022-11-03 Route planning method, device, equipment and medium based on bus tail traffic light

Publications (2)

Publication Number Publication Date
CN115507873A CN115507873A (en) 2022-12-23
CN115507873B true CN115507873B (en) 2023-03-10

Family

ID=84513283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211365316.1A Active CN115507873B (en) 2022-11-03 2022-11-03 Route planning method, device, equipment and medium based on bus tail traffic light

Country Status (1)

Country Link
CN (1) CN115507873B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007233864A (en) * 2006-03-02 2007-09-13 Denso Corp Dead angle support information notification device and program
CN103366588A (en) * 2013-07-23 2013-10-23 苏州卫生职业技术学院 Bus tail portion traffic light caution light
CN110542931A (en) * 2018-05-28 2019-12-06 北京京东尚科信息技术有限公司 traffic light detection method and device, electronic equipment and computer readable medium
CN214376955U (en) * 2021-02-03 2021-10-08 北京千方科技股份有限公司 Public traffic display system of traffic lights
WO2021246534A1 (en) * 2020-06-01 2021-12-09 엘지전자 주식회사 Route providing device and route providing method therefor

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522350B (en) * 2020-07-06 2020-10-09 深圳裹动智驾科技有限公司 Sensing method, intelligent control equipment and automatic driving vehicle
US11733054B2 (en) * 2020-12-11 2023-08-22 Motional Ad Llc Systems and methods for implementing occlusion representations over road features

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007233864A (en) * 2006-03-02 2007-09-13 Denso Corp Dead angle support information notification device and program
CN103366588A (en) * 2013-07-23 2013-10-23 苏州卫生职业技术学院 Bus tail portion traffic light caution light
CN110542931A (en) * 2018-05-28 2019-12-06 北京京东尚科信息技术有限公司 traffic light detection method and device, electronic equipment and computer readable medium
WO2021246534A1 (en) * 2020-06-01 2021-12-09 엘지전자 주식회사 Route providing device and route providing method therefor
CN214376955U (en) * 2021-02-03 2021-10-08 北京千方科技股份有限公司 Public traffic display system of traffic lights

Also Published As

Publication number Publication date
CN115507873A (en) 2022-12-23

Similar Documents

Publication Publication Date Title
Geyer et al. A2d2: Audi autonomous driving dataset
CN111079619B (en) Method and apparatus for detecting target object in image
US11217012B2 (en) System and method for identifying travel way features for autonomous vehicle motion control
CN111542860B (en) Sign and lane creation for high definition maps of autonomous vehicles
CN111874006B (en) Route planning processing method and device
WO2021003452A1 (en) Determination of lane connectivity at traffic intersections for high definition maps
CN112740268B (en) Target detection method and device
CN108509820B (en) Obstacle segmentation method and device, computer equipment and readable medium
CN115540896B (en) Path planning method and device, electronic equipment and computer readable medium
CN112258519B (en) Automatic extraction method and device for way-giving line of road in high-precision map making
WO2021003487A1 (en) Training data generation for dynamic objects using high definition map data
JP2022003508A (en) Trajectory planing model training method and device, electronic apparatus, computer-readable storage medium, and computer program
US11699234B2 (en) Semantic segmentation ground truth correction with spatial transformer networks
US20230049383A1 (en) Systems and methods for determining road traversability using real time data and a trained model
CN114550116A (en) Object identification method and device
CN115468578B (en) Path planning method and device, electronic equipment and computer readable medium
CN114972758A (en) Instance segmentation method based on point cloud weak supervision
CN117470258A (en) Map construction method, device, equipment and medium
CN115507873B (en) Route planning method, device, equipment and medium based on bus tail traffic light
US12026954B2 (en) Static occupancy tracking
CN115468579B (en) Path planning method and device, electronic equipment and computer readable medium
CN114643984A (en) Driving risk avoiding method, device, equipment, medium and product
CN115376365B (en) Vehicle control method, device, electronic equipment and computer readable medium
US20240221386A1 (en) Occupancy tracking based on depth information
CN116740682B (en) Vehicle parking route information generation method, device, electronic equipment and readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant