CN113012445A - Intelligent traffic control system and control method thereof - Google Patents
Intelligent traffic control system and control method thereof Download PDFInfo
- Publication number
- CN113012445A CN113012445A CN201911314761.3A CN201911314761A CN113012445A CN 113012445 A CN113012445 A CN 113012445A CN 201911314761 A CN201911314761 A CN 201911314761A CN 113012445 A CN113012445 A CN 113012445A
- Authority
- CN
- China
- Prior art keywords
- information
- traffic
- unit
- dynamic map
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/07—Controlling traffic signals
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
Abstract
The embodiment of the application provides an intelligent traffic control system and a control method thereof, wherein the intelligent traffic control system comprises: a detection unit that detects information relating to a traffic element based on detection results of the roadside sensing device and the vehicle-mounted sensing device; a fusion unit that spatially and/or temporally fuses information on the traffic elements detected by the detection unit based on detection results of different sensing devices; the positioning unit is used for positioning according to the processing result of the fusion unit to obtain the position information of the traffic element; and a dynamic map unit that updates information of the dynamic map based on the position information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic.
Description
Technical Field
The present application relates to the field of electronic information technology.
Background
With the development of the automatic driving technology, more and more automatic driving automobiles will get on the road, and the realization of the automatic driving function depends on the provision of various information on the road.
Currently, there are intelligent traffic control systems that are capable of providing road information and/or performing traffic control. For example, some intelligent traffic control systems can determine the congestion condition of a road; or some intelligent traffic control systems can control traffic lights according to the traffic flow of roads; or some intelligent traffic control systems can send real-time traffic information to vehicles running on the road.
It should be noted that the above background description is only for the convenience of clear and complete description of the technical solutions of the present application and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the present application.
Disclosure of Invention
The inventors of the present application have found that existing intelligent traffic control systems have limited functionality. For example, existing intelligent traffic control systems can only provide some simple information to the vehicle and cannot provide more information for the automatic driving of the vehicle; or, it is difficult for the existing intelligent traffic control system to ensure the safety of pedestrians and vehicles on roads (especially at intersections).
The embodiment of the application provides an intelligent traffic control system and a control method thereof, the intelligent traffic control system can update a dynamic map based on detection results of vehicle-mounted sensing equipment of vehicles running on a road and road-side sensing equipment, and sends the updated dynamic map to each main body participating in traffic, so that traffic information can be provided to each main body participating in traffic in time, safety of pedestrians and vehicles is guaranteed, and automatic driving of the vehicles is facilitated.
According to a first aspect of embodiments of the present application, there is provided an intelligent traffic control system, including:
a detection (detection) unit that detects information about a traffic element (traffic element) based on detection results of the roadside sensing device and the vehicle-mounted sensing device;
a fusion (fusion) unit that spatially and/or temporally fuses information on the traffic element detected by the detection unit based on detection results of different sensor devices;
a positioning (positioning) unit that performs positioning according to a processing result of the fusion (fusion) unit to obtain position information of the traffic element; and
a dynamic map (dynamic map) unit that updates information of a dynamic map based on the location information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic,
wherein the information related to traffic elements comprises:
information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
According to a second aspect of embodiments of the present application, there is provided a control method of an intelligent traffic control system, the control method of the intelligent traffic control system including:
detecting information related to a traffic element (traffic element) based on detection results of the roadside sensing device and the vehicle-mounted sensing device;
fusing information related to traffic elements in space and/or time;
positioning according to the fused processing result to obtain the position information of the traffic element; and
updating information of a dynamic map based on the location information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic,
wherein the information related to traffic elements comprises:
information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
The beneficial effect of this application lies in: the intelligent traffic control system can update the dynamic map based on the detection results of the vehicle-mounted sensing equipment and the road side sensing equipment which run on the road, and send the updated dynamic map to each main body participating in traffic, so that traffic information can be provided for each main body participating in traffic in time, safety of pedestrians and vehicles is guaranteed, and automatic driving of the vehicles is facilitated.
Specific embodiments of the present invention are disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the invention may be employed. It should be understood that the embodiments of the invention are not so limited in scope. The embodiments of the invention include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments, in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
fig. 1 is a schematic diagram of an intelligent traffic control system of a first aspect of an embodiment of the present application;
FIG. 2 is a schematic diagram of the processing performed by the edge calculation unit within the roadside unit;
FIG. 3 is a schematic diagram of the processing performed by the in-vehicle edge calculation unit;
fig. 4 is a schematic diagram of a control method of an intelligent traffic control system according to a second aspect of the embodiment of the present application.
Detailed Description
The foregoing and other features of the invention will become apparent from the following description taken in conjunction with the accompanying drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the embodiments in which the principles of the invention may be employed, it being understood that the invention is not limited to the embodiments described, but, on the contrary, is intended to cover all modifications, variations, and equivalents falling within the scope of the appended claims.
In the embodiments of the present application, the terms "first", "second", and the like are used for distinguishing different elements by reference, but do not denote a spatial arrangement, a temporal order, or the like of the elements, and the elements should not be limited by the terms. The term "and/or" includes any and all combinations of one or more of the associated listed terms. The terms "comprising," "including," "having," and the like, refer to the presence of stated features, elements, components, and do not preclude the presence or addition of one or more other features, elements, components, and elements.
In the embodiments of the present application, the singular forms "a", "an", and the like include the plural forms and are to be construed broadly as "a" or "an" and not limited to the meaning of "a" or "an"; furthermore, the term "the" should be understood to include both the singular and the plural, unless the context clearly dictates otherwise. Further, the term "according to" should be understood as "at least partially according to … …," and the term "based on" should be understood as "based at least partially on … …," unless the context clearly dictates otherwise.
First aspect of the embodiments
A first aspect of an embodiment of the present application provides an intelligent traffic control system.
Fig. 1 is a schematic diagram of an intelligent traffic control system of a first aspect of an embodiment of the present application, and as shown in fig. 1, the intelligent traffic control system 1 includes: a detection (detection) unit 11, a fusion (fusion) unit 12, a positioning (localization) unit 13, and a dynamic map (dynamic map) unit 14.
In at least one embodiment, the detection unit 11 is capable of detecting information about traffic elements (traffic elements) based on detection results of the on-vehicle sensor device and the roadside sensor device.
In at least one embodiment, the fusion unit 12 fuses (fusion) the information related to the traffic element measured by the detection unit 11 based on the detection results of the different sensing devices in space and/or time.
In at least one embodiment, the positioning unit 13 performs positioning processing according to the processing result of the fusion unit 12 to obtain the position information of the traffic element.
In at least one embodiment, the dynamic map unit 14 updates information of the dynamic map based on the position information of the traffic element, and the updated dynamic map may be transmitted to the respective subjects participating in the traffic.
According to the first aspect of the embodiment of the application, the intelligent traffic control system can update the dynamic map based on the detection results of the vehicle-mounted sensing equipment and the roadside sensing equipment of the vehicle running on the road, and send the updated dynamic map to each main body participating in traffic, so that traffic information can be provided to each main body participating in traffic in time, safety of pedestrians and vehicles is guaranteed, and automatic driving of the vehicle is facilitated.
In the first aspect of the embodiment of the present application, the detection unit 11 is capable of detecting information related to a traffic element (traffic element) based on detection results of the vehicle-mounted sensing device and the roadside sensing device.
Wherein, on-vehicle sensing equipment includes for example: vehicle-mounted cameras, and/or vehicle-mounted lasers, and/or vehicle-mounted radars, etc. The roadside sensing apparatus includes, for example: over-road or roadside surveillance cameras, e.g., fixed camera or pan-tilt-zoom (PTZ) cameras; and/or, radar above or alongside the road, (e.g., millimeter wave radar); and/or other sensors, etc., such as, for example, Lidar (Lidar), Time of flight (TOF) sensors, infrared detectors, etc.
The information related to the traffic element detected by the detection unit 11 may include: information of the vehicle, for example, information of the kind of the vehicle, i.e., car, van, midbus, truck, motorcycle, bicycle lamp; and/or, pedestrian information; and/or, traffic light information; and/or, information of gestures of a traffic police; and/or, road marking information; and/or lane line information, e.g., solid lines, dashed lines, etc.; and/or, zebra crossing information; and/or event detection result information, for example, information such as congestion (jam), traffic accident (accident), lane change (avoid), traffic congestion, reverse (reverse), vehicle counting (counting), low speed (low speed), high speed (high speed), disabled person (disabled), old person (old), child (child), congestion (occlusion), dangerous (risk), smoke (smoke), fog (fog), visibility (visibility), and the like.
In at least one embodiment, the detection unit 11 may be input to detection results of the road-side sensing device and the vehicle-mounted sensing device, such as an image frame captured by a camera and/or a point cloud (point cloud) detected by a radar. The detection unit 11 may output information related to the traffic element in the form of: information of a bounding box (bounding box) of a traffic element, for example, coordinates (x, y) of a center point of the bounding box, a height h, a width w of the bounding box; text information indicating a vehicle type; the speed of movement of an object moving in the traffic element (e.g., detected from data from radar); a motion trajectory (trajectory) of the traffic element; coordinate information representing the occurrence position of the event, and/or text information representing the type of the event; text information indicating whether smoke or fog is present in the environment, and the like.
In at least one embodiment, as shown in fig. 1, the detection unit 11 may include: a roadside detection unit 111 and/or an onboard detection unit 112. The roadside detection Unit 111 may be disposed in an Edge Computing Unit (ECU) of a roadside (Road Side), for example, and a Graphics Processing Unit (GPU), a Field Programmable Gate Array (FPGA), and/or the like may be disposed in the Edge Computing Unit of the roadside (Road Side); the on-board detection Unit 112 may be provided in an Edge Computing Unit (ECU) of the vehicle, and a Graphics Processing Unit (GPU), a Field Programmable Gate Array (FPGA), or the like may be provided in the Edge Computing Unit of the on-board (Road Side).
In at least one embodiment, the calculation rate of the detection unit 11 may be, for example: at least 30Frames Per Second (30Frames Per Second), that is, the detection result for at least 30Frames of images Per Second is calculated, and the calculation result is output at least 30 times Per Second. Thus, in the case where the speed of the vehicle is 60 km/h, the vehicle advances by 16.7 meters per second, and the vehicle travels by about 0.56 meter in the process of calculation for 1 frame image by the detection unit 11.
Further, in at least one embodiment, when detecting the speed of a traffic element (e.g., a pedestrian or a vehicle, etc.), target tracking and trajectory extraction (trajectory extraction) is required for the traffic element.
In at least one embodiment, the delay (delay) incurred by the detection unit 11 may be less than 20 milliseconds (ms).
In at least one implementation, the fusion unit 12 may spatially and/or temporally fuse (fusion) the information related to the traffic element obtained by the detection unit 11. By the fusion processing by the fusion unit 12, the accuracy (precision) of the information on the imaging depth (depth) of the camera and the speed information can be improved, thereby reducing the detection error in severe weather and severe lighting conditions.
In one embodiment, the fusion unit 12 may fuse the bounding box of the traffic element detected from the images of the monitoring camera and the vehicle-mounted camera with the detection result from the radar or other sensor, that is, perform the first fusion. For example, the bounding box of the traffic element detected from the images of the monitoring camera and the vehicle-mounted camera and the detection result from the radar or other sensor are matched and aligned with each other in time and/or space, so that the accurate speed of the traffic element or the position of the bounding box in the image, or the like, is obtained.
In another embodiment, the fusion unit 12 may further perform a second fusion on the basis of the first fusion, that is, fuse the result of the first fusion with geographic location information, such as information of a Global Positioning System (GPS), to obtain a second fusion result. For example, the result of the first fusion is that the vehicle travels in a wide area, and the information of the geographical position shows that a part of the wide area is a green belt of the roadside, whereby the result of the second fusion is that the traveling area of the vehicle is limited on the road beside the green belt.
In at least one implementation, the fusion Unit 12 may be provided in an Edge Computing Unit (ECU) of the Road Side (Road Side).
In the first aspect of the embodiment of the present application, the positioning unit 13 can perform positioning according to the processing result of the fusion unit 12, and obtain the position information of the traffic element. For example, the positioning unit 13 calculates the position of the traffic element in the fusion result of the fusion unit 12 on the dynamic map, which is, for example, a coordinate value or the like on the dynamic map.
As shown in fig. 1, the positioning unit 13 may include: roadside locating units 131. The roadside locating Unit 131 may be provided in an Edge Computing Unit (ECU) of the roadside (Road Side), for example. The positioning unit 13 may further include: and an onboard positioning unit 132. The on-vehicle positioning Unit 132 may be provided in an Edge Computing Unit (ECU) of the vehicle, for example.
In at least one embodiment, the positioning unit 13 calculates the position of each traffic element on the dynamic map according to the position of each traffic element in the fusion result of the fusion unit 12 and the corresponding position of the fusion result in the dynamic map. For example, in the fusion result, the position of a certain vehicle in the monitored image is represented as the coordinates (x) of the imageImage of a person,yImage of a person) The monitoring image is captured by a roadside monitoring camera with a fixed position, for example, so that the range of the captured range of the monitoring image on the dynamic map is known, and the roadside locating unit 131 may determine the vehicle according to the correspondence between the captured range of the monitoring image and the range on the dynamic map and the position of the vehicle in the monitoring imageA location on the dynamic map, the location being represented as a coordinate (x) on the dynamic mapMap with a plurality of maps,yMap with a plurality of maps) Thereby, the position of the traffic element on the image plane can be translated into a position on the map plane. The fusion result may include images captured by a plurality of surveillance cameras and detection results of sensors such as radar, and images captured by the surveillance cameras may partially overlap, thereby avoiding missing detection of traffic elements. In addition, if three-dimensional information of traffic elements exists in the fusion result, information perpendicular to the image plane in the three-dimensional information may be retained.
In at least one embodiment, the positioning (localization) unit 13 may also calculate the position of each traffic element on the dynamic map in the image captured by the vehicle-mounted camera according to the position information of the vehicle-mounted camera on the dynamic map and the relative position of the vehicle and other traffic elements obtained based on the image captured by the vehicle-mounted camera. For example, the position information of the vehicle-mounted camera of the vehicle on the dynamic map is known from the GPS information of the vehicle, and a pedestrian is present 10 meters away from the vehicle in the image captured by the vehicle-mounted camera, so that the vehicle-mounted positioning unit 132 can calculate the position information of the pedestrian on the dynamic map based on the relative position of the pedestrian and the vehicle in the image captured by the vehicle-mounted camera and the position information of the vehicle-mounted camera on the dynamic map.
In at least one embodiment, the positioning (positioning) unit 13, in the case of having both the roadside positioning unit 131 and the vehicle-mounted positioning unit 132, the roadside positioning unit 131 and the vehicle-mounted positioning unit 132 may calculate the positions of the traffic elements on the dynamic map, and the union of the calculation results of the two may be used as the positioning result of each traffic element, wherein when the positioning results obtained by the roadside positioning unit 131 and the vehicle-mounted positioning unit 132 for the same traffic element are different, the positioning result with higher priority may be used according to the following priorities: when the GPS of the vehicle is a high-precision GPS, the priority of a positioning result obtained based on the position information of the high-precision GPS is the highest; the priority of a positioning result obtained based on a fusion result of detection results of a monitoring camera of the road and a radar of the road is lower; when the GPS of the vehicle is a normal-accuracy GPS, the priority of the positioning result obtained from the position information of the vehicle-mounted camera on the dynamic map and the relative positions of the vehicle and other traffic elements obtained based on the image captured by the vehicle-mounted camera is lowest.
In the first aspect of the embodiment of the present application, the dynamic map unit 14 updates the information of the dynamic map based on the position information of the traffic element calculated by the positioning unit 13. For example, the dynamic map unit 14 updates the information of the dynamic map using the position and the travel speed of the vehicle and the position and the movement speed of the pedestrian.
In at least one embodiment, the dynamic map unit 14 updates the information of the dynamic map at a frequency of at least 20HZ, i.e., at least 20 updates per second.
In at least one embodiment, the dynamic map Unit 14 may be provided in an Edge Computing Unit (ECU) of a Road Side (Road Side) or in a cloud (cloud) server. For example, when the dynamic map Unit 14 is provided in the edge calculation Unit on the roadside, the roadside Unit (RSU) may send the updated information of the dynamic map to each subject participating in the traffic; for another example, when the dynamic map unit 14 is provided in a cloud server, the roadside unit may transmit the position information of the traffic element to the cloud server, and receive the information of the updated dynamic map from the cloud server and transmit the received information of the updated dynamic map to each subject participating in the traffic. In addition, when the dynamic map unit 14 is provided in the roadside edge calculation unit, the delay of the intelligent traffic control system is short; when the dynamic map unit 14 is provided in the cloud server, the delay of the intelligent traffic control system is long.
In at least one embodiment, the updated information of the dynamic map is sent to the various entities participating in the traffic. The subject participating in the traffic may be, for example, a vehicle to which updated dynamic map information is transmitted when the distance of the vehicle to the intersection is less than a predetermined distance (e.g., 100 meters), for example. Further, the updated information of the dynamic map may also be transmitted to the pedestrian, for example, the dynamic map unit 14 transmits the dynamic map information to the mobile device of the pedestrian within a predetermined range of the intersection.
In at least one embodiment, the position information of the traffic element obtained by the dynamic map unit 14 is text information, and the text information may be stored in at least one layer of the dynamic map, thereby saving the storage capacity. For example, the position and/or speed information of each traffic element is stored in text in one layer of the dynamic map. In addition, the position information of the traffic element may also include image information corresponding to the text information, and the image information may also be stored in at least one layer of the dynamic map.
In a first aspect of the embodiment of the present application, as illustrated in fig. 1, the intelligent traffic control system 1 further includes: a communication unit 15. The communication unit 15 is used for communication between the sensing device and the detection unit 11, and/or communication between the positioning unit 13 and the dynamic map unit 14, and/or communication between the dynamic map unit 14 and the body participating in the traffic.
In one embodiment, when the dynamic map unit 14 is located at a different location, the configuration of the communication may change accordingly. For example, when the dynamic map unit 14 is provided in an Edge Computing Unit (ECU), the communication unit 15 may be based on the communication architecture of LTE-V2X (PC 5). For another example, when the dynamic map unit 14 is provided in a cloud server, the communication unit 15 may be based on a communication architecture of LTE/5G (V2I, V2N, V2P, or V2V).
In another embodiment, the transmitter in the communication unit 15 may perform an upload (upload). For example, the roadside unit may transmit the results of the detection or fusion (e.g., the location of traffic elements, etc.) to the map unit 14 through the communication unit 15; for another example, the vehicle-mounted device may transmit information such as the position of the traffic element detected by the edge calculation unit of the vehicle and/or GPS information of the vehicle through the communication unit 15.
In another embodiment, the transmitter in the communication unit 15 may perform a download (download) transmission. For example, the road side unit may send the dynamic map to a mobile terminal of a vehicle or pedestrian; for another example, the cloud server may send the dynamic map to a mobile terminal of a vehicle, and/or a pedestrian, and/or a traffic monitoring center.
In yet another embodiment, the receiver in the communication unit 15 may perform the reception of data. For example, the mobile terminal of the vehicle and/or the pedestrian and/or the traffic monitoring center may receive information of the dynamic map from the road side unit or the cloud server, where the received information of the dynamic map may be, for example, text information for indicating a position and/or a movement speed of a traffic element, and further, the received information of the dynamic map may include picture information.
In at least one embodiment, the communication delay of the communication unit 15 is less than 20 milliseconds.
In the first embodiment of the present application, the units of the intelligent traffic control system 1 may not be concentrated in the same hard device, but distributed in the edge calculation unit of the road side unit and the vehicle-mounted edge calculation unit, that is, the edge calculation unit of the road side unit and the vehicle-mounted edge calculation unit implement the functions of the intelligent traffic control system 1 of the present application.
FIG. 2 is a schematic diagram of the processing performed by the edge calculation unit within the roadside unit. As shown in fig. 2, the edge calculation unit 20 in the roadside unit may have a CPU 201 and/or a GPU 202 therein. As shown in fig. 2, the processing of the edge calculation unit 20 may include the following operations:
an operation 21 of performing detection of a traffic element, detection of a feature (e.g., a kind and/or a color, etc.) of the traffic element, event detection, and detection of a position of the traffic element in an image based on an image acquired by a monitoring camera provided on a road, wherein the number of the monitoring cameras is at least one;
in operation 27, broadcasting (broadcasting) the updated information of the dynamic map, for example, as shown in fig. 2, the roadside unit sends the updated information of the dynamic map to the vehicle in the form of text.
Fig. 3 is a schematic diagram of the processing performed by the in-vehicle edge calculation unit. As shown in fig. 3, the in-vehicle edge calculation unit 30 may have therein a CPU 301 and/or a GPU 302. As shown in fig. 3, the processing of the edge calculation unit 30 may include the following operations:
in operation 35, the updated information of the dynamic map is transmitted, for example, as shown in fig. 3, the on-board edge calculation unit 30 transmits the updated information of the dynamic map to the roadside unit in the form of text and/or image.
According to the first aspect of the embodiment of the application, the intelligent traffic control system can update the dynamic map based on the detection results of the vehicle-mounted sensing equipment and the roadside sensing equipment of the vehicle running on the road, and send the updated dynamic map to each main body participating in traffic, so that traffic information can be provided to each main body participating in traffic in time, safety of pedestrians and vehicles is guaranteed, and automatic driving of the vehicle is facilitated.
Second aspect of the embodiments
A second aspect of the embodiments of the present application provides a control method of an intelligent traffic control system, which corresponds to the intelligent traffic control system of the first aspect of the embodiments of the present application.
Fig. 4 is a schematic diagram of a control method of an intelligent traffic control system according to a second aspect of an embodiment of the present application, and as shown in fig. 4, the control method of the intelligent traffic control system includes:
an operation 42 of spatially and/or temporally fusing information related to the traffic elements;
In at least one embodiment, the information related to traffic elements includes: information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
According to the second aspect of the embodiment of the application, the intelligent traffic control system can update the dynamic map based on the detection results of the vehicle-mounted sensing equipment and the roadside sensing equipment of the vehicle running on the road, and send the updated dynamic map to each main body participating in traffic, so that traffic information can be provided to each main body participating in traffic in time, safety of pedestrians and vehicles is guaranteed, and automatic driving of the vehicle is facilitated.
Embodiments of the present application also provide a computer-readable program, where when the program is executed in an intelligent traffic control system, the program causes the intelligent traffic control system to execute the control method of the intelligent traffic control system according to the second aspect of the embodiments.
The embodiment of the present application further provides a storage medium storing a computer readable program, where the storage medium stores the computer readable program, and the computer readable program enables an intelligent traffic control system to execute the control method of the intelligent traffic control system according to the second aspect of the embodiment.
The motion recognition means described in connection with the embodiments of the invention may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. For example, one or more of the functional block diagrams and/or one or more combinations of the functional block diagrams illustrated in the figures may correspond to individual software modules, or may correspond to individual hardware modules of a computer program flow. These software modules may respectively correspond to the respective operations shown in the second aspect of the embodiment. These hardware modules may be implemented, for example, by solidifying these software modules using a Field Programmable Gate Array (FPGA).
A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. A storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium; or the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The software module may be stored in the memory of the mobile terminal or in a memory card that is insertable into the mobile terminal. For example, if the electronic device employs a MEGA-SIM card with a larger capacity or a flash memory device with a larger capacity, the software module may be stored in the MEGA-SIM card or the flash memory device with a larger capacity.
One or more of the functional block diagrams and/or one or more combinations of the functional block diagrams described with respect to the figures may be implemented as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof designed to perform the functions described herein. One or more of the functional block diagrams and/or one or more combinations of the functional block diagrams described with respect to the figures may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP communication, or any other such configuration.
The present application has been described in conjunction with specific embodiments, but it should be understood by those skilled in the art that these descriptions are intended to be illustrative, and not limiting. Various modifications and adaptations of the present application may occur to those skilled in the art based on the teachings herein and are within the scope of the present application.
With respect to the embodiments including the above embodiments, the following remarks are also disclosed:
1. an intelligent traffic control system, characterized in that the intelligent traffic control system comprises:
a detection (detection) unit that detects information about a traffic element (traffic element) based on detection results of a roadside sensing device including at least a vehicle-mounted camera and a vehicle-mounted sensing device including at least a surveillance camera and a radar;
a fusion (fusion) unit that spatially and/or temporally fuses information on the traffic element detected by the detection unit based on detection results of different sensor devices;
a positioning (positioning) unit that performs positioning according to a processing result of the fusion (fusion) unit to obtain position information of the traffic element; and
a dynamic map (dynamic map) unit that updates information of a dynamic map based on the location information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic,
wherein the information related to traffic elements comprises:
information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
2. The intelligent traffic control system according to supplementary note 1, wherein,
the fusion unit fuses the bounding boxes of the traffic elements detected from the images of the surveillance camera and the vehicle-mounted camera with the detection results obtained from the radar or other sensors.
3. The intelligent traffic control system according to supplementary note 1, wherein,
the positioning (positioning) unit calculates a position of the traffic element on the dynamic map in the fusion result of the fusion (fusion) unit.
4. The intelligent traffic control system according to supplementary note 3, wherein,
and the positioning (positioning) unit calculates the position of each traffic element on the dynamic map according to the position of each traffic element in the fusion result of the fusion unit and the corresponding position of the fusion result in the dynamic map.
5. The intelligent traffic control system according to supplementary note 3, wherein,
the positioning (positioning) unit also calculates the position of each traffic element on the dynamic map in the image captured by the vehicle-mounted camera according to the position information of the vehicle-mounted camera on the dynamic map and the relative position of the vehicle and other traffic elements obtained based on the image captured by the vehicle-mounted camera.
6. The intelligent traffic control system according to supplementary note 1, wherein,
the frequency with which the dynamic map unit updates the information of the dynamic map is at least 20 HZ.
7. The intelligent traffic control system according to supplementary note 1, wherein,
the position information of the traffic element obtained by the dynamic map unit is text information or image information, and the position information is stored in at least one layer of the dynamic map.
8. The intelligent traffic control system according to supplementary note 1, wherein,
the dynamic map unit transmits information of the dynamic map to a mobile device of a vehicle or a pedestrian within a predetermined range of an intersection.
9. The intelligent traffic control system according to supplementary note 1, wherein the intelligent traffic control system further comprises:
a communication unit for communication between the sensing device and the detection unit, and/or between the positioning unit and the dynamic map unit, and/or between the dynamic map unit and a subject participating in traffic.
10. The intelligent traffic control system according to supplementary note 9, wherein,
the communication delay of the communication unit is less than 20 milliseconds.
11. A control method of an intelligent traffic control system is characterized by comprising the following steps:
detecting information related to traffic elements (traffic elements) based on detection results of roadside sensing devices and vehicle-mounted sensing devices, wherein the vehicle-mounted sensing devices at least comprise vehicle-mounted cameras, and the roadside sensing devices at least comprise monitoring cameras and radars;
fusing information related to traffic elements in space and/or time;
positioning according to the fused processing result to obtain the position information of the traffic element; and
updating information of a dynamic map based on the location information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic,
wherein the information related to traffic elements comprises:
information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
Claims (10)
1. An intelligent traffic control system, characterized in that the intelligent traffic control system comprises:
a detection unit that detects information relating to a traffic element based on detection results of a roadside sensing device including at least an on-vehicle camera and an on-vehicle sensing device including at least a surveillance camera and a radar;
a fusion unit that spatially and/or temporally fuses information on the traffic elements detected by the detection unit based on detection results of different sensing devices;
the positioning unit is used for positioning according to the processing result of the fusion unit to obtain the position information of the traffic element; and
a dynamic map unit that updates information of a dynamic map based on the position information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic,
wherein the information related to traffic elements comprises:
information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
2. The intelligent traffic control system of claim 1,
the fusion unit fuses the boundary frames of the traffic elements detected according to the images of the monitoring camera and the vehicle-mounted camera with the detection results obtained according to the radar or other sensors.
3. The intelligent traffic control system of claim 1,
the positioning unit calculates the position of the traffic element in the fusion result of the fusion unit on the dynamic map.
4. The intelligent traffic control system of claim 3,
and the positioning unit calculates the position of each traffic element on the dynamic map according to the position of each traffic element in the fusion result of the fusion unit and the corresponding position of the fusion result in the dynamic map.
5. The intelligent traffic control system of claim 3,
the positioning unit also calculates the position of each traffic element in the images shot by the vehicle-mounted camera on the dynamic map according to the position information of the vehicle-mounted camera on the dynamic map and the relative positions of the vehicle and other traffic elements obtained based on the images shot by the vehicle-mounted camera.
6. The intelligent traffic control system of claim 1,
the frequency with which the dynamic map unit updates the information of the dynamic map is at least 20 HZ.
7. The intelligent traffic control system of claim 1,
the position information of the traffic element obtained by the dynamic map unit is text information or image information, and the position information is stored in at least one layer of the dynamic map.
8. The intelligent traffic control system of claim 1,
the dynamic map unit transmits information of the dynamic map to a mobile device of a vehicle or a pedestrian within a predetermined range of an intersection.
9. The intelligent traffic control system according to claim 1, wherein the intelligent traffic control system further comprises:
a communication unit for communication between the sensing device and the detection unit, and/or between the positioning unit and the dynamic map unit, and/or between the dynamic map unit and a subject participating in traffic.
10. A control method of an intelligent traffic control system is characterized by comprising the following steps:
detecting information related to traffic elements based on detection results of roadside sensing equipment and vehicle-mounted sensing equipment, wherein the vehicle-mounted sensing equipment at least comprises a vehicle-mounted camera, and the roadside sensing equipment at least comprises a monitoring camera and a radar;
fusing information related to traffic elements in space and/or time;
positioning according to the fused processing result to obtain the position information of the traffic element; and
updating information of a dynamic map based on the location information of the traffic element, the updated dynamic map being transmitted to a subject participating in traffic,
wherein the information related to traffic elements comprises:
information of vehicles, and/or information of pedestrians, and/or information of traffic lights, and/or information of traffic police gestures, and/or information of road signs, and/or information of lane lines, and/or information of zebra stripes, and/or information of detection results of events.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911314761.3A CN113012445A (en) | 2019-12-19 | 2019-12-19 | Intelligent traffic control system and control method thereof |
JP2020190191A JP2021099793A (en) | 2019-12-19 | 2020-11-16 | Intelligent traffic control system and control method for the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911314761.3A CN113012445A (en) | 2019-12-19 | 2019-12-19 | Intelligent traffic control system and control method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113012445A true CN113012445A (en) | 2021-06-22 |
Family
ID=76381619
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911314761.3A Pending CN113012445A (en) | 2019-12-19 | 2019-12-19 | Intelligent traffic control system and control method thereof |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2021099793A (en) |
CN (1) | CN113012445A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114141019A (en) * | 2021-12-15 | 2022-03-04 | 阿波罗智联(北京)科技有限公司 | Traffic control method, apparatus, medium, and program product |
CN116405905A (en) * | 2022-12-20 | 2023-07-07 | 联通智网科技股份有限公司 | Information processing method, device, equipment and storage medium |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113682307B (en) * | 2021-08-06 | 2023-09-12 | 南京市德赛西威汽车电子有限公司 | Visual lane change assisting method and system |
CN113823087B (en) * | 2021-09-09 | 2022-10-11 | 中国信息通信研究院 | Method and device for analyzing RSS performance of roadside sensing system and test system |
JP7213940B1 (en) | 2021-11-25 | 2023-01-27 | 三菱電機株式会社 | Dynamic map delivery system |
CN114283389A (en) * | 2021-12-14 | 2022-04-05 | 北京百度网讯科技有限公司 | Distributed information processing method, device, equipment, system and storage medium |
CN115230722A (en) * | 2022-09-23 | 2022-10-25 | 北京小马易行科技有限公司 | Vehicle control method, device, computer readable storage medium and processor |
CN116958763A (en) * | 2023-05-04 | 2023-10-27 | 浙江大学 | Feature-result-level-fused vehicle-road collaborative sensing method, medium and electronic equipment |
CN117492454B (en) * | 2024-01-03 | 2024-03-15 | 中建科工集团智慧停车科技有限公司 | Unmanned vehicle control method, device, equipment and medium based on intelligent rod |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1091899A (en) * | 1996-09-13 | 1998-04-10 | Oki Electric Ind Co Ltd | Road monitoring system |
CN108990010A (en) * | 2017-06-01 | 2018-12-11 | 松下电器(美国)知识产权公司 | Communication means, trackside machine and communication system |
CN109102702A (en) * | 2018-08-24 | 2018-12-28 | 南京理工大学 | Vehicle speed measuring method based on video encoder server and Radar Signal Fusion |
CN109949439A (en) * | 2019-04-01 | 2019-06-28 | 星觅(上海)科技有限公司 | Driving outdoor scene information labeling method, apparatus, electronic equipment and medium |
CN110083163A (en) * | 2019-05-20 | 2019-08-02 | 三亚学院 | A kind of 5G C-V2X bus or train route cloud cooperation perceptive method and system for autonomous driving vehicle |
-
2019
- 2019-12-19 CN CN201911314761.3A patent/CN113012445A/en active Pending
-
2020
- 2020-11-16 JP JP2020190191A patent/JP2021099793A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1091899A (en) * | 1996-09-13 | 1998-04-10 | Oki Electric Ind Co Ltd | Road monitoring system |
CN108990010A (en) * | 2017-06-01 | 2018-12-11 | 松下电器(美国)知识产权公司 | Communication means, trackside machine and communication system |
CN109102702A (en) * | 2018-08-24 | 2018-12-28 | 南京理工大学 | Vehicle speed measuring method based on video encoder server and Radar Signal Fusion |
CN109949439A (en) * | 2019-04-01 | 2019-06-28 | 星觅(上海)科技有限公司 | Driving outdoor scene information labeling method, apparatus, electronic equipment and medium |
CN110083163A (en) * | 2019-05-20 | 2019-08-02 | 三亚学院 | A kind of 5G C-V2X bus or train route cloud cooperation perceptive method and system for autonomous driving vehicle |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114141019A (en) * | 2021-12-15 | 2022-03-04 | 阿波罗智联(北京)科技有限公司 | Traffic control method, apparatus, medium, and program product |
CN114141019B (en) * | 2021-12-15 | 2023-03-28 | 阿波罗智联(北京)科技有限公司 | Traffic control method, apparatus, medium, and program product |
CN116405905A (en) * | 2022-12-20 | 2023-07-07 | 联通智网科技股份有限公司 | Information processing method, device, equipment and storage medium |
CN116405905B (en) * | 2022-12-20 | 2024-01-30 | 联通智网科技股份有限公司 | Information processing method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP2021099793A (en) | 2021-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113012445A (en) | Intelligent traffic control system and control method thereof | |
CN109791565B (en) | ADAS field of view visual supplement V2X | |
US11967230B2 (en) | System and method for using V2X and sensor data | |
US10964216B2 (en) | Method for providing information about a vehicle's anticipated driving intention | |
US9786171B2 (en) | Systems and methods for detecting and distributing hazard data by a vehicle | |
US11009356B2 (en) | Lane marking localization and fusion | |
US10369995B2 (en) | Information processing device, information processing method, control device for vehicle, and control method for vehicle | |
CN111508276B (en) | High-precision map-based V2X reverse overtaking early warning method, system and medium | |
JPWO2016208067A1 (en) | Vehicle position determination device and vehicle position determination method | |
CN112477860A (en) | Vehicle control device | |
JP2009181315A (en) | Object detection device | |
US20230148097A1 (en) | Adverse environment determination device and adverse environment determination method | |
CN108875658A (en) | A kind of object identifying method based on V2X communication apparatus | |
US20230118619A1 (en) | Parking-stopping point management device, parking-stopping point management method, and vehicle device | |
US20230120095A1 (en) | Obstacle information management device, obstacle information management method, and device for vehicle | |
US11361687B2 (en) | Advertisement display device, vehicle, and advertisement display method | |
US20220292847A1 (en) | Drive assist device, drive assist method, and program | |
CN116935693A (en) | Collision early warning method, vehicle-mounted terminal and storage medium | |
JP7359099B2 (en) | Mobile object interference detection device, mobile object interference detection system, and mobile object interference detection program | |
CN110763244B (en) | Electronic map generation system and method | |
JP2022027305A (en) | Risky driving detection device, risky driving detection system, and risky driving detection program | |
JP2021068315A (en) | Estimation method and estimation system of lane condition | |
CN112950995A (en) | Parking assistance device, corresponding method, vehicle and server | |
CN114973706B (en) | Vehicle-road cooperative communication method and device, traffic signal control equipment and road side equipment | |
WO2023145494A1 (en) | Information processing device and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |