CN115100844A - Emergency lane occupation behavior recognition system and method and terminal equipment - Google Patents

Emergency lane occupation behavior recognition system and method and terminal equipment Download PDF

Info

Publication number
CN115100844A
CN115100844A CN202210493538.5A CN202210493538A CN115100844A CN 115100844 A CN115100844 A CN 115100844A CN 202210493538 A CN202210493538 A CN 202210493538A CN 115100844 A CN115100844 A CN 115100844A
Authority
CN
China
Prior art keywords
vehicle
information
roadside
emergency lane
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210493538.5A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huichen Software Co ltd
Original Assignee
Shenzhen Huichen Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huichen Software Co ltd filed Critical Shenzhen Huichen Software Co ltd
Priority to CN202210493538.5A priority Critical patent/CN115100844A/en
Publication of CN115100844A publication Critical patent/CN115100844A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/886Radar or analogous systems specially adapted for specific applications for alarm systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Abstract

The application is suitable for the technical field of radar signal processing, and provides a recognition system, a method and a terminal device for emergency lane occupation behaviors, wherein the system comprises: the system comprises at least two road side monitoring modules, at least one road side processing module and a central control module, wherein the road side monitoring modules are used for identifying first vehicle information and second vehicle information of each vehicle; the roadside processing module is used for collecting multi-frame static echo signals; receiving first vehicle information and second vehicle information and performing fusion processing to obtain vehicle fusion information; the central control module is used for determining roadside edge lines according to the static echo signals and determining emergency lane areas according to the roadside edge lines; and when the vehicle is detected to be positioned in the emergency lane area, identifying the running state of the vehicle according to the vehicle fusion information, and generating corresponding warning information. The emergency lane occupation behavior detection method and device based on the laser fusion information has the advantages that the emergency lane occupation behavior is detected based on the laser fusion information, the cost is low, the calculated amount is small, and the recognition efficiency, the recognition precision and the stability are improved.

Description

Emergency lane occupation behavior recognition system and method and terminal equipment
Technical Field
The application belongs to the technical field of radar signal processing, and particularly relates to a system and a method for recognizing emergency lane occupation behaviors, a terminal device and a readable storage medium.
Background
In a high-speed public transportation road, an emergency lane is generally provided to facilitate a required driver to perform overtaking operation or emergency stop and avoidance under the conditions of vehicle failure and danger of passengers in the vehicle.
In daily life, abnormal occupation of emergency lanes of a high-speed public traffic road and other behaviors which violate rules occasionally occur, the behaviors which occupy the emergency lanes bring adverse effects to required vehicles, and the driving safety of part of users is damaged.
The related behavior identification method for occupying emergency lanes generally detects and identifies acquired image data and vehicle information through a camera, an RSU and other devices, and determines whether behaviors which are carried out illegally in occupying emergency are existed. The method has the problems of large calculated amount, time delay, low detection efficiency and precision and the like, and can not meet the detection requirement on the high-speed public traffic road.
Disclosure of Invention
The embodiment of the application provides a recognition system and method for emergency lane occupation behaviors, a terminal device and a readable storage medium, and can solve the problems of large calculation amount, time delay and low detection efficiency and precision of a related recognition method for emergency lane occupation behaviors.
In a first aspect, an embodiment of the present application provides a recognition system for emergency lane occupation behavior, where the recognition system for emergency lane occupation behavior includes at least two roadside monitoring modules, at least one roadside processing module, and a central control module; the roadside monitoring module is in communication connection with the roadside processing module, and the roadside processing module is in communication connection with the central control module; the roadside monitoring module comprises at least one RSU and at least one radar device; the RSU is communicated with an on-board unit (OBU);
the roadside monitoring module is used for identifying first vehicle information and second vehicle information of each vehicle on the expressway and sending the first vehicle information and the second vehicle information to the roadside processing module;
the roadside processing module is used for collecting multi-frame static echo signals on an expressway and sending the multi-frame static echo signals to the central control module;
the central control module is used for determining a roadside edge line of the expressway according to the static echo signal and determining an emergency lane area according to the roadside edge line;
the road side processing module is further used for receiving the first vehicle information and the second vehicle information, and performing fusion processing on the first vehicle information and the second vehicle information to obtain vehicle fusion information;
and the central control module is also used for identifying the running state of the vehicle according to the vehicle fusion information and generating corresponding warning information according to the running state when the vehicle is detected to be positioned in the emergency lane area.
In a second aspect, an embodiment of the present application provides a method for identifying an emergency lane occupancy behavior, which is implemented by the identification system according to the first aspect, and includes the following steps executed by a central control module:
determining roadside edge lines of the expressway according to the static echo signals, and determining an emergency lane area according to the roadside edge lines;
and when the vehicle is detected to be positioned in the emergency lane area, identifying the running state of the vehicle according to the vehicle fusion information, and generating corresponding warning information according to the running state.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the method for identifying emergency lane occupancy behavior according to the second aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the method for recognizing emergency lane occupancy behavior according to the second aspect.
In a fifth aspect, the present application provides a computer program product, which when running on a terminal device, causes the terminal device to execute the method for recognizing emergency lane occupation behavior according to the second aspect.
Compared with the prior art, the embodiment of the application has the advantages that: the vehicle fusion information is determined according to the laser fusion information, corresponding emergency lane line calibration is carried out based on the radar signals, the cost is low, the calculated amount is small, whether the occupation behavior of an emergency lane area exists or not is detected in real time based on the vehicle fusion information, corresponding warning information is generated, the interference of external factors such as night and extreme weather is reduced, and the recognition efficiency, the recognition precision and the stability of the occupation behavior of the emergency lane are improved.
It is to be understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram of an emergency lane occupation behavior recognition system provided in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a central control module according to an embodiment of the present disclosure;
FIG. 3 is another schematic structural diagram of a central control module according to an embodiment of the present disclosure;
fig. 4 is a flowchart illustrating a method for recognizing emergency lane occupancy behavior according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if [ a described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ a described condition or event ]" or "in response to detecting [ a described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing a relative importance or importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more, but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to explain the technical solution proposed in the present application, the following description will be given by way of specific examples.
Example one
Fig. 1 shows a structure diagram of an emergency lane occupancy behavior recognition system 1 provided in the present application, where the emergency lane occupancy behavior recognition system 1 includes at least two roadside monitoring modules 101, at least one roadside processing module 102, and a central control module 103; the roadside monitoring module 101 is in communication connection with the roadside processing module 102, and the roadside processing module 102 is in communication connection with the central control module 103; the roadside monitoring module 101 includes at least one RSU and at least one radar device; the RSU is communicated with an on-board unit (OBU);
the roadside monitoring module 101 is configured to identify first vehicle information and second vehicle information of each vehicle on an expressway, and send the first vehicle information and the second vehicle information to the roadside processing module 102;
the roadside processing module 102 is configured to collect multiple frames of static echo signals on an expressway and send the static echo signals to the central control module 103;
the central control module 103 is configured to determine a roadside edge line of the expressway according to the static echo signal, and determine an emergency lane area according to the roadside edge line;
the roadside processing module 102 is further configured to receive the first vehicle information and the second vehicle information, and perform fusion processing on the first vehicle information and the second vehicle information to obtain vehicle fusion information;
the central control module 103 is further configured to, when it is detected that the vehicle is located in the emergency lane area, identify a driving state of the vehicle according to the vehicle fusion information, and generate corresponding warning information according to the driving state.
Specifically, the identification system 1 for emergency lane occupation behavior comprises at least two roadside monitoring modules 101, at least one roadside processing module 102 and a central control module 103; each roadside monitoring module 101 is constituted by at least one RSU and at least one radar device. RSU (Road Side Unit, RSU), also called roadside Unit, specifically, a device installed in roadside in the ETC electronic toll collection system, communicating with On-board Unit (OBU) of a vehicle by using dsrc (verified Short Range communication) technology, and recognizing vehicle identity information to perform electronic toll collection; presetting that every two adjacent roadside monitoring modules 101 are communicated with the same roadside processing module 102, wherein the roadside monitoring modules 101 are in communication connection with the roadside processing module 102, so that the roadside processing module 102 controls the working states of the roadside monitoring modules 101 positioned at two sides of the roadside processing module 102; each roadside processing module 102 is communicatively coupled to the central control module 103 to control the state of each roadside processing module 102 through the central control module 103. In this embodiment, the roadside monitoring module 101 is installed on a side bar or a gantry of an expressway, and the roadside processing module 102 is located at a roadside beside the expressway.
Specifically, the roadside monitoring module 101 is configured to identify first vehicle information and second vehicle information of each vehicle on the highway in real time through the RSU and the radar device, and send the first vehicle information and the vehicle information to the roadside processing module 102; when receiving the first vehicle information and the second vehicle information of each vehicle, the roadside processing module 102 performs matching fusion processing on the first vehicle information and the second vehicle information to obtain vehicle fusion information of each vehicle, and sends the vehicle fusion information to the central control module 103; the method comprises the steps that radar equipment transmits a roadside monitoring module 101 to the expressway in real time for transmitting electromagnetic wave signals, receives static echo signals on the expressway and sends the static echo signals to a roadside processing module 102, and when the roadside processing module 102 collects multiple frames of static echo signals through the radar equipment, the roadside processing module identifies and determines roadside edge lines of the expressway at the current road section according to the static echo signals, pushes the preset road width distance of an emergency lane inwards by taking the roadside edge lines as an outer boundary, and obtains an emergency lane area; the central control module 103 determines the vehicle position information of each vehicle in real time through the vehicle fusion information, identifies the driving state of a certain vehicle according to the vehicle fusion information when detecting that the vehicle position information of the vehicle is located in the emergency lane area, and generates corresponding warning information according to the driving state of the vehicle.
By way of example and not limitation, since the expressway is long, to simplify the calculation flow, it is set that the expressway is divided into a plurality of sections based on the positions of the roadside monitoring modules 101. The length of the road section can be specifically set according to actual requirements; for example, the distance between every 2 roadside monitoring modules 101 is 100M, and the expressway is divided into a plurality of road sections with the length of 100M; correspondingly, each roadside processing module 102 is configured to collect and transmit vehicle fusion information and static echo signals of an expressway road section with a length of 100M to the central control module 103 for processing through the roadside monitoring modules 101 on both sides. The radar device and the RSU need to cover the areas such as the expressway, the emergency lane, and the roadside objects (such as roadside edge barriers, trees, billboards, etc. of the expressway) of the current road section (i.e., between the two roadside monitoring modules 101).
As shown in fig. 2, in one embodiment, the central control module 103 includes a screening unit 1031, a fitting unit 1032 and a region determining unit 1033; the screening unit 1031 is in communication with the fitting unit 1032, the fitting unit 1032 is in communication with the region determining unit 1033;
the screening unit 1031 is configured to screen the static echo signals, determine roadside echo points of the expressway, and send the roadside echo points to the fitting unit 1032;
the fitting unit 1032 is configured to perform linear fitting processing on the roadside echo points to obtain roadside edge lines, and send the roadside edge lines to the region determining unit 1033;
the region determining unit 1033 is configured to determine the emergency lane region according to the roadside edge line and a preset road width; the emergency lane area is an area on the expressway, which takes the roadside edge line as a long edge and takes the preset road width as the emergency lane width.
Specifically, the central control module 103 includes, but is not limited to, a screening unit 1031, a fitting unit 1032, and a region determination unit 1033; wherein the screening unit 1031 is in communication with the fitting unit 1032, and the fitting unit 1032 is in communication with the region determining unit 1033.
Specifically, the screening unit 1031 is configured to screen the multi-frame static echo signals on the expressway, identify and determine roadside echo points of the expressway according to characteristics of roadside edges of the expressway (guardrails are generally installed on the roadside edges of the expressway, and the road surface material of the expressway is different from the material of the roadside edge guardrails, and the material of the guardrails of different road sections is different, so that the correspondingly obtained static echo intensities are different;
the fitting unit 1032 is configured to perform linear fitting processing on the multiple roadside echo points of the identified expressway to obtain roadside edge lines, and send the roadside edge lines to the region determining unit 1033;
the area determination unit 1033 is configured to determine an emergency lane area according to the roadside edge line and the preset road width of the emergency lane; the emergency lane area is an area which takes roadside edge lines as long sides on the expressway and takes the preset road width as the emergency lane width; since the emergency lanes at different road sections have different widths, the preset road width can be determined according to the position of the current road section, and it can be understood that, in general, the width of the emergency lane should be not less than 3M based on the design specifications of public roads to ensure that most types of vehicles normally pass through.
In an embodiment, the screening unit 1031 is specifically configured to obtain static echo signals of multiple frames at the same position, calculate an average value of the static echo signals of the multiple frames, select a static echo signal larger than the average value as the roadside echo point, and send the roadside echo point to the fitting unit 1032.
Specifically, since the roadside edge guardrail of the expressway is generally made of metal materials such as Q235 steel, and the pavement of the expressway is generally made of asphalt, cement, and the like, the static echo signal intensity of the roadside edge guardrail of the expressway is often greater than that of the pavement of the expressway; correspondingly, the screening unit 1031 is specifically configured to obtain multiple frames of static echo signals at the same position on the expressway, calculate an average value of the static echo signals of the multiple frames, select a static echo signal larger than the average value as a roadside echo point of the expressway, and send the roadside echo point to the fitting unit 1032.
In an embodiment, the fitting unit 1032 is specifically configured to calculate and determine slopes of a plurality of the roadside echo points, perform straight line fitting processing on the roadside echo points when it is detected that the slopes satisfy a first preset condition, and perform curve fitting processing on the roadside echo points when it is detected that the slopes do not satisfy the first preset condition.
Specifically, the fitting unit 1032 is specifically configured to calculate a corresponding slope based on the multiple roadside echo points, where the slope is used as a slope of a roadside edge of the expressway, determine the roadside edge of the current road as a straight line when it is detected that the slope satisfies a first preset condition, perform straight line fitting processing based on the multiple roadside echo points, and perform curve fitting processing on the roadside echo points when it is detected that the slope does not satisfy the first preset condition; the first preset condition is an identification condition for detecting whether a line (i.e., a current road section) formed by a plurality of road-side echo points is a straight line or a curve, and the first preset condition may be determined according to a slope calculation method.
Specifically, to simplify the calculation process, the calculation amount is reduced; based on the length of the current road segment, setting a roadside echo point closest to the road exit of the current road segment as a starting point, determining two other roadside echo points with different distances from the starting point (if the length of the current road segment is 100M, selecting one roadside echo point from the starting point 10M and one roadside echo point from the starting point 100M), respectively calculating the slope between the starting point and each other roadside echo point, and determining whether the road segment between the starting point and the two other roadside echo points is a curve or a straight line, thereby performing corresponding fitting processing on the current road segment to obtain a roadside edge line of the corresponding highroad.
Specifically, when a roadside echo point of the current link closest to the road exit is taken as a starting point, the slope from the starting point to the first roadside echo point is K1, the slope from the starting point to the second roadside echo point is K2, and the road is detected
Figure BDA0003632728590000091
When the slope of the current road section meets the first preset condition, the road section is judged to be a straight line road, and straight line fitting is carried out on a plurality of road side echo points of the current road section by using algorithms such as a least square method; is detecting
Figure BDA0003632728590000092
And then, judging that the slope of the current road section does not meet a first preset condition, judging that the road section is a curved road, and performing curve fitting on a plurality of road side echo points of the current road section. Alpha is a discrimination threshold value of the expressway value, and the value can be taken between 0.01 and 0.10 according to the actual situation.
Specifically, when the current road segment is detected to be a straight road, when the straight line fitting is performed by the least square method, n road-side echo points are shared on the current road segment, and the coordinates of the n road-side echo points on the airspace are sequentially (x) 1 ,y 1 ),(x 2 ,y 2 ),…,(x n ,y n ) Performing straight line fitting processing on the n roadside echo points, and setting a straight line equation as y as kx + b; where k represents the slope of the line and b represents the intercept of the line. Error of
Figure BDA0003632728590000093
Solving the coefficients in the linear equation by minimizing the error: k and b.
The slope k and the intercept b of the linear equation are obtained through calculation, and can be expressed as follows:
Figure BDA0003632728590000094
and obtaining a fitted straight line equation y which is kx + b according to the formula so as to obtain an equation of the road side straight line part of the current road section and determine the road side edge line of the current road section.
When the current road section is detected to be a straight road, determining a roadside edge line by adopting cubic spline fitting of natural boundary conditions: the coordinates of all road side echo points obtained on the current road section are sequentially (x) according to the x coordinate sequence 0 ,y 0 ),(x 1 ,y 1 ),…,(x n ,y n ). Obtaining n small intervals [ (x) consisting of n +1 points from the x coordinate of the road side echo point 0 ,x 1 ),(x 1 ,x 2 ),(x 2 ,x 3 ),…,(x n-1 ,x n )]In [ x ] i-1 ,x i ](i-1, …, n); the curve equation of the roadside echo point is assumed to be S (x) ═ S i (x),S i (x) As a cubic function, i.e.
Figure BDA0003632728590000095
a i ,b i ,c i ,d i Is a polynomial coefficient of a cubic function on the ith cell from low to high according to x power, and S i (x),S i ‘(x),S i ' (x) is continuous, let h i =x i+1 -x i Representing the step sizes of two adjacent points. Solving cubic functions per cell, i.e. four coefficients a of the solving function i ,b i ,c i ,d i
From the above description it follows:
Figure BDA0003632728590000101
through calculation derivation, the coefficients of the cubic function equation can be expressed as:
a i =y i
let m i =S i ‘’(x i )=2c i Then the calculation yields:
Figure BDA0003632728590000102
Figure BDA0003632728590000103
a i ,b i ,c i ,d i all can use m i To algebraically represent; correspondingly, solve for a i ,b i ,c i ,d i I.e. solving with m i System of equations for unknowns:
Figure BDA0003632728590000104
Figure BDA0003632728590000111
in the above equation, the y-coordinate y of the roadside echo point i And step length h i =x i+1 -x i All are known values, and the above linear system of equations is solved to obtain m ═ m 0 ,m 1 ,…,m n ]A is introduced into i ,b i ,c i ,d i And (4) deriving an algebraic expression to obtain the coefficient of the cubic spline fitting function so as to obtain the fitting function of the cubic spline fitting on each cell, fitting the curve part of the road side edge and determining the road side edge line. Alternatively, the curve fitting process may be performed by the least square method.
In one embodiment, the roadside monitoring module 101 is specifically configured to identify first vehicle information of each vehicle on the highway based on the RSU, detect second vehicle information of each vehicle on the highway based on the radar, and send the first vehicle information and the second vehicle information to the roadside processing module 102; the first vehicle information comprises an OBUID, license plate information, first vehicle position information and first vehicle running speed; the second vehicle information includes a vehicle ID, second vehicle position information, and a second vehicle travel speed;
the roadside processing module 102 is configured to perform information matching and fusion processing on first vehicle position information, first vehicle running speed, second vehicle position information, and second vehicle running speed, and perform conversion processing on the OBUID to obtain vehicle fusion information of each vehicle on the highway; the vehicle fusion information comprises a target vehicle ID, target vehicle type information, target license plate information, target vehicle position information and target vehicle running speed.
Specifically, the roadside monitoring module 101 is specifically configured to communicate with the on-board unit OBU through the RSU, identify first vehicle information (including but not limited to an OBUID, license plate information, first vehicle position information, and first vehicle traveling speed) of each vehicle on the highway, then transmit an electromagnetic wave signal through a radar, collect an echo signal, obtain second vehicle information (including but not limited to a vehicle ID, second vehicle position information, and second vehicle traveling speed) of each vehicle on the highway, and send the first vehicle information and the second vehicle information to the roadside processing module 102; the roadside processing module 102 is configured to match a first vehicle running speed and a second vehicle running speed, and first vehicle position information and second vehicle position information, perform fusion processing on the matched first vehicle information and second vehicle information, and perform conversion processing on the OBUID to obtain vehicle type information of the vehicle, so as to determine vehicle fusion information of each vehicle on the highway; the vehicle fusion information includes, but is not limited to, a target vehicle ID, target vehicle type information, target license plate information, target vehicle position information, and target vehicle travel speed.
As shown in fig. 3, in one embodiment, the central control module 103 further includes a matching unit 1034 and a status monitoring unit 1035; the matching unit 1034 communicates with the status monitoring unit 1035;
the matching unit 1034 is configured to, when it is detected that the target vehicle position information matches the emergency lane area, send corresponding target license plate information to the state monitoring unit 1035;
the state monitoring unit 1035 is configured to compare the target license plate information with preset license plate information, and generate first warning information when it is detected that the target license plate information matches the preset license plate information;
when the target license plate information is not consistent with preset license plate information, detecting whether the running speed of the target vehicle meets preset running conditions; when the target vehicle running speed is detected to meet the preset running condition, determining that the vehicle is in a deceleration parking state, tracking the vehicle, and generating corresponding second warning information;
and generating corresponding third warning information when the target vehicle running speed is detected not to meet the preset running condition.
Specifically, the central control module 103 further includes a matching unit 1034 and a status monitoring unit 1035; the matching unit 1034 communicates with a status monitoring unit 1035. The matching unit 1034 is configured to determine the target vehicle position information of each vehicle according to the vehicle fusion information in real time, and when it is detected that the target vehicle position information of a certain vehicle matches the emergency lane area (that is, when the target vehicle position information is located in the emergency lane area), send the corresponding target license plate information to the state monitoring unit 1035. The state monitoring unit 1035 compares the target license plate information with preset license plate information when receiving the target license plate information, judges that the current vehicle is a rescue vehicle for implementing rescue when detecting that the target license plate information conforms to the preset license plate information, and generates first warning information; the first warning information comprises the model information and the license plate information of the current vehicle and the notification information that the current vehicle is rescuing, and is sent to the management terminal and all vehicles, the distance between which and the current vehicle meets the preset distance threshold value, so that the management terminal can pay attention to the running state of the vehicle in real time, and the safety of a rescue road is improved; meanwhile, whether other vehicles occupy the emergency lane area on the expressway in front of the vehicle is searched, and the vehicle is informed of the result. The management terminal is a mobile phone, a notebook computer and other terminal equipment of a manager for monitoring the safety of the expressway and implementing road assistance.
The preset license plate information refers to rescue vehicle information for implementing rescue, for example, a police vehicle carrying an alarm in a license plate, or a license plate number marked as an ambulance; the preset distance threshold may be specifically set according to actual conditions, for example, the preset distance threshold is set to be 500M. Correspondingly generating first alarm information when detecting that the current vehicle is a police car carrying a 'police', and sending the vehicle type information, the license plate information and the notification information that the current vehicle is carrying out rescue to all vehicles which are 500M away from the police car, and informing the police car of the occupation result when detecting that other vehicles occupy the emergency lane area on the expressway in front of the vehicle.
Specifically, the state monitoring unit 1035 is further configured to, when detecting that target license plate information of a vehicle located in the emergency lane area does not match preset license plate information, determine that the vehicle occupies the emergency lane, and detect whether a target vehicle running speed of the vehicle meets a preset running condition; when the target vehicle running speed is detected to meet the preset running condition, determining that the vehicle is in a deceleration stop state and possibly stops on an expressway, performing certain road safety assistance on the vehicle, tracking the vehicle, and generating corresponding second alarm information; wherein the second preset running condition is a monitoring condition for determining whether the vehicle is in a decelerated parking state. In order to reduce the data calculation amount and simplify the detection process, a second preset condition is set to be smaller than a preset speed threshold value.
Specifically, the state monitoring unit 1035 is further configured to, when it is detected that the target vehicle running speed of the vehicle located in the emergency lane area does not satisfy a preset running condition (i.e., is greater than or equal to a preset speed threshold), determine that the vehicle is an abnormally running vehicle, where the abnormal running cause may include, but is not limited to, abnormal running caused by a vehicle fault or abnormal driving intentionally caused by a driver or unintentionally caused by a physical abnormality, obtain a vehicle fault detection result of the vehicle and physical state information of the driver, and generate corresponding third warning information; the third warning information comprises license plate information, vehicle type information, vehicle running speed and vehicle position information, notification information that the vehicle occupies an emergency lane area and reasons for abnormal running of the vehicle are sent to the management terminal and all vehicles of which the distances from the current vehicle meet a preset distance threshold value, so that the management terminal determines the working state of the vehicle and the body state of a driver, applies road safety assistance and simultaneously reminds surrounding vehicles to perform safety avoidance operation; and searching whether other vehicles occupy the emergency lane area on the expressway in front of the vehicle, and informing the vehicle of the result to avoid other safety accidents.
In one embodiment, the state monitoring unit 1035 is specifically configured to, when it is detected that the target license plate information does not match preset license plate information, compare the running speed of the target vehicle with a preset speed threshold, when it is detected that the running speed of the vehicle is less than the preset speed threshold, determine that the vehicle is in a deceleration stop state, track the vehicle according to a clutter map CFAR algorithm based on the target vehicle position information, and generate corresponding second warning information according to vehicle fusion information of the vehicle.
Specifically, the state monitoring unit 1035 is specifically configured to, when it is detected that the target license plate information does not match the preset license plate information, determine that the vehicle occupies an emergency lane, compare the running speed of the target vehicle with a preset speed threshold, determine that the vehicle is in a deceleration parking state when it is detected that the running speed of the vehicle is less than the preset speed threshold, track the vehicle according to a clutter map CFAR algorithm based on the position information of the target vehicle, and generate corresponding second warning information according to vehicle fusion information of the vehicle. The preset speed threshold is the lowest speed threshold on the expressway. For example, the lowest speed threshold of the slow lane on the current expressway is 60KM/H, and a second preset condition is correspondingly set to be less than 60 KM/H; correspondingly, when the running speed of a target vehicle of a certain vehicle is detected to be less than 60KM/H, the vehicle is judged to be in a deceleration parking state, the vehicle belongs to a behavior of occupying an emergency lane, certain traffic assistance may need to be carried out on the vehicle, the vehicle is correspondingly tracked through a preset algorithm (including but not limited to a clutter map CM-CFAR algorithm), and corresponding second alarm information is generated; the second warning information comprises license plate information, vehicle type information, vehicle running speed and vehicle position information, and notification information that the vehicle occupies an emergency lane area, and is sent to the management terminal and all vehicles of which the distances from the current vehicle meet a preset distance threshold, so that the management terminal determines the working state of the vehicle and the body states of a driver and a crew member and applies road safety assistance; meanwhile, whether other vehicles occupy the emergency lane area or not is searched on the expressway in front of the vehicle, and the vehicle is informed of the result so as to avoid other safety accidents.
In one embodiment, when it is detected that the vehicle located in the emergency lane area does not meet the preset license plate information, and the working state of the vehicle is normal (i.e. the vehicle has no fault) and the body states of the driver and the crew are stable and abnormal, a warning message is sent to the vehicle owner to warn the vehicle owner that the vehicle is occupying the emergency lane and violate the relevant road traffic safety regulations, and the vehicle is required to immediately leave the emergency lane area.
In the specific application, a plurality of radar devices are deployed on an expressway, when a vehicle is tracked, an echo signal returned by a vehicle with too low speed is weak and possibly ignored, so that related vehicle information of the vehicle is lost, vehicle fusion information of the vehicle in a deceleration parking state under an I frame (namely a current frame) is obtained in real time, the predicted vehicle position information of the vehicle in the I +1 frame is predicted according to the vehicle fusion information based on a uniform acceleration model state, the predicted vehicle position information of the I +1 frame is taken as a circle center, a prediction range of the I +1 frame is defined by taking Q meters as a radius (Q is specifically set according to target vehicle type information, the larger the vehicle length and width value in the target vehicle type information is, the larger the corresponding Q value is), detection is carried out in the prediction range of the I +1 frame through a clutter map CM-CFAR algorithm, and echo point cloud data of the vehicle in the I +1 frame is determined, and (3) clustering the echo point cloud data of the frame I +1 to obtain a clustering target point, predicting the predicted vehicle position information of the vehicle in the frame I +2 based on the state of the uniform acceleration model according to the vehicle fusion information of the vehicle in the frame I +1, continuously tracking the vehicle through a clutter map CM-CFAR algorithm until the speed of the vehicle is detected to be 0, and determining the stop position information of the vehicle.
The embodiment determines the vehicle fusion information according to the laser fusion information, carries out corresponding emergency lane marking based on the radar signal, is low in cost and small in calculated amount, detects whether the occupation behavior of an emergency lane area exists or not in real time based on the vehicle fusion information, generates corresponding alarm information, reduces the interference of external factors such as night and extreme weather, and improves the identification efficiency, the identification precision and the stability of the occupation behavior of the emergency lane.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply any order of execution, and the execution order of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example two
Fig. 4 shows a flowchart of an identification method for emergency lane occupancy behavior provided in an embodiment of the present application, and for convenience of description, only a part related to the embodiment of the present application is shown.
Referring to fig. 4, the method for identifying an emergency lane occupation behavior provided in the embodiment of the present application includes the following steps executed by a central control module:
s101, roadside edge lines of the expressway are determined according to the static echo signals, and an emergency lane area is determined according to the roadside edge lines.
S102, when the vehicle is detected to be located in the emergency lane area, identifying the running state of the vehicle according to the vehicle fusion information, and generating corresponding warning information according to the running state.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the second embodiment described above may refer to the corresponding process in the first embodiment, and is not repeated herein.
The embodiment calibrates the corresponding emergency lane line based on the radar signal, is low in cost and small in calculated amount, detects whether the occupation behavior of the emergency lane area exists or not in real time based on the vehicle fusion information, generates corresponding warning information, reduces the interference of external factors such as night and extreme weather, and improves the recognition efficiency, recognition accuracy and stability of the occupation behavior of the emergency lane.
Fig. 5 is a schematic structural diagram of the terminal device provided in this embodiment. As shown in fig. 5, the terminal device 5 of this embodiment includes: at least one processor 50 (only one is shown in fig. 5), a memory 51, and a computer program 52 stored in the memory 51 and executable on the at least one processor 50, wherein the processor 50 executes the computer program 52 to implement the steps in any of the above-mentioned embodiments of the method for recognizing emergency lane occupancy behavior.
The terminal device 5 is specifically a central control module, which may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is only an example of the terminal device 5, and does not constitute a limitation to the terminal device 5, and may include more or less components than those shown, or combine some components, or different components, such as an input/output device, a network access device, and the like.
The Processor 50 may be a Central Processing Unit (CPU), and the Processor 50 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may in some embodiments be an internal storage unit of the terminal device 5, such as a hard disk or a memory of the terminal device 5. In other embodiments, the memory 51 may also be an external storage device of the terminal device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital Card (SD), a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the terminal device 5. The memory 51 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the foregoing method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when the actual implementation is performed, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may also be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (10)

1. The identification system for the emergency lane occupation behavior is characterized by comprising at least two roadside monitoring modules, at least one roadside processing module and a central control module; the roadside monitoring module is in communication connection with the roadside processing module, and the roadside processing module is in communication connection with the central control module; the roadside monitoring module comprises at least one RSU and at least one radar device; the RSU is communicated with an On Board Unit (OBU);
the roadside monitoring module is used for identifying first vehicle information and second vehicle information of each vehicle on the expressway and sending the first vehicle information and the second vehicle information to the roadside processing module;
the roadside processing module is used for collecting multi-frame static echo signals on an expressway and sending the multi-frame static echo signals to the central control module;
the central control module is used for determining a roadside edge line of the expressway according to the static echo signal and determining an emergency lane area according to the roadside edge line;
the road side processing module is further used for receiving the first vehicle information and the second vehicle information, and performing fusion processing on the first vehicle information and the second vehicle information to obtain vehicle fusion information;
and the central control module is also used for identifying the running state of the vehicle according to the vehicle fusion information and generating corresponding warning information according to the running state when the vehicle is detected to be positioned in the emergency lane area.
2. The identification system of emergency lane occupancy behavior of claim 1, wherein the central control module comprises a screening unit, a fitting unit and a region determination unit; the screening unit is communicated with the fitting unit, and the fitting unit is communicated with the region determining unit;
the screening unit is used for screening the static echo signals, determining roadside echo points of the expressway and sending the roadside echo points to the fitting unit;
the fitting unit is used for performing linear fitting processing on the road side echo points to obtain road side edge lines and sending the road side edge lines to the region determining unit;
the region determining unit is used for determining the emergency lane region according to the roadside edge line and the preset road width; the emergency lane area is an area on the expressway, which takes the roadside edge line as a long edge and takes the preset road width as the emergency lane width.
3. The system according to claim 2, wherein the screening unit is specifically configured to obtain static echo signals of multiple frames at the same position, calculate an average value of the static echo signals of the multiple frames, select a static echo signal larger than the average value as the roadside echo point, and send the roadside echo point to the fitting unit.
4. The emergency lane occupancy behavior recognition system according to claim 2, wherein the fitting unit is specifically configured to calculate and determine slopes of a plurality of roadside echo points, perform straight line fitting processing on the roadside echo points when it is detected that the slopes satisfy a first preset condition, and perform curve fitting processing on the roadside echo points when it is detected that the slopes do not satisfy the first preset condition.
5. The emergency lane occupancy behavior recognition system of claim 1, wherein the roadside monitoring module is specifically configured to recognize first vehicle information of each vehicle on the highway based on the RSU, detect second vehicle information of each vehicle on the highway based on the radar, and send the first vehicle information and the second vehicle information to the roadside processing module; the first vehicle information comprises an OBUID, license plate information, first vehicle position information and first vehicle running speed; the second vehicle information includes a vehicle ID, second vehicle position information, and a second vehicle travel speed;
the road side processing module is used for performing information matching and fusion processing on first vehicle position information, first vehicle running speed, second vehicle position information and second vehicle running speed, and performing conversion processing on the OBUID to obtain vehicle fusion information of each vehicle on the highway; the vehicle fusion information comprises a target vehicle ID, target vehicle type information, target license plate information, target vehicle position information and target vehicle running speed.
6. The emergency lane occupancy behavior recognition system of claim 5, wherein the central control module further comprises a matching unit and a status monitoring unit; the matching unit is communicated with the state monitoring unit;
the matching unit is used for sending corresponding target license plate information to the state monitoring unit when the matching of the target vehicle position information and the emergency lane area is detected;
the state monitoring unit is used for comparing the target license plate information with preset license plate information and generating first warning information when the target license plate information is detected to be consistent with the preset license plate information;
when the target license plate information is not consistent with preset license plate information, detecting whether the running speed of the target vehicle meets preset running conditions; when the target vehicle running speed is detected to meet the preset running condition, determining that the vehicle is in a deceleration parking state, tracking the vehicle, and generating corresponding second warning information;
and generating corresponding third warning information when the target vehicle running speed is detected not to meet the preset running condition.
7. The system according to claim 6, wherein the status monitoring unit is specifically configured to compare the driving speed of the target vehicle with a preset speed threshold when the target license plate information is detected not to match with preset license plate information, determine that the vehicle is in a deceleration stop state when the driving speed of the vehicle is detected to be less than the preset speed threshold, track the vehicle according to a clutter map CFAR algorithm based on the position information of the target vehicle, and generate corresponding second warning information according to vehicle fusion information of the vehicle.
8. An identification method for emergency lane occupation behavior, which is implemented based on the identification system of any one of claims 1 to 7, and comprises the following steps executed by a central control module:
determining roadside edge lines of the expressway according to the static echo signals, and determining an emergency lane area according to the roadside edge lines;
and when the vehicle is detected to be positioned in the emergency lane area, identifying the running state of the vehicle according to the vehicle fusion information, and generating corresponding warning information according to the running state.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the identification method according to claim 8 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the identification method according to claim 8.
CN202210493538.5A 2022-05-07 2022-05-07 Emergency lane occupation behavior recognition system and method and terminal equipment Pending CN115100844A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210493538.5A CN115100844A (en) 2022-05-07 2022-05-07 Emergency lane occupation behavior recognition system and method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210493538.5A CN115100844A (en) 2022-05-07 2022-05-07 Emergency lane occupation behavior recognition system and method and terminal equipment

Publications (1)

Publication Number Publication Date
CN115100844A true CN115100844A (en) 2022-09-23

Family

ID=83287197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210493538.5A Pending CN115100844A (en) 2022-05-07 2022-05-07 Emergency lane occupation behavior recognition system and method and terminal equipment

Country Status (1)

Country Link
CN (1) CN115100844A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101914890A (en) * 2010-08-31 2010-12-15 中交第二公路勘察设计研究院有限公司 Airborne laser measurement-based highway reconstruction and expansion investigation method
CN205068798U (en) * 2015-10-29 2016-03-02 长安大学 A road controlling means for detecting vehicle is violating regulations to take emergent lane
CN107516421A (en) * 2017-08-02 2017-12-26 深圳市盛路物联通讯技术有限公司 A kind of Emergency Vehicle Lane monitoring method and device
CN207068238U (en) * 2017-02-07 2018-03-02 德阳力久云智知识产权运营有限公司 A kind of Emergency Vehicle Lane passes through device
US20180293684A1 (en) * 2016-09-07 2018-10-11 Southeast University Supervision and penalty method and system for expressway emergency lane occupancy
CN110570664A (en) * 2019-09-23 2019-12-13 山东科技大学 automatic detection system for highway traffic incident
CN110718058A (en) * 2019-09-26 2020-01-21 东南大学 Active safety terminal-based expressway emergency lane occupation detection and disposal method
CN111260808A (en) * 2020-01-17 2020-06-09 河北德冠隆电子科技有限公司 Free flow vehicle charging device, system and method based on multi-data fusion
CN210804753U (en) * 2019-09-04 2020-06-19 武汉微智创大科技有限公司 High-speed emergency lane occupation monitoring device
CN113167885A (en) * 2021-03-03 2021-07-23 华为技术有限公司 Lane line detection method and lane line detection device
CN113409607A (en) * 2021-03-30 2021-09-17 新奇点智能科技集团有限公司 Road condition information pushing system, method, device, equipment and storage medium
CN114141022A (en) * 2020-09-03 2022-03-04 丰图科技(深圳)有限公司 Emergency lane occupation behavior detection method and device, electronic equipment and storage medium
CN114333347A (en) * 2022-01-07 2022-04-12 深圳市金溢科技股份有限公司 Vehicle information fusion method and device, computer equipment and storage medium
CN114419885A (en) * 2022-01-14 2022-04-29 重庆长安汽车股份有限公司 Vehicle violation snapshot method and system based on high-precision positioning and vehicle-mounted sensing capability

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101914890A (en) * 2010-08-31 2010-12-15 中交第二公路勘察设计研究院有限公司 Airborne laser measurement-based highway reconstruction and expansion investigation method
CN205068798U (en) * 2015-10-29 2016-03-02 长安大学 A road controlling means for detecting vehicle is violating regulations to take emergent lane
US20180293684A1 (en) * 2016-09-07 2018-10-11 Southeast University Supervision and penalty method and system for expressway emergency lane occupancy
CN207068238U (en) * 2017-02-07 2018-03-02 德阳力久云智知识产权运营有限公司 A kind of Emergency Vehicle Lane passes through device
CN107516421A (en) * 2017-08-02 2017-12-26 深圳市盛路物联通讯技术有限公司 A kind of Emergency Vehicle Lane monitoring method and device
CN210804753U (en) * 2019-09-04 2020-06-19 武汉微智创大科技有限公司 High-speed emergency lane occupation monitoring device
CN110570664A (en) * 2019-09-23 2019-12-13 山东科技大学 automatic detection system for highway traffic incident
CN110718058A (en) * 2019-09-26 2020-01-21 东南大学 Active safety terminal-based expressway emergency lane occupation detection and disposal method
CN111260808A (en) * 2020-01-17 2020-06-09 河北德冠隆电子科技有限公司 Free flow vehicle charging device, system and method based on multi-data fusion
CN114141022A (en) * 2020-09-03 2022-03-04 丰图科技(深圳)有限公司 Emergency lane occupation behavior detection method and device, electronic equipment and storage medium
CN113167885A (en) * 2021-03-03 2021-07-23 华为技术有限公司 Lane line detection method and lane line detection device
CN113409607A (en) * 2021-03-30 2021-09-17 新奇点智能科技集团有限公司 Road condition information pushing system, method, device, equipment and storage medium
CN114333347A (en) * 2022-01-07 2022-04-12 深圳市金溢科技股份有限公司 Vehicle information fusion method and device, computer equipment and storage medium
CN114419885A (en) * 2022-01-14 2022-04-29 重庆长安汽车股份有限公司 Vehicle violation snapshot method and system based on high-precision positioning and vehicle-mounted sensing capability

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张广伟;邓喀中;张永红;余海坤;: "基于多个极化通道的SAR影像道路探测", 国土资源遥感, no. 04, pages 32 - 35 *
焦春雨;常文革;: "超宽带SAR图像道路提取算法适应性研究", 雷达科学与技术, no. 06, pages 40 - 46 *

Similar Documents

Publication Publication Date Title
US11074813B2 (en) Driver behavior monitoring
CN110400478B (en) Road condition notification method and device
US7884739B2 (en) Systems and devices for assessing fines for traffic disturbances
CN109559532B (en) Expressway exit diversion area vehicle road cooperative safety early warning control method
EP2806413B1 (en) Vehicle behavior prediction device and vehicle behavior prediction method, and driving assistance device
EP3403219A1 (en) Driver behavior monitoring
CN110942623B (en) Auxiliary traffic accident handling method and system
CN112289054A (en) Road safety early warning method, OBU, RSU, MEC equipment and system
EP3886076A1 (en) Warning system for a host automotive vehicle
CN113538917A (en) Collision early warning method and collision early warning device
CN112918471A (en) Anti-collision control method, device and equipment for vehicle and storage medium
CN113947892A (en) Abnormal parking monitoring method and device, server and readable storage medium
CN113888860A (en) Method and device for detecting abnormal running of vehicle, server and readable storage medium
US20230242138A1 (en) Method and device for evaluating driver by using adas
CN115346370B (en) Intersection anti-collision system and method based on intelligent traffic
CN115100844A (en) Emergency lane occupation behavior recognition system and method and terminal equipment
CN116434607A (en) Expressway early warning method, device, equipment and readable storage medium
CN116129675A (en) Method, device and equipment for early warning of collision between people and vehicles
CN112447055A (en) Monitoring and warning method for low-speed driving on expressway
CN115966100B (en) Driving safety control method and system
CN116434522A (en) Intersection risk level assessment method and processing device
CN116923419A (en) Risk assessment method and system for unmanned boundary scene and electronic equipment
CN117496758A (en) Vehicle guidance method, electronic device, and computer-readable storage medium
CN111833651A (en) Highway automobile rear-end collision prevention system and working method thereof
CN114648891A (en) Dangerous vehicle prompting method, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination