CN115100863A - Road monitoring method, device, equipment and storage medium - Google Patents

Road monitoring method, device, equipment and storage medium Download PDF

Info

Publication number
CN115100863A
CN115100863A CN202210720750.0A CN202210720750A CN115100863A CN 115100863 A CN115100863 A CN 115100863A CN 202210720750 A CN202210720750 A CN 202210720750A CN 115100863 A CN115100863 A CN 115100863A
Authority
CN
China
Prior art keywords
vehicle
determining
predicted
monitored
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210720750.0A
Other languages
Chinese (zh)
Other versions
CN115100863B (en
Inventor
冯琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PEOPLE'S PUBLIC SECURITY UNIVERSITY OF CHINA
Original Assignee
PEOPLE'S PUBLIC SECURITY UNIVERSITY OF CHINA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PEOPLE'S PUBLIC SECURITY UNIVERSITY OF CHINA filed Critical PEOPLE'S PUBLIC SECURITY UNIVERSITY OF CHINA
Priority to CN202210720750.0A priority Critical patent/CN115100863B/en
Publication of CN115100863A publication Critical patent/CN115100863A/en
Application granted granted Critical
Publication of CN115100863B publication Critical patent/CN115100863B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Landscapes

  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Marketing (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Primary Health Care (AREA)
  • Emergency Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention belongs to the technical field of public safety, and discloses a road monitoring method, a road monitoring device, road monitoring equipment and a storage medium. The method comprises the following steps: when the preset moment is reached, crossing image information collected by unmanned aerial vehicles arranged at a target crossing is obtained; determining the predicted running track of each vehicle to be monitored according to the intersection image information; determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information; predicting a predicted accident vehicle with accident risk according to the view angle interval and the predicted driving track; and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents. By the mode, the rubbing accident is prevented from occurring in advance, traffic pressure can be effectively reduced, and traffic accidents and traffic jam are prevented from occurring.

Description

Road monitoring method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of public safety, in particular to a road monitoring method, a road monitoring device, road monitoring equipment and a storage medium.
Background
At present, a camera at a fixed point is basically used for real-time monitoring and recording the driving state of a vehicle in traffic scheduling and traffic monitoring, but the method can only record the driving state of the vehicle in real time, if an accident occurs, information is transmitted to a traffic management department and related personnel are informed to process, whether the vehicle is likely to scratch or not cannot be predicted in advance, and traffic jam is easily caused by the traffic accident at an intersection with large traffic flow.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a road monitoring method, a road monitoring device, road monitoring equipment and a storage medium, and aims to solve the technical problem that the prior art cannot realize real-time road monitoring to prevent vehicles from being scraped in a visual field blind area of a driver.
In order to achieve the above object, the present invention provides a road monitoring method, comprising the steps of:
when the preset moment is reached, crossing image information collected by unmanned aerial vehicles arranged at a target crossing is obtained;
determining the predicted running track of each vehicle to be monitored according to the intersection image information;
determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information;
predicting a predicted accident vehicle with accident risk according to the view angle interval and the predicted driving track;
and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents.
Optionally, determining the predicted driving track of each vehicle to be monitored according to the intersection image information includes:
determining a plurality of vehicles to be monitored according to the intersection image information;
acquiring wheel images of each vehicle to be monitored;
and determining the predicted running track of each vehicle to be monitored according to the wheel image.
Optionally, the determining the predicted driving track of each vehicle to be monitored according to the wheel image includes:
acquiring current position and attitude information of each vehicle to be monitored;
determining the wheel angle of each vehicle to be monitored according to the wheel image;
determining a track starting point according to the current position and attitude information;
determining a track direction according to the wheel angle;
and determining the predicted running track of each vehicle to be monitored according to the track starting point and the track direction.
Optionally, the determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information includes:
determining the head image of the driver of each vehicle to be monitored according to the intersection image information;
determining the positions of both eyes of a driver of each vehicle to be monitored according to the head image;
taking the central point of the positions of the two eyes of the driver of each vehicle to be monitored as a visual central point;
and determining the view angle interval of the driver of each vehicle to be monitored according to the visual center point and the vehicle side mark points of each vehicle to be monitored.
Optionally, the predicting accident vehicle predicting the accident risk according to the view angle interval and the predicted driving track includes:
acquiring the current speed and the driver position of each vehicle to be monitored;
taking the position of the driver as the circle center, and taking the view angle interval as a central angle to obtain the view angle area of the driver of each fan-shaped vehicle to be monitored;
determining the time running track of each vehicle to be monitored according to the time change according to the predicted running track and the current speed;
obtaining a time track curve of each vehicle to be monitored according to the running track of each time;
and predicting the accident vehicle with the accident risk according to the view angle area and the time track curve.
Optionally, the predicting an accident vehicle with an accident risk according to the view angle area and the time trajectory curve includes:
determining possible collision vehicles corresponding to the vehicles to be monitored according to the time track curve;
determining a target time track curve of the target monitoring vehicle corresponding to each visual angle area;
acquiring a matching time track curve of a possible collision vehicle corresponding to each target monitoring vehicle;
determining a time visual angle area track of the visual angle area moving along with time according to the target time track curve;
and predicting the predicted accident vehicle with the accident risk according to the time visual angle area track and the matched time track curve.
Optionally, the predicting an accident vehicle with an accident risk according to the time-view region trajectory and the matching time trajectory curve includes:
calculating the minimum interval between the possible collision vehicle and the target monitoring vehicle according to the time visual angle area track and the matched time track curve;
determining a risk time point with a collision risk according to the minimum inter-vehicle distance;
acquiring a position included angle between the possible collision vehicle and the target monitoring vehicle at the risk time point;
and when the position included angle and the minimum inter-vehicle distance meet the predicted collision condition, taking the possible collision vehicle and the target monitoring vehicle as predicted accident vehicles.
In addition, in order to achieve the above object, the present invention also provides a road monitoring device, including:
the image acquisition module is used for acquiring intersection image information acquired by the unmanned aerial vehicle arranged at the target intersection when the preset moment is reached;
the track prediction module is used for determining the predicted running track of each vehicle to be monitored according to the intersection image information;
the visual field determining module is used for determining the visual field angle interval of the driver of each vehicle to be monitored according to the intersection image information;
the accident prediction module is used for predicting a predicted accident vehicle with accident risk according to the view angle interval and the predicted running track;
and the automatic reminding module is used for scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents.
In addition, to achieve the above object, the present invention also provides a road monitoring apparatus, including: a memory, a processor and a road monitoring program stored on the memory and operable on the processor, the road monitoring program being configured to implement the steps of the road monitoring method as described above.
In addition, to achieve the above object, the present invention further provides a storage medium having a road monitoring program stored thereon, wherein the road monitoring program, when executed by a processor, implements the steps of the road monitoring method as described above.
When the preset moment is reached, intersection image information acquired by unmanned aerial vehicles arranged at a target intersection is acquired; determining the predicted running track of each vehicle to be monitored according to the intersection image information; determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information; predicting a predicted accident vehicle with accident risk according to the view angle interval and the predicted driving track; and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents. By the method, the crossing image information is collected by the unmanned aerial vehicle after the preset moment is reached, the predicted driving track of each vehicle to be monitored and the visual field angle interval of the driver are determined based on the crossing image information, the predicted accident vehicle possibly having the accident risk is determined based on the visual field angle interval and the predicted driving track, and finally the unmanned aerial vehicle is dispatched to remind the driver of the predicted accident vehicle, so that the occurrence of traffic accidents is avoided, the occurrence of scratch and rub accidents is prevented in advance, the traffic pressure can be effectively reduced, and the occurrence of traffic accidents and traffic jam is prevented.
Drawings
FIG. 1 is a schematic structural diagram of a road monitoring device of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of a road monitoring method according to the present invention;
FIG. 3 is a schematic flow chart of a road monitoring method according to a second embodiment of the present invention;
FIG. 4 is a schematic diagram of a vehicle driving track in an embodiment of a road monitoring method according to the present invention;
FIG. 5 is a schematic diagram illustrating a method for determining whether a position angle satisfies a predicted collision condition according to an embodiment of the road monitoring method of the present invention;
fig. 6 is a block diagram showing the structure of the road monitoring device according to the first embodiment of the present invention.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a road monitoring device in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the road monitoring apparatus may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a Random Access Memory (RAM) Memory, or may be a Non-Volatile Memory (NVM), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in FIG. 1 is not intended to be limiting of a roadway monitoring device, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and a road monitoring program.
In the road monitoring apparatus shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 of the road monitoring device of the present invention may be provided in the road monitoring device, which calls the road monitoring program stored in the memory 1005 through the processor 1001 and executes the road monitoring method provided by the embodiment of the present invention.
An embodiment of the present invention provides a road monitoring method, and referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the road monitoring method of the present invention.
In this embodiment, the road monitoring method includes the following steps:
step S10: and when the preset moment is reached, acquiring intersection image information acquired by unmanned aerial vehicles arranged at the target intersection.
It should be noted that the execution subject of this embodiment is an information processing terminal, which may be a computer, a server, or other devices capable of implementing this function, and this embodiment is not limited thereto.
It should be understood that, most of the current road real-time monitoring means are to use a fixed-point monitoring camera to monitor the traffic conditions of the intersection and the vehicle at the intersection, but this method can only detect and notify the relevant personnel to deal with the traffic conditions after the occurrence of a traffic accident or traffic jam, and cannot realize the risk pre-detection and prompt the driver to prevent the occurrence of the traffic accident. According to the scheme of the embodiment, the unmanned aerial vehicle is used for acquiring the image information of the intersection, then the predicted driving track of each vehicle to be monitored and the view angle interval of the driver are determined based on the image information of the intersection, then the predicted accident vehicle possibly having accident risk is determined based on the view angle interval and the predicted driving track, and finally the unmanned aerial vehicle is scheduled to remind the driver of the predicted accident vehicle, so that the traffic accident is avoided, the occurrence of the scratch and tear accident is prevented in advance, the traffic pressure can be effectively reduced, and the traffic accident and traffic jam are prevented.
In specific implementation, the preset time refers to a preset time period for implementing real-time monitoring of the unmanned aerial vehicle, for example: the embodiment is not limited to the time when the traffic flow is large, such as morning and evening work peak (7 to 8 am, 5 to 7 pm, etc.), or important holidays.
It should be noted that the target intersection may be any intersection where the unmanned aerial vehicle is disposed, may be an intersection with a large traffic flow or a narrow intersection that is easy to block, may be a t-shaped intersection, or may be an intersection, and the setting of the target intersection may be set by a user or an administrator, which is not limited in this embodiment.
It should be understood that the drone refers to any model of drone equipped with an image acquisition device, which is not limited in this embodiment, and multiple drones may be monitoring at one target intersection, which is not limited in this embodiment.
In specific implementation, the intersection image information refers to image information of all vehicles and other pedestrians and non-motor vehicles in a circular area with a preset radius of a target intersection, wherein the preset radius refers to a radius with any length set by a management unit, for example: 10 meters, 20 meters, or 50 meters, etc.
Step S20: and determining the predicted running track of each vehicle to be monitored according to the intersection image information.
It should be noted that the vehicles to be monitored refer to all vehicles within a preset radius of the target intersection.
It should be understood that the predicted travel track folding of each vehicle to be monitored is determined according to the intersection image information: firstly, a plurality of vehicles to be monitored serving as monitoring targets are determined according to intersection image information, and then the running track is predicted according to wheel images of the vehicles to be monitored.
Further, in order to accurately determine the predicted travel track of each vehicle to be monitored, step S20 includes: determining a plurality of vehicles to be monitored according to the intersection image information; acquiring wheel images of each vehicle to be monitored; and determining the predicted running track of each vehicle to be monitored according to the wheel image.
In specific implementation, the step of determining a plurality of vehicles to be monitored according to the intersection image information includes: and determining a plurality of targets which are determined as motor vehicles from the intersection image information by means of image recognition, and then taking all vehicle targets as vehicles to be monitored.
It should be noted that, acquiring the wheel image of each vehicle to be monitored refers to: images of the steered wheels (generally, front wheels) of each vehicle to be monitored are extracted from the intersection image information as wheel images.
It should be understood that determining the predicted travel track of each vehicle to be monitored from the wheel images refers to: and determining the wheel angle and the track direction of each vehicle to be monitored according to the wheel image so as to determine the predicted running track of each vehicle to be monitored.
By the method, the wheel images of the vehicles to be monitored are accurately determined based on the intersection image information, so that the predicted running tracks of the vehicles to be monitored are determined.
Further, in order to accurately determine the predicted travel track of each vehicle to be monitored, the step of determining the predicted travel track of each vehicle to be monitored according to the wheel image comprises the following steps: acquiring current position and attitude information of each vehicle to be monitored; determining the wheel angle of each vehicle to be monitored according to the wheel image; determining a track starting point according to the current position and attitude information; determining a track direction according to the wheel angle; and determining the predicted running track of each vehicle to be monitored according to the track starting point and the track direction.
In a specific implementation, the current position and posture information refers to the current position coordinates of each vehicle to be monitored, the current posture of the vehicle body and other relevant information, wherein the posture of the vehicle body includes, but is not limited to, relevant information such as the current vehicle body arrangement angle of the vehicle and the included angle of the traffic lane.
It should be noted that, determining the wheel angle of each vehicle to be monitored according to the wheel image means: and determining the included angle of the wheels of each vehicle to be monitored relative to the vehicle body according to the wheel image as the wheel angle.
It should be understood that determining the track starting point from the current position and posture information refers to: and taking the current position coordinates of each vehicle to be monitored in the current position posture information as a track starting point.
In a specific implementation, determining the track direction according to the wheel angle means determining a steering angle of the vehicle to be monitored according to the wheel angle, so as to determine a deflection angle and a direction of a running track of the vehicle to be monitored.
It should be noted that, determining the predicted travel track of each vehicle to be monitored according to the track starting point and the track direction refers to: and taking the track starting point as the running starting point of the vehicle to be monitored, and then making a curve according to the track direction to be used as the predicted running track of the vehicle to be monitored.
By the method, accurate calculation and obtaining of the predicted running track of the vehicle to be monitored are achieved, and the calculation of the predicted running track is more accurate.
Step S30: and determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information.
It should be understood that the viewing angle interval refers to: the driver can observe the angle range from the leftmost side to the rightmost side of the area in front of the vehicle when the driver is in the cockpit of the vehicle to be monitored.
In specific implementation, the step of determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information includes: determining a head image of the driver according to the intersection image information, then determining a visual center point of the driver, and then determining a visual field angle interval of the driver based on the vehicle side mark points of each vehicle to be monitored.
Further, in order to accurately obtain the view angle interval of the driver of each vehicle to be monitored, step S30 includes: determining the head image of the driver of each vehicle to be monitored according to the intersection image information; determining the positions of the two eyes of the driver of each vehicle to be monitored according to the head images; taking the central point of the positions of the two eyes of the driver of each vehicle to be monitored as a visual central point; and determining the view angle interval of the driver of each vehicle to be monitored according to the visual center point and the vehicle side mark points of each vehicle to be monitored.
It should be noted that the head image refers to an image of the head area of the driver in the cabin of each vehicle to be monitored, and then the image positions where the eyes of the driver are located are determined based on the head image.
It should be understood that taking the center point of the positions of both eyes of the driver of each vehicle to be monitored as the visual center point means: and connecting the positions of the two eyes of the driver of each vehicle to be monitored, and taking the midpoint of the connecting line as a visual center point.
In specific implementation, the step of determining the view angle interval of the driver of each vehicle to be monitored according to the visual center point and the vehicle-side mark point of each vehicle to be monitored refers to: and taking the visual center point as a vertex, and then taking a connecting line between the visual center point and two vehicle side mark points of each vehicle to be monitored as a side to obtain the maximum visual field angles of the left side and the right side of the driver, wherein the visual field angle interval is an interval from 0 degree to the maximum visual field angle. The vehicle side mark points of the vehicles to be monitored are the middle points of the leftmost side and the rightmost side of the front windshield of each vehicle to be monitored.
By the method, the visual angle interval of the driver of each vehicle to be monitored is accurately calculated, so that the possibility of collision and accident between the vehicles is more accurately predicted subsequently.
Step S40: and predicting the predicted accident vehicle with the accident risk according to the view angle interval and the predicted driving track.
The predicted accident vehicle that predicts the risk of an accident from the viewing angle section and the predicted travel locus means: and predicting whether a position included angle and the minimum distance between the vehicle and another vehicle to be monitored meet preset collision conditions according to the visual field angle interval and the predicted running track, and then judging the two vehicles to be monitored as predicted accident vehicles.
Step S50: and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents.
It should be understood that after the predicted accident vehicle is determined, two unmanned aerial vehicles are scheduled to run to the front of the visual angle of the driver of each predicted accident vehicle to send out reminding information so as to remind the driver to carry out evasive measures. The reminding information includes, but is not limited to, voice, alarm, and the like, which is not limited in this embodiment.
In the embodiment, when the preset time is reached, intersection image information acquired by unmanned aerial vehicles arranged at a target intersection is acquired; determining the predicted running track of each vehicle to be monitored according to the intersection image information; determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information; predicting an accident-predicted vehicle with an accident risk according to the view angle interval and the predicted driving track; and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents. By the method, the crossing image information is collected by the unmanned aerial vehicle after the preset moment is reached, the predicted driving track of each vehicle to be monitored and the visual field angle interval of the driver are determined based on the crossing image information, the predicted accident vehicle possibly having the accident risk is determined based on the visual field angle interval and the predicted driving track, and finally the unmanned aerial vehicle is dispatched to remind the driver of the predicted accident vehicle, so that the occurrence of traffic accidents is avoided, the occurrence of scratch and rub accidents is prevented in advance, the traffic pressure can be effectively reduced, and the occurrence of traffic accidents and traffic jam is prevented.
Referring to fig. 3, fig. 3 is a flowchart illustrating a road monitoring method according to a second embodiment of the present invention.
Based on the first embodiment, the road monitoring method in this embodiment includes, in the step S40:
step S401: and acquiring the current speed and the driver position of each vehicle to be monitored.
It should be noted that the current speed refers to the current driving speed of each vehicle to be monitored, and the driver position refers to the specific position coordinates of each vehicle to be monitored relative to the target intersection.
Step S402: and taking the position of the driver as the center of a circle, and taking the view angle interval as a central angle to obtain the view angle area of the driver of each fan-shaped vehicle to be monitored.
It should be understood that, taking the position of the driver as the center of the circle and taking the view angle interval as the center angle to obtain the fan-shaped view angle area of the driver of each vehicle to be monitored refers to: a fan-shaped area can be obtained by taking the position of a driver as the center of a circle, then taking a view angle interval as a central angle and combining a preset effective view distance as a radius, and then taking the fan-shaped area as the view angle area of each driver. The effective viewing distance is preset, and is used to represent an average farthest viewing distance that can be distinguished by a driver when driving the vehicle, and may be preset by a management unit, which is not limited in this embodiment.
Step S403: and determining the time running track of each vehicle to be monitored according to the time change according to the predicted running track and the current speed.
In specific implementation, the position change of each vehicle to be monitored along with the time change is simulated according to the predicted running track and the current speed, so that the time running track of each vehicle to be monitored is formed.
Step S404: and obtaining a time track curve of each vehicle to be monitored according to the running track of each time.
It should be noted that obtaining the time trajectory curve of each vehicle to be monitored according to each time travel trajectory means that an intersection coordinate system is established based on the central point of the target intersection, and then the time travel trajectory of each vehicle is substituted into the intersection coordinate system, so as to obtain the time trajectory curve of each vehicle to be monitored and a relational expression of the time trajectory curve.
Step S405: and predicting the accident vehicle with the accident risk according to the view angle area and the time track curve.
It should be understood that predicting an accident-risk predicted vehicle from the perspective area and the time trajectory curve refers to: and calculating the minimum spacing distance and the position included angle between the vehicles to be monitored based on the intersection coordinate system according to the visual angle area and the time track curve, so as to predict the predicted accident vehicle with the accident risk.
Further, in order to determine a vehicle likely to collide and a target monitoring vehicle in the intersection coordinate system, the step S405 further determines a predicted accident vehicle, and includes: determining possible collision vehicles corresponding to the vehicles to be monitored according to the time track curve; determining a target time trajectory curve of the target monitoring vehicle corresponding to each visual angle area; acquiring a matching time track curve of a possible collision vehicle corresponding to each target monitoring vehicle; determining a time visual angle area track of the visual angle area moving along with time according to the target time track curve; and predicting the predicted accident vehicle with the accident risk according to the time visual angle area track and the matched time track curve.
In specific implementation, the step of determining the possible collision vehicle corresponding to each vehicle to be monitored according to the time trajectory curve includes: and determining another vehicle to be monitored which is closest to each vehicle to be monitored when the vehicle to be monitored runs according to the time track curve of each vehicle to be monitored. And then the other vehicle to be monitored is taken as a possible collision vehicle.
It should be noted that, determining the target time trajectory curve of the target monitoring vehicle corresponding to each view angle area means: and taking any one vehicle to be monitored as a prediction reference and the corresponding view angle area, taking the taken vehicle to be monitored as a target monitoring vehicle, and taking the time track curve corresponding to the target monitoring vehicle as a target time track curve.
It should be understood that, after the target monitoring vehicles are determined, the time trajectory curves of the possible collision vehicles corresponding to the respective target monitoring vehicles are taken as the matching time trajectory curves.
In specific implementation, the step of determining the time view area track of the view area moving along with the time according to the target time track curve means that the view area is substituted into the target time track curve, so that a corresponding relation and a track of the view area along with the advance of the vehicle, namely the change of the view area along with the time, can be obtained, and the corresponding relation and the track are the time view area track.
It should be noted that, the predicting an accident-prone vehicle according to the time-view region trajectory and the matching time trajectory curve means: and calculating the minimum inter-vehicle distance between the target monitoring vehicle and the possible collision vehicle in the driving process according to the time visual angle area track and the matched time track curve, then determining the position included angle of the two vehicles at the minimum inter-vehicle distance, and finally judging whether the predicted collision condition is met or not based on the minimum inter-vehicle distance and the position included angle so as to determine the predicted accident vehicle.
In this way, the predicted accident vehicle with the accident risk can be accurately predicted according to the time-view zone trajectory changing along with time and the matching time trajectory curve.
Further, in order to accurately determine a predicted accident vehicle through data, the step of predicting the predicted accident vehicle with the accident risk according to the time viewing angle area track and the matched time track curve comprises the following steps: calculating the minimum interval vehicle distance between the possible collision vehicle and the target monitoring vehicle according to the time visual angle area track and the matching time track curve; determining a risk time point with a collision risk according to the minimum inter-vehicle distance; acquiring a position included angle between the possible collision vehicle and the target monitoring vehicle at the risk time point; and when the position included angle and the minimum inter-vehicle distance meet the predicted collision condition, taking the possible collision vehicle and the target monitoring vehicle as predicted accident vehicles.
It should be understood that calculating the minimum inter-vehicle distance between the possible collision vehicle and the target monitoring vehicle according to the time-view region trajectory and the matching time trajectory curve means: and calculating the distance when the two curves are closest to each other based on the expressions of the time visual angle area track and the matching time track curve in the intersection coordinate system. The intersection coordinate system is a virtual two-dimensional plane coordinate system which is constructed by taking a central point of the target intersection as an origin and covers the road surface, and the central point of the intersection is a preset point, which is not limited in this embodiment.
As shown in fig. 4, since the target monitoring vehicle and the potential collision vehicle move based on the steering of the wheels, the movement trajectory is an arc, where y1 is the movement trajectory in the time-view region, y2 is the matching time trajectory curve, point a is the current position of the target monitoring vehicle, point B is the current position of the potential collision vehicle, the coordinates of point a are (e, f), and the coordinates of point B are (g, h).
The expression of the temporal view area trajectory is:
(x-a) 2 +(y-b) 2 =r 2 ,x>e,y<f
the expression of the matching time trajectory curve is:
(x-c) 2 +(y-d) 2 =R 2 ,x>g,y>h
the coordinates (a, b) are the center of a circle corresponding to the time visual angle area track, R is the radius of the circle corresponding to the time visual angle area track, (c, d) is the center of a circle corresponding to the matching time track curve, and R is the radius of the circle corresponding to the time track curve.
The method for the minimum distance between vehicles comprises the following steps: and (4) obtaining the distance between the target monitoring vehicle and the possible collision vehicle at each time point according to the time change, and then obtaining the minimum distance which is the minimum distance between vehicles.
It should be understood that after the minimum inter-vehicle distance is obtained, the time at this time, i.e., the time at which the target monitoring vehicle has traveled to the minimum inter-vehicle distance, is determined.
In a specific implementation, the obtaining of the position included angle between the potential collision vehicle and the target monitoring vehicle at the risk time point refers to that an included angle, which is an included angle between the potential collision vehicle and the target monitoring vehicle in the positive x-axis direction of the intersection coordinate system at the risk time point, is a position included angle.
It should be noted that the predicted collision condition is that the minimum inter-vehicle distance is smaller than the safe inter-vehicle distance, or the position included angle is not in the visual field angle range. The safe distance is the preset minimum safe distance without scratch accidents, and can be set by a management unit, which is not limited in this embodiment.
It should be understood that, as shown in fig. 5, when at the risk time point, the target monitoring vehicle is at point C, and the possible collision vehicle is at point D, where the included angle of position is β, the dotted line region is the viewing angle region of the driver of the target monitoring vehicle, and the angle region of the angle α is the viewing angle region of the driver of the target monitoring vehicle. Therefore, the visual field area of the target monitoring vehicle is determined according to the time visual field area track, then whether the position of the vehicle which is likely to collide at the moment is in the visual field angle interval and the visual field area is determined, and when the angle beta is not in the angle alpha interval, the target monitoring vehicle and the vehicle which is likely to collide are judged to be the vehicle which is predicted to be in an accident.
By the method, the accident vehicle can be accurately judged and predicted, so that early warning and reminding can be accurately carried out, the probability of avoiding traffic accidents is improved, resources are saved to the maximum extent, and unnecessary reminding is not carried out.
The embodiment obtains the current speed and the driver position of each vehicle to be monitored; taking the position of the driver as the circle center, and taking the view angle interval as a central angle to obtain the view angle area of the driver of each fan-shaped vehicle to be monitored; determining the time running track of each vehicle to be monitored according to the time change according to the predicted running track and the current speed; obtaining a time track curve of each vehicle to be monitored according to the running track of each time; and predicting the accident vehicle with the accident risk according to the view angle area and the time track curve. By the method, whether collision danger exists among the vehicles to be monitored of each vehicle is accurately judged based on the coordinate system and the curve, so that the accident vehicle with the accident risk can be accurately predicted.
In addition, an embodiment of the present invention further provides a storage medium, where a road monitoring program is stored on the storage medium, and the road monitoring program, when executed by a processor, implements the steps of the road monitoring method as described above.
Since the storage medium adopts all technical solutions of all the embodiments described above, at least all the beneficial effects brought by the technical solutions of the embodiments described above are achieved, and are not described in detail herein.
Referring to fig. 6, fig. 6 is a block diagram illustrating a first embodiment of the road monitoring device according to the present invention.
As shown in fig. 6, a road monitoring device according to an embodiment of the present invention includes:
the image acquisition module 10 is configured to acquire intersection image information acquired by the unmanned aerial vehicle arranged at the target intersection when the preset time is reached.
And the track prediction module 20 is configured to determine a predicted driving track of each vehicle to be monitored according to the intersection image information.
And the visual field determining module 30 is configured to determine a visual field angle interval of a driver of each vehicle to be monitored according to the intersection image information.
And the accident prediction module 40 is used for predicting the predicted accident vehicle with the accident risk according to the view angle interval and the predicted running track.
And the automatic reminding module 50 is used for scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents.
In the embodiment, when the preset time is reached, intersection image information acquired by unmanned aerial vehicles arranged at a target intersection is acquired; determining the predicted running track of each vehicle to be monitored according to the intersection image information; determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information; predicting an accident-predicted vehicle with an accident risk according to the view angle interval and the predicted driving track; and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents. By the mode, the crossing image information is collected by the unmanned aerial vehicle after the preset time is reached, then the predicted driving track of each vehicle to be monitored and the visual field angle interval of the driver are determined based on the crossing image information, then the predicted accident vehicle possibly having the accident risk is determined based on the visual field angle interval and the predicted driving track, and finally the unmanned aerial vehicle is dispatched to remind the driver of the predicted accident vehicle, so that the traffic accident is avoided, the occurrence of the scratch accident is prevented in advance, the traffic pressure can be effectively reduced, and the traffic accident and traffic jam are prevented.
In one embodiment, the trajectory prediction module 20 is further configured to determine a plurality of vehicles to be monitored according to the intersection image information; acquiring wheel images of each vehicle to be monitored; and determining the predicted running track of each vehicle to be monitored according to the wheel image.
In an embodiment, the trajectory prediction module 20 is further configured to obtain current position and posture information of each vehicle to be monitored; determining the wheel angle of each vehicle to be monitored according to the wheel image; determining a track starting point according to the current position and attitude information; determining a track direction according to the wheel angle; and determining the predicted running track of each vehicle to be monitored according to the track starting point and the track direction.
In one embodiment, the view field determining module 30 is further configured to determine a head image of a driver of each vehicle to be monitored according to the intersection image information; determining the positions of the two eyes of the driver of each vehicle to be monitored according to the head images; taking the central point of the positions of the two eyes of the driver of each vehicle to be monitored as a visual central point; and determining the view angle interval of the driver of each vehicle to be monitored according to the visual center point and the vehicle side mark points of each vehicle to be monitored.
In an embodiment, the accident prediction module 40 is further configured to obtain a current speed and a driver position of each vehicle to be monitored; taking the position of the driver as the circle center, and taking the view angle interval as a central angle to obtain the view angle area of the driver of each fan-shaped vehicle to be monitored; determining the time running track of each vehicle to be monitored according to the time change according to the predicted running track and the current speed; obtaining a time track curve of each vehicle to be monitored according to the running track of each time; and predicting the accident vehicle with the accident risk according to the view angle area and the time track curve.
In an embodiment, the accident prediction module 40 is further configured to determine, according to the time trajectory curve, possible collision vehicles corresponding to each vehicle to be monitored; determining a target time trajectory curve of the target monitoring vehicle corresponding to each visual angle area; acquiring a matching time track curve of a possible collision vehicle corresponding to each target monitoring vehicle; determining a time visual angle area track of the visual angle area moving along with time according to the target time track curve; and predicting the predicted accident vehicle with the accident risk according to the time visual angle area track and the matched time track curve.
In an embodiment, the accident prediction module 40 is further configured to calculate a minimum inter-vehicle distance between the potential collision vehicle and the target monitoring vehicle according to the time-view area trajectory and the matching time trajectory curve; determining a risk time point with a collision risk according to the minimum interval vehicle distance; acquiring a position included angle between the possible collision vehicle and the target monitoring vehicle at the risk time point; and when the position included angle and the minimum inter-vehicle distance meet the predicted collision condition, taking the possible collision vehicle and the target monitoring vehicle as predicted accident vehicles.
It should be understood that the above is only an example, and the technical solution of the present invention is not limited in any way, and in a specific application, a person skilled in the art may set the technical solution as needed, and the present invention is not limited thereto.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not described in detail in this embodiment may refer to the road monitoring method provided in any embodiment of the present invention, and are not described herein again.
Further, it is to be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method of road monitoring, the method comprising:
when the preset moment is reached, crossing image information collected by unmanned aerial vehicles arranged at a target crossing is obtained;
determining the predicted running track of each vehicle to be monitored according to the intersection image information;
determining the view angle interval of the driver of each vehicle to be monitored according to the intersection image information;
predicting a predicted accident vehicle with accident risk according to the view angle interval and the predicted driving track;
and scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents.
2. The method of claim 1, wherein said determining a predicted travel trajectory for each vehicle to be monitored from said intersection image information comprises:
determining a plurality of vehicles to be monitored according to the intersection image information;
acquiring wheel images of each vehicle to be monitored;
and determining the predicted running track of each vehicle to be monitored according to the wheel image.
3. The method of claim 2, wherein determining a predicted travel path for each vehicle to be monitored from the wheel images comprises:
acquiring current position and attitude information of each vehicle to be monitored;
determining the wheel angle of each vehicle to be monitored according to the wheel image;
determining a track starting point according to the current position and attitude information;
determining a track direction according to the wheel angle;
and determining the predicted running track of each vehicle to be monitored according to the track starting point and the track direction.
4. The method of claim 1, wherein determining the range of angles of view of the driver of each vehicle to be monitored from the intersection image information comprises:
determining the head image of the driver of each vehicle to be monitored according to the intersection image information;
determining the positions of the two eyes of the driver of each vehicle to be monitored according to the head images;
taking the central point of the positions of the two eyes of the driver of each vehicle to be monitored as a visual central point;
and determining the view angle interval of the driver of each vehicle to be monitored according to the visual center point and the vehicle side mark points of each vehicle to be monitored.
5. The method of claim 1, wherein said predicting a predicted accident vehicle having a risk of accident based on said viewing angle interval and said predicted travel trajectory comprises:
acquiring the current speed and the driver position of each vehicle to be monitored;
taking the position of the driver as the circle center, and taking the view angle interval as a central angle to obtain the view angle area of the driver of each fan-shaped vehicle to be monitored;
determining the time running track of each vehicle to be monitored according to the time change according to the predicted running track and the current speed;
obtaining a time track curve of each vehicle to be monitored according to the running track of each time;
and predicting the accident vehicle with the accident risk according to the view angle area and the time track curve.
6. The method of claim 5, wherein predicting an accident-risking predicted vehicle based on the perspective region and the time trajectory profile comprises:
determining possible collision vehicles corresponding to the vehicles to be monitored according to the time track curve;
determining a target time trajectory curve of the target monitoring vehicle corresponding to each visual angle area;
acquiring a matching time track curve of a possible collision vehicle corresponding to each target monitoring vehicle;
determining a time visual angle area track of the visual angle area moving along with time according to the target time track curve;
and predicting the predicted accident vehicle with the accident risk according to the time visual angle area track and the matched time track curve.
7. The method of claim 6, wherein predicting an accident-risky predicted vehicle based on the time-perspective region trajectory and the matching time trajectory profile comprises:
calculating the minimum interval between the possible collision vehicle and the target monitoring vehicle according to the time visual angle area track and the matched time track curve;
determining a risk time point with a collision risk according to the minimum inter-vehicle distance;
acquiring a position included angle between the possible collision vehicle and the target monitoring vehicle at the risk time point;
and when the position included angle and the minimum inter-vehicle distance meet the predicted collision condition, taking the possible collision vehicle and the target monitoring vehicle as predicted accident vehicles.
8. A road monitoring device, comprising:
the image acquisition module is used for acquiring intersection image information acquired by the unmanned aerial vehicle arranged at the target intersection when the preset moment is reached;
the track prediction module is used for determining the predicted running track of each vehicle to be monitored according to the intersection image information;
the visual field determining module is used for determining the visual field angle interval of the driver of each vehicle to be monitored according to the intersection image information;
the accident prediction module is used for predicting a predicted accident vehicle with accident risk according to the view angle interval and the predicted running track;
and the automatic reminding module is used for scheduling the unmanned aerial vehicle to remind a driver corresponding to the predicted accident vehicle so as to prevent traffic accidents.
9. A road monitoring device, characterized in that the device comprises: a memory, a processor and a road monitoring program stored on the memory and operable on the processor, the road monitoring program being configured to implement the road monitoring method of any one of claims 1 to 7.
10. A storage medium having stored thereon a road monitoring program which, when executed by a processor, implements a road monitoring method according to any one of claims 1 to 7.
CN202210720750.0A 2022-06-23 2022-06-23 Road monitoring method, device, equipment and storage medium Active CN115100863B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210720750.0A CN115100863B (en) 2022-06-23 2022-06-23 Road monitoring method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210720750.0A CN115100863B (en) 2022-06-23 2022-06-23 Road monitoring method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115100863A true CN115100863A (en) 2022-09-23
CN115100863B CN115100863B (en) 2023-03-24

Family

ID=83293608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210720750.0A Active CN115100863B (en) 2022-06-23 2022-06-23 Road monitoring method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115100863B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351706A (en) * 2023-10-07 2024-01-05 广东艾百智能科技有限公司 Highway number monitoring and data closed-loop analysis system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005125828A (en) * 2003-10-21 2005-05-19 Fujitsu Ten Ltd Vehicle surrounding visually confirming system provided with vehicle surrounding visually confirming device
JP2009086788A (en) * 2007-09-28 2009-04-23 Hitachi Ltd Vehicle surrounding monitoring device
CN104021370A (en) * 2014-05-16 2014-09-03 浙江传媒学院 Driver state monitoring method based on vision information fusion and driver state monitoring system based on vision information fusion
JP2015225366A (en) * 2014-05-26 2015-12-14 株式会社リコー Accident prevention system, accident prevention device, and accident prevention method
FR3024687A1 (en) * 2015-03-12 2016-02-12 Jean Claude Galland METHOD AND DEVICE FOR IMPROVING THE VIEW OF THE DRIVER OF A MOTOR VEHICLE

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005125828A (en) * 2003-10-21 2005-05-19 Fujitsu Ten Ltd Vehicle surrounding visually confirming system provided with vehicle surrounding visually confirming device
JP2009086788A (en) * 2007-09-28 2009-04-23 Hitachi Ltd Vehicle surrounding monitoring device
CN104021370A (en) * 2014-05-16 2014-09-03 浙江传媒学院 Driver state monitoring method based on vision information fusion and driver state monitoring system based on vision information fusion
JP2015225366A (en) * 2014-05-26 2015-12-14 株式会社リコー Accident prevention system, accident prevention device, and accident prevention method
FR3024687A1 (en) * 2015-03-12 2016-02-12 Jean Claude Galland METHOD AND DEVICE FOR IMPROVING THE VIEW OF THE DRIVER OF A MOTOR VEHICLE

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351706A (en) * 2023-10-07 2024-01-05 广东艾百智能科技有限公司 Highway number monitoring and data closed-loop analysis system

Also Published As

Publication number Publication date
CN115100863B (en) 2023-03-24

Similar Documents

Publication Publication Date Title
CN112965502B (en) Visual tracking confirmation method, device, equipment and storage medium
US11380193B2 (en) Method and system for vehicular-related communications
EP3361466B1 (en) Risk-based driver assistance for approaching intersections of limited visibility
US20190340522A1 (en) Event prediction system, event prediction method, recording media, and moving body
CN113366497B (en) Agent Prioritization for Autonomous Vehicles
US10800455B2 (en) Vehicle turn signal detection
EP2990290B1 (en) Method and system for post-collision manoeuvre planning and vehicle equipped with such system
EP2201496B1 (en) Inattentive state determination device and method of determining inattentive state
US11945435B2 (en) Devices and methods for predicting collisions and/or intersection violations
JP5278419B2 (en) Driving scene transition prediction device and vehicle recommended driving operation presentation device
US11003925B2 (en) Event prediction system, event prediction method, program, and recording medium having same recorded therein
GB2545321A (en) Predicting vehicle movements based on driver body language
JP5691237B2 (en) Driving assistance device
CN108961839A (en) Driving lane change method and device
US20210312193A1 (en) Devices and methods for predicting intersection violations and/or collisions
CN110936960A (en) Driving assisting method and system
CN115100863B (en) Road monitoring method, device, equipment and storage medium
CN117022323A (en) Intelligent driving vehicle behavior analysis and prediction system and method
JP6811429B2 (en) Event prediction system, event prediction method, program, and mobile
CN113428160B (en) Dangerous scene prediction method, device and system, electronic equipment and storage medium
US20210309221A1 (en) Devices and methods for determining region of interest for object detection in camera images
JP2019128697A (en) Information processing method and information processing device
Minoura et al. Driving support by estimating vehicle behavior
CN116168542B (en) Early warning method and system based on behavior monitoring of large vehicle
CN115139999B (en) Vehicle and pedestrian anti-collision control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant