CN111105644A - Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system - Google Patents

Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system Download PDF

Info

Publication number
CN111105644A
CN111105644A CN201911158440.9A CN201911158440A CN111105644A CN 111105644 A CN111105644 A CN 111105644A CN 201911158440 A CN201911158440 A CN 201911158440A CN 111105644 A CN111105644 A CN 111105644A
Authority
CN
China
Prior art keywords
vehicle
blind area
road
driving
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911158440.9A
Other languages
Chinese (zh)
Inventor
张帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JD Digital Technology Holdings Co Ltd
Original Assignee
JD Digital Technology Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JD Digital Technology Holdings Co Ltd filed Critical JD Digital Technology Holdings Co Ltd
Priority to CN201911158440.9A priority Critical patent/CN111105644A/en
Publication of CN111105644A publication Critical patent/CN111105644A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application relates to a vehicle blind area monitoring and driving control method, a device and a vehicle road cooperative system, wherein the vehicle blind area monitoring method comprises the following steps: acquiring a road image and a blind area corresponding to a vehicle; acquiring a first motion trail of a first object on the road according to the road image; determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track; and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object. According to the technical scheme, the blind area monitoring information is generated according to the traffic conflict situation in the vehicle blind area, and a vehicle driver or an automatic driving vehicle can make a driving decision according to the blind area monitoring information, so that the driving safety of the road is improved, and the traffic safety of the whole road is further improved.

Description

Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system
Technical Field
The application relates to the technical field of automobiles, in particular to a vehicle blind area monitoring and driving control method, a vehicle blind area monitoring and driving control device and a vehicle road cooperative system.
Background
With the development of technologies such as car networking, 5G, cloud computing, etc., the field of unmanned driving or automatic driving receives more and more attention.
Although the unmanned automobile industry develops rapidly, the current unmanned technical route mainly depends on the sensing and decision of automobile equipment, and a huge computing system is placed in an automobile, but various requirements of automatic driving, especially the requirements on safety, cannot be perfectly met. In addition, the intelligent automobile has extremely high bearing pressure and high cost. The current unmanned vehicle needs self-perception and self-decision, so the requirements on environment perception and recognition capability are extremely high.
In unmanned and automatic driving applications, the prediction of future form tracks of dynamic objects possibly appearing on roads such as automobiles, pedestrians, non-motor vehicles and the like is to improve the driving safety and accuracy of vehicles. The existing prediction is generally completed by a vehicle, the vehicle acquires relevant motion parameters of dynamic objects in a detection range, and the future motion trail of each dynamic object is predicted by combining the current environment condition.
However, because the vehicle itself has a detection blind area, the traffic hazard existing in the blind area cannot be found, and especially for an intersection with a complex traffic condition, the vehicle cannot sense a moving object outside the detection range of the sensor, so that the motion track cannot be predicted, and the safety of the unmanned vehicle passing through the intersection is difficult to guarantee.
Disclosure of Invention
In order to solve the technical problems or at least partially solve the technical problems, embodiments of the present application provide a vehicle blind area monitoring and driving control method, a vehicle blind area monitoring and driving control device, and a vehicle and road coordination system.
In a first aspect, an embodiment of the present application provides a vehicle blind area monitoring method, including:
acquiring a road image and a blind area corresponding to a vehicle;
acquiring a first motion trail of a first object on the road according to the road image;
determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track;
and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
Optionally, the acquiring the road image includes:
acquiring a driving route of the vehicle;
and acquiring a road image corresponding to the driving route.
Optionally, the acquiring of the road image and the blind area corresponding to the vehicle includes:
acquiring a driving route of the vehicle;
determining a blind area of the vehicle according to the driving route;
and acquiring a road image corresponding to the blind area.
Optionally, the obtaining of the blind area corresponding to the vehicle includes:
acquiring a driving route and attribute information of the vehicle;
acquiring road condition information corresponding to the driving route;
and determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the attribute information and the road condition information.
Optionally, the obtaining of the blind area corresponding to the vehicle includes:
acquiring a driving route of the vehicle and the selected blind area type;
acquiring road condition information corresponding to the driving route;
and determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the blind area type and the road condition information.
Optionally, the determining, according to the first motion trajectory, a second object that conflicts with the vehicle traveling in the blind area within a preset time period includes:
acquiring a driving route, a driving speed and vehicle position information of the vehicle;
determining a driving track of the vehicle on the road according to the driving route;
determining an intersection point of the first motion track and the driving track in the blind area;
determining first time when the vehicle reaches the intersection point according to the running speed and the vehicle position information, and determining second time when the first object reaches the intersection point according to the first motion track;
and when the difference value between the first time and the second time is smaller than or equal to a preset threshold value, determining the first object as a second object which conflicts with the vehicle in the blind area in the preset time period.
Optionally, the method further includes:
and when the vehicle meets the preset reminding condition, sending the blind area monitoring information to a terminal corresponding to the vehicle.
In a second aspect, an embodiment of the present application provides a vehicle travel control method, including:
receiving blind area monitoring information corresponding to a vehicle, wherein the blind area monitoring information is generated according to the embodiment of the vehicle blind area monitoring method;
when the vehicle is determined to have a driving conflict in the blind area of the vehicle according to the driving information of the vehicle and the blind area monitoring information, generating driving conflict information;
and carrying out running control according to the running conflict information.
In a third aspect, an embodiment of the present application provides a vehicle blind area monitoring device, including:
the first acquisition module is used for acquiring a road image and a blind area corresponding to a vehicle;
the second acquisition module is used for acquiring a first motion track of a first object on the road according to the road image;
the determining module is used for determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track;
and the generating module is used for generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
In a fourth aspect, an embodiment of the present application provides a travel control apparatus including:
the system comprises a receiving module, a judging module and a judging module, wherein the receiving module is used for receiving blind area monitoring information corresponding to a vehicle, and the blind area monitoring information is generated according to the embodiment of the vehicle blind area monitoring method;
the generating module is used for generating driving conflict information when the driving conflict of the vehicle in the blind area of the vehicle is determined according to the driving information of the vehicle and the blind area monitoring information;
and the control module is used for carrying out running control according to the running conflict information.
In a fifth aspect, an embodiment of the present application provides a vehicle-road coordination system, including: the camera device and the calculating device are arranged on a road;
the camera device is used for shooting the road and sending the shot road image to the computing device;
the computing device is used for acquiring a road image and a blind area corresponding to a vehicle; acquiring a first motion trail of a first object on the road according to the road image; determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track; and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
Optionally, the system further comprises: a vehicle-mounted terminal located on a vehicle;
the computing device is used for sending the blind area monitoring information to the vehicle-mounted terminal;
the vehicle-mounted terminal is used for receiving blind area monitoring information corresponding to the vehicle; when the vehicle is determined to have a driving conflict in the blind area of the vehicle according to the driving information of the vehicle and the blind area monitoring information, generating driving conflict information; and carrying out running control according to the running conflict information.
In a sixth aspect, an embodiment of the present application provides an electronic device, including: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the above method steps when executing the computer program.
In a seventh aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the above method steps.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
the method comprises the steps of monitoring objects on a road by shooting road images, determining the motion tracks of the objects, judging whether the objects conflict with vehicles in the range of blind areas of the vehicles or not, and generating corresponding blind area monitoring information if the traffic conflicts are possible. The blind area monitoring information can be sent to the vehicle-mounted terminal for traffic reminding according to preset conditions, and a vehicle driver or an automatic driving vehicle can make a driving decision according to the blind area monitoring information, so that the driving safety of the road is improved, and the traffic safety of the whole road is further improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a block diagram of a road side portion based on a vehicle-road cooperative system according to an embodiment of the present application;
fig. 2 is a schematic deployment diagram of a roadside system based on vehicle-road cooperation according to an embodiment of the present application;
fig. 3 is a schematic deployment diagram of a roadside system based on vehicle-road coordination according to another embodiment of the present application;
fig. 4 is a flowchart of a vehicle blind area monitoring method according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a blind area option interface provided in an embodiment of the present application;
FIG. 6 is a flowchart of a vehicle blind spot monitoring method according to another embodiment of the present application;
fig. 7 is a flowchart of a vehicle blind area monitoring method according to another embodiment of the present application;
fig. 8 is a flowchart of a vehicle driving control method according to an embodiment of the present application;
fig. 9 is a block diagram of a vehicle blind area monitoring device according to an embodiment of the present disclosure;
fig. 10 is a block diagram of a travel control apparatus according to an embodiment of the present application;
fig. 11 is a block diagram of a vehicle-road coordination system according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The method of the embodiment of the application is mainly applied to a vehicle-road cooperative system.
Vehicle-road cooperation refers to the connection of all elements in a traffic system with all vehicles and roadside infrastructure in a wireless communication manner to form a complete system for providing dynamic information sharing. The road side part in the vehicle-road cooperation system collects traffic information on roads, uses edge computing equipment to perform recognition processing, and provides more comprehensive and accurate auxiliary information for vehicles in time.
The vehicle-road cooperation system of the embodiment of the application includes an imaging device 10 and a computing device 20 which are provided on a road. Wherein the computing device 20 comprises: an edge computing device 21 and a central computing device 22.
Fig. 1 is a block diagram of a road side portion based on a vehicle-road cooperative system according to an embodiment of the present application. As shown in fig. 1, at least one camera 10 is arranged on the road every first preset length to shoot the road section of the first preset length; the at least two camera devices 10 are connected with the edge calculation device 21; a first predetermined number of edge computing devices 21 are connected to a central computing device 22.
And the camera device 10 is used for uploading the shot image to the edge calculation device connected with the camera device. The edge computing device 21 is used for performing recognition processing on the image and sending the recognition result to a central computing device connected with the edge computing device. And the central computing device 22 is used for carrying out data processing according to the identification result. The edge computing device 21 may be an edge computing industrial personal computer, and the central computing device 22 may be an edge computing workstation.
Fig. 2 is a schematic deployment diagram of a road side system based on vehicle-road cooperation according to an embodiment of the present application. As shown in fig. 2, on an expressway, in which at least one image pickup device 10 is provided per a first preset length on the road, the first preset length of a link is photographed. At least two cameras 10 are connected to the edge computing device 21. A first predetermined number of edge computing devices 21 are connected to a central computing device 22.
For example, 1 image pickup device 10 may be provided at each end of a road segment of 100 meters. The 2 image pickup devices 10 relatively photograph the 100-meter link. Meanwhile, the 2 image pickup devices 10 are connected to the 1 edge calculation device 21. The 5 edge computing devices 20 are connected to the 1 central computing device 22.
The camera 10 and the edge computing device 21 are connected to a power over ethernet switch 41, and the central computing device 22 is connected to a power over core ethernet switch 42.
The roadside system further includes: the firewall device 50, the edge computing device 21, and the central computing device 22 are connected to a cloud server on the network side through the firewall device 50.
Fig. 3 is a schematic deployment diagram of a roadside system based on vehicle-road cooperation according to another embodiment of the present application. As shown in fig. 3, at least two image pickup devices 10 are provided on each side of the intersection, and the image pickup devices 10 take images toward the intersection. The camera means 10 arranged on each side are connected to an edge calculation means 21. Each edge computing device 21 is connected to 1 central computing device.
For example, 2 cameras 10 are provided on each side of the intersection, and the 2 cameras on each side are connected to one edge calculation device 21. The crossroad worker is provided with 4 edge computing devices 21, and the 4 edge computing devices 21 are all connected with 1 central computing device 22.
The image pickup device 10 and the edge computing device 21 are connected to a power over ethernet switch 41, and the central computing device 22 is connected to a core power over ethernet switch 42.
The edge computing device 21 and the central computing device 22 may be connected to a cloud server, and upload an image recognition result or a data processing result to the cloud server, or receive an instruction or data sent by the cloud server.
First, a vehicle blind area monitoring method provided by the embodiment of the invention is described below.
Fig. 4 is a flowchart of a vehicle blind area monitoring method according to an embodiment of the present application. As shown in fig. 4, the method comprises the steps of:
step S11, acquiring road images and blind areas corresponding to vehicles;
step S12, acquiring a first motion trail of a first object on the road according to the road image;
step S13, determining a second object which is in conflict with the vehicle running in the blind area within a preset time period according to the first motion trail;
and step S14, generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
In this embodiment, the road image is captured, objects on the road are monitored, the movement tracks of the objects are determined, whether the objects conflict with the vehicle in the blind area range of the vehicle is judged, and if the traffic conflict is possible, corresponding blind area monitoring information is generated. The blind area monitoring information can be sent to the vehicle-mounted terminal for traffic reminding according to preset conditions, and a vehicle driver or an automatic driving vehicle can make a driving decision according to the blind area monitoring information, so that the driving safety of the road is improved, and the traffic safety of the whole road is further improved.
Optionally, the first object comprises a dynamic object and/or a static object. The dynamic object includes: motor vehicles, bicycles, pedestrians, etc., static objects including: traffic lights, road barriers, obstacles to road maintenance, vehicles parked on the road in the event of a traffic accident, and the like. For dynamic objects, the object information may include: type of object (e.g., car, truck, van, bicycle, electric bike, pedestrian, etc.), size, object location, direction of movement, speed of movement, etc. For static objects, the object information may include: type of object (traffic lights, road barriers, roadblocks, vehicles, etc.), size, location, etc.
For a static object, if the static object is located in the range of the vehicle blind area, blind area monitoring information corresponding to the vehicle is generated according to object information such as the type, the position, the size and the like of the static object. And for the dynamic object, the motion track of the dynamic object can be predicted, whether the dynamic object enters the vehicle blind area or not and whether the dynamic object can send traffic conflict with the vehicle or not when the dynamic object enters the vehicle blind area are judged, and for the moving object which is possibly in traffic conflict with the vehicle in the vehicle blind area, blind area monitoring information corresponding to the vehicle is generated according to the motion track of the moving object. Therefore, by combining with the monitoring of dynamic objects and static objects, the traffic condition in the vehicle blind area can be monitored more comprehensively and accurately, a vehicle driver or an automatic driving vehicle can make a driving decision accurately according to the blind area monitoring information, the occurrence of traffic conflicts is avoided, the driving safety of the vehicle road is improved, and the safety of the whole road traffic is further improved.
In addition, in the embodiment, not only the dynamic object and/or the static object in the vehicle blind area is monitored, but all objects that may enter the vehicle blind area are monitored. For example, the pedestrian is not in the vehicle blind area at present, but through the prediction of the motion trail of the pedestrian, the pedestrian is found to be possibly in the vehicle blind area, and the probability of traffic conflict exists. If the motion track of the pedestrian is close to or passes through the vehicle blind area, if an emergency occurs, the pedestrian possibly enters the blind area and collides with the vehicle, the pedestrian is also monitored, and the motion track information of the pedestrian is added into the reminding information of the vehicle. Therefore, the road monitoring range is further expanded, the traffic conflict which possibly occurs in the blind area is reminded, the occurrence of sudden traffic conflict is avoided, the driving safety of the vehicle road is improved, and the safety of the whole road traffic is further improved.
In an alternative embodiment, in order to reduce the amount of computation and improve the monitoring efficiency, only the road image on the vehicle travel route may be acquired for blind area monitoring. In the above step S11, the acquiring the road image includes: acquiring a driving route of a vehicle; and acquiring a road image corresponding to the driving route.
For example, based on the traveling route information of the vehicle, it is determined that the traveling locus of the vehicle is traveling from east to west on the road and turning right at the intersection a. Therefore, it is possible to acquire only the road image of the intersection a, and the road images of the east and north sides of the intersection after the vehicle turns right.
In another alternative embodiment, in order to further reduce the amount of calculation and improve the monitoring efficiency, a blind area is determined by the vehicle travel route, and only the road image on the vehicle travel route that is associated with the blind area is acquired for blind area monitoring. The step S11 includes: acquiring a driving route of a vehicle; determining a blind area of the vehicle according to the driving route; and acquiring a road image corresponding to the blind area.
For example, based on the traveling route information of the vehicle, it is determined that the traveling locus of the vehicle is traveling from east to west on the road and turning right at the intersection a. Therefore, only the road image on the north side of the intersection where the vehicle turns right can be acquired.
In an alternative embodiment, the vehicle corresponding blind zone may be obtained by at least one of the following ways.
The method comprises the steps that a blind area corresponding to a vehicle is determined according to attribute information of the vehicle.
In the step S11, the obtaining of the blind area corresponding to the vehicle includes the following steps:
step A1, acquiring the driving route and attribute information of the vehicle;
step A2, acquiring road condition information corresponding to a driving route;
and A3, determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the attribute information and the road condition information.
Wherein the attribute information of the vehicle itself includes: size, effective braking distance, blind zone range, visual range, etc.
For an automatic driving vehicle, detection devices such as radars are installed on the vehicle, but the number and positions of the detection devices are limited, and the surrounding conditions of the vehicle cannot be comprehensively monitored. For example, the radar detection distance range is 20 cm to 100 m from the vehicle, and the vehicle blind area is actually 0 cm to 20 cm from the vehicle and exceeds 100 m. The radar detection angle ranges from-25 degrees to 15 degrees, and the vehicle blind areas range from-180 degrees to-25 degrees and 15 degrees to 180 degrees.
In addition, the traffic information includes: fixed objects which have shielding influence on the sight line of a driver or detection equipment, such as buildings, trees, green belts, street lamps and the like, are arranged near a certain driving position on the driving route of the vehicle. For example, if there is a tree on the right side of the vehicle, the detection device cannot detect an object behind the tree because the tree may block the detection device on the vehicle, and the rear of the tree is also a blind area of the vehicle.
In this embodiment, the blind area corresponding to the driving position of the vehicle can be determined by combining the road condition information corresponding to the driving route of the vehicle and the attribute information of the vehicle.
And determining the blind area corresponding to the vehicle according to the type of the blind area selected by the user.
In the step S11, the obtaining of the blind area corresponding to the vehicle and the obtaining of the blind area of the vehicle include the following steps:
step B1, acquiring the driving route of the vehicle and the selected blind area type;
step B2, acquiring road condition information corresponding to the driving route;
and step B3, determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the type of the blind area and the road condition information.
The blind area monitoring method and the blind area monitoring system can provide an interface about blind area monitoring for a user, provide a blind area type option on the interface, and enable the user to select the type of the blind area to be monitored according to needs. Fig. 5 is a schematic diagram of a blind area option interface provided in the embodiment of the present application. As shown in fig. 5, the blind area types may be divided into a left-turn blind area, a right-turn blind area, a reverse blind area, a parking blind area, a starting blind area, and the like according to the driving scene, and the user determines the blind area to be monitored and reminded by selecting the option. Each blind area type may correspond to a plurality of blind areas. Optionally, all vehicle blind areas, such as a front blind area, a rear-view mirror blind area, an AB column blind area, a short-distance blind area, a turning blind area, a long-distance blind area, and the like, may also be provided on the blind area option interface, and the user may select the blind area according to his own needs.
In an alternative embodiment, the road image comprises: at least two road images photographed at a preset time interval, and/or at least two road images extracted at a preset time interval from the photographed road video. The step S12 includes: and processing and identifying the road image to obtain a first object.
Alternatively, a moving object in the image may be identified by a three-frame difference method. First, three consecutive images are acquired as defined as image1, image2, and image 3. And performing frame difference operation on the image1 and the image2 to obtain a difference d 1. And performing frame difference operation on the image2 and the image3 to obtain a difference d 2. And d1 and d2 are subjected to smoothing processing and threshold processing and then converted into binary images. And carrying out AND operation on the binary image according to bits to obtain an identification result.
After the first object is identified, predicting the motion trajectory of the first object may be implemented by using a computer vision library OpenCV with a kalman filter. The Kalman filter predicts the motion trail of the object by continuously predicting the state of the object and updating the recursive calculation process of state prediction based on the measurement result.
However, it is impossible to immediately obtain the predicted point by directly processing the actual coordinate point by using the kalman filter, and it is seen from the actual use that the result of the processing is delayed from the current coordinate point (the coordinate point at the moment when the image pickup device recognizes the target), then regressed and advanced from the current point, but regressed and advanced from the current point again, and so on. Thus, when predicting a dynamic object, the predicted point may be behind the current point during the initial period of time, and even if the actual point is caught up by the subsequent predicted point, the target may have changed direction of motion or moved out of view. Therefore, the prediction effect of directly using the Kalman filter on the motion of the object is poor.
In the present embodiment, the following technical means are adopted to overcome the above problems.
Fig. 6 is a flowchart of a vehicle blind area monitoring method according to another embodiment of the present application. As shown in fig. 6, the first object information includes: position information, direction of motion, and speed of motion; the position information comprises actual coordinates of the first object. The method further comprises the following steps:
step S21, determining coordinate information of a first prediction point according to a preset prediction algorithm and an actual coordinate, a motion direction and a motion speed;
step S22, multiplying the difference between the actual coordinate information and the first prediction point coordinate information by a preset coefficient to obtain a product result;
step S23, adding the product result and the actual coordinate information to obtain a second predicted point coordinate;
and step S24, obtaining a first motion track of the first object according to the second predicted point coordinates.
In this embodiment, the distance between the predicted point and the actual point is multiplied by a preset coefficient and then added to the coordinates of the actual point, thereby realizing motion prediction. During actual prediction, in the initial stage, the predicted point may be ahead of the actual point, but the distance between the predicted point and the actual point is gradually shortened in the subsequent stage, and finally the predicted point and the actual point are overlapped, so that the prediction effect is more accurate for the dynamic object.
In this embodiment, in step S12, for the identification, detection, trajectory prediction, and the like of the object, an evaluation data set, such as a KITTI data set, of a computer vision algorithm in an automatic driving scene may be referred to. The data set is used for evaluating the performance of computer vision technologies such as stereo image (stereo), optical flow (optical flow), visual distance measurement (visual object measurement), 3D object detection (object detection) and 3D tracking (tracking) in a vehicle-mounted environment.
Fig. 7 is a flowchart of a vehicle blind spot monitoring method according to another embodiment of the present application. As shown in fig. 7, step S13 includes the steps of:
step S31, acquiring the driving route, the driving speed and the vehicle position information of the vehicle;
step S32, determining the driving track of the vehicle on the road according to the driving route;
step S33, determining the intersection point of the first motion track and the driving track in the blind area;
step S34, determining a first time when the vehicle reaches the intersection point according to the running speed and the vehicle position information, and determining a second time when the first object reaches the intersection point according to the first motion track;
and step S35, when the difference value between the first time and the second time is less than or equal to the preset threshold value, determining the first object as a second object which conflicts with the vehicle running in the blind area within the preset time period.
Alternatively, the preset time period in step S13 is a time period in the future from the current time, and the preset time period may be determined according to the information of the position, the speed, the direction, and the like of the first object.
For example, the monitoring area of the imaging device is an intersection, the first object is a pedestrian, the first object is located on the north side of the intersection at present, the pedestrian passes through the road from east to west, the walking speed of the pedestrian is 1 m/s, and the road width is 35 m. The pedestrian may enter the vehicle blind area in the road crossing process and conflict with the vehicle in traffic, and the time of the pedestrian passing the road is 35 seconds, so that the preset time period can be determined to be within 35 seconds from the current time. Or, considering that an emergency may occur during the pedestrian passing through the road, the time for the pedestrian to pass through the road is prolonged, and the preset time period may be prolonged to be within 40 seconds in the future from the current time.
Optionally, the preset time period in step S13 may also be determined according to the signal lamp timing information. If the pedestrian possibly enters the vehicle blind area in the road crossing process and collides with the vehicle in traffic, and the green time of a sidewalk signal lamp is 40 seconds, the preset time period can be set to be within 40 seconds from the current time; or, the red light time of the traffic lane signal lights in the south and north directions of the intersection is 60 seconds, and the preset time period can be prolonged to be within 60 seconds in the future from the current time. According to practical situations, the determination of the preset time period may be any one of the above manners or a combination of a plurality of manners, and may also be determined by other manners, which is not described herein again.
In this embodiment, the time when the vehicle reaches the intersection point in the blind area may be respectively calculated according to the movement track of the vehicle and the first object, and if the time difference is less than or equal to a preset threshold, for example, when the time difference is less than or equal to 10 seconds, the time difference and the first object may have a traffic conflict, at this time, a driver of the vehicle needs to be reminded, so that the vehicle driver or the autonomous vehicle can make a driving decision accurately according to the traffic conflict situation, thereby avoiding the occurrence of the traffic conflict, improving the driving safety of the vehicle road, and further improving the safety of the whole road traffic.
Optionally, the probability of collision between the second object and the vehicle may be calculated, the second object with higher probability is selected, and the blind area monitoring information is generated according to the second motion track.
Optionally, the method further includes: when the vehicle accords with the preset reminding condition, the blind area monitoring information is sent to the terminal corresponding to the vehicle.
For example, it may be set that when there is a traffic collision in a blind area, the blind area monitoring information is transmitted to a terminal corresponding to the vehicle. Or the blind area monitoring information is fed back to the terminal sending the request blind area monitoring information, or the corresponding terminal of the vehicle is subscribed with the service for obtaining the blind area monitoring information, so that the corresponding blind area monitoring information can be sent in real time according to the position or the route of the vehicle.
In this embodiment, the method further includes: when it is determined that a second object colliding with the vehicle in the blind area exists according to the first movement locus, the state of the signal lamp may be controlled according to the collision.
For example, when it is found that a driving conflict may occur between the vehicle and a pedestrian crossing the road when the vehicle turns right at the intersection, the right turn signal lamp may be controlled to be in a red state when the vehicle reaches the intersection, and a time for the right turn signal lamp to maintain the red state may be determined according to a time for the pedestrian to pass the road, and when the time is reached, the right turn signal lamp may be controlled to turn green so that the vehicle may turn right. Thereby avoiding traffic conflict and improving traffic safety on roads.
The following describes a vehicle travel control method provided by an embodiment of the present invention.
Fig. 8 is a flowchart of a vehicle driving control method according to an embodiment of the present application. As shown in fig. 8, the method is applied to a vehicle-mounted terminal, and comprises the following steps:
step S41, receiving blind area monitoring information corresponding to the vehicle, wherein the blind area monitoring information is generated according to the embodiment of the vehicle blind area monitoring method;
step S42, when determining that the vehicle has a driving conflict in the blind area of the vehicle according to the driving information of the vehicle and the blind area monitoring information, generating driving conflict information;
in step S43, the driving control is performed based on the driving collision information.
Wherein, the blind area monitoring information may include: and all the second motion tracks of the second object having traffic conflict with the vehicle or the second motion tracks of the second object having higher probability of traffic conflict with the vehicle. The vehicle-mounted terminal determines that the vehicle generates the driving conflict information when the driving conflict exists on the road according to the blind area monitoring information and the driving information of the vehicle, such as the driving route, the driving speed, the position information and the like.
Wherein the driving conflict information may include: the current driving speed, position and driving route of the vehicle may have traffic conflict with other vehicles, pedestrians, etc. at a certain intersection, and the speed, driving direction, etc. of the vehicles or pedestrians having traffic conflict.
Based on the driving conflict information, corresponding driving control operations can be executed, for example, for a vehicle driven by the driver, reminding can be performed according to the driving conflict information, such as generating a new driving route and recommending the driver to change the route, recommending the driver to change the driving speed, and the like; for an autonomous vehicle, the vehicle travel speed may be automatically adjusted or the travel route may be changed, etc., according to the travel conflict information.
In this embodiment, the vehicle-mounted terminal performs travel control according to the blind area monitoring information, so that the safety of road travel is improved, and the safety of the whole road traffic is further improved.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application.
Fig. 9 is a block diagram of a vehicle blind area monitoring device according to an embodiment of the present disclosure, which may be implemented as part or all of an electronic device through software, hardware, or a combination of the two. As shown in fig. 9, the vehicle blind area monitoring device includes:
the first obtaining module 51 is configured to obtain a road image and a blind area corresponding to a vehicle;
the second obtaining module 52 is configured to obtain a first motion trajectory of a first object on the road according to the road image;
a determining module 53, configured to determine, according to the first motion trajectory, a second object that conflicts with the vehicle traveling in the blind area;
and the generating module 54 is configured to generate blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
Fig. 10 is a block diagram of a driving control device provided in an embodiment of the present application, which may be implemented as part or all of an electronic device by software, hardware, or a combination of the two. As shown in fig. 10, the travel control device includes:
the receiving module 61 is configured to receive blind area monitoring information corresponding to a vehicle, where the blind area monitoring information is generated according to the vehicle blind area monitoring method embodiment;
the generating module 62 is configured to generate travel conflict information when it is determined that a vehicle has a travel conflict in a blind area of the vehicle according to the travel information of the vehicle and the blind area monitoring information;
and the control module 63 is used for carrying out running control according to the running conflict information.
A vehicle-road coordination system provided in the embodiment of the present application is specifically described below.
Fig. 11 is a block diagram of a vehicle-road coordination system according to an embodiment of the present application, and as shown in fig. 11, the system includes: an image capture device 10 and a computing device 20. The image pickup device 10 is used for shooting roads and sending the shot road images to the computing device 20. The calculating device 20 is used for acquiring a road image and a blind area corresponding to the vehicle; acquiring a first motion trail of a first object on a road according to the road image; determining a second object which conflicts with the vehicle running in a blind area within a preset time period according to the first motion track; and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
Optionally, the system further comprises an in-vehicle terminal 30 located on the vehicle. And the computing device 20 is used for sending the blind area monitoring information to the vehicle-mounted terminal 30. The vehicle-mounted terminal 30 is used for receiving blind area monitoring information corresponding to the vehicle; when the vehicle is determined to have a driving conflict in the blind area of the vehicle according to the driving information of the vehicle and the blind area monitoring information, generating driving conflict information; and performing running control according to the running conflict information.
As shown in fig. 11, in the present embodiment, the computing device 20 may include: an edge computing device 21 deployed on the road.
The edge calculating device 21 is used for acquiring a road image and a blind area corresponding to a vehicle; acquiring a first motion trail of a first object on a road according to the road image; determining a second object which conflicts with the vehicle running in the blind area according to the first motion track; and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
As shown in fig. 11, the computing device 20 may further include: and a cloud server 23 disposed on the network side. The cloud server 23 is used for acquiring a driving route and attribute information of the vehicle; acquiring road condition information corresponding to a driving route; and determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the attribute information and the road condition information. Or, the cloud server 23 is configured to obtain a driving route of the vehicle and the selected blind area type; acquiring road condition information corresponding to a driving route; and determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the type of the blind area and the road condition information. The cloud server 23 transmits the blind area corresponding to the vehicle to the edge computing device 21.
An embodiment of the present application further provides an electronic device, as shown in fig. 12, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501, when executing the computer program stored in the memory 1503, implements the steps of the method embodiments described below.
The communication bus mentioned in the electronic device may be a Peripheral component interconnect (pci) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method embodiments described below.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (14)

1. A vehicle blind zone monitoring method, comprising:
acquiring a road image and a blind area corresponding to a vehicle;
acquiring a first motion trail of a first object on the road according to the road image;
determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track;
and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
2. The method of claim 1, wherein the acquiring the road image comprises:
acquiring a driving route of the vehicle;
and acquiring a road image corresponding to the driving route.
3. The method of claim 1, wherein the obtaining of the road image and the blind area corresponding to the vehicle comprises:
acquiring a driving route of the vehicle;
determining a blind area of the vehicle according to the driving route;
and acquiring a road image corresponding to the blind area.
4. The method of claim 1, wherein the obtaining the corresponding blind zone of the vehicle comprises:
acquiring a driving route and attribute information of the vehicle;
acquiring road condition information corresponding to the driving route;
and determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the attribute information and the road condition information.
5. The method of claim 1, wherein the obtaining the corresponding blind zone of the vehicle comprises:
acquiring a driving route of the vehicle and the selected blind area type;
acquiring road condition information corresponding to the driving route;
and determining a blind area corresponding to at least one driving position of the vehicle on the driving route according to the blind area type and the road condition information.
6. The method according to claim 1, wherein the determining a second object which is in conflict with the vehicle traveling in the blind area within a preset time period according to the first motion trajectory comprises:
acquiring a driving route, a driving speed and vehicle position information of the vehicle;
determining a driving track of the vehicle on the road according to the driving route;
determining an intersection point of the first motion track and the driving track in the blind area;
determining first time when the vehicle reaches the intersection point according to the running speed and the vehicle position information, and determining second time when the first object reaches the intersection point according to the first motion track;
and when the difference value between the first time and the second time is smaller than or equal to a preset threshold value, determining the first object as a second object which conflicts with the vehicle in the blind area in the preset time period.
7. The method of claim 1, further comprising:
and when the vehicle meets the preset reminding condition, sending the blind area monitoring information to a terminal corresponding to the vehicle.
8. A vehicle travel control method characterized by comprising:
receiving blind area monitoring information corresponding to a vehicle, the blind area monitoring information being generated according to the method of any one of claims 1-7;
when the vehicle is determined to have a driving conflict in the blind area of the vehicle according to the driving information of the vehicle and the blind area monitoring information, generating driving conflict information;
and carrying out running control according to the running conflict information.
9. A vehicle blind zone monitoring device, comprising:
the first acquisition module is used for acquiring a road image and a blind area corresponding to a vehicle;
the second acquisition module is used for acquiring a first motion track of a first object on the road according to the road image;
the determining module is used for determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track;
and the generating module is used for generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
10. A travel control device characterized by comprising:
a receiving module, configured to receive blind area monitoring information corresponding to a vehicle, where the blind area monitoring information is generated according to the method of any one of claims 1 to 7;
the generating module is used for generating driving conflict information when the driving conflict of the vehicle in the blind area of the vehicle is determined according to the driving information of the vehicle and the blind area monitoring information;
and the control module is used for carrying out running control according to the running conflict information.
11. A vehicle-road coordination system, comprising: the camera device and the calculating device are arranged on a road;
the camera device is used for shooting the road and sending the shot road image to the computing device;
the computing device is used for acquiring a road image and a blind area corresponding to a vehicle; acquiring a first motion trail of a first object on the road according to the road image; determining a second object which conflicts with the vehicle in the blind area in a preset time period according to the first motion track; and generating blind area monitoring information corresponding to the vehicle according to the second motion track of the second object.
12. The system of claim 11, further comprising: a vehicle-mounted terminal located on a vehicle;
the computing device is used for sending the blind area monitoring information to the vehicle-mounted terminal;
the vehicle-mounted terminal is used for receiving blind area monitoring information corresponding to the vehicle; when the vehicle is determined to have a driving conflict in the blind area of the vehicle according to the driving information of the vehicle and the blind area monitoring information, generating driving conflict information; and carrying out running control according to the running conflict information.
13. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, implementing the method steps of any of claims 1-8.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of any one of claims 1 to 8.
CN201911158440.9A 2019-11-22 2019-11-22 Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system Pending CN111105644A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911158440.9A CN111105644A (en) 2019-11-22 2019-11-22 Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911158440.9A CN111105644A (en) 2019-11-22 2019-11-22 Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system

Publications (1)

Publication Number Publication Date
CN111105644A true CN111105644A (en) 2020-05-05

Family

ID=70420709

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911158440.9A Pending CN111105644A (en) 2019-11-22 2019-11-22 Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system

Country Status (1)

Country Link
CN (1) CN111105644A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627252A (en) * 2020-06-10 2020-09-04 上海商汤智能科技有限公司 Vehicle early warning method and device, electronic equipment and storage medium
CN111932941A (en) * 2020-08-24 2020-11-13 重庆大学 Intersection vehicle early warning method and system based on vehicle-road cooperation
CN112158197A (en) * 2020-08-21 2021-01-01 恒大新能源汽车投资控股集团有限公司 Vehicle blind area obstacle avoiding method, device and system
CN112489450A (en) * 2020-12-21 2021-03-12 北京百度网讯科技有限公司 Traffic intersection vehicle flow control method, road side equipment and cloud control platform
CN114179826A (en) * 2021-12-17 2022-03-15 中汽创智科技有限公司 Start control method, device and equipment for automatic driving vehicle and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200689A (en) * 2014-08-28 2014-12-10 长城汽车股份有限公司 Road early warning method and device
CN104376735A (en) * 2014-11-21 2015-02-25 中国科学院合肥物质科学研究院 Driving safety early-warning system and method for vehicle at blind zone crossing
CN105206108A (en) * 2015-08-06 2015-12-30 同济大学 Early warning method against vehicle collision based on electronic map
CN105774809A (en) * 2014-12-26 2016-07-20 中国移动通信集团公司 Traveling dead zone prompting method and device
CN107564334A (en) * 2017-08-04 2018-01-09 武汉理工大学 A kind of parking lot vehicle blind zone danger early warning system and method
CN108010383A (en) * 2017-09-29 2018-05-08 北京车和家信息技术有限公司 Blind zone detection method, device, terminal and vehicle based on driving vehicle
CN108802707A (en) * 2018-08-31 2018-11-13 中国科学院电子学研究所 The improved kalman filter method for target following
CN108932868A (en) * 2017-05-26 2018-12-04 奥迪股份公司 The danger early warning system and method for vehicle
CN109521802A (en) * 2017-09-19 2019-03-26 博世(上海)智能科技有限公司 Method, apparatus and equipment for tracing of the movement
CN110031876A (en) * 2018-01-11 2019-07-19 中南大学 A kind of vehicle mounted guidance tracing point offset antidote based on Kalman filtering
CN110379157A (en) * 2019-06-04 2019-10-25 深圳市速腾聚创科技有限公司 Road blind area monitoring method, system, device, equipment and storage medium
CN110430401A (en) * 2019-08-12 2019-11-08 腾讯科技(深圳)有限公司 Vehicle blind zone method for early warning, prior-warning device, MEC platform and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200689A (en) * 2014-08-28 2014-12-10 长城汽车股份有限公司 Road early warning method and device
CN104376735A (en) * 2014-11-21 2015-02-25 中国科学院合肥物质科学研究院 Driving safety early-warning system and method for vehicle at blind zone crossing
CN105774809A (en) * 2014-12-26 2016-07-20 中国移动通信集团公司 Traveling dead zone prompting method and device
CN105206108A (en) * 2015-08-06 2015-12-30 同济大学 Early warning method against vehicle collision based on electronic map
CN108932868A (en) * 2017-05-26 2018-12-04 奥迪股份公司 The danger early warning system and method for vehicle
CN107564334A (en) * 2017-08-04 2018-01-09 武汉理工大学 A kind of parking lot vehicle blind zone danger early warning system and method
CN109521802A (en) * 2017-09-19 2019-03-26 博世(上海)智能科技有限公司 Method, apparatus and equipment for tracing of the movement
CN108010383A (en) * 2017-09-29 2018-05-08 北京车和家信息技术有限公司 Blind zone detection method, device, terminal and vehicle based on driving vehicle
CN110031876A (en) * 2018-01-11 2019-07-19 中南大学 A kind of vehicle mounted guidance tracing point offset antidote based on Kalman filtering
CN108802707A (en) * 2018-08-31 2018-11-13 中国科学院电子学研究所 The improved kalman filter method for target following
CN110379157A (en) * 2019-06-04 2019-10-25 深圳市速腾聚创科技有限公司 Road blind area monitoring method, system, device, equipment and storage medium
CN110430401A (en) * 2019-08-12 2019-11-08 腾讯科技(深圳)有限公司 Vehicle blind zone method for early warning, prior-warning device, MEC platform and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BINGLIHANSHUANG: "要点初见:反用OpenCV3中的卡尔曼滤波器(KalmanFilter)进行运动预测", 《HTTPS://BLOG.CSDN.NET/M0_37857300/ARTICLE/DETAILS/79117062》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627252A (en) * 2020-06-10 2020-09-04 上海商汤智能科技有限公司 Vehicle early warning method and device, electronic equipment and storage medium
CN112158197A (en) * 2020-08-21 2021-01-01 恒大新能源汽车投资控股集团有限公司 Vehicle blind area obstacle avoiding method, device and system
CN112158197B (en) * 2020-08-21 2021-08-27 恒大新能源汽车投资控股集团有限公司 Vehicle blind area obstacle avoiding method, device and system
CN111932941A (en) * 2020-08-24 2020-11-13 重庆大学 Intersection vehicle early warning method and system based on vehicle-road cooperation
CN112489450A (en) * 2020-12-21 2021-03-12 北京百度网讯科技有限公司 Traffic intersection vehicle flow control method, road side equipment and cloud control platform
CN114179826A (en) * 2021-12-17 2022-03-15 中汽创智科技有限公司 Start control method, device and equipment for automatic driving vehicle and storage medium

Similar Documents

Publication Publication Date Title
CN113998034B (en) Rider assistance system and method
US10803329B2 (en) Vehicular control system
CN110430401B (en) Vehicle blind area early warning method, early warning device, MEC platform and storage medium
CN111105644A (en) Vehicle blind area monitoring and driving control method and device and vehicle road cooperative system
CN109033951B (en) System and method for detecting occluding objects based on graphics processing
US9983591B2 (en) Autonomous driving at intersections based on perception data
US9910443B1 (en) Drive control apparatus and method for autonomous vehicle
US10223910B2 (en) Method and apparatus for collecting traffic information from big data of outside image of vehicle
US10147002B2 (en) Method and apparatus for determining a road condition
US20170154225A1 (en) Predicting and Responding to Cut In Vehicles and Altruistic Responses
US20100030474A1 (en) Driving support apparatus for vehicle
US11120690B2 (en) Method and device for providing an environmental image of an environment of a mobile apparatus and motor vehicle with such a device
JP6627152B2 (en) Vehicle control device, vehicle control method, and program
JP2021099793A (en) Intelligent traffic control system and control method for the same
CN111145569A (en) Road monitoring and vehicle running control method and device and vehicle-road cooperative system
CN111025297A (en) Vehicle monitoring method and device, electronic equipment and storage medium
JP7321277B2 (en) Driving support device, vehicle control device, driving support system, and driving support method
CN110658809B (en) Method and device for processing travelling of movable equipment and storage medium
CN112721931A (en) Vehicle meeting method, device, equipment and storage medium
CN113043955A (en) Road condition information display device and method and vehicle
CN112498343A (en) Vehicle steering control system and method
JP2019191839A (en) Collision avoidance device
CN113808418A (en) Road condition information display system, method, vehicle, computer device and storage medium
CN114333339B (en) Deep neural network functional module de-duplication method
JP7359099B2 (en) Mobile object interference detection device, mobile object interference detection system, and mobile object interference detection program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 221, 2 / F, block C, 18 Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Holding Co.,Ltd.

Address before: Room 221, 2 / F, block C, 18 Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Digital Technology Holding Co.,Ltd.

Address after: Room 221, 2 / F, block C, 18 Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Digital Technology Holding Co.,Ltd.

Address before: Room 221, 2 / F, block C, 18 Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: JINGDONG DIGITAL TECHNOLOGY HOLDINGS Co.,Ltd.

CB02 Change of applicant information