CN111354182A - Driving assisting method and system - Google Patents

Driving assisting method and system Download PDF

Info

Publication number
CN111354182A
CN111354182A CN201811564618.5A CN201811564618A CN111354182A CN 111354182 A CN111354182 A CN 111354182A CN 201811564618 A CN201811564618 A CN 201811564618A CN 111354182 A CN111354182 A CN 111354182A
Authority
CN
China
Prior art keywords
vehicle
target object
road
driving
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811564618.5A
Other languages
Chinese (zh)
Inventor
童华江
姚浪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811564618.5A priority Critical patent/CN111354182A/en
Publication of CN111354182A publication Critical patent/CN111354182A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/042Detecting movement of traffic to be counted or controlled using inductive or magnetic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a driving assisting method, which comprises the following steps: acquiring road data, the road data comprising: static and/or dynamic information of all or part of objects in each road range is collected by at least one roadside sensing device; identifying all or part of the vehicle objects in each object based on the road data; for a predetermined vehicle object, determining a target object constituting a driving risk for the predetermined vehicle object from the road data; and transmitting the related information of the target object to the predetermined vehicle object. The invention also discloses an auxiliary driving method executed on the vehicle, corresponding roadside sensing equipment and an auxiliary driving system.

Description

Driving assisting method and system
Technical Field
The present invention relates to the field of vehicle driving assistance, and in particular to the field of using road environment data to assist in vehicle driving.
Background
As the automotive industry moves into the internet and intelligent era, sensors and arithmetic units in or around the vehicle can provide increasingly greater driving-related data and computing power. These data and capabilities can assist in driving the vehicle more efficiently than previously, making vehicle driving simpler, more intelligent, and safer.
Safety and convenience are often concerns for the driver in relation to driving a vehicle. In the existing vehicle-mounted driving assistance scheme, data collection such as a distance to a vehicle ahead, a speed of the vehicle itself, and a real-time position of the vehicle is generally performed during driving using sensors on the vehicle, and then an on-vehicle computing unit analyzes the data and performs a driving assistance providing capability based on the analysis result. This solution is limited on the one hand to the relevant sensors installed on the vehicle, i.e. it cannot be implemented on vehicles not equipped with relevant sensors. On the other hand, the vehicle sensor can only sense data in a small range around the vehicle, and cannot provide driving environment related information at a greater distance from the vehicle, which has obvious limitations.
The existing road monitoring equipment only provides functions of measuring vehicle flow, vehicle distance, vehicle speed and the like, can only provide a few pieces of road flow prompting information for vehicle driving, and cannot achieve the aim of effectively assisting the vehicle driving.
With the development of the technology of the internet of vehicles V2X, a collaborative environment awareness system appears. The system can use the data of the vehicle and the surrounding environment together to assist the driving of the vehicle. However, how to construct the environmental data and how to fuse the vehicle itself and the environmental data are problems faced by the collaborative environmental awareness system.
How to provide auxiliary information for the driving of the vehicle conveniently, accurately and quickly without changing the vehicle per se is one of the problems which are urgently needed to be solved in the field.
Therefore, a new driving assistance scheme for a vehicle is needed, which can provide more accurate and comprehensive driving assistance information for the vehicle, so that the driver can accurately notice the potential risks related to the driving of the vehicle on the road, and the driver can change the driving behavior of the vehicle at the first time, so that the driving of the vehicle becomes safer.
Disclosure of Invention
To this end, the present invention provides a new driving assistance solution for a vehicle in an attempt to solve or at least alleviate at least one of the problems presented above.
According to an aspect of the present invention, a driving assist method is provided. The method comprises the following steps: acquiring road data, wherein the road data comprises static and/or dynamic information of each object in each road range, which is acquired by at least one road side sensing device; identifying all or part of the vehicle objects in each object based on the road data; for a predetermined vehicle object of the vehicle objects, determining a target object constituting a driving risk for the predetermined vehicle object from the road data; and sending the relevant information of the target object to the vehicle so that the vehicle can carry out vehicle driving control according to the target object.
Alternatively, in the driving assist method according to the present invention, the step of determining the target object includes: based on the travel characteristics of the vehicle object and the characteristics of each object in the road data, an object having a collision risk with the vehicle object is determined as a target object, and a risk level value of the target object is determined.
Alternatively, in the driving assist method according to the present invention, the step of determining the target object includes: based on the travel characteristics of the vehicle object and the characteristics of each object in the road data, an object having an influence on the travel of the vehicle object on the road is determined as a target object, and a risk level value of the target object is determined.
Alternatively, in the driving assist method according to the present invention, the step of transmitting the information on the target object to the vehicle includes: the relevant information of the target object and the associated risk level value are sent to the vehicle together.
Optionally, the driving assist method according to the present invention further includes the steps of: for the vehicle object, searching a driving related object related to the driving of the vehicle object from the road data; and sending the travel-related object to the vehicle so that the vehicle presents the travel-related object and the target object based on their relative positions.
According to another aspect of the present invention, there is provided a driving assistance method performed in a vehicle that runs on a road on which a roadside sensing device is disposed, the method including the steps of: receiving information related to a target object, wherein the target object is generated for a vehicle by a road side sensing device according to road data and indicates that a driving risk is formed for the vehicle; and presenting the target object so that a driver of the vehicle can control the traveling of the vehicle according to the relevant information of the target object.
Alternatively, in the driving assist method according to the present invention, the step of presenting the target object includes: the target object is based on its characteristics, the size and direction of travel of the vehicle, and the relative positions of the target object and the vehicle.
Optionally, the driving assist method according to the present invention further includes the steps of: receiving a risk degree value of a target object; and presenting the target object according to the presentation mode corresponding to the risk degree value.
Optionally, the driving assist method according to the present invention further includes the steps of: a travel-related object is received, the travel-related object being an object in the road data that is related to travel of the vehicle. And the step of presenting the target object comprises displaying the target object together with the travel-related object.
Alternatively, in the driving assist method according to the present invention, the step of displaying the target object together with the travel-related object includes: the travel-related object is presented based on the feature of the travel-related object, the size and the traveling direction of the vehicle, and the relative positions of the travel-related object and the vehicle.
Alternatively, in the driving assist method according to the present invention, the step of displaying the target object together with the travel-related object includes: emphatically displaying the target object; and/or weakening the display of the driving prompt information.
According to still another aspect of the present invention, there is provided a roadside sensing apparatus including: each sensor is suitable for obtaining the static and dynamic information of each object in the coverage area; a storage unit adapted to store road data including static and/or dynamic information of objects within its coverage; and a calculation unit adapted to perform the driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance system including the roadside sensing device according to the present invention, and a vehicle, running on a road, and performing a driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance method of a vehicle, including the steps of: acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in each road range is collected by at least one roadside sensing device; identifying all or part of the vehicle objects in each object based on the road data; determining a first target object and/or a second target object according to the road data, wherein the first target object forms a driving risk for one or more vehicle objects, and the one or more vehicle objects form a driving risk for the second target object; and transmitting information related to the first target object to one or more vehicle objects, and/or transmitting information related to one or more vehicle objects to a second target object.
According to still another aspect of the present invention, there is provided a driving assistance method of a vehicle, including the steps of: acquiring road data, the road data comprising: static and/or dynamic information of all or part of objects in each road range is collected by at least one roadside sensing device; identifying all or part of the vehicle objects in each object based on the road data; for a predetermined vehicle object of the vehicle objects, determining a target object constituting a driving risk for the predetermined vehicle object based on the road data; and transmitting the related information of the target object to a mobile terminal associated with the predetermined vehicle object.
According to still another aspect of the present invention, there is also provided a computing device. The computing device includes at least one processor and a memory storing program instructions, wherein the program instructions are configured to be executed by the at least one processor and include instructions for performing the above-described assisted parking method.
According to still another aspect of the present invention, there is also provided a readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the above-described parking assist method.
According to the driving assistance scheme, the road side sensing equipment is used for sensing and aggregating static and dynamic information on roads in the coverage range of the road side sensing equipment to form road data, driving risk analysis aiming at vehicles is carried out aiming at each vehicle, so that an analysis result related to the driving potential risk of the vehicle is determined for the vehicle, the analysis result is provided for the vehicle, and therefore a driver can conveniently change the driving mode of the vehicle according to the analysis result, namely the driving risk information, and the driving safety of the vehicle is improved.
In addition, according to the driving assistance scheme of the invention, objects which are within a range enlarged around the vehicle and possibly cause risks to the driving of the vehicle can be displayed on the vehicle, so that the problem that the driver of the existing vehicle can only see the objects within the sight line range of the driver of the existing vehicle and cannot have more sufficient time to avoid the driving risks is solved
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic view of a travel assist system according to an embodiment of the invention;
FIG. 2 shows a schematic diagram of a roadside sensing device according to one embodiment of the invention;
FIG. 3 shows a schematic view of a method of assisting driving according to an embodiment of the invention;
FIG. 4 shows a schematic view of a method of assisting driving according to an embodiment of the invention; and
FIG. 5 illustrates a schematic diagram of an interface presenting a target object according to another embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 shows a schematic view of a driving assistance system 100 according to an embodiment of the invention. As shown in fig. 1, the driving assistance system 100 includes a vehicle 110 and a roadside sensing device 200. Vehicle 110 is traveling on road 140. Roadway 140 includes a plurality of lanes 150. During the driving process of the vehicle 110 on the road 140, different lanes 150 may be switched according to the road condition and the driving target.
The roadside sensing device 200 is disposed at the periphery of the road, and collects various information within a predetermined range around the roadside sensing device 200, particularly road data related to the road, using various sensors it has.
The roadside sensing device 200 has a predetermined coverage. According to the coverage range and the road condition of each roadside sensing device 200, a sufficient number of roadside sensing devices 200 can be deployed on two sides of the road, and the whole road can be fully covered. Of course, according to an embodiment, instead of fully covering the entire road, the roadside sensing devices 200 may be deployed at the feature points (corners, intersections, and diversions) of each road to obtain the feature data of the road. The present invention is not limited by the specific number of roadside sensing devices 200 and the coverage of the road.
When the roadside sensing devices 200 are deployed, the positions of the sensing devices 200 to be deployed are calculated according to the coverage area of a single roadside sensing device 200 and the condition of the road 140. The coverage area of the roadside sensing device 200 depends on at least the arrangement height of the sensing device 200, the effective distance sensed by the sensors in the sensing device 200, and the like. And the condition of road 140 includes road length, number of lanes 150, road curvature and grade, etc. The deployment location of the perceiving device 200 may be calculated in any manner known in the art.
After the deployment location is determined, the roadside sensing device 200 is deployed at the determined location. Since the data that the roadside sensing device 200 needs to sense includes motion data of a large number of objects, clock synchronization of the roadside sensing device 200 is performed, that is, the time of each sensing device 200 is kept consistent with the time of the vehicle 110 and the cloud platform.
Subsequently, the position of each deployed roadside sensing device 200 is determined. Since the perception device 200 is to provide the driving assistance function for the vehicle 110 traveling at a high speed on the road 140, the position of the perception device 200 must be highly accurate as the absolute position of the perception device. There are a number of ways to calculate the high accuracy absolute position of the perceiving device 200. According to one embodiment, a Global Navigation Satellite System (GNSS) may be utilized to determine a high accuracy position.
The roadside sensing device 200 collects and senses the static conditions (lane lines 120, guardrails, isolation belts, parking spaces, road gradient and inclination, accumulated water and snow cover of the road, and the like) and the dynamic conditions (running vehicles 110, pedestrians 130 and sprinkles) of the road in the coverage area of the roadside sensing device by using the sensors thereof, and fuses the sensing data of the different sensors to form the road data of the road section. The road data comprises static and dynamic information of all objects within the coverage area of the perceiving device 200, in particular within the road-related field. The roadside sensing device 200 may then calculate driving-related information for each vehicle based on the road data, such as whether the vehicle has a potential collision risk and driving risk, traffic conditions outside the field of view of the vehicle (e.g., road conditions after a road curve, road conditions before a preceding vehicle), and so forth.
A vehicle 110 entering the coverage area of one roadside sensing device 200 may communicate with the roadside sensing device 200. A typical communication method is the V2X communication method. Of course, the mobile internet provided by the mobile communication service provider may communicate with the roadside sensing devices 200 using mobile communication means such as 5G, 4G and 3G. In consideration of the fact that the vehicle runs at a high speed and the requirement for the time delay of communication is as short as possible, the V2X communication method is adopted in the general embodiment of the present invention. However, any communication means that can meet the time delay requirements required by the present invention is within the scope of the present invention.
The vehicle 110 may receive driving-related information related to the vehicle 110 from the roadside sensing device 200. For example, the vehicle 110 may acquire, from the roadside sensing device 200, various driving-related objects related to driving of the vehicle 110 on a road in road data, such as lanes on the road, traffic restriction marks above the road, pedestrians around the road or street lamps around the road, signs, other vehicles, and the like. The vehicle 110 may display these travel-related objects on its control interface to present all surrounding road information to the driver in one place.
The vehicle 110 may receive driving-related information related to the vehicle 110 and road data for the segment of road in various ways. In one implementation, vehicles 110 entering the coverage area of roadside sensing devices 200 may receive such information and data automatically. In another implementation, the vehicle 110 may issue a request, and the roadside sensing device 200 sends driving-related information related to the vehicle 110 and road data of the section of road to the vehicle 110 in response to the request, so that the driver controls the driving behavior of the vehicle 110 based on the information.
The invention is not limited to the specific way in which the vehicle 110 receives driving-related information and road data for the road segment, and all ways in which such information and data can be received and the driving behavior of the vehicle 110 controlled accordingly by the driver's reference are within the scope of the invention.
Optionally, the driving assistance system 100 further comprises a server 160. Although only one server 160 is shown in fig. 1, it should be understood that the server 160 may be a cloud service platform consisting of a plurality of servers. Each roadside sensing device 100 transmits the sensed road data to the server 160. The server 160 may combine the road data based on the location of each roadside sensing device 100 to form road data for the entire road. The server 160 may also perform further processing on the road data for the road to form driving-related information, such as traffic conditions, accident sections, expected transit times, etc. for the entire road.
The server 160 may transmit the road data and the driving related information of the formed whole road to each roadside sensing device 200, or may transmit the road related data and the driving related information of a section of road corresponding to several roadside sensing devices 200 adjacent to a certain roadside sensing device 200 to the roadside sensing device 200. In this way, the vehicle 110 may obtain a greater range of driving-related information from the roadside sensing device 200. Of course, the vehicle 110 may obtain the driving-related information and the road data directly from the server 160 without passing through the roadside sensing device 200.
If roadside sensing devices 200 are deployed on all roads within an area and the roadside sensing devices 200 transmit road data to the server 160, an indication of road traffic within the area may be formed at the server 160. Vehicle 110 may receive the indication from server 160 and control the driving behavior of vehicle 110 accordingly.
The roadside sensing devices 200 may establish communication and perform communication transmission with the surrounding roadside sensing devices 200 directly without the server 160. Since the roadside sensing device 200 has more and more powerful computing power, more and more information can be processed locally at the roadside sensing device 200 by edge calculation in consideration of the bandwidth limitation and time delay requirement for communication between the roadside sensing device 200 and the server 160. The processed information may also be directly transmitted to the surrounding roadside sensing devices 200 without being transmitted via the server 160. This communication is more efficient for information that only needs to be exchanged between adjacent roadside sensing devices 200. For example, the traffic light information within the coverage area of one roadside sensing device 200 is forwarded to the surrounding roadside sensing devices 200.
For the roadside sensing devices 200, the same information may be received from the server 160 and the surrounding roadside sensing devices 200, and therefore, information merging is required based on the time stamp and the content of the information, old duplicate information is removed, and the latest information is provided for the vehicles 110 within the coverage range thereof.
It should be noted that a mobile terminal, such as a smart phone carried by a vehicle driver, a vehicle-mounted smart speaker, etc., may also be connected to the network formed by the server 160 and the roadside sensing device 200. Some information to the vehicle, if appropriate to the mobile terminal (e.g., navigation information, parking space information, driving advice information, vehicle surroundings information, etc.), may also be sent to the mobile terminal so that the mobile terminal can maneuver the associated vehicle based on the information.
FIG. 2 shows a schematic diagram of a roadside sensing device 200 according to one embodiment of the invention. As shown in fig. 2, the roadside sensing device 200 includes a communication unit 210, a sensor group 220, a storage unit 230, and a calculation unit 240.
The roadside sensing devices 200 communicate with each vehicle 110 entering its coverage area to provide driving-related information to the vehicle 110 and to receive vehicle driving information of the vehicle from the vehicle 110. Meanwhile, the roadside sensing device 200 also needs to communicate with the server 160 and other surrounding roadside sensing devices 200. The communication unit 210 provides a communication function for the roadside sensing device 200. The communication unit 210 may employ various communication methods including, but not limited to, ethernet, V2X, 5G, 4G, and 3G mobile communication, etc., as long as they can complete data communication with as little time delay as possible. In one embodiment, roadside sensing devices 200 may communicate with vehicle 110 entering its coverage area and surrounding roadside sensing devices 200 using V2X, and roadside sensing devices 200 may communicate with server 160 using, for example, a high speed internet.
The sensor group 220 includes various sensors, for example, radar sensors such as a millimeter wave radar 222 and a laser radar 224, and image sensors such as a camera 226 and an infrared probe 228 having a light supplement function. For the same object, various sensors can obtain different properties of the object, for example, a radar sensor can perform object velocity and acceleration measurements, and an image sensor can obtain the shape and relative angle of the object.
The sensor group 220 collects and senses static conditions (lane lines 120, guardrails, isolation belts, roadside parking spaces, road gradient and inclination, road water and snow, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinkles) of roads in a coverage area using the respective sensors, and stores data collected and sensed by the respective sensors in the storage unit 230.
The calculation unit 240 fuses the data sensed by the sensors to form road data for the road segment, and also stores the road data in the storage unit 230. In addition, the calculation unit 240 may further perform data analysis based on the road data, identify one or more vehicles and vehicle motion information therein, and further determine driving-related information for the vehicle 110. Such data and information may be stored in storage unit 230 for transmission to vehicle 110 or server 160 via communication unit 210.
Specifically, the calculation unit 240 may acquire static information on a predetermined range of road positions, which is stored in advance. After the roadside sensing device 200 is deployed at a certain position of a road, the range of the road covered by the sensing device 200 is fixed. Static information of the predetermined range, such as road width, number of lanes, turning radius, etc., within the range may be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device 200 at the time of deployment of the perceiving device 200. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, the calculating unit 240 processes the raw sensor data according to different sensors, respectively, to form sensing data such as distance measurement, speed measurement, type identification, size identification, and the like. And then, based on the obtained static road data, in different cases, different sensor data are used as a reference, and other sensor data are added for calibration, so that uniform road data are finally formed.
The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Accordingly, the vehicle 110 may transmit vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. To this end, the calculation unit 240 may further fuse the vehicle travel information obtained from the vehicle 110 on the basis of the previously formed road data to form new road data.
In addition, the storage unit 230 may store various calculation models, such as a collision detection model, a license plate recognition model, a parking space recognition model, a parking lot entrance/exit device model, and the like. These computational models may be used by the computational unit 240 to implement the corresponding steps in the method 300 described below with reference to fig. 3.
According to one embodiment of the present aspect, in the road data, object information, such as a static street light object, a sign object, a lane object, and the like, and a dynamic vehicle object, a new person object, and a temporary falling object, a temporary lane change object, and the like, can be established for each item on the road. Subsequently, the calculation unit 240 calculates interactions between various objects for the respective objects using various calculation models stored in the storage unit 230, thereby obtaining various driving-related information.
Fig. 3 shows a schematic representation of a method 300 for driver assistance for a vehicle according to an embodiment of the invention. The driving assistance method 300 is suitably performed in the roadside sensing device 200 shown in fig. 2. As shown in fig. 3, the driving assistance method 300 starts at step S310.
In step S310, road data is acquired. The road data comprises static and/or dynamic information of all or part of the objects within the road range of the at least one roadside sensing device respectively. As described above with reference to fig. 1, the roadside sensing device 200 is generally fixedly disposed near a certain road, and thus has a corresponding road position. In addition, the roadside sensing device 200 has a predetermined coverage area depending on at least the arrangement height of the sensing device 200, the effective distance for sensing by the sensors in the sensing device 200, and the like. Once the roadside sensing device 200 is deployed at a side of a certain road, a predetermined range of the road that can be covered by the sensing device can be determined according to the specific positions, heights and effective sensing distances of the sensing device and the road.
The roadside sensing device 200 collects and/or senses the static conditions (lane lines 120, guardrails, isolation strips, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinklers) of the road in the coverage area by using the various sensors thereof to obtain and store various sensor data.
As described above, the roadside sensing device 200 includes various sensors, for example, radar sensors such as the millimeter wave radar 222 and the laser radar 224, and image sensors such as the camera 226 and the infrared probe 228 having a light supplement function, and the like. For the same object, various sensors can obtain different properties of the object, for example, a radar sensor can perform object velocity and acceleration measurements, and an image sensor can obtain the shape and relative angle of the object.
In step S310, processing and fusion may be performed based on the obtained various sensor raw data, thereby forming unified road data. In one embodiment, step S310 may further include a substep S312. In step S312, static information on a predetermined range of road positions, which is stored in advance, is acquired. After the roadside sensing device is deployed at a certain position of a road, the range of the road covered by the sensing device is fixed. Static information of the predetermined range, such as road width, number of lanes, lane lines, speed limit signs, turning radius, etc. within the range can be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device 200 at the time of deployment. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, in step S314, the raw sensor data is processed according to different sensors, respectively, to form sensing data about the distance measurement, speed measurement, and type, size, etc. of the dynamic object and the size, content, location, etc. of various static objects. Next, in step S316, based on the road static data obtained in step S312, calibration is performed using different sensor data as a reference and other sensor data, and finally uniform road data is formed.
Steps S312-S136 describe one way to obtain road data. The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Therefore, as described in step S318, the vehicle 110 transmits the vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. The method S310 further includes a step S319 in which the vehicle travel information obtained in the step S318 is further fused on the basis of the road data formed in the step S316 to form new road data.
After each roadside sensing device 200 collects static and/or dynamic information of objects within its coverage area or covered road area, the information collected by at least one roadside sensing device 200 that is adjacent may be combined to form road data for a segment of road. This combination may be performed in the server 160 coupled to the roadside sensing devices 200, or may be performed in any of the roadside sensing devices 200.
Next, in step S320, one or more vehicle objects within the sensing unit coverage are identified based on the road data obtained at step S310. The identification in step S320 includes two aspects of identification. One aspect of the identification is vehicle identification, i.e. identifying which objects in the road data are vehicle objects. Since the vehicle objects have different motion characteristics, such as a relatively high speed, traveling in a lane in one direction, generally not sending collisions with other objects, and the like. A conventional classification detection model or a deep learning-based model may be constructed based on these motion characteristics, and the constructed model is applied to road data, thereby determining motion characteristics such as a vehicle object and a motion trajectory of the vehicle object in the road data.
Another aspect of the identification is identifying a vehicle identification. For the recognized vehicle object, its vehicle identification is further determined. One way to determine the identity of the vehicle is to determine the unique license plate of the vehicle, for example by means of image recognition or the like. When the license plate of the vehicle cannot be identified, another way to determine the vehicle identifier may be to generate a unique mark of the vehicle by combining the size, type, position information, driving speed, and the like of the vehicle object. The vehicle identification is the unique identification of the vehicle object within the road section and is used to distinguish it from other vehicle objects. The vehicle identification is used in subsequent data transmission and is transmitted in different road side sensing devices in the road so as to facilitate overall analysis.
Alternatively, after the vehicle is identified, vehicle matching is required, that is, a vehicle object to be analyzed subsequently and a vehicle object that needs to receive the target object-related information as the analysis result are matched. Vehicle matching can be performed through various matching modes or combination of license plate matching, driving speed and type matching, position information fuzzy matching and the like. According to one embodiment, the vehicle 110 may bind the license plate information through V2X or application verification, and the license plate information may further be matched to the vehicle data of the corresponding license plate in the roadside sensing device and the server, thereby implementing license plate matching.
It should be noted that in step S320, it is not required that all the vehicle objects be recognized, and only a part of the vehicle objects may be recognized as necessary. The invention is not limited to the number of vehicle objects to be identified in step S320.
Subsequently, in step S330, with respect to the vehicle object identified in step S320, a target object that constitutes a travel risk for the vehicle object is found from the road data acquired in step S310. It should be noted that the present invention is not limited to the case where the subsequent processes from step S330 should be performed on all the vehicle objects acquired at step S320. According to one embodiment, the subsequent processing may be performed on one or more vehicle objects selected from the vehicle objects identified in step S320, or on the vehicle according to a request sent by the vehicle, or on all identified vehicle objects. All of these processing methods are within the scope of the present invention.
As described above with reference to fig. 2, in the storage unit of the roadside sensing device 200, various calculation models are stored in advance. These calculation models can be used to calculate the association of the vehicle object with other objects in the road data and to determine a target object that would constitute a driving risk for the driving of the vehicle object.
There are various ways in which the travel target object of the vehicle object can be found from the road data. According to one embodiment, the driving risk is a collision risk, for which purpose a collision assessment model may be utilized to calculate the probability of a collision between a vehicle object and a target object and information such as the expected time and location of the collision based on the characteristics of the vehicle object (position, size, driving direction and/or driving speed, etc.) and the characteristics of the target object (position, size, movement speed and/or direction), etc. Alternatively, a collision risk value may also be determined as the risk level value based on the collision probability or the like. And a target object with a risk degree exceeding a certain threshold may be determined as the target object.
The collision assessment model may employ various algorithms such as deep learning algorithms and graphical analysis algorithms, or combinations of these algorithms, to make the collision assessment. The invention is not limited to the specific implementation of the collision assessment model, and all calculation methods that can perform collision assessment for the vehicle object and the target object are within the scope of the invention.
According to another embodiment, the driving risk is any risk that may have an impact on the occurrence of an accident during the driving of the vehicle on the road. For example, continuous curves, large curves, temporary sand throws on the road surface, road grade, water and snow accumulation, temporary road surface damage, etc. may cause accidents during the driving of the vehicle. For this purpose, an accident risk assessment model can be used to determine one or more target objects which may lead to an accident in the vehicle object. And optionally, the risk degree values of the target objects, namely the probability values of accidents of the vehicles caused by the characteristics of the objects, and the like, can also be determined. And a target object having a risk degree value higher than a predetermined threshold may be determined as the target object.
The accident risk assessment model may be constructed based on such historical empirical data, for example, from a large amount of historical data of accident-prone sections of road and road slopes to assign risk measure values to slope objects of the road. The assessment model may be updated based on the collected data to provide more accurate assessment results.
The accident risk assessment model can be constructed by various algorithms such as various big data analysis algorithms and deep learning algorithms, the invention is not limited to the specific construction mode of the accident risk assessment model, and all modes which can determine one or more objects in road data which can cause accidents of the vehicle object and the risk degree value of the objects are within the protection scope of the invention.
It should be noted that since the risk of traveling is bidirectional, in addition to some objects posing a risk to the traveling of the vehicle, the traveling of the vehicle also poses a risk to some objects. The collision evaluation model and the accident risk evaluation model can also determine the target object for which the risk is posed due to the travel of the vehicle.
After generating the target object for the vehicle object in step S330, the generated target object is transmitted to the vehicle corresponding to the vehicle object in step S340. According to one embodiment, the vehicle has automatically established communication with the roadside sensing device 200 when it enters the coverage area of the device 200, so the target object may be sent to the vehicle 110 via the previously established channel. Subsequently, the target object may be displayed on the control interface of the vehicle 110 so that the driver of the vehicle 110 pays attention to the target object and accordingly changes the driving behavior of the vehicle 110, thereby improving the driving safety of the vehicle 110.
Optionally, in order to be able to more clearly communicate the risk prompt to the driver on the vehicle 110, the target object and the first risk level value may optionally be sent to the vehicle 110. In this way, the vehicle may display the target object in different ways depending on the risk level value. For example, an object with a higher degree of risk is displayed in an emphasized manner and blinked differently from the background, while an object with a lower degree of risk is displayed in a normal manner. In this way the driver can focus on different target objects and can focus on target objects with a higher degree of risk.
Optionally, in order to present the driving risk more comprehensively on the vehicle 110, according to one embodiment, the driving assistance method 300 may further include step S350. In step S350, a travel-related object related to the travel of the vehicle object determined in step S320 is searched for from the road data. The travel-related objects include stationary and moving objects and the like within a predetermined range of the vehicle object, which may affect the driving of the vehicle object, may be some reference objects and the like. For example, the traveling object information may include, for example, a vehicle object traveling around the vehicle, a lane object, a street lamp, a sign object, and the like.
Subsequently, in step S360, the travel-related object determined in step S350 is also transmitted to the vehicle 110. Thus, the vehicle 110 has a travel-related object and a target object. According to one embodiment, the objects may be displayed according to the relative positions of the travel-related object and the target object with respect to the vehicle 110, thereby allowing the driver to more intuitively determine the position of the target object. If the target object is highlighted in different ways according to the risk degree value, the driver can more intuitively locate the target object having a higher degree of risk and change the driving manner of the vehicle 110 as early as possible according to the distance of this target object with respect to the vehicle, thereby reducing the risk so as to improve the driving safety of the vehicle.
In addition, it should be noted that a mobile terminal, such as a smart phone carried by a vehicle driver, a vehicle-mounted smart sound box, etc., may also be connected to the network formed by the server 160 and the roadside sensing device 200. In step S340 and step S360, in addition to transmitting the information related to the target object and/or the information related to the travel-related object to the vehicle corresponding to the vehicle object, the information related to the target object and/or the information related to the travel-related object may be transmitted to a mobile terminal associated with the vehicle so as to display the information on the mobile terminal or on the vehicle via the mobile terminal.
In addition, it should also be noted that the processing of steps S330 to S360 is performed for vehicle objects, but the processing procedures described in steps S330 to S360 may be applied globally, i.e., vehicle objects for which a travel risk is likely to be caused and vehicle objects for which a travel risk is likely to be caused are determined for each object in the road data, respectively. The associated information may then be transmitted to the vehicle object and the associated object, respectively, that may receive the reminder information.
For example, if a pedestrian carrying a mobile terminal is likely to collide because a vehicle object travels thereto, not only the vehicle may receive a risk indication that the pedestrian is likely to cause a collision. The pedestrian carrying mobile terminal will also receive a prompt about the risk that the vehicle will hit the pedestrian.
Fig. 4 shows a schematic representation of a driving assistance method 400 according to another embodiment of the invention. The driving assistance method 400 is adapted to be executed in a vehicle 110, and the vehicle 110 runs on a road on which the roadside sensing device 200 is disposed.
The method 400 includes step S410. In step S410, a target object of the road side perception device 200 is received through a predetermined communication method. The target object is generated by the roadside sensing device 200 using the method 300 described above with reference to fig. 3, and will not be described in detail here.
Subsequently, in step S420, the target object is displayed in the vehicle 110 so that the driver of the vehicle can obtain the relevant information of the target object and thus change the driving manner of the vehicle, thereby improving the driving safety and efficiency of the vehicle 110. In one embodiment, vehicle 110 has a display interface and the effect information of the target object may be displayed in the display interface.
Alternatively, according to an embodiment of the present invention, in order to more reasonably provide the driver with information of the target object, when the target object is displayed in step S420, the presentation may be performed according to the feature of the target object, the size and the traveling direction of the vehicle, and the relative positions of the target object and the vehicle. For example, when the target object is a projectile on a road, the current position of the vehicle is used as a reference position, and the current driving direction is used as a reference direction, and the projectile is displayed at a predetermined position on the screen according to the relative position of the projectile with respect to the vehicle, so that a driver can conveniently pay attention to the projectile at a position far away from the projectile and take an avoidance measure as soon as possible.
In addition, according to an embodiment, when the target object is received in step S410, the risk level value of the target object may also be received. A specific example of how the roadside sensing device 200 generates the risk degree value has been described above with reference to fig. 3, and is not described herein again. Accordingly, in step S420, the target object may also be presented according to the presentation manner corresponding to the risk degree value. For example, a target object with a higher degree of risk is displayed in an emphasized manner that is clearly different from the background and is displayed in a blinking manner, while a target object with a lower degree of risk is displayed in a normal manner. In this way the driver can focus on different target objects and can focus on target objects with a higher degree of risk.
In addition, optionally, the driving assistance method 400 further includes a step S430, and in the step S430, the travel-related object is received. The driving-related object is an object related to the driving of the vehicle 110 in the road data, and the specific manner in which the roadside sensing device 200 acquires the driving-related object from the road data has been described above with reference to the method 300 in fig. 3, and is not described herein again.
Accordingly, when the target object is presented at step S420, the presentation of the target object together with the travel-related object is further included. Displaying the travel-related object and the target object simultaneously may provide a more intuitive feel to the driver of vehicle 110 to determine the location of the target object. For example, according to one embodiment, the travel-related object is presented based on the characteristics of the travel-related object, the size and the traveling direction of the vehicle, and the relative positions of the travel-related object and the vehicle. Since the target object has been rendered in substantially the same way. In this way, the display effect of displaying the objects according to the relative positions of the travel-related object and the target object with the vehicle 110 as a reference can be realized, thereby allowing the driver to more intuitively determine the position of the target object.
FIG. 5 illustrates a schematic diagram of an interface 500 for presenting a target object, according to an embodiment of the invention. As shown in fig. 5, the interface 500 is displayed with the current position of the vehicle as a reference position and the traveling direction of the vehicle as a reference direction, and each of the travel-related objects 510 and the target object 520 is displayed on the screen at the position relative to the vehicle on the screen. In order to make the driver feel the target object 520 more intuitively, the driving-related object 510 is displayed in a weakened state such as a blurred state, and the target object 520 is displayed in a highlighted state such as a solid line, a red color, or a blinking state. In addition, when there are a plurality of target objects 520, the target objects may be emphatically displayed in different ways according to the risk degree value, and the driver may more intuitively locate a target object having a higher degree of risk and change the driving manner of the vehicle 110 as early as possible according to the distance of this target object with respect to the vehicle, thereby reducing the risk so as to improve the driving safety of the vehicle.
As shown in fig. 5, although the target object 520 is not visible in the current field of view of the vehicle 110 (blocked by the driving-related object 510), information of the target object 520 may be obtained from road data by roadside sensing by the roadside sensing device 200 so as to be displayed in the screen 500. In addition, by blurring the driving-related object 510 that blocks the target object 520, the driver can notice the presence of the target object 520, and can prevent the occurrence of a traffic accident even if measures are taken. For example, in the example given in fig. 5, it is possible to prevent a pedestrian (target object 520) from suddenly passing out from behind a large vehicle (travel-related object 510) to cause a risk of a traffic accident. This significantly improves the safety of the vehicle driving.
It should also be noted that the driving assistance method 400 described above with reference to fig. 4 and 5 may also be performed on a mobile terminal associated with the vehicle 110, in addition to being adapted to be performed in the vehicle 110. For example, a driver of a vehicle may carry a smartphone or a car speaker as a mobile terminal. These mobile terminals may receive the vehicle surrounding object information when the vehicle enters the coverage area of the roadside sensing device 200 to perform the driving assistance method 400 described above with reference to fig. 4, or the mobile terminals may forward the information to the associated vehicle for subsequent processing after receiving the relevant information.
According to the driving assistance scheme, the road side sensing equipment is used for sensing and aggregating static and dynamic information on roads in the coverage range of the road side sensing equipment to form road data, driving analysis is conducted on each vehicle, target objects possibly forming vehicle driving risks are provided for the vehicles as analysis results, and therefore a driver can conveniently change the driving mode of the vehicles according to the analysis results, namely the information of the target objects, and the driving safety and the driving efficiency of the vehicles are improved.
In addition, according to the driving assist scheme of the present invention, a manner is provided in which the target object and the travel-related object are displayed more intuitively on the vehicle, so that it is possible for the driver of the vehicle to obtain a prompt for the target object out of his field of view, so as to drive the vehicle more safely.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (19)

1. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in each road range is acquired by at least one roadside sensing device;
identifying all or part of the vehicle objects in the objects based on the road data;
for one or more of the vehicle objects, determining a target object constituting a driving risk for the one or more vehicle objects according to the road data;
and sending the relevant information of the target object to the one or more vehicles.
2. The driving assistance method according to claim 1, the step of acquiring road data including:
acquiring static information which is stored in advance and relates to each road range;
obtaining static and/or dynamic information of objects within the road range using sensors deployed in roadside sensing devices within the road range;
combining the pre-stored static information and information obtained by the respective sensors to generate the road data.
3. The driving assist method according to claim 2, the step of acquiring road data within a road range including:
receiving vehicle running information sent by vehicles in the road range in the preset communication mode; and
the pre-stored static information, the information obtained by the respective sensors, and the received vehicle travel information are combined to generate the road data.
4. The driving assistance method according to any one of claims 1 to 3, the step of identifying all or part of the vehicle objects based on the road data including:
determining a vehicle object belonging to the vehicle based on the motion characteristics of each object; and
the identity of each vehicle object is identified to determine all or part of the vehicle objects.
5. The driving assist method according to any one of claims 1 to 4, the step of determining a target object that constitutes a travel risk for a predetermined vehicle object comprising:
and determining an object having a collision risk with the predetermined vehicle object as the target object based on the driving characteristics of the predetermined vehicle object and the characteristics of each object in the road data, and determining a risk level value of the target object.
6. The driving assist method according to any one of claims 1 to 5, the step of determining a target object that constitutes a running risk for a predetermined vehicle object comprising:
and determining an object having an influence on the traveling of the predetermined vehicle object on the road as a target object based on the traveling characteristics of the predetermined vehicle object and the characteristics of each object in the road data, and determining a risk level value of the target object.
7. The driving assist method according to any one of claims 1 to 6, the step of transmitting the target object related information to the predetermined vehicle including:
transmitting the relevant information of the target object and the associated risk degree value to the predetermined vehicle.
8. The driving assist method according to any one of claims 1 to 7, further comprising the steps of:
for the predetermined vehicle object, searching a travel-related object related to travel of the predetermined vehicle object from the road data;
transmitting the related information of the travel-related object to the predetermined vehicle so that the predetermined vehicle presents the travel-related object and the target object based on their relative positions.
9. A driving assistance method performed in a vehicle that runs on a road on which a roadside sensing device is disposed, the method comprising the steps of:
receiving target object related information, wherein the target object is generated for the vehicle by the road side sensing equipment according to road data and indicates that a driving risk is formed on the vehicle; and
the target object is presented so that a driver of the vehicle can control the travel of the vehicle according to the target object.
10. The driving assistance method according to claim 9, the step of presenting the target object including:
presenting the target object based on a feature of the target object, a size and a travel direction of the vehicle, and a relative position of the target object and the vehicle.
11. The driving assist method according to claims 9 and 10, further comprising the steps of:
receiving a risk degree value of the target object; and
and presenting the target object according to a presentation mode corresponding to the risk degree value.
12. The driving assist method according to any one of claims 9 to 11, further comprising the step of:
receiving related information of a driving related object, wherein the driving related object is an object related to the driving of the vehicle in the road data;
the step of presenting the target object comprises displaying the target object together with the travel-related object.
13. The driving assist method according to claim 12, the step of displaying the target object together with the travel-related object comprising:
presenting the travel-related object based on the feature of the travel-related object, the size and the traveling direction of the vehicle, and the relative positions of the travel-related object and the vehicle.
14. The driving assist method according to claim 12 or 13, the step of displaying the target object together with the travel-related object comprising:
emphatically displaying the target object; and/or
And weakening and displaying the driving prompt information.
15. A roadside sensing device deployed at a road location, comprising:
each sensor adapted to obtain static and dynamic information for each object within the predetermined range;
a storage unit adapted to store road data including static and dynamic information of each object within the predetermined range; and
a computing unit adapted to perform the method of any of claims 1-8.
16. A driving assistance system comprising:
a plurality of roadside sensing devices as recited in claim 15 deployed at a lateral location on a road; and
a vehicle that travels on the road and that executes the driving assist method according to any one of claims 9 to 14.
17. A computing device, comprising:
at least one processor; and
a memory storing program instructions configured for execution by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-14.
18. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in each road range is collected by at least one roadside sensing device;
identifying all or part of the vehicle objects in the objects based on the road data;
determining a first target object and/or a second target object according to the road data, wherein the first target object forms a driving risk for one or more vehicle objects, and the one or more vehicle objects form a driving risk for the second target object; and
and sending the related information of the first target object to the one or more vehicle objects, and/or sending the related information of the one or more vehicle objects to the second target object.
19. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in each road range is collected by at least one roadside sensing device;
identifying all or part of the vehicle objects in the objects based on the road data;
for one or more of the vehicle objects, determining a target object constituting a driving risk for the one or more vehicle objects according to the road data; and
transmitting information related to the target object to a mobile terminal associated with the one or more 333 vehicular objects.
CN201811564618.5A 2018-12-20 2018-12-20 Driving assisting method and system Pending CN111354182A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811564618.5A CN111354182A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811564618.5A CN111354182A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Publications (1)

Publication Number Publication Date
CN111354182A true CN111354182A (en) 2020-06-30

Family

ID=71193603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811564618.5A Pending CN111354182A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Country Status (1)

Country Link
CN (1) CN111354182A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111932882A (en) * 2020-08-13 2020-11-13 广东飞达交通工程有限公司 Real-time early warning system, method and equipment for road accidents based on image recognition
CN111951573A (en) * 2020-07-21 2020-11-17 华设设计集团股份有限公司 Intelligent public transportation system and method based on vehicle-road cooperation technology
CN112435475A (en) * 2020-11-23 2021-03-02 北京软通智慧城市科技有限公司 Traffic state detection method, device, equipment and storage medium
CN112634354A (en) * 2020-12-21 2021-04-09 紫清智行科技(北京)有限公司 Road side sensor-based networking automatic driving risk assessment method and device
CN112712719A (en) * 2020-12-25 2021-04-27 北京百度网讯科技有限公司 Vehicle control method, vehicle-road coordination system, road side equipment and automatic driving vehicle
CN113581199A (en) * 2021-06-30 2021-11-02 银隆新能源股份有限公司 Vehicle control method and device
CN114399906A (en) * 2022-03-25 2022-04-26 四川省公路规划勘察设计研究院有限公司 Vehicle-road cooperative driving assisting system and method
CN114694368A (en) * 2020-12-28 2022-07-01 比亚迪股份有限公司 Vehicle management and control system
CN114724366A (en) * 2022-03-29 2022-07-08 北京万集科技股份有限公司 Driving assistance method, device, equipment, storage medium and program product
CN114822022A (en) * 2022-04-13 2022-07-29 中国第一汽车股份有限公司 Data processing method and device for cooperative vehicle and road sensing, vehicle and storage medium
CN115273508A (en) * 2022-06-17 2022-11-01 智道网联科技(北京)有限公司 Vehicle travel guidance method, device, electronic device, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782646A (en) * 2009-01-19 2010-07-21 财团法人工业技术研究院 All-round environment sensing system and method
DE102011088805A1 (en) * 2011-12-16 2013-06-20 Bayerische Motoren Werke Aktiengesellschaft Method for developing and/or testing of driver assistance system for motor vehicle, involves determining several scenarios in modeling of prior collision phase by using Monte Carlo simulation based on driving situation
CN106740834A (en) * 2015-11-24 2017-05-31 中国移动通信集团公司 A kind of method and device of auxiliary vehicle meeting
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108694859A (en) * 2017-02-28 2018-10-23 大唐高鸿信息通信研究院(义乌)有限公司 A kind of trackside node high risk vehicle alarm prompt method suitable for vehicle-mounted short distance communication network
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings
CN108831190A (en) * 2018-08-02 2018-11-16 钟祥博谦信息科技有限公司 Vehicle collision avoidance method, apparatus and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782646A (en) * 2009-01-19 2010-07-21 财团法人工业技术研究院 All-round environment sensing system and method
DE102011088805A1 (en) * 2011-12-16 2013-06-20 Bayerische Motoren Werke Aktiengesellschaft Method for developing and/or testing of driver assistance system for motor vehicle, involves determining several scenarios in modeling of prior collision phase by using Monte Carlo simulation based on driving situation
CN106740834A (en) * 2015-11-24 2017-05-31 中国移动通信集团公司 A kind of method and device of auxiliary vehicle meeting
CN108694859A (en) * 2017-02-28 2018-10-23 大唐高鸿信息通信研究院(义乌)有限公司 A kind of trackside node high risk vehicle alarm prompt method suitable for vehicle-mounted short distance communication network
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings
CN108831190A (en) * 2018-08-02 2018-11-16 钟祥博谦信息科技有限公司 Vehicle collision avoidance method, apparatus and equipment

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111951573A (en) * 2020-07-21 2020-11-17 华设设计集团股份有限公司 Intelligent public transportation system and method based on vehicle-road cooperation technology
CN111932882B (en) * 2020-08-13 2022-05-06 广东飞达交通工程有限公司 Real-time early warning system, method and equipment for road accidents based on image recognition
CN111932882A (en) * 2020-08-13 2020-11-13 广东飞达交通工程有限公司 Real-time early warning system, method and equipment for road accidents based on image recognition
CN112435475A (en) * 2020-11-23 2021-03-02 北京软通智慧城市科技有限公司 Traffic state detection method, device, equipment and storage medium
CN112634354A (en) * 2020-12-21 2021-04-09 紫清智行科技(北京)有限公司 Road side sensor-based networking automatic driving risk assessment method and device
CN112634354B (en) * 2020-12-21 2021-08-13 紫清智行科技(北京)有限公司 Road side sensor-based networking automatic driving risk assessment method and device
CN112712719B (en) * 2020-12-25 2022-05-03 阿波罗智联(北京)科技有限公司 Vehicle control method, vehicle-road coordination system, road side equipment and automatic driving vehicle
CN112712719A (en) * 2020-12-25 2021-04-27 北京百度网讯科技有限公司 Vehicle control method, vehicle-road coordination system, road side equipment and automatic driving vehicle
CN114694368A (en) * 2020-12-28 2022-07-01 比亚迪股份有限公司 Vehicle management and control system
CN113581199A (en) * 2021-06-30 2021-11-02 银隆新能源股份有限公司 Vehicle control method and device
CN114399906A (en) * 2022-03-25 2022-04-26 四川省公路规划勘察设计研究院有限公司 Vehicle-road cooperative driving assisting system and method
CN114724366A (en) * 2022-03-29 2022-07-08 北京万集科技股份有限公司 Driving assistance method, device, equipment, storage medium and program product
CN114724366B (en) * 2022-03-29 2023-06-20 北京万集科技股份有限公司 Driving assistance method, device, equipment and storage medium
CN114822022A (en) * 2022-04-13 2022-07-29 中国第一汽车股份有限公司 Data processing method and device for cooperative vehicle and road sensing, vehicle and storage medium
CN115273508A (en) * 2022-06-17 2022-11-01 智道网联科技(北京)有限公司 Vehicle travel guidance method, device, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
CN111354182A (en) Driving assisting method and system
US20220032884A1 (en) Systems and methods for causing a vehicle response based on traffic light detection
US20220317700A1 (en) Systems and methods for detecting low-height objects in a roadway
CN113284366B (en) Vehicle blind area early warning method, early warning device, MEC platform and storage medium
CN111429739A (en) Driving assisting method and system
CN111354222A (en) Driving assisting method and system
CN110400478A (en) A kind of road condition notification method and device
CN110942623B (en) Auxiliary traffic accident handling method and system
CN111354214B (en) Auxiliary parking method and system
US20170330463A1 (en) Driving support apparatus and driving support method
US20090240432A1 (en) Vehicle-installation obstacle detection apparatus
CN110936960A (en) Driving assisting method and system
JP2019536184A (en) Road detection using traffic sign information
JP6129268B2 (en) Vehicle driving support system and driving support method
CN110940347A (en) Auxiliary vehicle navigation method and system
CN110763244B (en) Electronic map generation system and method
JP2022056153A (en) Temporary stop detection device, temporary stop detection system, and temporary stop detection program
JP6681044B2 (en) Reverse running detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201218

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: The big Cayman capital building, a four - story mailbox 847

Applicant before: Alibaba Group Holding Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200630