CN111429739A - Driving assisting method and system - Google Patents

Driving assisting method and system Download PDF

Info

Publication number
CN111429739A
CN111429739A CN201811565096.0A CN201811565096A CN111429739A CN 111429739 A CN111429739 A CN 111429739A CN 201811565096 A CN201811565096 A CN 201811565096A CN 111429739 A CN111429739 A CN 111429739A
Authority
CN
China
Prior art keywords
vehicle
driving
road
information
prompt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811565096.0A
Other languages
Chinese (zh)
Inventor
童华江
姚浪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811565096.0A priority Critical patent/CN111429739A/en
Publication of CN111429739A publication Critical patent/CN111429739A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle

Abstract

The invention discloses a driving assisting method, which comprises the following steps: acquiring road data, wherein the road data comprises static and/or dynamic information of all or part of objects in a road range respectively acquired by at least one roadside sensing device; identifying a vehicle object in a set state based on the road data; generating driving prompt information related to the driving of the vehicle object in the set state on the road according to the road data; and transmitting the identified driving prompt information to a vehicle with a set state or a vehicle related to the driving of the vehicle with the set state, so that the vehicle receiving the driving prompt information performs vehicle driving control according to the driving prompt information. The invention also discloses corresponding roadside sensing equipment and an auxiliary driving system.

Description

Driving assisting method and system
Technical Field
The present invention relates to the field of vehicle driving assistance, and in particular to the field of using road environment data to assist in vehicle driving.
Background
As the automotive industry moves into the internet and intelligent era, sensors and arithmetic units in or around the vehicle can provide increasingly greater driving-related data and computing power. These data and capabilities can assist in driving the vehicle more efficiently than previously, making vehicle driving simpler, more intelligent, and safer.
Safety and convenience are often concerns for the driver in relation to driving a vehicle. In the existing vehicle-mounted driving assistance scheme, data collection such as a distance to a vehicle ahead, a speed of the vehicle itself, and a real-time position of the vehicle is generally performed during driving using sensors on the vehicle, and then an on-vehicle computing unit analyzes the data and performs a driving assistance providing capability based on the analysis result. This solution is limited on the one hand to the relevant sensors installed on the vehicle, i.e. it cannot be implemented on vehicles not equipped with relevant sensors. On the other hand, the vehicle sensor can only sense data in a small range around the vehicle, and cannot provide driving environment related information at a greater distance from the vehicle, which has obvious limitations.
The existing road monitoring equipment only provides functions of measuring vehicle flow, vehicle distance, vehicle speed and the like, can only provide a few pieces of road flow prompting information for vehicle driving, and cannot achieve the aim of effectively assisting the vehicle driving.
With the development of the technology of the internet of vehicles V2X, a collaborative environment awareness system appears. The system can use the data of the vehicle and the surrounding environment together to assist the driving of the vehicle. However, how to construct the environmental data and how to fuse the vehicle itself and the environmental data are problems faced by the collaborative environmental awareness system.
How to provide auxiliary information for the driving of the vehicle conveniently, accurately and quickly without changing the vehicle per se is one of the problems which are urgently needed to be solved in the field.
Therefore, a new driving assistance scheme for a vehicle is needed, which can provide more accurate and comprehensive driving assistance information for the vehicle, so that the driver can accurately notice various conditions occurring on the vehicle and the road, and the driver can change the driving behavior of the vehicle at the first time, and the vehicle can be driven more safely.
Disclosure of Invention
To this end, the present invention provides a new driving assistance solution for a vehicle in an attempt to solve or at least alleviate at least one of the problems presented above.
According to an aspect of the present invention, a driving assist method is provided. The method comprises the following steps: acquiring road data, wherein the road data comprises static and/or dynamic information of all or part of objects in a road range respectively acquired by at least one roadside sensing device; identifying a vehicle object in a set state based on the road data; generating driving prompt information related to the driving of the vehicle object in the set state on the road according to road data; and sending the running prompting information to the vehicle in the set state or the vehicle related to the running of the vehicle in the set state so that the vehicle receiving the running prompting information performs vehicle driving control according to the running prompting information.
According to another aspect of the present invention, there is provided a driving assistance method performed in a vehicle that runs on a road on which a roadside sensing device is disposed, the method including the steps of: receiving driving prompt information, wherein the driving prompt information is generated for the vehicle by the road side sensing equipment according to the road data and is related to the driving of the vehicle on the road; and presenting the driving prompt information so that a driver of the vehicle can control the driving of the vehicle according to the driving prompt information.
According to still another aspect of the present invention, there is provided a roadside sensing apparatus including: each sensor is suitable for obtaining the static and dynamic information of each object in the coverage area; a storage unit adapted to store road data comprising static and/or dynamic information of objects within said coverage area; and a calculation unit adapted to perform the driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance system including the roadside sensing device according to the present invention, and a vehicle, running on a road, and performing a driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance method of a vehicle, including the steps of: acquiring road data, wherein the road data comprises static and/or dynamic information of all or part of objects in a road range respectively acquired by at least one roadside sensing device; identifying a vehicle object in a set state based on the road data; generating driving prompt information related to driving of the vehicle object in the set state on the road according to the road data; and sending the driving prompt information to a mobile terminal, wherein the mobile terminal is associated with the vehicle with the set state and the related vehicles.
According to still another aspect of the present invention, there is also provided a computing device. The computing device includes at least one processor and a memory storing program instructions, wherein the program instructions are configured to be executed by the at least one processor and include instructions for performing the above-described assisted parking method.
According to still another aspect of the present invention, there is also provided a readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the above-described parking assist method.
According to the driving assistance scheme, the road side sensing equipment is used for sensing and aggregating static and dynamic information on roads in the coverage range of the road side sensing equipment to form road data, driving analysis is conducted on each vehicle according to the vehicles, the analysis results are provided for the vehicles, and therefore a driver can conveniently change the driving mode of the vehicles according to the analysis results, namely the driving prompt information, and the driving safety and the driving efficiency of the vehicles are improved.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic view of a travel assist system according to an embodiment of the invention;
FIG. 2 shows a schematic diagram of a roadside sensing device according to one embodiment of the invention;
FIG. 3 shows a schematic view of a method of assisting driving according to an embodiment of the invention;
FIG. 4 shows a schematic view of a method of assisting driving according to an embodiment of the invention; and
fig. 5 shows a schematic view of a parking assist method according to another embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 shows a schematic view of a driving assistance system 100 according to an embodiment of the invention. As shown in fig. 1, the driving assistance system 100 includes a vehicle 110 and a roadside sensing device 200. Vehicle 110 is traveling on road 140. Roadway 140 includes a plurality of lanes 150. During the driving process of the vehicle 110 on the road 140, different lanes 150 may be switched according to the road condition and the driving target.
The roadside sensing device 200 is disposed at the periphery of the road, and collects various information within a predetermined range around the roadside sensing device 200, particularly road data related to the road, using various sensors it has.
The roadside sensing device 200 has a predetermined coverage. According to the coverage range and the road condition of each roadside sensing device 200, a sufficient number of roadside sensing devices 200 can be deployed on two sides of the road, and the whole road can be fully covered. Of course, according to an embodiment, instead of fully covering the entire road, the roadside sensing devices 200 may be deployed at the feature points (corners, intersections, and diversions) of each road to obtain the feature data of the road. The present invention is not limited by the specific number of roadside sensing devices 200 and the coverage of the road.
When the roadside sensing devices 200 are deployed, the positions of the sensing devices 200 to be deployed are calculated according to the coverage area of a single roadside sensing device 200 and the condition of the road 140. The coverage area of the roadside sensing device 200 depends on at least the arrangement height of the sensing device 200, the effective distance sensed by the sensors in the sensing device 200, and the like. And the condition of road 140 includes road length, number of lanes 150, road curvature and grade, etc. The deployment location of the perceiving device 200 may be calculated in any manner known in the art.
After the deployment location is determined, the roadside sensing device 200 is deployed at the determined location. Since the data that the roadside sensing device 200 needs to sense includes motion data of a large number of objects, clock synchronization of the roadside sensing device 200 is performed, that is, the time of each sensing device 200 is kept consistent with the time of the vehicle 110 and the cloud platform.
Subsequently, the position of each deployed roadside sensing device 200 is determined. Since the perception device 200 is to provide the driving assistance function for the vehicle 110 traveling at a high speed on the road 140, the position of the perception device 200 must be highly accurate as the absolute position of the perception device. There are a number of ways to calculate the high accuracy absolute position of the perceiving device 200. According to one embodiment, a Global Navigation Satellite System (GNSS) may be utilized to determine a high accuracy position.
The roadside sensing device 200 collects and senses the static conditions (lane lines 120, guardrails, isolation belts, parking spaces and the like) and the dynamic conditions (running vehicles 110, pedestrians 130 and sprinklers) of the roads in the coverage area of the roadside sensing device by using the sensors thereof, and fuses the sensing data of the different sensors to form the road data of the section of the road. The road data comprises static and dynamic information of all objects within the coverage area of the perceiving device 200, in particular within the road-related field. The roadside sensing devices 200 may then calculate driving-related information for each vehicle based on the road data, such as whether the vehicle has a potential collision risk, traffic conditions outside the field of view of the vehicle (such as road conditions after a road curve, road conditions before a preceding vehicle), and the like.
A vehicle 110 entering the coverage area of one roadside sensing device 200 may communicate with the roadside sensing device 200. A typical communication method is the V2X communication method. Of course, the mobile internet provided by the mobile communication service provider may communicate with the roadside sensing devices 200 using mobile communication means such as 5G, 4G and 3G. In consideration of the fact that the vehicle runs at a high speed and the requirement for the time delay of communication is as short as possible, the V2X communication method is adopted in the general embodiment of the present invention. However, any communication means that can meet the time delay requirements required by the present invention is within the scope of the present invention.
The vehicle 110 may receive driving-related information related to the vehicle 110 from the roadside sensing device 200. For example, the vehicle 110 may acquire, from the roadside sensing device 200, various driving-related objects related to driving of the vehicle 110 on a road in road data, such as lanes on the road, traffic restriction marks above the road, pedestrians around the road or street lamps around the road, signs, other vehicles, and the like. The vehicle 110 may display these travel-related objects on its control interface to present all surrounding road information to the driver in one place.
The vehicle 110 may receive driving-related information related to the vehicle 110 and road data for the segment of road in various ways. In one implementation, vehicles 110 entering the coverage area of roadside sensing devices 200 may receive such information and data automatically. In another implementation, the vehicle 110 may issue a request, and the roadside sensing device 200 sends driving-related information related to the vehicle 110 and road data of the section of road to the vehicle 110 in response to the request, so that the driver controls the driving behavior of the vehicle 110 based on the information.
The present invention is not limited to the particular manner in which the vehicle 110 receives the driving-related information and the road data for the road segment, and all manners in which such information and data may be received and the driving behavior of the vehicle 110 controlled accordingly are within the scope of the present invention.
Optionally, the driving assistance system 100 further comprises a server 160. Although only one server 160 is shown in fig. 1, it should be understood that the server 160 may be a cloud service platform consisting of a plurality of servers. Each roadside sensing device 100 transmits the sensed road data to the server 160. The server 160 may combine the road data based on the location of each roadside sensing device 100 to form road data for the entire road. The server 160 may also perform further processing on the road data for the road to form driving-related information, such as traffic conditions, accident sections, expected transit times, etc. for the entire road.
The server 160 may transmit the road data and the driving related information of the formed whole road to each roadside sensing device 200, or may transmit the road related data and the driving related information of a section of road corresponding to several roadside sensing devices 200 adjacent to a certain roadside sensing device 200 to the roadside sensing device 200. In this way, the vehicle 110 may obtain a greater range of driving-related information from the roadside sensing device 200. Of course, the vehicle 110 may obtain the driving-related information and the road data directly from the server 160 without passing through the roadside sensing device 200.
If roadside sensing devices 200 are deployed on all roads within an area and the roadside sensing devices 200 transmit road data to the server 160, an indication of road traffic within the area may be formed at the server 160. Vehicle 110 may receive the indication from server 160 and control the driving behavior of vehicle 110 accordingly.
The roadside sensing devices 200 may establish communication and perform communication transmission with the surrounding roadside sensing devices 200 directly without the server 160. Since the roadside sensing device 200 has more and more powerful computing power, more and more information can be processed locally at the roadside sensing device 200 by edge calculation in consideration of the bandwidth limitation and time delay requirement for communication between the roadside sensing device 200 and the server 160. The processed information may also be directly transmitted to the surrounding roadside sensing devices 200 without being transmitted via the server 160. This communication is more efficient for information that only needs to be exchanged between adjacent roadside sensing devices 200. For example, the traffic light information within the coverage area of one roadside sensing device 200 is forwarded to the surrounding roadside sensing devices 200.
For the roadside sensing devices 200, the same information may be received from the server 160 and the surrounding roadside sensing devices 200, and therefore, information merging is required based on the time stamp and the content of the information, old duplicate information is removed, and the latest information is provided for the vehicles 110 within the coverage range thereof.
It should be noted that a mobile terminal, such as a smart phone carried by a vehicle driver, may also be connected to the network formed by the server 160 and the roadside sensing device 200. Some information to the vehicle, if appropriate to the mobile terminal (e.g., navigation information, parking space information, driving directions, etc.), may also be sent to the mobile terminal so that the mobile terminal may maneuver the associated vehicle based on the information.
FIG. 2 shows a schematic diagram of a roadside sensing device 200 according to one embodiment of the invention. As shown in fig. 2, the roadside sensing device 200 includes a communication unit 210, a sensor group 220, a storage unit 230, and a calculation unit 240.
The roadside sensing devices 200 communicate with each vehicle 110 entering its coverage area to provide driving-related information to the vehicle 110 and to receive vehicle driving information of the vehicle from the vehicle 110. Meanwhile, the roadside sensing device 200 also needs to communicate with the server 160 and other surrounding roadside sensing devices 200. The communication unit 210 provides a communication function for the roadside sensing device 200. The communication unit 210 may employ various communication methods including, but not limited to, ethernet, V2X, 5G, 4G, and 3G mobile communication, etc., as long as they can complete data communication with as little time delay as possible. In one embodiment, roadside sensing devices 200 may communicate with vehicle 110 entering its coverage area and surrounding roadside sensing devices 200 using V2X, and roadside sensing devices 200 may communicate with server 160 using, for example, a high speed internet.
The sensor group 220 includes various sensors, for example, radar sensors such as a millimeter wave radar 222 and a laser radar 224, and image sensors such as a camera 226 and an infrared probe 228 having a light supplement function. For the same object, various sensors can obtain different properties of the object, for example, radar sensors can make object velocity and acceleration measurements, while image sensors can obtain object shape, relative angle, etc.
The sensor group 220 collects and senses static conditions (lane lines 120, guardrails, isolation belts, roadside parking spaces, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinklers) of roads in the coverage area using the respective sensors, and stores data collected and sensed by the respective sensors in the storage unit 230.
The computing unit 240 fuses the data sensed by the sensors to form road data for the road segment and also stores the road data in 234. In addition, the calculation unit 240 may further perform data analysis based on the road data, identify one or more vehicles and vehicle motion information therein, and further determine driving-related information for the vehicle 110. Such data and information may be stored in storage unit 230 for transmission to vehicle 110 or server 160 via communication unit 210.
Specifically, the calculation unit 240 may acquire static information on a predetermined range of road positions, which is stored in advance. After the roadside sensing device 200 is deployed at a certain position of a road, the range of the road covered by the sensing device 200 is fixed. Static information of the predetermined range, such as road width, number of lanes, turning radius, etc., within the range may be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device 200 at the time of deployment of the perceiving device 200. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, the calculating unit 240 processes the raw sensor data according to different sensors, respectively, to form sensing data such as distance measurement, speed measurement, type identification, size identification, and the like. And then, based on the obtained static road data, in different cases, different sensor data are used as a reference, and other sensor data are added for calibration, so that uniform road data are finally formed.
The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Accordingly, the vehicle 110 may transmit vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. To this end, the calculation unit 240 may further fuse the vehicle travel information obtained from the vehicle 110 on the basis of the previously formed road data to form new road data.
In addition, the storage unit 230 may store various calculation models, such as a collision detection model, a license plate recognition model, a parking space recognition model, a parking lot entrance/exit device model, and the like. These computational models may be used by the computational unit 240 to implement the corresponding steps in the method 300 described below with reference to fig. 3.
According to one embodiment of the present aspect, in the road data, object information, such as a static street light object, a sign object, a lane object, and the like, and a dynamic vehicle object, a new person object, and a temporary falling object, a temporary lane change object, and the like, can be established for each item on the road. Subsequently, the calculation unit 240 calculates interactions between various objects for the respective objects using various calculation models stored in the storage unit 230, thereby obtaining various driving-related information.
Fig. 3 shows a schematic representation of a method 300 for driver assistance for a vehicle according to an embodiment of the invention. The driving assistance method 300 is suitably performed in the roadside sensing device 200 shown in fig. 2. As shown in fig. 3, the driving assistance method 300 starts at step S310.
In step S310, road data is acquired. The road data comprises static and/or dynamic information of all or part of the objects within the road range of the at least one roadside sensing device respectively. As described above with reference to fig. 1, the roadside sensing device 200 is generally fixedly disposed near a certain road, and thus has a corresponding road position. In addition, the roadside sensing device 200 has a predetermined coverage area depending on at least the arrangement height of the sensing device 200, the effective distance for sensing by the sensors in the sensing device 200, and the like. Once the roadside sensing device 200 is deployed at a side of a certain road, a predetermined range of the road that can be covered by the sensing device can be determined according to the specific positions, heights and effective sensing distances of the sensing device and the road.
The roadside sensing device 200 collects and/or senses the static conditions (lane lines 120, guardrails, isolation strips, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinklers) of the road in the coverage area by using the various sensors thereof to obtain and store various sensor data.
As described above, the roadside sensing device 200 includes various sensors, for example, radar sensors such as the millimeter wave radar 222 and the laser radar 224, and image sensors such as the camera 226 and the infrared probe 228 having a light supplement function, and the like. For the same object, various sensors can obtain different properties of the object, for example, a radar sensor can perform object velocity and acceleration measurements, and an image sensor can obtain the shape and relative angle of the object.
In step S310, processing and fusion may be performed based on the obtained various sensor raw data, thereby forming unified road data. In one embodiment, step S310 may further include a substep S312. In step S312, static information on a predetermined range of road positions, which is stored in advance, is acquired. After the roadside sensing device is deployed at a certain position of a road, the range of the road covered by the sensing device is fixed. Static information of the predetermined range, such as road width, number of lanes, lane lines, speed limit signs, turning radius, etc. within the range can be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device 200 at the time of deployment. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, in step S314, the raw sensor data is processed according to different sensors, respectively, to form sensing data about the distance measurement, speed measurement, and type, size, etc. of the dynamic object and the size, content, location, etc. of various static objects. Next, in step S316, based on the road static data obtained in step S312, calibration is performed using different sensor data as a reference and other sensor data, and finally uniform road data is formed.
Steps S312-S136 describe one way to obtain road data. The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Therefore, as described in step S318, the vehicle 110 transmits the vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. The method S310 further includes a step S319 in which the vehicle travel information obtained in the step S318 is further fused on the basis of the road data formed in the step S316 to form new road data.
After each roadside sensing device 200 collects static and/or dynamic information of objects within its coverage area or covered road area, the information collected by at least one roadside sensing device 200 that is adjacent may be combined to form road data for a segment of road. This combination may be performed in the server 160 coupled to the roadside sensing devices 200, or may be performed in any of the roadside sensing devices 200.
Next, in step S320, one or more vehicle objects within the sensing unit coverage are identified based on the road data obtained at step S310. The identification in step S320 includes two aspects of identification. One aspect of the identification is vehicle identification, i.e. identifying which objects in the road data are vehicle objects. Since the vehicle objects have different motion characteristics, such as a relatively high speed, traveling in a lane in one direction, generally not sending collisions with other objects, and the like. A conventional classification detection model or a deep learning-based model may be constructed based on these motion characteristics, and the constructed model is applied to road data, thereby determining motion characteristics such as a vehicle object and a motion trajectory of the vehicle object in the road data.
Another aspect of the identification is identifying a vehicle identification. For the recognized vehicle object, its vehicle identification is further determined. One way to determine the identity of the vehicle is to determine the unique license plate of the vehicle, for example by means of image recognition or the like. When the license plate of the vehicle cannot be identified, another way to determine the vehicle identifier may be to generate a unique mark of the vehicle by combining the size, type, position information, driving speed, and the like of the vehicle object. The vehicle identification is the unique identification of the vehicle object within the road section and is used to distinguish it from other vehicle objects. The vehicle identification is used in subsequent data transmission and is transmitted in different road side sensing devices in the road so as to facilitate overall analysis.
Alternatively, after the vehicle is identified, vehicle matching is performed, that is, a vehicle object to be analyzed subsequently and a vehicle object that needs to receive the driving prompt information as the analysis result are matched. Vehicle matching can be performed through various matching modes or combination of license plate matching, driving speed and type matching, position information fuzzy matching and the like. According to one embodiment, the vehicle 110 may bind the license plate information through V2X or application verification, and the license plate information may further be matched to the vehicle data of the corresponding license plate in the roadside sensing device and the server, thereby implementing license plate matching.
In step S320, in addition to the identification of the vehicle objects, the vehicle objects having the set state may be identified. There are various ways to determine the vehicle object having the set state. According to one embodiment, the vehicle object for which the status is set can be a vehicle object with special driving requirements, for example an ambulance, a police vehicle or a vehicle in a fleet of vehicles. Whether the vehicle has the set state may be determined according to the attribute of the vehicle object. According to another embodiment, the vehicle in the set state may be a vehicle in a driving abnormality state. In the case of vehicle object recognition based on road data, in addition to the vehicle object, the running state of the vehicle object may be recognized, and if the running state is abnormal compared with other vehicles, these vehicles may be set to be in a set state. In addition, some vehicles may directly send information to the roadside sensing unit 200 to indicate that it has a set state, so that driving assistance services can be provided for these vehicles.
The set state is not limited to the above-indicated cases, and according to an embodiment of the present invention, the set state may even indicate the state that all or any one of the vehicles traveling on the road has.
Subsequently, in step S330, for the vehicle object having the set state identified in step S320, the travel guidance information relating to the travel of the vehicle object on the road is generated from the road data acquired in step S310. As described above with reference to fig. 2, in the storage unit of the roadside sensing device 200, various calculation models are stored in advance. These computational models can be used to calculate the interaction of vehicle objects with other objects in the road data and generate driving tips. Examples of various travel prompting messages will be given below with reference to fig. 4. It should be noted that the present invention is not limited to these examples, and therefore, it is within the scope of the present invention to generate the travel prompt information that can be generated from the road data and that is related to the travel of the vehicle object on the road.
After the travel guidance information is generated for the vehicle object in step S330, the generated travel guidance information is transmitted to the vehicle corresponding to the vehicle object in the set state in step S340. According to one embodiment, the vehicle has automatically established communication with the roadside sensing device 200 when entering the coverage area of the device 200, so that driving prompts may be sent to the vehicle 110 via the previously established channel. Subsequently, the travel guidance information may be displayed on the control interface of the vehicle 110 so that the driver of the vehicle 110 pays attention to this guidance information and accordingly changes the travel behavior of the vehicle 110, thereby improving the driving safety of the vehicle 110.
In addition, since the travel guidance of the vehicle having the set state is also given to the other vehicles, the travel guidance information generated in step S330 may be transmitted to the vehicles related to the travel of the vehicle having the set state so that these vehicles can also perform the vehicle driving control based on the travel guidance information.
In addition, it should be noted that a mobile terminal, such as a smart phone carried by a vehicle driver, a vehicle-mounted smart speaker, etc., may also be connected to the network formed by the server 160 and the roadside sensing device 200. In step S340, in addition to transmitting the travel guidance information to the vehicle corresponding to the vehicle object, the travel guidance information may be transmitted to a mobile terminal associated with the vehicle so that the associated vehicle is manipulated by a user of the mobile terminal according to the received travel guidance information.
Fig. 4 shows a schematic diagram of a method 400 for parking assist according to another embodiment of the invention. Fig. 4 is a further description of the parking assist method 300 shown in fig. 3, and therefore steps that are the same as or have the same level as those in the method shown in fig. 3 are indicated by the same step numbers, and are not repeated.
As shown in fig. 4, step S330 may include a plurality of embodiments for generating the travel guidance information.
According to one embodiment of the invention, the driving information includes a continuous lane change indication for the vehicle. Step S330 includes step S410 in which each lane object is extracted from the road data acquired in step S310. Each lane object includes, for example, information such as a position of a lane on a road and a lane line. Subsequently, in step S412, the number of times the vehicle object crosses a certain lane object within a predetermined time is determined. This may be achieved, for example, by an image recognition model, and the invention is not limited to a particular image recognition model. And when the crossing times exceed the preset times, generating a continuous lane change prompt of the vehicle.
In this way, when it is found that the vehicle object frequently curves ahead on the road (thus crossing multiple lanes), the continuous lane change notice can be generated so that the driver of the vehicle 110 can be aware of the abnormality of the vehicle running.
According to a further embodiment of the invention, the driving instruction information includes a vehicle speed warning instruction. Thus, step S330 includes step S420. In step S420, te is from the road data obtained in step S310. The road speed-limiting object can be obtained by image-recognizing various static objects within the coverage of the roadside sensing device 200. The highest or lowest speed limit on the road may be obtained, for example, by identifying speed limit prompts above the road. The speed limit of a lane is obtained by recognizing a speed limit indication on the lane. The present invention is not limited to a specific manner of obtaining the speed-limit object, and therefore, a manner in which the speed-limit object can be constructed by analyzing the road data to obtain the speed-limit information is within the scope of the present invention.
Subsequently, in step S422, the traveling speed of the vehicle object is compared with the speed limit object, and when the traveling speed of the vehicle is out of the road speed range defined by the road speed limit object, a vehicle speed warning prompt is generated. It should be noted that the traveling speed of the vehicle object may also be compared with the speed limit of the lane in which the vehicle object is located to determine whether the speed of the vehicle on the lane is beyond the speed limit range, i.e., exceeds the highest speed or is below the lowest speed, etc.
In this way, when the speed of the vehicle object on the road is found to be too high or too low, a vehicle speed warning prompt may be generated so that the driver of the vehicle 110 can be aware of the abnormality in the vehicle running.
According to one embodiment of the invention, the driving information comprises a traffic light indication. In this case, step S330 includes step S430. In step S430, a traffic light object ahead of the traveling direction of the vehicle object is acquired from the road data acquired in step S310. Considering the vehicle speed of the vehicle and the coverage distance of the roadside sensing devices 200, the vehicle may pass more than one coverage distance of the roadside sensing devices 200 within a time period when the traffic light changes once. Alternatively, in step S430, the traffic light object may be acquired from road data of one or more adjacent roadside sensing devices 200 in front of the roadside sensing device 200 in the traveling direction of the vehicle object.
Subsequently, in step S432, a traffic light prompt is generated based on the traffic light object and the traveling speed of the vehicle object. For example, the traffic light prompt may indicate that a traffic light is encountered a few seconds after driving based on the speed of travel of the vehicle object, thereby prompting the driver to notice the traffic light information.
Alternatively, the traffic light system may be coupled to the roadside sensing devices 200 within its coverage area via a communication protocol such as V2X. Optionally, the traffic light system may also be connected to the server 160, such that the roadside sensing devices 200 may be connected to the traffic light system via the server 160.
In this case, the step S330 further includes a step S434, in which the control information of the traffic light object is acquired from the traffic light system at the step S434. The control information includes, for example, the current state of the traffic light, the time when the traffic light changes to the next light, the changed state, and the like. Thus, in step S432, a traffic light prompt may be generated from the traffic light object obtained in step S430, the traffic light control information obtained in step S434, and the traveling speed of the vehicle object. The traffic light prompt at this time may include the current traffic light state, and prompt information such as whether the vehicle can pass through the traffic light without stopping at the current speed.
In this case, the driver of the vehicle can obtain specific information on the traffic light ahead of the vehicle and obtain advice that the traffic light can be safely passed at high speed if the driver drives.
According to one embodiment of the invention, the driving prompt information includes a vehicle avoidance prompt. In this case, step S330 includes step S440, in step S440, a rear vehicle object within a predetermined range rearward of the vehicle object traveling direction is acquired from the road data.
Alternatively, considering that only vehicles in the same lane need to be avoided in general, step S440 may further include acquiring a rear vehicle object located within a predetermined range behind the driving direction on the same lane as the vehicle object from the road data.
In addition, optionally, since the coverage of the roadside sensing device is limited and the traveling speed of the vehicle is faster, in step S440, the road data of the current roadside sensing device and the adjacent roadside sensing device (behind the current roadside sensing device in the traveling direction of the vehicle object) may be combined and the vehicle object behind the current vehicle object may be acquired from the road data.
Subsequently, in step S442, it is determined whether the rear vehicle object includes a special vehicle. If the special vehicle exists, a vehicle avoidance prompt is generated so as to prompt a driver to drive the vehicle to other lanes and to give way to the special vehicle behind the driver, and the method is particularly suitable for the situations that an emergency wounded person needs to be transported when an accident occurs on a road and the like.
According to the embodiment of the invention, there are various ways to determine whether the rear vehicle object is a special vehicle. For example, in one embodiment, it may be determined by image recognition whether the rearward vehicle is a special vehicle (e.g., a special vehicle such as a police car and an ambulance has a special shape and marking). In another embodiment, the vehicle may be explicitly a special vehicle when communicating with the roadside sensing device. The invention is not limited to the specific form of determining whether the vehicle object is a special vehicle.
By generating a vehicle avoidance prompt when a special vehicle such as a police car and an ambulance appears within a predetermined range behind a vehicle object, it is possible for a driver to change a driving lane of the vehicle so as to give way to the special vehicle behind, providing road passing efficiency.
The above gives a manner of generating a vehicle avoidance guidance according to whether or not there is a special vehicle behind the vehicle object. It should be noted that the vehicle avoidance prompt may also be generated with reference to a particular vehicle. For example, vehicle avoidance cues may be generated for all vehicle objects located within a predetermined distance forward in the direction of travel of the particular vehicle on the same lane. The invention concerns the generation conditions of vehicle avoidance prompts. All methods which can generate vehicle avoidance prompts for vehicles meeting the conditions according to the conditions are within the protection scope of the invention.
According to a further embodiment of the invention, the driving instruction information comprises a route avoidance instruction. In this case, method step S330 includes step S450. In step S450, a travel path plan associated with travel of the special vehicle is acquired. For example, when a particular vehicle needs to travel from a location on a road to a destination, in order to ensure rapid transit of the particular vehicle on the road, the route request may be sent to a roadside sensing device or server 160 to which the particular vehicle is communicatively connected, so as to generate a travel route plan for the route request. Subsequently, the driving path plan may be acquired from the server 160 or the roadside sensing device 200. Subsequently, in step S452, it is determined whether the travel path plan overlaps with the road in the road data acquired in step S310. If there is a road overlap, a route avoidance guidance is generated in step S454.
Alternatively, the travel path plan may be in units of lanes, i.e., a lane-based travel path. In this case, when there is road overlap, it is further determined whether the vehicle object is on a lane of the travel path plan, and the route avoidance guidance prompt is generated only when the vehicle object is on the lane, thereby providing the travel prompt more accurately.
In addition, the travel path plan may optionally contain information on the time at which the special vehicle passes each road (e.g., estimated time based on the travel speed of the special vehicle). In this case, the route avoidance guidance includes the time information, so that the driver can consider when to perform lane avoidance.
By generating the route avoidance prompt for the vehicle, the driver can know that the lane where the vehicle runs has the special vehicle to pass in a certain time in the future, so that the lane where the vehicle runs can be changed as early as possible, and the efficient running on the lane is improved.
According to one embodiment of the invention, the driving advisory information comprises a fleet advisory. In this case, method step S330 includes step S460. In step S460, a fleet object within a predetermined range is acquired. The fleet object includes at least one or more fleet vehicles. A fleet of vehicles refers to vehicles that travel at approximately the same speed and travel state from the same starting point to the same ending point. These vehicles constitute a fleet of vehicles. Fleets are particularly useful for trucks used for transportation functions. And the subsequent vehicles can only travel along with the head vehicle. The fleet of vehicles typically have a specific communication protocol in order for the vehicles in the fleet to communicate with each other. According to one embodiment of the invention, the motorcade can inform the roadside sensing equipment of the motorcade information when the motorcade is in the coverage area of the roadside sensing equipment. Therefore, the roadside sensing equipment can acquire the fleet object from the road data.
Subsequently, in step S462, it is determined whether the vehicle object is within a predetermined distance from the fleet object obtained in step S470, and if so, a fleet prompt is generated to prompt the presence of the fleet object in the vicinity of the vehicle for driver attention to travel and avoidance while driving the vehicle.
According to one embodiment, the fleet prompt includes a fleet lane change prompt. To this end, step S330 further includes step S464, wherein a prompt is obtained that the fleet object is to change lanes. This may be obtained by identifying driving characteristics of the fleet object, in particular driving characteristics of the head vehicle in the fleet object. Subsequently, in step S466, it is determined whether the vehicular object is on the lane change target lane of the fleet object and behind the traveling direction of the fleet object, and if so, a fleet lane change prompt is generated to prompt the fleet ahead to change to the lane of the vehicle and pay attention to avoidance and keeping distance.
According to another embodiment, the fleet advisory includes a fleet avoidance advisory. To this end, step S330 further includes step S468, wherein it is determined whether the vehicle object is between one or more fleet vehicle objects of the fleet of vehicles on the travel road, and if so, a fleet avoidance prompt is generated to prompt the vehicle to be interposed between the fleet vehicles to change lanes as soon as possible to leave the fleet of vehicles.
Fig. 5 shows a schematic illustration of a driving assistance method 500 according to another embodiment of the invention. The driving assistance method 500 is adapted to be performed in a vehicle 110, and the vehicle 110 runs on a road on which the roadside sensing device 200 is disposed. The method 500 includes step S510. In step S510, the driving guidance information of the approaching road side sensing device 200 is received through a predetermined communication method. The driving guidance information is generated by the roadside sensing device 200 using the methods 300 and 400 described above with reference to fig. 3 and 4, and will not be described in detail here.
Subsequently, in step S520, the travel guidance information is displayed in the vehicle 110 so that the driver of the vehicle can obtain the guidance information and thus change the manner of travel of the vehicle, thereby improving the travel safety and efficiency of the vehicle 110. In one embodiment, vehicle 110 has a display interface, and the driving information may be displayed in the display interface.
Alternatively, according to an embodiment of the present invention, in order to more reasonably provide the driving guidance information for the driver, the guidance target information associated with the driving guidance information may be further obtained from the roadside sensing device in step S530. For example, when the travel guidance information is a vehicle continuous lane change guidance, the guidance target information includes the vehicle itself and the lane target information. For example, when the travel guidance information is a vehicle avoidance guidance, the guidance target information may include the vehicle itself and a specific vehicle target behind the vehicle.
In this case, the travel guidance information is presented together with the guidance target information in step S520, thereby making it easier for the driver to understand the meaning of the travel guidance information.
In addition, optionally, the display of the driving prompt information may be further reasonable, and in step S540, the driving object information related to the driving of the vehicle may be further acquired from the roadside sensing device 200 in the road data. The traveling object information may include, for example, vehicles, lanes, street lamps, and the like around the vehicle.
In this case, in step S520, the travel guidance information is presented together with the travel target information and the guidance target information.
It should also be noted that the driving assistance method 500 described above with reference to fig. 5 may also be performed on a mobile terminal associated with the vehicle 110, in addition to being adapted to be performed in the vehicle 110. For example, a driver of a vehicle may carry a smartphone or a car speaker as a mobile terminal. These mobile terminals may receive driving prompt information when the vehicle enters the coverage area of the roadside sensing device 200 to execute the driving assistance method 500 described above with reference to fig. 5.
Alternatively, the travel object and the guidance object may be stereoscopically displayed according to their relative distances and sizes from the vehicle. And driving objects, reinforcement prompt objects and driving prompt information can be virtualized. Therefore, a clearer driving prompt message is provided for the driver, so that the driver can conveniently change the driving mode of the vehicle according to the prompt message, and the driving safety of the vehicle is improved.
According to the driving assistance scheme, the road side sensing equipment is used for sensing and aggregating static and dynamic information on roads in the coverage range of the road side sensing equipment to form road data, driving analysis is conducted on each vehicle according to the vehicles, the analysis results are provided for the vehicles, and therefore a driver can conveniently change the driving mode of the vehicles according to the analysis results, namely the driving prompt information, and the driving safety and the driving efficiency of the vehicles are improved.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (28)

1. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises static and/or dynamic information of all or part of objects in a road range, which is acquired by at least one road side sensing device;
identifying a vehicle object in a set state based on the road data;
generating driving prompt information related to the driving of the vehicle object in the set state on the road according to the road data;
and sending the running prompt information to the vehicle in the set state or the vehicle related to the running of the vehicle in the set state so that the vehicle receiving the running prompt information performs vehicle driving control according to the running prompt information.
2. The driving assistance method according to claim 1, the step of acquiring road data including:
acquiring static information which is stored in advance and relates to each road range;
obtaining static and/or dynamic information of objects within the road range using sensors deployed in roadside sensing devices within the road range;
combining the pre-stored static information and information obtained by the respective sensors to generate the road data.
3. The driving assist method according to claim 2, the step of acquiring road data within a road range including:
receiving vehicle running information sent by vehicles in the road range in the preset communication mode; and
the pre-stored static information, the information obtained by the respective sensors, and the received vehicle travel information are combined to generate the road data.
4. The driving assist method according to any one of claims 1 to 3, the step of identifying the vehicle object of the set state based on the road data including:
determining a vehicle object belonging to the vehicle based on the motion characteristics of each object; and
the identification and motion state of each vehicle object are recognized to determine the vehicle object belonging to the set state.
5. The driving assist method according to any one of claims 1 to 4, wherein the travel guidance information includes a vehicle continuous lane change guidance, and the step of generating the travel guidance information includes:
acquiring lane objects from the road data; and
and when the vehicle object crosses the lane object more than a preset number of times within preset time, generating the vehicle continuous lane change prompt.
6. The driving assist method according to any one of claims 1 to 5, the travel guidance information including a vehicle speed warning guidance, the step of generating the travel guidance information including:
acquiring a road speed limiting object from the road data; and
and when the running speed of the vehicle is out of the road speed range defined by the road speed limiting object, generating the vehicle speed alarm prompt.
7. The driving assist method according to any one of claims 1 to 6, wherein the travel guidance information includes a traffic light guidance, and the step of generating the travel guidance information includes:
acquiring a traffic light object ahead of the vehicle object in the driving direction from the road data; and
and generating the traffic light prompt based on the traffic light object and the driving speed of the vehicle object.
8. The driving assist method according to claim 7, further comprising the step of:
obtaining control information for the traffic light object from a traffic light system coupled to the roadside sensing device;
the step of generating a traffic light prompt includes generating the traffic light prompt based on the traffic light object, control information of the traffic light object, and a travel speed of the vehicle object.
9. The driving assistance method according to claim 7 or 8, the step of acquiring a traffic light object comprising:
and acquiring the traffic light object from the road data of the adjacent roadside sensing equipment.
10. The driving assist method according to any one of claims 1 to 9, wherein the travel guidance information includes a vehicle avoidance guidance, and the step of generating the travel guidance information includes:
acquiring a rear vehicle object within a predetermined range behind the vehicle object traveling direction from the road data; and
and if the rear vehicle object is a special vehicle, generating the vehicle avoidance prompt.
11. The driving assist method according to claim 10, the step of acquiring the rear vehicle object comprising:
the rear vehicle object is acquired from behind in a traveling direction on the same lane as the vehicle object.
12. The driving assist method according to claim 10 or 11, the step of acquiring the rear vehicle object comprising:
and acquiring the rear vehicle object from the road data and the road data of the adjacent roadside sensing equipment.
13. The driving assist method according to any one of claims 10 to 12, the step of determining that the rear vehicle object is a special vehicle comprising:
determining the rear vehicle as a special vehicle using image recognition; and/or
The rear vehicle object indicates to the roadside sensing device that the rear vehicle object is a special vehicle.
14. The driving assist method according to any one of claims 1 to 13, wherein the travel guidance information includes a route avoidance guidance, and the step of generating the travel guidance information includes:
obtaining a driving path plan associated with the driving of the special vehicle;
judging whether the driving path plan is overlapped with a road in the road data or not; and
and if the roads are overlapped, generating the route avoidance prompt.
15. The driving assistance method according to claim 14, the acquired travel path plan including time information when the special vehicle passes through the road, and the route avoidance guidance prompt including the time information.
16. The driving assistance method according to claim 14 or 15, the travel path plan including lane information on a road, the step of generating a route avoidance guidance prompt including:
and when the lane driven by the vehicle object is the same as the lane in the driving path plan, generating the route avoidance prompt.
17. The driving assistance method according to any one of claims 1 to 16, the travel guidance information including a fleet guidance, the step of generating the travel guidance information including:
acquiring a fleet object within the predetermined range, wherein the fleet object comprises at least one or more fleet vehicles; and
generating the fleet prompt when the vehicle object is within a predetermined distance from the fleet object.
18. The method of driving assistance of claim 17, the fleet prompt comprising a fleet lane change prompt, the method further comprising the steps of:
acquiring a prompt that the motorcade object needs to change lanes; and
and when the vehicle object is positioned on a lane changing target lane of the motorcade object and behind the driving direction of the motorcade object, generating a motorcade lane changing prompt.
19. The driving assistance method according to claim 17 or 18, the fleet prompt comprising a fleet avoidance prompt, the method further comprising the steps of:
generating the fleet avoidance prompt when the vehicle object is between the one or more fleet vehicle objects on a road of travel.
20. A driving assistance method performed in a vehicle that runs on a road on which a roadside sensing device is disposed, the method comprising the steps of:
receiving driving prompt information, wherein the driving prompt information is generated by the road side sensing equipment for the vehicle according to road data and is related to the driving of the vehicle on the road; and
presenting the driving prompt information so that a driver of the vehicle can control the driving of the vehicle according to the driving prompt information.
21. The driving assist method according to claim 20, further comprising the step of:
receiving prompting object information which is associated with the driving prompting information in the road data;
and presenting the driving prompt information together with the prompt object information.
22. The driving assist method according to claim 21, further comprising the step of:
receiving traveling object information associated with traveling of the vehicle in the road data; and presenting the driving prompt information together with the driving object information and the prompt object information.
23. The driving assist method according to any one of claims 20 to 22, the step of presenting the travel prompting information including:
presenting in consideration of relative positions of the vehicle and the travel object and the presentation object, and sizes of the travel object and the presentation object; and
and highlighting the prompting object and the driving prompting information.
24. A roadside sensing device deployed at a road location, comprising:
each sensor adapted to obtain static and dynamic information for each object within the predetermined range;
a storage unit adapted to store the road data including static and dynamic information of each object within the predetermined range; and
a computing unit adapted to perform the method of any of claims 1-12.
25. The roadside sensing apparatus of claim 24, the sensors comprising one or more of:
millimeter wave radar, laser radar, camera, infrared probe.
26. A driving assistance system comprising:
a plurality of roadside sensing units as claimed in claim 24 or 25 deployed at a roadside location; and
a vehicle that travels on the road and that executes the driving assist method according to any one of claims 20 to 23.
27. A computing device, comprising:
at least one processor; and
a memory storing program instructions configured for execution by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-23.
28. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises static and/or dynamic information of all or part of objects in a road range, which is acquired by at least one road side sensing device;
identifying a vehicle object in a set state based on the road data;
generating driving prompt information related to the driving of the vehicle object in the set state on the road according to the road data;
and sending the driving prompt information to a mobile terminal, wherein the mobile terminal is associated with the vehicle with the set state and the related vehicles.
CN201811565096.0A 2018-12-20 2018-12-20 Driving assisting method and system Pending CN111429739A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811565096.0A CN111429739A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811565096.0A CN111429739A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Publications (1)

Publication Number Publication Date
CN111429739A true CN111429739A (en) 2020-07-17

Family

ID=71545509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811565096.0A Pending CN111429739A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Country Status (1)

Country Link
CN (1) CN111429739A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111540237A (en) * 2020-05-19 2020-08-14 河北德冠隆电子科技有限公司 Method for automatically generating vehicle safety driving guarantee scheme based on multi-data fusion
CN111932882A (en) * 2020-08-13 2020-11-13 广东飞达交通工程有限公司 Real-time early warning system, method and equipment for road accidents based on image recognition
CN112466114A (en) * 2020-11-19 2021-03-09 南京代威科技有限公司 Traffic monitoring system and method based on millimeter wave technology
CN112776821A (en) * 2021-01-28 2021-05-11 宁波均联智行科技股份有限公司 V2X-based method and device for instant messaging between vehicles
CN114170803A (en) * 2021-12-15 2022-03-11 阿波罗智联(北京)科技有限公司 Roadside sensing system and traffic control method
CN114694368A (en) * 2020-12-28 2022-07-01 比亚迪股份有限公司 Vehicle management and control system
WO2022206978A1 (en) * 2021-01-01 2022-10-06 许军 Roadside millimeter-wave radar calibration method based on vehicle-mounted positioning apparatus
US20220324476A1 (en) * 2021-04-12 2022-10-13 International Business Machines Corporation Autonomous self-driving vehicles user profiles
WO2023016464A1 (en) * 2021-08-12 2023-02-16 华为技术有限公司 Interaction method and apparatus for trajectory information
CN115713866A (en) * 2022-10-11 2023-02-24 悉地(苏州)勘察设计顾问有限公司 Road static information active service method based on vehicle running characteristics

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007026881A1 (en) * 2005-09-01 2007-03-08 Pioneer Corporation Driving support system, driving support apparatus, driving support method, driving support program and recording medium
CN104616516A (en) * 2013-11-04 2015-05-13 深圳市赛格导航科技股份有限公司 Driving safety auxiliary control method and driving safety auxiliary control system
CN105761521A (en) * 2015-12-31 2016-07-13 重庆邮电大学 Real-time traffic guidance roadside system and real-time traffic guidance method based on Internet of Vehicles
CN106205169A (en) * 2016-07-20 2016-12-07 天津职业技术师范大学 Based on the major trunk roads crossing inlet road method for controlling driving speed that bus or train route is collaborative
CN106467112A (en) * 2016-10-11 2017-03-01 斑马信息科技有限公司 Vehicle-mounted DAS (Driver Assistant System)
CN107067718A (en) * 2016-12-29 2017-08-18 盯盯拍(深圳)技术股份有限公司 Traffic accident responsibility appraisal procedure, traffic accident responsibility apparatus for evaluating and traffic accident responsibility assessment system
CN107680012A (en) * 2016-08-01 2018-02-09 奥迪股份公司 Vehicle DAS (Driver Assistant System) and method
CN107798861A (en) * 2017-11-30 2018-03-13 湖北汽车工业学院 A kind of vehicle cooperative formula formation running method and system
CN108010360A (en) * 2017-12-27 2018-05-08 中电海康集团有限公司 A kind of automatic Pilot context aware systems based on bus or train route collaboration
CN108011947A (en) * 2017-11-30 2018-05-08 湖北汽车工业学院 A kind of vehicle cooperative formula formation driving system
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108447291A (en) * 2018-04-03 2018-08-24 南京锦和佳鑫信息科技有限公司 A kind of Intelligent road facility system and control method
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings
CN108831190A (en) * 2018-08-02 2018-11-16 钟祥博谦信息科技有限公司 Vehicle collision avoidance method, apparatus and equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007026881A1 (en) * 2005-09-01 2007-03-08 Pioneer Corporation Driving support system, driving support apparatus, driving support method, driving support program and recording medium
CN104616516A (en) * 2013-11-04 2015-05-13 深圳市赛格导航科技股份有限公司 Driving safety auxiliary control method and driving safety auxiliary control system
CN105761521A (en) * 2015-12-31 2016-07-13 重庆邮电大学 Real-time traffic guidance roadside system and real-time traffic guidance method based on Internet of Vehicles
CN106205169A (en) * 2016-07-20 2016-12-07 天津职业技术师范大学 Based on the major trunk roads crossing inlet road method for controlling driving speed that bus or train route is collaborative
CN107680012A (en) * 2016-08-01 2018-02-09 奥迪股份公司 Vehicle DAS (Driver Assistant System) and method
CN106467112A (en) * 2016-10-11 2017-03-01 斑马信息科技有限公司 Vehicle-mounted DAS (Driver Assistant System)
CN107067718A (en) * 2016-12-29 2017-08-18 盯盯拍(深圳)技术股份有限公司 Traffic accident responsibility appraisal procedure, traffic accident responsibility apparatus for evaluating and traffic accident responsibility assessment system
CN107798861A (en) * 2017-11-30 2018-03-13 湖北汽车工业学院 A kind of vehicle cooperative formula formation running method and system
CN108011947A (en) * 2017-11-30 2018-05-08 湖北汽车工业学院 A kind of vehicle cooperative formula formation driving system
CN108010360A (en) * 2017-12-27 2018-05-08 中电海康集团有限公司 A kind of automatic Pilot context aware systems based on bus or train route collaboration
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108447291A (en) * 2018-04-03 2018-08-24 南京锦和佳鑫信息科技有限公司 A kind of Intelligent road facility system and control method
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings
CN108831190A (en) * 2018-08-02 2018-11-16 钟祥博谦信息科技有限公司 Vehicle collision avoidance method, apparatus and equipment

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111540237A (en) * 2020-05-19 2020-08-14 河北德冠隆电子科技有限公司 Method for automatically generating vehicle safety driving guarantee scheme based on multi-data fusion
CN111540237B (en) * 2020-05-19 2021-09-28 河北德冠隆电子科技有限公司 Method for automatically generating vehicle safety driving guarantee scheme based on multi-data fusion
CN111932882A (en) * 2020-08-13 2020-11-13 广东飞达交通工程有限公司 Real-time early warning system, method and equipment for road accidents based on image recognition
CN111932882B (en) * 2020-08-13 2022-05-06 广东飞达交通工程有限公司 Real-time early warning system, method and equipment for road accidents based on image recognition
CN112466114A (en) * 2020-11-19 2021-03-09 南京代威科技有限公司 Traffic monitoring system and method based on millimeter wave technology
CN114694368A (en) * 2020-12-28 2022-07-01 比亚迪股份有限公司 Vehicle management and control system
WO2022206978A1 (en) * 2021-01-01 2022-10-06 许军 Roadside millimeter-wave radar calibration method based on vehicle-mounted positioning apparatus
CN112776821A (en) * 2021-01-28 2021-05-11 宁波均联智行科技股份有限公司 V2X-based method and device for instant messaging between vehicles
US20220324476A1 (en) * 2021-04-12 2022-10-13 International Business Machines Corporation Autonomous self-driving vehicles user profiles
WO2023016464A1 (en) * 2021-08-12 2023-02-16 华为技术有限公司 Interaction method and apparatus for trajectory information
CN114170803A (en) * 2021-12-15 2022-03-11 阿波罗智联(北京)科技有限公司 Roadside sensing system and traffic control method
EP4198944A1 (en) * 2021-12-15 2023-06-21 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Roadside sensing system and traffic control method
CN115713866A (en) * 2022-10-11 2023-02-24 悉地(苏州)勘察设计顾问有限公司 Road static information active service method based on vehicle running characteristics
CN115713866B (en) * 2022-10-11 2023-08-22 悉地(苏州)勘察设计顾问有限公司 Road static information active service method based on vehicle operation characteristics

Similar Documents

Publication Publication Date Title
CN111429739A (en) Driving assisting method and system
CN111354182A (en) Driving assisting method and system
US10800455B2 (en) Vehicle turn signal detection
CN108122432B (en) Method for determining data of traffic situation
US8620571B2 (en) Driving assistance apparatus, driving assistance method, and driving assistance program
WO2016009600A1 (en) Drive assist device
US10832577B2 (en) Method and system for determining road users with potential for interaction
CN111354214B (en) Auxiliary parking method and system
JP4258485B2 (en) Vehicle overtaking support device
JP5326230B2 (en) Vehicle driving support system, driving support device, vehicle, and vehicle driving support method
CN110942623B (en) Auxiliary traffic accident handling method and system
WO2020057406A1 (en) Driving aid method and system
JP2016218732A (en) Automobile peripheral information display system
CN110562222B (en) Emergency braking control method for curve scene, vehicle-mounted device and storage medium
CN111354222A (en) Driving assisting method and system
KR20190133623A (en) Method for supporting a guidance of at least one motor vehicle, assistance system and motor vehicle
CN109383367B (en) Vehicle exterior notification device
WO2020057407A1 (en) Vehicle navigation assistance method and system
JP2017102739A (en) Vehicle control device
CN110662683A (en) Driving support device and driving support method
US20170103271A1 (en) Driving assistance system and driving assistance method for vehicle
JP2017045130A (en) Driving support device, computer program, and driving support system
JP2008097279A (en) Vehicle exterior information display device
JP7165827B2 (en) Automobile assistance method
KR100534689B1 (en) Method for guide of complexity road of a navigation system on vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201222

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: The big Cayman capital building, a four - story mailbox 847

Applicant before: Alibaba Group Holding Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200717