CN111354222A - Driving assisting method and system - Google Patents

Driving assisting method and system Download PDF

Info

Publication number
CN111354222A
CN111354222A CN201811565073.XA CN201811565073A CN111354222A CN 111354222 A CN111354222 A CN 111354222A CN 201811565073 A CN201811565073 A CN 201811565073A CN 111354222 A CN111354222 A CN 111354222A
Authority
CN
China
Prior art keywords
vehicle
road
objects
information
road data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811565073.XA
Other languages
Chinese (zh)
Inventor
童华江
姚浪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811565073.XA priority Critical patent/CN111354222A/en
Publication of CN111354222A publication Critical patent/CN111354222A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a driving assisting method, which comprises the following steps: acquiring road data, the road data comprising: static and/or dynamic information of all or part of objects in the respective road range is acquired by at least one drive test sensing unit; identifying all or part of the vehicle objects based on the road data; for a target vehicle object among the vehicle objects, determining a vehicle peripheral object of the target vehicle object; and sending the relevant information of the vehicle peripheral objects to the target vehicle so that the target vehicle presents the position relation of the vehicle and the vehicle peripheral objects. The invention also discloses an auxiliary driving method executed on the vehicle, corresponding roadside sensing equipment and an auxiliary driving system.

Description

Driving assisting method and system
Technical Field
The present invention relates to the field of vehicle driving assistance, and in particular to the field of using road environment data to assist in vehicle driving.
Background
As the automotive industry moves into the internet and intelligent era, sensors and arithmetic units in or around the vehicle can provide increasingly greater driving-related data and computing power. These data and capabilities can assist in driving the vehicle more efficiently than previously, making vehicle driving simpler, more intelligent, and safer.
Safety and convenience are often concerns for the driver in relation to driving a vehicle. In the existing vehicle-mounted driving assistance scheme, data collection such as a distance to a vehicle ahead, a speed of the vehicle itself, and a real-time position of the vehicle is generally performed during driving using sensors on the vehicle, and then an on-vehicle computing unit analyzes the data and performs a driving assistance providing capability based on the analysis result. This solution is limited on the one hand to the relevant sensors installed on the vehicle, i.e. it cannot be implemented on vehicles not equipped with relevant sensors. On the other hand, the vehicle sensor can only sense data in a small range around the vehicle, and cannot provide driving environment related information at a greater distance from the vehicle, which has obvious limitations.
The existing road monitoring equipment only provides functions of measuring vehicle flow, vehicle distance, vehicle speed and the like, can only provide a few pieces of road flow prompting information for vehicle driving, and cannot achieve the aim of effectively assisting the vehicle driving.
With the development of the technology of the internet of vehicles V2X, a collaborative environment awareness system appears. The system can use the data of the vehicle and the surrounding environment together to assist the driving of the vehicle. However, how to construct the environmental data and how to fuse the vehicle itself and the environmental data are problems faced by the collaborative environmental awareness system.
How to provide auxiliary information for the driving of the vehicle conveniently, accurately and quickly without changing the vehicle per se is one of the problems which are urgently needed to be solved in the field.
Therefore, a new driving assistance scheme for a vehicle is needed, which can provide more accurate and comprehensive driving assistance information for the vehicle, so that the driver can accurately notice the potential risks related to the driving of the vehicle on the road, and the driver can change the driving behavior of the vehicle at the first time, so that the driving of the vehicle becomes safer.
Disclosure of Invention
To this end, the present invention provides a new driving assistance solution for a vehicle in an attempt to solve or at least alleviate at least one of the problems presented above.
According to an aspect of the present invention, a driving assist method is provided. The method comprises the following steps: acquiring road data, wherein the road data comprises static and/or dynamic information of all or part of objects in a road range respectively acquired by at least one roadside sensing device; identifying all or part of the vehicle objects based on the road data; for a target vehicle object among the vehicle objects, determining a vehicle peripheral object of the target vehicle object; and sending the information of the vehicle peripheral objects to the vehicle so that the vehicle presents the mutual position relationship of the vehicle and the vehicle peripheral objects.
Alternatively, in the driving assist method according to the present invention, the step of determining the vehicle peripheral object of the target vehicle object includes: an object related to the travel of the target vehicle object within a predetermined distance from the target vehicle object is acquired from the road data as a vehicle peripheral object.
Alternatively, in the driving assist method according to the present invention, the step of transmitting the vehicle-surrounding object to the target vehicle includes: the target vehicle object and the vehicle peripheral object are transmitted to the vehicle.
According to another aspect of the present invention, there is provided a driving assistance method performed in a vehicle that travels on a road on which a roadside sensing device is disposed, the method including the steps of: receiving vehicle peripheral object information, the vehicle peripheral object being generated for the vehicle by the roadside sensing device according to the road data and indicating to be associated with the vehicle; and presenting the vehicle peripheral object.
Alternatively, in the driving assist method according to the present invention, the step of presenting the vehicle peripheral object includes: the vehicle-surrounding object is presented based on the feature of the vehicle-surrounding object, the size and the traveling direction of the vehicle, and the relative positions of the vehicle-surrounding object and the vehicle.
Alternatively, in the driving assist method according to the present invention, the step of presenting the vehicle peripheral object further includes: the vehicle-surrounding objects are presented with reference to the vehicle.
Alternatively, in the driving assist method according to the present invention, the step of presenting the vehicle-surrounding object with reference to the vehicle includes: presenting the vehicle at a predetermined location of the display area; and presenting the vehicle-surrounding object in accordance with the relative positions of the vehicle-surrounding object and the vehicle.
Alternatively, in the driving assist method according to the present invention, the step of presenting the vehicle-surrounding object with reference to the vehicle includes: the vehicle-surrounding object and the vehicle are presented in accordance with relative position, direction, and speed relationships of the vehicle-surrounding object and the vehicle, with a predetermined position behind the vehicle as a projection point in accordance with a traveling direction of the vehicle.
Alternatively, in the driving assist method according to the present invention, the predetermined position is higher in height than the vehicle.
Alternatively, in the driving assist method according to the present invention, the receiving the vehicle-surrounding object includes receiving the vehicle object and the vehicle-surrounding object, and the speed, the position, and/or the size of the vehicle is determined according to a speed, a position, and/or a size characteristic of the received vehicle object.
Alternatively, in the driving assist method according to the present invention, the step of displaying the vehicle peripheral object includes: highlighting the vehicle object; and weakening the display of the vehicle peripheral object.
According to still another aspect of the present invention, there is provided a roadside sensing apparatus including: each sensor is suitable for obtaining the static and dynamic information of each object in the coverage area; a storage unit adapted to store road data including static and/or dynamic information of objects within its coverage; and a calculation unit adapted to perform the driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance system including the roadside sensing device according to the present invention, and a vehicle, running on a road, and performing a driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance method of a vehicle, including the steps of: acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in the respective road range is acquired by at least one drive test sensing unit; identifying all or part of the vehicle objects based on the road data; for a target vehicle object among the vehicle objects, determining a vehicle peripheral object of the target vehicle object; and transmitting the related information of the vehicle peripheral objects to the mobile terminal associated with the target vehicle so that the mobile terminal or the target vehicle presents the position relationship of the vehicle and the vehicle peripheral objects.
According to still another aspect of the present invention, there is also provided a computing device. The computing device includes at least one processor and a memory storing program instructions, wherein the program instructions are configured to be executed by the at least one processor and include instructions for performing the above-described assisted parking method.
According to still another aspect of the present invention, there is also provided a readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the above-described parking assist method.
According to the driving assistance scheme of the invention, static and dynamic information on roads in the coverage area of the road side sensing device is sensed and aggregated to form road data, vehicle-specific analysis is performed on each vehicle, a set of vehicle-surrounding objects surrounding the vehicle is formed, and the set of vehicle-surrounding objects is provided to the vehicle, so that the vehicle-surrounding objects can be displayed on the vehicle in all directions, and the vehicle driver is provided with an all-around impression of the vehicle surroundings.
In addition, according to the driving assistance scheme of the present invention, information of objects around the vehicle can be displayed on the vehicle, and the existing driver of the vehicle can only see objects within the range of his/her sight line by being limited by his/her field of view, while according to the driving assistance scheme of the present invention, an omni-directional object view of 360 degrees around the vehicle without a blind area can be provided, so that the driver of the vehicle can more comprehensively understand the situation around the vehicle, and the driving risk can be reduced.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic view of a travel assist system according to an embodiment of the invention;
FIG. 2 shows a schematic diagram of a roadside sensing device according to one embodiment of the invention;
FIG. 3 shows a schematic view of a method of assisting driving according to an embodiment of the invention;
FIG. 4 shows a schematic view of a method of assisting driving according to an embodiment of the invention; and
FIG. 5 illustrates a schematic diagram of an interface presenting risk objects according to another embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 shows a schematic view of a driving assistance system 100 according to an embodiment of the invention. As shown in fig. 1, the driving assistance system 100 includes a vehicle 110 and a roadside sensing device 200. Vehicle 110 is traveling on road 140. Roadway 140 includes a plurality of lanes 150. During the driving process of the vehicle 110 on the road 140, different lanes 150 may be switched according to the road condition and the driving target.
The roadside sensing device 200 is disposed at the periphery of the road, and collects various information within a predetermined range around the roadside sensing device 200, particularly road data related to the road, using various sensors it has.
The roadside sensing device 200 has a predetermined coverage. According to the coverage range and the road condition of each roadside sensing device 200, a sufficient number of roadside sensing devices 200 can be deployed on two sides of the road, and the whole road can be fully covered. Of course, according to an embodiment, instead of fully covering the entire road, the roadside sensing devices 200 may be deployed at the feature points (corners, intersections, and diversions) of each road to obtain the feature data of the road. The present invention is not limited by the specific number of roadside sensing devices 200 and the coverage of the road.
When the roadside sensing devices 200 are deployed, the positions of the sensing devices 200 to be deployed are calculated according to the coverage area of a single roadside sensing device 200 and the condition of the road 140. The coverage area of the roadside sensing device 200 depends on at least the arrangement height of the sensing device 200, the effective distance sensed by the sensors in the sensing device 200, and the like. And the condition of road 140 includes road length, number of lanes 150, road curvature and grade, etc. The deployment location of the perceiving device 200 may be calculated in any manner known in the art.
After the deployment location is determined, the roadside sensing device 200 is deployed at the determined location. Since the data that the roadside sensing device 200 needs to sense includes motion data of a large number of objects, clock synchronization of the roadside sensing device 200 is performed, that is, the time of each sensing device 200 is kept consistent with the time of the vehicle 110 and the cloud platform.
Subsequently, the position of each deployed roadside sensing device 200 is determined. Since the perception device 200 is to provide the driving assistance function for the vehicle 110 traveling at a high speed on the road 140, the position of the perception device 200 must be highly accurate as the absolute position of the perception device. There are a number of ways to calculate the high accuracy absolute position of the perceiving device 200. According to one embodiment, a Global Navigation Satellite System (GNSS) may be utilized to determine a high accuracy position.
The roadside sensing device 200 collects and senses the static conditions (lane lines 120, guardrails, isolation belts, parking spaces, road gradient and inclination, accumulated water and snow cover of the road, and the like) and the dynamic conditions (running vehicles 110, pedestrians 130 and sprinkles) of the road in the coverage area of the roadside sensing device by using the sensors thereof, and fuses the sensing data of the different sensors to form the road data of the road section. The road data comprises static and dynamic information of all objects within the coverage area of the perceiving device 200, in particular within the road-related field. The roadside sensing device 200 may then calculate driving-related information for each vehicle based on the road data, such as whether the vehicle has a potential collision risk and driving risk, traffic conditions outside the field of view of the vehicle (e.g., road conditions after a road curve, road conditions before a preceding vehicle), and so forth.
A vehicle 110 entering the coverage area of one roadside sensing device 200 may communicate with the roadside sensing device 200. A typical communication method is the V2X communication method. Of course, the mobile internet provided by the mobile communication service provider may communicate with the roadside sensing devices 200 using mobile communication means such as 5G, 4G and 3G. In consideration of the fact that the vehicle runs at a high speed and the requirement for the time delay of communication is as short as possible, the V2X communication method is adopted in the general embodiment of the present invention. However, any communication means that can meet the time delay requirements required by the present invention is within the scope of the present invention.
The vehicle 110 may receive driving-related information related to the vehicle 110 from the roadside sensing device 200. For example, the vehicle 110 may acquire, from the roadside sensing device 200, vehicle-surrounding objects around the vehicle 110 in the road data, such as lanes on the road, traffic restriction marks above the road, pedestrians around the road, or street lights, signs, other vehicles, and the like, which include not only objects in front of the vehicle 110 in driving but also various objects behind and above the vehicle 110. The vehicle 110 may display these surrounding objects on its control interface to present all surrounding road information to the driver in one place.
The vehicle 110 may receive driving-related information related to the vehicle 110 and road data for the segment of road in various ways. In one implementation, vehicles 110 entering the coverage area of roadside sensing devices 200 may receive such information and data automatically. In another implementation, the vehicle 110 may issue a request, in response to which the roadside sensing device 200 transmits surrounding object information related to the vehicle 110 and road data of the section of road to the vehicle 110, so as to provide an all-round presentation on the vehicle 110 based on these information, thereby providing the vehicle driver with an all-round impression of the surroundings of the vehicle.
The invention is not limited to the specific way in which the vehicle 110 receives driving-related information and road data for the road segment, and all ways in which such information and data can be received and the driving behavior of the vehicle 110 controlled accordingly by the driver's reference are within the scope of the invention.
Optionally, the driving assistance system 100 further comprises a server 160. Although only one server 160 is shown in fig. 1, it should be understood that the server 160 may be a cloud service platform consisting of a plurality of servers. Each roadside sensing device 100 transmits the sensed road data to the server 160. The server 160 may combine the road data based on the location of each roadside sensing device 100 to form road data for the entire road. The server 160 may also perform further processing on the road data for the road to form driving-related information, such as traffic conditions, accident sections, expected transit times, etc. for the entire road.
The server 160 may transmit the road data and the driving related information of the formed whole road to each roadside sensing device 200, or may transmit the road related data and the driving related information of a section of road corresponding to several roadside sensing devices 200 adjacent to a certain roadside sensing device 200 to the roadside sensing device 200. In this way, the vehicle 110 can obtain a wider range of road and vehicle surroundings information from the roadside sensing device 200. Of course, the vehicle 110 may obtain the vehicle surroundings information and the road data directly from the server 160 without passing through the roadside sensing device 200.
If roadside sensing devices 200 are deployed on all roads within an area and the roadside sensing devices 200 transmit road data to the server 160, an indication of road traffic within the area may be formed at the server 160. Vehicle 110 may receive the indication from server 160 and control the driving behavior of vehicle 110 accordingly.
The roadside sensing devices 200 may establish communication and perform communication transmission with the surrounding roadside sensing devices 200 directly without the server 160. Since the roadside sensing device 200 has more and more powerful computing power, more and more information can be processed locally at the roadside sensing device 200 by edge calculation in consideration of the bandwidth limitation and time delay requirement for communication between the roadside sensing device 200 and the server 160. The processed information may also be directly transmitted to the surrounding roadside sensing devices 200 without being transmitted via the server 160. This communication is more efficient for information that only needs to be exchanged between adjacent roadside sensing devices 200. For example, the traffic light information within the coverage area of one roadside sensing device 200 is forwarded to the surrounding roadside sensing devices 200.
For the roadside sensing devices 200, the same information may be received from the server 160 and the surrounding roadside sensing devices 200, and therefore, information merging is required based on the time stamp and the content of the information, old duplicate information is removed, and the latest information is provided for the vehicles 110 within the coverage range thereof.
It should be noted that a mobile terminal, such as a smart phone carried by a vehicle driver, a vehicle-mounted smart speaker, etc., may also be connected to the network formed by the server 160 and the roadside sensing device 200. Some information to the vehicle, if appropriate to the mobile terminal (e.g., navigation information, parking space information, driving advice information, vehicle surroundings information, etc.), may also be sent to the mobile terminal so that the mobile terminal can maneuver the associated vehicle based on the information.
FIG. 2 shows a schematic diagram of a roadside sensing device 200 according to one embodiment of the invention. As shown in fig. 2, the roadside sensing device 200 includes a communication unit 210, a sensor group 220, a storage unit 230, and a calculation unit 240.
The roadside sensing devices 200 communicate with each vehicle 110 entering its coverage area to provide the vehicle 110 with vehicle surrounding information and driving-related information, and receive vehicle travel information of the vehicle from the vehicle 110. Meanwhile, the roadside sensing device 200 also needs to communicate with the server 160 and other surrounding roadside sensing devices 200. The communication unit 210 provides a communication function for the roadside sensing device 200. The communication unit 210 may employ various communication methods including, but not limited to, ethernet, V2X, 5G, 4G, and 3G mobile communication, etc., as long as they can complete data communication with as little time delay as possible. In one embodiment, roadside sensing devices 200 may communicate with vehicle 110 entering its coverage area and surrounding roadside sensing devices 200 using V2X, and roadside sensing devices 200 may communicate with server 160 using, for example, a high speed internet.
The sensor group 220 includes various sensors, for example, radar sensors such as a millimeter wave radar 222 and a laser radar 224, and image sensors such as a camera 226 and an infrared probe 228 having a light supplement function. For the same object, various sensors can obtain different properties of the object, for example, a radar sensor can perform object velocity and acceleration measurements, and an image sensor can obtain the shape and relative angle of the object.
The sensor group 220 collects and senses static conditions (lane lines 120, guardrails, isolation belts, roadside parking spaces, road gradient and inclination, road water and snow, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinkles) of roads in a coverage area using the respective sensors, and stores data collected and sensed by the respective sensors in the storage unit 230.
The calculation unit 240 fuses the data sensed by the sensors to form road data for the road segment, and also stores the road data in the storage unit 230. In addition, the calculation unit 240 may further perform data analysis based on the road data, identify one or more vehicles and vehicle motion information therein, and further determine driving-related information for the vehicle 110. Such data and information may be stored in storage unit 230 for transmission to vehicle 110 or server 160 via communication unit 210.
Specifically, the calculation unit 240 may acquire static information on a predetermined range of road positions, which is stored in advance. After the roadside sensing device 200 is deployed at a certain position of a road, the range of the road covered by the sensing device 200 is fixed. Static information of the predetermined range, such as road width, number of lanes, turning radius, etc., within the range may be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device 200 at the time of deployment of the perceiving device 200. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, the calculating unit 240 processes the raw sensor data according to different sensors, respectively, to form sensing data such as distance measurement, speed measurement, type identification, size identification, and the like. And then, based on the obtained static road data, in different cases, different sensor data are used as a reference, and other sensor data are added for calibration, so that uniform road data are finally formed.
The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Accordingly, the vehicle 110 may transmit vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. To this end, the calculation unit 240 may further fuse the vehicle travel information obtained from the vehicle 110 on the basis of the previously formed road data to form new road data.
In addition, the storage unit 230 may store various calculation models, such as a collision detection model, a license plate recognition model, a parking space recognition model, a parking lot entrance/exit device model, and the like. These computational models may be used by the computational unit 240 to implement the corresponding steps in the method 300 described below with reference to fig. 3.
According to one embodiment of the present aspect, in the road data, object information, such as a static street light object, a sign object, a lane object, and the like, and a dynamic vehicle object, a new person object, and a temporary falling object, a temporary lane change object, and the like, can be established for each item on the road. Subsequently, the calculation unit 240 calculates interactions between various objects for the respective objects using various calculation models stored in the storage unit 230, thereby obtaining various driving-related information.
Fig. 3 shows a schematic representation of a method 300 for driver assistance for a vehicle according to an embodiment of the invention. The driving assistance method 300 is suitably performed in the roadside sensing device 200 shown in fig. 2. As shown in fig. 3, the driving assistance method 300 starts at step S310.
In step S310, road data is acquired. The road data comprises static and/or dynamic information of all or part of the objects within the road range of the at least one roadside sensing device respectively. As described above with reference to fig. 1, the roadside sensing device 200 is generally fixedly disposed near a certain road, and thus has a corresponding road position. In addition, the roadside sensing device 200 has a predetermined coverage area depending on at least the arrangement height of the sensing device 200, the effective distance for sensing by the sensors in the sensing device 200, and the like. Once the roadside sensing device 200 is deployed at a side of a certain road, a predetermined range of the road that can be covered by the sensing device can be determined according to the specific positions, heights and effective sensing distances of the sensing device and the road.
The roadside sensing device 200 collects and/or senses the static conditions (lane lines 120, guardrails, isolation strips, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinklers) of the road in the coverage area by using the various sensors thereof to obtain and store various sensor data.
As described above, the roadside sensing device 200 includes various sensors, for example, radar sensors such as the millimeter wave radar 222 and the laser radar 224, and image sensors such as the camera 226 and the infrared probe 228 having a light supplement function, and the like. For the same object, various sensors can obtain different properties of the object, for example, a radar sensor can perform object velocity and acceleration measurements, and an image sensor can obtain the shape and relative angle of the object.
In step S310, processing and fusion may be performed based on the obtained various sensor raw data, thereby forming unified road data. In one embodiment, step S310 may further include a substep S312. In step S312, static information on a predetermined range of road positions, which is stored in advance, is acquired. After the roadside sensing device is deployed at a certain position of a road, the range of the road covered by the sensing device is fixed. Static information of the predetermined range, such as road width, number of lanes, lane lines, speed limit signs, turning radius, etc. within the range can be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device 200 at the time of deployment. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, in step S314, the raw sensor data is processed according to different sensors, respectively, to form sensing data about the distance measurement, speed measurement, and type, size, etc. of the dynamic object and the size, content, location, etc. of various static objects. Next, in step S316, based on the road static data obtained in step S312, calibration is performed using different sensor data as a reference and other sensor data, and finally uniform road data is formed.
Steps S312-S136 describe one way to obtain road data. The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Therefore, as described in step S318, the vehicle 110 transmits the vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. The method S310 further includes a step S319 in which the vehicle travel information obtained in the step S318 is further fused on the basis of the road data formed in the step S316 to form new road data.
After each roadside sensing device 200 collects static and/or dynamic information of objects within its coverage area or covered road area, the information collected by at least one roadside sensing device 200 that is adjacent may be combined to form road data for a segment of road. This combination may be performed in the server 160 coupled to the roadside sensing devices 200, or may be performed in any of the roadside sensing devices 200.
Next, in step S320, one or more vehicle objects within the sensing unit coverage are identified based on the road data obtained at step S310. The identification in step S320 includes two aspects of identification. One aspect of the identification is vehicle identification, i.e. identifying which objects in the road data are vehicle objects. Since the vehicle objects have different motion characteristics, such as a relatively high speed, traveling in a lane in one direction, generally not sending collisions with other objects, and the like. A conventional classification detection model or a deep learning-based model may be constructed based on these motion characteristics, and the constructed model is applied to road data, thereby determining motion characteristics such as a vehicle object and a motion trajectory of the vehicle object in the road data.
Another aspect of the identification is identifying a vehicle identification. For the recognized vehicle object, its vehicle identification is further determined. One way to determine the identity of the vehicle is to determine the unique license plate of the vehicle, for example by means of image recognition or the like. When the license plate of the vehicle cannot be identified, another way to determine the vehicle identifier may be to generate a unique mark of the vehicle by combining the size, type, position information, driving speed, and the like of the vehicle object. This vehicle identification is the unique identification of the vehicle object within this road section and is used to distinguish it from other vehicle objects. The vehicle identification is used in subsequent data transmission and is transmitted in different road side sensing devices in the road so as to facilitate overall analysis.
Optionally, after identifying the vehicle, a vehicle matching is required, i.e. a vehicle object to be subsequently analyzed is matched with a vehicle object that is required to receive the risk object related information as the analysis result. Vehicle matching can be performed through various matching modes or combination of license plate matching, driving speed and type matching, position information fuzzy matching and the like. According to one embodiment, the vehicle 110 may bind the license plate information through V2X or application verification, and the license plate information may further be matched to the vehicle data of the corresponding license plate in the roadside sensing device and the server, thereby implementing license plate matching.
It should be noted that in step S320, it is not required that all the vehicle objects be recognized, and only a part of the vehicle objects may be recognized as necessary. The invention is not limited to the number of vehicle objects to be identified in step S320.
Subsequently, in step S330, with respect to the vehicle object identified in step S320, particularly the target vehicle object to be subjected to the subsequent processing, the vehicle peripheral object associated with the target vehicle object is searched for from the road data acquired in step S310. As described above with reference to fig. 2, in the storage unit of the roadside sensing device 200, various calculation models are stored in advance. These calculation models can be used to calculate the association of the vehicle object with other objects in the road data and determine the vehicle-surrounding objects of the vehicle object.
There are various kinds of objects around the vehicle that can be found from the road data. According to one embodiment, the vehicle peripheral objects may be all objects in the road data whose distance from the vehicle object is within a predetermined range. For this purpose, it is possible to calculate the distance between the vehicle object and the target object based on the characteristics (position, size, traveling direction, traveling speed, and/or the like) of the vehicle object and the characteristics (position, size, moving speed, and/or direction) of the target object and the like using the object search model, and select a target object whose distance from the vehicle object is within a predetermined range as the vehicle peripheral object.
The object finding model may employ various algorithms such as a graph analysis algorithm or a combination of these algorithms for distance calculation. The invention is not limited to the specific implementation of the object finding model, and all calculation methods that can determine the relative distance between the vehicle object and the target object are within the scope of the invention.
According to another embodiment, only objects related to the traveling of the vehicle object within the predetermined range of the vehicle object may be selected as the vehicle peripheral objects. The object related to the traveling of the vehicle object means an object that may affect the traveling of the vehicle on the road, and includes, for example, a vehicle object traveling in the periphery (including the front and the rear) of the vehicle object, a vehicle object traveling in the opposite direction on a two-way road, a lane object of the road, a temporary sand throw on the road surface, accumulated water and snow, a temporary road block on the road surface, a street lamp, a pedestrian, and the like. For this purpose, one or more target objects relevant to the driving of the vehicle object can be determined using the driving correlation model. The driving correlation model can be constructed based on a large amount of historical data, and various algorithms such as various big data analysis algorithms and deep learning algorithms can be utilized to construct the driving correlation model.
Therefore, an object related to the travel of the vehicle object within a predetermined distance from the vehicle object may be determined as the vehicle peripheral object based on the object search model and the travel-related model.
After the vehicle peripheral object is generated for the target vehicle object in step S330, the generated vehicle peripheral object is transmitted to the vehicle corresponding to the vehicle object in step S340. According to one embodiment, the vehicle has automatically established communication with the roadside sensing device 200 when entering the coverage area of the device 200, so that objects around the vehicle may be sent to the vehicle 110 via the previously established channel. Subsequently, the vehicle peripheral objects may be displayed on the control interface of the vehicle 110 so that the driver of the vehicle 110 can observe the vehicle peripheral environmental objects in a 360-degree blind area-free manner and manipulate the vehicle 110 accordingly, whereby the driving safety of the vehicle 110 may be improved.
Alternatively, in order to provide the vehicle surroundings to the driver more clearly on the vehicle 110, the vehicle object and the vehicle surroundings may be transmitted to the vehicle 110 together. Although the on-board system may also provide information about the vehicle itself, such as the speed and direction of travel of the vehicle, size, and current location. The information provided by the on-board systems is typically generated solely by sensors deployed on the vehicle and may be incomplete and inaccurate. For example, if the vehicle is modified, the vehicle is to carry objects that exceed the length of the vehicle, and the like, the information provided by the on-board system is often inaccurate. The roadside sensing apparatus 200 is more accurate in terms of information for the vehicle obtained using various sensors disposed outside the vehicle. For this purpose, the vehicle object information in the road data may also be transmitted to the vehicle 110. In this way, the vehicle can display the vehicle object together with the vehicle peripheral object based on the vehicle object information from the road data, or the vehicle object information acquired by combining the vehicle object information and the in-vehicle system. Thereby providing a more accurate 360 degree look around impression.
In addition, it should be noted that a mobile terminal, such as a smart phone carried by a vehicle driver, a vehicle-mounted smart sound box, etc., may also be connected to the network formed by the server 160 and the roadside sensing device 200. In step S340, in addition to transmitting the vehicle-surrounding object information to the vehicle corresponding to the vehicle object, the vehicle-surrounding object information may be transmitted to a mobile terminal associated with the vehicle so as to display the information on the mobile terminal or on the vehicle via the mobile terminal.
Fig. 4 shows a schematic representation of a driving assistance method 400 according to another embodiment of the invention. The driving assistance method 400 is adapted to be executed in a vehicle 110, and the vehicle 110 runs on a road on which the roadside sensing device 200 is disposed.
The method 400 includes step S410. In step S410, the vehicle peripheral objects of the approaching road side perception device 200 are received through a predetermined communication means. The vehicle peripheral object is generated by the roadside sensing device 200 using the method 300 described above with reference to fig. 3, and will not be described in detail here.
Subsequently, in step S420, the vehicle peripheral objects are displayed in the vehicle 110 so that the driver of the vehicle can obtain an impression of looking around the vehicle in 360 degrees and thus can make a reference during the vehicle traveling, thereby improving the traveling safety and efficiency of the vehicle 110. In one embodiment, vehicle 110 has a display interface, and objects around the vehicle may be displayed in the display interface.
Alternatively, according to an embodiment of the present invention, the vehicle peripheral objects are displayed together with the vehicle object, so that the driver can more clearly learn the impression of the correlation between the vehicle peripheral objects and the vehicle. In consideration of the fact that the road data acquired by the roadside sensing device 200 includes more comprehensive information about the vehicle object (particularly, when the vehicle is refitted and the vehicle carries an article outside the vehicle beyond the size of the vehicle), the vehicle object itself is also received when the vehicle peripheral object is received in step S410.
In addition, according to one embodiment, in displaying the vehicle-surrounding objects, the vehicle-surrounding objects and the vehicle can be presented based on various features (size, speed, and direction) of the vehicle-surrounding objects, features (size, speed, and traveling direction) of the vehicle itself, and relative positions of the vehicle-surrounding objects and the vehicle. For example, the vehicle-surrounding objects may be presented based on their relative positions and relative directions with respect to the vehicle, i.e., with the vehicle being the location of the screen focus, thereby providing a 360-degree look-around effect.
Specifically, the visual focus position in the display screen is first determined based on the relative position between the driver's eyes and the display screen when the driver is driving the vehicle. These focal positions may be a predetermined position in the display area and vary according to the relative position of the display screen and the driver's seat.
The vehicle itself is then presented at this visual focus position. At the moment, the direction of the head of the vehicle in the screen and the driving direction in the display can be set to be consistent, so that the visual feeling of viewing the screen animation by a human is met. Then, based on the position and orientation of the vehicle, the vehicle-surrounding objects are presented in accordance with the relative positional relationship of the vehicle-surrounding objects and the vehicle. The visual focus position is generally located at a position lower than the middle of the screen display area, so that when the vehicle peripheral objects are displayed around the vehicle object, the peripheral objects at the rear of the vehicle can be displayed, thereby providing a 360-degree omni-directional display effect.
In order to provide a more intuitive presentation impression, it is desirable to incorporate a presentation style in the presented image that is similar to human vision, for example, with a depth perception. The display scale of the vehicle peripheral objects that are farther from the vehicle from the viewing angle is rendered at a relatively small scale in accordance with the depth of field. In this presentation manner, it is possible to set an image projection point at a predetermined position behind the vehicle in accordance with the traveling direction of the vehicle, and present the vehicle-surrounding objects on the screen in accordance with the relative position, size, and speed relationship of the vehicle-surrounding objects and the vehicle by using the projection relationship, so that it is possible to say that the projection point is set higher than the vehicle, thereby providing an effect of looking at the vehicle and the vehicle-surrounding environment, so that more details of the surrounding environment can be seen.
Fig. 5 shows a schematic diagram of an interface 500 presenting objects in the surroundings of a vehicle according to an embodiment of the invention. As shown in fig. 5, the interface 500 is displayed with the current position of the vehicle as a reference position and the traveling direction of the vehicle as a reference direction, and each of the vehicle peripheral objects 520 is displayed on the screen at the position relative to the vehicle object 510. The display on the interface 500 provides a bird's eye view effect from behind the vehicle 510, thereby introducing a depth of field effect. In addition, in order to give the driver a more intuitive impression, the vehicle peripheral object 520 is displayed in a weakened state, for example, in a blurred state, and the vehicle object 510 itself is displayed in a highlighted state, for example, in a red color.
As shown in fig. 5, although some vehicle surrounding objects 520 are not visible in the current field of view of the vehicle 110 (blocked by other objects 520, behind the vehicle 110, etc.), by means of roadside sensing by the roadside sensing device 200, information of more comprehensive vehicle surrounding objects 520 can be obtained from road data, and can be displayed in the screen 500. The presenting mode provides different visual angles, particularly 360-degree around-looking effect for the vehicle driver, so that the driver can conveniently and accurately master the surrounding environment of the vehicle in the driving process of the vehicle, and can take certain driving operation according to the surrounding environment, and the driving safety of the vehicle is improved.
It should also be noted that the driving assistance method 400 described above with reference to fig. 4 and 5 may also be performed on a mobile terminal associated with the vehicle 110, in addition to being adapted to be performed in the vehicle 110. For example, a driver of a vehicle may carry a smartphone or a car speaker as a mobile terminal. These mobile terminals may receive the vehicle surrounding object information when the vehicle enters the coverage area of the roadside sensing device 200 to perform the driving assistance method 400 described above with reference to fig. 4, or the mobile terminals may forward the information to the associated vehicle for subsequent processing after receiving the relevant information.
According to the driving assisting scheme, the road side sensing equipment is used for sensing and aggregating static and dynamic information on roads in the coverage range of the road side sensing equipment to form road data, driving analysis is conducted on each vehicle, objects around the vehicles are provided for the vehicles as analysis results, and therefore drivers can conveniently obtain more comprehensive information of the surrounding environment of the vehicles, and driving safety and driving efficiency of the vehicles are improved.
In addition, according to the driving assist scheme of the present invention, a manner is provided in which vehicle-related objects and vehicle objects are displayed more intuitively on the vehicle, so that it is possible for the driver of the vehicle to obtain an impression of the surroundings of the vehicle outside his field of view, so as to drive the vehicle more safely.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (18)

1. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in the respective road range is acquired by at least one drive test sensing unit;
identifying all or part of the vehicle objects based on the road data;
for a target vehicle object of the vehicle objects, determining a vehicle peripheral object of the target vehicle object;
and sending the information of the vehicle peripheral objects to the target vehicle so that the target vehicle presents the position relation between the vehicle and the vehicle peripheral objects.
2. The driving assistance method according to claim 1, the step of acquiring road data including:
acquiring static information which is stored in advance and relates to each road range;
obtaining static and/or dynamic information of objects within the road range using sensors deployed in roadside sensing devices within the road range;
combining the pre-stored static information and information obtained by the respective sensors to generate the road data.
3. The driving assist method according to claim 2, the step of acquiring road data within a road range including:
receiving vehicle running information sent by vehicles in the road range in the preset communication mode; and
the pre-stored static information, the information obtained by the respective sensors, and the received vehicle travel information are combined to generate the road data.
4. The driving assistance method according to any one of claims 1 to 3, the step of identifying all or part of the vehicle objects based on the road data comprising:
determining a vehicle object belonging to the vehicle based on the motion characteristics of each object; and
the identity of each vehicle object is identified to determine all or part of the vehicle objects.
5. The driving assist method according to any one of claims 1 to 4, the step of determining the vehicle peripheral object of the target vehicle object comprising:
and acquiring an object related to the running of the target vehicle object within a predetermined distance from the target vehicle object from the road data as a vehicle peripheral object of the target vehicle object.
6. The driving assist method according to any one of claims 1 to 5, the step of transmitting the vehicle peripheral object information to the target vehicle comprising:
and sending the target vehicle object and the vehicle peripheral object to the target vehicle.
7. A driving assistance method performed in a vehicle that runs on a road on which a roadside sensing device is disposed, the method comprising the steps of:
receiving vehicle peripheral object information, the vehicle peripheral object being generated for the vehicle by the roadside sensing device according to road data and indicating that the vehicle is associated with the vehicle; and
presenting the vehicle peripheral object.
8. The driving assist method according to claim 7, the step of presenting the vehicle peripheral object comprising:
presenting the vehicle-surrounding object based on the feature of the vehicle-surrounding object, the size and the running direction of the vehicle, and the relative positions of the vehicle-surrounding object and the vehicle.
9. The driving assist method according to claim 8, the step of presenting the vehicle peripheral object further comprising:
and presenting the vehicle peripheral objects by taking the vehicle as a reference.
10. The driving assist method according to claim 9, wherein the step of presenting the vehicle-surroundings object with reference to the vehicle includes:
presenting the vehicle at a predetermined location of a display area; and
presenting the vehicle-surrounding object according to relative positions of the vehicle-surrounding object and the vehicle.
11. The driving assist method according to claim 10, wherein the step of presenting the vehicle-surroundings object with reference to the vehicle includes:
presenting the vehicle-surrounding object and the vehicle according to a relative position and speed relationship of the vehicle-surrounding object and the vehicle with a predetermined position behind the vehicle according to a traveling direction of the vehicle as a projection point.
12. The driving assist method according to claim 11, wherein the predetermined position is higher in height than the vehicle.
13. The driving assist method according to any one of claims 7 to 12, wherein the receiving the vehicle-surroundings object includes receiving the vehicle object and the vehicle-surroundings object, and
the speed, position and/or size of the vehicle is determined from the received speed, position and/or size characteristics of the vehicle object.
14. The driving assist method according to any one of claims 7 to 13, the step of displaying the vehicle peripheral object comprising:
emphatically displaying the vehicle object; and
weakening display of the vehicle peripheral object.
15. A roadside sensing device deployed at a road location, comprising:
each sensor adapted to obtain static and dynamic information for each object within the predetermined range;
a storage unit adapted to store road data including static and dynamic information of each object within the predetermined range; and
a computing unit adapted to perform the method of any of claims 1-6.
16. A driving assistance system comprising:
a plurality of roadside sensing devices as recited in claim 15 deployed at a lateral location on a road; and
a vehicle that travels on the road and that executes the driving assist method according to any one of claims 7 to 15.
17. A computing device, comprising:
at least one processor; and
a memory storing program instructions configured for execution by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-14.
18. A driving assistance method of a vehicle, comprising the steps of:
acquiring road data, wherein the road data comprises: static and/or dynamic information of all or part of objects in the respective road range is acquired by at least one drive test sensing unit;
identifying all or part of the vehicle objects based on the road data;
for a target vehicle object of the vehicle objects, determining a vehicle peripheral object of the target vehicle object;
and sending the vehicle peripheral object information to a mobile terminal associated with the target vehicle so that the mobile terminal or the target vehicle presents the position relation between the vehicle and the vehicle peripheral object.
CN201811565073.XA 2018-12-20 2018-12-20 Driving assisting method and system Pending CN111354222A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811565073.XA CN111354222A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811565073.XA CN111354222A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Publications (1)

Publication Number Publication Date
CN111354222A true CN111354222A (en) 2020-06-30

Family

ID=71195491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811565073.XA Pending CN111354222A (en) 2018-12-20 2018-12-20 Driving assisting method and system

Country Status (1)

Country Link
CN (1) CN111354222A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112382085A (en) * 2020-10-20 2021-02-19 华南理工大学 System and method suitable for intelligent vehicle traffic scene understanding and beyond visual range perception
CN112735130A (en) * 2020-12-25 2021-04-30 北京百度网讯科技有限公司 Traffic data processing method and device, electronic equipment and medium
CN113238496A (en) * 2021-04-20 2021-08-10 东风汽车集团股份有限公司 Parallel driving controller control system, method and medium of integrated on-board unit (OBU)
CN114373295A (en) * 2021-11-30 2022-04-19 江铃汽车股份有限公司 Driving safety early warning method, system, storage medium and equipment
CN114399924A (en) * 2022-02-15 2022-04-26 青岛海信网络科技股份有限公司 Vehicle, edge computing device, server and information transmission method
CN114648870A (en) * 2022-02-11 2022-06-21 行云新能科技(深圳)有限公司 Edge calculation system, edge calculation decision prediction method, and computer-readable storage medium
CN114694368A (en) * 2020-12-28 2022-07-01 比亚迪股份有限公司 Vehicle management and control system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782646A (en) * 2009-01-19 2010-07-21 财团法人工业技术研究院 All-round environment sensing system and method
KR20160071734A (en) * 2014-12-12 2016-06-22 삼성전자주식회사 Method and apparatus for traffic safety
CN108022450A (en) * 2017-10-31 2018-05-11 华为技术有限公司 A kind of auxiliary driving method and traffic control unit based on cellular network
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108694859A (en) * 2017-02-28 2018-10-23 大唐高鸿信息通信研究院(义乌)有限公司 A kind of trackside node high risk vehicle alarm prompt method suitable for vehicle-mounted short distance communication network
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782646A (en) * 2009-01-19 2010-07-21 财团法人工业技术研究院 All-round environment sensing system and method
KR20160071734A (en) * 2014-12-12 2016-06-22 삼성전자주식회사 Method and apparatus for traffic safety
CN108694859A (en) * 2017-02-28 2018-10-23 大唐高鸿信息通信研究院(义乌)有限公司 A kind of trackside node high risk vehicle alarm prompt method suitable for vehicle-mounted short distance communication network
CN108022450A (en) * 2017-10-31 2018-05-11 华为技术有限公司 A kind of auxiliary driving method and traffic control unit based on cellular network
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112382085A (en) * 2020-10-20 2021-02-19 华南理工大学 System and method suitable for intelligent vehicle traffic scene understanding and beyond visual range perception
CN112735130A (en) * 2020-12-25 2021-04-30 北京百度网讯科技有限公司 Traffic data processing method and device, electronic equipment and medium
CN112735130B (en) * 2020-12-25 2022-05-10 阿波罗智联(北京)科技有限公司 Traffic data processing method and device, electronic equipment and medium
US11821746B2 (en) 2020-12-25 2023-11-21 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Method and apparatus of processing traffic data, device and medium
CN114694368A (en) * 2020-12-28 2022-07-01 比亚迪股份有限公司 Vehicle management and control system
CN113238496A (en) * 2021-04-20 2021-08-10 东风汽车集团股份有限公司 Parallel driving controller control system, method and medium of integrated on-board unit (OBU)
CN114373295A (en) * 2021-11-30 2022-04-19 江铃汽车股份有限公司 Driving safety early warning method, system, storage medium and equipment
CN114648870A (en) * 2022-02-11 2022-06-21 行云新能科技(深圳)有限公司 Edge calculation system, edge calculation decision prediction method, and computer-readable storage medium
CN114399924A (en) * 2022-02-15 2022-04-26 青岛海信网络科技股份有限公司 Vehicle, edge computing device, server and information transmission method

Similar Documents

Publication Publication Date Title
CN111354182A (en) Driving assisting method and system
CN111354222A (en) Driving assisting method and system
US20230005364A1 (en) Systems and methods for monitoring traffic lane congestion
US10984557B2 (en) Camera calibration using traffic sign recognition
US11727799B2 (en) Automatically perceiving travel signals
CN111429739A (en) Driving assisting method and system
CN115143987A (en) System and method for collecting condition information associated with a road segment
CN115380196A (en) System and method for determining road safety
US10650256B2 (en) Automatically perceiving travel signals
CN110942623B (en) Auxiliary traffic accident handling method and system
WO2020057406A1 (en) Driving aid method and system
CN111354214B (en) Auxiliary parking method and system
CN111595357B (en) Visual interface display method and device, electronic equipment and storage medium
CN110940347B (en) Auxiliary vehicle navigation method and system
US20180299893A1 (en) Automatically perceiving travel signals
CN111094095B (en) Method and device for automatically sensing driving signal and vehicle
US10732420B2 (en) Head up display with symbols positioned to augment reality
US20170103271A1 (en) Driving assistance system and driving assistance method for vehicle
US20180300566A1 (en) Automatically perceiving travel signals
JP2008097279A (en) Vehicle exterior information display device
CN113212451A (en) Rearview auxiliary system for intelligent driving automobile
CN110763244B (en) Electronic map generation system and method
JP2022056153A (en) Temporary stop detection device, temporary stop detection system, and temporary stop detection program
JP7432198B2 (en) Situation awareness estimation system and driving support system
JP7449497B2 (en) Obstacle information acquisition system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201218

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: The big Cayman capital building, a four - story mailbox 847

Applicant before: Alibaba Group Holding Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200630