CN113147751A - Driving assistance system, method and readable storage medium for vehicle - Google Patents

Driving assistance system, method and readable storage medium for vehicle Download PDF

Info

Publication number
CN113147751A
CN113147751A CN202010008846.5A CN202010008846A CN113147751A CN 113147751 A CN113147751 A CN 113147751A CN 202010008846 A CN202010008846 A CN 202010008846A CN 113147751 A CN113147751 A CN 113147751A
Authority
CN
China
Prior art keywords
pedestrian
motor vehicle
current vehicle
driving assistance
vehicle user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010008846.5A
Other languages
Chinese (zh)
Inventor
曲彤
唐帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to CN202010008846.5A priority Critical patent/CN113147751A/en
Publication of CN113147751A publication Critical patent/CN113147751A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means

Abstract

The invention provides a driving assistance system for a vehicle, a vehicle comprising the same, and a corresponding method and a readable storage medium. The driving assistance system includes: an image acquisition unit configured to acquire images of pedestrians and/or non-motor vehicle users around a current vehicle; an image processing unit configured to detect a behavior condition of a pedestrian and/or a non-motor vehicle user based on the acquired image of the pedestrian and/or the non-motor vehicle user; a risk assessment unit configured to estimate a risk value of the presence of a collision object in an undetectable area of a current vehicle based on the detected behavior condition of a pedestrian and/or a non-motor vehicle user; and the driving assistance unit is configured to provide corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold value. By the scheme of the invention, the risk value of the collision object existing in the undetectable area of the current vehicle can be determined.

Description

Driving assistance system, method and readable storage medium for vehicle
Technical Field
The present invention relates to the field of vehicles, and more particularly, to a driving assistance system for a vehicle, a vehicle including the same, and a corresponding driving assistance method and computer-readable storage medium.
Background
In order to reduce traffic accidents, a safety driving assistance system, such as a collision warning system, is usually installed on a vehicle, and can detect the road environment through sensors such as radar, laser, ultrasonic waves, and infrared rays, and determine the possibility of collision according to the detected information, so as to give a warning to the driver in time when there is a possibility of danger.
However, current advanced driving assistance systems (e.g., ADAS systems) can only respond to visible vehicles or surrounding environments, and although sensors based on ADAS systems can detect blind areas of drivers and detect images in visible areas as much as possible, since some moving objects (e.g., pedestrians, bicycles, and even vehicles) may be hidden or hidden by other objects, the driving assistance systems of vehicles cannot acquire real information of the moving objects or acquire any information of the moving objects at all, or cannot make evasive operations when seeing the moving objects, so that timely danger determination and evasive operations cannot be made, and thus traffic accidents occur.
Therefore, there is a need for a driving assistance system capable of predicting the presence of a collision object in an undetectable area of a current vehicle.
Disclosure of Invention
In order to solve the technical problem, the invention provides a scheme for determining a risk value of a collision object existing in an undetectable area of a current vehicle based on behavior conditions of pedestrians and/or non-motor vehicle users around the current vehicle, aiming at reducing the collision risk of the vehicle and improving driving safety.
Specifically, according to a first aspect of the present invention, there is provided a driving assistance system for a vehicle, the system including:
an image acquisition unit configured to acquire images of pedestrians and/or non-motor vehicle users around a current vehicle;
an image processing unit configured to detect a behavior condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user;
a risk assessment unit configured to estimate a risk value of the presence of a collision object in an undetectable area of the current vehicle based on the detected behavior condition of the pedestrian and/or non-motor vehicle user;
a driving assistance unit configured to provide corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold.
In one embodiment, the behavioral conditions of the pedestrian and/or non-motor vehicle user include: a pedestrian and/or a non-motor vehicle user's motion state, gaze direction, and/or facial expression.
In one embodiment, the image processing unit is further configured to detect any one or more of the following based on the acquired images of the pedestrian and/or non-motor vehicle user:
detecting whether the state of motion of the pedestrian and/or non-motor vehicle user is not consistent with the allowed moving direction of the traffic light and/or whether there is a change in the state of motion;
detecting whether the gaze direction of the pedestrian and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle; and/or
Detecting whether there is a change in the facial expression of the pedestrian and/or non-motor vehicle user and/or whether there is an expression associated with surprise or fear.
In one embodiment, the risk assessment unit is further configured to: different weights are respectively given to the plurality of situations detected by the image processing unit, and the results of the detected plurality of situations are subjected to weighted summation to estimate a risk value of the presence of a collision object in the undetectable area of the current vehicle.
In one embodiment, the risk assessment unit is further configured to determine the position and/or distance of a collision object possibly present in the undetectable area with respect to the current vehicle based on the position of the pedestrian and/or non-motor vehicle user in the acquired image and the gaze direction.
In one embodiment, the driving assistance unit is further configured to, when the risk value estimated by the risk assessment unit exceeds a preset risk threshold:
providing alert information to the current vehicle to alert the current vehicle of the presence of a collision object in an undetectable area and to indicate the location and/or distance of the collision object relative to the current vehicle; and/or
And providing corresponding driving auxiliary operation to control the current vehicle to take the operations of decelerating and/or adjusting the driving direction.
According to another aspect of the present invention, there is provided a vehicle including the driving assistance system according to any one of the above.
According to still another aspect of the present invention, there is provided a driving assistance method for a vehicle, the method including:
acquiring images of pedestrians and/or non-motor vehicle users around a current vehicle;
detecting a behavioral condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user;
estimating a risk value for the presence of a collision object in an undetectable area of the current vehicle based on the detected behavior condition of the pedestrian and/or non-motor vehicle user; and
providing corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold.
In one embodiment, the behavioral conditions of the pedestrian and/or non-motor vehicle user include: a pedestrian and/or a non-motor vehicle user's motion state, gaze direction, and/or facial expression.
In one embodiment, said detecting a behavioral condition of said pedestrian and/or non-motor vehicle user based on said acquired image of said pedestrian and/or non-motor vehicle user comprises:
detecting whether the state of motion of the pedestrian and/or non-motor vehicle user is not consistent with the allowed moving direction of the traffic light and/or whether there is a change in the state of motion;
detecting whether the gaze direction of the pedestrian and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle; and/or
Detecting whether there is a change in the facial expression of the pedestrian and/or non-motor vehicle user and/or whether there is an expression associated with surprise or fear.
In one embodiment, the estimating a risk value of the presence of a collision object in an undetectable area of the current vehicle includes:
different weights are respectively given to the plurality of situations detected by the image processing unit, and the results of the detected plurality of situations are subjected to weighted summation to estimate a risk value of the presence of a collision object in the undetectable area of the current vehicle.
In one embodiment, the method further comprises:
determining a position and/or distance of a collision object possibly present in the undetectable area with respect to a current vehicle based on a location and gaze direction of the pedestrian and/or non-motor vehicle user in the acquired image.
In one embodiment, wherein providing the respective driving assistance to the current vehicle comprises:
providing alert information to the current vehicle to alert the current vehicle of the presence of a collision object in an undetectable area and to indicate the location and/or distance of the collision object relative to the current vehicle; and/or
And providing corresponding driving auxiliary operation to control the current vehicle to take the operations of decelerating and/or adjusting the driving direction.
According to a further aspect of the invention, there is provided a computer apparatus comprising a memory storing a computer program and a processor implementing any of the methods described above when the processor executes the computer program.
According to a further aspect of the invention, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the above.
By the scheme of the invention, the behavior condition of the pedestrian and/or the non-motor vehicle user can be more accurately detected according to the image of the pedestrian and/or the non-motor vehicle user in the visible detection area around the current vehicle, the risk value of the collision object existing in the non-detection area which is not detected or seen by the current vehicle and is shielded by the adjacent vehicle is determined, and the current vehicle is controlled to carry out corresponding running state adjustment according to the risk value, so that the collision risk of the current vehicle and the collision object existing in the non-detection area is greatly reduced, and the running safety is remarkably improved.
Drawings
Non-limiting and non-exhaustive embodiments of the present invention are described, by way of example, with reference to the following drawings, in which:
fig. 1 is a schematic view showing an application scenario of a driving assistance system and method according to an embodiment of the present invention;
fig. 2 shows a schematic diagram of a driving assistance system according to an embodiment of the invention;
fig. 3 shows a schematic view of a driving assistance system according to another embodiment of the invention;
fig. 4 shows a flowchart of a driving assistance method according to an embodiment of the invention.
Detailed Description
In order to make the above and other features and advantages of the present invention more apparent, the present invention is further described below with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are for purposes of illustration only and are not to be construed as limiting, for those of ordinary skill in the art.
Fig. 1 is a schematic view showing an application scenario of a driving assistance system and method according to an embodiment of the present invention.
Specifically, the vehicle driving assist system and method according to an embodiment of the invention may be applied to the present vehicle 10 in the application scenario shown in fig. 1. The present vehicle 10 may have an image acquisition unit 101 and an image processing unit 102 mounted thereon. In the embodiment shown in fig. 1, the image acquisition unit 101 may acquire images of pedestrians and/or non-motor vehicle users around the current vehicle through sensors mounted on the current vehicle, such as various sensors, e.g., a camera, an ultrasonic sensor, or a radar. The image processing unit 102 may be, for example, a processing device installed in an in-vehicle system, an intelligent mobile terminal of a vehicle user, an online server (not shown), or other location that can process the acquired image. In one embodiment, the image acquisition unit 101 may also acquire the relevant image by communicating with an infrastructure (e.g., a camera on the road, etc.) or an online server connected to the current vehicle 10, and the image processing unit 102 may process the image by connecting with the online server. Preferably, the image obtaining unit 101 may obtain not only images of pedestrians and/or non-motor vehicle users around the current vehicle, but also images of the environment around the current vehicle, for example, images of the current vehicle position/traffic signs around the position, set type areas around bus stations, subway stations, shopping malls 60, and the like, and surrounding vehicles, and the like.
As shown in fig. 1, when there is an adjacent vehicle 20 on the right side of the current vehicle 10, there is an undetectable area on the right side of the current vehicle that cannot be detected by the sensor due to the occlusion of the adjacent vehicle 20, when there is a vehicle 40, 50 in the undetectable area, the pedestrian 30 or other non-motor vehicle user stops moving forward when observing the vehicles 40 and 50 or the facial expression changes, and even expressions related to surprise or fear may occur. The image acquisition unit 101 may acquire images of the pedestrian 30 or other non-motor vehicle user to detect a behavior condition thereof, such as a motion state (e.g., sudden stop), gaze direction, and/or facial expression (e.g., expression becomes surprised or feared from a previous level, etc.) of the pedestrian 30 or other non-motor vehicle user. The image processing unit 102 processes the behavioral action images or expression images in the captured images to detect the behavioral conditions of the pedestrian and/or non-motor vehicle user. The risk assessment unit 103 estimates a risk value of the presence of a collision object in the undetectable area of the present vehicle from the behavior condition of the pedestrian and/or the non-motor vehicle user.
Since the adjacent vehicle 20 located on the right side of the current vehicle 10 may partially block the driver's sight of the current vehicle 10 and block the detection signal of the sensor of the current vehicle 10, the vehicles 40, 50 having a cross with the traveling route of the current vehicle may be located in the blind spot of the current vehicle 10. If it is not determined in time that there is a possibility of the collision object 40, 50, the risk of the collision of the vehicle 10 with the collision object 40, 50 at present will be relatively high without taking corresponding measures.
Advantageously, the driving assistance system and method according to the present invention can estimate a risk value of a collision object existing in an undetectable area of a current vehicle in advance in conjunction with behavior conditions of pedestrians and/or non-motor vehicle users, and if the estimated risk value exceeds a preset risk threshold, determine that a collision object 40, 50 exists in the undetectable area of the current vehicle 10, thereby providing corresponding driving assistance to the current vehicle 10, such as: reminding the driver of the operations of deceleration, parking and the like in time. Therefore, by using the scheme of the present invention, the running state of the current vehicle 10 can be adjusted in advance, the risk of collision between the current vehicle 10 and the collision object in the undetectable area is reduced, and the completeness is improved.
It should be noted that the "collision object" mentioned herein should be interpreted in a broad sense, which does not necessarily require the moving direction of the collision object to be lateral with respect to the current vehicle, but as long as the collision object is at risk of collision with the current vehicle 10.
Fig. 2 schematically shows a schematic view of a driving assistance system according to an embodiment of the invention.
As shown in fig. 2, the driving assistance system 100 according to an embodiment of the present invention includes an image acquisition unit 101, an image processing unit 102, a risk assessment unit 103, and a driving assistance unit 104.
In particular, the image acquisition unit 101 is configured for detecting images of pedestrians and/or non-motor vehicle users around the current vehicle, such as: the image acquisition unit 101 is configured to detect images of pedestrians and/or non-motor vehicle users around the current vehicle, as well as images of surrounding environmental buildings 60 and neighboring vehicles, etc.
Preferably, the image acquisition unit 101 may be configured to detect images of pedestrians and/or non-motor vehicle users in front left and right of the current vehicle, for example, within a preset detectable area. For example, the image acquisition unit 101 may be implemented by at least one of: the image acquisition unit 101 may include a sensor mounted on the vehicle 10; may be connected to sensors mounted on the vehicle 10 to acquire images of the pedestrian 30 and/or a non-motor vehicle user; or by communicating with neighboring vehicles 20, infrastructure or online servers to acquire images of pedestrians 30 and/or non-motor vehicle users around the vehicle 10. The sensors may include cameras, millimeter wave radars, lidar, ultrasonic sensors, or any other suitable sensor, or combinations thereof.
The image processing unit 102 is configured to detect a behavior condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user. In one embodiment, the behavioral conditions of the pedestrian and/or non-motor vehicle user include: a pedestrian and/or a non-motor vehicle user's motion state, gaze direction, and/or facial expression.
The risk assessment unit 103 is configured for estimating a risk value for the presence of a collision object in an undetectable area of the current vehicle 10 based on the detected behavior condition of the pedestrian 30 and/or the non-motor vehicle user. The driving assistance unit 104 is configured to provide corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold. For example, both the risk assessment unit 103 and the driving assistance unit 104 may be implemented by respective units in an in-vehicle terminal system or an online server.
With the above-described driving assistance system, the current vehicle may obtain an image of a pedestrian or a non-motor vehicle user around the current vehicle in a situation where the blocked non-detectable area cannot be detected, determine a behavior condition of the pedestrian or the non-motor vehicle user, such as a motion state, a gaze direction, and/or a facial expression, by analyzing the image, and then estimate a risk value of the presence of the collision object in the non-detectable area of the current vehicle based on the behavior condition of the pedestrian or the non-motor vehicle user, determine that the collision object exists around the current vehicle when the risk value exceeds a preset risk threshold, and timely provide corresponding driving assistance for the current vehicle, such as: automatic deceleration braking, parking, etc.
In one embodiment, the image processing unit is further configured to detect any one or more of the following based on the acquired images of the pedestrian and/or non-motor vehicle user:
detecting whether the state of motion of the pedestrian and/or non-motor vehicle user is not consistent with the allowed moving direction of the traffic light and/or whether there is a change in the state of motion;
detecting whether the gaze direction of the pedestrian and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle; and/or
Detecting whether there is a change in the facial expression of the pedestrian and/or non-motor vehicle user and/or whether there is an expression associated with surprise or fear.
It is understood that detecting whether the motion state of the pedestrian and/or the non-motor vehicle user is inconsistent with the movement direction allowed by the traffic light is mainly used for detecting whether the current movement of the pedestrian and/or the non-motor vehicle user violates the traffic rule, and judging whether the facial expression and the gaze direction of the pedestrian and/or the non-motor vehicle user can become important bases for judging the collision object in the undetectable area. Detecting whether there is a change in the state of motion of the pedestrian 30 and/or non-motor vehicle user may include detecting whether there is a change in the direction of movement or speed of movement of the pedestrian 30 or non-motor vehicle user, such as whether the user suddenly stops moving or changes the direction of movement. Detecting whether the direction of gaze of the pedestrian 30 and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle may include detecting whether the time the pedestrian 30 and/or non-motor vehicle user looks in a direction towards an undetectable area exceeds a preset time threshold. In addition, it can be detected from the shape of the facial features (e.g. eyes, mouth, facial muscles) of the pedestrian or user, whether there is a change in the facial expression of said pedestrian 30 and/or non-motor vehicle user and/or whether there is an expression related to surprise or fear, etc.
In an embodiment, the risk assessment unit 103 is further configured to: different weights are respectively given to the plurality of situations detected by the image processing unit, and the results of the detected plurality of situations are subjected to weighted summation to estimate a risk value of the presence of a collision object in the undetectable area of the current vehicle. For example, for example: when a change in the state of motion of surrounding pedestrians and/or non-motor vehicle users is detected, such as: if the vehicle is suddenly stopped, a weight value is given as 5; if it is detected that the gaze direction of the pedestrian and/or non-motor vehicle user is always in one direction towards the undetectable area and exceeds a predetermined time period, assigning a weight value of 3; if it is detected that the facial expression of a pedestrian and/or non-motor vehicle user is surprised or frightened, a weight value of 4 is assigned. In addition, the results of the detected multiple situations are weighted and summed to more accurately estimate the risk value of the presence of the collision object in the undetectable area of the present vehicle.
In an embodiment, the risk assessment unit 103 is further configured to determine the position and/or distance of a collision object possibly present in the undetectable area with respect to the current vehicle based on the position and gaze direction of the pedestrian and/or non-motor vehicle user in the acquired image.
In an embodiment, the driving assistance unit 104 is further configured for, when the risk value estimated by the risk assessment unit exceeds a preset risk threshold:
providing alert information to the current vehicle to alert the current vehicle of the presence of a collision object in an undetectable area and to indicate the location and/or distance of the collision object relative to the current vehicle; and/or
And providing corresponding driving auxiliary operation to control the current vehicle to take the operations of decelerating and/or adjusting the driving direction.
For example, providing the warning information to the present vehicle 10 may include, for example, displaying an emergency warning mark on an in-vehicle display, red indicating that the undetectable area where the collision object may exist is, for example, 1 to 20 meters distant from the present vehicle, yellow indicating that the undetectable area where the collision object may exist is, for example, 20 to 60 meters distant from the present vehicle, green indicating that the undetectable area where the collision object may exist is, for example, 60 to 100 meters distant from the present vehicle, and the like.
According to different emergency alarm signs, the current vehicle may adopt different driving assistance strategies, for example, emergency braking for red alarms, timely deceleration for yellow alarms, slow deceleration for green alarms, etc.
Fig. 3 shows a schematic view of a driving assistance system according to another embodiment of the invention.
As shown in fig. 3, the driving assistance system 100 may further include a machine learning unit 105, which may be configured to train a risk analysis model of the risk assessment unit 103 by performing a training process in a machine learning manner. The risk assessment unit 103 is further configured for determining the risk value according to the trained risk analysis model.
In one embodiment, the machine learning unit 105 may be located on an online server, and the image acquisition unit 101, the image processing unit 102, and/or the risk assessment unit 103 are connected to the machine learning unit 105 on the online server via a mobile network/Wi-Fi or the like. The machine learning unit 105 is further configured to train the risk analysis model by using the acquired images of a large number of pedestrians and/or non-motor vehicle users and subsequent results of whether collision objects are present. Specifically, the machine learning may be performed based on any one of the current vehicle and at least one other vehicle.
According to another aspect of the present invention, there is also provided a vehicle including the vehicular drive assist system described in any one of the above.
Fig. 4 shows a flowchart of a driving assistance method for vehicle according to an embodiment of the invention.
As shown in fig. 4, the driving assistance method S100 includes:
s200: acquiring images of pedestrians and/or non-motor vehicle users around a current vehicle; and
s300: detecting a behavioral condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user;
s400: estimating a risk value for the presence of a collision object in an undetectable area of the current vehicle based on the detected behavior condition of the pedestrian and/or non-motor vehicle user;
s500: providing corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold.
Preferably, the behaviour of said pedestrian and/or non-motor vehicle user comprises: a pedestrian and/or a non-motor vehicle user's motion state, gaze direction, and/or facial expression.
Preferably, said detecting a behavioural condition of said pedestrian and/or non-motor vehicle user based on said acquired image of said pedestrian and/or non-motor vehicle user comprises:
detecting whether the state of motion of the pedestrian and/or non-motor vehicle user is not consistent with the allowed moving direction of the traffic light and/or whether there is a change in the state of motion;
detecting whether the gaze direction of the pedestrian and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle; and/or
Detecting whether there is a change in the facial expression of the pedestrian and/or non-motor vehicle user and/or whether there is an expression associated with surprise or fear.
Preferably, the estimating of the risk value of the presence of the collision object in the undetectable area of the present vehicle includes:
different weights are respectively given to the plurality of situations detected by the image processing unit, and the results of the detected plurality of situations are subjected to weighted summation to estimate a risk value of the presence of a collision object in the undetectable area of the current vehicle.
Preferably, the method further comprises:
determining a position and/or distance of a collision object possibly present in the undetectable area with respect to a current vehicle based on a location and gaze direction of the pedestrian and/or non-motor vehicle user in the acquired image.
Preferably, providing the respective driving assistance to the current vehicle includes:
providing alert information to the current vehicle to alert the current vehicle of the presence of a collision object in an undetectable area and to indicate the location and/or distance of the collision object relative to the current vehicle; and/or
And providing corresponding driving auxiliary operation to control the current vehicle to take the operations of decelerating and/or adjusting the driving direction.
For a more specific description of the driving assistance method of the present invention, reference may be made to the above description of the driving assistance system of the present invention, which is not described herein again.
Another aspect of the invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above. For example, the computer program, when executed by a processor, is capable of instructing the processor and/or the respective component to carry out the steps of: acquiring images of pedestrians and/or non-motor vehicle users around a current vehicle; and detecting a behavioral condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user; estimating a risk value for the presence of a collision object in an undetectable area of the current vehicle based on the detected behavior condition of the pedestrian and/or non-motor vehicle user; and providing corresponding driving assistance to the current vehicle when the estimated risk value exceeds a preset risk threshold.
Although the invention has been described above by way of example with the application scenario of fig. 1, it will be appreciated by those skilled in the art that the invention is equally applicable to other situations where there is a possible collision object in an undetectable area.
Further, it should be understood that each unit in the above-described driving assistance system 100 for vehicle may be entirely or partially implemented by software, hardware, and a combination thereof. The units can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the units.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor, the processor implementing the steps of the method of any of the above embodiments when executing the computer program. The computer device may be a server or a vehicle-mounted terminal. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement the vehicle driving assist method of the invention.
Those skilled in the art will appreciate that the schematic diagram of the driving assistance system 100 shown in fig. 2 is only a block diagram of a part of the structure related to the present application, and does not constitute a limitation of the computer device to which the present application is applied, and a specific computer device may include more or less components than those shown in the figure, or combine some components, or have a different arrangement of components.
It will be understood by those of ordinary skill in the art that all or part of the steps in implementing the methods according to the above embodiments of the present invention may be directed to relevant hardware implementations by a computer program, which may be stored in a non-volatile computer-readable storage medium, that when executed, may include the steps of the above embodiments of the methods. Any reference to memory, storage, database, or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory.
The features of the above embodiments may be arbitrarily combined, and for the sake of brevity, all possible combinations of the features in the above embodiments are not described, but should be construed as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the features.
While the invention has been described in connection with the embodiments, it is to be understood by those skilled in the art that the foregoing description and drawings are merely illustrative and not restrictive of the broad invention, and that this invention not be limited to the disclosed embodiments. Various modifications and variations are possible without departing from the spirit of the invention.

Claims (15)

1. A driving assistance system for a vehicle, the driving assistance system comprising:
an image acquisition unit configured to acquire images of pedestrians and/or non-motor vehicle users around a current vehicle;
an image processing unit configured to detect a behavior condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user;
a risk assessment unit configured to estimate a risk value of the presence of a collision object in an undetectable area of the current vehicle based on the detected behavior condition of the pedestrian and/or non-motor vehicle user;
a driving assistance unit configured to provide corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold.
2. The driving assistance system according to claim 1, wherein the behavior condition of the pedestrian and/or the non-motor vehicle user includes: a pedestrian and/or a non-motor vehicle user's motion state, gaze direction, and/or facial expression.
3. The driving assistance system according to claim 1 or 2, wherein the image processing unit is further configured to detect any one or more of the following based on the acquired images of the pedestrian and/or non-motor vehicle user:
detecting whether the state of motion of the pedestrian and/or non-motor vehicle user is not consistent with the allowed moving direction of the traffic light and/or whether there is a change in the state of motion;
detecting whether the gaze direction of the pedestrian and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle; and/or
Detecting whether there is a change in the facial expression of the pedestrian and/or non-motor vehicle user and/or whether there is an expression associated with surprise or fear.
4. The driving assistance system according to claim 3, wherein the risk assessment unit is further configured to: different weights are respectively given to the plurality of situations detected by the image processing unit, and the results of the detected plurality of situations are subjected to weighted summation to estimate a risk value of the presence of a collision object in the undetectable area of the current vehicle.
5. The driving assistance system according to claim 4, wherein the risk assessment unit is further configured to determine an orientation and/or a distance of a collision object possibly present in the undetectable area with respect to a current vehicle based on a position and a gaze direction of the pedestrian and/or non-motor vehicle user in the acquired image.
6. The driving assistance system according to claim 5, wherein the driving assistance unit is further configured to, when the risk value estimated by the risk assessment unit exceeds a preset risk threshold:
providing alert information to the current vehicle to alert the current vehicle of the presence of a collision object in an undetectable area and to indicate the location and/or distance of the collision object relative to the current vehicle; and/or
And providing corresponding driving auxiliary operation to control the current vehicle to take the operations of decelerating and/or adjusting the driving direction.
7. A vehicle comprising the driving assistance system according to any one of claims 1 to 6.
8. A driving assistance method for a vehicle, the method comprising:
acquiring images of pedestrians and/or non-motor vehicle users around a current vehicle;
detecting a behavioral condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user;
estimating a risk value for the presence of a collision object in an undetectable area of the current vehicle based on the detected behavior condition of the pedestrian and/or non-motor vehicle user; and
providing corresponding driving assistance for the current vehicle when the estimated risk value exceeds a preset risk threshold.
9. The method of claim 8, wherein the behavioral conditions of the pedestrian and/or non-motor vehicle user comprise: a pedestrian and/or a non-motor vehicle user's motion state, gaze direction, and/or facial expression.
10. The method of claim 8 or 9, wherein said detecting a behavioral condition of the pedestrian and/or non-motor vehicle user based on the acquired image of the pedestrian and/or non-motor vehicle user comprises:
detecting whether the state of motion of the pedestrian and/or non-motor vehicle user is not consistent with the allowed moving direction of the traffic light and/or whether there is a change in the state of motion;
detecting whether the gaze direction of the pedestrian and/or non-motor vehicle user exceeds a preset time threshold towards an undetectable area of the current vehicle; and/or
Detecting whether there is a change in the facial expression of the pedestrian and/or non-motor vehicle user and/or whether there is an expression associated with surprise or fear.
11. The method of claim 10, wherein the estimating a risk value for the presence of a collision object in an undetectable area of the current vehicle comprises:
different weights are respectively given to the plurality of situations detected by the image processing unit, and the results of the detected plurality of situations are subjected to weighted summation to estimate a risk value of the presence of a collision object in the undetectable area of the current vehicle.
12. The method of claim 11, wherein the method further comprises:
determining a position and/or distance of a collision object possibly present in the undetectable area with respect to a current vehicle based on a location and gaze direction of the pedestrian and/or non-motor vehicle user in the acquired image.
13. The method of claim 12, wherein providing the respective driving assistance to the current vehicle comprises:
providing alert information to the current vehicle to alert the current vehicle of the presence of a collision object in an undetectable area and to indicate the location and/or distance of the collision object relative to the current vehicle; and/or
And providing corresponding driving auxiliary operation to control the current vehicle to take the operations of decelerating and/or adjusting the driving direction.
14. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the method of any one of claims 8 to 13 when executing the computer program.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 8 to 13.
CN202010008846.5A 2020-01-06 2020-01-06 Driving assistance system, method and readable storage medium for vehicle Withdrawn CN113147751A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010008846.5A CN113147751A (en) 2020-01-06 2020-01-06 Driving assistance system, method and readable storage medium for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010008846.5A CN113147751A (en) 2020-01-06 2020-01-06 Driving assistance system, method and readable storage medium for vehicle

Publications (1)

Publication Number Publication Date
CN113147751A true CN113147751A (en) 2021-07-23

Family

ID=76881315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010008846.5A Withdrawn CN113147751A (en) 2020-01-06 2020-01-06 Driving assistance system, method and readable storage medium for vehicle

Country Status (1)

Country Link
CN (1) CN113147751A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109334566A (en) * 2018-08-31 2019-02-15 百度在线网络技术(北京)有限公司 For providing the method, apparatus, equipment and storage medium of vehicle external feedback
CN109844838A (en) * 2016-10-25 2019-06-04 三菱电机株式会社 Peripheral information decision maker and peripheral information determination method
KR20190123640A (en) * 2018-04-24 2019-11-01 현대모비스 주식회사 Pedestrian detection and risk estimation apparatus and method thereof
CN110550105A (en) * 2018-05-30 2019-12-10 奥迪股份公司 Driving assistance system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109844838A (en) * 2016-10-25 2019-06-04 三菱电机株式会社 Peripheral information decision maker and peripheral information determination method
KR20190123640A (en) * 2018-04-24 2019-11-01 현대모비스 주식회사 Pedestrian detection and risk estimation apparatus and method thereof
CN110550105A (en) * 2018-05-30 2019-12-10 奥迪股份公司 Driving assistance system and method
CN109334566A (en) * 2018-08-31 2019-02-15 百度在线网络技术(北京)有限公司 For providing the method, apparatus, equipment and storage medium of vehicle external feedback

Similar Documents

Publication Publication Date Title
US9889858B2 (en) Confidence estimation for predictive driver assistance systems based on plausibility rules
US10332401B2 (en) Running vehicle alerting system and method
US9566981B2 (en) Method and system for post-collision manoeuvre planning and vehicle equipped with such system
US9524643B2 (en) Orientation sensitive traffic collision warning system
EP2778007B1 (en) Method and system to assess abnormal driving behaviour of vehicles travelling on road
JP5620147B2 (en) Movable object prediction apparatus and program
US11003925B2 (en) Event prediction system, event prediction method, program, and recording medium having same recorded therein
CN113853640B (en) electronic control device
CN109515433A (en) Vehicle control system
CN112249007A (en) Vehicle danger alarm method and related equipment
CN115223131A (en) Adaptive cruise following target vehicle detection method and device and automobile
WO2019003792A1 (en) Control device, control method, and program
CN113147751A (en) Driving assistance system, method and readable storage medium for vehicle
CN111746534A (en) Vehicle driving assistance system, vehicle including the same, and corresponding method and medium
CN113191238A (en) Blind area detection method and system based on binocular camera and storage medium
JP2022027306A (en) Mobile body disturbance detection device, mobile body disturbance detection system, and mobile body disturbance detection program
JP2022027305A (en) Risky driving detection device, risky driving detection system, and risky driving detection program
CN113808435B (en) Auxiliary device for a vehicle, corresponding method and storage medium
CN111192479A (en) Message pushing method and device, computer equipment and storage medium
CN110874549A (en) Target visual field determining method, system, computer device and storage medium
CN114078331B (en) Overspeed detection method, overspeed detection device, visual sensor and storage medium
JP2018028743A (en) Vehicle controller
CN116923389A (en) Vehicle control method and device, storage medium and vehicle
CN117755287A (en) Vehicle decision control method and related device
CN117184057A (en) Control method and device for safe running of vehicle, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210723