CN118254819A - Driving support method, program product, and driving support device for vehicle - Google Patents

Driving support method, program product, and driving support device for vehicle Download PDF

Info

Publication number
CN118254819A
CN118254819A CN202410392212.2A CN202410392212A CN118254819A CN 118254819 A CN118254819 A CN 118254819A CN 202410392212 A CN202410392212 A CN 202410392212A CN 118254819 A CN118254819 A CN 118254819A
Authority
CN
China
Prior art keywords
vehicle
driving assistance
driving
around
looking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410392212.2A
Other languages
Chinese (zh)
Inventor
支蓉
张武强
王宝锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Mercedes Benz Group AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mercedes Benz Group AG filed Critical Mercedes Benz Group AG
Priority to CN202410392212.2A priority Critical patent/CN118254819A/en
Publication of CN118254819A publication Critical patent/CN118254819A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W50/16Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/801Lateral distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/802Longitudinal distance

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a driving assistance method for a vehicle, wherein the driving assistance method comprises at least the following steps: acquiring a 360-degree looking-around image of the vehicle; generating a travelable region around the vehicle using a neural network model based on the look-around image, wherein the travelable region is generated from parameters identified from the look-around image by the neural network model, the parameters comprising at least one of the group of: the road edge height, the distance between obstacles and the vehicle; providing a driver of the vehicle with corresponding auxiliary guiding measures at least according to the drivable area. The invention also relates to a corresponding computer program product and a driving assistance device for a vehicle. The method and the device can guide the driver to perform reasonable driving operation aiming at different environments in real time, so that the risk of scratch is reliably reduced, the driving experience of the driver is effectively improved, and the driving safety of the vehicle and the driving experience of the driver are improved.

Description

Driving support method, program product, and driving support device for vehicle
Technical Field
The invention relates to the technical field of driving assistance of vehicles, in particular to a driving assistance method for vehicles. The invention also relates to a corresponding computer program product and a corresponding driving assistance device for a vehicle.
Background
In recent years, with the development of technology and the improvement of living standard, the requirements of people on the intelligence and the safety of vehicles are increasing. In the face of complex road conditions, especially in the case of narrow road meeting and narrow road parking, inexperienced drivers often have inaccurate perception of the surrounding environment and relative distance, which may lead to a driver failing to perform driving operations correctly and to risk scratch.
Currently, there are solutions for detecting an environmental state by ultrasonic waves or cameras to give an alarm or providing obstacle information to a driver during driving to prevent scratch, but these solutions do not enable the driver to intuitively understand the current driving environment and cannot solve the problem of difficulty in meeting or parking in complex road conditions for inexperienced drivers.
Disclosure of Invention
It is therefore an object of the present invention to propose an improved driving assistance method for a vehicle, by means of which a driver can be guided in real time to perform reasonable driving operations for different environments, so that the risk of scratch is reliably reduced and the driving experience of the driver is effectively improved, thereby improving the driving safety of the vehicle and the driving experience of the driver. The object of the invention is also to propose a corresponding computer program product and a corresponding driving assistance device for a vehicle.
According to a first aspect of the present invention, there is provided a driving assistance method for a vehicle, wherein the driving assistance method includes at least the steps of:
S1: acquiring a 360-degree looking-around image of the vehicle;
S2: generating a travelable region around the vehicle using a neural network model based on the look-around image, wherein the travelable region is generated from parameters identified from the look-around image by the neural network model, the parameters comprising at least one of the group of: the road edge height, the distance between obstacles and the vehicle;
S3: providing a driver of the vehicle with corresponding auxiliary guiding measures at least according to the drivable area.
Compared with the prior art, in the driving assistance method for the vehicle, firstly, 360-degree looking-around images of the vehicle are acquired, the looking-around images can clearly show the road edge height, obstacle distribution and other environmental conditions around the vehicle, then the looking-around images are used as input, required parameters are identified from the looking-around images through a trained neural network model, a drivable area around the vehicle is generated based on the parameters, the vehicle can be driven into the drivable area without scratch or collision, an optimized driving path can be planned according to the drivable area, corresponding assistance guiding measures are provided for a driver of the vehicle, and the driver can smoothly realize operations such as narrow road meeting or parking under complex road conditions with the help of the assistance guiding measures, so that the scratch risk of the vehicle is reduced, the driving safety is improved, and on the other hand, the driver can clearly and correctly drive the vehicle under different road conditions, thereby improving the experience and the driving sense of the driver, and improving the driving experience and the driving feeling of the driver.
According to an exemplary embodiment of the invention, the auxiliary guiding measure comprises at least one of the group: travel route guidance, opening guidance of a brake pedal and/or an accelerator pedal, and steering wheel rotation guidance.
According to an exemplary embodiment of the invention, the auxiliary guiding means are implemented in the form of speech and/or images and/or vibrations.
According to an exemplary embodiment of the invention, the vehicle performs an additional driving assistance function, the additional driving assistance function comprising at least one of the group of: and (5) prompting collision risk, emergency braking and folding the rearview mirror.
According to an exemplary embodiment of the present invention, in an automatic driving state of the vehicle, the vehicle automatically travels to the drivable region according to the auxiliary guiding means.
According to an exemplary embodiment of the invention, in order to provide the auxiliary guiding measure, navigation information and/or a driving speed of the vehicle are additionally taken into account in addition to the drivable region.
According to an exemplary embodiment of the invention, the driving assistance method is only implemented when the driving speed of the vehicle is below a speed threshold.
According to an exemplary embodiment of the present invention, an area satisfying at least one of the following conditions is generated as the drivable area: the height of the road edge is smaller than the height of the chassis of the vehicle, the distance between the obstacles is larger than the width of the vehicle body, and the distance between the obstacles and the vehicle is larger than a preset distance.
According to an exemplary embodiment of the invention, the parameters are additionally measured by an ultrasonic radar and/or an infrared camera and/or a laser radar and/or a millimeter wave radar of the vehicle and are fused with the recognition result of the neural network model.
According to an exemplary embodiment of the present invention, when the distance between the obstacles is less than a distance threshold, a narrow road scene including a narrow road meeting, a narrow road parking, and a narrow road passing is determined, and a function option regarding the driving assistance measure is automatically popped up.
According to an exemplary embodiment of the present invention, the through-view image and the generated travelable region are displayed on an on-vehicle screen of the vehicle.
According to an exemplary embodiment of the present invention, the looking-around image is formed by stitching images captured by a plurality of looking-around cameras, and/or a distance grid is added to the looking-around image.
According to a second aspect of the present invention, there is provided a computer program product comprising a computer program, wherein the computer program, when executed by one or more processors, is capable of performing the driving assistance method according to the present invention.
According to a third aspect of the present invention, there is provided a driving assistance device for a vehicle, wherein the driving assistance device includes at least:
-a pan around camera assembly configured to take pan around images of the vehicle and having a plurality of pan around cameras;
-a control unit obtaining the looking-around image from the looking-around camera assembly and configured to be adapted to implement the driving assistance method according to the invention with the computer program product of the invention to provide assistance guidance to the driver of the vehicle;
-an execution unit communicatively connected with the control unit and configured to be adapted to execute the auxiliary guiding measure.
Drawings
The principles, features and advantages of the present invention may be better understood by describing the present invention in more detail with reference to the drawings. The drawings include:
FIG. 1 shows a schematic flow chart of a driving assistance method for a vehicle according to an exemplary embodiment of the invention;
FIGS. 2a and 2b show schematic views of a driving condition of a narrow-road meeting and a drivable zone in the driving condition, respectively;
fig. 3a and 3b show a schematic view of the driving situation of narrow-road parking and the drivable zone in this driving situation, respectively;
Fig. 4 shows a schematic block diagram of a driving assistance device for a vehicle according to an exemplary embodiment of the invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous technical effects to be solved by the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and a plurality of exemplary embodiments. It should be understood that the detailed description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the invention.
The present specification provides method operational steps as described in the examples or flowcharts, but may include more or fewer operational steps based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one way of performing the order of steps and does not represent a unique order of execution.
Fig. 1 shows a schematic flow chart of a driving assistance method for a vehicle according to an exemplary embodiment of the invention. Fig. 2a and 2b show a schematic view of the driving situation of a narrow-road vehicle and the drivable area in this driving situation, respectively. Fig. 3a and 3b show a schematic view of the driving situation of narrow-road parking and the drivable zone in this driving situation, respectively.
As shown in fig. 1, the driving assistance method for a vehicle according to the present invention includes at least the steps of:
S1: a 360-degree looking-around image of the vehicle is acquired by a looking-around camera assembly of the vehicle, wherein the looking-around image can clearly show the environment conditions such as road edge height, obstacle distribution and the like around the vehicle, and the looking-around camera assembly can comprise four looking-around cameras, and the looking-around cameras are configured as wide-angle cameras and are respectively arranged around the vehicle, such as at front and rear marks and left and right rearview mirrors of the vehicle;
S2: generating a drivable region around the vehicle, which can be driven without scratches or collisions along a reasonable driving path, using a neural network model based on the look-around image, wherein the neural network model is trained in advance and stored in a control unit for the driving assistance method, outputting parameters for analyzing the drivable region or directly outputting the drivable region as input to the neural network model, wherein the drivable region is generated from parameters identified from the look-around image by the neural network model, the parameters including at least one of the following group: road edge height, distance between obstacles, distance of obstacles relative to vehicle. By comparing the identified parameters with the vehicle's own dimensions, a drivable area around the vehicle can be determined;
S3: corresponding auxiliary guiding measures are provided for a driver of the vehicle at least according to the drivable area, so that the driver can drive the vehicle to a desired position along a reasonable driving path without scratch.
Therefore, the driver can smoothly realize operations such as narrow road passing, narrow road meeting or narrow road parking in complex road conditions with the help of the auxiliary guiding measures provided by the driving auxiliary method, the risk of scratch of the vehicle is reduced, the driving safety is improved, and on the other hand, the driver can clearly and correctly drive the vehicle under different road conditions, so that the driving experience and driving achievement sense of the driver are improved, and the driving experience of the driver is improved.
In step S2, the drivable region is to be set, for example, to satisfy at least one of the following conditions, in particular to satisfy all of the following conditions: the height of the road edge is smaller than the height of the chassis of the vehicle, the distance between the obstacles is larger than the width of the vehicle body, and the distance between the obstacles and the vehicle is larger than a preset distance. This ensures that the vehicle travels without scratches to the drivable region. Here, the drivable region in the surrounding area of the vehicle is true-valued in the 3D world, in particular, the true-valued data are jointly annotated by means of high-precision sensors such as lidar, ultrasonic radar, etc., and the neural network model is trained by means of a pre-annotated data set, so that the neural network model can output parameters for analyzing the drivable region or directly output the drivable region with a look-around image as input. The trained neural network model is stored in advance in a control unit of the vehicle for the driving assistance method.
For example, it is also conceivable that in step S2, the parameters are additionally measured by other detection devices of the vehicle, for example, ultrasonic radar, infrared camera, laser radar, millimeter wave radar, etc., and the measurement result is fused with the analysis result of the look-around image by the neural network model for the generation of the drivable region, which can improve the detection accuracy of the individual parameters in the look-around image and generate the drivable region more accurately.
For example, when the distance between the identified obstacles is smaller than the distance threshold, it is determined that the current scene in which the vehicle is located is a narrow road scene, which may include a narrow road meeting, narrow road parking, and narrow road passing, and function options regarding driving assistance measures are automatically popped up. In particular in complex scenarios, the vehicle driver can achieve a targeted and smooth implementation of the driving assistance measures by selecting the functional options. For example, the driver may select from a meeting/parking/passing option in which the user may select a desired parking space or head orientation in the case of narrow-road parking, and in the case of narrow-road meeting, the user may confirm the navigation planning information and may select to park alongside if necessary. Of course, additional functional options that would be considered interesting by those skilled in the art are also contemplated.
As shown in fig. 2a and 2b, for example, in a narrow road scene of a narrow road meeting, a 360 ° looking around image of the vehicle 1 is taken by a looking around camera assembly of the vehicle 1, in which surrounding obstacles 2 are displayed, which may be oncoming vehicles, traffic barriers (such as traffic cones, etc.), other road traffic participants (such as pedestrians, bicycles, etc.). Of course, it is also possible to consider additional obstacles, such as plants, buildings, etc., which are considered to be of interest by the person skilled in the art, and then to generate a drivable zone 3 around the vehicle 1, to which the vehicle 1 can be driven without scratch, using a neural network model of the vehicle 1 on the basis of the look-around image, and to provide the driver of the vehicle 1 with auxiliary guiding measures, such as a planned driving path 4, along which the driver can drive the vehicle 1, so that a narrow-road meeting is smoothly achieved, in accordance with the drivable zone 3.
As shown in fig. 3a and 3b, for example, in a narrow road scene of narrow road parking, a 360 ° looking-around image of the vehicle 1 is captured by a looking-around camera assembly of the vehicle 1, in which surrounding obstacles 2 are displayed. A drivable zone 3 around the vehicle 1 is then generated using the neural network model of the vehicle 1 based on the look-around image, the drivable zone comprising an empty space in which parking is possible, and auxiliary guiding measures are then provided to the driver of the vehicle 1 according to the drivable zone 3, for example a planned travel path 4 along which the driver can drive the vehicle 1 so as to smoothly park the vehicle 1 to the empty space.
Further, of course, other narrow road scenarios that would be considered interesting by those skilled in the art, such as narrow road traffic, etc., are also contemplated.
In step S2, a panoramic image captured by a panoramic camera assembly, in particular, a mosaic of images captured by a plurality of panoramic cameras, and the generated drivable region are displayed on a vehicle-mounted screen of the vehicle. This enables the driver to more clearly see the current surroundings and the drivable area. In this case, a distance grid is in particular added to the look-around image, which enables the driver to more accurately determine the relative distance between the vehicle and the obstacle or between a plurality of obstacles.
Illustratively, the auxiliary guiding measure provided in step S3 may comprise at least one of the group of: travel route guidance, opening guidance of a brake pedal and/or an accelerator pedal, and steering wheel rotation guidance. For example, in the driving situation of a narrow-road meeting according to fig. 2b, in addition to the driving path 4 for guiding, it is also possible to guide the driver to turn the steering wheel to the left and to guide the driver to timely depress the brake pedal during driving in order to avoid an undesired collision with an obstacle.
The auxiliary guidance provided in step S3 can be implemented, for example, in the form of speech, for example, by means of a car microphone giving the following speech prompts: please turn the steering wheel 10 ° to the left. However, it is also possible for the auxiliary guidance to be implemented in the form of images and to be displayed on a vehicle screen. The auxiliary guiding means can also be implemented in the form of vibrations, for example by vibrating an accelerator pedal and/or a brake pedal to guide the driver in a corresponding manner. The auxiliary guidance can be implemented in any combination of the above, for example, by displaying the travel path on a vehicle-mounted screen and prompting the driver by voice to drive the vehicle in accordance with the displayed travel path.
Illustratively, in step S3, an additional driving assistance function is performed in addition to the assistance guidance measure, which additional driving assistance function may include at least one of the following group: and (5) prompting collision risk, emergency braking and folding the rearview mirror. This can assist the driver in coping with the current running conditions more easily and avoid an unexpected collision accident of the vehicle as much as possible. Of course, further driving assistance functions which are considered to be of interest by the person skilled in the art are also conceivable.
For example, when the vehicle is in the automatic driving state, the vehicle automatically travels to a desired drivable region in accordance with the auxiliary guiding means provided in step S3. The automatic driving state may be initiated manually by the driver or automatically.
For providing the auxiliary guidance, the navigation information of the vehicle is additionally taken into account, in step S3, from which the driving direction of the vehicle can be determined, and the corresponding auxiliary guidance is provided in combination with the driving direction and the drivable region. It is also possible to additionally consider the driving speed of the vehicle in order to ensure the driving safety of the vehicle, in particular, the driving assistance method is only implemented when the driving speed of the vehicle is below a speed threshold value, for example 10km/h, wherein the driver is prompted to reduce the vehicle speed when the driving speed of the vehicle is above the speed threshold value.
Fig. 4 shows a schematic block diagram of a driving assistance device 100 for a vehicle according to an exemplary embodiment of the invention.
As shown in fig. 4, the driving assistance device 100 according to the invention comprises a looking-around camera assembly 10 which is configured for capturing 360 ° looking-around images of the vehicle 1, wherein the looking-around camera assembly 10 can have a plurality of, in particular four looking-around cameras 11 which are configured as wide-angle cameras and are arranged around the vehicle 1, for example at the front and rear emblems and at the left and right rear view mirrors of the vehicle 1, respectively. Of course, it is contemplated that the pan around camera assembly 10 has other numbers of pan around cameras 11 that would be considered interesting by those skilled in the art.
As shown in fig. 4, the driving assistance device 100 comprises a control unit 20 communicatively connected to the pan around camera assembly 10 for acquiring captured pan around images from the pan around camera assembly 10 and configured to implement the driving assistance method according to the invention with a computer program product according to the invention for providing assistance guidance to the driver of the vehicle 1, wherein the computer program product comprises a computer program which, when executed by one or more processors, is capable of executing the driving assistance method according to the invention. The control unit 20 may be integrated in a vehicle control unit of the vehicle.
As shown in fig. 4, the driving assistance device 100 includes an execution unit 30 that is communicatively connected with the control unit 20 and is configured to execute an assistance guidance measure provided by the control unit 20. The execution unit 30 may have, for example, an in-vehicle microphone for carrying out the auxiliary guidance in the form of speech or an in-vehicle screen for carrying out the auxiliary guidance in the form of images. It is also possible that the actuator unit 30 may have a vibrator mounted on the brake pedal and/or the accelerator pedal and/or the steering wheel.
The driving assistance device 100 also includes a detection device configured to measure various parameters about the obstacle in the surrounding environment, which may be, for example, an ultrasonic radar, an infrared camera, a laser radar, a millimeter wave radar, or the like.
The foregoing explanation of the embodiments describes the invention only in the framework of the examples. Of course, the individual features of the embodiments can be combined with one another freely without departing from the framework of the invention, as long as they are technically interesting.
Other advantages and alternative embodiments of the invention will be apparent to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details, the representative structures, and illustrative examples shown and described. Rather, various modifications and substitutions may be made by those skilled in the art without departing from the basic spirit and scope of the invention.

Claims (10)

1. A driving assistance method for a vehicle (1), characterized in that the driving assistance method comprises at least the steps of:
S1: acquiring a360 ° looking-around image of the vehicle (1);
S2: -generating a travelable region (3) around the vehicle (1) using a neural network model based on the look-around image, wherein the travelable region (3) is generated from parameters identified from the look-around image by the neural network model, the parameters comprising at least one of the group of: a road edge height, a distance between obstacles, a distance of an obstacle relative to the vehicle (1);
s3: corresponding auxiliary guiding measures are provided to the driver of the vehicle (1) at least according to the drivable area (3).
2. The driving assistance method according to claim 1, characterized in that,
The auxiliary guiding means comprises at least one of the group of: guiding a driving path, guiding the opening degree of a brake pedal and/or an accelerator pedal, and guiding the rotation of a steering wheel; and/or
The auxiliary guiding measures are implemented in the form of speech and/or images and/or vibrations.
3. The driving assistance method according to claim 1 or 2, characterized in that,
The vehicle performs additional driving assistance functions including at least one of the following group: collision risk prompt, emergency braking and folding of the rearview mirror; and/or
In an automatic driving state of the vehicle (1), the vehicle (1) automatically travels to the drivable region (3) in accordance with the auxiliary guiding means.
4. The driving assistance method according to any one of the preceding claims, characterized in that,
In order to provide the auxiliary guiding measure, navigation information and/or driving speed of the vehicle (1) are additionally taken into account in addition to the drivable region (3); and/or
The driving assistance method is only implemented when the driving speed of the vehicle (1) is below a speed threshold.
5. The driving assistance method according to any one of the preceding claims, characterized in that,
An area satisfying at least one of the following conditions is generated as the drivable area (3): the road edge height is smaller than the chassis height of the vehicle (1), the distance between the obstacles is larger than the width of the vehicle body, and the distance between the obstacles and the vehicle (1) is larger than a preset distance; and/or
The parameters are additionally measured by means of an ultrasonic radar and/or an infrared camera and/or a laser radar and/or a millimeter wave radar of the vehicle (1) and are fused with the recognition result of the neural network model.
6. The driving assistance method according to any one of the preceding claims, characterized in that,
And when the distance between the obstacles is smaller than a distance threshold value, judging a narrow road scene, wherein the narrow road scene comprises narrow road meeting, narrow road parking and narrow road passing, and automatically popping up functional options related to the driving auxiliary measures.
7. The driving assistance method according to any one of the preceding claims, characterized in that,
In step S2, the looking-around image and the generated travelable region (3) are displayed on an on-board screen of the vehicle (1).
8. The driving assistance method according to any one of the preceding claims, characterized in that,
The looking-around image is formed by splicing images shot by a plurality of looking-around cameras (11), and/or a distance grid is added on the looking-around image.
9. A computer program product comprising a computer program, characterized in that the computer program, when executed by one or more processors, is capable of performing the driving assistance method according to any one of claims 1 to 8.
10. A driving assistance device (100) for a vehicle (1), characterized in that the driving assistance device (100) comprises at least:
-a looking-around camera assembly (10) configured to take looking-around images of the vehicle (1) and having a plurality of looking-around cameras (11);
-a control unit (20) obtaining the looking-around image from the looking-around camera assembly (10) and configured to implement the driving assistance method according to any one of claims 1 to 8 with the computer program product of claim 9 to provide assistance guidance to a driver of the vehicle (1);
-an execution unit (30) communicatively connected to the control unit (20) and configured to be adapted to execute the auxiliary guiding measure.
CN202410392212.2A 2024-04-02 2024-04-02 Driving support method, program product, and driving support device for vehicle Pending CN118254819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410392212.2A CN118254819A (en) 2024-04-02 2024-04-02 Driving support method, program product, and driving support device for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410392212.2A CN118254819A (en) 2024-04-02 2024-04-02 Driving support method, program product, and driving support device for vehicle

Publications (1)

Publication Number Publication Date
CN118254819A true CN118254819A (en) 2024-06-28

Family

ID=91610694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410392212.2A Pending CN118254819A (en) 2024-04-02 2024-04-02 Driving support method, program product, and driving support device for vehicle

Country Status (1)

Country Link
CN (1) CN118254819A (en)

Similar Documents

Publication Publication Date Title
US10981507B1 (en) Interactive safety system for vehicles
CN108973682B (en) Vehicle control system, vehicle control method, and storage medium
US8717196B2 (en) Display apparatus for vehicle
US20160304126A1 (en) Vehicle control device
CN108973988B (en) Vehicle control system, vehicle control method, and storage medium
WO2017119170A1 (en) Driving assistance device
JP7140037B2 (en) Vehicle remote indication system
JP2021160535A (en) Parking support system
CN112995584B (en) Display device and parking assistance system for vehicle
JP2020112542A (en) Display system, display controller, and display control program
CN112977426B (en) Parking assist system
US20190066382A1 (en) Driving support device, driving support method, information providing device and information providing method
CN112977428B (en) Parking assist system
JP2012086684A (en) Parking assist device
JP2008151507A (en) Apparatus and method for merge guidance
US20200311400A1 (en) Driving assistance device
JP7198742B2 (en) AUTOMATED DRIVING VEHICLE, IMAGE DISPLAY METHOD AND PROGRAM
JP2011003117A (en) Backward visual recognition support system
JP2021149319A (en) Display control device, display control method, and program
EP4102323B1 (en) Vehicle remote control device, vehicle remote control system, vehicle remote control method, and vehicle remote control program
WO2022224754A1 (en) Vehicle display system, vehicle display method, and vehicle display program
JP7541843B2 (en) Vehicle control device
CN118254819A (en) Driving support method, program product, and driving support device for vehicle
JP6471707B2 (en) Driving teaching device
JP7111121B2 (en) Display control device and display control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination