CN110884429A - Driving assistance system and method based on laser radar - Google Patents

Driving assistance system and method based on laser radar Download PDF

Info

Publication number
CN110884429A
CN110884429A CN201911216728.7A CN201911216728A CN110884429A CN 110884429 A CN110884429 A CN 110884429A CN 201911216728 A CN201911216728 A CN 201911216728A CN 110884429 A CN110884429 A CN 110884429A
Authority
CN
China
Prior art keywords
image
road condition
vertical distance
radar
integrated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911216728.7A
Other languages
Chinese (zh)
Other versions
CN110884429B (en
Inventor
奚斌嵩
刘军帅
朱凯文
任鑫
李东浩
叶胜伟
王卿海
钱严
原小雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Jianghuai Automobile Group Corp
Original Assignee
Anhui Jianghuai Automobile Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Jianghuai Automobile Group Corp filed Critical Anhui Jianghuai Automobile Group Corp
Priority to CN201911216728.7A priority Critical patent/CN110884429B/en
Publication of CN110884429A publication Critical patent/CN110884429A/en
Application granted granted Critical
Publication of CN110884429B publication Critical patent/CN110884429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/804Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning

Abstract

The invention discloses a driving assistance system and a driving assistance method based on a laser radar, wherein the driving assistance system based on the laser radar acquires image information of road conditions in a preset range through a radar sensor and sends acquired road condition images to a radar controller; the radar controller receives the road condition image, adjusts brightness of image elements contained in the road condition image in an HSL color mixing mode to obtain an image to be integrated, and then sends the image to be integrated to the vehicle-mounted entertainment host; the vehicle-mounted entertainment host computer reads a locally pre-stored top view of the vehicle model, integrates the image to be integrated with the top view of the vehicle model and displays the integrated image. According to the invention, the objects in the current road condition are displayed in different HSL color mixing modes, so that a driver can visually check hollow objects or high and low road edges in a complex road surface in a complex road section, and the convenience and the safety in the driving process are improved.

Description

Driving assistance system and method based on laser radar
Technical Field
The invention relates to the technical field of automobile driving, in particular to a driving assisting system and method based on a laser radar.
Background
With the development of economy, people's requirements for passenger cars are no longer limited to riding instead of walking, but more pursue driving safety and comfort. In order to meet the demands of customers, Advanced Driving Assistance System (ADAS) functions for assisting a driver in operating an automobile are being developed. The mainstream ADAS functions existing in the market at present include parking assistance for a driver, following cruising under a specific road condition, and the like, but the functions for assisting the driver to pass through a complicated road condition (a hollow road section, a narrow road section) do not exist.
For drivers, various complex road conditions often confuse the drivers, and when the vehicles run on a road section with complex road conditions, the drivers are difficult to grasp the running distance, the turning angle and the like by naked eyes, so that accidents are easy to happen.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a driving assistance system and method based on a laser radar, and aims to solve the technical problem that effective driving assistance cannot be carried out when a driver passes through a complicated road section in the prior art.
In order to achieve the above object, the present invention provides a driving assistance system based on a lidar, the system comprising: the system comprises a radar sensor, a radar controller and a vehicle-mounted entertainment host;
the radar sensor is used for acquiring image information of road conditions in a preset range and sending the acquired road condition images to the radar controller;
the radar controller is used for receiving the road condition image and adjusting the brightness of image elements contained in the road condition image in an HSL (high speed Link) toning mode to obtain an image to be integrated;
the radar controller is used for sending the image to be integrated to the vehicle-mounted entertainment host;
the vehicle-mounted entertainment host is used for reading a vehicle model top view which is pre-stored locally, integrating the image to be integrated with the vehicle model top view and displaying the integrated image.
Preferably, the radar sensor is further configured to acquire image information of a road condition within a preset range to obtain a current road condition image;
the radar sensor is also used for determining the vertical distance information between the vehicle and each road surface object according to the current road condition image;
the radar sensor is further used for sending the current road condition image and the vertical distance information to the radar controller as collected road condition images.
Preferably, the radar controller is further configured to receive the road condition image and extract vertical distance information in the road condition image;
the radar controller is further configured to classify the vertical distances included in the vertical distance information according to a preset distance range, and perform brightness adjustment on image elements included in the road condition image in an HSL (high speed Link) color mixing mode according to a classification result to obtain an image to be integrated.
Preferably, the preset distance range includes: a lower distance limit and an upper distance limit;
the radar controller is further configured to classify vertical distances included in the vertical distance information according to the distance lower limit value and the distance upper limit value to obtain a vertical distance set;
the radar controller is further configured to search a corresponding brightness adjustment strategy in a pre-constructed mapping relation according to the vertical distance set, where the mapping relation includes a corresponding relation between the vertical distance set and the brightness adjustment strategy;
and the radar controller is also used for carrying out brightness adjustment on image elements contained in the road condition image in an HSL (hue, saturation and lightness) toning mode according to the brightness adjustment strategy to obtain an image to be integrated.
Preferably, the radar controller is further configured to select, according to the vertical distance information, a target image element of which the vertical distance is smaller than or equal to the distance lower limit value from the current road condition image;
the radar controller is further used for obtaining the actual vertical distance of the target image element, judging whether the actual vertical distance is smaller than a preset early warning threshold value, if so, generating a braking early warning, and sending the braking early warning to the vehicle-mounted entertainment host.
In addition, in order to achieve the above object, the present invention further provides a driving assistance method based on a laser radar, including:
the radar sensor acquires image information of road conditions within a preset range and sends the acquired road condition images to the radar controller;
the radar controller receives the road condition image, and brightness adjustment is carried out on image elements contained in the road condition image in an HSL (high speed Link) color mixing mode to obtain an image to be integrated;
the radar controller sends the image to be integrated to the vehicle-mounted entertainment host;
and the vehicle-mounted entertainment host reads a locally pre-stored vehicle model top view, integrates the image to be integrated with the vehicle model top view, and displays the integrated image.
Preferably, the radar sensor collects image information of road conditions within a preset range, and sends the collected road condition images to the radar controller, including:
the radar sensor acquires image information of road conditions within a preset range to obtain a current road condition image;
the radar sensor determines vertical distance information between the vehicle and each road surface object according to the current road condition image;
and the radar sensor sends the current road condition image and the vertical distance information to the radar controller as collected road condition images.
Preferably, the radar controller receives the road condition image, and performs brightness adjustment on image elements included in the road condition image in an HSL color mixing manner to obtain an image to be integrated, including:
the radar controller receives the road condition image and extracts vertical distance information in the road condition image;
and the radar controller classifies the vertical distance contained in the vertical distance information according to a preset distance range, and adjusts the brightness of image elements contained in the road condition image in an HSL color mixing mode according to a classification result to obtain an image to be integrated.
Preferably, the preset distance range includes: a lower distance limit and an upper distance limit;
the radar controller classifies the vertical distance contained in the vertical distance information according to a preset distance range, and adjusts brightness of image elements contained in the road condition image in an HSL color mixing mode according to a classification result to obtain an image to be integrated, and the radar controller comprises the following steps of:
the radar controller classifies the vertical distance contained in the vertical distance information according to the distance lower limit value and the distance upper limit value to obtain a vertical distance set;
the radar controller searches a corresponding brightness adjustment strategy in a pre-constructed mapping relation according to the vertical distance set, wherein the mapping relation comprises the corresponding relation between the vertical distance set and the brightness adjustment strategy;
and the radar controller adjusts the brightness of image elements contained in the road condition image in an HSL color mixing mode according to the brightness adjusting strategy to obtain an image to be integrated.
Preferably, after the step of measuring information on vertical distances between the objects in the current road condition image and the radar sensor by the radar sensor, the method further includes:
the radar controller selects a target image element with the vertical distance smaller than or equal to the distance lower limit value from the current road condition image according to the vertical distance information;
and the radar controller acquires the actual vertical distance of the target image element, judges whether the actual vertical distance is smaller than a preset early warning threshold value or not, generates a braking early warning if the actual vertical distance is smaller than the preset early warning threshold value, and sends the braking early warning to the vehicle-mounted entertainment host computer for prompting.
The driving assistance system based on the laser radar carries out image information acquisition on road conditions in a preset range through the radar sensor and sends the acquired road condition images to the radar controller; the radar controller receives the road condition image, adjusts brightness of image elements contained in the road condition image in an HSL color mixing mode to obtain an image to be integrated, and then sends the image to be integrated to the vehicle-mounted entertainment host; the vehicle-mounted entertainment host computer reads a locally pre-stored top view of the vehicle model, integrates the image to be integrated with the top view of the vehicle model and displays the integrated image. According to the invention, the objects in the current road condition are displayed in different HSL color mixing modes, so that a driver can visually check hollow objects or high and low road edges in a complex road surface in a complex road section, and the convenience and the safety in the driving process are improved.
Drawings
FIG. 1 is a schematic diagram of a first embodiment of a lidar-based assisted steering system of the present invention;
FIG. 2 is a schematic diagram of a vehicle-mounted entertainment host displaying a hollow road section in the laser radar-based assistant driving system according to the present invention;
FIG. 3 is a schematic diagram of a vehicle entertainment host displaying a narrow road section in the laser radar-based assistant driving system according to the present invention;
fig. 4 is a schematic flow chart of a driving assistance method based on a lidar according to a first embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic diagram of a first embodiment of a driving assistance system based on a lidar according to the present invention.
As shown in fig. 1, the laser radar-based driving assistance system may include: radar sensor 10, radar controller 20, and in-vehicle entertainment host 30.
Those skilled in the art will appreciate that the configuration shown in fig. 1 does not constitute a limitation of a lidar based assisted steering system and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
Note that the radar sensor 10 may be a laser radar capable of emitting a laser beam to detect a characteristic amount such as a position, a speed, and the like of a target. In this embodiment, the radar sensor 10 includes three laser radars, one of which is arranged in the center of the front bumper of the vehicle, and the other two front corner laser radars are arranged on both sides of the head of the vehicle, and the three laser radars are arranged at the same level, so that all road surfaces 4 meters ahead from the wheels and 1 meter in each range from left to right of the vehicle can be covered. The radar controller 20 may be a control device capable of controlling the radar sensor 10, the control device having the capability of computing and processing data and images. The in-vehicle entertainment host 30 may be an Electronic Control Unit (ECU) mounted on the vehicle, which is also called a "driving computer", "in-vehicle computer", or "vehicle central Control system".
The radar sensor 10 is configured to acquire image information of road conditions within a preset range, and send an acquired road condition image to the radar controller 20;
it should be noted that the preset range is a measurement range that can be covered by the radar sensor 10. The image information acquisition is to measure the information such as images, positions, heights and the like of various objects on the road surface in front of the vehicle in real time, so as to obtain corresponding road condition images or images.
In a specific implementation, the radar sensor 10 may transmit the collected road condition image to the radar controller 20 after collecting the image information of the road condition within the preset range.
Further, considering that the complex road condition is generally a hollow section or a narrow section, and there may be obstacles with different heights and hollows with different depths in the hollow section, there are usually objects higher or lower than the road edge on both sides of the narrow section. Therefore, the present embodiment considers that the objects belong to any one of the groups of pothole, road block or road edge objects by measuring the vertical distance between the road object and the horizontal plane where the radar sensor is located and then adjusting the corresponding chromaticity, saturation and brightness of the road object according to the numerical range of the vertical distance.
Specifically, the radar sensor 10 in this embodiment is further configured to perform image information acquisition on a road condition within a preset range, so as to obtain a current road condition image; then determining the vertical distance information between the vehicle and each object according to the current road condition image; and then the current road condition image and the vertical distance information are sent to the radar controller 20 as the collected road condition image.
The radar controller 20 is configured to receive the road condition image, and perform brightness adjustment on image elements included in the road condition image by using an HSL color matching manner to obtain an image to be integrated;
it should be understood that HSL is a representation of points in the RGB color model in a cylindrical coordinate system. HSL is chroma (Hue), Saturation (Saturation), and brightness/brightness (Lightness).
The chroma (H) is a basic attribute of color, and is a name of a commonly-known color, such as red, yellow, and the like. The saturation (S) is the purity of the color, and the higher the saturation, the purer the color, and the lower the saturation, the gray gradually becomes. The lightness (V) or brightness (L) may represent the degree of shading of an object.
In this embodiment, the HSL toning mode characterizes the shape of each object on the road surface by adjusting the chromaticity, saturation, and brightness of image elements (i.e., objects on the road surface) in the road condition image.
In consideration of the fact that the height of the object can be more intuitively reflected in the image, in the embodiment, it is preferable to perform brightness adjustment on the image elements included in the road condition image in an HSL toning manner.
In a specific implementation, the radar controller 20 is further configured to receive the road condition image and extract vertical distance information in the road condition image; and then classifying the vertical distances contained in the vertical distance information according to a preset distance range, and regulating the brightness of image elements contained in the road condition image in an HSL (high speed Link) toning mode according to a classification result to obtain an image to be integrated.
In this embodiment, the preset distance range includes: a lower distance limit and an upper distance limit. Correspondingly, the radar controller 20 is further configured to classify the vertical distances included in the vertical distance information according to the distance lower limit value and the distance upper limit value, and obtain a vertical distance set (for example, a vertical distance with a vertical distance greater than or equal to 122.5mm is taken as a first class, a vertical distance with a vertical distance less than or equal to 0mm is taken as a second class, and a vertical distance between 0 and 122.5mm is taken as a third class); then, according to the vertical distance set, searching a corresponding brightness adjustment strategy in a pre-constructed mapping relation (the mapping relation comprises the corresponding relation between the vertical distance set and the brightness adjustment strategy, for example, a first type of vertical distance set corresponds to a first brightness adjustment strategy, and a second type of vertical distance set corresponds to a second brightness adjustment strategy); and adjusting the brightness of the image elements contained in the road condition image in an HSL color mixing mode according to the brightness adjusting strategy to obtain an image to be integrated.
Wherein, the (first) brightness adjustment strategy corresponding to the (road surface object of the) first-type vertical distance is to set the chromaticity, the saturation and the brightness as: chroma 0, saturation 0, lightness 255; the (second) lightness adjustment strategy corresponding to the (road object of the) second type of vertical distance is to set the chromaticity, saturation and lightness as: chroma 0, saturation 0, lightness 0; the (third) lightness adjustment strategy corresponding to the (road surface object of the) third type of vertical distance is to set the chromaticity, saturation and lightness as: chroma 0, saturation 0, lightness 0-255.
Specifically, a road surface object with a vertical distance to the vehicle radar of more than or equal to 122.5mm in height can be characterized by a color (pure white) of (chroma 0, saturation 0, lightness 255); the road surface object with the height of less than or equal to 0mm in the vertical distance from the radar is characterized by the color (pure black) of (chroma 0, saturation 0 and lightness 0); a pavement object with a height of 0 to 122.5mm from the vertical distance of the radar is characterized by gray (chroma is 0, saturation is 0, and lightness is 0-255), and each 0.5mm change of the distance corresponds to 1 lightness (value).
Referring to fig. 2, fig. 2 is a schematic diagram of a vehicle-mounted entertainment host displaying a hollow section in the laser radar-based driving assistance system of the present invention.
As shown in fig. 2, when the vehicle travels on a pothole section, the driver decelerates and turns on the in-vehicle entertainment host 30, and the in-vehicle entertainment host 30 can display the front close-range image. In fig. 2, black indicates a projection, and the deeper the color, the higher the height, white indicates a pit, and the lighter the color, the deeper the depth. The driver can avoid the high bumps and deep pits in the low-speed environment, avoid the tire from sinking or the vehicle from jolting, and improve the driving safety and comfort.
Referring to fig. 3, fig. 3 is a schematic diagram of a vehicle-mounted entertainment host displaying a narrow road section in the laser radar-based driving assistance system of the present invention.
As shown in FIG. 3, when the vehicle travels a narrow road, the driver decelerates, turns on the in-vehicle entertainment host 30, and turns on the in-vehicle entertainment host 30 to display the front close-range image. The left and right black bumps in fig. 3 indicate the road edge. The driver can accurately control the vehicle to pass through a narrow road section under the low-speed environment, so that the error of visual judgment is avoided, and the driving safety and comfort are improved.
The radar controller 20 is configured to send the image to be integrated to the in-vehicle entertainment host 30;
in a specific implementation, the radar controller 20, after obtaining the image to be integrated, may send the image to be integrated to the in-vehicle entertainment host 30 for subsequent image integration.
The vehicle-mounted entertainment host 30 is configured to read a locally pre-stored top view of the vehicle model, integrate the image to be integrated with the top view of the vehicle model, and display the integrated image.
It should be understood that the top view of the vehicle model is a picture containing a top view image of the vehicle model, and the top view of the vehicle model can display a part of the vehicle body of the vehicle model, so that a user can clearly and intuitively know road condition information in front of the vehicle.
In a specific implementation, when the to-be-integrated image is acquired, the in-vehicle entertainment host 30 may read a locally pre-stored top view of the vehicle model, then integrate the to-be-integrated image with the top view of the vehicle model, and display the integrated image.
Further, in order to perform early warning on the vehicle in time when the height of an obstacle in front of the vehicle is too high, in this embodiment, the radar controller 20 is further configured to select, according to the vertical distance information, a target image element whose vertical distance is less than or equal to the distance lower limit value from the current road condition image; and then, acquiring the actual vertical distance of the target image element, judging whether the actual vertical distance is smaller than a preset early warning threshold value, if so, generating a braking early warning, and sending the braking early warning to the vehicle-mounted entertainment host 30.
The braking early warning can prompt a driver of an obstacle with an excessively high height in front of a vehicle in a voice broadcasting, image displaying or buzzing mode, and the driver is noticed to avoid the obstacle, so that driving safety is improved.
The driving assistance system based on the laser radar in the embodiment acquires image information of road conditions in a preset range through the radar sensor and sends the acquired road condition images to the radar controller; the radar controller receives the road condition image, adjusts brightness of image elements contained in the road condition image in an HSL color mixing mode to obtain an image to be integrated, and then sends the image to be integrated to the vehicle-mounted entertainment host; the vehicle-mounted entertainment host computer reads a locally pre-stored top view of the vehicle model, integrates the image to be integrated with the top view of the vehicle model and displays the integrated image. This embodiment shows according to the difference of the HSL mixing of colors mode to the object in the current road conditions for the driver can audio-visually look over hole object or high-low road edge in the complicated road surface when complicated highway section, has improved convenience and the security of driving the in-process.
Referring to fig. 4, fig. 4 is a schematic flowchart of a first embodiment of a driving assistance method based on a lidar according to the present invention.
The driving assistance method based on the laser radar provided by the embodiment comprises the following steps:
step S10: the radar sensor acquires image information of road conditions within a preset range and sends the acquired road condition images to the radar controller;
it should be noted that the radar sensor may be a laser radar capable of emitting a laser beam to detect a characteristic amount such as a position, a speed, and the like of the target. In this embodiment, the radar sensor includes three laser radars, one of which is arranged in the center of the front bumper of the vehicle, and the other two front corner laser radars are arranged on both sides, and the three laser radars are arranged at the same level, so that all roads in the range of 1 meter each from 4 meters ahead of the wheels to the left and right of the vehicle can be covered. The radar controller may be a control device capable of controlling the radar sensor, and the control device has a function of performing arithmetic and processing on data and images. The vehicle-mounted entertainment host may be an Electronic Control Unit (ECU) mounted on a vehicle, which is also called a "driving computer", "vehicle-mounted computer", or a "vehicle central Control system".
It should be understood that the predetermined range is the measurement range covered by the radar sensor. The image information acquisition is to acquire information such as images, positions, heights and the like of various objects on the road surface in front of the vehicle, so as to obtain road condition images or images.
In concrete implementation, the radar sensor can send the collected road condition image to the radar controller after acquiring the image information of the road condition within the preset range.
Further, considering that the complex road condition is generally a hollow section or a narrow section, and there may be obstacles with different heights and hollows with different depths in the hollow section, there are usually objects higher or lower than the road edge on both sides of the narrow section. Therefore, the present embodiment considers that the objects belong to potholes, roadblocks or road edges by measuring the vertical distance between the object in the road condition image and the horizontal plane where the radar sensor is located and then characterizing the objects according to the chromaticity, saturation and brightness corresponding to the vertical distance.
Specifically, in this embodiment, the radar sensor may acquire image information of a road condition within a preset range to obtain a current road condition image; then determining the vertical distance information between the vehicle and each object according to the current road condition image; and then the current road condition image and the vertical distance information are used as collected road condition images to be sent to the radar controller.
Step S20: the radar controller receives the road condition image, and brightness adjustment is carried out on image elements contained in the road condition image in an HSL (high speed Link) color mixing mode to obtain an image to be integrated;
it should be understood that HSL is a representation of points in the RGB color model in a cylindrical coordinate system. HSL is chroma, Saturation, and luminance (Hue, Saturation, brightness).
The chroma (H) is a basic attribute of color, and is a name of a commonly-known color, such as red, yellow, and the like. The saturation (S) is the purity of the color, and the higher the color is, the more pure the color is, and the lower the color is, the gray gradually becomes, and the value range is 0-100%. Lightness (V) or brightness (L) in the range of 0-100%.
In this embodiment, the HSL toning mode is to represent the height of each object on the road surface by adjusting (chroma, saturation, and brightness) of image elements in the road condition image.
In consideration of the fact that the height of the object can be more intuitively reflected in the image, in the embodiment, it is preferable to perform brightness adjustment on the image elements included in the road condition image in an HSL toning manner.
In a specific implementation, the radar controller may receive the road condition image and extract vertical distance information in the road condition image; and then classifying the vertical distances contained in the vertical distance information according to a preset distance range, and regulating the brightness of image elements contained in the road condition image in an HSL (high speed Link) toning mode according to a classification result to obtain an image to be integrated.
In this embodiment, the preset distance range includes: a lower distance limit and an upper distance limit. Accordingly, the radar controller may classify the vertical distances included in the vertical distance information according to the lower distance limit value and the upper distance limit value to obtain a vertical distance set (for example, a vertical distance with a vertical distance greater than or equal to 122.5mm is taken as a first class, a vertical distance with a vertical distance less than or equal to 0mm is taken as a second class, and a vertical distance between 0 and 122.5mm is taken as a third class); then searching a corresponding brightness adjustment strategy in a pre-constructed mapping relation according to the vertical distance set, wherein the mapping relation comprises the corresponding relation between the vertical distance set and the brightness adjustment strategy; and adjusting the brightness of the image elements contained in the road condition image in an HSL color mixing mode according to the brightness adjusting strategy to obtain an image to be integrated.
The lightness adjustment strategy corresponding to the first vertical distance is to set the chroma, the saturation and the lightness as follows: chroma 0, saturation 0, lightness 255; the lightness adjustment strategy corresponding to the (road object of the) second type of vertical distance is to set the chroma, saturation and lightness as: chroma 0, saturation 0, lightness 0; the lightness adjustment strategy corresponding to the (road object of the) third type of vertical distance is to set the chroma, saturation and lightness as: chroma 0, saturation 0, lightness 0-255.
Specifically, a road surface object with a vertical distance to the vehicle radar of more than or equal to 122.5mm in height can be characterized by a color (pure white) of (chroma 0, saturation 0, lightness 255); the road surface object with the height of less than or equal to 0mm in the vertical distance from the radar is characterized by the color (pure black) of (chroma 0, saturation 0 and lightness 0); a pavement object with a height of 0 to 122.5mm from the vertical distance of the radar is characterized by gray (chroma is 0, saturation is 0, and lightness is 0-255), and each 0.5mm change of the distance corresponds to 1 lightness (value).
Step S30: the radar controller sends the image to be integrated to the vehicle-mounted entertainment host;
in a specific implementation, after the radar controller obtains the image to be integrated, the image to be integrated is sent to the vehicle-mounted entertainment host for subsequent image integration.
Step S40: and the vehicle-mounted entertainment host reads a locally pre-stored vehicle model top view, integrates the image to be integrated with the vehicle model top view, and displays the integrated image.
It should be understood that the top view of the vehicle model is a picture containing a top view image of the vehicle model, and the top view of the vehicle model can display a part of the vehicle body of the vehicle model, so that a user can clearly and intuitively know road condition information in front of the vehicle.
In a specific implementation, when the vehicle-mounted entertainment host acquires the image to be integrated, the vehicle-mounted entertainment host reads a locally pre-stored vehicle model top view, then integrates the image to be integrated with the vehicle model top view, and displays the integrated image.
Further, in order to perform early warning on the vehicle in time when the height of an obstacle in front of the vehicle is too high, in this embodiment, the radar controller is further configured to select, from the current road condition image, a target image element of which the vertical distance is smaller than or equal to the distance lower limit value according to the vertical distance information; and then acquiring the actual vertical distance of the target image element, judging whether the actual vertical distance is smaller than a preset early warning threshold value, if so, generating a braking early warning, and sending the braking early warning to the vehicle-mounted entertainment host. The braking early warning can prompt a driver of an obstacle with an excessively high height in front of a vehicle in a voice broadcasting, image displaying or buzzing mode, and the driver is noticed to avoid the obstacle, so that driving safety is improved.
The driving assistance system based on the laser radar in the embodiment acquires image information of road conditions in a preset range through the radar sensor and sends the acquired road condition images to the radar controller; the radar controller receives the road condition image, adjusts brightness of image elements contained in the road condition image in an HSL color mixing mode to obtain an image to be integrated, and then sends the image to be integrated to the vehicle-mounted entertainment host; the vehicle-mounted entertainment host computer reads a locally pre-stored top view of the vehicle model, integrates the image to be integrated with the top view of the vehicle model and displays the integrated image. This embodiment is through showing the object in the current road conditions according to the HSL of difference for the driver can audio-visually hold the hole object in the road surface when complicated highway section, has improved convenience and the security of driving the in-process.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., a rom/ram, a magnetic disk, an optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A lidar-based driver assistance system, the system comprising: the system comprises a radar sensor, a radar controller and a vehicle-mounted entertainment host;
the radar sensor is used for acquiring image information of road conditions in a preset range and sending the acquired road condition images to the radar controller;
the radar controller is used for receiving the road condition image and adjusting the brightness of image elements contained in the road condition image in an HSL (high speed Link) toning mode to obtain an image to be integrated;
the radar controller is used for sending the image to be integrated to the vehicle-mounted entertainment host;
the vehicle-mounted entertainment host is used for reading a vehicle model top view which is pre-stored locally, integrating the image to be integrated with the vehicle model top view and displaying the integrated image.
2. The system of claim 1, wherein the radar sensor is further configured to acquire image information of a road condition within a preset range to obtain an image of a current road condition;
the radar sensor is also used for determining the vertical distance information between the vehicle and each road surface object according to the current road condition image;
the radar sensor is further used for sending the current road condition image and the vertical distance information to the radar controller as collected road condition images.
3. The system of claim 1, wherein the radar controller is further configured to receive the road condition image and extract vertical distance information from the road condition image;
the radar controller is further configured to classify the vertical distances included in the vertical distance information according to a preset distance range, and perform brightness adjustment on image elements included in the road condition image in an HSL (high speed Link) color mixing mode according to a classification result to obtain an image to be integrated.
4. The system of claim 3, wherein the preset distance range comprises: a lower distance limit and an upper distance limit;
the radar controller is further configured to classify vertical distances included in the vertical distance information according to the distance lower limit value and the distance upper limit value to obtain a vertical distance set;
the radar controller is further configured to search a corresponding brightness adjustment strategy in a pre-constructed mapping relation according to the vertical distance set, where the mapping relation includes a corresponding relation between the vertical distance set and the brightness adjustment strategy;
and the radar controller is also used for carrying out brightness adjustment on image elements contained in the road condition image in an HSL (hue, saturation and lightness) toning mode according to the brightness adjustment strategy to obtain an image to be integrated.
5. The system of claim 4, wherein the radar controller is further configured to select a target image element with a vertical distance less than or equal to the distance lower limit value from the current road condition image according to the vertical distance information;
the radar controller is further used for obtaining the actual vertical distance of the target image element, judging whether the actual vertical distance is smaller than a preset early warning threshold value, if so, generating a braking early warning, and sending the braking early warning to the vehicle-mounted entertainment host.
6. A method of assisted driving based on lidar, the method comprising:
the radar sensor acquires image information of road conditions within a preset range and sends the acquired road condition images to the radar controller;
the radar controller receives the road condition image, and brightness adjustment is carried out on image elements contained in the road condition image in an HSL (high speed Link) color mixing mode to obtain an image to be integrated;
the radar controller sends the image to be integrated to the vehicle-mounted entertainment host;
and the vehicle-mounted entertainment host reads a locally pre-stored vehicle model top view, integrates the image to be integrated with the vehicle model top view, and displays the integrated image.
7. The method as claimed in claim 6, wherein the step of the radar sensor collecting image information of the road condition within a preset range and transmitting the collected road condition image to the radar controller comprises:
the radar sensor acquires image information of road conditions within a preset range to obtain a current road condition image;
the radar sensor determines vertical distance information between the vehicle and each road surface object according to the current road condition image;
and the radar sensor sends the current road condition image and the vertical distance information to the radar controller as collected road condition images.
8. The method as claimed in claim 6, wherein the step of receiving the road condition image by the radar controller and adjusting brightness of image elements included in the road condition image by using an HSL color matching method to obtain an image to be integrated comprises:
the radar controller receives the road condition image and extracts vertical distance information in the road condition image;
and the radar controller classifies the vertical distance contained in the vertical distance information according to a preset distance range, and adjusts the brightness of image elements contained in the road condition image in an HSL color mixing mode according to a classification result to obtain an image to be integrated.
9. The method of claim 8, wherein the preset distance range comprises: a lower distance limit and an upper distance limit;
the radar controller classifies the vertical distance contained in the vertical distance information according to a preset distance range, and adjusts brightness of image elements contained in the road condition image in an HSL color mixing mode according to a classification result to obtain an image to be integrated, and the radar controller comprises the following steps of:
the radar controller classifies the vertical distance contained in the vertical distance information according to the distance lower limit value and the distance upper limit value to obtain a vertical distance set;
the radar controller searches a corresponding brightness adjustment strategy in a pre-constructed mapping relation according to the vertical distance set, wherein the mapping relation comprises the corresponding relation between the vertical distance set and the brightness adjustment strategy;
and the radar controller adjusts the brightness of image elements contained in the road condition image in an HSL color mixing mode according to the brightness adjusting strategy to obtain an image to be integrated.
10. The method of claim 9, wherein after the step of the radar sensor measuring vertical distance information between each object in the current road condition image and the radar sensor, the method further comprises:
the radar controller selects a target image element with the vertical distance smaller than or equal to the distance lower limit value from the current road condition image according to the vertical distance information;
and the radar controller acquires the actual vertical distance of the target image element, judges whether the actual vertical distance is smaller than a preset early warning threshold value or not, generates a braking early warning if the actual vertical distance is smaller than the preset early warning threshold value, and sends the braking early warning to the vehicle-mounted entertainment host.
CN201911216728.7A 2019-11-29 2019-11-29 Driving assistance system and method based on laser radar Active CN110884429B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911216728.7A CN110884429B (en) 2019-11-29 2019-11-29 Driving assistance system and method based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911216728.7A CN110884429B (en) 2019-11-29 2019-11-29 Driving assistance system and method based on laser radar

Publications (2)

Publication Number Publication Date
CN110884429A true CN110884429A (en) 2020-03-17
CN110884429B CN110884429B (en) 2021-06-08

Family

ID=69750002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911216728.7A Active CN110884429B (en) 2019-11-29 2019-11-29 Driving assistance system and method based on laser radar

Country Status (1)

Country Link
CN (1) CN110884429B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113569692A (en) * 2021-07-22 2021-10-29 上汽通用五菱汽车股份有限公司 Driving assistance method, system, device, and computer-readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102812703A (en) * 2010-03-18 2012-12-05 爱信精机株式会社 Image display device
US20160104272A1 (en) * 2014-10-10 2016-04-14 Ncku Research And Development Foundation Auto-contrast enhancement system
CN109094567A (en) * 2018-09-29 2018-12-28 奇瑞汽车股份有限公司 Automobile safety protective method and apparatus
CN109849782A (en) * 2017-11-30 2019-06-07 比亚迪股份有限公司 Virtual panoramic auxiliary driving device and its display methods, vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102812703A (en) * 2010-03-18 2012-12-05 爱信精机株式会社 Image display device
US20160104272A1 (en) * 2014-10-10 2016-04-14 Ncku Research And Development Foundation Auto-contrast enhancement system
CN109849782A (en) * 2017-11-30 2019-06-07 比亚迪股份有限公司 Virtual panoramic auxiliary driving device and its display methods, vehicle
CN109094567A (en) * 2018-09-29 2018-12-28 奇瑞汽车股份有限公司 Automobile safety protective method and apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113569692A (en) * 2021-07-22 2021-10-29 上汽通用五菱汽车股份有限公司 Driving assistance method, system, device, and computer-readable storage medium
CN113569692B (en) * 2021-07-22 2024-02-09 上汽通用五菱汽车股份有限公司 Driving assistance method, system, apparatus, and computer-readable storage medium

Also Published As

Publication number Publication date
CN110884429B (en) 2021-06-08

Similar Documents

Publication Publication Date Title
US11961162B2 (en) Imaging apparatus, image processing apparatus, display system, and vehicle
US11056003B1 (en) Occupant facing vehicle display
US11008016B2 (en) Display system, display method, and storage medium
AU2018219084B2 (en) Using wheel orientation to determine future heading
US7486175B2 (en) Vehicle drive assist system
CN112084232B (en) Vehicle driving risk assessment method and device based on visual field information of other target vehicles
US10732420B2 (en) Head up display with symbols positioned to augment reality
US20230184560A1 (en) Visual interface display method and apparatus, electronic device, and storage medium
CN109952491B (en) Method and system for generating a representation of an object detected by a perception system of a vehicle
US11364842B2 (en) Notification device
CN111505617B (en) Vehicle positioning method, device, equipment and storage medium
CN115273023A (en) Vehicle-mounted road pothole identification method and system and automobile
US8170284B2 (en) Apparatus and method for displaying image of view in front of vehicle
CN110884429B (en) Driving assistance system and method based on laser radar
CN113343738A (en) Detection method, device and storage medium
CN108705972B (en) Vehicle transverse control information display system
CN114559935A (en) Virtual zebra crossing projection control method and device, storage medium and navigation system
JPH0778240A (en) Calculating method for road disappearing point
JP2023536812A (en) Systems and methods for informing vehicle occupants of the severity and location of potential vehicle threats
JP2018098567A (en) Imaging apparatus, image processing apparatus, display system, and vehicle
WO2017179174A1 (en) Moving body surroundings display method and moving body surroundings display apparatus
JP2018101850A (en) Imaging device, image processing apparatus, display system, and vehicle
KR20170055624A (en) Apparatus for displaying traffic lane using head-up display and method thereof
JPH0778255A (en) Method for extracting straight line part of image
JP2019049808A (en) Information display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant