CN116513054A - Vehicle driving assisting method and device, vehicle and storage medium - Google Patents
Vehicle driving assisting method and device, vehicle and storage medium Download PDFInfo
- Publication number
- CN116513054A CN116513054A CN202310506012.0A CN202310506012A CN116513054A CN 116513054 A CN116513054 A CN 116513054A CN 202310506012 A CN202310506012 A CN 202310506012A CN 116513054 A CN116513054 A CN 116513054A
- Authority
- CN
- China
- Prior art keywords
- haze
- vehicle
- target
- characteristic information
- surrounding environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000003860 storage Methods 0.000 title claims abstract description 16
- 238000009434 installation Methods 0.000 claims description 9
- 238000012216 screening Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- SAZUGELZHZOXHB-UHFFFAOYSA-N acecarbromal Chemical compound CCC(Br)(CC)C(=O)NC(=O)NC(C)=O SAZUGELZHZOXHB-UHFFFAOYSA-N 0.000 claims description 3
- 238000001514 detection method Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 239000002245 particle Substances 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 239000005427 atmospheric aerosol Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000000443 aerosol Substances 0.000 description 1
- 238000012271 agricultural production Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000003344 environmental pollutant Substances 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 231100000206 health hazard Toxicity 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 231100000719 pollutant Toxicity 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/04—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
- B60Q1/06—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
- B60Q1/08—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
- B60Q1/085—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically due to special conditions, e.g. adverse weather, type of road, badly illuminated road signs or potential dangers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8053—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Mechanical Engineering (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
The application provides a vehicle driving assisting method, a device, a vehicle and a storage medium, wherein the method comprises the following steps: respectively carrying out haze recognition on target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding cameras and a second haze recognition result of the front-view cameras; based on the first haze identification result and the second haze identification result, determining whether haze exists in the surrounding environment where the target vehicle is located, and determining a haze level when the haze exists; the haze level comprises light haze, medium haze, heavy haze and group haze; and when the haze exists in the surrounding environment where the target vehicle is located, controlling the target car lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located. The haze recognition is carried out on the target images corresponding to the front view camera and the around view camera, so that the accuracy of the haze recognition is improved, automatic control of the target car lamp is realized according to the recognition result, and the driving safety can be improved.
Description
Technical Field
The present disclosure relates to the field of automatic driving technologies, and in particular, to a method and apparatus for assisting driving of a vehicle, and a storage medium.
Background
Along with the rapid expansion of the economic scale and the acceleration of the urban process, the occurrence frequency of haze pollution weather is also higher and higher. When haze pollution weather occurs, molecules, water vapor and a large amount of suspended particles in the atmosphere form aerosol to form serious absorption, scattering and reflection effects on light, so that the atmospheric visibility is reduced, and serious influence is generated on traffic orders of roads, railways, aviation, shipping and the like. And haze contaminates a large amount of pollutant particles suspended in the air during the weather, and can cause great health hazards to people exposed to haze. The haze weather is timely and accurately detected, and the method has important significance for daily life, normal traffic of vehicles, agricultural production and the like.
In the prior art, the haze detection method mainly judges the haze pollution condition by judging the content of atmospheric aerosol particles in the air, the haze detection method is used for detecting components in the air by relying on a professional measuring instrument, the professional requirement on a detecting instrument is high, the measurement method is complex, the haze detection of a vehicle in the driving process is not facilitated, and a plurality of drivers cannot actively turn on front and rear fog lamps due to the fact that the vehicle condition is not known, and the like, so that accidents are easy to cause. Therefore, how to improve the accuracy of haze detection and the driving safety become a technical problem in a small scale.
Disclosure of Invention
In view of the foregoing, an object of the present application is to provide a vehicle driving assisting method, apparatus, vehicle and storage medium, which can improve accuracy of haze recognition by performing haze recognition on target images corresponding to a front view camera and a surrounding view camera, thereby realizing automatic control of target vehicle lamps according to recognition results, and improving driving safety.
The embodiment of the application provides a vehicle auxiliary driving method, which comprises the following steps:
acquiring target images of the surrounding environment of a target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position;
respectively carrying out haze recognition on target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera;
determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining a haze level when the haze exists;
and when the haze exists in the surrounding environment where the target vehicle is located, controlling a target car lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located.
In one possible implementation manner, the determining, based on the first haze recognition result and the second haze recognition result, whether the surrounding environment where the target vehicle is located has haze, and determining the haze level when the target vehicle is provided with haze includes:
judging whether a first haze identification result of the looking-around camera is haze-free or not;
if the first haze identification result is haze-free, judging whether a second haze identification result of the front-view camera is haze-free, if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free, and if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free.
In one possible implementation manner, after the determining whether the first haze recognition result of the pan-around camera is haze-free, the vehicle driving assisting method further includes:
if the first haze identification result is haze, determining that the surrounding environment where the target vehicle is located is haze based on the haze identification result of each vehicle-mounted camera, and determining the haze level.
In a possible implementation manner, the determining that the surrounding environment where the target vehicle is located has haze based on the haze recognition result of each vehicle-mounted camera, and determining the haze level include:
sorting the haze recognition results of the plurality of target images, and screening out the largest haze recognition result from the sorted haze recognition results;
detecting a haze concentration value interval in which the maximum haze identification result is located;
if the maximum haze identification result is in the first interval, the haze level of the surrounding environment where the target vehicle is located is light haze;
if the maximum haze identification result is in the second interval, the haze level of the surrounding environment where the target vehicle is located is medium haze;
if the maximum haze identification result is in the third interval, the haze level of the surrounding environment where the target vehicle is located is heavy haze;
the haze concentration value in the first interval is smaller than the haze concentration value in the second interval, and the haze concentration value in the second interval is smaller than the haze concentration value in the third interval.
In one possible embodiment, the target vehicle lamp includes a vehicle-mounted fog lamp and a vehicle-mounted double flashing lamp; when it is determined that the surrounding environment where the target vehicle is located has haze, controlling a target lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located, including:
If the haze level of the surrounding environment where the target vehicle is located is the light haze or the group haze, controlling the vehicle-mounted fog lamp to be in an irradiation state;
and if the haze level of the surrounding environment where the target vehicle is located is the moderate haze or the severe haze, controlling the vehicle-mounted fog lamp and the vehicle-mounted double-flash lamp to be in an irradiation state.
In one possible implementation manner, for any one of the target images, haze recognition is performed on the target image according to the following steps to obtain a haze recognition result of the target image, where the haze recognition result includes:
extracting haze characteristic information from the target image; the haze characteristic information comprises hue characteristic information, saturation characteristic information and brightness characteristic information;
comparing the haze characteristic information of the target image with haze characteristic information of a plurality of reference images, and determining a reference image matched with the target image;
and determining the determined reference haze concentration value corresponding to the reference image as a haze identification result of the target image.
In one possible implementation manner, the comparing the haze characteristic information of the target image with the haze characteristic information of a plurality of reference images, and determining the reference image matched with the target image includes:
For each reference image, determining a hue characteristic information difference value according to the hue characteristic information and the reference hue characteristic information of the reference image, determining a saturation characteristic information difference value according to the saturation characteristic information and the reference saturation characteristic information of the reference image, and determining a brightness characteristic information difference value according to the brightness characteristic information and the brightness characteristic information of the reference image;
detecting whether the hue characteristic information difference value, the saturation characteristic information difference value and the brightness characteristic information difference value are smaller than a preset threshold value;
if yes, the reference image is determined to be the reference image matched with the target image.
The embodiment of the application also provides a vehicle auxiliary driving device, which comprises:
the acquisition module is used for acquiring target images of the surrounding environment of the target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position;
the haze recognition module is used for respectively carrying out haze recognition on the target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera;
The haze level determining module is used for determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining the haze level when the haze exists;
and the car light control module is used for controlling the target car light of the target car based on the haze level of the surrounding environment where the target car is located when the surrounding environment where the target car is located is determined to have haze.
The embodiment of the application also provides a vehicle, which comprises: the system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory are communicated through the bus when the vehicle runs, and the machine-readable instructions are executed by the processor to perform the steps of the vehicle auxiliary driving method.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of a vehicle assisted driving method as described above.
The embodiment of the application provides a vehicle driving assisting method, a device, a vehicle and a storage medium, wherein the vehicle driving assisting method comprises the following steps: acquiring target images of the surrounding environment of a target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position; respectively carrying out haze recognition on target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera; determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining a haze level when the haze exists; and when the haze exists in the surrounding environment where the target vehicle is located, controlling a target car lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located. The haze recognition is carried out on the target images corresponding to the front view camera and the around view camera, so that the accuracy of the haze recognition is improved, automatic control of the target car lamp is realized according to the recognition result, and the driving safety can be improved.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for assisting driving of a vehicle according to an embodiment of the present disclosure;
fig. 2 is a schematic installation diagram of a vehicle-mounted camera according to an embodiment of the present application;
FIG. 3 is a second flowchart of a method for assisting driving of a vehicle according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a driving assisting device for a vehicle according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a vehicle according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be appreciated that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
In order to enable one skilled in the art to utilize the present disclosure, the following embodiments are provided in connection with a particular application scenario "vehicle assisted driving", and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit or scope of the disclosure.
The method, the device, the vehicle or the computer readable storage medium described below in the embodiments of the present application may be applied to any scene that needs to be detected in vivo, and the embodiments of the present application do not limit specific application scenes, and any scheme using the method, the device, the vehicle and the storage medium for driving assistance provided in the embodiments of the present application is within the scope of protection of the present application.
First, application scenarios applicable to the present application will be described. The method and the device can be applied to the technical field of automatic driving.
According to research, in the prior art, the haze detection method mainly judges the haze pollution condition by judging the content of atmospheric aerosol particles in the air, the haze detection method is high in professional requirements on detection instruments by detecting components in the air by relying on professional measuring instruments, the measurement method is complex, the haze detection of a vehicle in the driving process is not facilitated, and a plurality of drivers cannot actively turn on front and rear fog lamps due to the fact that the vehicle condition is not known and the like, so that accidents are easy to cause. Therefore, how to improve the accuracy of haze detection and the driving safety become a technical problem in a small scale.
Based on this, the embodiment of the application provides a vehicle driving assisting method, which improves the accuracy of haze recognition by carrying out haze recognition on target images corresponding to a front-view camera and a surrounding-view camera, thereby realizing automatic control on target car lamps according to recognition results and improving driving safety.
Referring to fig. 1, fig. 1 is a flowchart of a method for assisting driving of a vehicle according to an embodiment of the present application. As shown in fig. 1, a vehicle driving assisting method provided in an embodiment of the present application includes:
S101: acquiring target images of the surrounding environment of a target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a circular camera and a front-view camera.
In the step, target images of the surrounding environment of the target vehicle, which are respectively shot by each vehicle-mounted camera, are acquired.
Referring to fig. 2, fig. 2 is an installation schematic diagram of a vehicle-mounted camera according to an embodiment of the present application. As shown in fig. 2, the looking-around cameras are installed at the peripheral positions of the vehicle body, the front-view cameras are installed at the positions of the vehicle windows, the number of the looking-around cameras can be 4, the number of the front-view cameras can be 1, and specific setting relations can be set according to the types of target vehicles.
S102: and respectively carrying out haze recognition on target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera.
In the step, haze recognition is carried out on target images corresponding to all vehicle-mounted cameras, and a first haze recognition result of the looking-around camera and a second haze recognition result of the front-view camera are obtained.
The first haze identification result and the second haze identification result are haze concentration values of the target image.
Here, since the angles of the surrounding environments photographed by the front-view camera and the through-view camera are different, it is necessary to perform haze recognition on the target images photographed by the front-view camera and the through-view camera.
In one possible implementation manner, for any one of the target images, haze recognition is performed on the target image according to the following steps to obtain a haze recognition result of the target image, where the haze recognition result includes:
a: extracting haze characteristic information from the target image; the haze characteristic information comprises hue characteristic information, saturation characteristic information and brightness characteristic information.
Here, the haze characteristic information is extracted from the target image, and includes hue characteristic information, saturation characteristic information, and brightness characteristic information.
B: and comparing the haze characteristic information of the target image with the haze characteristic information of a plurality of reference images, and determining the reference image matched with the target image.
The haze characteristic information of the target image is compared with the haze characteristic information of the plurality of reference images, and the reference image matched with the target image is determined.
In one possible implementation manner, the comparing the haze characteristic information of the target image with the haze characteristic information of a plurality of reference images, and determining the reference image matched with the target image includes:
a: for each reference image, determining a hue characteristic information difference value according to the hue characteristic information and the reference hue characteristic information of the reference image, determining a saturation characteristic information difference value according to the saturation characteristic information and the reference saturation characteristic information of the reference image, and determining a brightness characteristic information difference value according to the brightness characteristic information and the brightness characteristic information of the reference image.
Here, for each reference image, a hue characteristic information difference value is determined according to a difference between hue characteristic information and reference hue characteristic information of the reference image, a saturation characteristic information difference value is determined according to a difference between saturation characteristic information and reference saturation characteristic information of the reference image, and a brightness characteristic information difference value is determined according to a difference between brightness characteristic information and brightness characteristic information of the reference image.
b: and detecting whether the hue characteristic information difference value, the saturation characteristic information difference value and the brightness characteristic information difference value are smaller than a preset threshold value.
Here, it is detected whether or not the hue characteristic information difference value, the saturation characteristic information difference value, and the brightness characteristic information difference value are all smaller than a preset threshold value.
Wherein, the preset threshold value is preset according to expert experience.
c: if yes, the reference image is determined to be the reference image matched with the target image.
Here, if both are smaller than the pre-threshold value, the reference image is determined as a reference image that matches the target image.
C: and determining the determined reference haze concentration value corresponding to the reference image as a haze identification result of the target image.
Here, the determined reference haze concentration value corresponding to the reference image is determined as a haze recognition result of the target image.
The haze identification result of the target image is a determined reference haze concentration value of the reference image, namely the haze identification result is a haze concentration value.
S103: and determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining the haze level when the haze exists.
In the step, whether haze exists in the surrounding environment where the target vehicle is located or not is determined according to the first haze identification result and the second haze identification result, and the haze level is determined when the haze exists.
Wherein, the haze level includes light haze, moderate haze, heavy haze, and group haze.
In one possible implementation manner, the determining, based on the first haze recognition result and the second haze recognition result, whether the surrounding environment where the target vehicle is located has haze, and determining the haze level when the target vehicle is provided with haze includes:
(1): judging whether the first haze identification result of the looking-around camera is haze-free or not.
Here, whether the first haze recognition result of the looking-around camera is haze-free is detected.
(2): if the first haze identification result is haze-free, judging whether a second haze identification result of the front-view camera is haze-free, if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free, and if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free.
Here, if the first haze recognition result is no haze, whether the second haze recognition result of the front-view camera is no haze is judged, if the second haze recognition result is no haze, it is determined that the surrounding environment where the target vehicle is located is provided with haze, and the haze level is group haze, and if the second haze recognition result is also no haze, it is determined that the surrounding environment where the target vehicle is located is no haze.
For a scene of the group fog, only the front-view cameras can detect the fog and the plurality of the round-view cameras are in a non-fog category, the detection result is that the fog exists, and the fog grade is the group fog.
In one possible implementation manner, after the determining whether the first haze recognition result of the pan-around camera is haze-free, the vehicle driving assisting method further includes:
if the first haze identification result is haze, determining that the surrounding environment where the target vehicle is located is haze based on the haze identification result of each vehicle-mounted camera, and determining the haze level.
Here, if the first haze recognition result is that haze exists, haze exists in the surrounding environment where the target vehicle is located according to the haze recognition result of each vehicle-mounted camera, and the haze level is determined.
The haze recognition result is a haze concentration value.
In a possible implementation manner, the determining that the surrounding environment where the target vehicle is located has haze based on the haze recognition result of each vehicle-mounted camera, and determining the haze level include:
i: and sorting the haze recognition results of the plurality of target images, and screening out the largest haze recognition result from the sorted haze recognition results.
Here, the haze recognition results of the plurality of target images are ranked, and the largest haze recognition result is selected from the ranked plurality of haze recognition results.
II: and detecting a haze concentration value interval in which the maximum haze identification result is located.
III: if the maximum haze identification result is in the first interval, the haze level of the surrounding environment where the target vehicle is located is light haze; if the maximum haze identification result is in the second interval, the haze level of the surrounding environment where the target vehicle is located is medium haze; and if the maximum haze identification result is in the third interval, the haze level of the surrounding environment where the target vehicle is located is heavy haze.
Here, if the maximum haze recognition result is in the first section, the haze level of the surrounding environment where the target vehicle is located is light haze, if the maximum haze recognition result is in the second section, the haze level of the surrounding environment where the target vehicle is located is medium haze, and if the maximum haze recognition result is in the third section, the haze level of the surrounding environment where the target vehicle is located is heavy haze.
For scenes with medium haze, light haze and heavy haze, except for the front-view camera, the haze needs to be detected by other around-view cameras. In order to ensure the accuracy of the haze level determination and the personal safety of drivers, a haze concentration value interval corresponding to the maximum haze identification result is determined to be the haze level.
In a specific embodiment, sorting the haze recognition results of the plurality of target images, screening out the largest haze recognition result from the sorted plurality of haze recognition results to be 4, and detecting a haze concentration value interval where the largest haze recognition result is located. If the first section is [0, 1) the corresponding haze level is light haze, if the second section is [1, 2) the corresponding haze level is medium haze, and if the third section is [3, 4) the corresponding haze level is heavy haze, determining that the haze level is heavy haze according to the third section where the maximum haze identification result is 4.
S104: and when the haze exists in the surrounding environment where the target vehicle is located, controlling a target car lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located.
In the step, when the fact that haze exists in the surrounding environment where the target vehicle is located is determined, the target car lamp of the target vehicle is controlled according to the haze level of the surrounding environment where the target vehicle is located.
In one possible embodiment, the target vehicle lamp includes a vehicle-mounted fog lamp and a vehicle-mounted double flashing lamp; when it is determined that the surrounding environment where the target vehicle is located has haze, controlling a target lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located, including:
i: and if the haze level of the surrounding environment where the target vehicle is located is the light haze or the group haze, controlling the vehicle-mounted fog lamp to be in an irradiation state.
Here, if the haze level of the surrounding environment where the target vehicle is located is light haze or group haze, the vehicle-mounted fog lamp is controlled to be in an irradiation state.
ii: and if the haze level of the surrounding environment where the target vehicle is located is the moderate haze or the severe haze, controlling the vehicle-mounted fog lamp and the vehicle-mounted double-flash lamp to be in an irradiation state.
Here, if the haze level of the surrounding environment where the target vehicle is located is moderate haze or severe haze, the vehicle-mounted fog lamp and the vehicle-mounted double-flash lamp are controlled to be in an irradiation state.
The automatic driving controller sends signals to the light controller through the LIN bus to perform corresponding light operation. For haze-free scenes, no lamplight operation is needed. And turning on the vehicle-mounted fog lamp for the group fog and light haze scenes, and turning on the vehicle-mounted fog lamp and the vehicle-mounted double-flash lamp for the moderate and heavy haze scenes.
Further, referring to fig. 3, fig. 3 is a second flowchart of a method for assisting driving of a vehicle according to an embodiment of the present application. As shown in fig. 3, S301: detecting whether a first haze identification result of the looking-around camera is haze-free; s302: if yes, judging whether a second haze identification result of the front-view camera is haze-free; s303: if not, the second haze identification result is haze-free; s304: if yes, the second haze identification result is that haze exists, and the haze level is group haze; s305: if yes, according to the haze identification results of the plurality of vehicle-mounted cameras, determining that haze exists in the surrounding environment where the target vehicle is located, and determining the haze level.
The embodiment of the application provides a vehicle auxiliary driving method, which comprises the following steps: acquiring target images of the surrounding environment of a target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position; respectively carrying out haze recognition on target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera; determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining a haze level when the haze exists; and when the haze exists in the surrounding environment where the target vehicle is located, controlling a target car lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located. The haze recognition is carried out on the target images corresponding to the front view camera and the around view camera, so that the accuracy of the haze recognition is improved, automatic control of the target car lamp is realized according to the recognition result, and the driving safety can be improved.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a driving assisting device for a vehicle according to an embodiment of the present application. As shown in fig. 4, the vehicle auxiliary driving device 400 includes:
The acquiring module 410 is configured to acquire target images of surrounding environments of a target vehicle respectively captured by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position;
the haze recognition module 420 is configured to perform haze recognition on target images corresponding to the vehicle-mounted cameras respectively, so as to obtain a first haze recognition result of the looking-around camera and a second haze recognition result of the front-view camera;
the haze level determining module 430 is configured to determine, based on the first haze identification result and the second haze identification result, whether the surrounding environment where the target vehicle is located has haze, and determine a haze level when the target vehicle has haze;
and the vehicle lamp control module 440 is configured to control a target vehicle lamp of the target vehicle based on a haze level of an ambient environment in which the target vehicle is located when it is determined that the ambient environment in which the target vehicle is located has haze.
In one possible implementation manner, the haze level determining module 430 is configured to determine, when the environment where the target vehicle is located is determined to have haze based on the first haze identification result and the second haze identification result, and determine the haze level when the environment has haze, where the haze level determining module 430 is specifically configured to:
Judging whether a first haze identification result of the looking-around camera is haze-free or not;
if the first haze identification result is haze-free, judging whether a second haze identification result of the front-view camera is haze-free, if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free, and if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free.
In one possible implementation, the haze level determination module 430 is further configured to:
if the first haze identification result is haze, determining that the surrounding environment where the target vehicle is located is haze based on the haze identification result of each vehicle-mounted camera, and determining the haze level.
In one possible implementation manner, when the haze level determining module 430 is configured to determine that the surrounding environment where the target vehicle is located has haze based on the haze identification result of each of the vehicle-mounted cameras, and determine the haze level, the haze level determining module 430 is specifically configured to:
sorting the haze recognition results of the plurality of target images, and screening out the largest haze recognition result from the sorted haze recognition results;
Detecting a haze concentration value interval in which the maximum haze identification result is located;
if the maximum haze identification result is in the first interval, the haze level of the surrounding environment where the target vehicle is located is light haze;
if the maximum haze identification result is in the second interval, the haze level of the surrounding environment where the target vehicle is located is medium haze;
if the maximum haze identification result is in the third interval, the haze level of the surrounding environment where the target vehicle is located is heavy haze;
the haze concentration value in the first interval is smaller than the haze concentration value in the second interval, and the haze concentration value in the second interval is smaller than the haze concentration value in the third interval.
In one possible implementation, the lamp control module 440 is configured to include an onboard fog lamp and an onboard dual flash lamp for the target lamp; when determining that the surrounding environment where the target vehicle is located has haze, based on the haze level of the surrounding environment where the target vehicle is located, the lamp control module 440 is specifically configured to:
if the haze level of the surrounding environment where the target vehicle is located is the light haze or the group haze, controlling the vehicle-mounted fog lamp to be in an irradiation state;
And if the haze level of the surrounding environment where the target vehicle is located is the moderate haze or the severe haze, controlling the vehicle-mounted fog lamp and the vehicle-mounted double-flash lamp to be in an irradiation state.
In one possible implementation manner, when the haze identifying module 430 is configured to identify, for any one of the target images, the haze of the target image according to the following steps, and obtain a haze identifying result of the target image, the haze identifying module 430 is specifically configured to:
extracting haze characteristic information from the target image; the haze characteristic information comprises hue characteristic information, saturation characteristic information and brightness characteristic information;
comparing the haze characteristic information of the target image with haze characteristic information of a plurality of reference images, and determining a reference image matched with the target image;
and determining the determined reference haze concentration value corresponding to the reference image as a haze identification result of the target image.
In one possible implementation manner, when the haze identifying module 430 is configured to compare the haze characteristic information of the target image with the haze characteristic information of a plurality of reference images, it determines a reference image that matches the target image, the haze identifying module 430 is specifically configured to:
For each reference image, determining a hue characteristic information difference value according to the hue characteristic information and the reference hue characteristic information of the reference image, determining a saturation characteristic information difference value according to the saturation characteristic information and the reference saturation characteristic information of the reference image, and determining a brightness characteristic information difference value according to the brightness characteristic information and the brightness characteristic information of the reference image;
detecting whether the hue characteristic information difference value, the saturation characteristic information difference value and the brightness characteristic information difference value are smaller than a preset threshold value;
if yes, the reference image is determined to be the reference image matched with the target image.
The embodiment of the application provides a vehicle auxiliary driving device, the vehicle auxiliary driving device includes: the acquisition module is used for acquiring target images of the surrounding environment of the target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position; the haze recognition module is used for respectively carrying out haze recognition on the target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera; the haze level determining module is used for determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining the haze level when the haze exists; and the car light control module is used for controlling the target car light of the target car based on the haze level of the surrounding environment where the target car is located when the surrounding environment where the target car is located is determined to have haze. The haze recognition is carried out on the target images corresponding to the front view camera and the around view camera, so that the accuracy of the haze recognition is improved, automatic control of the target car lamp is realized according to the recognition result, and the driving safety can be improved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a vehicle according to an embodiment of the present application. As shown in fig. 5, the vehicle 500 includes a processor 510, a memory 520, and a bus 530.
The memory 520 stores machine-readable instructions executable by the processor 510, and when the vehicle 500 runs, the processor 510 communicates with the memory 520 through the bus 530, and when the machine-readable instructions are executed by the processor 510, the steps of the vehicle driving assisting method in the method embodiment shown in fig. 1 can be executed, and specific implementation can be referred to the method embodiment and will not be described herein.
The embodiment of the present application further provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the steps of the vehicle driving assisting method in the embodiment of the method shown in fig. 1 may be executed, and a specific implementation manner may refer to the method embodiment and will not be described herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A vehicle assisted driving method, characterized by comprising:
acquiring target images of the surrounding environment of a target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a circular camera and a front-view camera;
respectively carrying out haze recognition on target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera;
determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining a haze level when the haze exists;
and when the haze exists in the surrounding environment where the target vehicle is located, controlling a target car lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located.
2. The vehicle driving assist method according to claim 1, wherein the determining whether the surrounding environment in which the target vehicle is located has haze based on the first haze recognition result and the second haze recognition result, and determining the haze level when there is haze, includes:
Judging whether a first haze identification result of the looking-around camera is haze-free or not;
if the first haze identification result is haze-free, judging whether a second haze identification result of the front-view camera is haze-free, if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free, and if the second haze identification result is haze-free, determining that the surrounding environment where the target vehicle is located is haze-free.
3. The vehicle-assisted driving method according to claim 2, characterized in that after the determination as to whether the first haze recognition result of the through camera is haze-free, the vehicle-assisted driving method further comprises:
if the first haze identification result is haze, determining that the surrounding environment where the target vehicle is located is haze based on the haze identification result of each vehicle-mounted camera, and determining the haze level.
4. The vehicle driving assist method according to claim 3, wherein the determining that the surrounding environment in which the target vehicle is located has haze based on the haze recognition results of the respective vehicle-mounted cameras, and determining the haze level, includes:
Sorting the haze recognition results of the plurality of target images, and screening out the largest haze recognition result from the sorted haze recognition results;
detecting a haze concentration value interval in which the maximum haze identification result is located;
if the maximum haze identification result is in the first interval, the haze level of the surrounding environment where the target vehicle is located is light haze;
if the maximum haze identification result is in the second interval, the haze level of the surrounding environment where the target vehicle is located is medium haze;
if the maximum haze identification result is in the third interval, the haze level of the surrounding environment where the target vehicle is located is heavy haze;
the haze concentration value in the first interval is smaller than the haze concentration value in the second interval, and the haze concentration value in the second interval is smaller than the haze concentration value in the third interval.
5. The vehicle assisted driving method according to claim 1, characterized in that the target lamp includes an in-vehicle fog lamp and an in-vehicle double flash lamp; when it is determined that the surrounding environment where the target vehicle is located has haze, controlling a target lamp of the target vehicle based on the haze level of the surrounding environment where the target vehicle is located, including:
If the haze level of the surrounding environment where the target vehicle is located is the light haze or the group haze, controlling the vehicle-mounted fog lamp to be in an irradiation state;
and if the haze level of the surrounding environment where the target vehicle is located is the moderate haze or the severe haze, controlling the vehicle-mounted fog lamp and the vehicle-mounted double-flash lamp to be in an irradiation state.
6. The vehicle driving support method according to claim 1, characterized in that, for any one of the target images, haze recognition is performed on the target image according to the following steps, and a haze recognition result of the target image is obtained, comprising:
extracting haze characteristic information from the target image; the haze characteristic information comprises hue characteristic information, saturation characteristic information and brightness characteristic information;
comparing the haze characteristic information of the target image with haze characteristic information of a plurality of reference images, and determining a reference image matched with the target image;
and determining the determined reference haze concentration value corresponding to the reference image as a haze identification result of the target image.
7. The vehicle driving assist method as set forth in claim 6, wherein comparing the haze characteristic information of the target image with haze characteristic information of a plurality of reference images, determining a reference image matching the target image, comprises:
For each reference image, determining a hue characteristic information difference value according to the hue characteristic information and the reference hue characteristic information of the reference image, determining a saturation characteristic information difference value according to the saturation characteristic information and the reference saturation characteristic information of the reference image, and determining a brightness characteristic information difference value according to the brightness characteristic information and the brightness characteristic information of the reference image;
detecting whether the hue characteristic information difference value, the saturation characteristic information difference value and the brightness characteristic information difference value are smaller than a preset threshold value;
if yes, the reference image is determined to be the reference image matched with the target image.
8. A vehicle assisted driving device, characterized by comprising:
the acquisition module is used for acquiring target images of the surrounding environment of the target vehicle, which are respectively shot by each vehicle-mounted camera; the vehicle-mounted camera is divided into a look-around camera and a front-view camera according to the installation position;
the haze recognition module is used for respectively carrying out haze recognition on the target images corresponding to the vehicle-mounted cameras to obtain a first haze recognition result of the surrounding camera and a second haze recognition result of the front camera;
The haze level determining module is used for determining whether haze exists in the surrounding environment where the target vehicle is located or not based on the first haze identification result and the second haze identification result, and determining the haze level when the haze exists;
and the car light control module is used for controlling the target car light of the target car based on the haze level of the surrounding environment where the target car is located when the surrounding environment where the target car is located is determined to have haze.
9. A vehicle, characterized by comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via said bus when the vehicle is running, said machine readable instructions when executed by said processor performing the steps of the vehicle assisted driving method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the vehicle assisted driving method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310506012.0A CN116513054A (en) | 2023-05-06 | 2023-05-06 | Vehicle driving assisting method and device, vehicle and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310506012.0A CN116513054A (en) | 2023-05-06 | 2023-05-06 | Vehicle driving assisting method and device, vehicle and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116513054A true CN116513054A (en) | 2023-08-01 |
Family
ID=87399066
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310506012.0A Pending CN116513054A (en) | 2023-05-06 | 2023-05-06 | Vehicle driving assisting method and device, vehicle and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116513054A (en) |
-
2023
- 2023-05-06 CN CN202310506012.0A patent/CN116513054A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10152640B2 (en) | System and method for verification of lamp operation | |
JP5100159B2 (en) | Automatic switching control method for in-vehicle projector | |
US7952490B2 (en) | Method for identifying the activation of the brake lights of preceding vehicles | |
CN108357418B (en) | Preceding vehicle driving intention analysis method based on tail lamp identification | |
CN111815959B (en) | Vehicle violation detection method and device and computer readable storage medium | |
US20190164267A1 (en) | Failed vehicle estimation system, failed vehicle estimation method and computer-readable non-transitory storage medium | |
JP2002083297A (en) | Object recognition method and object recognition device | |
CN102806867A (en) | Image processing device and light distribution control method | |
CN106183981B (en) | Obstacle detection method, apparatus based on automobile and automobile | |
JP2017017635A (en) | Failure diagnosis apparatus | |
CN108482367A (en) | A kind of method, apparatus and system driven based on intelligent back vision mirror auxiliary | |
CN108407806A (en) | Auxiliary driving method and device | |
KR101818542B1 (en) | System for improving the reliability of the recognition of traffic lane and method thereof | |
CN111505617B (en) | Vehicle positioning method, device, equipment and storage medium | |
CN111967384A (en) | Vehicle information processing method, device, equipment and computer readable storage medium | |
CN110386088A (en) | System and method for executing vehicle variance analysis | |
CN113408364B (en) | Temporary license plate recognition method, system, device and storage medium | |
CN116513054A (en) | Vehicle driving assisting method and device, vehicle and storage medium | |
CN109895694B (en) | Lane departure early warning method and device and vehicle | |
CN113581196B (en) | Method and device for early warning of vehicle running, computer equipment and storage medium | |
CN115082894A (en) | Distance detection method, vehicle high beam control method, device, medium and vehicle | |
CN113370991A (en) | Driving assistance method, device, equipment, storage medium and computer program product | |
CN116438584A (en) | Image processing method | |
CN112528923A (en) | Video analysis method and device, electronic equipment and storage medium | |
CN112634624A (en) | Bus standard stop detection method and system based on intelligent video analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |