CN116080644A - Vehicle running control method and device and electronic equipment - Google Patents

Vehicle running control method and device and electronic equipment Download PDF

Info

Publication number
CN116080644A
CN116080644A CN202111293446.4A CN202111293446A CN116080644A CN 116080644 A CN116080644 A CN 116080644A CN 202111293446 A CN202111293446 A CN 202111293446A CN 116080644 A CN116080644 A CN 116080644A
Authority
CN
China
Prior art keywords
coordinate point
point set
vehicle
lane line
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111293446.4A
Other languages
Chinese (zh)
Inventor
李小波
张青山
党诗芽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Shanghai ICT Co Ltd
CM Intelligent Mobility Network Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Shanghai ICT Co Ltd
CM Intelligent Mobility Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Shanghai ICT Co Ltd, CM Intelligent Mobility Network Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202111293446.4A priority Critical patent/CN116080644A/en
Publication of CN116080644A publication Critical patent/CN116080644A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a vehicle running control method, a device and electronic equipment, wherein the vehicle running control method comprises the following steps: acquiring a target image shot by a monocular camera of a vehicle; carrying out Gaussian filtering treatment on a gray level image corresponding to a target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm; determining a reference coordinate point set according to the lane line edge coordinate point set; and controlling the running of the vehicle according to the reference coordinate point set. According to the scheme, the gray level image corresponding to the target image is subjected to Gaussian filtering, the lane line edge coordinate point set is obtained through the edge detection algorithm and the region extraction algorithm, the reference coordinate point set is determined according to the lane line edge coordinate point set, the process of forming the vehicle control track curve by linear fitting is bypassed, the vehicle is controlled through the reference coordinate point set, and the problem of low control precision caused by control according to the vehicle control track curve is solved.

Description

Vehicle running control method and device and electronic equipment
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a vehicle driving control method, a device, and an electronic device.
Background
In the auxiliary driving function of the vehicle, the lane keeping function is further divided into a common lane keeping function and a lane centering driving function, and the two functions are realized by identifying lane lines in front of the vehicle through a front-view camera of the vehicle, and calculating a reference line for whether the vehicle is in a lane or the vehicle is in the center according to the lane lines at two sides so as to control the vehicle.
In the existing lane line centering control scheme, the lane line information is all the position information of the lane line extracted from the visual image, the lane line information is converted into an equation based on a vehicle coordinate system or a camera position coordinate system, and finally, a center curve is taken through two side lines or a deviation is made through a single side line, so that a control track route of the vehicle is obtained, and the vehicle is controlled.
However, in the existing scheme, when a single vehicle-mounted camera is used for vehicle lane keeping control, the accuracy of a vehicle control track curve is not high because the positioning accuracy of the monocular camera to a lane line is limited and can only reach sub-meter accuracy generally, but the cost of the whole vehicle is increased when the positioning accuracy is increased by using the binocular camera.
Disclosure of Invention
The embodiment of the invention provides a vehicle running control method, a vehicle running control device and electronic equipment, which are used for solving the problem that in the prior art, a monocular camera is used for performing linear fitting to form a vehicle control track curve to control a vehicle, so that the control precision is low.
In order to solve the technical problems, the embodiment of the invention provides the following technical scheme:
the embodiment of the invention provides a vehicle running control method, which comprises the following steps:
acquiring a target image shot by a monocular camera of a vehicle;
carrying out Gaussian filtering treatment on the gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm;
determining a reference coordinate point set according to the lane line edge coordinate point set;
and controlling the running of the vehicle according to the reference coordinate point set.
Optionally, performing gaussian filtering processing on a gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm, where the step includes:
acquiring a gray image corresponding to the target image based on RGB information;
carrying out Gaussian filtering treatment on the gray level image to obtain a treated gray level image;
processing the processed gray level image by using an edge detection algorithm and an extraction algorithm to obtain a lane line edge coordinate point set corresponding to a lane line in the target image;
the lane line edge coordinate point set is a CMOS array coordinate point set in the monocular camera.
Optionally, determining a reference coordinate point set according to the lane line edge coordinate point set includes:
carrying out coordinate repair on the lane line edge coordinate point set by utilizing an interpolation mode to obtain a repaired lane line edge coordinate point set;
and determining a reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set.
Optionally, determining the reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set includes:
clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the target line in the lane line edge coordinate point set to obtain a left reference point coordinate and a right reference point coordinate of the target line;
determining the reference coordinates of the target row according to the left reference point coordinates and the right reference point coordinates of the target row;
and arranging the reference coordinates of all the target rows in the lane line edge coordinate point set according to the row sequence to obtain the reference coordinate point set.
Optionally, controlling the vehicle running according to the reference coordinate point set includes:
calibrating a control point corresponding to a longitudinal centerline of the vehicle in the monocular camera;
controlling the running of the vehicle according to the control points, the target reference coordinate point set and a preset control mode;
the target reference coordinate point set is a corresponding reference coordinate point set in a preset duration of the current time.
Optionally, the control points include a first control point and a second control point;
according to the control point, the target reference coordinate point set and a preset control mode, the vehicle running is controlled, and the method comprises the following steps:
controlling the running of the vehicle according to the position control points, the yaw angle of the vehicle, the target reference coordinate point set and a preset control mode;
the position control points are control points with larger row coordinates in the first control points and the second control points;
the vehicle yaw angle is determined from the first control point and the second control point.
Optionally, calibrating a control point corresponding to a longitudinal centerline of the vehicle in the monocular camera includes:
calibrating column coordinates of control points corresponding to a longitudinal center line of the vehicle in the monocular camera in a static calibration mode;
and/or the number of the groups of groups,
and calibrating row coordinates of a control point corresponding to the longitudinal center line of the vehicle in the monocular camera in a dynamic calibration mode.
The embodiment of the invention also provides a vehicle running control device, which comprises:
the acquisition module is used for acquiring a target image shot by a monocular camera of the vehicle;
the processing module is used for carrying out Gaussian filtering processing on the gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm;
the determining module is used for determining a reference coordinate point set according to the lane line edge coordinate point set;
and the control module is used for controlling the running of the vehicle according to the reference coordinate point set.
Optionally, the processing module includes:
an acquisition unit, configured to acquire a gray image corresponding to the target image based on RGB information;
the first processing unit is used for carrying out Gaussian filtering processing on the gray level image to obtain a processed gray level image;
the second processing unit is used for processing the processed gray level image by utilizing an edge detection algorithm and an extraction algorithm to obtain a lane line edge coordinate point set corresponding to a lane line in the target image;
the lane line edge coordinate point set is a CMOS array coordinate point set in the monocular camera.
Optionally, the determining module includes:
the repair unit is used for carrying out coordinate repair on the lane line edge coordinate point set by utilizing an interpolation mode to obtain a repaired lane line edge coordinate point set;
the first determining unit is used for determining a reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set.
Optionally, the first determining unit is specifically configured to:
clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the target line in the lane line edge coordinate point set to obtain a left reference point coordinate and a right reference point coordinate of the target line;
determining the reference coordinates of the target row according to the left reference point coordinates and the right reference point coordinates of the target row;
and arranging the reference coordinates of all the target rows in the lane line edge coordinate point set according to the row sequence to obtain the reference coordinate point set.
Optionally, the control module includes:
a calibration unit for calibrating a control point corresponding to a longitudinal center line of the vehicle in the monocular camera;
the control unit is used for controlling the running of the vehicle according to the control points, the target reference coordinate point set and a preset control mode;
the target reference coordinate point set is a corresponding reference coordinate point set in a preset duration of the current time.
Optionally, the control points include a first control point and a second control point;
the control unit is specifically configured to:
controlling the running of the vehicle according to the position control points, the yaw angle of the vehicle, the target reference coordinate point set and a preset control mode;
the position control points are control points with larger row coordinates in the first control points and the second control points;
the vehicle yaw angle is determined from the first control point and the second control point.
Optionally, the calibration unit is specifically configured to
Calibrating column coordinates of control points corresponding to a longitudinal center line of the vehicle in the monocular camera in a static calibration mode;
and/or the number of the groups of groups,
and calibrating row coordinates of a control point corresponding to the longitudinal center line of the vehicle in the monocular camera in a dynamic calibration mode.
The embodiment of the invention also provides electronic equipment, which comprises: a processor, a memory, and a program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the vehicle travel control method as set forth in any one of the above.
The embodiment of the present invention also provides a readable storage medium having a program stored thereon, which when executed by a processor, implements the steps in the vehicle running control method as described in any one of the above.
The beneficial effects of the invention are as follows:
according to the scheme, through Gaussian filtering processing is carried out on the gray level image corresponding to the target image shot by the obtained monocular camera, the lane line edge coordinate point set corresponding to the lane line in the target image is obtained through the edge detection algorithm and the region extraction algorithm, the reference coordinate point set is determined according to the lane line edge coordinate point set, the process of forming the vehicle control track curve through linear fitting is bypassed, vehicle running is controlled through the reference coordinate point set, and the problem of low control precision caused by vehicle control according to the vehicle control track curve is solved.
Drawings
FIG. 1 shows one of the flowcharts of a vehicle travel control method provided by an embodiment of the present invention;
FIG. 2 is a flow chart showing a second method for controlling vehicle driving according to an embodiment of the present invention;
fig. 3 is a schematic diagram showing a structure of a vehicle running control apparatus according to an embodiment of the present invention;
fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the drawings and the specific embodiments thereof in order to make the objects, technical solutions and advantages of the present invention more apparent.
Aiming at the problem of low control precision caused by vehicle control by using a vehicle control track curve formed by linear fitting of a monocular camera in the prior art, the invention provides a vehicle running control method, a vehicle running control device and electronic equipment.
As shown in fig. 1, an embodiment of the present invention provides a vehicle running control method, including:
step 101: a target image taken by a monocular camera of the vehicle is acquired.
Preferably, the monocular camera is a front-view camera provided at a front end of the vehicle.
Specifically, the target image may be extracted from a video stream photographed by a monocular camera.
Step 102: and carrying out Gaussian filtering processing on the gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm.
The process of obtaining the lane line edge coordinate point set in the target image is as follows: and carrying out gray processing on the target image to obtain a gray image, and processing the gray image through an edge detection algorithm and a region extraction algorithm to obtain a lane line edge coordinate point set.
Optionally, the edge detection algorithm is a Canny edge detection algorithm; the region extraction algorithm is a region of interest (region of interest, ROI) region extraction algorithm.
Step 103: and determining a reference coordinate point set according to the lane line edge coordinate point set.
That is, the lane line edge coordinate point set is processed to obtain the reference coordinate point set, and in this step, the process of linearly fitting the formed vehicle control track curve is bypassed by determining the reference coordinate point set. Errors generated by positioning after the camera recognizes the lane line can be avoided, and fitting errors generated during curve fitting are further avoided.
Step 104: and controlling the running of the vehicle according to the reference coordinate point set.
According to the embodiment of the invention, the vehicle running is controlled by referring to the coordinate point set, so that the problem of low control precision caused by vehicle control according to the vehicle control track curve can be solved.
Optionally, performing gaussian filtering processing on a gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm, where the step includes:
acquiring a gray image corresponding to the target image based on RGB information;
carrying out Gaussian filtering treatment on the gray level image to obtain a treated gray level image;
processing the processed gray level image by using an edge detection algorithm and an extraction algorithm to obtain a lane line edge coordinate point set corresponding to a lane line in the target image;
the lane line edge coordinate point set is a CMOS array coordinate point set in the monocular camera.
Specifically, the process of obtaining the lane line edge coordinate point set in the target image is as follows: after extracting a gray image of a target image based on red, green and blue primary color (Red, green, blue, RGB) information of the target image, carrying out Gaussian filtering processing on the gray image, and extracting the gray image through a Canny edge detection algorithm and a ROI region extraction algorithm to obtain a lane line edge coordinate point set.
Taking a monocular camera as an example of 1280×720 cameras, the obtained lane line edge coordinate point set is a coordinate point set of lane lines in a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) image sensor array in the monocular camera. The coordinate point set in all CMOS image sensor arrays in 1280 x 720 cameras is:
Figure BDA0003335527950000071
optionally, determining a reference coordinate point set according to the lane line edge coordinate point set includes:
carrying out coordinate repair on the lane line edge coordinate point set by utilizing an interpolation mode to obtain a repaired lane line edge coordinate point set;
and determining a reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set.
That is, coordinates in a discontinuous lane line edge coordinate point set are repaired by adopting an interpolation mode, and then mean processing is performed on coordinate values corresponding to a left lane line coordinate point set and a right lane line coordinate point set in the lane line edge coordinate point set to obtain a coordinate point set (reference coordinate point set) of the reference point.
The coordinates in the reference coordinate point set are all positive values.
As a preferred embodiment, determining the reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set includes:
clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the target line in the lane line edge coordinate point set to obtain a left reference point coordinate and a right reference point coordinate of the target line;
determining the reference coordinates of the target row according to the left reference point coordinates and the right reference point coordinates of the target row;
and arranging the reference coordinates of all the target rows in the lane line edge coordinate point set according to the row sequence to obtain the reference coordinate point set.
Taking 720 rows in the CMOS image sensor array as an example of the target row, clustering the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the row according to a clustering algorithm to obtain a left reference point coordinate and a right reference point coordinate of the row, obtaining the reference coordinates of the row according to the left reference point coordinate and the right reference point coordinate of the row, calculating the reference coordinates of other rows in the CMOS image sensor array by repeating the above steps, and sorting the obtained reference coordinates according to the sequence to obtain a reference coordinate point set.
The reference coordinate point set is a coordinate set of a center point of the lane. The lane line centering control is convenient for the subsequent vehicle running.
Preferably, the clustering algorithm is a Mean Shift (Mean Shift) clustering algorithm.
Optionally, controlling the vehicle running according to the reference coordinate point set includes:
calibrating a control point corresponding to a longitudinal centerline of the vehicle in the monocular camera;
controlling the running of the vehicle according to the control points, the target reference coordinate point set and a preset control mode;
the target reference coordinate point set is a corresponding reference coordinate point set in a preset duration of the current time.
Specifically, the process of controlling the vehicle running according to the reference coordinate point set is: coordinates of a control point of the vehicle are identified in a monocular camera, wherein the monocular camera may not be in the middle of the vehicle, with the control point aligned with the longitudinal centerline of the vehicle by vehicle off-line calibration. The coordinates of the control point may vary with the vehicle speed. And taking the obtained target reference coordinate point set in the reference coordinate point set as the following reference point set of the vehicle, and realizing the coordinates in the control point following reference coordinate point set by utilizing a preset control mode, thereby realizing the running control of the vehicle on the lane. I.e. to achieve a centering control of the vehicle on the lane.
The preset control mode may be a stanley control algorithm, a linear quadratic regulator (linear quadratic regulator, LQR) control algorithm, or a model predictive control (Model Predictive Control, MPC) algorithm, and different preset control modes may be selected according to requirements.
Preferably, the target reference coordinate point set may be coordinates in the reference coordinate point set within a preset duration of the current time determined according to the vehicle speed, for example, the preset duration is 1-2s. That is, coordinates in the corresponding reference coordinate point set are selected as the target reference coordinate point set within 1-2s according to the current vehicle speed.
As a preferred embodiment, the control points include a first control point and a second control point;
according to the control point, the target reference coordinate point set and a preset control mode, the vehicle running is controlled, and the method comprises the following steps:
controlling the running of the vehicle according to the position control points, the yaw angle of the vehicle, the target reference coordinate point set and a preset control mode;
the position control points are control points with larger row coordinates in the first control points and the second control points;
the vehicle yaw angle is determined from the first control point and the second control point.
Specifically, two control points corresponding to the longitudinal center line of the vehicle are selected to be respectively used as a first control point and a second control point, wherein the row coordinate is a position control point of the vehicle, and an included angle between the two control points and the tangential direction of the central control point of the vehicle is used as a yaw angle of the vehicle. The central control point of the vehicle may be a central point of the vehicle body.
The position control point, the yaw angle of the vehicle and the target reference coordinate point set are used as control parameters of a preset control mode to input, so that the vehicle is centered and driven, the control precision can be further improved, and the vehicle curve centering control method is more suitable for vehicle curve centering control.
The preset control mode can be a stanley control algorithm, an LQR control algorithm or an MPC algorithm.
Optionally, calibrating a control point corresponding to a longitudinal centerline of the vehicle in the monocular camera includes:
calibrating column coordinates of control points corresponding to a longitudinal center line of the vehicle in the monocular camera in a static calibration mode;
and/or the number of the groups of groups,
and calibrating row coordinates of a control point corresponding to the longitudinal center line of the vehicle in the monocular camera in a dynamic calibration mode.
Specifically, the calibration method of calibrating the control point corresponding to the longitudinal center line of the vehicle in the monocular camera is classified into two kinds of static calibration and dynamic calibration. The primary purpose is to determine the relationship of the control point to the longitudinal centerline of the vehicle and to the central control point of the vehicle.
The static calibration mode is mainly used for calibrating the column coordinates of the vehicle control points. The specific process is as follows: the vehicle is placed on a designed calibration mould, and when the monocular camera is positioned at the central position of the front windshield of the vehicle, the parameters of the direction of the camera 6 are adjusted, so that the central line of the pixel array of the camera is aligned with the marking line which coincides with the central line of the vehicle on the ground. If the camera is not at the center of the vehicle, calculating the offset column number corresponding to the pixel array according to the parameters of the camera and the offset center line, and correspondingly offsetting the pixel center column number of the camera.
The dynamic calibration mode mainly calibrates the row coordinates of the control points in the pixel queue, and the row coordinates of the control points correspondingly change along with the change of the speed, and the faster the speed, the smaller the row coordinates of the control points. The specific numerical values need to be calibrated into corresponding relation curves according to monocular camera parameters and control parameters.
The flow of the vehicle running control method is specifically described below with reference to fig. 2.
Extracting a target image from a video stream photographed from a front-view camera (monocular camera); performing image processing on the target image, wherein the image processing mainly comprises the steps of extracting a gray level image of the target image based on RGB information, performing Gaussian filtering processing, and obtaining a lane line edge coordinate point set by using a Canny edge detection algorithm and an ROI region extraction algorithm; repairing the lane line edge coordinate point set by adopting an interpolation mode, and extracting a left lane line coordinate point set and a right lane line coordinate point set in the repaired lane line edge coordinate point set; calculating a set of reference coordinate points, comprising: clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the same row to obtain a left reference point coordinate and a right reference point coordinate of the row, determining the reference coordinate of the row according to the left reference point coordinate and the right reference point coordinate of the row, and arranging the reference coordinates of a plurality of rows according to the row sequence to obtain a reference coordinate point set; identifying a control point of the vehicle in the monocular camera; inputting a control point of the vehicle and a target reference coordinate point set in a reference coordinate point set into a control algorithm, wherein the control algorithm comprises a stanley control algorithm, an LQR control algorithm, an MPC algorithm and the like; converting a control point set and a target reference coordinate point set of the vehicle into control information of the vehicle by using a control algorithm, wherein the control information comprises steering wheel rotation angle, rotation speed and the like; controlling the vehicle according to the control information; in the process of controlling the vehicle, the vehicle speed is acquired and is input into a control algorithm, so that the control algorithm can accurately control the running of the vehicle according to the vehicle speed.
According to the embodiment of the invention, the row position (control point) of the central line of the vehicle in the pixel and the processed row position (target reference coordinate set) of the reference point of the vehicle are used as the input of the transverse control of the vehicle, the control signal of the vehicle is directly output, various errors in the process of generating a lane line equation (vehicle control track curve) in the traditional method are effectively eliminated, the accuracy of the vehicle centering control is improved on the basis of original hardware, the instantaneity is improved, and the dependence on calculation resources is saved.
It should be noted that, other driving modes other than the central control of the vehicle driving, for example, driving near the lane line, etc., are also included in the protection scope of the embodiment of the present invention, which are implemented by using the image captured by the monocular camera, the obtained lane line edge coordinate point set on the CMOS sensor, and the control point of the vehicle calibrated in the monocular camera.
As shown in fig. 3, an embodiment of the present invention further provides a vehicle running control apparatus, including:
an acquisition module 301, configured to acquire a target image captured by a monocular camera of a vehicle;
the processing module 302 is configured to perform gaussian filtering processing on a gray level image corresponding to the target image, and obtain a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm;
a determining module 303, configured to determine a reference coordinate point set according to the lane line edge coordinate point set;
and the control module 304 is used for controlling the running of the vehicle according to the reference coordinate point set.
According to the embodiment of the invention, through Gaussian filtering processing is carried out on the gray level image corresponding to the obtained target image shot by the monocular camera, the lane line edge coordinate point set corresponding to the lane line in the target image is obtained through the edge detection algorithm and the region extraction algorithm, the reference coordinate point set is determined according to the lane line edge coordinate point set, the process of forming the vehicle control track curve through linear fitting is bypassed, the vehicle running is controlled through the reference coordinate point set, and the problem of low control precision caused by vehicle control according to the vehicle control track curve is solved.
Optionally, the processing module 302 includes:
an acquisition unit, configured to acquire a gray image corresponding to the target image based on RGB information;
the first processing unit is used for carrying out Gaussian filtering processing on the gray level image to obtain a processed gray level image;
the second processing unit is used for processing the processed gray level image by utilizing an edge detection algorithm and an extraction algorithm to obtain a lane line edge coordinate point set corresponding to a lane line in the target image;
the lane line edge coordinate point set is a CMOS array coordinate point set in the monocular camera.
Optionally, the determining module 303 includes:
the repair unit is used for carrying out coordinate repair on the lane line edge coordinate point set by utilizing an interpolation mode to obtain a repaired lane line edge coordinate point set;
the first determining unit is used for determining a reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set.
Optionally, the first determining unit is specifically configured to:
clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the target line in the lane line edge coordinate point set to obtain a left reference point coordinate and a right reference point coordinate of the target line;
determining the reference coordinates of the target row according to the left reference point coordinates and the right reference point coordinates of the target row;
and arranging the reference coordinates of all the target rows in the lane line edge coordinate point set according to the row sequence to obtain the reference coordinate point set.
Optionally, the control module 304 includes:
a calibration unit for calibrating a control point corresponding to a longitudinal center line of the vehicle in the monocular camera;
the control unit is used for controlling the running of the vehicle according to the control points, the target reference coordinate point set and a preset control mode;
the target reference coordinate point set is a corresponding reference coordinate point set in a preset duration of the current time.
Optionally, the control points include a first control point and a second control point;
the control unit is specifically configured to:
controlling the running of the vehicle according to the position control points, the yaw angle of the vehicle, the target reference coordinate point set and a preset control mode;
the position control points are control points with larger row coordinates in the first control points and the second control points;
the vehicle yaw angle is determined from the first control point and the second control point.
Optionally, the calibration unit is specifically configured to
Calibrating column coordinates of control points corresponding to a longitudinal center line of the vehicle in the monocular camera in a static calibration mode;
and/or the number of the groups of groups,
and calibrating row coordinates of a control point corresponding to the longitudinal center line of the vehicle in the monocular camera in a dynamic calibration mode.
It should be noted that, the vehicle running control apparatus provided in the embodiment of the present invention is an apparatus capable of executing the above-described vehicle running control method, and all embodiments of the above-described vehicle running control method are applicable to the apparatus, and the same or similar technical effects can be achieved.
As shown in fig. 4, an embodiment of the present invention further provides an electronic device, including: a processor 400; and a memory 410 connected to the processor 400 through a bus interface, the memory 410 storing programs and data used by the processor 400 in performing operations, the processor 400 calling and executing the programs and data stored in the memory 410.
Wherein the electronic device further comprises a transceiver 420, the transceiver 420 being connected to the bus interface for receiving and transmitting data under the control of the processor 400; specifically, the processor 400 calls and executes the programs and data stored in the memory 410, and the processor 400 performs the following processes:
acquiring a target image shot by a monocular camera of a vehicle;
carrying out Gaussian filtering treatment on the gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm;
determining a reference coordinate point set according to the lane line edge coordinate point set;
and controlling the running of the vehicle according to the reference coordinate point set.
Optionally, the processor 400 is specifically configured to:
acquiring a gray image corresponding to the target image based on RGB information;
carrying out Gaussian filtering treatment on the gray level image to obtain a treated gray level image;
processing the processed gray level image by using an edge detection algorithm and an extraction algorithm to obtain a lane line edge coordinate point set corresponding to a lane line in the target image;
the lane line edge coordinate point set is a CMOS array coordinate point set in the monocular camera.
Optionally, the processor 400 is specifically configured to:
carrying out coordinate repair on the lane line edge coordinate point set by utilizing an interpolation mode to obtain a repaired lane line edge coordinate point set;
and determining a reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set.
Optionally, the processor 400 is specifically configured to:
clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the target line in the lane line edge coordinate point set to obtain a left reference point coordinate and a right reference point coordinate of the target line;
determining the reference coordinates of the target row according to the left reference point coordinates and the right reference point coordinates of the target row;
and arranging the reference coordinates of all the target rows in the lane line edge coordinate point set according to the row sequence to obtain the reference coordinate point set.
Optionally, the processor 400 is specifically configured to:
calibrating a control point corresponding to a longitudinal centerline of the vehicle in the monocular camera;
controlling the running of the vehicle according to the control points, the target reference coordinate point set and a preset control mode;
the target reference coordinate point set is a corresponding reference coordinate point set in a preset duration of the current time.
Optionally, the control points include a first control point and a second control point;
the processor 400 is specifically configured to:
controlling the running of the vehicle according to the position control points, the yaw angle of the vehicle, the target reference coordinate point set and a preset control mode;
the position control points are control points with larger row coordinates in the first control points and the second control points;
the vehicle yaw angle is determined from the first control point and the second control point.
Optionally, the processor 400 is specifically configured to:
calibrating column coordinates of control points corresponding to a longitudinal center line of the vehicle in the monocular camera in a static calibration mode;
and/or the number of the groups of groups,
and calibrating row coordinates of a control point corresponding to the longitudinal center line of the vehicle in the monocular camera in a dynamic calibration mode.
Wherein in fig. 4, a bus architecture may comprise any number of interconnected buses and bridges, and in particular one or more processors represented by processor 400 and various circuits of memory represented by memory 410, linked together. The bus architecture may also link together various other circuits such as peripheral devices, voltage regulators, power management circuits, etc., which are well known in the art and, therefore, will not be described further herein. The bus interface provides a user interface 430. Transceiver 420 may be a number of elements, including a transmitter and a receiver, providing a means for communicating with various other apparatus over a transmission medium. The processor 400 is responsible for managing the bus architecture and general processing, and the memory 410 may store data used by the processor 400 in performing operations.
In addition, a specific embodiment of the present invention also provides a readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the steps in the vehicle running control method as described in any one of the above.
In the several embodiments provided in this application, it should be understood that the disclosed methods and apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may be physically included separately, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform part of the steps of the transceiving method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that various modifications and changes can be made without departing from the principles of the present invention, and such modifications and changes are intended to be within the scope of the present invention.

Claims (10)

1. A vehicle travel control method characterized by comprising:
acquiring a target image shot by a monocular camera of a vehicle;
carrying out Gaussian filtering treatment on the gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm;
determining a reference coordinate point set according to the lane line edge coordinate point set;
and controlling the running of the vehicle according to the reference coordinate point set.
2. The vehicle running control method according to claim 1, wherein the performing gaussian filtering processing on the gray-scale image corresponding to the target image, and obtaining the lane line edge coordinate point set corresponding to the lane line in the target image by an edge detection algorithm and a region extraction algorithm, includes:
acquiring a gray image corresponding to the target image based on RGB information;
carrying out Gaussian filtering treatment on the gray level image to obtain a treated gray level image;
processing the processed gray level image by using an edge detection algorithm and an extraction algorithm to obtain a lane line edge coordinate point set corresponding to a lane line in the target image;
the lane line edge coordinate point set is a CMOS array coordinate point set in the monocular camera.
3. The vehicle travel control method according to claim 1, characterized in that determining a reference coordinate point set from the lane line edge coordinate point set includes:
carrying out coordinate repair on the lane line edge coordinate point set by utilizing an interpolation mode to obtain a repaired lane line edge coordinate point set;
and determining a reference coordinate point set according to the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set.
4. The vehicle travel control method according to claim 3, characterized in that determining the reference coordinate point set from the left lane line coordinate point set and the right lane line coordinate point set in the repaired lane line edge coordinate point set includes:
clustering and averaging the abscissa of the left lane line coordinate point and the abscissa of the right lane line coordinate point of the target line in the lane line edge coordinate point set to obtain a left reference point coordinate and a right reference point coordinate of the target line;
determining the reference coordinates of the target row according to the left reference point coordinates and the right reference point coordinates of the target row;
and arranging the reference coordinates of all the target rows in the lane line edge coordinate point set according to the row sequence to obtain the reference coordinate point set.
5. The vehicle travel control method according to claim 1, characterized in that controlling the vehicle travel according to the reference coordinate point set includes:
calibrating a control point corresponding to a longitudinal centerline of the vehicle in the monocular camera;
controlling the running of the vehicle according to the control points, the target reference coordinate point set and a preset control mode;
the target reference coordinate point set is a corresponding reference coordinate point set in a preset duration of the current time.
6. The vehicle travel control method according to claim 5, characterized in that the control points include a first control point and a second control point;
according to the control point, the target reference coordinate point set and a preset control mode, the vehicle running is controlled, and the method comprises the following steps:
controlling the running of the vehicle according to the position control points, the yaw angle of the vehicle, the target reference coordinate point set and a preset control mode;
the position control points are control points with larger row coordinates in the first control points and the second control points;
the vehicle yaw angle is determined from the first control point and the second control point.
7. The vehicle travel control method according to claim 5, characterized in that calibrating a control point corresponding to a longitudinal center line of the vehicle in the monocular camera includes:
calibrating column coordinates of control points corresponding to a longitudinal center line of the vehicle in the monocular camera in a static calibration mode;
and/or the number of the groups of groups,
and calibrating row coordinates of a control point corresponding to the longitudinal center line of the vehicle in the monocular camera in a dynamic calibration mode.
8. A vehicle travel control apparatus characterized by comprising:
the acquisition module is used for acquiring a target image shot by a monocular camera of the vehicle;
the processing module is used for carrying out Gaussian filtering processing on the gray level image corresponding to the target image, and obtaining a lane line edge coordinate point set corresponding to a lane line in the target image through an edge detection algorithm and a region extraction algorithm;
the determining module is used for determining a reference coordinate point set according to the lane line edge coordinate point set;
and the control module is used for controlling the running of the vehicle according to the reference coordinate point set.
9. An electronic device, comprising: a processor, a memory, and a program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the vehicle running control method according to any one of claims 1 to 7.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a program which, when executed by a processor, realizes the steps in the vehicle running control method according to any one of claims 1 to 7.
CN202111293446.4A 2021-11-03 2021-11-03 Vehicle running control method and device and electronic equipment Pending CN116080644A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111293446.4A CN116080644A (en) 2021-11-03 2021-11-03 Vehicle running control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111293446.4A CN116080644A (en) 2021-11-03 2021-11-03 Vehicle running control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN116080644A true CN116080644A (en) 2023-05-09

Family

ID=86203064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111293446.4A Pending CN116080644A (en) 2021-11-03 2021-11-03 Vehicle running control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116080644A (en)

Similar Documents

Publication Publication Date Title
CN109784333B (en) Three-dimensional target detection method and system based on point cloud weighted channel characteristics
CN107481292B (en) Attitude error estimation method and device for vehicle-mounted camera
CN110910451B (en) Object pose estimation method and system based on deformation convolution network
CN111950426A (en) Target detection method and device and delivery vehicle
CN114067001B (en) Vehicle-mounted camera angle calibration method, terminal and storage medium
CN111222417A (en) Method and device for improving lane line extraction precision based on vehicle-mounted image
CN112654998B (en) Lane line detection method and device
CN116068507A (en) Alignment verification in-vehicle sensors
CN111476062A (en) Lane line detection method and device, electronic equipment and driving system
CN111488762A (en) Lane-level positioning method and device and positioning equipment
CN116080644A (en) Vehicle running control method and device and electronic equipment
CN116630401A (en) Fish-eye camera ranging method and terminal
Li et al. Feature point extraction and tracking based on a local adaptive threshold
EP1035509A1 (en) Road modeling method for lane drifting warning system
CN115239822A (en) Real-time visual identification and positioning method and system for multi-module space of split type flying vehicle
CN111260538A (en) Positioning and vehicle-mounted terminal based on long-baseline binocular fisheye camera
US11477371B2 (en) Partial image generating device, storage medium storing computer program for partial image generation and partial image generating method
CN115346209A (en) Motor vehicle three-dimensional target detection method and device and computer readable storage medium
CN117237230B (en) Laser line and mark point identification method and system
CN106611150A (en) Speed-limiting sign recognition system and method
US11398043B2 (en) System and method for self-supervised monocular depth regularization from surface normals
CN117746401A (en) Gesture recognition method and device, vehicle-mounted terminal, head gesture monitoring system and medium
CN117372514A (en) Flat car positioning method and device based on depth image and storage medium
CN114549855A (en) Road surface image feature matching method and system based on region and motion information
CN116883652A (en) Method and device for dividing drivable area, readable storage medium and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination