CN113204234B - Vehicle control method and vehicle control system - Google Patents

Vehicle control method and vehicle control system Download PDF

Info

Publication number
CN113204234B
CN113204234B CN202010042891.2A CN202010042891A CN113204234B CN 113204234 B CN113204234 B CN 113204234B CN 202010042891 A CN202010042891 A CN 202010042891A CN 113204234 B CN113204234 B CN 113204234B
Authority
CN
China
Prior art keywords
road condition
boundary
condition image
vehicle
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010042891.2A
Other languages
Chinese (zh)
Other versions
CN113204234A (en
Inventor
徐暄翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN202010042891.2A priority Critical patent/CN113204234B/en
Publication of CN113204234A publication Critical patent/CN113204234A/en
Application granted granted Critical
Publication of CN113204234B publication Critical patent/CN113204234B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A vehicle control method and a vehicle control system. The vehicle control method is suitable for a vehicle control system on a vehicle. A video stream including a plurality of road condition images is captured toward a front of a vehicle. And detecting a travelable area in each road condition image. Dividing each road condition image into a plurality of vertical strip parts, and obtaining a plurality of boundary values of the driving area relative to the vertical strip parts respectively, wherein the road condition image comprises a current road condition image and at least one previous road condition image. According to the boundary value of the current road condition image and the boundary value of at least one previous road condition image, a plurality of regional boundary variation parameters respectively corresponding to the vertical strip-shaped parts are obtained. And controlling the driving state of the vehicle according to the regional boundary change parameters.

Description

Vehicle control method and vehicle control system
Technical Field
The present invention relates to driving assistance technology, and more particularly, to a vehicle control method and a vehicle control system.
Background
With the continuous research and development of the automatic driving vehicle in recent years, the development and technology of the automatic driving vehicle have also been rapidly developed. In the prior art, many related technologies, such as sensing technology, object recognition technology, and positioning technology, have been developed to substantially meet the needs of autonomous vehicles. An ideal autopilot system can accurately detect the drivable area of a road surface to avoid dangerous situations such as a collision of a vehicle or deviation of the vehicle from the road. Many methods have been proposed to detect a drivable area for various driving situations, such as capturing a road condition image in front of a vehicle and analyzing the road condition image to detect a drivable area of a road surface. For example, the disparity map generated by the dual-lens camera device can effectively detect the travelable region, or other object recognition techniques and deep learning architecture can also effectively detect the travelable region. However, how to use the information of the drivable area to improve the driving safety and stability is also a concern for those skilled in the art.
Disclosure of Invention
In view of the above, the present invention provides a vehicle control method and a vehicle control system, which can control a vehicle according to a boundary variation trend of a drivable region, so as to improve the safety and stability of an automatic driving system and an auxiliary driving system.
The embodiment of the invention provides a vehicle control method, which is suitable for a vehicle control system on a vehicle and comprises the following steps: capturing a video stream comprising a plurality of road condition images towards the front of the vehicle; detecting a drivable area in each road condition image; dividing each road condition image into a plurality of vertical strip parts, and obtaining a plurality of boundary values of the drivable area relative to the vertical strip parts respectively, wherein the road condition image comprises a current road condition image and at least one previous road condition image; obtaining a plurality of regional boundary variation parameters corresponding to the vertical strip part respectively according to the boundary value of the current road condition image and the boundary value of at least one previous road condition image; and controlling the driving state of the vehicle according to the regional boundary variation parameters.
The embodiment of the invention provides a vehicle control system which is suitable for a vehicle and comprises a vehicle control device, an image pickup device, a storage device and a controller. The camera device captures a video stream comprising a plurality of road condition images towards the front of the vehicle. The controller is coupled to the vehicle control device, the image capturing device and the storage device, and is configured to execute instructions in the storage device to: detecting a drivable area in each road condition image; dividing each road condition image into a plurality of vertical strip parts, and obtaining a plurality of boundary values of the drivable area relative to the vertical strip parts respectively, wherein the road condition image comprises a current road condition image and at least one previous road condition image; obtaining a plurality of regional boundary variation parameters corresponding to the vertical strip part respectively according to the boundary value of the current road condition image and the boundary value of at least one previous road condition image; and controlling the driving state of the vehicle according to the regional boundary variation parameters.
Based on the above, in the embodiment of the invention, after detecting the travelable region in the road condition image, the boundary value of the travelable region with respect to each vertical bar portion is obtained. Then, the region boundary variation parameters of each vertical bar portion can be obtained by comparing the boundary value of the current road condition image with the corresponding boundary value of at least one previous road condition image. Therefore, the vehicle control system can estimate the change trend of the movable area according to the area boundary change parameters, and control the running state of the vehicle according to the change trend, so that the collision of the vehicle and the obstacle is avoided, and the running safety of the vehicle is improved.
In order to make the above features and advantages of the present invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a schematic diagram of a control system for a vehicle according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a vehicle control system and a vehicle according to an embodiment of the invention.
Fig. 3 is a flowchart of a vehicle control method according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a vehicle control method according to an embodiment of the invention.
Fig. 5 is a flowchart of a vehicle control method according to an embodiment of the present invention.
FIG. 6 is a schematic diagram of boundary values with respect to each vertical stripe portion in accordance with an embodiment of the present invention.
Wherein:
10: a control system for a vehicle;
110: a vehicle control device;
120: a storage device;
130: an image pickup device;
140: a controller;
v1: a vehicle;
p1, P2: boundary variation trend parameters;
p3: a boundary variation amplitude parameter;
b1 to B3: boundary values;
img_t0, img_t1: a previous road condition image;
img_t2: a current road condition image;
b4, B12: a vertical strip portion;
img1: road condition images;
s301 to S305, S501 to S503: and (3) step (c).
Detailed Description
Some embodiments of the invention will be described in detail below with reference to the drawings, wherein reference to the following description refers to the same or similar elements appearing in different drawings. These examples are only a part of the present invention and do not disclose all possible embodiments of the invention. Rather, these embodiments are merely examples of methods and systems that may be used in the present invention.
Fig. 1 is a schematic diagram of a control system for a vehicle according to an embodiment of the present invention. Referring to fig. 1, the vehicle control system 10 includes a vehicle control device 110, a storage device 120, an image capturing device 130, and a controller 140. In one embodiment, the vehicle control system 10 may be configured in a wide variety of vehicles, such as a passenger car, bus, golf cart, tourist coach, truck, or van, etc. The invention is not limited to the type of vehicle equipped with the vehicle control system 10.
The vehicle control device 110 is, for example, a steering device, a brake device, a throttle device, a navigation device, or other vehicle element that can be used to control the driving state of the vehicle. The driving state is, for example, a vehicle speed, a braking state, a driving direction or route planning, and the like.
The storage device 120 is, for example, any type of fixed or removable random access Memory (Random Access Memory, RAM), read-Only Memory (ROM), flash Memory (Flash Memory), hard disk or the like or a combination thereof, and is used to store data, program codes, images, etc. that may be used in the operation of the vehicle control system 10. That is, the storage device 120 is further configured to record a plurality of instructions executable by the processor 130.
The image capturing device 130 is disposed on the vehicle for capturing a video stream including a plurality of images toward the front of the vehicle. The image pickup device 130 may include an image pickup lens having a lens and a photosensitive element. The photosensitive element is used for sensing the intensity of light entering the lens so as to generate an image. The photosensitive element may be, for example, a charge coupled device (charge coupled device, CCD), a complementary metal oxide semiconductor (complementary metal-oxide semiconductor, CMOS) element, or other element, as the invention is not limited in this regard.
The controller 140 is coupled to the vehicle control device 110, the camera device 130 and the storage device 120 for controlling the overall operation of the vehicle control system 10. In the present embodiment, the controller 140 is, for example, a central processing unit (Central Processing Unit, CPU), or other programmable Microprocessor (Microprocessor), digital signal processor (Digital Signal Processor, DSP), programmable controller, application specific integrated circuit (Application Specific Integrated Circuits, ASIC), programmable logic device (Programmable Logic Device, PLD), or other hardware device with operation capability, but the disclosure is not limited thereto.
Fig. 2 is a schematic diagram of a vehicle control system and a vehicle according to an embodiment of the invention. Referring to fig. 2, if the vehicle control system 10 is applied to the driving environment of the vehicle V1, the image capturing device 130 may be disposed inside or outside the vehicle V1. However, fig. 2 is only an exemplary illustration, and the number and the actual positions of the image capturing devices 130 are not limited, and may be designed according to the practical application. In one embodiment, the image capturing device 130 may be disposed on a front windshield (front glass) of the vehicle V1.
Fig. 3 is a flowchart of a rearview mirror control method in accordance with an embodiment of the present invention. Referring to fig. 3, the manner of the present embodiment is applicable to the vehicle control system 10 in the above embodiment, and the following describes the detailed steps of controlling the vehicle according to the trend of the driving area along with the various elements in the vehicle control system 10.
In step S301, the image capturing device 130 captures a video stream including a plurality of road condition images toward the front of the vehicle. Specifically, the camera device 130 can continuously capture a plurality of road condition images at regular time intervals, and the video stream is composed of the road condition images corresponding to different capturing time points. The time interval is the inverse of the frame rate (frame rate) of the video stream. For example, assuming that the frame rate of the video stream is 60fps, which represents that the camera 130 captures 60 road condition images in one second, the time interval between the road condition images is 1/60 second.
In step S302, the controller 140 detects a travelable region in each road condition image. Here, the drivable region is a region within the road condition image determined to be drivable by the vehicle, and the method for detecting the drivable region is not limited by the present invention. For example, patent publication No. TW201913557 discloses a deep learning model capable of detecting a traveling area. Patent publication No. TW201327473 also discloses a method of detecting a travelable area using a disparity map. The non-patent document Dan Levi et al, "Stixelelnet: A Deep Convolutional Network for Obstacle Detection and Road segment," BMVC 2015 "also discloses a method for detecting a travelable region by combining object identification, image cutting and deep learning models. It should be noted that the drivable region in the road condition image may be represented based on image coordinates, that is, the boundary of the drivable region may be represented by the image coordinates.
In step S303, the controller 140 divides each road condition image into a plurality of vertical bar portions, and obtains a plurality of boundary values of the drivable region relative to the vertical bar portions, respectively. In one embodiment, the road condition image includes a current road condition image and at least one previous road condition image. Specifically, the road condition images can be divided into a plurality of vertical bar portions, but the number and width of the vertical bar portions are not limited in the present invention, and can be set according to practical requirements. And, the controller 140 will generate boundary values for the drivable region respectively corresponding to the boundary values of each of the vertical bar portions.
In an embodiment, when the drivable region is generated based on a column pixel (six), for example, in the drivable region detecting method of the above-mentioned non-patent document, the road condition image is already divided into a plurality of vertical bar portions during the process of detecting the boundary of the drivable region, and the estimation result of the boundary of the drivable region is also represented as a plurality of boundary values respectively corresponding to the vertical bar portions. In another embodiment, if the boundary of the drivable area may continuously vary from pixel row to pixel row, the controller 140 may calculate the average value of the boundary of the drivable area in each vertical bar portion to obtain a plurality of boundary values corresponding to the vertical bar portions after dividing each road condition image into a plurality of vertical bar portions.
For example, FIG. 6 is a schematic diagram of boundary values with respect to each vertical stripe portion according to an embodiment of the present invention. Referring to fig. 6, in the present embodiment, the drivable region is a region generated by removal of an obstacle (e.g., other vehicle) from the lane region. The road condition image Img1 is divided into N vertical stripe portions, for example, 50 vertical stripe portions. The vertical bars respectively have boundary values of independent driving areas, and the boundary values can be defined by Y-axis pixelsAnd (5) representing coordinates. For example, the vertical bar portion B4 corresponds to the boundary value Y4 of the drivable region, and the vertical bar portion B12 corresponds to the boundary value Y12 of the drivable region. Furthermore, these boundary values of the drivable region in the road condition image Img1 can be expressed as (n) bar Y), where n bar The numbered index of the vertical bar portion is represented, and Y is the Y-axis pixel coordinate where the boundary value is located.
In step S304, the controller 140 obtains a plurality of region boundary variation parameters corresponding to the vertical bar portion according to the boundary value of the current road condition image and the boundary value of at least one previous road condition image. In one embodiment, the controller 140 can obtain the area boundary variation parameters corresponding to each of the vertical bar portions by comparing the boundary value in the current road condition image with the boundary value of the same vertical bar portion in the previous road condition image one by one so as to obtain the variation trend of the drivable area. In one embodiment, the region boundary variation parameters may include a plurality of boundary variation trend parameters and/or a plurality of boundary variation amplitude parameters, respectively, with respect to the vertical stripe portion. The boundary variation trend parameter is the boundary value variation speed, and the boundary variation amplitude parameter is the boundary value variation acceleration. The boundary variation trend parameter may be used to represent an extended speed and a reduced speed of the driving area, and the boundary variation magnitude parameter may be used to represent an extended acceleration and a reduced acceleration of the driving area.
In detail, fig. 4 is a schematic diagram of a vehicle control method according to an embodiment of the present invention. Referring to fig. 4, in the present example, the image capturing device 130 captures the previous road condition image img_t1 after capturing the previous road condition image img_t0, and captures the current road condition image img_t2 after capturing the previous road condition image img_t1. The controller 140 can obtain a plurality of boundary values B1-B3 of the drivable region in the current road condition image img_t2, the previous road condition image img_t1 and the previous road condition image img_t0 with respect to the plurality of vertical bars by using the drivable region detection model.
In the example of fig. 4, the controller 140 may obtain the current road condition image img_t2 and the previous road condition image img_t1 (i.e. the first previous road condition image)And a boundary variation trend parameter P2 between the front road condition image Img_t2 and the front road condition image Img_t1. Specifically, the controller 140 subtracts the boundary values B2 of the previous road condition image img_t1 from the boundary values B3 of the current road condition image img_t2 to obtain a plurality of first boundary difference values corresponding to the vertical bar portion. For example, the controller 140 determines the boundary value (n bar =1, y) minus the boundary value (n bar =1, y), thereby obtaining a first boundary difference value with respect to the first vertical stripe portion. Next, the controller 140 divides the first boundary difference value corresponding to the vertical bar portion by the time interval Δt to obtain the boundary variation trend parameter P2 between the current road condition image img_t2 and the previous road condition image img_t1. Assuming that there are 50 vertical stripe portions, the controller can obtain 50 boundary variation trend parameters P2.
Similarly, the controller 140 may obtain the boundary variation trend parameter P1 between the previous road condition image img_t1 and the previous road condition image img_t0 according to the previous road condition image img_t1 (i.e. the first previous road condition image) and the previous road condition image img_t0 (i.e. the second previous road condition image). Specifically, the controller 140 subtracts the boundary values B1 of the previous road condition image img_t0 from the boundary values B2 of the previous road condition image img_t1 to obtain a plurality of second boundary difference values corresponding to the vertical bar portions. Next, the controller 140 divides the second boundary difference values corresponding to the vertical bar portions by the time interval Δt to obtain the boundary variation trend parameter P1 between the previous road condition image img_t1 and the previous road condition image img_t0. Assuming that there are 50 vertical stripe portions, the controller can obtain 50 boundary variation trend parameters P1.
In addition, the controller 140 may divide the subtraction result between the boundary variation trend parameter P2 between the current road condition image img_t2 and the previous road condition image img_t1 and the plurality of boundary variation trend parameters P1 between the previous road condition image img_t1 and the previous road condition image img_t0 by the time interval Δt to obtain the boundary variation amplitude parameter P3. Assuming 50 vertical stripe portions, the controller can obtain 50 boundary variation amplitude parameters P3.
Returning to the step of fig. 3, after obtaining the plurality of zone boundary variation parameters corresponding to the vertical bar portions, in step S305, the controller 140 controls the driving state of the vehicle according to the zone boundary variation parameters. Specifically, after acquiring the plurality of regional boundary variation parameters corresponding to the vertical bar portions, the controller 140 may estimate the boundary variation trend of the drivable region, and control the driving state of the vehicle according to the boundary variation trend of the drivable region, such as decelerating, accelerating, braking, providing warning, changing the driving direction, prompting the driving direction, planning the driving path, and so on.
In one embodiment, in response to at least a portion of the boundary trend parameter between the current road condition image and the previous road condition image meeting the forward direction change condition, the controller 140 may control the vehicle control device 110 to increase the vehicle speed when the boundary representing the drivable area moves away from the vehicle. On the other hand, in response to at least a part of the boundary trend parameters between the current road condition image and the previous road condition image meeting the negative direction change condition, the controller 140 may control the vehicle control device 110 to reduce the vehicle speed by moving the boundary representing the drivable area in a direction approaching the vehicle. Specifically, the controller 140 may determine that the positive change condition or the negative change condition is met according to at least a part of the positive and negative values of the boundary trend parameter.
In one embodiment, when the boundary trend parameter is positive, it indicates that the boundary trend parameter meets the forward direction change condition. When the boundary variation trend parameter is negative, the boundary variation trend parameter accords with the negative variation condition. However, the positive change condition or the negative change condition is designed according to the preset position of the image coordinate origin, and the person skilled in the art should be able to change the design according to the actual requirement. In addition, in one embodiment, the controller 140 may determine the partial boundary trend parameter directly in front of the driving route of the vehicle. For example, the controller 140 may determine the vehicle deceleration or the vehicle acceleration by taking the boundary trend parameters of the 15 th vertical bar portion to the 35 th vertical bar portion.
In addition, in one embodiment, the controller 140 may further determine the vehicle speed acceleration according to at least a portion of the boundary variation amplitude parameter. Alternatively, the controller 140 may further determine the speed of the vehicle according to at least a portion of the boundary variation amplitude parameter. In detail, after knowing that the boundary of the drivable region is approaching or departing from the vehicle according to the boundary variation trend parameter, the controller 140 may estimate the acceleration of the boundary of the drivable region approaching the vehicle or the acceleration of the boundary of the drivable region departing from the vehicle according to the boundary variation magnitude parameter. Thus, the controller 140 can determine the vehicle speed acceleration or the vehicle deceleration according to the boundary variation amplitude parameter.
For example, fig. 5 is a flowchart of a vehicle control method according to an embodiment of the invention. Referring to fig. 5, in step S501, the controller 140 determines whether the boundary variation trend parameter of the region boundary variation parameters is greater than zero. If step S501 determines yes, it represents that the vehicle is traveling in front of the vehicle. Therefore, in step S502, the controller 140 determines to maintain the vehicle speed or increase the vehicle speed, and determines the vehicle speed acceleration according to the boundary variation amplitude parameter among the region boundary variation parameters. For example, the controller 140 may control the throttle size according to the zone boundary variation parameter after determining to increase the vehicle speed.
On the other hand, if no in step S501, it represents that the vehicle is traveling in a reduced area in front of the vehicle. Therefore, in step S503, the controller 140 determines the vehicle speed deceleration according to the boundary variation amplitude parameter among the region boundary variation parameters. For example, the controller 140 may control the braking force according to the boundary variation amplitude parameter after determining to reduce the vehicle speed. In other words, the controller 140 can control the vehicle speed acceleration and the vehicle speed deceleration by controlling the brake force or the accelerator, and the accelerator and the brake force can be adaptively adjusted based on the boundary variation amplitude parameter.
In summary, in the embodiment of the present invention, the boundary variation trend of the drivable area can be obtained by comparing the boundary values of the drivable area in the plurality of road condition images, so as to control the driving state of the vehicle according to the boundary variation trend of the drivable area, thereby avoiding the collision between the vehicle and the obstacle and improving the safety. In addition, the regional boundary variation parameter for representing the boundary variation trend of the drivable region can also be used for helping the automatic vehicle decision system and the advanced auxiliary driving system to provide more accurate judgment and immediate warning, thereby improving the driving safety and stability of the vehicle.
Although the present invention has been described with reference to the above embodiments, it should be understood that the invention is not limited thereto, but rather is capable of modification and variation without departing from the spirit and scope of the present invention.

Claims (8)

1. A vehicle control method for a vehicle control system on a vehicle, the method comprising:
capturing a video stream comprising a plurality of road condition images towards the front of the vehicle;
detecting a drivable area in each road condition image;
dividing each road condition image into a plurality of vertical strip parts, and obtaining a plurality of boundary values of the drivable area relative to the vertical strip parts respectively, wherein the road condition images comprise a current road condition image and at least one previous road condition image; wherein the at least one previous road condition image comprises a first previous road condition image;
acquiring a plurality of regional boundary variation parameters corresponding to the vertical strip parts respectively according to the boundary values of the current road condition image and the boundary values of the at least one previous road condition image; the step of obtaining the region boundary variation parameters corresponding to the vertical strip portions according to the boundary values of the current road condition image and the boundary values of the at least one previous road condition image comprises the following steps:
subtracting the boundary values of the first previous road condition image from the boundary values of the current road condition image to obtain a plurality of first boundary difference values corresponding to the vertical strip parts respectively; and
dividing the first boundary difference values corresponding to the vertical strip parts by a time interval to obtain the boundary variation trend parameters between the current road condition image and the first previous road condition image;
and controlling the driving state of the vehicle according to the regional boundary change parameters.
2. The method according to claim 1, wherein the region boundary variation parameters include a plurality of boundary variation amplitude parameters corresponding to the vertical bar portions, the at least one previous road condition image further includes a second previous road condition image, and the step of obtaining the region boundary variation parameters corresponding to the vertical bar portions according to the boundary values of the current road condition image and the boundary values of the at least one previous road condition image includes:
subtracting the boundary values of the second previous road condition image from the boundary values of the first previous road condition image to obtain a plurality of second boundary difference values corresponding to the vertical strip parts respectively;
dividing the second boundary difference values corresponding to the vertical strip parts by the time interval to obtain the boundary variation trend parameters between the first previous road condition image and the second previous road condition image; and
and dividing the subtraction result between the boundary variation trend parameters between the current road condition image and the first previous road condition image and the boundary variation trend parameters between the first previous road condition image and the second previous road condition image by the time interval to obtain the boundary variation amplitude parameters.
3. The vehicle control method according to claim 2, wherein the step of controlling the driving state of the vehicle according to the zone boundary variation parameters comprises:
responding to the condition that at least part of boundary variation trend parameters between the current road condition image and the first previous road condition image accord with forward variation conditions, and controlling a vehicle control device to adjust the vehicle speed; and
and controlling the vehicle control device to reduce the vehicle speed in response to at least part of the boundary variation trend parameters between the current road condition image and the first previous road condition image conforming to the negative variation condition.
4. The vehicle control method according to claim 3, characterized in that the step of controlling the vehicle control device to increase the vehicle speed includes:
determining the vehicle speed acceleration of the vehicle according to at least part of the boundary variation amplitude parameters,
wherein the step of controlling the vehicle control device to reduce the vehicle speed includes:
and determining the speed deceleration of the vehicle according to at least part of the boundary variation amplitude parameters.
5. A vehicle control system for a vehicle, comprising:
a vehicle control device;
a memory device for storing a plurality of instructions;
the image pickup device is used for picking up a video stream comprising a plurality of road condition images towards the front of the vehicle; and
a processor, coupled to the vehicle control device, the camera device, and the storage device, configured to execute the instructions to:
detecting a drivable area in each road condition image;
dividing each road condition image into a plurality of vertical strip parts, and obtaining a plurality of boundary values of the drivable area relative to the vertical strip parts respectively, wherein the road condition images comprise a current road condition image and at least one previous road condition image; wherein the at least one previous road condition image comprises a first previous road condition image;
acquiring a plurality of regional boundary variation parameters corresponding to the vertical strip parts respectively according to the boundary values of the current road condition image and the boundary values of the at least one previous road condition image; the method for obtaining the region boundary variation parameters includes the steps of:
subtracting the boundary values of the first previous road condition image from the boundary values of the current road condition image to obtain a plurality of first boundary difference values corresponding to the vertical strip parts respectively; and
dividing the first boundary difference values corresponding to the vertical strip parts by a time interval to obtain the boundary variation trend parameters between the current road condition image and the first previous road condition image;
the vehicle control device is controlled according to the regional boundary change parameters so as to control the driving state of the vehicle.
6. The vehicle control system of claim 5, wherein the zone boundary variation parameters include a plurality of boundary variation amplitude parameters with respect to the vertical bar portions, respectively, the at least one previous road condition image further includes a second previous road condition image, the processor further configured to:
subtracting the boundary values of the second previous road condition image from the boundary values of the first previous road condition image to obtain a plurality of second boundary difference values corresponding to the vertical strip parts respectively;
dividing the second boundary difference values corresponding to the vertical strip parts by the time interval to obtain the boundary variation trend parameters between the first previous road condition image and the second previous road condition image; and
and dividing the subtraction result between the boundary variation trend parameters between the current road condition image and the first previous road condition image and the boundary variation trend parameters between the first previous road condition image and the second previous road condition image by the time interval to obtain the boundary variation amplitude parameters.
7. The vehicle control system of claim 5, wherein the processor is further configured to:
responding to the condition that at least part of boundary variation trend parameters between the current road condition image and the first previous road condition image accord with forward variation conditions, and controlling a vehicle control device to adjust the vehicle speed; and
and controlling the vehicle control device to reduce the vehicle speed in response to at least part of the boundary variation trend parameters between the current road condition image and the first previous road condition image conforming to the negative variation condition.
8. The vehicle control system of claim 7, wherein the processor is further configured to:
the vehicle speed deceleration of the vehicle is determined according to at least part of the boundary variation amplitude parameters, or the vehicle speed acceleration of the vehicle is determined according to at least part of the boundary variation amplitude parameters.
CN202010042891.2A 2020-01-15 2020-01-15 Vehicle control method and vehicle control system Active CN113204234B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010042891.2A CN113204234B (en) 2020-01-15 2020-01-15 Vehicle control method and vehicle control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010042891.2A CN113204234B (en) 2020-01-15 2020-01-15 Vehicle control method and vehicle control system

Publications (2)

Publication Number Publication Date
CN113204234A CN113204234A (en) 2021-08-03
CN113204234B true CN113204234B (en) 2023-08-22

Family

ID=77024756

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010042891.2A Active CN113204234B (en) 2020-01-15 2020-01-15 Vehicle control method and vehicle control system

Country Status (1)

Country Link
CN (1) CN113204234B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007316685A (en) * 2006-05-23 2007-12-06 Nissan Motor Co Ltd Traveling path boundary detection device and traveling path boundary detection method
JP2014149776A (en) * 2013-02-04 2014-08-21 Fuji Heavy Ind Ltd Vehicle outside environment recognition device and vehicle outside environment recognition method
CN104417431A (en) * 2013-09-06 2015-03-18 昆达电脑科技(昆山)有限公司 Running information indicating system
CN108124122A (en) * 2016-11-29 2018-06-05 法乐第(北京)网络科技有限公司 Image treatment method, device and vehicle
JP2019220070A (en) * 2018-06-22 2019-12-26 株式会社Soken Electronic control unit

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9586593B2 (en) * 2014-09-25 2017-03-07 Nissan North America, Inc. Method and system of assisting a driver of a vehicle
JP2017223461A (en) * 2016-06-13 2017-12-21 パナソニックIpマネジメント株式会社 Radar device and detection method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007316685A (en) * 2006-05-23 2007-12-06 Nissan Motor Co Ltd Traveling path boundary detection device and traveling path boundary detection method
JP2014149776A (en) * 2013-02-04 2014-08-21 Fuji Heavy Ind Ltd Vehicle outside environment recognition device and vehicle outside environment recognition method
CN104417431A (en) * 2013-09-06 2015-03-18 昆达电脑科技(昆山)有限公司 Running information indicating system
CN108124122A (en) * 2016-11-29 2018-06-05 法乐第(北京)网络科技有限公司 Image treatment method, device and vehicle
JP2019220070A (en) * 2018-06-22 2019-12-26 株式会社Soken Electronic control unit

Also Published As

Publication number Publication date
CN113204234A (en) 2021-08-03

Similar Documents

Publication Publication Date Title
KR101276871B1 (en) Method and apparatus for collision avoidance of vehicle
JP4987573B2 (en) Outside monitoring device
JP6613795B2 (en) Display control device and vehicle control device
KR20190039648A (en) Method for monotoring blind spot of vehicle and blind spot monitor using the same
JP7206583B2 (en) Information processing device, imaging device, device control system, moving object, information processing method and program
JP5974607B2 (en) Vehicle travel control device
WO2014084122A1 (en) On-board control device
JP2009286279A (en) Drive support device for vehicle
EP3915857B1 (en) A parking assist apparatus and a method of controlling the parking assist apparatus
JP7472832B2 (en) Vehicle control device, vehicle control method, and vehicle control computer program
KR102304851B1 (en) Ecu, autonomous vehicle including the ecu, and method of recognizing near vehicle for the same
US11458936B2 (en) Drive assist apparatus
JP7119317B2 (en) Information processing device, imaging device, device control system, moving object, information processing method, and information processing program
JP7062782B2 (en) Vehicle control method and vehicle control device
JP5411671B2 (en) Object detection device and driving support system
CN114194184A (en) Vehicle and method of controlling vehicle
JP5011186B2 (en) Driving support device and driving support system
CN113204234B (en) Vehicle control method and vehicle control system
JP6387710B2 (en) Camera system, distance measuring method, and program
TWI723657B (en) Vehicle control method and vehicle control system
JP2003208602A (en) Rear-side side alarm device for vehicle
JP2008276307A (en) Video image processing apparatus, video image processing system, and navigation device
JP4807763B1 (en) Outside monitoring device
JP6969245B2 (en) Information processing device, image pickup device, device control system, mobile body, information processing method, and information processing program
EP3540643A1 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant