CN114715188A - Vehicle automatic driving control system and vehicle - Google Patents

Vehicle automatic driving control system and vehicle Download PDF

Info

Publication number
CN114715188A
CN114715188A CN202210299054.7A CN202210299054A CN114715188A CN 114715188 A CN114715188 A CN 114715188A CN 202210299054 A CN202210299054 A CN 202210299054A CN 114715188 A CN114715188 A CN 114715188A
Authority
CN
China
Prior art keywords
vehicle
detection device
image detection
view image
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210299054.7A
Other languages
Chinese (zh)
Inventor
沈琦
白富儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Jianzhi Robot Technology Co ltd
Original Assignee
Suzhou Jianzhi Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Jianzhi Robot Technology Co ltd filed Critical Suzhou Jianzhi Robot Technology Co ltd
Priority to CN202210299054.7A priority Critical patent/CN114715188A/en
Publication of CN114715188A publication Critical patent/CN114715188A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The application discloses a vehicle automatic driving control system, which comprises an image detection device group, a radar detection device group and a control device, wherein the image detection device group is deployed on a vehicle and is used for acquiring environmental image information around the vehicle; a radar detection device group for acquiring target detection information around the vehicle; and the control equipment is used for carrying out information fusion on the environment image information, the target detection information and the map information of the navigation map, establishing an environment model of the vehicle, carrying out path planning based on the environment model, and generating a vehicle control instruction according to the planned path so as to control the vehicle to advance through the vehicle control instruction. By applying the technical scheme provided by the application, the environment model of the vehicle can be accurately constructed, so that the path can be accurately planned, the advancing of the vehicle can be effectively and automatically controlled according to the planned path, and the stable and comfortable running of the vehicle is ensured. The application also discloses a vehicle, which has corresponding technical effects.

Description

Vehicle automatic driving control system and vehicle
Technical Field
The application relates to the technical field of computer application, in particular to a vehicle automatic driving control system and a vehicle.
Background
At present, the vehicle is used as an important vehicle, brings great convenience to the life of people, and the using amount of the vehicle is increased. With the progress of science and technology, people have higher and higher requirements on the intellectualization of vehicles and have higher and higher attention on the automatic driving of the vehicles.
The automatic driving is realized by mainly depending on technologies such as artificial intelligence, visual calculation, radar, global positioning, vehicle-road cooperation and the like, so that the vehicle has the capabilities of environment perception, path planning and autonomous control, a computer can autonomously control the vehicle, and the vehicle can be driven automatically and safely without any human intervention.
How to effectively and automatically control the traveling of the vehicle is a technical problem which needs to be solved urgently by those skilled in the art.
Disclosure of Invention
The application aims to provide a vehicle automatic driving control system and a vehicle, so that the advancing of the vehicle is effectively and automatically controlled, and the vehicle is ensured to run stably and comfortably.
In order to solve the technical problem, the application provides the following technical scheme:
a vehicle automatic driving control system includes an image detection device group, a radar detection device group, and a control device disposed on a vehicle, wherein,
the image detection device group is used for acquiring environment image information around the vehicle;
the radar detection device group is used for acquiring target detection information around the vehicle;
the control device is used for performing information fusion on the environment image information, the target detection information and the map information of the navigation map, establishing an environment model of the vehicle, performing path planning based on the environment model, and generating a vehicle control instruction according to a planned path so as to control the vehicle to travel through the vehicle control instruction.
In a specific embodiment of the present application, the image detection device group includes at least three front-view image detection devices, four side-view image detection devices, four all-around image detection devices, and one back-view image detection device.
In one embodiment of the present application, the three forward-view image detection devices include a first forward-view image detection device, a second forward-view image detection device, and a third forward-view image detection device, each disposed at a front portion of the vehicle, and a detection distance of the first forward-view image detection device < a detection distance of the second forward-view image detection device < a detection distance of the third forward-view image detection device.
In a specific embodiment of this application, four side view image detection equipment are arranged in respectively front fender and rear-view mirror department of vehicle are arranged in side view image detection equipment of front fender department of vehicle is used for gathering the environmental image in vehicle side the place ahead is arranged in side view image detection equipment of rear-view mirror department of vehicle is used for gathering the environmental image in vehicle side rear, and every side view image detection equipment's horizontal field of view is greater than 90, arranges in side view image detection equipment's on the same side of the vehicle field of view scope has the overlap.
In a specific embodiment of the present application, the four all-round-view image detection devices are respectively disposed at the front, rear, left, and right of the vehicle, and the environment images collected by the four all-round-view image detection devices are spliced to form a 360 ° viewing angle.
In one embodiment of the present application, the one rear-view image detection device is disposed at a rear portion of the vehicle for capturing an environmental image behind the vehicle.
In one embodiment of the present application, the radar detection device group includes at least four corner radars, and the four corner radars are respectively disposed at corner portions of the vehicle.
In an embodiment of the present application, the navigation map is an advanced driving assistance system ADAS navigation map.
In one embodiment of the present application, further comprises an interactive device,
and the interactive equipment is used for sending the interactive instruction of the user to the control equipment and outputting and displaying the running state of the vehicle.
A vehicle comprising the vehicle autopilot control system of any one of the above.
By applying the technical scheme provided by the embodiment of the application, the image detection device group deployed on the vehicle acquires environment image information around the vehicle, the radar detection device group deployed on the vehicle acquires target detection information around the vehicle, the control device deployed on the vehicle performs information fusion on the environment image information, the target detection information and the map information of the navigation map, establishes an environment model of the vehicle, performs path planning based on the environment model, generates a vehicle control instruction according to the planned path, and controls the vehicle to travel through the vehicle control instruction. The environment model of the vehicle can be accurately constructed through the environment image information, the target detection information and the map information of the navigation map, so that the path can be accurately planned based on the environment model, the advancing of the vehicle can be effectively and automatically controlled according to the planned path, and the stable and comfortable running of the vehicle is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic structural diagram of an automatic driving control system of a vehicle according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an automatic driving control process of a vehicle according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a specific deployment example in an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating effects of environment image information and an environment model in an embodiment of the present application;
FIG. 5 is a schematic diagram showing the overall structure of an automatic drive control system for a vehicle according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a vehicle according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the following detailed description will be given with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, a schematic structural diagram of a vehicle automatic driving control system provided in an embodiment of the present application may include an image detection device group 110, a radar detection device group 120, and a control device 130 disposed on a vehicle.
The image detection device group 110 is configured to acquire environment image information around the vehicle;
a radar detection device group 120 for acquiring target detection information around the vehicle;
and the control device 130 is configured to perform information fusion on the environment image information, the target detection information, and the map information of the navigation map, establish an environment model of the vehicle, perform path planning based on the environment model, and generate a vehicle control instruction according to the planned path, so as to control the vehicle to travel through the vehicle control instruction.
In the embodiment of the present application, the image detection device group 110, the radar detection device group 120, and the control device 130 may be disposed on the vehicle, and the control device 130 may be communicatively connected to the image detection device group 110 and the radar detection device group 120, respectively. Specifically, the image detection device group 110 may be communicatively connected to an LVDS (Low Voltage Differential Signaling) interface, and the radar detection device group 120 may be communicatively connected to a CAN (Controller Area Network) bus.
During the running process of the vehicle, the image detection device group 110 and the radar detection device group 120 are in an operating state, the image detection device group 110 can acquire the image information of the environment around the vehicle in real time, and the radar detection device group 120 can acquire the target detection information around the vehicle in real time. The control device 130 can acquire the environment image information of the image detection device group 110 and the target detection information of the radar detection device group 120 in real time through communication with the image detection device group 110 and the radar detection device group 120.
The control device 130 may perform information fusion on the environment image information, the target detection information, and the map information of the navigation map, such as road information, static and dynamic target information, and may establish an environment model of the vehicle based on the fused information. The process of establishing the environment model may be regarded as a process of abstracting an actual physical space into an abstract space that can be processed by an algorithm to realize mapping between the physical space and the abstract space. The environmental model may include road information, static maps, map topology, traffic rules, static/dynamic targets, road route information, safe destinations, and the like. The environment condition of the vehicle can be accurately obtained through the environment model of the vehicle, such as the position of the vehicle in the road, whether obstacles exist around the vehicle, the distance between the vehicle and the obstacles at the current moment, the predicted distance between the vehicle and the obstacles in the next preset time period and the like. The control device 130 may perform the corresponding operations using the intavada Xavier operation unit.
After the control device 130 establishes the environment model of the vehicle, path planning may be further performed based on the environment model. Specifically, the position of the navigation map where the vehicle is currently located and the local obstacle distribution condition thereof may be determined based on the environment model, and the optimal path from the current position point to the next target position point may be selected, which may include path search and path smoothing. According to the planned path, vehicle control instructions, such as vehicle control instructions for acceleration, deceleration, steering and the like, can be generated, and the vehicle can be controlled to move through the vehicle control instructions. Specifically, after the Control device 130 generates the Vehicle Control command, the Vehicle Control command may be sent to a corresponding Vehicle body execution device, for example, to a Vehicle Control Unit (VCU), an Electric Power Steering (EPS) system, a Brake (Brake), and the like, so as to Control acceleration, deceleration, Steering, and the like of the Vehicle.
For the sake of easy understanding, the following description will be made of the automatic driving control process of the vehicle, taking the example shown in fig. 2 as an example. At the beginning of vehicle start, the control device 130 may plan a target path based on the positioning of map information of the inertial navigation system and the navigation map according to a preset destination, where the target path may be considered as a global path, and after the vehicle is started, the control device controls the vehicle to travel on the target path, and the road and target information may be obtained through the image detection device group 110 and the radar detection device group 120, where the road and target information includes environment image information around the vehicle obtained in real time through the image detection device group 110, and the target detection information around the vehicle obtained in real time through the radar detection device group 120, specifically, road characteristics, static targets, dynamic targets, and the like may be obtained. And carrying out information fusion on the environment image information, the target detection information and the map information of the navigation map which are obtained in real time, sensing the environment of the vehicle in real time, and establishing an environment model of the vehicle. The environmental model may include road information, static maps, map topology, traffic rules, static/dynamic targets, road route information, safe destinations, and the like. Task planning, such as mode state, road node, route, etc., can be performed based on the environment model, and path planning can be performed based on the environment model and the task planning, where the planned path can be regarded as a local path. The route control command can be generated according to the planned route, the vehicle control command can be accurately generated after the route control command is arbitrated, the vehicle is dynamically controlled, lane changing, reasonable acceleration and deceleration and other operations are realized by controlling vehicle body execution equipment such as steering, braking and a power system, and the traffic efficiency is higher.
During the running process of the vehicle, the safe running of the vehicle can be further ensured through an operation condition monitor, a data recorder, a diagnosis device and the like.
By applying the system provided by the embodiment of the application, the image detection device group deployed on the vehicle acquires environment image information around the vehicle, the radar detection device group deployed on the vehicle acquires target detection information around the vehicle, the control device deployed on the vehicle performs information fusion on the environment image information, the target detection information and the map information of the navigation map, establishes an environment model of the vehicle, performs path planning based on the environment model, generates a vehicle control instruction according to the planned path, and controls the vehicle to travel through the vehicle control instruction. The environment model of the vehicle can be accurately constructed through the environment image information, the target detection information and the map information of the navigation map, so that the path can be accurately planned based on the environment model, the advancing of the vehicle can be effectively and automatically controlled according to the planned path, and the stable and comfortable running of the vehicle is ensured.
In one embodiment of the present application, the image detection device group 110 includes at least three front view image detection devices, four side view image detection devices, four all around view image detection devices, and one back view image detection device. The at least twelve image detection devices are reasonably deployed, the acquired environment image information can comprehensively cover the surrounding environment of the vehicle, and the hardware cost can be effectively controlled.
Wherein the three forward-view image detecting devices may include a first forward-view image detecting device, a second forward-view image detecting device, and a third forward-view image detecting device, all disposed at a front portion of the vehicle, such as a front windshield, and a detection distance of the first forward-view image detecting device < a detection distance of the second forward-view image detecting device < a detection distance of the third forward-view image detecting device. The first forward-looking image detection device can be regarded as short-distance shooting, the detection distance can reach about 60 meters, such as the driving environments of obstacles, lane lines and the like within the range of 60 meters in front of the vehicle can be shot, the second forward-looking image detection device can be regarded as middle-distance shooting, the detection distance can reach about 150 meters, such as the driving environments of obstacles, lane lines and the like within the range of 150 meters in front of the vehicle can be shot, the third forward-looking image detection device can be regarded as long-distance shooting, the detection distance can reach about 250 meters, such as the obstacles within the range of 250 meters in front of the vehicle can be shot. The obstacle may be another vehicle, an isolation device on the road, etc. The forward-looking image detection device can be a camera, a video camera and the like.
The four side-view image detection devices are respectively arranged at the front fender and the rear-view mirror of the vehicle, the side-view image detection device arranged at the front fender of the vehicle is used for collecting an environment image in front of the side of the vehicle, the side-view image detection device arranged at the rear-view mirror of the vehicle is used for collecting an environment image in rear of the side of the vehicle, the horizontal field angle of each side-view image detection device is larger than 90 degrees, if the horizontal field angle is set to be 100 degrees, and the field ranges of the side-view image detection devices arranged at the same side of the vehicle are overlapped. The four side-view image detection devices can acquire the environment images on two sides of the vehicle, and provide information for the control device 130 to control the vehicle to change lanes. The detection distance of the side-view image detection equipment can reach about 80 meters. The side view image detection device may be a camera, a video camera, or the like.
The four all-round looking image detection devices are respectively arranged at the front part, the rear part, the left part and the right part of the vehicle, and the environment images collected by the four all-round looking image detection devices are spliced to form a 360-degree visual angle. The environment images acquired by the four panoramic image detection devices are spliced, so that an environment image in a range of 360 degrees around the vehicle can be obtained, a 360-degree view angle is formed, and information can be provided for the control device 130 to control the vehicle to park. The detection distance of the panoramic image detection equipment can reach about 20 meters. The all-round image detection device may specifically be a fisheye camera.
A rear view image detection device is disposed at a rear portion of a vehicle, such as a rear windshield, for capturing an environmental image behind the vehicle. The control apparatus 130 can effectively prevent a vehicle rear collision by monitoring information on the rear of the vehicle. The detection range of the rear view image detection device can be up to 50 meters. The rear view image detection device may be a camera, a video camera, or the like.
It should be noted that the specific deployment positions of the forward-view image detection device, the side-view image detection device, the ring-view image detection device, and the rear-view image detection device may be adjusted according to actual conditions, as long as the acquired environmental image information can comprehensively cover the surrounding environment of the vehicle.
In one embodiment of the present application, the radar detection device cluster 120 includes at least four corner radars, which are respectively disposed at corners of the vehicle. Target detection information of the front side and the rear side of the vehicle can be acquired through the four corner radars, and information is provided for the control device 130 to control the lane change of the vehicle. The angle radar may specifically be a millimeter wave radar. The detection range of the angle radar can reach 50 meters.
Fig. 3 shows a specific deployment example of the image detection devices included in the image detection device group 110 and the angle radar included in the radar detection device group 120 on the vehicle in the embodiment of the present application.
Wherein, the horizontal angle of the visual field of the first forward-looking image detection device A1 is 120 degrees, the vertical angle is 55 degrees, namely FOV (H) is 120 degrees (V)55 degrees, and the detection distance can reach 60 m; the horizontal angle of the field of view of the second front-view image detection device A2 is 60 degrees, the vertical angle is 33 degrees, namely FOV (H) is 60 degrees (V)33 degrees, and the detection distance can reach 150 m; the horizontal angle of the visual field of the third front-view image detection device A3 is 30 degrees, the vertical angle is 16 degrees, namely FOV (H) is 30 degrees (V)16 degrees, and the detection distance can reach 250 m;
the horizontal angle and the vertical angle of the fields of view of the four side-view image detection devices B1, B2, B3 and B4 are 96 degrees and 49 degrees respectively, namely FOV (H) is 96 degrees (V)49 degrees, and the detection distance can reach 80 m;
the horizontal angle of the fields of view of the four all-round-view image detection devices C1, C2, C3 and C4 is 216 degrees, the vertical angle is 124 degrees, namely FOV (H) is 124 degrees (V) 216 degrees, the detection distance can reach 20m, the all-round-view image detection device C1 is close to the deployment position of the side-view image detection device B1, the all-round-view image detection device C3 is close to the deployment position of the side-view image detection device B3, and the all-round-view image detection devices C3 and B3 are not distinguished in the image 3;
the horizontal angle of the field of view of the back view image detection device D is 96 degrees, the vertical angle is 49 degrees, namely the FOV (H) is 96 degrees (V)49 degrees, and the detection distance can reach 50 m;
the horizontal angle of the fields of view of the four corner radars E1, E2, E3 and E4 is 165 degrees, the vertical angle is 10 degrees, namely FOV (H) is 165 degrees (V)10 degrees, and the detection distance can reach 50 m.
Through the arrangement, the twelve image detection devices and the four corner radars can be used for carrying out full-coverage detection on the surrounding environment of the vehicle, and no blind area exists.
The control device 130 establishes an environment model of the vehicle by performing information fusion on the environment image information around the vehicle acquired by the image detection device group 110, the target detection information around the vehicle acquired by the radar detection device group 120, and the map information of the navigation map, so that the targets around the vehicle can be accurately identified, and the stability and reliability of the automatic vehicle driving control system can be ensured through radar and visual perception. As shown in fig. 4, the six images at the periphery are the effect of simultaneously monitoring the surroundings of the vehicle by the twelve image detection devices in real time, and the middle image is the effect of the established environment model of the vehicle, from which the real-time road conditions around the vehicle and the relative relationship between the target and the vehicle can be seen.
In one embodiment of the present application, the navigation map may be an ADAS (Advanced Driving Assistance System) navigation map. The control device 130 may implement the automatic driving related functions through meter-level positioning and ADAS navigation maps, and the system availability is high.
Of course, in some high-speed road sections, the control device 130 may also implement the automatic driving related function through the decimeter-level high-precision positioning and the decimeter-level high-precision navigation map, so as to improve the control precision.
In one embodiment of the present application, the vehicle automatic driving control system may further include an interactive device for transmitting an interactive instruction of a user to the control device 130 and outputting a display showing a driving state of the vehicle.
The interaction device may be a device with a display screen for human-computer interaction. At the beginning of vehicle starting, a user can set a destination through the interactive device and send a corresponding interactive instruction to the control device 130, and after receiving the interactive instruction set by the destination, the control device 130 can plan a driving path according to the destination. The user can also turn on or turn off the automatic driving mode through the interactive device, and send the corresponding interactive instruction to the control device 130, and after the control device 130 receives the interactive instruction for turning on or turning off the automatic driving mode, the automatic driving mode can be turned on or turned off according to the interactive instruction, so as to better meet the user requirements. Control of a destination, an automatic driving mode on/off, a mode state, and the like can be performed through human-machine interaction as shown in fig. 2. The control device 130 may output and display the driving state of the vehicle to the user through the interactive device during the process of controlling the vehicle to move, so that the user can know the driving state of the vehicle in time.
Fig. 5 is a schematic diagram showing an overall structure of the automatic driving control system of the vehicle, in which the control device 130 is respectively connected to three front-view image detection devices, four surround-view image detection devices, four side-view image detection devices, and one rear-view image detection device through LVDS interfaces to obtain image information of an environment around the vehicle, the CAN bus is in communication connection with the four corner radars to obtain target detection information around the vehicle, the navigation map is called through the Ethernet, information fusion is carried out on the environmental image information and the target detection information, after an environmental model of the vehicle is established, a vehicle control instruction is generated based on the environmental model and the navigation map, the vehicle control command is sent to the corresponding vehicle body execution equipment through the vehicle body CAN bus to control the running of the vehicle, and the vehicle-mounted intelligent control system is communicated with the interactive equipment through the Ethernet, receives an interactive instruction of a user, and outputs and displays the driving state of the vehicle. The stability and the reliability of the automatic driving control system of the vehicle are ensured, the higher passing efficiency is realized, and the stable and comfortable running of the vehicle can be ensured.
Corresponding to the above system embodiment, the embodiment of the present application further provides a vehicle, as shown in fig. 6, including the above vehicle automatic driving control system.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, read-only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The principle and the implementation of the present application are explained in the present application by using specific examples, and the above description of the embodiments is only used to help understanding the technical solution and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (10)

1. A vehicle automatic driving control system comprising an image detection device group, a radar detection device group, and a control device disposed on a vehicle, wherein,
the image detection device group is used for acquiring environment image information around the vehicle;
the radar detection device group is used for acquiring target detection information around the vehicle;
the control device is used for performing information fusion on the environment image information, the target detection information and the map information of the navigation map, establishing an environment model of the vehicle, performing path planning based on the environment model, and generating a vehicle control instruction according to a planned path so as to control the vehicle to travel through the vehicle control instruction.
2. The vehicle automatic driving control system according to claim 1, characterized in that the image detection device group includes at least three forward-view image detection devices, four side-view image detection devices, four all-around image detection devices, and one rear-view image detection device.
3. The vehicle automatic driving control system according to claim 2, wherein the three forward-view image detection devices include a first forward-view image detection device, a second forward-view image detection device, and a third forward-view image detection device, each of which is disposed in a front portion of the vehicle, and a detection distance of the first forward-view image detection device < a detection distance of the second forward-view image detection device < a detection distance of the third forward-view image detection device.
4. The vehicle automatic driving control system according to claim 2, wherein the four side-view image detecting devices are respectively disposed at a front fender and a rear view mirror of the vehicle, the side-view image detecting device disposed at the front fender of the vehicle is configured to capture an environment image in front of the vehicle side, the side-view image detecting device disposed at the rear view mirror of the vehicle is configured to capture an environment image behind the vehicle side, each side-view image detecting device has a horizontal field angle of more than 90 °, and the field of view ranges of the side-view image detecting devices disposed on the same side of the vehicle overlap.
5. The vehicle automatic driving control system according to claim 2, wherein the four all-around image detection devices are respectively disposed at the front, rear, left, and right portions of the vehicle, and the environment images collected by the four all-around image detection devices are spliced to form a 360 ° view angle.
6. The vehicle autopilot control system of claim 2 wherein the one rear view image sensing device is disposed at a rear portion of the vehicle for capturing an image of an environment behind the vehicle.
7. The vehicle autopilot control system of claim 1 wherein the set of radar detection devices includes at least four corner radars, each of the four corner radars being disposed at a corner of the vehicle.
8. The vehicle autopilot control system of claim 1 wherein the navigation map is an Advanced Driving Assistance System (ADAS) navigation map.
9. The vehicle automatic driving control system according to any one of claims 1 to 8, characterized by further comprising an interaction device,
and the interactive equipment is used for sending the interactive instruction of the user to the control equipment and outputting and displaying the running state of the vehicle.
10. A vehicle characterized by comprising the vehicle automatic driving control system according to any one of claims 1 to 9.
CN202210299054.7A 2022-03-25 2022-03-25 Vehicle automatic driving control system and vehicle Pending CN114715188A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210299054.7A CN114715188A (en) 2022-03-25 2022-03-25 Vehicle automatic driving control system and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210299054.7A CN114715188A (en) 2022-03-25 2022-03-25 Vehicle automatic driving control system and vehicle

Publications (1)

Publication Number Publication Date
CN114715188A true CN114715188A (en) 2022-07-08

Family

ID=82239459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210299054.7A Pending CN114715188A (en) 2022-03-25 2022-03-25 Vehicle automatic driving control system and vehicle

Country Status (1)

Country Link
CN (1) CN114715188A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116811914A (en) * 2023-06-29 2023-09-29 重庆亿连信息科技有限公司 Unmanned vehicle-mounted obstacle sensing system and method
TWI840021B (en) * 2022-12-15 2024-04-21 富智捷股份有限公司 Driving control method based on dynamic environmental information and related equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI840021B (en) * 2022-12-15 2024-04-21 富智捷股份有限公司 Driving control method based on dynamic environmental information and related equipment
CN116811914A (en) * 2023-06-29 2023-09-29 重庆亿连信息科技有限公司 Unmanned vehicle-mounted obstacle sensing system and method

Similar Documents

Publication Publication Date Title
CN111880533B (en) Driving scene reconstruction method, device, system, vehicle, equipment and storage medium
JP7082246B2 (en) Operating autonomous vehicles according to road user reaction modeling with shielding
JP6390035B2 (en) Vehicle control system, vehicle control method, and vehicle control program
JP7043450B2 (en) Vehicle control devices, vehicle control methods, and programs
US9849832B2 (en) Information presentation system
EP3650285B1 (en) Parking assistance method and parking assistance device
US11716160B2 (en) Vehicle remote instruction system
JP2019090627A (en) Display system, display method, and program
CN111915915A (en) Driving scene reconstruction method, device, system, vehicle, equipment and storage medium
CN114715188A (en) Vehicle automatic driving control system and vehicle
JP2020064047A (en) Device and method for visualizing content
JP2019182305A (en) Vehicle control device, vehicle control method, and program
US11148668B2 (en) Autonomous vehicle control for reverse motion
US11488395B2 (en) Systems and methods for vehicular navigation
CN110239542A (en) Controller of vehicle, control method for vehicle and storage medium
DE112018005910T5 (en) CONTROL DEVICE AND CONTROL METHOD, PROGRAM AND MOBILE BODY
WO2019092846A1 (en) Display system, display method, and program
JP2019014300A (en) Vehicle control system, vehicle control method and program
CN111936364B (en) Parking assist device
KR20180087968A (en) Autonomous driving device and method thereof
JP2020052559A (en) Vehicle control device, vehicle control method, and program
JP7362566B2 (en) Operation control device, operation control method and program
CN112461249A (en) Sensor localization from external source data
CN116674557B (en) Vehicle autonomous lane change dynamic programming method and device and domain controller
Zhao et al. Tiev: The tongji intelligent electric vehicle in the intelligent vehicle future challenge of china

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination