CN113147748A - ADAS display method and system based on AR live-action navigation - Google Patents

ADAS display method and system based on AR live-action navigation Download PDF

Info

Publication number
CN113147748A
CN113147748A CN202110326005.3A CN202110326005A CN113147748A CN 113147748 A CN113147748 A CN 113147748A CN 202110326005 A CN202110326005 A CN 202110326005A CN 113147748 A CN113147748 A CN 113147748A
Authority
CN
China
Prior art keywords
information
vehicle
image
live
adas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110326005.3A
Other languages
Chinese (zh)
Inventor
刘卫东
李甜甜
王爱春
黄少堂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangling Motors Corp Ltd
Original Assignee
Jiangling Motors Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangling Motors Corp Ltd filed Critical Jiangling Motors Corp Ltd
Priority to CN202110326005.3A priority Critical patent/CN113147748A/en
Publication of CN113147748A publication Critical patent/CN113147748A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Abstract

The invention provides an ADAS display method based on AR live-action navigation, which comprises the following steps: acquiring a real-scene image received by an AR navigation control module in real time, wherein the real-scene image comprises a road image in front of a current vehicle and a vehicle image; acquiring road information and obstacle information received by an ADAS control module in real time, and performing fusion processing and path planning on the road information and the obstacle information through the ADAS control module to output an ADAS control signal and corresponding driving assistance state information, wherein the road information comprises lane lines on two sides of a current vehicle, and the obstacle information comprises the speed, distance and direction information of a front vehicle; and embedding the driving assistance state information into the live-action image and displaying the driving assistance state information in real time. By the mode, the driving assistance state information output by the ADAS control module can be displayed in the live-action image in real time, the accuracy of the position of the vehicle and the definition of the state information are improved, and potential safety hazards are eliminated.

Description

ADAS display method and system based on AR live-action navigation
Technical Field
The invention relates to the technical field of automobile auxiliary driving, in particular to an ADAS display method and system based on AR live-action navigation.
Background
In recent years, with the rapid development of automobile manufacturing technology, advanced intelligent driving assistance systems (ADAS) have been developed, and the loading rate of the ADAS on automobiles is increasing.
The ADAS is a technology that uses various sensors installed on an automobile to sense the surrounding environment in real time during the driving process of the automobile, to identify and track static and dynamic objects and lane lines, and to perform systematic calculation and analysis in combination with the state of the automobile, so that a driver can feel the danger that may occur in advance and to perform certain transverse and longitudinal driving control on the automobile.
However, the information display of the existing ADAS function has the problem of being not clear and accurate enough in the vehicle driving process, has certain potential safety hazard, and is not beneficial to large-scale popularization and use.
Disclosure of Invention
Based on this, the invention aims to provide an ADAS display method and system based on AR live-action navigation, so as to solve the problem that the information display of the ADAS function in the prior art is not clear and accurate enough in the vehicle driving process.
An ADAS display method based on AR live-action navigation comprises the following steps:
acquiring a real-scene image received by an AR navigation control module in real time, wherein the real-scene image comprises a road image in front of a current vehicle and a vehicle image;
acquiring road information and obstacle information received by an ADAS control module in real time, and performing fusion processing and path planning on the road information and the obstacle information through the ADAS control module to output an ADAS control signal and corresponding driving auxiliary state information, wherein the road information comprises lane lines on two sides of a current vehicle, and the obstacle information comprises the speed, the distance and the direction information of a front vehicle;
and embedding the driving assistance state information into the live-action image and displaying the driving assistance state information in real time.
The invention has the beneficial effects that: the real-time received real-time image of the AR navigation control module is obtained, the road information and the obstacle information received by the ADAS control module are further obtained, the road information and the obstacle information are subjected to fusion processing and path planning through the ADAS control module, so as to output an ADAS control signal and corresponding driving auxiliary state information, when the real-time display device is used, the driving auxiliary state information is embedded into the real-time image and is displayed in real time, so that lane lines on two sides of a current vehicle and information such as the speed, the distance and the direction of the front vehicle, which are obtained by the ADAS module, can be displayed in the real-time image in real time, the accuracy of the front vehicle, the lane lines and the current vehicle position is ensured, and meanwhile, the driving auxiliary state information can be displayed to a driver more visually and clearly through the real-time image display, thereby eliminating the potential safety hazard and being beneficial to the popularization and the use in a large range.
Preferably, the step of acquiring the live-action image received by the AR navigation control module in real time includes:
the real-scene image in front of the current vehicle, which is acquired by the first acquisition device in real time, is acquired and transmitted to the AR navigation control module, and the first acquisition device is arranged on a rearview mirror of a front windshield of the current vehicle.
Preferably, the step of acquiring the road information and the obstacle information received by the ADAS control module in real time includes:
acquiring the road image and the vehicle image in the live-action image, determining a target vehicle in front of the current vehicle according to the vehicle image, determining lane lines on two sides of the current vehicle according to the road image and generating corresponding road information;
acquiring barrier information detected by a second acquisition device in real time according to the target vehicle, wherein the barrier information comprises vehicle speed, vehicle distance and direction information, and the second acquisition device is arranged at the head of the current vehicle;
transmitting the road information and the obstacle information to the ADAS control module.
Preferably, the driving assistance state information includes adaptive cruise control information, lane assistance information, and safety assistance information;
the self-adaptive cruise control information comprises a cruise target and a following distance; the lane assistance information includes a steering assistance state and a vehicle departure state; the safety auxiliary information comprises collision early warning reminding information and system active braking reminding information.
Preferably, after the step of embedding the driving assistance state information in the live view image and displaying the driving assistance state information in real time, the method further includes:
and identifying a front vehicle closest to the current vehicle through the AR navigation control module, and displaying the following distance between the front vehicle and the current vehicle in the live-action image in real time.
Another objective of the present invention is to provide an ADAS display system based on AR live-action navigation, which includes:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a live-action image received by the AR navigation control module in real time, and the live-action image comprises a road image in front of a current vehicle and a vehicle image;
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring road information and obstacle information which are received by the ADAS control module in real time, and performing fusion processing and path planning on the road information and the obstacle information through the ADAS control module to output an ADAS control signal and corresponding driving assistance state information, the road information comprises lane lines on two sides of a current vehicle, and the obstacle information comprises the speed, the distance and the direction information of a front vehicle;
and the embedding module is used for embedding the driving assistance state information into the live-action image and displaying the driving assistance state information in real time.
Preferably, in the ADAS display system based on AR live-action navigation, the first obtaining module is specifically configured to:
the real-scene image in front of the current vehicle, which is acquired by the first acquisition device in real time, is acquired and transmitted to the AR navigation control module, and the first acquisition device is arranged on a rearview mirror of a front windshield of the current vehicle.
Preferably, in the ADAS display system based on AR live-action navigation, the second obtaining module is specifically configured to:
acquiring the road image and the vehicle image in the live-action image, determining a target vehicle in front of the current vehicle according to the vehicle image, determining lane lines on two sides of the current vehicle according to the road image and generating corresponding road information;
acquiring barrier information detected by a second acquisition device in real time according to the target vehicle, wherein the barrier information comprises vehicle speed, vehicle distance and direction information, and the second acquisition device is arranged at the head of the current vehicle;
transmitting the road information and the obstacle information to the ADAS control module.
Preferably, in the ADAS display system based on AR live-action navigation, the system includes:
the driving assistance state information includes adaptive cruise control information, lane assistance information, and safety assistance information.
Preferably, in the ADAS display system based on AR live-action navigation, the system further includes:
and identifying a front vehicle closest to the current vehicle through the AR navigation control module, and displaying the following distance between the front vehicle and the current vehicle in the live-action image in real time.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
Fig. 1 is a flowchart of an ADAS display method based on AR live-action navigation according to a first embodiment of the present invention;
fig. 2 is a flowchart of an ADAS display method based on AR live-action navigation according to a second embodiment of the present invention;
fig. 3 is a connection block diagram of an ADAS display system based on AR live-action navigation according to a third embodiment of the present invention;
fig. 4 is a block diagram of an ADAS display system based on AR live-action navigation according to a third embodiment of the present invention.
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Several embodiments of the invention are presented in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like as used herein are for illustrative purposes only.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, a ADAS display method based on AR live-action navigation in a first embodiment of the present invention is shown, which is suitable for a vehicle equipped with an AR navigation control module and an ADAS display control module, specifically:
the ADAS display method based on AR live-action navigation comprises the following steps:
step S10, acquiring a live-action image received by the AR navigation control module in real time, wherein the live-action image comprises a road image in front of the current vehicle and a vehicle image;
specifically, as will be understood by those skilled in the art, the AR navigation control module has an AR live-action navigation function, and when the driver starts the vehicle, the AR navigation control module and the ADAS control module may be started through the start switch.
In the driving process of the automobile, the AR navigation control module can receive the live-action image in front of the current automobile in real time and display the live-action image on a center console in the automobile in real time. The live-action image comprises a road image and a vehicle image in front of the current vehicle, so that a driver can observe the current position of the vehicle and the situation around the vehicle body in real time.
Step S20, acquiring road information and obstacle information received by an ADAS control module in real time, and performing fusion processing and path planning on the road information and the obstacle information through the ADAS control module to output an ADAS control signal and corresponding driving assistance state information, wherein the road information comprises lane lines on two sides of a current vehicle, and the obstacle information comprises the speed, the distance and the direction information of a front vehicle;
specifically, in this embodiment, it should be noted that, after a driver starts the AR navigation control module and the ADAS control module, the AR navigation control module and the ADAS control module can immediately receive information acquired by a current vehicle from the outside, where the ADAS control module can receive road information and obstacle information acquired by the current vehicle.
More specifically, the road information includes lane lines on both sides of the current vehicle, the obstacle information includes vehicle speed, vehicle distance, and direction information of the preceding vehicle, and the road information and the obstacle information are output as corresponding driving assistance state information by using the function of the ADAS control module.
Step S30, the driving assistance state information is embedded in the live view image and displayed in real time.
Specifically, in this embodiment, the AR navigation control module and the ADAS control module are electrically connected together, and when the ADAS control module outputs driving assistance status information, the driving assistance status information can be embedded into the obtained live-action image and displayed in real time along with the live-action image by using the characteristics between the ADAS control module and the AR navigation control module, so that the driving assistance status information can be presented to the driver more intuitively and clearly, and potential safety hazards are eliminated.
When in use, the real-time received live-action image by the AR navigation control module is obtained, the road information and the barrier information received by the ADAS control module are further obtained, and outputting the road information and the obstacle information into corresponding driving assistance state information through the ADAS control module, embedding the driving assistance state information into a live-action image and displaying the driving assistance state information in real time when in use, therefore, the lane lines on two sides of the current vehicle and the information such as the speed, the distance and the direction of the front vehicle, which are acquired by the ADAS module, can be displayed in the live-action image in real time, the accuracy of the front vehicle, the lane lines and the current vehicle position is ensured, meanwhile, the live-action image is displayed in real time, the driving assistance state information can be more visually and clearly presented to the driver, so that potential safety hazards are eliminated, and the method is favorable for large-scale popularization and use.
It should be noted that the above implementation process is only for illustrating the applicability of the present application, but this does not represent that the ADAS display method based on AR live-action navigation of the present application has only one implementation flow, and on the contrary, the ADAS display method based on AR live-action navigation of the present application can be incorporated into the feasible embodiments of the present application as long as it can be implemented.
In summary, the ADAS display method based on AR live-action navigation in the embodiments of the present invention can present the driving assistance status information to the driver more intuitively and clearly, so as to eliminate the potential safety hazard and facilitate wide-scale popularization and use.
Referring to fig. 2, a ADAS display method based on AR live-action navigation in a second embodiment of the present invention is shown, including the following steps:
step S11, acquiring a live-action image in front of the current vehicle acquired by a first acquisition device in real time and transmitting the live-action image to the AR navigation control module, wherein the first acquisition device is arranged on a rearview mirror of a front windshield of the current vehicle;
specifically, in this embodiment, the first collecting device is configured as a camera, and when the first collecting device is installed, the camera is disposed on a rearview mirror of a front windshield of a current vehicle.
In the running process of the vehicle, the first acquisition device can shoot a real-scene image in front of the current vehicle in real time, the real-scene image comprises a road image and a vehicle image in front of the current vehicle, and the road image and the vehicle image are immediately transmitted to the AR navigation control module.
Step S21, acquiring the road image and the vehicle image in the live-action image, determining a target vehicle in front of the current vehicle according to the vehicle image, determining lane lines on two sides of the current vehicle according to the road image and generating corresponding road information;
specifically, when the AR navigation control module receives the road image and the vehicle image, it immediately identifies a target vehicle in front of the current vehicle according to the vehicle image, and determines lane lines on both sides of the current vehicle according to the road image and generates corresponding road information. More specifically, the target vehicle is the vehicle closest to the current vehicle, and the AR navigation control module displays the target vehicle on a display screen of the center console in real time for the driver to observe. Secondly, the road information comprises lane lines on two sides of the current vehicle, and the change of the lane lines is displayed on a display screen in real time, so that a driver can know the conditions on two sides of the vehicle.
Step S31, acquiring obstacle information detected by a second acquisition device in real time according to the target vehicle, wherein the obstacle information comprises vehicle speed, vehicle distance and direction information, and the second acquisition device is arranged at the head of the current vehicle;
specifically, in this embodiment, the second collection system sets up to the millimeter wave radar, and during the installation, this radar setting is on the top of current vehicle locomotive, and during the use, this radar can detect the speed of a motor vehicle, the vehicle distance and the azimuth information of the place ahead target vehicle.
Step S41, transmitting the road information and the obstacle information to the ADAS control module.
Specifically, the second acquisition device may transmit information of the target vehicle detected in real time to the ADAS control module, and the ADAS control module may output the road information and the obstacle information as corresponding driving assistance state information.
More specifically, the driving assistance state information includes adaptive cruise control information, lane assistance information, and safety assistance information, in which:
the lane auxiliary information includes a steering auxiliary state, a vehicle deviation state and the like, and specifically, the AR navigation control module acquires a live-action image shot by the camera, and determines two lane lines of a lane where the vehicle is located according to lane line information shot by the camera in front. When the AR navigation control module receives an alarm signal sent by the ADAS control module and indicating that the driver is unconsciously deviated from the lane, the lane line on the corresponding side of the lane in the AR live-action navigation is defined as red and continuously flickers to remind the driver of safety. When the AR navigation control module receives the information sent by the ADAS control module and indicates that the ADAS system is controlling the electronic power-assisted steering system to provide steering assistance for the driver, two lane lines defining the lane in the AR live-action image are blue to indicate the driver system to provide steering torque
The safety auxiliary information comprises collision early warning reminding information, system active braking reminding information and the like, specifically, the AR navigation control module acquires live-action images shot by the camera, and when collision early warning reminding signals sent by the ADAS control module are received, collision early warning images are added between a vehicle and a front vehicle in the live-action images for warning and reminding; when system active braking prompt information sent by the ADAS control module is received, a system active braking image is added between the vehicle and the front vehicle in the live-action image to prompt a driver.
The self-adaptive cruise control information, the lane auxiliary information and the safety auxiliary information can be simultaneously fused into the AR live-action image and finally displayed through the display module, wherein the display module is arranged as a vehicle-mounted display screen on the center console.
Step S51, embedding the driving assistance state information in the live-action image and displaying it in real time;
this step is the same as step 30 provided in the first embodiment, and is not described again here.
And step S61, identifying a front vehicle closest to the current vehicle through the AR navigation control module, and displaying the following distance between the front vehicle and the current vehicle in the live-action image in real time.
Specifically, the AR navigation control module acquires a live-action image shot by the camera, identifies a target vehicle closest to the vehicle in the live-action image, and adds the actual distance between the vehicle and a cruising target to the live-action image according to the received self-adaptive cruise control state information; meanwhile, the following vehicle distance (distance/vehicle speed) of the vehicle and the target vehicle is different, the following vehicle distance generally has three gears, and different colors of the target vehicle are defined according to different following vehicle distances in the live-action image: when the following distance is safe enough, the color of the target vehicle is defined as blue, and the vehicle is indicated as a cruising target vehicle; when the following distance is short, defining the color of the target vehicle as yellow to remind the driver of paying attention to control the distance between the vehicles; when the distance between the vehicles is too close, the color of the target vehicle is defined to be red, and the driver is reminded to take over the vehicles in time.
It should be noted that, the method provided by the second embodiment of the present invention, which implements the same principle and produces some technical effects as the first embodiment, can refer to the corresponding contents in the first embodiment for the sake of brief description, where this embodiment is not mentioned.
In summary, the ADAS display method based on AR live-action navigation in the embodiments of the present invention can present the driving assistance status information to the driver more intuitively and clearly, so as to eliminate the potential safety hazard and facilitate wide-scale popularization and use.
A third embodiment of the present invention provides an ADAS display system based on AR live-action navigation, including:
the first acquisition module 12 is configured to acquire a live-action image received by the AR navigation control module in real time, where the live-action image includes a road image in front of a current vehicle and a vehicle image;
the second acquisition module 22 is configured to acquire road information and obstacle information received by the ADAS control module in real time, perform fusion processing and path planning on the road information and the obstacle information through the ADAS control module, and output an ADAS control signal and corresponding driving assistance state information, where the road information includes lane lines on both sides of a current vehicle, and the obstacle information includes vehicle speed, vehicle distance, and direction information of a vehicle ahead;
and an embedding module 32, configured to embed the driving assistance state information into the live-action image and perform real-time display.
: the first obtaining module 12 is specifically configured to:
the real-scene image in front of the current vehicle, which is acquired by the first acquisition device in real time, is acquired and transmitted to the AR navigation control module, and the first acquisition device is arranged on a rearview mirror of a front windshield of the current vehicle.
: the second obtaining module 22 is specifically configured to:
acquiring the road image and the vehicle image in the live-action image, determining a target vehicle in front of the current vehicle according to the vehicle image, determining lane lines on two sides of the current vehicle according to the road image and generating corresponding road information;
acquiring barrier information detected by a second acquisition device in real time according to the target vehicle, wherein the barrier information comprises vehicle speed, vehicle distance and direction information, and the second acquisition device is arranged at the head of the current vehicle;
transmitting the road information and the obstacle information to the ADAS control module.
: the system comprises:
the driving assistance state information includes adaptive cruise control information, lane assistance information, and safety assistance information.
: the system further comprises:
and identifying a front vehicle closest to the current vehicle through the AR navigation control module, and displaying the following distance between the front vehicle and the current vehicle in the live-action image in real time.
In summary, the ADAS display method based on AR live-action navigation in the embodiments of the present invention can present the driving assistance status information to the driver more intuitively and clearly, so as to eliminate the potential safety hazard and facilitate wide-scale popularization and use.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An ADAS display method based on AR live-action navigation is characterized by comprising the following steps:
acquiring a real-scene image received by an AR navigation control module in real time, wherein the real-scene image comprises a road image in front of a current vehicle and a vehicle image;
acquiring road information and obstacle information received by an ADAS control module in real time, and performing fusion processing and path planning on the road information and the obstacle information through the ADAS control module to output an ADAS control signal and corresponding driving auxiliary state information, wherein the road information comprises lane lines on two sides of a current vehicle, and the obstacle information comprises the speed, the distance and the direction information of a front vehicle;
and embedding the driving assistance state information into the live-action image and displaying the driving assistance state information in real time.
2. The ADAS display method based on AR live-action navigation according to claim 1, wherein: the step of obtaining the real-scene image received by the AR navigation control module in real time comprises the following steps:
the real-scene image in front of the current vehicle, which is acquired by the first acquisition device in real time, is acquired and transmitted to the AR navigation control module, and the first acquisition device is arranged on a rearview mirror of a front windshield of the current vehicle.
3. The ADAS display method based on AR live-action navigation according to claim 1, wherein: the step of acquiring the road information and the obstacle information received by the ADAS control module in real time comprises the following steps:
acquiring the road image and the vehicle image in the live-action image, determining a target vehicle in front of the current vehicle according to the vehicle image, determining lane lines on two sides of the current vehicle according to the road image and generating corresponding road information;
acquiring barrier information detected by a second acquisition device in real time according to the target vehicle, wherein the barrier information comprises vehicle speed, vehicle distance and direction information, and the second acquisition device is arranged at the head of the current vehicle;
transmitting the road information and the obstacle information to the ADAS control module.
4. The ADAS display method based on AR live-action navigation according to claim 1, wherein:
the driving assistance state information includes adaptive cruise control information, lane assistance information, and safety assistance information;
the self-adaptive cruise control information comprises a cruise target and a following distance; the lane assistance information includes a steering assistance state and a vehicle departure state; the safety auxiliary information comprises collision early warning reminding information and system active braking reminding information.
5. The ADAS display method based on AR live-action navigation according to claim 1, wherein: after the step of embedding the driving assistance state information in the live-action image and displaying the driving assistance state information in real time, the method further includes:
and identifying a front vehicle closest to the current vehicle through the AR navigation control module, and displaying the following distance between the front vehicle and the current vehicle in the live-action image in real time.
6. An ADAS display system based on AR live-action navigation, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a live-action image received by the AR navigation control module in real time, and the live-action image comprises a road image in front of a current vehicle and a vehicle image;
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring road information and obstacle information which are received by the ADAS control module in real time, and performing fusion processing and path planning on the road information and the obstacle information through the ADAS control module to output an ADAS control signal and corresponding driving assistance state information, the road information comprises lane lines on two sides of a current vehicle, and the obstacle information comprises the speed, the distance and the direction information of a front vehicle;
and the embedding module is used for embedding the driving assistance state information into the live-action image and displaying the driving assistance state information in real time.
7. The ADAS display system based on AR live action navigation of claim 6, wherein: the first obtaining module is specifically configured to:
the real-scene image in front of the current vehicle, which is acquired by the first acquisition device in real time, is acquired and transmitted to the AR navigation control module, and the first acquisition device is arranged on a rearview mirror of a front windshield of the current vehicle.
8. The ADAS display system based on AR live action navigation of claim 6, wherein: the second obtaining module is specifically configured to:
acquiring the road image and the vehicle image in the live-action image, determining a target vehicle in front of the current vehicle according to the vehicle image, determining lane lines on two sides of the current vehicle according to the road image and generating corresponding road information;
acquiring barrier information detected by a second acquisition device in real time according to the target vehicle, wherein the barrier information comprises vehicle speed, vehicle distance and direction information, and the second acquisition device is arranged at the head of the current vehicle;
transmitting the road information and the obstacle information to the ADAS control module.
9. The ADAS display system based on AR live action navigation of claim 6, wherein: the system comprises:
the driving assistance state information includes adaptive cruise control information, lane assistance information, and safety assistance information.
10. The ADAS display system based on AR live action navigation of claim 6, wherein: the system further comprises:
and identifying a front vehicle closest to the current vehicle through the AR navigation control module, and displaying the following distance between the front vehicle and the current vehicle in the live-action image in real time.
CN202110326005.3A 2021-03-26 2021-03-26 ADAS display method and system based on AR live-action navigation Pending CN113147748A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110326005.3A CN113147748A (en) 2021-03-26 2021-03-26 ADAS display method and system based on AR live-action navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110326005.3A CN113147748A (en) 2021-03-26 2021-03-26 ADAS display method and system based on AR live-action navigation

Publications (1)

Publication Number Publication Date
CN113147748A true CN113147748A (en) 2021-07-23

Family

ID=76885116

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110326005.3A Pending CN113147748A (en) 2021-03-26 2021-03-26 ADAS display method and system based on AR live-action navigation

Country Status (1)

Country Link
CN (1) CN113147748A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001846A1 (en) * 2022-06-28 2024-01-04 深圳市中兴微电子技术有限公司 Information projection method and apparatus, and vehicle control system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170017203A (en) * 2015-08-05 2017-02-15 엘지전자 주식회사 Driver Assistance Apparatus and Vehicle Having The Same
CN107521411A (en) * 2017-07-18 2017-12-29 吉林大学 A kind of track level navigation augmented reality device for aiding in driver
CN107985310A (en) * 2017-11-17 2018-05-04 浙江吉利汽车研究院有限公司 A kind of adaptive cruise method and system
CN108204822A (en) * 2017-12-19 2018-06-26 武汉极目智能技术有限公司 A kind of vehicle AR navigation system and method based on ADAS
CN109584596A (en) * 2018-12-20 2019-04-05 奇瑞汽车股份有限公司 Vehicle drive reminding method and device
CN110271417A (en) * 2019-06-21 2019-09-24 延锋伟世通电子科技(南京)有限公司 Full liquid crystal instrument system based on ADAS and AR technology
CN110386152A (en) * 2019-06-17 2019-10-29 江铃汽车股份有限公司 The human-computer interaction display control method and system driven based on L2 grades of intelligence navigators
CN110608751A (en) * 2019-08-14 2019-12-24 广汽蔚来新能源汽车科技有限公司 Driving navigation method and device, vehicle-mounted computer equipment and storage medium
CN111348055A (en) * 2018-12-20 2020-06-30 阿里巴巴集团控股有限公司 Driving assistance method, driving assistance system, computing device, and storage medium
US20200255026A1 (en) * 2017-09-18 2020-08-13 Telefonaktiebolaget Lm Ericsson (Publ) System and method for providing precise driving recommendations based on network-assisted scanning of a surrounding environment
CN111959507A (en) * 2020-07-06 2020-11-20 江铃汽车股份有限公司 Lane changing control method and system, readable storage medium and vehicle

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170017203A (en) * 2015-08-05 2017-02-15 엘지전자 주식회사 Driver Assistance Apparatus and Vehicle Having The Same
CN107521411A (en) * 2017-07-18 2017-12-29 吉林大学 A kind of track level navigation augmented reality device for aiding in driver
US20200255026A1 (en) * 2017-09-18 2020-08-13 Telefonaktiebolaget Lm Ericsson (Publ) System and method for providing precise driving recommendations based on network-assisted scanning of a surrounding environment
CN107985310A (en) * 2017-11-17 2018-05-04 浙江吉利汽车研究院有限公司 A kind of adaptive cruise method and system
CN108204822A (en) * 2017-12-19 2018-06-26 武汉极目智能技术有限公司 A kind of vehicle AR navigation system and method based on ADAS
CN109584596A (en) * 2018-12-20 2019-04-05 奇瑞汽车股份有限公司 Vehicle drive reminding method and device
CN111348055A (en) * 2018-12-20 2020-06-30 阿里巴巴集团控股有限公司 Driving assistance method, driving assistance system, computing device, and storage medium
CN110386152A (en) * 2019-06-17 2019-10-29 江铃汽车股份有限公司 The human-computer interaction display control method and system driven based on L2 grades of intelligence navigators
CN110271417A (en) * 2019-06-21 2019-09-24 延锋伟世通电子科技(南京)有限公司 Full liquid crystal instrument system based on ADAS and AR technology
CN110608751A (en) * 2019-08-14 2019-12-24 广汽蔚来新能源汽车科技有限公司 Driving navigation method and device, vehicle-mounted computer equipment and storage medium
CN111959507A (en) * 2020-07-06 2020-11-20 江铃汽车股份有限公司 Lane changing control method and system, readable storage medium and vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001846A1 (en) * 2022-06-28 2024-01-04 深圳市中兴微电子技术有限公司 Information projection method and apparatus, and vehicle control system

Similar Documents

Publication Publication Date Title
US10449971B2 (en) Travel control device
CN102610125B (en) Method for operating a driver assistance system on a motor vehicle outputting a recommendation related to an overtaking manoeuvre and motor vehicle
US7920070B2 (en) Parking guidance device and method thereof
EP2480443B1 (en) Driver assistance system for a vehicle, vehicle with a driver assistance system, and method for aiding a driver when operating a vehicle
US7378947B2 (en) Device and method for the active monitoring of the safety perimeter of a motor vehicle
US10235887B2 (en) Control system and method for assisting motor vehicles in safely pulling in after overtaking
US20100329510A1 (en) Method and device for displaying the surroundings of a vehicle
US20070063874A1 (en) Method and device for determining the position and/or the anticipated position of a vehicle during a parking operation in relation to the oncoming lane of a multi-lane roadway
US20120221236A1 (en) Collision Monitoring for a Motor Vehicle
US10928511B2 (en) Synchronous short range radars for automatic trailer detection
CN110920607A (en) Method and apparatus for facilitating remotely controlled vehicle maneuvering and pedestrian detection
EP3785996A1 (en) Integrated alarm system for vehicles
US20140297107A1 (en) Parking assistance system and method
CN112977428B (en) Parking assist system
EP2859543B1 (en) Warning system
CN112009398A (en) Vehicle control method and device, vehicle and storage medium
CN105774651A (en) Vehicle lane change auxiliary system
US11763681B2 (en) ECU and lane departure warning system
CN114228716A (en) Driving auxiliary lane changing method and system, readable storage medium and vehicle
CN113147748A (en) ADAS display method and system based on AR live-action navigation
CN204354915U (en) A kind of vehicle lane change ancillary system
CN210363587U (en) Auxiliary driving system
CN115527395B (en) Intelligent traffic safety identification device for auxiliary judgment
US11794809B1 (en) Vehicle and trailer wheel path collision detection and alert
CN114523905A (en) System and method for displaying detection and track prediction of targets around vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210723