CN112298040A - Auxiliary driving method based on transparent A column - Google Patents

Auxiliary driving method based on transparent A column Download PDF

Info

Publication number
CN112298040A
CN112298040A CN202011032979.2A CN202011032979A CN112298040A CN 112298040 A CN112298040 A CN 112298040A CN 202011032979 A CN202011032979 A CN 202011032979A CN 112298040 A CN112298040 A CN 112298040A
Authority
CN
China
Prior art keywords
target object
pillar
lane
distance measurement
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011032979.2A
Other languages
Chinese (zh)
Inventor
邢斌
凌赟
申水文
方运舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Hozon New Energy Automobile Co Ltd
Original Assignee
Zhejiang Hozon New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Hozon New Energy Automobile Co Ltd filed Critical Zhejiang Hozon New Energy Automobile Co Ltd
Priority to CN202011032979.2A priority Critical patent/CN112298040A/en
Priority to PCT/CN2020/121761 priority patent/WO2022062000A1/en
Publication of CN112298040A publication Critical patent/CN112298040A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/202Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used displaying a blind spot scene on the vehicle part responsible for the blind spot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A driving assisting method based on a transparent A column belongs to the field of driving safety. The method is applied to a control device; the method comprises the following steps: step S01, receiving the image in front of the vehicle collected by the front-view camera; step S02, judging the space position of the target object based on the image in front of the vehicle; step S03, based on the spatial position of the target object, performs distance measurement: if the target object is located in the lane, distance measurement is carried out through target object images collected by the two A-column cameras; if the target object is positioned on the left side of the lane, distance measurement and calculation are carried out through target object images collected by the left A-pillar camera and the front-view camera; if the target object is positioned at the right side of the lane, distance measurement and calculation are carried out through target object images collected by the A-pillar camera and the front-looking camera at the right side; and step S04, performing driving assistance judgment based on the distance measurement result, and displaying the driving assistance judgment result on the A-pillar flexible display screen. The invention can realize higher binocular/multi-ocular distance measuring and calculating precision.

Description

Auxiliary driving method based on transparent A column
The invention belongs to the technical field of driving safety, and particularly relates to an auxiliary driving method based on a transparent A column.
Background
The conventional driving assistance system generally has the following implementation modes:
1. a forward millimeter wave radar (77 GHz) is fused with the intelligent camera, wherein the forward millimeter wave radar is used as a sensor for distance detection;
2. a single vision system (forward-looking intelligent camera) is used as a distance detection sensor;
3. a multi-view vision system (binocular, multi-view), wherein tesla mainly measures obstacles in different distance ranges by adopting 3 cameras with different focal lengths;
all three solutions described above have certain limitations
1. Although the forward millimeter wave radar can meet the requirement of high ranging accuracy, the cost of the forward millimeter wave radar is higher than that of a camera, and in addition, due to the fact that the angular resolution of the millimeter wave radar is low, objects (such as metal plates on the ground and metal plates in the air) with different heights and different angles in front are not easy to distinguish, false triggering is easy to occur, and therefore the reliability of the system is reduced;
2. although the single vision system has a great advantage in cost, the distance measurement precision is low, and the functions of ACC (Adaptive Cruise Control), AEB (automatic Braking system) and the like with high Control precision requirements cannot be met;
3. the multi-view vision system has the problems that a plurality of cameras are arranged, and on one hand, the distance between the cameras is too close to meet the precision requirement of binocular distance measurement; on the other hand, the camera is arranged too far, the arrangement difficulty is increased, a plurality of supports are required to be arranged on the inner side of the front windshield, and the attractiveness and even the visual field of a driver are affected.
The utility model patent CN204641550U discloses a vehicle vision blind area display system and a vehicle, and specifically discloses that the system comprises a flexible display device, a first camera device and a processor; the first camera device is arranged outside the cab and close to the lower end of the A column of the vehicle and is used for shooting images of the surrounding environment of the vehicle; the processor receives and processes the image of the surrounding environment of the vehicle shot by the first camera device, so that the image of the environment outside the vehicle is obtained from the image and is transmitted to the flexible display device to be displayed. According to the system, the two A-column camera devices are used for acquiring and judging images in all directions in the front of the vehicle, effective pixels in a visual angle range cannot be ensured, the target cannot be accurately identified, and the distance measurement result precision difference is large for vehicles driving in front of the vehicle, in front of the left and in front of the right, so that accurate driving assistance prompt cannot be given.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an auxiliary driving method based on a transparent A column, which can realize higher binocular/multi-eye distance measuring and calculating precision.
The invention is realized by the following technical scheme:
a driving assistance method based on a transparent A column is realized based on a driving system comprising a forward-looking camera, two A column cameras, an A column flexible display screen and a control device, and the method is applied to the control device; the method comprises the following steps:
step S01, receiving the image in front of the vehicle collected by the front-view camera;
step S02, judging the space position of the target object based on the image in front of the vehicle;
step S03, based on the spatial position of the target object, performs distance measurement:
if the target object is located in the lane, distance measurement is carried out through target object images collected by the two A-column cameras;
if the target object is positioned on the left side of the lane, distance measurement and calculation are carried out through target object images collected by the left A-pillar camera and the front-view camera;
if the target object is positioned at the right side of the lane, distance measurement and calculation are carried out through target object images collected by the A-pillar camera and the front-looking camera at the right side;
and step S04, performing driving assistance judgment based on the distance measurement result, and displaying the driving assistance judgment result on the A-pillar flexible display screen.
According to the invention, 3 cameras are used for collecting images in front of the vehicle, and all spaces in front of the vehicle can be covered after the three cameras are synthesized in view angle ranges. The camera setting that is located the vehicle both sides is in A post side, and the installation of a plurality of cameras both guaranteed enough interval in order to satisfy the range finding required precision like this, has considered the aesthetic property of arranging of front windshield again. According to the invention, the spatial position of the target object is judged, and a proper camera is selected for accurate distance measurement and calculation, so that accurate judgment is provided for driving assistance.
Preferably, the two A-pillar cameras are respectively arranged at the top ends of the A-pillars at the two sides.
Preferably, the two a-pillar cameras are respectively provided on the outer mirror on the a-pillar side.
Preferably, the two A-pillar cameras and the front-view camera both adopt narrow-view cameras.
Preferably, the step S02 specifically includes: based on the picture area in the image in front of the vehicle collected by the front-view camera, the spatial position of the target object is judged: if the target object is located in the middle area of the picture, the target object is located in the lane; if the target object is located in the left area of the picture, the target object is located on the left side of the lane; and if the target object is positioned in the right area of the picture, the target object is positioned on the right side of the lane.
Preferably, the step S02 specifically includes: judging the spatial position between the target objects based on the lane lines in the images in front of the vehicle collected by the front-view camera: if the target object is located between the two lane lines, the target object is located in the lane; if the target object is positioned on the left side of the left lane line, the target object is positioned on the left side of the lane; and if the target object is positioned on the right side of the right lane line, the target object is positioned on the right side of the lane.
Preferably, the distance measurement in step S03 is implemented by a binocular camera distance measurement method.
Preferably, the step S03 further includes: and based on the target object images collected by the two A-column cameras and the front-view camera, distance measurement and calculation are carried out by a multi-view distance measurement and calculation method.
Preferably, the step S04 includes: and performing front collision early warning judgment based on the distance measurement result, the running speed of the vehicle and the speed of the target object, and sending a collision early warning signal and displaying the collision early warning signal on the A-pillar flexible display screen when judging that the relative distance between the target object and the vehicle and the ratio of the relative speed do not exceed a safety threshold value.
Preferably, the step S04 further includes: and judging lane line deviation based on the distance measurement result, and sending out lane deviation signals and displaying the lane deviation signals on the A-pillar flexible display screen when judging that the relative position of the vehicle wheels and the lane line deviates.
The invention has the following beneficial effects:
compared with the traditional assistant driving method, the assistant driving method based on the transparent A column has the following advantages:
1. the advantages of the installation position of the transparent A column are fully exerted, and higher binocular/multi-eye distance measurement precision is realized;
2. the method comprises the steps that the transparent A column is used for displaying the condition of a front road in real time, the working state of an intelligent driving system and related set information;
3. the alarm prompt can be more clearly given to the driver, so that better user experience is brought, and the technological sense of the whole vehicle can be improved;
4. the driver does not need to look over instrument or central control large-screen alarm information in a head-down mode, the condition that the traffic of surrounding roads is observed in a head-up mode can be restrained, and driving safety is improved.
Drawings
FIG. 1 is a flow chart of a driving assistance method based on a transparent A-pillar according to the present invention;
FIG. 2a is a rear view of a first embodiment of two A-pillar cameras, a front view camera, mounted on a vehicle;
FIG. 2b is a top view of the first embodiment in which two A-pillar cameras and a front-view camera are mounted on a vehicle;
FIG. 3a is a rear view of a second embodiment of two A-pillar cameras, a front view camera, mounted on a vehicle;
FIG. 3b is a second embodiment of two A-pillar cameras, a front view camera, mounted on a vehicle, this view being a top view;
fig. 4 is a specific example of binocular ranging, where P is the target object and L, R are two cameras.
Detailed Description
The following are specific embodiments of the present invention and are further described with reference to the drawings, but the present invention is not limited to these embodiments.
The auxiliary driving method based on the transparent A column is realized based on a driving system. The driving system comprises a front-view camera, two A-column cameras, an A-column flexible display screen and a control device. The front view camera is arranged in the middle of the vehicle, such as the middle above the front windshield.
In one embodiment, two a-pillar cameras are respectively disposed at the top ends of the a-pillars at two sides (see fig. 2a and 2 b). In another embodiment, two a-pillar cameras are respectively arranged on the a-pillar side exterior mirror, especially on the exterior mirror above the height of the vehicle hood (see fig. 3a and 3 b). The flexible screen of A post adopts the OLED screen, arranges the A post in for whole A post is transparent display state, can shelter from the image projection of the driver blind area that brings with driving in-process A post on the screen of A post in the car, promotes the security of driving. The control device is a driving control device and is used for receiving the camera information and sending the information to be displayed on the display screen. The two modes ensure enough space to meet the requirement of distance measurement precision and also consider the arrangement attractiveness of the front windshield.
The two A-column cameras and the front-view camera both adopt narrow-view cameras. The narrow viewing angle camera has a viewing angle of about 50 °. The images synthesized by the three cameras can cover all the space in front of the vehicle. The camera with a narrow visual angle range is selected, so that effective pixels in the visual angle range can be guaranteed, and the accuracy of target identification is guaranteed.
Referring to fig. 1, the method of the present invention is applied to a control device; the method comprises the following steps:
step S01, receiving the image in front of the vehicle collected by the front-view camera;
step S02, judging the space position of the target object based on the image in front of the vehicle;
step S03, based on the spatial position of the target object, performs distance measurement:
if the target object is located in the lane, distance measurement is carried out through target object images collected by the two A-column cameras;
if the target object is positioned on the left side of the lane, distance measurement and calculation are carried out through target object images collected by the left A-pillar camera and the front-view camera;
if the target object is positioned at the right side of the lane, distance measurement and calculation are carried out through target object images collected by the A-pillar camera and the front-looking camera at the right side;
and step S04, performing driving assistance judgment based on the distance measurement result, and displaying the driving assistance judgment result on the A-pillar flexible display screen.
The step S02 specifically includes: judging the spatial position between the target objects based on the lane lines in the images in front of the vehicle collected by the front-view camera: if the target object is located between the two lane lines, the target object is located in the lane; if the target object is positioned on the left side of the left lane line, the target object is positioned on the left side of the lane; and if the target object is positioned on the right side of the right lane line, the target object is positioned on the right side of the lane.
When the lane line does not exist in the video received in step S01 or the lane line is blurred, another embodiment of step S02 includes: based on the picture area in the image in front of the vehicle collected by the front-view camera, the spatial position of the target object is judged: if the target object is located in the middle area of the picture, the target object is located in the lane; if the target object is located in the left area of the picture, the target object is located on the left side of the lane; and if the target object is positioned in the right area of the picture, the target object is positioned on the right side of the lane.
In addition, both embodiments may be provided, and the image in front of the vehicle received in step S01 may be determined before the determination of the spatial position of the target object is performed. For example, step S01 further includes: identifying whether the image in front of the vehicle collected by the front-view camera contains a lane line or not and whether the lane line can be identified or not, and if so, identifying the spatial position of the target object by using the lane line; if the image in front of the vehicle collected by the front-view camera does not contain lane lines or cannot be identified, the spatial position of the target object is identified by using the position of the picture area.
When the distance measurement is performed in step S03, the images collected by the different cameras are synthesized into an image covering the front view of the vehicle, and then the distance measurement is performed. Fig. 4 illustrates a binocular camera ranging method. The distance (depth) z of the target object P from the camera is calculated based on the data in the map, that is, the relative distance between the target object and the host vehicle is obtained. Specifically, the method comprises the following steps:
according to the similarity law of triangles:
Figure DEST_PATH_IMAGE002
solving the equation by the formula (1):
Figure DEST_PATH_IMAGE004
Figure DEST_PATH_IMAGE006
f is the focal length of the camera, b is the reference line of the left camera and the right camera, and f and b can be obtained through prior information or camera calibration; parallax d = xl-xrI.e. left camera pixel point (x)l,yl) And right camera pixel point (x)r,yr)。
The step S03 further includes: and based on the target object images collected by the two A-column cameras and the front-view camera, distance measurement and calculation are carried out by a multi-view distance measurement and calculation method. If the target object is judged to be located in the lane, multi-view distance measurement can be adopted. The images collected by different cameras are synthesized into an image covering the front view of the vehicle, and then distance measurement is carried out.
The step S04 includes: and performing front collision early warning judgment based on the distance measurement result, the running speed of the vehicle and the speed of the target object, and sending a collision early warning signal and displaying the collision early warning signal on the A-pillar flexible display screen when judging that the relative distance between the target object and the vehicle and the ratio of the relative speed do not exceed a safety threshold value. The distance measurement result is obtained based on the relative distance of the measurement target object and the host vehicle in step S03. The running speed of the vehicle is acquired by acquiring the running speed of the vehicle through the control device, and the speed of the target object is acquired by acquiring the speed of the target object through the control device. The driving speed of the vehicle, the speed of a target object and the distance between the target object and the vehicle can be shown in real time on the A-column flexible display screen. When collision occurs, displaying collision early warning signals, such as alarm information and the like, on the A-pillar flexible display screen; meanwhile, the control device controls the emergency braking of the vehicle. Compared with the traditional instrument panel for displaying the alarm information or the traditional central control host for displaying the alarm information on a large screen, the display mode of the invention does not need the driver to look over the alarm information by lowering the head, the sight line is always kept in front, and the safety is high.
The step S04 further includes: and judging lane line deviation based on the distance measurement result, and sending out lane deviation signals and displaying the lane deviation signals on the A-pillar flexible display screen when judging that the relative position of the vehicle wheels and the lane line deviates. The distance measurement result is obtained based on the measurement of the relative position of the wheels of the host vehicle and the lane line in step S03. When lane departure occurs, lane departure warning information is displayed on the A-pillar flexible display screen, and the control device can also control steering wheel vibration, sound alarm and the like.
The flexible display screen of the A column can display the information related to the distance measurement and can also display the alarm state of the alarm auxiliary driving function in real time, such as traffic sign identification, traffic signal lamp identification, front vehicle starting prompt and the like; the traffic state around the vehicle can also be displayed in real time, including the speed, the acceleration and deceleration state, the state of the front vehicle (speed, distance, etc.), and the like. The control device can carry out relevant setting on the driving assisting function based on the distance measuring result, such as function opening and closing, function sensitivity, self-adaptive cruise following time interval, maximum set speed and the like.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the present invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (10)

1. A driving assistance method based on a transparent A column is characterized by being realized based on a driving system comprising a front-view camera, two A column cameras, an A column flexible display screen and a control device, and the method is applied to the control device; the method comprises the following steps:
step S01, receiving the image in front of the vehicle collected by the front-view camera;
step S02, judging the space position of the target object based on the image in front of the vehicle;
step S03, based on the spatial position of the target object, performs distance measurement:
if the target object is located in the lane, distance measurement is carried out through target object images collected by the two A-column cameras;
if the target object is positioned on the left side of the lane, distance measurement and calculation are carried out through target object images collected by the left A-pillar camera and the front-view camera;
if the target object is positioned at the right side of the lane, distance measurement and calculation are carried out through target object images collected by the A-pillar camera and the front-looking camera at the right side;
and step S04, performing driving assistance judgment based on the distance measurement result, and displaying the driving assistance judgment result on the A-pillar flexible display screen.
2. The driving assistance method according to claim 1, wherein the two A-pillar cameras are respectively arranged at the top ends of the A-pillars at two sides.
3. The driving assistance method based on the transparent A-pillar as claimed in claim 1, wherein the two A-pillar cameras are respectively disposed on the external rearview mirror at the side of the A-pillar.
4. The driving assistance method based on the transparent A-pillar as claimed in claim 1, wherein the two A-pillar cameras and the front-view camera both use narrow-view cameras.
5. The driving assistance method based on the transparent a-pillar as claimed in claim 1, wherein the step S02 specifically comprises: based on the picture area in the image in front of the vehicle collected by the front-view camera, the spatial position of the target object is judged: if the target object is located in the middle area of the picture, the target object is located in the lane; if the target object is located in the left area of the picture, the target object is located on the left side of the lane; and if the target object is positioned in the right area of the picture, the target object is positioned on the right side of the lane.
6. The driving assistance method based on the transparent a-pillar as claimed in claim 1, wherein the step S02 specifically comprises: judging the spatial position between the target objects based on the lane lines in the images in front of the vehicle collected by the front-view camera: if the target object is located between the two lane lines, the target object is located in the lane; if the target object is positioned on the left side of the left lane line, the target object is positioned on the left side of the lane; and if the target object is positioned on the right side of the right lane line, the target object is positioned on the right side of the lane.
7. The driving assistance method based on the transparent A-pillar as claimed in claim 1, wherein the distance estimation of the step S03 is implemented by a binocular camera ranging method.
8. The driving assistance method based on the transparent a-pillar as claimed in claim 1, wherein the step S03 further comprises: and based on the target object images collected by the two A-column cameras and the front-view camera, distance measurement and calculation are carried out by a multi-view distance measurement and calculation method.
9. The driving assistance method based on the transparent a-pillar as claimed in claim 1, wherein the step S04 includes: and performing front collision early warning judgment based on the distance measurement result, the running speed of the vehicle and the speed of the target object, and sending a collision early warning signal and displaying the collision early warning signal on the A-pillar flexible display screen when judging that the relative distance between the target object and the vehicle and the ratio of the relative speed do not exceed a safety threshold value.
10. The driving assistance method based on the transparent a-pillar as claimed in claim 9, wherein the step S04 further comprises: and judging lane line deviation based on the distance measurement result, and sending out lane deviation signals and displaying the lane deviation signals on the A-pillar flexible display screen when judging that the relative position of the vehicle wheels and the lane line deviates.
CN202011032979.2A 2020-09-27 2020-09-27 Auxiliary driving method based on transparent A column Pending CN112298040A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011032979.2A CN112298040A (en) 2020-09-27 2020-09-27 Auxiliary driving method based on transparent A column
PCT/CN2020/121761 WO2022062000A1 (en) 2020-09-27 2020-10-19 Driver assistance method based on transparent a-pillar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011032979.2A CN112298040A (en) 2020-09-27 2020-09-27 Auxiliary driving method based on transparent A column

Publications (1)

Publication Number Publication Date
CN112298040A true CN112298040A (en) 2021-02-02

Family

ID=74488729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011032979.2A Pending CN112298040A (en) 2020-09-27 2020-09-27 Auxiliary driving method based on transparent A column

Country Status (2)

Country Link
CN (1) CN112298040A (en)
WO (1) WO2022062000A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114655222A (en) * 2022-04-18 2022-06-24 重庆长安汽车股份有限公司 Method and system for displaying target vehicle in real time
TWI777646B (en) * 2021-07-01 2022-09-11 新煒科技有限公司 System, method and vehicle for vheicle warning
CN115626159A (en) * 2021-07-01 2023-01-20 信扬科技(佛山)有限公司 Vehicle warning system and method and automobile

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115139910B (en) * 2022-09-05 2023-01-17 奥提赞光晶(山东)显示科技有限公司 Method for eliminating automobile A column blind area
CN115303295A (en) * 2022-09-06 2022-11-08 中国重汽集团济南动力有限公司 Lane departure early warning method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101866051A (en) * 2009-04-02 2010-10-20 通用汽车环球科技运作公司 Infotainment on the full-windscreen head-up display shows
CN105620489A (en) * 2015-12-23 2016-06-01 深圳佑驾创新科技有限公司 Driving assistance system and real-time warning and prompting method for vehicle
US20180196133A1 (en) * 2016-07-29 2018-07-12 Faraday&Future Inc. Method and apparatus for detection and ranging fault detection and recovery
CN108860045A (en) * 2018-06-28 2018-11-23 深圳奥尼电子股份有限公司 Driving support method, driving support device, and storage medium
CN108928297A (en) * 2018-06-11 2018-12-04 信利光电股份有限公司 A kind of vehicle assistant drive method and apparatus
CN108973861A (en) * 2018-07-24 2018-12-11 浙江合众新能源汽车有限公司 A kind of intelligence A column driving safety system
CN109080630A (en) * 2018-08-07 2018-12-25 安徽工程大学 A kind of context aware systems and its control method for vehicle
JP2019125920A (en) * 2018-01-17 2019-07-25 株式会社ジャパンディスプレイ Monitor display system and display method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7786898B2 (en) * 2006-05-31 2010-08-31 Mobileye Technologies Ltd. Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
CN104786933A (en) * 2015-03-04 2015-07-22 江苏大学 Panoramic image driving auxiliary device and panoramic image driving auxiliary method
CN106585623B (en) * 2016-12-21 2023-12-01 驭势科技(北京)有限公司 Detection system for detecting objects around vehicle and application thereof
JP2019046069A (en) * 2017-08-31 2019-03-22 株式会社デンソーテン Driving support device and driving support method
CN109278641A (en) * 2018-05-30 2019-01-29 惠州市德赛西威汽车电子股份有限公司 A kind of high-definition intelligent devices and methods therefor for eliminating vehicle blind zone based on camera
CN211468310U (en) * 2019-12-17 2020-09-11 上汽通用汽车有限公司 Vehicle display system and vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101866051A (en) * 2009-04-02 2010-10-20 通用汽车环球科技运作公司 Infotainment on the full-windscreen head-up display shows
CN105620489A (en) * 2015-12-23 2016-06-01 深圳佑驾创新科技有限公司 Driving assistance system and real-time warning and prompting method for vehicle
US20180196133A1 (en) * 2016-07-29 2018-07-12 Faraday&Future Inc. Method and apparatus for detection and ranging fault detection and recovery
JP2019125920A (en) * 2018-01-17 2019-07-25 株式会社ジャパンディスプレイ Monitor display system and display method
CN108928297A (en) * 2018-06-11 2018-12-04 信利光电股份有限公司 A kind of vehicle assistant drive method and apparatus
CN108860045A (en) * 2018-06-28 2018-11-23 深圳奥尼电子股份有限公司 Driving support method, driving support device, and storage medium
CN108973861A (en) * 2018-07-24 2018-12-11 浙江合众新能源汽车有限公司 A kind of intelligence A column driving safety system
CN109080630A (en) * 2018-08-07 2018-12-25 安徽工程大学 A kind of context aware systems and its control method for vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI777646B (en) * 2021-07-01 2022-09-11 新煒科技有限公司 System, method and vehicle for vheicle warning
CN115626159A (en) * 2021-07-01 2023-01-20 信扬科技(佛山)有限公司 Vehicle warning system and method and automobile
CN114655222A (en) * 2022-04-18 2022-06-24 重庆长安汽车股份有限公司 Method and system for displaying target vehicle in real time

Also Published As

Publication number Publication date
WO2022062000A1 (en) 2022-03-31

Similar Documents

Publication Publication Date Title
CN112298040A (en) Auxiliary driving method based on transparent A column
CN110228416B (en) Early warning system and method based on driver turning visual blind area detection
EP1961613B1 (en) Driving support method and driving support device
CN106573577B (en) Display system and method
EP2955915B1 (en) Around view provision apparatus and vehicle including the same
US8199975B2 (en) System and method for side vision detection of obstacles for vehicles
US8044781B2 (en) System and method for displaying a 3D vehicle surrounding with adjustable point of view including a distance sensor
US7652686B2 (en) Device for image detecting objects, people or similar in the area surrounding a vehicle
EP2045132B1 (en) Driving support device, driving support method, and computer program
EP2579231A1 (en) Image processing apparatus for vehicle
US20170140542A1 (en) Vehicular image processing apparatus and vehicular image processing system
US20110050886A1 (en) System and method for providing guidance information to a driver of a vehicle
US20190100145A1 (en) Three-dimensional image driving assistance device
US20110169957A1 (en) Vehicle Image Processing Method
US10919450B2 (en) Image display device
CN104802710B (en) A kind of intelligent automobile reversing aid system and householder method
JP2008222153A (en) Merging support device
CN108791062B (en) Dynamic information system and method of operation
US20190135342A1 (en) Parking assist device
CN105793909B (en) The method and apparatus for generating warning for two images acquired by video camera by vehicle-periphery
US9849835B2 (en) Operating a head-up display of a vehicle and image determining system for the head-up display
US10836311B2 (en) Information-presenting device
JPH0717328A (en) Circumference recognition auxiliary device for vehicle
JP2000251198A (en) Periphery monitor device for vehicle
KR102473404B1 (en) Apparatus for providing top view

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210202