CA3145746A1 - Advanced display and control system for driving assistance - Google Patents
Advanced display and control system for driving assistance Download PDFInfo
- Publication number
- CA3145746A1 CA3145746A1 CA3145746A CA3145746A CA3145746A1 CA 3145746 A1 CA3145746 A1 CA 3145746A1 CA 3145746 A CA3145746 A CA 3145746A CA 3145746 A CA3145746 A CA 3145746A CA 3145746 A1 CA3145746 A1 CA 3145746A1
- Authority
- CA
- Canada
- Prior art keywords
- car
- processing unit
- driver
- camera
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000009471 action Effects 0.000 claims description 14
- 230000002452 interceptive effect Effects 0.000 claims description 7
- 238000006748 scratching Methods 0.000 claims description 2
- 230000002393 scratching effect Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 description 7
- 241000282414 Homo sapiens Species 0.000 description 6
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000003381 stabilizer Substances 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/30—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Traffic Control Systems (AREA)
- Image Generation (AREA)
Abstract
A new display and control system for driving a car is provided. By presenting a "Third-person" view from the car at realtime which is similar to play a video game could let the driver recognize the road condition more easier and comfortably. Control the car by using gamepad or other input device could free the driver from staying in a fixed position in car. And an additional "snap" mode is introduced to avoid collision when driving in a narrow street.
Description
Advanced display and control system for driving assistance BACKGROUND
A "Third-person" view is a computer video game terminology refers to a graphical perspective rendered from a fixed distance behind and slightly above the player character. In this invention the "Third-person" view means a virtual camera capture image from a fixed distance and slightly above the car.
Human beings are more superior than computer when it come to recognize and classify objects. So represent a "Third-Person" view during driving to the driver in a more meaningful way will make the driving more easy and comfortable. In most cars,the driver is limited to a fixed position which make it almost impossible to change position of the driver during driving. Steering wheel,brake,gas pedal are mechanical control devices,which make it difficult for beginners to learn how to coordinate it. And it will cost many time for beginners to master how to drive a car safely.
This invention is trying to make this process more easier,which not only give a new way to display and control the car,but also make it as easy as playing a video game.
SUMMARY
By presenting a "Third-person" view to the driver during driving which is similar to the third person view in video game could help the driver watch the road condition more intuitively and comfortably.
This invention basically provide two methods to archive this goal.
One method is set a pole with camera or cameras at the end of it on the roof of the car to capture the road condition and then presenting this view to the driver at realtime. The other method is set cameras at the sides or top of the car without poles stick high above the roof,using image processing unit to generate a virtual third person view at realtime and then presenting this view on the display terminal.
When driving at night or raining or snowing or fog days,it is hard for the driver to recognize objects or cars on the road. This invention also introduced an enhanced vision of the road condition view to remove the outdoor scene shadow when driving at sunny day.
By adding an wireframe from a high precision 3D map to the view could help the driver to identity the road more clearly. The image processing unit also generate estimated depth of the view,using the high precision 3d map could provide correct depth occlusion of other cars and objects on the road.
Conventional cars usually have steering wheel,brake and gas pedal which are mechanical control device,which need a lot of time for human to learn how to coordinate with it. Even a proficient driver couldn't guarantee to operate it correctly all the time. While this invention introduce new control device such as game pad,tablet,mobile phone or other similar custom control device,which
A "Third-person" view is a computer video game terminology refers to a graphical perspective rendered from a fixed distance behind and slightly above the player character. In this invention the "Third-person" view means a virtual camera capture image from a fixed distance and slightly above the car.
Human beings are more superior than computer when it come to recognize and classify objects. So represent a "Third-Person" view during driving to the driver in a more meaningful way will make the driving more easy and comfortable. In most cars,the driver is limited to a fixed position which make it almost impossible to change position of the driver during driving. Steering wheel,brake,gas pedal are mechanical control devices,which make it difficult for beginners to learn how to coordinate it. And it will cost many time for beginners to master how to drive a car safely.
This invention is trying to make this process more easier,which not only give a new way to display and control the car,but also make it as easy as playing a video game.
SUMMARY
By presenting a "Third-person" view to the driver during driving which is similar to the third person view in video game could help the driver watch the road condition more intuitively and comfortably.
This invention basically provide two methods to archive this goal.
One method is set a pole with camera or cameras at the end of it on the roof of the car to capture the road condition and then presenting this view to the driver at realtime. The other method is set cameras at the sides or top of the car without poles stick high above the roof,using image processing unit to generate a virtual third person view at realtime and then presenting this view on the display terminal.
When driving at night or raining or snowing or fog days,it is hard for the driver to recognize objects or cars on the road. This invention also introduced an enhanced vision of the road condition view to remove the outdoor scene shadow when driving at sunny day.
By adding an wireframe from a high precision 3D map to the view could help the driver to identity the road more clearly. The image processing unit also generate estimated depth of the view,using the high precision 3d map could provide correct depth occlusion of other cars and objects on the road.
Conventional cars usually have steering wheel,brake and gas pedal which are mechanical control device,which need a lot of time for human to learn how to coordinate with it. Even a proficient driver couldn't guarantee to operate it correctly all the time. While this invention introduce new control device such as game pad,tablet,mobile phone or other similar custom control device,which
2 will make it easy to learn. And driving a car is as easy as playing a video game.
The display terminal would be tablet,VR-headset,mobile phone or other similar custom display device instead of a direct human sight will liberate the the driver and the driver could stay at any place in the car while in the conventional car driver is limited at a fixed position. The display terminal and control device will make it easier for other ones to take over the car in emergence situation.
According to one embodiment disclosed herein,a display system for driving assistance is provided. The display system includes a pole with camera or cameras at the end of it installed on the roof of the car, an image processing unit and a display terminal to present a "Third-person" view of the road condition to the driver.
According to one embodiment disclosed herein, a display system for driving assistance is provided. The display system includes cameras installed at the sides and top of the car but without poles stick high above the roof,an image processing unit and a display terminal to present a "Third-person" view of the road condition to the driver.
According to one embodiment disclosed herein, a control system for driving assistance is provided. This control system includes an interactive input device and an action processing unit to operate the car.
According to one embodiment disclosed herein,a car with a display system and a control system for driving assistance is provided. The display system includes a pole with camera or cameras at the end of it installed on the roof of the car, an image processing unit and a display terminal to present a "Third-person" view of the road condition to the driver. The control system for driving assistance is provided. This control system includes an interactive input device and an action processing unit to operate the car.
According to one embodiment disclosed herein,a car with a display system and a control system for driving assistance is provided. The display system includes cameras installed at the sides and top of the car but without without poles stick high above the roof,an image processing unit and a display terminal to present a "Third-person"
view of the road condition to the driver. The control system for driving assistance is provided. This control system includes an interactive input device and an action processing unit to operate the car.
According to one embodiment disclosed herein,a special "snap" mode is include in the action processing unit which will prevent the car collide with other objects when driving on a narrow street.
The display terminal would be tablet,VR-headset,mobile phone or other similar custom display device instead of a direct human sight will liberate the the driver and the driver could stay at any place in the car while in the conventional car driver is limited at a fixed position. The display terminal and control device will make it easier for other ones to take over the car in emergence situation.
According to one embodiment disclosed herein,a display system for driving assistance is provided. The display system includes a pole with camera or cameras at the end of it installed on the roof of the car, an image processing unit and a display terminal to present a "Third-person" view of the road condition to the driver.
According to one embodiment disclosed herein, a display system for driving assistance is provided. The display system includes cameras installed at the sides and top of the car but without poles stick high above the roof,an image processing unit and a display terminal to present a "Third-person" view of the road condition to the driver.
According to one embodiment disclosed herein, a control system for driving assistance is provided. This control system includes an interactive input device and an action processing unit to operate the car.
According to one embodiment disclosed herein,a car with a display system and a control system for driving assistance is provided. The display system includes a pole with camera or cameras at the end of it installed on the roof of the car, an image processing unit and a display terminal to present a "Third-person" view of the road condition to the driver. The control system for driving assistance is provided. This control system includes an interactive input device and an action processing unit to operate the car.
According to one embodiment disclosed herein,a car with a display system and a control system for driving assistance is provided. The display system includes cameras installed at the sides and top of the car but without without poles stick high above the roof,an image processing unit and a display terminal to present a "Third-person"
view of the road condition to the driver. The control system for driving assistance is provided. This control system includes an interactive input device and an action processing unit to operate the car.
According to one embodiment disclosed herein,a special "snap" mode is include in the action processing unit which will prevent the car collide with other objects when driving on a narrow street.
3 BRIEF DESCRIPTION OF THE DRAWINGS
FIG.1 is a side-up view of a car with a pole installed at the top of the roof,and cameras set at the end of the pole in accordance with an embodiment of the present technology.
FIG.2 is a side-up view of the car with cameras installed at the sides of the car in accordance with an embodiment of the present technology.
FIG.3 is a realtime "Third-person" view generated from the image processing unit at sunny daytime in accordance with an embodiment of the present technology.
FIG.4 is a realtime "Third-person" view generated from the image processing unit at night/raining/snowing/fog day in accordance with an embodiment of the present technology.
FIG.5 is an example of the display terminal as tablet,VR-Headset, and an example of the input device as gamepad in accordance with an embodiment of the present technology.
DETAILED DESCRIPTION
Although computer vision has been made many progress these days, human being are still have some advantage especially for objects recognize and classification,which is important for car driver to watch out the road condition during driving. By presenting a special "Third-person" view of the car will give the driver a more meaningful representation of the road condition. And this let the driver drive the car in a more intuitive way.
One method to archive this goal is installed a pole with camera or cameras at the end of it on the roof of the car.
FIG.1 is a side-up view of a car with a pole 102 installed at the roof of the car and cameras 101 at the end of the pole 102.
There is a image processing unit generate a "Third-person" view of the car at realtime from the images captured by the cameras 101.
Since there might be a shake for the cameras 161 during driving, the image processing unit also contains an image stabilizer to register images from frame to frame and remove possible motion blur.
The pole 102 will fold automatically when traveling in a tunnel or there is a limitation of the height of the car. During this period,the image processing unit will generate a convertible mode of the view for the driver to operate the car. Such convertible mode is using computer vision algorithm to generate a virtual camera located at the front of the car just like the driver is driving a convertible car.
FIG.1 is a side-up view of a car with a pole installed at the top of the roof,and cameras set at the end of the pole in accordance with an embodiment of the present technology.
FIG.2 is a side-up view of the car with cameras installed at the sides of the car in accordance with an embodiment of the present technology.
FIG.3 is a realtime "Third-person" view generated from the image processing unit at sunny daytime in accordance with an embodiment of the present technology.
FIG.4 is a realtime "Third-person" view generated from the image processing unit at night/raining/snowing/fog day in accordance with an embodiment of the present technology.
FIG.5 is an example of the display terminal as tablet,VR-Headset, and an example of the input device as gamepad in accordance with an embodiment of the present technology.
DETAILED DESCRIPTION
Although computer vision has been made many progress these days, human being are still have some advantage especially for objects recognize and classification,which is important for car driver to watch out the road condition during driving. By presenting a special "Third-person" view of the car will give the driver a more meaningful representation of the road condition. And this let the driver drive the car in a more intuitive way.
One method to archive this goal is installed a pole with camera or cameras at the end of it on the roof of the car.
FIG.1 is a side-up view of a car with a pole 102 installed at the roof of the car and cameras 101 at the end of the pole 102.
There is a image processing unit generate a "Third-person" view of the car at realtime from the images captured by the cameras 101.
Since there might be a shake for the cameras 161 during driving, the image processing unit also contains an image stabilizer to register images from frame to frame and remove possible motion blur.
The pole 102 will fold automatically when traveling in a tunnel or there is a limitation of the height of the car. During this period,the image processing unit will generate a convertible mode of the view for the driver to operate the car. Such convertible mode is using computer vision algorithm to generate a virtual camera located at the front of the car just like the driver is driving a convertible car.
4 FIG.2 is another method to archive the same goal. This method only using camera 201 and camera 202 without poles stick high above the roof.
An image processing unit will generate a "Third-person" view of the car at realtime by using computer vision algorithm.
This computer vision algorithm is the technical core of this invention. Most automaker such as tesla use simple symbol model to present the car while this invention generate a "Third-person" view which is synthesized from real image just like special effects in the sci-fi movies. The road side scene and the cars in the "Third-person" view are generated from real image. The virtual view generated from the image processing unit is just like a real camera capture the whole scene at realtime. All of this is depend on an advanced computer vision algorithm to make it work.
The image processing unit generate such "Third-person" view of the car at realtime and sent it to the display terminal to present it to the driver. The display terminal could be a mobile phone,tablet,VR-Headset or other similar display device.
When driving at sunny day, the image processing unit could also remove the outdoor scene shadow when generated the "Third-person"
view.
FIG.3 is a sunny day view generated from the image processing unit at realtime. The generated view will be registered in a high precision 3D map. Additional information could be optionally showed in the view such as road lines,navigation guide info,hint info of the building at road sides or even advertisement info. The image processing unit also generate estimated depth of the view,this depth image will be used when show the additional information to get correct depth occlusion. All this information could also be showed to other non-driver passengers on the car or even transmitting via internet connection for remote viewing or even for remote controlled driving.
When driving at night or raining or snowing or fog day,it is hard for the driver to recognize objects on the road. The image processing unit might use IR image of the scene to generate an enhanced version of the "Third-person" view of the car when driving in the above conditions.
By adding an wireframe from the high precision 3D map to the "Third-person" view could help the driver to identify the objects and cars on the road more clearly.
FIG.4 is a "Third-person" view of the car generated by the image processing unit when driving at night or raining or snowing or fog day. A wireframe with distinguishing color to the view is added to the view with other information similar in the FIG.3 can also be showed optionally.
Conventional cars usually have steering wheel,brake and gas pedal which are mechanical control device,which need a lot of time for human to learn how to coordinate with it. Even a proficient driver couldn't guarantee to operate it correctly all the time. While this invention introduce new control device such as gamepad,tablet,mobile phone or other similar custom control device,which will make it easy to learn. And driving a car is as easy as playing a video game.
The display terminal would be tablet,VR-headset,mobile phone or other similar custom display device instead of a direct human sight will liberate the the driver and the driver could stay at any place in the car while in the conventional car driver is limited at a fixed position. The display terminal and control device will make it easier for other ones to take over the car in emergence situation.
FIG.5 is an example display terminal device which are tablet and a VR-Headset. The control system includes a interactive input device and a action processing unit. The interactive input device could be a gamepad,tablet,mobile phone or similar input device. The action processing unit will translate the input action into commands. These commands control the car to execute the operation of the driver.
The action processing unit also contains a "snap" mode,when the car driving on a narrow road or narrow street. The action processing unit will try to keep a minimum distance to other cars or obstacles automatically. The car will just go through the street without worrying about scratching something.
The subject matter described above is provided by way of illustration only and should not be constructed as limiting various modifications and changes maybe made to the subject matter described herein without following the example embodiments and applications without departing from the true spirit and scope of the present disclosure which is also covered by this invention.
An image processing unit will generate a "Third-person" view of the car at realtime by using computer vision algorithm.
This computer vision algorithm is the technical core of this invention. Most automaker such as tesla use simple symbol model to present the car while this invention generate a "Third-person" view which is synthesized from real image just like special effects in the sci-fi movies. The road side scene and the cars in the "Third-person" view are generated from real image. The virtual view generated from the image processing unit is just like a real camera capture the whole scene at realtime. All of this is depend on an advanced computer vision algorithm to make it work.
The image processing unit generate such "Third-person" view of the car at realtime and sent it to the display terminal to present it to the driver. The display terminal could be a mobile phone,tablet,VR-Headset or other similar display device.
When driving at sunny day, the image processing unit could also remove the outdoor scene shadow when generated the "Third-person"
view.
FIG.3 is a sunny day view generated from the image processing unit at realtime. The generated view will be registered in a high precision 3D map. Additional information could be optionally showed in the view such as road lines,navigation guide info,hint info of the building at road sides or even advertisement info. The image processing unit also generate estimated depth of the view,this depth image will be used when show the additional information to get correct depth occlusion. All this information could also be showed to other non-driver passengers on the car or even transmitting via internet connection for remote viewing or even for remote controlled driving.
When driving at night or raining or snowing or fog day,it is hard for the driver to recognize objects on the road. The image processing unit might use IR image of the scene to generate an enhanced version of the "Third-person" view of the car when driving in the above conditions.
By adding an wireframe from the high precision 3D map to the "Third-person" view could help the driver to identify the objects and cars on the road more clearly.
FIG.4 is a "Third-person" view of the car generated by the image processing unit when driving at night or raining or snowing or fog day. A wireframe with distinguishing color to the view is added to the view with other information similar in the FIG.3 can also be showed optionally.
Conventional cars usually have steering wheel,brake and gas pedal which are mechanical control device,which need a lot of time for human to learn how to coordinate with it. Even a proficient driver couldn't guarantee to operate it correctly all the time. While this invention introduce new control device such as gamepad,tablet,mobile phone or other similar custom control device,which will make it easy to learn. And driving a car is as easy as playing a video game.
The display terminal would be tablet,VR-headset,mobile phone or other similar custom display device instead of a direct human sight will liberate the the driver and the driver could stay at any place in the car while in the conventional car driver is limited at a fixed position. The display terminal and control device will make it easier for other ones to take over the car in emergence situation.
FIG.5 is an example display terminal device which are tablet and a VR-Headset. The control system includes a interactive input device and a action processing unit. The interactive input device could be a gamepad,tablet,mobile phone or similar input device. The action processing unit will translate the input action into commands. These commands control the car to execute the operation of the driver.
The action processing unit also contains a "snap" mode,when the car driving on a narrow road or narrow street. The action processing unit will try to keep a minimum distance to other cars or obstacles automatically. The car will just go through the street without worrying about scratching something.
The subject matter described above is provided by way of illustration only and should not be constructed as limiting various modifications and changes maybe made to the subject matter described herein without following the example embodiments and applications without departing from the true spirit and scope of the present disclosure which is also covered by this invention.
Claims (14)
1. A display system for driving assistance, comprising of:
camera or cameras with supporters installed at the roof of a car;
an image processing unit; and an display terminal for the driver.
camera or cameras with supporters installed at the roof of a car;
an image processing unit; and an display terminal for the driver.
2. The camera or cameras in the claim 1 could be omnicamera,360 degree camera or industry camera or other similar camera to capture the road scene.
3. The image processing unit in claim 1 will use the images captured from the cameras in claim 1 to generate a "Third-person" view from the car in realtime,and present this view to the terminal to the driver.
4. The display terminal in claim 1 could be mobile phone,or tablet,or VR-Headset,or other similar display device.
5. A display system for driving assistance,comprising of:
cameras installed at the sides and/or top of the car;
an image processing unit;
an display terminal for the driver.
cameras installed at the sides and/or top of the car;
an image processing unit;
an display terminal for the driver.
6.The image processing unit in claim 5 will use the images captured from the cameras in claim 5 to generate a "Third-person" view from the car in realtime,and present this view to the terminal to the driver.
7. The camera or cameras in the claim 5 could be omnicamera,360 degree camera or industry camera or other similar camera to capture the road scene.
8. The display terminal in claim 5 could be mobile phone,or tablet,or VR-Headset,or other similar display device.
9. A control system for driving assistance,comprising of:
an input interactive device;
an action processing unit.
an input interactive device;
an action processing unit.
10. The input interactive device in claim 9 could be a gamepad,tablet,mobile phone or other similar control device.
11. The action processing unit in claim 9 will translate the input action into command and then operate the car by these commands.
12. The action processing unit in claim 9 is also contain a "snap"
mode,when driving in a narrow road/street,the action processing unit will try to keep a minimum distance to other objects or cars automatically. Which will avoid the car scratching something during driving.
mode,when driving in a narrow road/street,the action processing unit will try to keep a minimum distance to other objects or cars automatically. Which will avoid the car scratching something during driving.
13. A car with a display system for driving assistance described in claim 1 and a control system described in claim 9.
14. A car with a display system for driving assistance described in claim 5 and a control system described in claim 9.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2019/056626 WO2021024017A1 (en) | 2019-08-03 | 2019-08-03 | Advanced display and control system for driving assistance |
Publications (1)
Publication Number | Publication Date |
---|---|
CA3145746A1 true CA3145746A1 (en) | 2021-02-11 |
Family
ID=74503890
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3145746A Abandoned CA3145746A1 (en) | 2019-08-03 | 2019-08-03 | Advanced display and control system for driving assistance |
Country Status (5)
Country | Link |
---|---|
JP (1) | JP2022542481A (en) |
CN (1) | CN114206683A (en) |
CA (1) | CA3145746A1 (en) |
GB (1) | GB202202249D0 (en) |
WO (1) | WO2021024017A1 (en) |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3606816B2 (en) * | 2000-04-05 | 2005-01-05 | 松下電器産業株式会社 | Driving assistance device |
JP2003063310A (en) * | 2001-08-23 | 2003-03-05 | Matsushita Electric Ind Co Ltd | Image device for vehicle |
JP2010247621A (en) * | 2009-04-14 | 2010-11-04 | Stanley Electric Co Ltd | Drive support system of automobile |
JP6030317B2 (en) * | 2012-03-13 | 2016-11-24 | 富士通テン株式会社 | Image processing apparatus, image display system, display apparatus, image processing method, and program |
JP2015089733A (en) * | 2013-11-06 | 2015-05-11 | トヨタ自動車株式会社 | Parking support system |
CN203819183U (en) * | 2014-02-08 | 2014-09-10 | 温州文宏科技有限公司 | Intelligent control device for automobiles |
KR101741433B1 (en) * | 2015-06-09 | 2017-05-30 | 엘지전자 주식회사 | Driver assistance apparatus and control method for the same |
KR101942793B1 (en) * | 2015-07-03 | 2019-01-28 | 엘지전자 주식회사 | Driver Assistance Apparatus and Vehicle Having The Same |
US10000217B2 (en) * | 2015-09-03 | 2018-06-19 | Yahoo Japan Corporation | Notification-needed information presenting apparatus, notification-needed information presenting method, and non-transitory computer readable storage medium |
US10144419B2 (en) * | 2015-11-23 | 2018-12-04 | Magna Electronics Inc. | Vehicle dynamic control system for emergency handling |
KR102464484B1 (en) * | 2015-12-08 | 2022-11-07 | 현대모비스 주식회사 | Assistant system and assistant method for backward driving of vehicle |
KR20170109275A (en) * | 2016-03-21 | 2017-09-29 | 현대자동차주식회사 | Vehicle and controlling method of the same |
CN105825713B (en) * | 2016-04-08 | 2018-07-24 | 重庆大学 | The method of operation of vehicle-mounted unmanned aerial vehicle DAS (Driver Assistant System) |
JP6711128B2 (en) * | 2016-05-18 | 2020-06-17 | 株式会社リコー | Image processing device, imaging device, mobile device control system, image processing method, and program |
DE102016114693A1 (en) * | 2016-08-09 | 2018-02-15 | Connaught Electronics Ltd. | A method for assisting a driver of a motor vehicle when driving the motor vehicle, driver assistance system and motor vehicle |
DE102016221273A1 (en) * | 2016-10-28 | 2018-05-03 | Ford Global Technologies, Llc | Method for operating a portable input device for controlling a motor vehicle |
CN107150689A (en) * | 2017-03-20 | 2017-09-12 | 深圳市保千里电子有限公司 | A kind of automobile assistant driving method and system |
US10732625B2 (en) * | 2017-12-04 | 2020-08-04 | GM Global Technology Operations LLC | Autonomous vehicle operations with automated assistance |
CN107963030A (en) * | 2017-12-12 | 2018-04-27 | 成都电科海立科技有限公司 | A kind of image drive assistance device and method |
JP6907920B2 (en) * | 2017-12-15 | 2021-07-21 | 株式会社デンソー | Automatic driving support device |
CN109774604A (en) * | 2019-03-08 | 2019-05-21 | 王桂佼 | A kind of 360 ° of advanced driving assistance systems of panorama |
-
2019
- 2019-08-03 JP JP2022506686A patent/JP2022542481A/en active Pending
- 2019-08-03 CA CA3145746A patent/CA3145746A1/en not_active Abandoned
- 2019-08-03 GB GBGB2202249.5A patent/GB202202249D0/en not_active Ceased
- 2019-08-03 WO PCT/IB2019/056626 patent/WO2021024017A1/en active Application Filing
- 2019-08-03 CN CN201980098941.4A patent/CN114206683A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
GB202202249D0 (en) | 2022-04-06 |
CN114206683A (en) | 2022-03-18 |
JP2022542481A (en) | 2022-10-03 |
WO2021024017A1 (en) | 2021-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7331696B2 (en) | Information processing device, information processing method, program, and mobile object | |
US10953330B2 (en) | Reality vs virtual reality racing | |
CN109636924B (en) | Vehicle-mounted multi-mode augmented reality system based on real road condition information three-dimensional modeling | |
US9308917B2 (en) | Driver assistance apparatus capable of performing distance detection and vehicle including the same | |
US9787946B2 (en) | Picture processing device and method | |
JP2022520544A (en) | Vehicle intelligent driving control methods and devices, electronic devices and storage media | |
KR101413231B1 (en) | Around view monitoring based on augmented reality device, method and vehicle | |
CN111216127A (en) | Robot control method, device, server and medium | |
WO2022017139A1 (en) | Parking display method and vehicle | |
US20200249044A1 (en) | Superimposed-image display device and computer program | |
CN114041175A (en) | Neural network for estimating head pose and gaze using photorealistic synthetic data | |
CN114022565A (en) | Alignment method and alignment device for display equipment and vehicle-mounted display system | |
US11626028B2 (en) | System and method for providing vehicle function guidance and virtual test-driving experience based on augmented reality content | |
US20160203582A1 (en) | Display control apparatus, projection apparatus, display control method, and non-transitory computer readable storage medium | |
US11227494B1 (en) | Providing transit information in an augmented reality environment | |
Kim et al. | Effects on productivity and safety of map and augmented reality navigation paradigms | |
CN112104857A (en) | Image generation system, image generation method, and information storage medium | |
JP2019532540A (en) | Method for supporting a driver of a power vehicle when driving the power vehicle, a driver support system, and the power vehicle | |
CN111016787B (en) | Method and device for preventing visual fatigue in driving, storage medium and electronic equipment | |
CA3145746A1 (en) | Advanced display and control system for driving assistance | |
JP7232613B2 (en) | Track builder for car toy with camera | |
CN113525402B (en) | Advanced assisted driving and unmanned visual field intelligent response method and system | |
KR20200063789A (en) | Ar glass and method of providing augmented reality service using the same | |
JP2015184804A (en) | Image display device, image display method, and image display program | |
CN113345107A (en) | Augmented reality data display method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20220125 |
|
EEER | Examination request |
Effective date: 20220125 |
|
EEER | Examination request |
Effective date: 20220125 |
|
FZDE | Discontinued |
Effective date: 20230530 |