EP2447925A1 - Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle - Google Patents

Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle Download PDF

Info

Publication number
EP2447925A1
EP2447925A1 EP10189342A EP10189342A EP2447925A1 EP 2447925 A1 EP2447925 A1 EP 2447925A1 EP 10189342 A EP10189342 A EP 10189342A EP 10189342 A EP10189342 A EP 10189342A EP 2447925 A1 EP2447925 A1 EP 2447925A1
Authority
EP
European Patent Office
Prior art keywords
image information
display
road
side camera
navigation system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP10189342A
Other languages
German (de)
French (fr)
Other versions
EP2447925B1 (en
Inventor
Christian Heusch
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HEUSCH, CHRISTIAN
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to EP10189342.8A priority Critical patent/EP2447925B1/en
Publication of EP2447925A1 publication Critical patent/EP2447925A1/en
Application granted granted Critical
Publication of EP2447925B1 publication Critical patent/EP2447925B1/en
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control

Definitions

  • the present invention concerns a method for providing information assisting a driver of a vehicle which is equipped with a navigation system. It also concerns a corresponding system for use in a vehicle.
  • TrafficView A Driver Assistant Device for Traffic Monitoring based on Car-to-Car Communication
  • S. Dashtinezhad et al. cf. http://www.cs.rutgers.edu/ ⁇ iftode/vtcsp04.pdf.
  • AHS Automatic Highway System
  • NaviView In order to improve the comfort of drivers under the control of AHS, it is necessary to recover the driver's field of vision. In this paper, a NaviView system for the purpose of recovery is proposed.
  • AHS the entire traffic load on the highway is visually monitored by many roadside video cameras. NaviView utilizes these video images for the recovery of the driver's visual field. It enables the driver to observe the behavior of other cars by means of a virtual view from a point just above his car. The virtual view presented to the driver is generated by combining the video images of the fixed roadside cameras.
  • GPS global positioning system
  • the camera and transmitter are powered by the cables that power the vehicle's reverse lights.
  • the reversing lights turn on and power is sent to the camera and the transmitter, sending out a video signal.
  • the GPS unit then automatically switches over to the camera view.
  • the reverse lights go out and the camera stops sending an image.
  • the unit will stop showing the camera's image and return to the previous function. See for instance: http://www.navsgo.com/GO740RV.html
  • the method provides information assisting a driver of a vehicle equipped with a navigation system.
  • Real-time image information is provided to a stationary remote system, said real-time image information having been obtained from at least one road-side camera.
  • An object is displayed on a display of the navigation system, said object representing the at least one road-side camera.
  • a corresponding road-side camera is selected using a user interface of the navigation system. This selection might for instance be done before approaching a specific section of a route.
  • Image information based on the real-time image information, which is provided by the corresponding road-side camera which has been selected, is directly or indirectly (e.g. via an intermediate communication tool) transmitted from the remote system to the navigation system.
  • An image is displayed, based on the image information, on the display in order to provide visual information of the specific section or part of the route.
  • the inventive system is designed for use in a vehicle.
  • the system comprises a navigation system, a display for displaying information provided by the navigation system, and a receiver (e.g. a receiver part of the navigation system or an intermediate communication tool) for receiving image information from a remote system.
  • the system further comprises a software module causing an object representing a road-side camera to be displayed on the display and a user interface for interaction of a driver with the system.
  • the user interface enables the driver to select a road-side camera.
  • the system is enabled to receive the image information from the remote system and to display an image which is based on the image information on the display.
  • Fig. 1 shows a schematic structural diagram of a system 200 in accordance with the present invention.
  • the system 200 comprises a navigation system 2 inside a vehicle 1 (e.g. a car), a remote system 100 and at least one road-side camera C1.
  • the system 200 furthermore comprises communication means or channels, as indicated in Fig. 1 by arrows.
  • the image information, which is transmitted from the road-side camera C1 to the remote system 100, is represented by the reference sign I1.
  • the image information 12 is the information which is directly or indirectly transmitted from the remote system 100 to the navigation system 2.
  • the data size of 12 is smaller than the data size of I1 or than the data size of the image recorded by the road-side camera C1. If the compression is carried out inside the road-side camera C1, the data size of 12 might be about the same as the data size of I1.
  • a “navigation system” 2 is an electronic processors-based system which for instance enables or supports the driver of a vehicle 1 to find a certain route or to find a location.
  • the navigation system 2 typically has a GPS or other positioning feature which is included or connected.
  • the navigation system 2 may be integrated into the vehicle 1 (pre-installed system), or it may be a portable system (add-on system) which can be used, as needed, in different vehicles.
  • Any portable computer-based system e.g. a smart phone in which a navigation application is implemented, or which is able to connect to a server-based or cloud-based navigation system is also considered to be a navigation system for the purposes of the present invention.
  • a web-based navigation system is also considered to be a navigation system for the purposes of the present invention.
  • a navigation system 2 may be entirely onboard the vehicle 1, or it may be located elsewhere and communicate via radio or other signals with the vehicle, or it may use a combination of these methods.
  • the navigation system 2 can be a system presenting 2-dimensional or 3-dimensional maps. It could also be a system 2 where real images are used to provide a more realistic look.
  • the invention can be used in connection with any of these systems 2.
  • a “remote system 100” is a single server (i.e. a physical computer dedicated to running a special service) or host, a set-up where two or more servers (e.g. a series or network of computers) are connected, or a cloud-computing environment (e.g. an arrangement where shared resources, software, and information are provided to computers and other devices, such as the navigation system 2).
  • the remote system 100 is a stationary system. At least a part of the navigation system 2 (for instance the receiver and the display 2) is mobile.
  • the remote system 100 is configured and programmed to provide essential inventive services across a communication link or network to at least one vehicle 1 comprising a navigation system 2.
  • a “road-side camera C1” is either a camera at a fixed location, e.g. attached to a pole, lamp, tree or building, or a vehicle-based camera which keeps a pre-defined position for a certain period of time, e.g. during rush hours.
  • a camera C1 is herein considered to be a device that records images.
  • the camera C1 preferably comprises or is connected to communication means for establishing a direct or indirect communication link to the remote system 100.
  • the road-side camera C1 preferably provides digital image information.
  • Figures 6, 7, and 8 show exemplary schematic flow-charts of the various aspects of the inventive method.
  • the invention concerns a method which provides information assisting a driver of a vehicle 1 which is equipped with a navigation system 2.
  • the method comprises the following steps:
  • the road-side camera C1 is typically selected by the driver of the vehicle 1 before a specific section or part of a route R1 is approached.
  • the image information 12 Before the image information 12 is transmitted, it is made available by the remote system 100 (cf. step S2 in Fig. 6 ). Before displaying the image information 13 on the navigation system's display 4, the image information 12 is directly or indirectly (e.g. via a cellular phone) received (cf. step S4 in Fig. 6 ) and processed (cf. step S5 in Fig. 6 ) by the navigation system 2.
  • the image information 12 is directly or indirectly (e.g. via a cellular phone) received (cf. step S4 in Fig. 6 ) and processed (cf. step S5 in Fig. 6 ) by the navigation system 2.
  • the driver typically does not know where the road-side cameras C1 are positioned and in which direction they are pointing.
  • the object 3, which is displayed on the display 4, thus indicates one or more of the following:
  • This feature or these features F1 - F4 is/are important since the driver should not be distracted when selecting the right camera C1. In addition, it is important that the driver is able to match, relate or connect the information 13 displayed on the display 4 to the real world. Due to the fact that the object 3 includes one or more of the features F1 - F4, it is easy to interpret the visual information provided by the road-side camera C1.
  • the object 3, which is displayed on the display 4, may be temporarily magnified if the driver touches or pre-selects the object 3. This feature can be used in connection with all embodiments.
  • Fig. 11 shows several different objects 3.1 through 3.5 which could be used to show on a display 4 the position of a road-side camera C1.
  • Objects 3 are preferred where the object as such indicates a viewing direction. It is obvious when looking at the different objects 3.1 - 3.5 in Fig. 11 that all of them are pointing to the right hand side.
  • These objects 3.1 - 3.5 are preferred. They can be used in connection with any of the embodiments.
  • All embodiments of the invention may comprise additional features, objects or means which help the driver to "read” an image 13.
  • the following features could be used as single feature or two or more of these features can be combined:
  • the invention is facilitated by a number of measures which have been taken in order to control the bandwidth or to keep it low.
  • the respective measures are listed below.
  • the actual embodiments of the invention may comprises one or more of these measures.
  • the real-time image information I1 can be sent through different communication channels from the remote system 100 to the vehicle 1.
  • the remote system 100 is considered to be the source and the vehicle 1, respectively the inventive system 300 in the vehicle 1 in considered to be the destination or receiver.
  • the real-time image information I1 is not necessarily sent from the remote system 100 right to the vehicle 1. It is also conceivable that there are systems (relays, routers, switches etc.) in-between, such as a computing environment of a mobile phone company or a server of a specialized service provider.
  • a typical high resolution still image provided by a road-side camera C1 provides still images which have a size of about 1 MB or more.
  • the ideal size of a still image is for the purposes of the present invention considered to be between 4 kB and 500 kB. If one assumes that in a certain city there are 50 road-side cameras C1 - C50 and that the images are processed (e.g.
  • the road-side camera C1 should have a resolution which is better than 0,2 megapixels and preferably more than 0,5 megapixels. This rule applies to all embodiments.
  • the road-side camera carries on optical filter or screen in order to block information which is not considered relevant or which for legal reasons has to be blanked out.
  • a corresponding illustration is provided in Fig. 12 .
  • the hashed field 400 represents a filter attached to a road-side camera. This approach also helps to reduce the bandwidth requirements.
  • the dynamic image content is what matters the most in the context of the present invention since the view of a road crossing, for instance, which does not show any moving objects (such as pedestrians or cars) is not as interesting to the driver of the vehicle 1 as an image which shows for instance the stop-and-go of vehicles in front of a traffic light. This is one of the reasons why the measure M2 is considered advantageous.
  • FIG. 10A shows the real traffic situation similar to the situation of Fig. 5A .
  • the corresponding image information I1 can be transmitted from the camera C1 to the remote system 100.
  • An image compression e.g. a separation of static and dynamic content (measure M2) or a regular data compression (measure M3), could also be carried out by the camera C1 or by a module attached to the camera C1.
  • Fig. 10B shows the static image content which is here reduced to some very basic shapes and elements, such as road markings, outlines of buildings and landmarks, for instance. In the present example elements and features (such as plants, windows, doors, etc.) which are not considered important are removed.
  • Fig 10C shows the dynamic image content, such as cars, traffic lights, changing traffic signs, pedestrians, etc.
  • the transmission of the static image content does not require much bandwidth. In a vector based system it would be sufficient to just transmit vector information for lines and edges.
  • the respective static image content is actually transmitted from the remote system 100 to the navigation system 2.
  • the respective static image content is stored in the navigation system 2 (e.g. using a CD ROM or another storage medium). In this case the static information is not required to be transmitted.
  • the dynamic content is transmitted to the navigation system 2.
  • the system 2, or a special module attached thereto maps the dynamic content onto the static content (step S1.3, Fig. 7 ), no matter whether the static content is locally available or transmitted from the remote system 100.
  • Fig. 10D shows the display 4 with a "reconstructed" image where static and dynamic content has been merged.
  • FIG. 7 A flow chart is presented in Fig. 7 where the static and dynamic image content are separately handled and transmitted (steps S1.1 and S1.2, Fig. 7 ).
  • the static and dynamic image content are merged or combined by the navigation system 2, or by a module attached or connected to the system 2 (step S1.3, Fig. 7 ).
  • FIG. 8 A flow chart is presented in Fig. 8 where the static image content is not transmitted because it is available at the navigation system 2 (step S2.1, Fig. 8 ).
  • the static image content could be retrieved from a local storage medium, for instance.
  • the dynamic image content is separately handled and transmitted (steps S2.2 and S2.3, Fig. 8 ).
  • the static and dynamic image content are merged or combined by the navigation system 2, or by a module attached or connected to the system 2 (step S2.4, Fig. 8 ).
  • cellular or smart push approach instead of broadcasting all images to all systems 300 in a general push approach, one could divide the whole city area into smaller cells (herein called cellular or smart push approach). In this case only images of road-side cameras (e.g. the cameras C1 - C10) within a particular cell are transmitted to systems 300 inside the cell or close to this cell. This helps to reduce the overall bandwidth requirement drastically.
  • road-side cameras e.g. the cameras C1 - C10
  • images of road-side cameras are transmitted to systems 300 inside a vehicle 1 where a route has been defined or programmed in the navigation system 2 (i.e. if a route finding process of the navigation system 2 is active) which is passing these cameras C5 - C10.
  • a route has been defined or programmed in the navigation system 2 (i.e. if a route finding process of the navigation system 2 is active) which is passing these cameras C5 - C10.
  • This helps to reduce the overall bandwidth requirement drastically.
  • a driver who is driving from location A to location B using the navigation system 2 and following the route R1, as shown in Fig. 9 would only be enabled to request and receive image information 12 from the road-side camera C2.
  • Image information 12 from the road-side camera C3 is only requestable by a driver whose route passes by the position of the road-side camera C3.
  • the cell or micro-cell structure of a cellular mobile phone network is used in order to determine whether a system 200 currently is in a certain cell and whether there are any cameras (e.g. the cameras C1 - C10) in the same or in a neighboring cell which are providing real-time image information I1 which could be useful to the driver of a certain vehicle 1.
  • This approach could be used together with the cellular or smart broadcast push approach, or it could be used in connection with a pull approach.
  • a system 200 inside or close to a certain cell is enabled to request real-time image information I1 from a certain camera C1 - C10 within or close to the same cell, e.g. by using the systems user interface.
  • the pull approach is in another embodiment used without making a preselection or the like using the current vehicle's position inside a cell.
  • This approach is herein called user specific pull approach.
  • the user is enabled by a software module of the system 300 to request real-time image information I1 provided by a certain camera CX, no matter where he or his vehicle 1 is located.
  • This real-time image information I1 is directly or indirectly requested from the remote system 100 and sent to the user, vehicle 1 or system 300 using a dedicated downlink, e.g. a mobile phone connection.
  • TCM Traffic Message Channel
  • RDS FM Radio Data System
  • I1 Real-time image information I1 could be transmitted together with radio signals in a broadcast fashion, but this service should be limited to local radio transmitters, because it would not make much sense for a vehicle 1 in one city to receive images from cameras in other cities.
  • Digital radio describes radio communications technologies which carry information as a digital signal. Since a digital modulation method is used for the transmission, the vehicle 1 or navigation system 2 in this case comprises a digital demodulator in order to be able to receive and process the digital signals. The digital radio service could be used to transmit real-time image information I1 in a broadcast fashion.
  • a regular phone is programmed to receive the information I1, or a separate communications module (e.g. comprising a SIM card) is implemented inside the inventive system 300 or is connectable to the system 300.
  • a separate communications module e.g. comprising a SIM card
  • Such an approach is preferably being used for realizing the cellular or smart broadcast push approach or the user specific pull approach.
  • image data messages are received silently and decoded by a car radio, a mobile phone, a PDA, a smart phone or a navigation system 2, and delivered (e.g. made visible) to the driver in a variety of ways.
  • the system 300 in all embodiments includes a display 4 or other means or indicators which can be dedicated or shared with the ones already existing in the vehicle 1. Besides visual indicators the navigation system 2 may also include audible means or other means to inform the driver.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Method and system (200) for providing information assisting a driver of a vehicle (1) which is equipped with a navigation system (2),
- Providing real-time image information (I1) to a remote system (100), said real-time image information (I1) having been obtained from at least one road-side camera (C1),
- Displaying an object (3) on a display (4) of said navigation system (2) which represents at least one road-side camera (C1),
- selecting a corresponding road-side camera (C1) using a user interface of said navigation system (2),
- transmitting image information (12) based on said real-time image information (I1) provided by the corresponding road-side camera (C1) which has been selected, from said remote system (100) to said navigation system (2), displaying an image (13) based on said image information (12) on said display (4) in order to provide visual information of said specific section of a route (R1).

Description

    Field of the invention
  • The present invention concerns a method for providing information assisting a driver of a vehicle which is equipped with a navigation system. It also concerns a corresponding system for use in a vehicle.
  • Background of the invention
  • There is a need to improve current navigation systems. It is quite frustrating that still today these systems are not able to deal properly with traffic congestions in urban areas. There is quite some potential to reduce traffic jams and to reduce at the same time the pollution caused by cars which are waiting in queues, e.g. in front of traffic lights, tunnel portals, parking decks, shopping centers, and so forth.
  • There is a vehicle-to-vehicle communication system known as "TrafficView" which enables the spreading of traffic related information from car to car using short-range wireless communication. Details can be found in "TrafficView: A Driver Assistant Device for Traffic Monitoring based on Car-to-Car Communication", S. Dashtinezhad et al., cf.
    http://www.cs.rutgers.edu/~iftode/vtcsp04.pdf.
  • There is the long felt desire to provide real-time information or even images while driving inside a car. It is thus not surprising to see that a number of approaches and schemes have been developed which employ road-side cameras and which link the cameras with a display inside the car.
  • The following presentations
    • "Embedded Systems", WSITC FORUM, AN INTERNATIONAL CES 2007 PERSPECTIVE, which was held by Robert Mitchell, AEEMA, 20 February 2007,
    • "What's Hot in ICT?", WSITC FORUM, AN INTERNATIONAL CES 2007 PERSPECTIVE, which was held by Angus M Robinson, AEEMA, 20 February 2007
    relate to embedded systems which employ real-time roadside camera images.
  • For details see also:
    • http://interdependent.com.au/wsitc/documents/WSITC_Forum_20Feb07_Progra m_PPTs.pdf
  • The use of road-side cameras near intersections is proposed and discussed in "visual Assistance for Drivers by Mixed Reality", Y. Kameda et al., 14th World Congress at of ITS, 2007, Beijing. The road-side cameras are used in order to provide visual assistance and a better overview for a driver who has a receiver in his car. For details see: http://www.kameda-lab.org/research/publication/2007/200710_ITSWC/ITSWC2007-3210-kameda.pdf
  • Yet another approach, called AHS (Automated Highway System), is discussed in the following paper: "NaviView: Bird's-Eye View for Highway Drivers Using Roadside Cameras," Eitaro Ichihara, Hiroyuki Takao, Yuichi Ohta, icmcs, vol. 2, pp.559, 1999 IEEE International Conference on Multimedia Computing and Systems (ICMCS'99) - Volume 2, 1999. The AHS is believed to increase the traffic potential of highways because it enables drivers to shorten the distance between cars. AHS may make the drivers uncomfortable because it presents them with a reduced field of vision of the car ahead and may induce anxiety over the reliability of the control system. In order to improve the comfort of drivers under the control of AHS, it is necessary to recover the driver's field of vision. In this paper, a NaviView system for the purpose of recovery is proposed. In AHS, the entire traffic load on the highway is visually monitored by many roadside video cameras. NaviView utilizes these video images for the recovery of the driver's visual field. It enables the driver to observe the behavior of other cars by means of a virtual view from a point just above his car. The virtual view presented to the driver is generated by combining the video images of the fixed roadside cameras.
  • Another approach is presented in the following publication: "Car navigation system with image recognition," Kohei Ito, Naohiko Ichihara, Hiroto Inoue, Ryujiro Fujita, Mitsuo Yasushi, icce, pp.1-2, 2009 Digest of Technical Papers International Conference on Consumer Electronics, 2009. The authors state that many cars now would have on-board cameras, and that many kinds of driver support systems that use image recognition are being developed. The authors themselves claim to have developed a car navigation system with image recognition that enhances drivers' safety, convenience, and entertainment.
  • There are GPS (global positioning system) based navigation systems with backup camera display. The camera and transmitter are powered by the cables that power the vehicle's reverse lights. When the vehicle is put into reverse gear, the reversing lights turn on and power is sent to the camera and the transmitter, sending out a video signal. The GPS unit then automatically switches over to the camera view. When the vehicle is taken out of reverse gear, the reverse lights go out and the camera stops sending an image. At the same time the unit will stop showing the camera's image and return to the previous function. See for instance: http://www.navsgo.com/GO740RV.html
  • All these systems mentioned above confirm that there is a desire for establishing a connection between cameras and cars. It is a disadvantage of the known approaches that they are just providing overview information which is not linked at all to the capabilities of a navigation system. These systems have the serious disadvantage that their use while driving might distract the driver. It is yet another disadvantage of known systems, that they require quite some bandwidth for the transmission of image information. Any system which would be offered to all users of navigation systems in an urban area would lead to a communication or capacity overload.
  • It is an objective of the present invention to provide a camera-assisted or camera-based navigation system which offers real time or close-to-real time information.
  • It is another objective to provide a system which is well suited for large scale operation e.g. in urban area.
  • SUMMARY OF THE INVENTION
  • The method, according to the present invention, provides information assisting a driver of a vehicle equipped with a navigation system. Real-time image information is provided to a stationary remote system, said real-time image information having been obtained from at least one road-side camera. An object is displayed on a display of the navigation system, said object representing the at least one road-side camera. A corresponding road-side camera is selected using a user interface of the navigation system. This selection might for instance be done before approaching a specific section of a route. Image information based on the real-time image information, which is provided by the corresponding road-side camera which has been selected, is directly or indirectly (e.g. via an intermediate communication tool) transmitted from the remote system to the navigation system. An image is displayed, based on the image information, on the display in order to provide visual information of the specific section or part of the route.
  • The inventive system is designed for use in a vehicle. The system comprises a navigation system, a display for displaying information provided by the navigation system, and a receiver (e.g. a receiver part of the navigation system or an intermediate communication tool) for receiving image information from a remote system. The system further comprises a software module causing an object representing a road-side camera to be displayed on the display and a user interface for interaction of a driver with the system. The user interface enables the driver to select a road-side camera. The system is enabled to receive the image information from the remote system and to display an image which is based on the image information on the display.
  • The features of advantageous embodiments are presented in the dependent method and apparatus claims. The respective advantages are addressed or become apparent from the following detailed description.
  • Brief description of the drawings
  • For a more complete description of the present invention and for further objects and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying drawings:
  • Fig. 1
    shows a schematic structural diagram of a system in accordance with the present invention;
    Fig. 2
    shows a schematic diagram of a display of a system in accordance with the present invention;
    Fig. 3
    shows a schematic diagram of a display of a system in accordance with the present invention;
    Fig. 4
    shows a schematic diagram of a display of a system in accordance with the present invention;
    Fig. 5A
    shows a schematic diagram of a real traffic situation;
    Fig. 5B
    shows a schematic diagram of a display of a system in accordance with the present invention where an image of the real traffic situation of Fig. 5A is shown;
    Fig. 6
    shows a schematic flow-chart in accordance with the present invention;
    Fig. 7
    shows a schematic flow-chart in accordance with the present invention;
    Fig. 8
    shows a schematic flow-chart in accordance with the present invention;
    Fig. 9
    shows a schematic map with one object representing a road-side camera in accordance with the present invention;
    Fig. 10A
    shows a schematic diagram of a real traffic situation;
    Fig. 10B
    shows a schematic diagram of the relevant static image content of Fig. 10A;
    Fig. 10C
    shows a schematic diagram of the relevant dynamic image content of Fig. 10A;
    Fig. 10D
    shows a schematic diagram of the "reconstructed" image which is based on or derived from the image of Fig. 10A;
    Fig. 11
    shows several different objects which could be used to show on a display the position of a road-side camera;
    Fig. 12
    shows a schematic illustration of an image obtained by a road-side camera to which a filter is attached.
    DESCRIPTION OF PREFERRED EMBODIMENTS
  • Fig. 1 shows a schematic structural diagram of a system 200 in accordance with the present invention. The system 200 comprises a navigation system 2 inside a vehicle 1 (e.g. a car), a remote system 100 and at least one road-side camera C1. The system 200 furthermore comprises communication means or channels, as indicated in Fig. 1 by arrows. The image information, which is transmitted from the road-side camera C1 to the remote system 100, is represented by the reference sign I1. The image information 12 is the information which is directly or indirectly transmitted from the remote system 100 to the navigation system 2. According to the invention, the data size of 12 is smaller than the data size of I1 or than the data size of the image recorded by the road-side camera C1. If the compression is carried out inside the road-side camera C1, the data size of 12 might be about the same as the data size of I1.
  • A "navigation system" 2, as herein used, is an electronic processors-based system which for instance enables or supports the driver of a vehicle 1 to find a certain route or to find a location. The navigation system 2 typically has a GPS or other positioning feature which is included or connected. The navigation system 2 may be integrated into the vehicle 1 (pre-installed system), or it may be a portable system (add-on system) which can be used, as needed, in different vehicles. Any portable computer-based system (e.g. a smart phone) in which a navigation application is implemented, or which is able to connect to a server-based or cloud-based navigation system is also considered to be a navigation system for the purposes of the present invention. A web-based navigation system is also considered to be a navigation system for the purposes of the present invention. In other words, a navigation system 2, as herein used, may be entirely onboard the vehicle 1, or it may be located elsewhere and communicate via radio or other signals with the vehicle, or it may use a combination of these methods.
  • The navigation system 2 can be a system presenting 2-dimensional or 3-dimensional maps. It could also be a system 2 where real images are used to provide a more realistic look. The invention can be used in connection with any of these systems 2.
  • A "remote system 100" is a single server (i.e. a physical computer dedicated to running a special service) or host, a set-up where two or more servers (e.g. a series or network of computers) are connected, or a cloud-computing environment (e.g. an arrangement where shared resources, software, and information are provided to computers and other devices, such as the navigation system 2). The remote system 100 is a stationary system. At least a part of the navigation system 2 (for instance the receiver and the display 2) is mobile.
  • The remote system 100 is configured and programmed to provide essential inventive services across a communication link or network to at least one vehicle 1 comprising a navigation system 2.
  • A "road-side camera C1" is either a camera at a fixed location, e.g. attached to a pole, lamp, tree or building, or a vehicle-based camera which keeps a pre-defined position for a certain period of time, e.g. during rush hours. A camera C1 is herein considered to be a device that records images. The camera C1 preferably comprises or is connected to communication means for establishing a direct or indirect communication link to the remote system 100. The road-side camera C1 preferably provides digital image information.
  • Figures 6, 7, and 8 show exemplary schematic flow-charts of the various aspects of the inventive method.
  • The invention concerns a method which provides information assisting a driver of a vehicle 1 which is equipped with a navigation system 2. The method comprises the following steps:
    • Providing real-time image information I1 to the remote system 100, the real-time image information I1 having been obtained from at least one road-side camera C1 (as illustrated in Fig. 1, for instance).
    • Displaying an object 3 (see for instance Figures 2, 3, 4, 5B, 9) on a display 4 of the navigation system 2 which represents the at least one road-side camera C1.
    • Selecting a corresponding road-side camera C1 using a user interface of the navigation system 2 (e.g. by touching the object 3 on a touch sensitive display 4 or by using a keyboard or by using voice control). This step enables the user of the system 2 to select a particular (corresponding) road-side camera C1 (cf. step S1 in Fig. 6).
    • Transmitting image information 12 which is based on the real-time image information I1 provided by the corresponding road-side camera C1 which has been selected (cf. step S3 in Fig. 6). The transmission takes place between the remote system 100 or a relay unit attached or connected to the remote system 100 and the navigation system 2 (as illustrated in Fig. 1, for instance).
    • Displaying an image 13 based on the image information 12 on the display 4 in order to provide visual information of a specific section or part of the route R1 (cf. step S6 in Fig. 6). In Fig. 5B a situation or embodiment is shown, where on the display 4 an image 5 in an image is shown. A split-screen solution could also be used.
  • The road-side camera C1 is typically selected by the driver of the vehicle 1 before a specific section or part of a route R1 is approached.
  • Before the image information 12 is transmitted, it is made available by the remote system 100 (cf. step S2 in Fig. 6). Before displaying the image information 13 on the navigation system's display 4, the image information 12 is directly or indirectly (e.g. via a cellular phone) received (cf. step S4 in Fig. 6) and processed (cf. step S5 in Fig. 6) by the navigation system 2.
  • The driver typically does not know where the road-side cameras C1 are positioned and in which direction they are pointing. The object 3, which is displayed on the display 4, thus indicates one or more of the following:
    • F1: a viewing direction 102 of a corresponding road-side camera C1 (cf. Fig. 3),
    • F2: a line of sight of a corresponding road-side camera C1,
    • F3: field of view 101 of a corresponding road-side camera C1 (cf. Fig. 2),
    • F4: a range 103 of a corresponding road-side camera C1 (cf. Fig. 4).
  • This feature or these features F1 - F4 is/are important since the driver should not be distracted when selecting the right camera C1. In addition, it is important that the driver is able to match, relate or connect the information 13 displayed on the display 4 to the real world. Due to the fact that the object 3 includes one or more of the features F1 - F4, it is easy to interpret the visual information provided by the road-side camera C1.
  • The object 3, which is displayed on the display 4, may be temporarily magnified if the driver touches or pre-selects the object 3. This feature can be used in connection with all embodiments.
  • Fig. 11 shows several different objects 3.1 through 3.5 which could be used to show on a display 4 the position of a road-side camera C1. Objects 3 are preferred where the object as such indicates a viewing direction. It is obvious when looking at the different objects 3.1 - 3.5 in Fig. 11 that all of them are pointing to the right hand side. These objects 3.1 - 3.5 are preferred. They can be used in connection with any of the embodiments.
  • All embodiments of the invention may comprise additional features, objects or means which help the driver to "read" an image 13. The following features could be used as single feature or two or more of these features can be combined:
    • F5: when reaching the position of the road-side camera C1 a visual or audible cue can be provided by the navigation system 2.
    • F6: when reaching the zone or area which is within the viewing field of a road-side camera C1, the viewing direction 102 and/or the line of sight and/or the field of view 101 and/or the range 103 is/are highlighted or put in the foreground on the display 4.
    • F7: In a 3-D navigation system 2 or in a navigation system 2 using real images, the image information 13 (at least the dynamic content) is mapped onto the 3-D image or real image (called overlay mode).
  • The invention is facilitated by a number of measures which have been taken in order to control the bandwidth or to keep it low. The respective measures are listed below. The actual embodiments of the invention may comprises one or more of these measures.
  • The most important inventive measures M1 through M6, which help to keep bandwidth requirement low, are listed below:
    • M1: Real-time image information I1 is only offered locally, that is a driver can only obtain images of cameras C1 which are along his route R1 or within a certain local area. This measure M1 is preferably combined with an active route finding process being carried out by the navigation system 2.
    • M2: In addition to any of the other measures or instead of the other measures, a distinction is made between static (motionless) image content and dynamic or quasi dynamic image content (i.e. changing content). This can be done by means of software-implemented edge detection for instance, which ensures that only limited image data are transmitted for static portions or areas of an image. These data are then used in the vehicle 1 to build or reconstruct an image.
    • M3: In addition to any of the other measures or instead of the other measures, software or hardware-based techniques (such as data compression) are employed which ensure that only limited image data or a reduced data volume are transmitted. These data are then used in the vehicle 1 to build or reconstruct an image;
    • M4: In addition to any of the other measures or instead of the other measures, software or hardware-based techniques are employed which ensure that dynamic or quasi dynamic image content is transmitted on request only.
    • M5: In addition to any of the other measures or instead of the other measures, software or hardware-based techniques are employed which ensure that real-time image information is transmitted on request only. In this case either the user of the inventive system or the system itself requests real-time image information relevant for or related to a certain route R1.
    • M6: A distinction can be made between road-side cameras C2 which currently show important traffic information and road-side cameras C3 where no relevant information is available. This can be achieved either in that an automated software-based process run by the remote system 100 or an operator distinguishes relevant information from not relevant information. The automated software-based process either uses a pattern recognition scheme, or it processes additional information. The additional information can be provided by radar cameras, induction loops integrated into the road, motion sensors and other auxiliary units. The object 3 could for instance be highlighted using a special color scheme (e.g. a green object versus a red object), or the shape of the object 3 could change. A highlighted object 3 would than indicate that there is relevant image information available.
  • The real-time image information I1 can be sent through different communication channels from the remote system 100 to the vehicle 1. In the following a distinction is made between push approaches and pull approaches. In both cases the remote system 100 is considered to be the source and the vehicle 1, respectively the inventive system 300 in the vehicle 1 in considered to be the destination or receiver. The real-time image information I1 is not necessarily sent from the remote system 100 right to the vehicle 1. It is also conceivable that there are systems (relays, routers, switches etc.) in-between, such as a computing environment of a mobile phone company or a server of a specialized service provider.
  • The following embodiments are based on a series of assumptions. A typical high resolution still image provided by a road-side camera C1 provides still images which have a size of about 1 MB or more. The ideal size of a still image is for the purposes of the present invention considered to be between 4 kB and 500 kB. If one assumes that in a certain city there are 50 road-side cameras C1 - C50 and that the images are processed (e.g. by means of compression or by a separation of real-time image data or dynamic image content from static image content) so that they have 10 kB each, a total of 50 times 10 kB is to be broadcast, if all inventive systems 200, no matter where they are in the city, are to receive the images of all 50 road-side cameras C1 - C50. If pictures are obtained from the cameras C1 - C50 or remote system 100 once per minute, then in a broadcast push approach 500 kB per minute have to be transmitted to all vehicles 1 in a certain region. In this example every system 200 at least in this region receives new images of all cameras C1 - C50 once per minute. The corresponding transmission frequency can be higher or lower.
  • The road-side camera C1 should have a resolution which is better than 0,2 megapixels and preferably more than 0,5 megapixels. This rule applies to all embodiments.
  • In a preferred embodiment it is possible to compress, filter or process the image data at the camera side or at the remote system 100 so that the size of the images gets smaller (reduced data volume). It is, however, to be kept in mind that features of an image when displayed in the vehicle 1 have to be visibly resolved. Such a compression scheme can be applied to all embodiments.
  • In a preferred embodiment it is possible to process the image data at the camera side or at the remote system 100 so that license plates or the faces of people are blanked out.
  • Even more preferred is an embodiment where a separation of real-time image data or dynamic image content from static image content is carried out at the camera side and/or at the remote system 100 so that relevant image information is always made visible whereas less important (e.g. static) image information is less well visible on a display 4. Such a scheme can be applied to all embodiments.
  • Even more preferred is an embodiment where the road-side camera carries on optical filter or screen in order to block information which is not considered relevant or which for legal reasons has to be blanked out. A corresponding illustration is provided in Fig. 12. The hashed field 400 represents a filter attached to a road-side camera. This approach also helps to reduce the bandwidth requirements.
  • The dynamic image content is what matters the most in the context of the present invention since the view of a road crossing, for instance, which does not show any moving objects (such as pedestrians or cars) is not as interesting to the driver of the vehicle 1 as an image which shows for instance the stop-and-go of vehicles in front of a traffic light. This is one of the reasons why the measure M2 is considered advantageous.
  • An example of a possible implementation or embodiment of the measure M2 is schematically illustrated in the sequence of Figures 10A through 10D. A corresponding flow chart is presented in Fig. 7. Fig. 10A shows the real traffic situation similar to the situation of Fig. 5A. The corresponding image information I1, including details such as windows, doors, trees, pedestrians and the like, can be transmitted from the camera C1 to the remote system 100. An image compression, e.g. a separation of static and dynamic content (measure M2) or a regular data compression (measure M3), could also be carried out by the camera C1 or by a module attached to the camera C1. More preferred is an embodiment where the image compression is carried out by the remote system 100 or by a special hardware and/or software module of the remote system 100 (steps S1.1 and S.1.2, Fig. 7). The principle on which measure M2 is based, is schematically illustrated in Fig. 10B and 10C. Fig 10B shows the static image content which is here reduced to some very basic shapes and elements, such as road markings, outlines of buildings and landmarks, for instance. In the present example elements and features (such as plants, windows, doors, etc.) which are not considered important are removed. Fig 10C shows the dynamic image content, such as cars, traffic lights, changing traffic signs, pedestrians, etc.
  • The transmission of the static image content does not require much bandwidth. In a vector based system it would be sufficient to just transmit vector information for lines and edges. In one embodiment of the invention the respective static image content is actually transmitted from the remote system 100 to the navigation system 2. In another embodiment of the invention the respective static image content is stored in the navigation system 2 (e.g. using a CD ROM or another storage medium). In this case the static information is not required to be transmitted.
  • In both embodiments the dynamic content is transmitted to the navigation system 2. The system 2, or a special module attached thereto, maps the dynamic content onto the static content (step S1.3, Fig. 7), no matter whether the static content is locally available or transmitted from the remote system 100. Fig. 10D shows the display 4 with a "reconstructed" image where static and dynamic content has been merged.
  • In addition or instead of any of the other measures M1, M3 - M6, a distinction is made between static (motionless) image content and dynamic or quasi dynamic image content (i.e. changing content), as described above in connection with Figures 10A - 10D.
  • A flow chart is presented in Fig. 7 where the static and dynamic image content are separately handled and transmitted (steps S1.1 and S1.2, Fig. 7). The static and dynamic image content are merged or combined by the navigation system 2, or by a module attached or connected to the system 2 (step S1.3, Fig. 7).
  • A flow chart is presented in Fig. 8 where the static image content is not transmitted because it is available at the navigation system 2 (step S2.1, Fig. 8). The static image content could be retrieved from a local storage medium, for instance. The dynamic image content is separately handled and transmitted (steps S2.2 and S2.3, Fig. 8). The static and dynamic image content are merged or combined by the navigation system 2, or by a module attached or connected to the system 2 (step S2.4, Fig. 8).
  • In a preferred embodiment, instead of broadcasting all images to all systems 300 in a general push approach, one could divide the whole city area into smaller cells (herein called cellular or smart push approach). In this case only images of road-side cameras (e.g. the cameras C1 - C10) within a particular cell are transmitted to systems 300 inside the cell or close to this cell. This helps to reduce the overall bandwidth requirement drastically.
  • In another preferred embodiment only images of road-side cameras (e.g. the cameras C5 - C10) are transmitted to systems 300 inside a vehicle 1 where a route has been defined or programmed in the navigation system 2 (i.e. if a route finding process of the navigation system 2 is active) which is passing these cameras C5 - C10. This helps to reduce the overall bandwidth requirement drastically. This means that a driver who is driving from location A to location B using the navigation system 2 and following the route R1, as shown in Fig. 9, would only be enabled to request and receive image information 12 from the road-side camera C2. Image information 12 from the road-side camera C3 is only requestable by a driver whose route passes by the position of the road-side camera C3.
  • In a further preferred embodiment the cell or micro-cell structure of a cellular mobile phone network is used in order to determine whether a system 200 currently is in a certain cell and whether there are any cameras (e.g. the cameras C1 - C10) in the same or in a neighboring cell which are providing real-time image information I1 which could be useful to the driver of a certain vehicle 1. This approach could be used together with the cellular or smart broadcast push approach, or it could be used in connection with a pull approach. According to this pull approach, a system 200 inside or close to a certain cell is enabled to request real-time image information I1 from a certain camera C1 - C10 within or close to the same cell, e.g. by using the systems user interface.
  • The pull approach is in another embodiment used without making a preselection or the like using the current vehicle's position inside a cell. This approach is herein called user specific pull approach. Here the user is enabled by a software module of the system 300 to request real-time image information I1 provided by a certain camera CX, no matter where he or his vehicle 1 is located. This real-time image information I1 is directly or indirectly requested from the remote system 100 and sent to the user, vehicle 1 or system 300 using a dedicated downlink, e.g. a mobile phone connection.
  • With current radio broadcast methods in some countries traffic information is transmitted (e.g. using the Traffic Message Channel: TCM) to radio receivers. TMC is a specific application of the FM Radio Data System (RDS) used for broadcasting real-time traffic and weather information. The TCM information is used in order to allow a dynamic route calculation in case of traffic jams and the like. Real-time image information I1 could be transmitted together with radio signals in a broadcast fashion, but this service should be limited to local radio transmitters, because it would not make much sense for a vehicle 1 in one city to receive images from cameras in other cities.
  • With the deployment of digital radio, high bandwidth channels become available in particular in urban areas. Digital radio describes radio communications technologies which carry information as a digital signal. Since a digital modulation method is used for the transmission, the vehicle 1 or navigation system 2 in this case comprises a digital demodulator in order to be able to receive and process the digital signals. The digital radio service could be used to transmit real-time image information I1 in a broadcast fashion.
  • It is also possible to use mobile phones and stationary transmitters to transmit the real-time image information I1. For this purpose either a regular phone is programmed to receive the information I1, or a separate communications module (e.g. comprising a SIM card) is implemented inside the inventive system 300 or is connectable to the system 300. Such an approach is preferably being used for realizing the cellular or smart broadcast push approach or the user specific pull approach.
  • According to the invention, image data messages (real-time image information 12) are received silently and decoded by a car radio, a mobile phone, a PDA, a smart phone or a navigation system 2, and delivered (e.g. made visible) to the driver in a variety of ways.
  • The system 300 in all embodiments includes a display 4 or other means or indicators which can be dedicated or shared with the ones already existing in the vehicle 1. Besides visual indicators the navigation system 2 may also include audible means or other means to inform the driver.

Claims (14)

  1. Method for providing information assisting a driver of a vehicle (1) which is equipped with a navigation system (2),
    - Providing real-time image information (I1) to a remote system (100), said real-time image information (I1) having been obtained from at least one road-side camera (C1),
    - Displaying an object (3) on a display (4) of said navigation system (2) which represents at least one road-side camera (C1),
    - selecting a corresponding road-side camera (C1) using a user interface of said navigation system (2),
    - transmitting image information (12) based on said real-time image information (I1) provided by the corresponding road-side camera (C1) which has been selected, from said remote system (100) to said navigation system (2),
    - displaying an image (13) based on said image information (12) on said display (4) in order to provide visual information of said specific section of a route (R1).
  2. Method according to claim 1, wherein said visual information is provided if a route finding process of said navigation system (2) is active.
  3. Method according to claim 1 or 2, wherein said object (3) which is displayed on said display (4) indicates or represents one or more of the following:
    - a viewing direction (102) of a corresponding road-side camera (C1),
    - a line of sight of a corresponding road-side camera (C1),
    - field of view (101) of a corresponding road-side camera (C1),
    - a range (103) of a corresponding road-side camera (C1).
  4. Method according to claim 1, 2 or 3, wherein said display (4) is a touch-sensitive display (4) and wherein said object (3), which is displayed on said display (4), is temporarily magnified if the driver touches or pre-selects the object.
  5. Method according to one of the claims 1 through 4, wherein said image information (12) is displayed in an orientation which matches the actual orientation of a map of said specific section of the route (R1) shown on said display (4).
  6. Method according to one of the claims 1 through 5, comprising the step
    - displaying said image information (12) in an overlay mode above an artificial map on said display (4), or,
    - in a dual screen application displaying said image information (12) in a separate window (5) or frame of said display (4) while displaying a map in another window or frame of said display (4).
  7. Method according to one of the preceding claims, wherein said image (13) contains or is built using static image information and dynamic or quasi dynamic image information.
  8. Method according to claim 7, wherein said static image information is provided by said navigation system (2) or by a portable computing system connected to said navigation system (2), and wherein said dynamic or quasi dynamic image information is provided via said remote system (100).
  9. Method according to claim 7, wherein said static image information and said dynamic or quasi dynamic image information are provided via said remote system (100).
  10. Method according to one of the preceding claims, wherein said image information (12) is obtained by the application of a compression scheme on said real-time image information (I1).
  11. System (200) for use in a vehicle (1), said system comprising:
    - a navigation system (2),
    - a display (4) for displaying information provided by said navigation system (2),
    - a receiver for receiving image information (12) from a remote system (100),
    - a software module causing an object (3) representing a road-side camera (C1) to be displayed on said display (4),
    - a user interface for interaction of a driver with said system (200), said user interface enabling the driver to select a road-side camera (C1), said system being enabled to receive said image information (12) from said remote system (100) and to display an image (13) based on said image information (12) on said display (4).
  12. System of claim 11, wherein said image information (12) is requestable by means of a manual interaction with said user interface.
  13. System of claim 11 or 12, wherein said display (4) is a touch sensitive display (4) and wherein said software module enables a user to request said image information (12) by activation or selection of said object (3).
  14. A remote system (100) for offering information assisting a driver of a vehicle (1) which is equipped with a navigation system (2), said system (100) comprising
    - a computing server,
    - a storage system,
    - a communication link for receiving image information (I1) from at least one road-side camera,
    - a communication link for sending image information (12) to a navigation system (2),
    - a software module for processing said image information (I1).
EP10189342.8A 2010-10-29 2010-10-29 Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle Not-in-force EP2447925B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP10189342.8A EP2447925B1 (en) 2010-10-29 2010-10-29 Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP10189342.8A EP2447925B1 (en) 2010-10-29 2010-10-29 Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle

Publications (2)

Publication Number Publication Date
EP2447925A1 true EP2447925A1 (en) 2012-05-02
EP2447925B1 EP2447925B1 (en) 2017-05-17

Family

ID=43629057

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10189342.8A Not-in-force EP2447925B1 (en) 2010-10-29 2010-10-29 Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle

Country Status (1)

Country Link
EP (1) EP2447925B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11351932B1 (en) 2021-01-22 2022-06-07 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicles and methods for generating and displaying composite images
CN114666382A (en) * 2022-03-17 2022-06-24 北京斯年智驾科技有限公司 Parallel driving system for automatic driving semi-mounted collecting card
CN115731707A (en) * 2022-11-14 2023-03-03 东南大学 Highway vehicle traffic control method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001082261A1 (en) * 2000-04-24 2001-11-01 Kim Sug Bae Vehicle navigation system using live images
WO2007057696A1 (en) * 2005-11-18 2007-05-24 Tomtom International B.V. A navigation device displaying traffic information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001082261A1 (en) * 2000-04-24 2001-11-01 Kim Sug Bae Vehicle navigation system using live images
WO2007057696A1 (en) * 2005-11-18 2007-05-24 Tomtom International B.V. A navigation device displaying traffic information

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
EITARO ICHIHARA; HIROYUKI TAKAO; YUICHI OHTA: "NaviView: Bird's-Eye View for Highway Drivers Using Roadside Cameras", IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA COMPUTING AND SYSTEMS (ICMCS'99), vol. 2, 1999, pages 559
KOHEI ITO; NAOHIKO ICHIHARA; HIROTO INOUE; RYUJIRO FUJITA; MITSUO YASUSHI: "Car navigation system with image recognition", DIGEST OF TECHNICAL PAPERS INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, 2009, pages 1 - 2, XP031466912
S. DASHTINEZHAD, TRAFFICVIEW: A DRIVER ASSISTANT DEVICE FOR TRAFFIC MONITORING BASED ON CAR-TO-CAR COMMUNICATION
Y. KAMEDA ET AL.: "Visual Assistance for Drivers by Mixed Reality", 14TH WORLD CONGRESS AT OF ITS, 2007

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11351932B1 (en) 2021-01-22 2022-06-07 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicles and methods for generating and displaying composite images
EP4032752A1 (en) * 2021-01-22 2022-07-27 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle and method for generating and displaying composite images
CN114666382A (en) * 2022-03-17 2022-06-24 北京斯年智驾科技有限公司 Parallel driving system for automatic driving semi-mounted collecting card
CN115731707A (en) * 2022-11-14 2023-03-03 东南大学 Highway vehicle traffic control method and system
CN115731707B (en) * 2022-11-14 2024-03-19 东南大学 Highway vehicle traffic control method and system

Also Published As

Publication number Publication date
EP2447925B1 (en) 2017-05-17

Similar Documents

Publication Publication Date Title
US9406225B2 (en) Traffic data services without navigation system
CN103218910B (en) System and method for operation of traffic information
KR101177386B1 (en) Method and apparatus for providing transportation status information and using it
CN103606291B (en) A kind of information processing method, Apparatus and system
US7880645B2 (en) Method and apparatus for providing and using public transportation information containing bus stop-connected information
US8392099B2 (en) Method of providing detail information using multimedia based traffic and travel information message and terminal for executing the same
CN106104566A (en) By system in a vehicle
CN102881157A (en) Individualized traffic guidance method and individualized traffic guidance system on basis of mobile terminal display
JP2009541884A (en) Method and apparatus for transmitting vehicle related information in and from a vehicle
JP2009539173A (en) Method and apparatus for providing traffic information by lane and using the information
CN104820669A (en) System and method for enhanced time-lapse video generation using panoramic imagery
CN104833368A (en) Live-action navigation system and method
JP2007155341A (en) Route guide system and method
WO2011140859A1 (en) Method and system for providing graphical real-time traffic information
JP2006277546A (en) Information providing system and information providing method
WO2016138942A1 (en) A vehicle assistance system
EP2447925B1 (en) Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle
WO2015001677A1 (en) Safety assistance system and safety assistance device
KR100873191B1 (en) Method and apparatus for providing traffic and travel information by synopsis map
CN105091895A (en) Concern prompt system and method
KR101448895B1 (en) Traffic light control system based on the TPEG information
JP2007057280A (en) Car navigation system
CN113034943A (en) Holographic intersection video display system and method
Mammano et al. Pathfinder status and implementation experience
KR20160112358A (en) Road traffic information providing device through the interlocking of road electric bulletin board and vehicle navigation

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20121029

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: HEUSCH, CHRISTIAN

RIN1 Information on inventor provided before grant (corrected)

Inventor name: HEUSCH, CHRISTIAN

17Q First examination report despatched

Effective date: 20160210

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20161212

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 895104

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170615

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010042378

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20170517

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 895104

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170517

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170817

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170818

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170817

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170917

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010042378

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20180220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171029

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171031

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171031

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20171031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171029

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171029

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20101029

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170517

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20201022

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20211022

Year of fee payment: 12

Ref country code: DE

Payment date: 20211020

Year of fee payment: 12

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211031

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602010042378

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20221029

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230503

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221029