EP3089136A1 - Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule - Google Patents

Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule Download PDF

Info

Publication number
EP3089136A1
EP3089136A1 EP15165858.0A EP15165858A EP3089136A1 EP 3089136 A1 EP3089136 A1 EP 3089136A1 EP 15165858 A EP15165858 A EP 15165858A EP 3089136 A1 EP3089136 A1 EP 3089136A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
camera
radar sensor
control unit
surveillance area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP15165858.0A
Other languages
German (de)
English (en)
Inventor
Huba NÉMETH
Marton GYÖRI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH
Original Assignee
Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH filed Critical Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH
Priority to EP15165858.0A priority Critical patent/EP3089136A1/fr
Publication of EP3089136A1 publication Critical patent/EP3089136A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space

Definitions

  • the present invention relates to an apparatus and a method for detecting an object in a surveillance area of a vehicle and, in particular, to an advanced turning-assistant system for commercial vehicles.
  • Lane changes or turning maneuvers are potential critical traffic situations for commercial vehicles where other traffic participants are easily overseen.
  • blind spot assist systems are used to prevent accidents in such situations by supporting the driver, for example, during lane changes or turns.
  • the driver in passenger cars who is warned by such turn assist systems is in most cases able to confirm such warnings by a visual inspection of the respective area, for example by moving or turning her/his head.
  • this is in most cases not possible because the driver has no view, for example, at the passenger side of the vehicle and thus cannot confirm a warning given by a turning assist system by visual inspection.
  • DE 10 2010 048 144 A1 discloses a vehicle which has a sensor to detect objects in a surrounding area of the vehicle, wherein the surrounding area extends along one long side of the vehicle.
  • DE 10 2009 041 556 A1 discloses a blind spot assist system based on ultrasonic sensors, which are arranged along a side and a corner region of the vehicle.
  • DE 10 2012 010 876 A1 discloses another drive assist system based on image sensors which are arranged between the front axle and the rear axle in a lateral region of the vehicle main portion.
  • Further driver assist systems are disclosed in WO 2014/037064 A1 with sensors arranged at a vehicle trailer and in WO 2014/041023 , wherein a collision warning system for lane changes is described.
  • the present invention solves the afore-mentioned problem by providing an apparatus according to claim 1 and a method according to claim 15.
  • the dependent claims refer to specifically advantageous realizations of the subject matter of claim 1.
  • the present invention relates to an apparatus for detecting an object in a surveillance area of a vehicle.
  • the apparatus comprises at least one camera for capturing image data of the surveillance area, at least one radar sensor (or radar unit) for providing information indicative of a distance to the object, and a control unit configured to receive the image data and the information from the radar sensor(s) to generate a visual representation of the object based on the received image data and/or on the information indicative of the distance.
  • the surveillance area can be any area surrounding the vehicle and may cover, in particular, an adjacent driving lane or a sidewalk or a bicycle lane.
  • Each sensor has it own coverage area from where the sensor can take sensor data and which is possibly larger than the surveillance area so that both sensors cover a common area that can be identified as the surveillance area or it can be part thereof. Therefore, the surveillance area is part of both coverage areas.
  • the distance may be defined as the distance to the object measured from the respective sensor or from the vehicle (or any particular part of the vehicle).
  • the present invention does not rely on any particular coordinate system. Rather, the position, distance or angle may be defined with respect to any desired coordinate system (e.g. the distance(s) or angle(s) can be measured with respect to the sensor units).
  • Information indicative of a distance shall be interpreted very broadly and shall contain any information which can be used to derive therefrom a distance between the vehicle (or the radar sensor) and the object.
  • This information may include in particular time delay information between a transmitted and a reflected radar signal measured by the radar sensor or the corresponding moments in time of the emission and reception of the used (RF) signals. Based on the time delay or the two measured different moments in time or any other information, the control unit can then determine or calculate the distance to and/or the position of the object.
  • the above-mentioned problem is thus solved by the present invention by providing a redundancy of two independent sensor units of different types that enable a visual feedback to the driver while avoiding false detections (e.g. due to abnormal weather conditions affecting one sensor type).
  • the apparatus may further comprise a display unit that is configured to receive the visual representation of the object from the control unit and to show at least part of the surveillance area together with the object to the driver of the vehicle.
  • the display may be configured to show to the driver the image captured by the at least one camera as one image and, as a second image, to display a radar picture obtained from the information provided by at the at least one radar sensor.
  • the at least one radar sensor may be configured to transmit a radio frequency (RF) signal and to receive a reflected RF signal from the object.
  • the at least one radar sensor may further be configured to determine an angle or an angular range from where the RF signal was received.
  • the control unit may be configured to determine a position of the object based on the determined angle or angular range and the information indicative of the distance.
  • two visual representations are available, one referring to the image captured by the camera(s) and the second derived from the information received from the radar sensor(s). Both images may either be overlaid on the display or can be displayed as separate images, e.g. adjacent to each other.
  • the radar sensor may not distinguish between the object of interest and background objects (e.g. trees or buildings which the driver is aware of). Therefore, in order to detect an object of interest (i.e. to distinguish it from the background) subsequent positioning steps may be implemented.
  • multiple objects of interest may be present in the surveillance area.
  • the at least one radar sensor may be configured to repeatedly measure a distance between one or more candidate objects and the vehicle.
  • the control unit may receive the repeatedly measured distances and, based thereon, can determine a relative motion of the detected objects to the vehicle. If the objects move relative to each other, the reflected radar signals can be identified as signals originating from different objects.
  • background objects can be detected as static objects at a larger distance (e.g. beyond a certain threshold). As a result, the apparatus is able to eliminate background objects and/or to detect one or more further objects of interest.
  • the at least one camera When only one camera is available, there may be a problem in identifying an object being different from the background.
  • the ground or the driving lane may have certain patterns, which are typically not objects of any interest to the driver and thus should be eliminated in the captured images.
  • the radar sensor this can be achieved by taking subsequent images so that any relative motion of the vehicle with respect to the object and with respect to the background can be identified. Hence, any background object can be identified and thus eliminated from the captured images.
  • the at least one camera may be configured to capture subsequent images.
  • the control unit may further be configured to process the subsequent images received from the at least one camera using a processing algorithm to detect the object in the surveillance area and/or to eliminate background objects and/or to detect one or more further objects.
  • the same aim could be achieved, if multiple cameras are available so that at least two pictures can be taken from different angles, which allows to identify objects related to the background or further distant objects.
  • control unit might be configured to fuse (combine) both sensor signals (or sensor images) into one single sensor image. This can, for example, be done by highlighting the object in the visual representation of the image captured by the camera, wherein the highlighted object is related to the object detected by the at least one radar sensor. Hence, it is not necessary to detect the object by the camera(s). Instead, the position of an object detected by the radar sensor or the control unit can be simply be highlighted in the image taken by the camera(s) so that the driver can decide about the relevance of this object. Therefore, in yet another embodiment the control unit is configured to combine information received from the at least one radar sensor and the at least one camera and to highlight the object or multiple objects in the image shown to the driver of the vehicle.
  • control unit may be configured to predict a path of the vehicle and/or to predict a path of the object and is further configured to issue a warning, if the path of the vehicle intersects the path of the object. Also, the relative path between the object and the vehicle can be determined.
  • the warning can be done by a separate warning module (for example a loudspeaker or a warning light), which may or may not be part of the control unit and may or may not be already included in the vehicle, but may be controlled by the control unit.
  • the warning may comprise an acoustic signal and/or a haptic signal and/or an optical signal.
  • the haptic signal may be related to any kind of vibration, which the driver is able to recognize.
  • control unit may further be configured to issue a brake signal to enforce a braking and/or a steer signal to enforce a steering of the vehicle in case the path of the vehicle and the path of object indicate a collision for a predetermined period of time (e.g. if the driver ignores the warning and a collision is going to happen, when the vehicle stays on its path).
  • the predetermined period can be selected such that the remaining time to the collision is as long as the vehicle needs to prevent the collision (i.e. it may define the latest moment).
  • the optional warning module can issue a constant warning signal upon which the driver can interact to override the enforced braking and/or steering of the vehicle by taking appropriate actions to avoid the collision of the object.
  • the at least one camera may comprise a first and a second camera
  • the at least one radar sensor comprises a first and a second radar sensor, wherein the first camera and the first radar sensor are installed on one side of the vehicle and the second camera and the second radar sensor are installed on another side of the vehicle to detect objects on different sides of the vehicle.
  • the at least one camera comprises a fish-eye lens camera to capture image data of a large area (e.g. extending the viewing angle of more than 90° or up to 180°).
  • control unit may further be configured to transform the image data received from the at least one camera in a bird's eye view (e.g. by using a conversion method).
  • the present invention relates also to a vehicle with one of the described apparatus and the surveillance area may include an area of a side of the vehicle that extends over an angular range of more than 90° or up to 170° measured about the side of the vehicle.
  • the present invention relates also to a method for detecting an object in a surveillance area of a vehicle.
  • the method comprises: capturing image data of the surveillance area by at least one camera; providing information indicative of a distance to the object by at least one radar sensor; receiving the image data and the information by a control unit; and generating, by the control unit, a visual representation of the object based on the received image data and/or the information indicative of the distance.
  • This method may also be implemented in software or as a computer program product and the order of steps may be not important to achieve the desired effect.
  • all functions described in conjunction with the apparatus may be implemented in further embodiments of the method.
  • Fig. 1 depicts an apparatus for detecting an object 10 in a surveillance area 30, 50 of a vehicle 70.
  • the apparatus comprises one camera 110 for capturing image data of the surveillance area 30, one radar sensor 120 for providing a distance d to the object 10, and a control unit 130.
  • the control unit 130 is configured to receive the image data and the information from the radar sensor 120 to generate a visual representation of the object 10 based on the received image data and/or the distance d.
  • the apparatus is installed on an exemplary commercial vehicle 70 with a towing vehicle 71 (e.g. a tractor) and a trailer 72, wherein only one camera 110 and only one radar sensor 120 are installed at a right-hand side of the towing vehicle 71 in the driving direction (to the left in Fig. 1 ).
  • the control unit 130 and an optional display 140 may further be installed on the towing vehicle 71.
  • the control unit 130 is connected to the camera 110, to the radar sensor 120 and to the display 140 to receive the signals from the camera 110 and the radar sensor 120 and to supply corresponding visual data to the display 140.
  • the radar sensor 120 may provide an angular resolution of the detected object 10 so that the control unit 130 is able to determine a position for the object 10 (e.g. given by the distance d and the angle ⁇ ).
  • the camera 110 may be sensitive to capture images in the coverage area 30 and the radar sensor 120 has a range indicated by the dotted line 50.
  • both coverage areas 30, 50 overlap significantly over each other and the overlap may define the surveillance area.
  • the range 50 of the radar 120 is limited by the used radar signal.
  • the one or more radar sensors 120 may operate with radio frequency signals with exemplary frequencies between 10 and 100 GHz or about 24 GHz and/or at about 77/79 GHz.
  • one radar sensor may operate at 24 GHz whereas another may operate at 77/79 GHz.
  • the detection range may go up to 100 meters, although in the target operating mode the range may be around 25 meters.
  • Such range is of advantage, because it allows to cover a whole side the vehicle 70 with one radar sensor 120, which can thus be installed on the towing vehicle 71 - there is no need to install any sensor on the trailer 72 (hence the system operates trailer-independent).
  • the radar sensor provides more accurate information of the detected objects.
  • the radar coverage area 50 can also be smaller than the camera coverage area 30.
  • electromagnetic wave signals which can travel a sufficiently long distance to extend therewith the coverage area 50 so that only one radar sensor 120 would suffice to cover the whole side of the vehicle 70, and there is no need to distribute many radar sensors over the whole long side of the vehicle 70 (as in conventional systems).
  • the camera 110 and the radar sensor 120 may be installed at different locations.
  • the camera 110 may be installed at an upper portion of the driver cabin and the radar unit 120 may be installed at a lower portion (e.g. between two axles). Consequently, the viewing direction of the camera 110 is downward, which limits the coverage area 30 of the camera 110.
  • the radar unit 120 may emit signals parallel to the ground so that the coverage area 50 of the radar unit 120 is merely limited by the range of the used signals.
  • the range 50 of the radar sensor 120 may be larger than the coverage area 30 covered by the camera 110 (although this area can be changed by changing the orientation of the camera 110).
  • Fig. 2 depicts a further embodiment of the apparatus according to the present invention, which differs from the embodiment as shown in Fig. 1 in that the camera 110 is attached near or at a corner of the towing vehicle 71. With the camera 110 installed in the corner region of the towing vehicle 71 it becomes possible to extend the coverage area 30 of the camera 110 to cover also the front side of the vehicle 70. All other features as depicted in Fig. 2 are the same as in Fig. 1 so that a repetition of the description is not needed here.
  • already one radar sensor 120 can be used to obtain a radar image, which is suitable to be shown to the driver as an addition visual representation of the surveillance area.
  • the radar sensor 120 is configured to measure the distance d between the radar sensor 120 and the object 10.
  • the radar sensor may emit an RF-signal, which is reflected by the object 10 and returns to the radar sensor 120.
  • the radar sensor 120 measures the time difference between the emission of the RF-signal to the object 10 and the reception of the return signal. From this time difference, while taking into account the propagation speed of the wave signal, the control unit 130 (or the radar sensor 120 itself) can determine the distance d from the radar sensor to the object 10.
  • the radar sensor 120 may further be configured to provide an angular resolution of the detected return signals.
  • the radar signal 120 can scan the coverage area 50, for example, starting on the left hand side in Fig. 2 and subsequently scanning the area up to the right hand side of Fig. 2 (or vice versa).
  • the radar sensor 120 can determine the corresponding angle ⁇ associated with a reflecting object.
  • the radar sensor 120 may be configured to emit pulses of RF-signals, for example, for each angular value (or angular range) one pulse and if a reflected return signal is received, the radar sensor can assign the corresponding distance d and angle ⁇ to the detected object 10.
  • the radar sensor may receive multiple reflected signals from the multiple objects 10a, 10b. However, if the multiple objects 10 do not move relative to each other, the radar sensor 120 (or the control unit 130) cannot distinguish, whether the multiple reflected signals originate from different parts of only one object or whether two objects are present in the coverage area 50. However, the speed of each object can be determined by performing subsequent scans over the coverage area 50. If the independent objects move relative to each other they can be identified a different objects, and the radar sensor 120 can assign the corresponding distances d1, d2 and angles ⁇ 1, ⁇ 2 to the detected first and second objects 10a, 10b.
  • multiple objects can be detected or identified (by scanning the angular range) if the relative velocity and/or relative position between the multiple objects exceed a certain threshold.
  • These objects may be (visually) separated and identified as separate objects.
  • the field of view of these sensors may go up to 170 degrees.
  • the at least one radar sensor 120 can provide sufficient information to detect the object(s) in the coverage area 50 and to determine the position of the object(s) 10. The determined position may then be used to provide a further visual feedback to the driver.
  • this representation may be combined with the picture captured with the camera 110.
  • the control unit 130 can overlay both pictures and highlight a depicted object 10.
  • any background object can be eliminated (e.g. simply by ignoring object beyond a particular threshold).
  • Fig. 3 depicts a further embodiment of the apparatus installed on a vehicle.
  • This embodiment comprises a first radar sensor 121, a second radar sensor 122 and a third radar sensor 123.
  • the first radar sensor 121 is installed on the right-hand side of the towing vehicle 71.
  • the second radar sensor 122 is installed on the left-hand side of the towing vehicle 71.
  • the third radar sensor 123 is installed at a front side of the towing vehicle 71.
  • the embodiment comprises a first camera 111 and a second camera 112.
  • the first camera 111 is installed at the front right corner of the towing vehicle.
  • the second camera 112 is attached at the front left corner of the towing vehicle 71.
  • the first camera 111 and the first radar sensor 121 may cover the right-hand side of the vehicle 70 (as it was described in conjunction with Fig. 1 ).
  • the second camera 112 and the second radar sensor 122 cover the left-hand side of the vehicle.
  • This coverage is an analogy to the embodiment as described with Fig. 1 .
  • the first camera 111 has a first coverage area 31 (around the right corner) and the second camera 112 has a second coverage area 32 (around the left corner).
  • the third radar sensor 123 covers the range 53 in front of the vehicle 70, which overlaps with the radar coverage area on the right-hand side 51 and the radar coverage area on the left-hand side 52.
  • Fig. 3 shows an embodiment, which provides a maximum extension of all sides of the moving vehicle.
  • the display unit 140 can thus depict three pictures (or visual representations) of the three sides of the vehicle 70, wherein each visual representation is obtained in the same way as described in conjunction with Figs. 1 , 2 .
  • further cameras may also be installed at the same side of the vehicle and/or further radar sensors can also be installed at the same side of the vehicle, in which case the resolution can be further increased.
  • any further (hidden) object e.g. behind the one object as shown in Fig. 1 or Fig. 2
  • not only the towing vehicle 71 is used for attaching the at least one camera 110 and the at least one radar sensor 120. It is also possible that further, optional, cameras and/or radar sensors are attached to the trailer 172. However, installing the camera(s) and the radar sensor(s) merely on the towing 71 provide the advantage that the trailer can be changed freely while ensuring the correct operation of the turning-assistant system.
  • Fig. 4 shows an example for the visual feedback to the driver, for example shown on the display 140 of the system architecture as shown in Figs. 1 and 2 .
  • the driver can see the object 10 (for example a bicycle rider) as a marked object beside the vehicle.
  • the object marking can either be generated based on the captured image data of the at least one camera 110 or in combination with the at least one radar sensor 120 which also detects an object 10 travelling at the side of the vehicle 70.
  • both sensor units 110, 120 may identify the same object 10, which is marked in the visual feedback to the driver.
  • This marking may be interpreted by the driver as a confirmation that both systems 110, 120 have detected the same object at the same position.
  • the driver obtains a high level of confidence in the situation depicted on the display 140.
  • Fig. 4 shows a particular raw camera view, i.e. a view as seen from the position of the camera 110. However, it may be of advantage to manipulate the camera view such that it represents the bird's eye view.
  • FIG. 5 Such a converted picture is shown in Fig. 5 , which again shows the exemplary bicycle rider as object 10, who uses a bike lane parallel to the traffic direction of the vehicle 70. Again, this bird's eye view can be shown to the driver in combination, alternatively or selectable to the driver of the vehicle 70 so that the driver can select the way of viewing which s/he may prefer.
  • Fig. 5 further shows the area 50 covered by the radar 120 as a dotted line so that the driver can see, whether the detected object is reliably detected by both sensors or whether the object is outside the area of coverage 50 of the radar sensor 120.
  • Fig. 6 depicts a method for detecting an object 10 in a surveillance area 30, 50 of a vehicle 70.
  • the method comprises the steps of: capturing S110 image data of said surveillance area 30 by at least one camera 110; providing S120 information indicative of a distance to said object 10 by at least one radar sensor 120; receiving S130 said image data and said information by a control unit 130; and generating S140, by said control unit 130, a visual representation of said object 10 based on said received image data and/or said information indicative of said distance.
  • This method may also be a computer-implemented method.
  • a person of skill in the art would readily recognize that steps of various above-described methods might be performed by programmed computers.
  • Embodiments are also intended to cover program storage devices, e.g., digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, wherein the instructions perform some or all of the acts of the above-described methods, when executed on the a computer or processor.
  • the visual feedback (for example the displayed camera view) to the driver may allow the driver to have a better understanding of the situation to assist the driver in finding the correct action to avoid any accident.
  • the driver With a single camera view the driver is already able to observe the critical area around the vehicle without the need of an object detection feature implemented optionally by the disclosed system.
  • the system only in a visual mode the number of accidents caused by collisions related to objects in blind spots around the commercial vehicles can already be reduced significantly.
  • the disclosed system comprises, for example, at least one camera 110, at least one radar sensor 120 and a visual feedback 140 to the driver.
  • the commercial vehicle specific turning-assistant provides the possibility of visually observing critical areas by the driver, to help the driver to recognize relevant objects. This recognition can, for example, be done by marking them on the camera view and/or by warning for the occurrence of such objects.
  • the system may also initiate a braking and/or steering intervention, if the driver does not act accordingly.
  • the robustness of the system is increased by the combination or fusion of two different sensor signals (the radar signal and the camera images), which cover the same surveillance area. This may also extend the operational range of the system, because it can be applied at various environmental circumstances, for example at night, at day, or under severe weather conditions.
  • the redundancy provided by two different sensor units is a particular advantage of embodiments of the present invention.
  • the second, parallel sensor can compensate for insufficiencies of one sensor type.
  • the camera may not take sufficient pictures if the weather conditions are bad or at nighttime.
  • radar sensors might be affected by strong rain or snow falls, or falling leaves may generate false depiction signals of phantom objects.
  • the resulting apparatus will operate reliably during all weather conditions. This enables the sensor 110, 120 to operate under all circumstances during the daytime and at night and for all weather conditions, such as for example clear weather, fog, rain or snow.
  • the driver gains confidence about the actual driving traffic situation even at the side opposite to the driver's side.
  • the camera as well as the radar sensor may be mounted on the towing vehicle (tractor), whereas the towed vehicle (for example a trailer) is not involved in the system installation.
  • the trailer can be freely exchanged and can be used for any type of towing vehicle without compromising the safety.
  • the at least one camera 110 can be mounted high enough, for example around the top of the cabin, to provide a good freedom of view of the relevant area.
  • the radar sensor(s) 120 may, for example, be installed between the front axle and the rear axle of the towing vehicle 70.
  • the radar signal may be transmitted into the surveillance area 50 parallel to the ground, thereby avoiding any false detection originated by reflection from the ground.
  • Any type of wave signal can be used for the radar sensor, for example, any type of electromagnetic waves as for example RF waves or ultrasonic waves.
  • a turning assist apparatus comprising: at least one camera 110; and at least one radar sensor 120 (both installed on the relevant side of the vehicle, rider side - opposite to the driver side); and a display 140 showing the view of the camera to the driver.
  • the turning-assistant apparatus is characterized in that the relevant objects 10 are indicated to the driver.
  • the turning-assistant apparatus is characterized in that the camera view is processed by image processing algorithms to detect objects 10.
  • the turning-assistant apparatus is characterized in that the objects 10 detected by the radar sensor 120 are indicated to the driver.
  • the turning-assistant apparatus is characterized in that the sensor fusion is realized between the two sensor types (camera 110 & radar sensor 120) realizing the object detection.
  • the turning-assistant apparatus is characterized in that the object indication to the driver is realized by marking the objects 10 on the camera view displayed to the driver.
  • the turning-assistant apparatus is characterized in that the system can enable further warning methods (e.g. audible, haptic) to warn the driver of critical situations.
  • further warning methods e.g. audible, haptic
  • the turning-assistant apparatus is characterized in that the camera view is a fish-eye lens camera.
  • the turning-assistant apparatus is characterized in that the system is capable of converting the camera view into a bird's eye view using point of view conversion method.
  • the turning-assistant apparatus is characterized in that the system can predict the path of the vehicle.
  • the turning-assistant apparatus is characterized in that the system can distinguish between stationary and moving objects and predict the path of the object.
  • the turning-assistant apparatus is characterized in that the system can predict intersection of vehicle and object path.
  • the turning-assistant apparatus is characterized in that the system can brake and/or steer to avoid the collision.
  • the turning-assistant apparatus is characterized in that the system can be optionally extended to other sides of the vehicle by adding further cameras and radar sensors.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
EP15165858.0A 2015-04-30 2015-04-30 Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule Pending EP3089136A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP15165858.0A EP3089136A1 (fr) 2015-04-30 2015-04-30 Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP15165858.0A EP3089136A1 (fr) 2015-04-30 2015-04-30 Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule

Publications (1)

Publication Number Publication Date
EP3089136A1 true EP3089136A1 (fr) 2016-11-02

Family

ID=53016541

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15165858.0A Pending EP3089136A1 (fr) 2015-04-30 2015-04-30 Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule

Country Status (1)

Country Link
EP (1) EP3089136A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019101573A (ja) * 2017-11-29 2019-06-24 トヨタ自動車株式会社 物体情報取得装置
FR3080590A1 (fr) * 2018-04-25 2019-11-01 Psa Automobiles Sa Procede et systeme de detection d’obstacle dans l’environnement d’un vehicule automobile
CN110456788A (zh) * 2019-07-22 2019-11-15 中兴智能汽车有限公司 一种自动驾驶客车控制系统及自动驾驶客车
CN111278703A (zh) * 2017-10-26 2020-06-12 奔德士商用车系统有限责任公司 确定何时由铰接式车辆的一个构件上的碰撞避免传感器检测到的物体包括车辆的另一个构件的系统和方法
CN113850999A (zh) * 2021-09-03 2021-12-28 杭州海康威视数字技术股份有限公司 车位检测装置和用于监控车位的具有雷达的摄像机装置
CN117636671A (zh) * 2024-01-24 2024-03-01 四川君迪能源科技有限公司 乡村道路智能会车的协同调度方法和系统

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030025597A1 (en) * 2001-07-31 2003-02-06 Kenneth Schofield Automotive lane change aid
DE102009041556A1 (de) 2009-09-15 2010-06-17 Daimler Ag Fahrzeug mit Totwinkelassistenzeinrichtung
US20100253918A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Infotainment display on full-windshield head-up display
DE102010048144A1 (de) 2010-10-11 2011-07-28 Daimler AG, 70327 Kraftfahrzeug mit einer Überwachungseinrichtung zur Überwachung des Umgebungsbereichs des Fahrzeugs
JP5070809B2 (ja) * 2006-11-10 2012-11-14 アイシン精機株式会社 運転支援装置、運転支援方法、及び、プログラム
DE102012010876A1 (de) 2012-06-01 2012-11-22 Daimler Ag Fahrzeug und Verfahren zur Unterstützung eines Fahrers beim Führen eines Fahrzeugs
US20130271606A1 (en) * 2012-04-13 2013-10-17 Paul Chiang Method of displaying an assistant screen for improving driving safety of a vehicle
US20130278769A1 (en) * 2012-03-23 2013-10-24 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
WO2014037064A1 (fr) 2012-09-05 2014-03-13 Wabco Gmbh & Système et procédé de fonctionnement d'un véhicule tracté ou d'un train routier
WO2014041023A1 (fr) 2012-09-13 2014-03-20 Volkswagen Ag Procédé et dispositifs d'avertissement de risque de collision lors d'un changement de voie de circulation

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030025597A1 (en) * 2001-07-31 2003-02-06 Kenneth Schofield Automotive lane change aid
JP5070809B2 (ja) * 2006-11-10 2012-11-14 アイシン精機株式会社 運転支援装置、運転支援方法、及び、プログラム
US20100253918A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Infotainment display on full-windshield head-up display
DE102009041556A1 (de) 2009-09-15 2010-06-17 Daimler Ag Fahrzeug mit Totwinkelassistenzeinrichtung
DE102010048144A1 (de) 2010-10-11 2011-07-28 Daimler AG, 70327 Kraftfahrzeug mit einer Überwachungseinrichtung zur Überwachung des Umgebungsbereichs des Fahrzeugs
US20130278769A1 (en) * 2012-03-23 2013-10-24 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US20130271606A1 (en) * 2012-04-13 2013-10-17 Paul Chiang Method of displaying an assistant screen for improving driving safety of a vehicle
DE102012010876A1 (de) 2012-06-01 2012-11-22 Daimler Ag Fahrzeug und Verfahren zur Unterstützung eines Fahrers beim Führen eines Fahrzeugs
WO2014037064A1 (fr) 2012-09-05 2014-03-13 Wabco Gmbh & Système et procédé de fonctionnement d'un véhicule tracté ou d'un train routier
WO2014041023A1 (fr) 2012-09-13 2014-03-20 Volkswagen Ag Procédé et dispositifs d'avertissement de risque de collision lors d'un changement de voie de circulation

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111278703A (zh) * 2017-10-26 2020-06-12 奔德士商用车系统有限责任公司 确定何时由铰接式车辆的一个构件上的碰撞避免传感器检测到的物体包括车辆的另一个构件的系统和方法
JP2019101573A (ja) * 2017-11-29 2019-06-24 トヨタ自動車株式会社 物体情報取得装置
FR3080590A1 (fr) * 2018-04-25 2019-11-01 Psa Automobiles Sa Procede et systeme de detection d’obstacle dans l’environnement d’un vehicule automobile
CN110456788A (zh) * 2019-07-22 2019-11-15 中兴智能汽车有限公司 一种自动驾驶客车控制系统及自动驾驶客车
CN113850999A (zh) * 2021-09-03 2021-12-28 杭州海康威视数字技术股份有限公司 车位检测装置和用于监控车位的具有雷达的摄像机装置
CN113850999B (zh) * 2021-09-03 2022-11-25 杭州海康威视数字技术股份有限公司 车位检测装置和用于监控车位的具有雷达的摄像机装置
CN117636671A (zh) * 2024-01-24 2024-03-01 四川君迪能源科技有限公司 乡村道路智能会车的协同调度方法和系统
CN117636671B (zh) * 2024-01-24 2024-04-30 四川君迪能源科技有限公司 乡村道路智能会车的协同调度方法和系统

Similar Documents

Publication Publication Date Title
JP4019736B2 (ja) 車両用障害物検出装置
EP3089136A1 (fr) Appareil et procédé pour détecter un objet dans une zone de surveillance d'un véhicule
EP1892149B1 (fr) Procédé pour la capture d'image dans l'entourage d'un véhicule et système correspondant
US8461976B2 (en) On-vehicle device and recognition support system
US9586525B2 (en) Camera-assisted blind spot detection
US20040051659A1 (en) Vehicular situational awareness system
KR20200102004A (ko) 충돌 방지 장치, 시스템 및 방법
US8106755B1 (en) Triple-function vehicle safety sensor system
RU2723193C1 (ru) Сенсорное устройство и способ обнаружения объекта вокруг прицепа транспортного средства
US20240246543A1 (en) Driver attentiveness detection method and device
EP3785996B1 (fr) Système d'alarme intégré pour les véhicules
US20230415734A1 (en) Vehicular driving assist system using radar sensors and cameras
US11999370B2 (en) Automated vehicle system
US10846833B2 (en) System and method for visibility enhancement
US11745654B2 (en) Method and apparatus for object alert for rear vehicle sensing
JP2020197506A (ja) 車両用物体検出装置
US20240270312A1 (en) Vehicular control system with autonomous braking
US12106583B2 (en) Vehicular lane marker determination system with lane marker estimation based in part on a LIDAR sensing system
JP3822417B2 (ja) 車両周辺監視装置
US11794536B2 (en) Vehicle control system and vehicle control method for determining chance of collision
KR102662224B1 (ko) 레이더 장치, 레이더 장치의 타겟 인식 방법 및 레이더 장치를 포함하는 차량 제어 시스템
CN210617998U (zh) 一种用于货运和客运车辆的盲区检测设备
US20220176960A1 (en) Vehicular control system with vehicle control based on stored target object position and heading information
JP2005352636A (ja) 車載の警報発生装置
JP2010146082A (ja) 画像処理装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20170502

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20201216

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

APBK Appeal reference recorded

Free format text: ORIGINAL CODE: EPIDOSNREFNE

APBN Date of receipt of notice of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA2E

APBR Date of receipt of statement of grounds of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA3E

APAF Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNE

APAF Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNE