EP4370951A2 - Systeme und verfahren zur erfassung der umgebung um fahrzeuge - Google Patents

Systeme und verfahren zur erfassung der umgebung um fahrzeuge

Info

Publication number
EP4370951A2
EP4370951A2 EP22841586.5A EP22841586A EP4370951A2 EP 4370951 A2 EP4370951 A2 EP 4370951A2 EP 22841586 A EP22841586 A EP 22841586A EP 4370951 A2 EP4370951 A2 EP 4370951A2
Authority
EP
European Patent Office
Prior art keywords
vehicle
objects
radar
unit
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22841586.5A
Other languages
English (en)
French (fr)
Other versions
EP4370951A4 (de
Inventor
Alon KEREN
Noam WAGNER
Tom Harel
Mark POPOV
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vayyar Imaging Ltd
Original Assignee
Vayyar Imaging Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vayyar Imaging Ltd filed Critical Vayyar Imaging Ltd
Publication of EP4370951A2 publication Critical patent/EP4370951A2/de
Publication of EP4370951A4 publication Critical patent/EP4370951A4/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/60Velocity or trajectory determination systems; Sense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • G01S7/4091Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder during normal radar operation

Definitions

  • the disclosure herein relates to systems and methods for using radars to sense the environment around vehicles.
  • systems and methods are described for detecting the tilt of vehicles and for providing audio alerts to indicate presence and direction of objects detected in the vehicle’s environment.
  • sensors may be used to sense objects. Indeed with the increased usage of autonomous vehicles such as self-driving cars and the like a plethora of sensors are used to detect objects in the vicinity of travelling vehicles. For example, sensors such as video cameras, ultrasonic sensors, infrared, LIDAR sensors and the like may be used to provide information pertaining to the environment through which the vehicle is travelling.
  • Radar sensors have the advantage of operating in total darkness, fog, mist and rain. Radar is an electronic system with the advantages of low cost, low-power consumption, and high precision. It can be significantly applied in various applications including, space shuttle topographic missions, optics, geotechnical mapping, meteorological detection, and so on.
  • the working efficiency of a radar system is based upon reliable and stable radar signal with wide coverage, high directionality, high gain and low signal-to-noise ratio.
  • Two-wheeler vehicles are provided with various active driving assist devices which actively assist the drivers. These include various sensors, brake system, engine management system, and HMI (Human Machine Interface). These sensors and systems act as a sensory organ enabling advanced motorcycle assistance and safety functions while providing an accurate picture of the vehicle’s surroundings. As a result, these assistance functions not only increase safety, but also enhance enjoyment and convenience by making life easier for riders.
  • active driving assist devices include various sensors, brake system, engine management system, and HMI (Human Machine Interface).
  • the sensors used may be radar-based sensors providing advanced rider assistance systems.
  • the radar- based sensors may include an Adaptive Cruise Control (ACC) which adjusts the vehicle speed to the flow of traffic and maintains the necessary safe following distance.
  • ACC Adaptive Cruise Control
  • Another radar-based sensor is a Forward Collision Warning System which detects when another vehicle is dangerously close and warns the rider by way of an acoustic or an optical signal.
  • the radar-based Blind-spot Detection keeps a lookout in all directions to help motorcyclists change lanes safely. All of these and other safety systems protect the rider from collision with the nearby vehicles or objects.
  • the motorcycle riders often tilt or lean to a side while riding, especially during turns.
  • the leaning of the motorcycle creates a big risk to the rider.
  • the risk is enhanced when the plane of the road is not horizontal, like on a hill.
  • the abnormal leaning of the vehicle during riding is a major cause of accidents for two-wheeler vehicles.
  • Proper presentation of the radar image to the driver may depend on the tilt angle.
  • Assessment of hazard posed by objects detected by the radar may depend on the tilt angle as well.
  • a system for detecting the tilt of a two-wheeler vehicle such as a motorcycle using a mounted radar system includes a radar-based sensor unit, a processing unit, a database and a communicator.
  • the radar-based sensor unit may include an array of transmitters and receivers which are configured to transmit a beam of electromagnetic radiations towards the objects on the road on which the motorcycle is being driven and receive the electromagnetic waves reflected by objects, respectively.
  • the information received by the receiver may include a horizontal coordinate (perpendicular to the direction of travel or the vehicle) and a vertical coordinate for each object detected.
  • the sensor unit may also include a pre-processing unit which comprises a motion characteristic extraction module for determining the stationary and non-stationary state of the objects from the received electromagnetic signals.
  • the processing unit receives the horizontal and vertical coordinates along with the motion characteristics of the objects and plots a two-dimensional coordinate of each reflected object in the horizontal-vertical plane perpendicular to the direction of motion of the vehicle.
  • the processing unit is further configured to select a series of candidate tilt or roll angles and constructs a rectangular box corresponding to each roll angle.
  • the processing unit is also configured to count the number of reflections plotted inside each rectangular box and selects the tilt angle having the maximum number of plotted points inside the box as the tilt angle of the vehicle.
  • the tilt angle of the vehicle at different instances of time may be calculated and stored in the database.
  • the communicator may then transmit the tilt angle of the vehicle and the corresponding notification to the concerned parties through a communication network.
  • a method for determining the roll of a moving object by providing a vehicle mounted radar unit comprising a radar transmission unit comprising an array of transmitter antennas connected to an oscillator, and a radar receiving unit comprising at least one receiver antenna, providing a processing unit in communication with the radar receiving unit, transmitting electromagnetic radiation into the region surrounding the vehicle, receiving electromagnetic radiation reflected from objects in the region surrounding the vehicle, transferring received electromagnetic signals to the processing unit, plotting the two-dimensional coordinates of each reflected object in the horizontal-vertical plane perpendicular to the direction of motion of the vehicle, and selecting a series of candidate tilt angles. For each candidate tilt angle, constructing a virtual box parallel to an associated candidate horizon, and counting reflections within the virtual box, then selecting the candidate tilt angle with largest number of reflections within the virtual box.
  • the virtual box comprises a rectangular box.
  • a box extending from 20 centimeters below candidate ground level to 2 meters above candidate ground level.
  • Fig. 1 illustrates a schematic representation of a radar-based system 100 for measuring the tilt angle of a motorcycle according to an aspect of the invention
  • Fig. 2 illustrates a scematic representation of a motorcycle equipped with a radar-based sensor
  • Fig. 3 illustrates roll estimation method on a radar coordinate system
  • Fig. 4 illustrates a flowchart showing method steps for measuring the tilt angle of a motorcycle according to an aspect of the invention
  • Fig. 5 illustrates a 2D graphical representation of the signal reflected from objects on the road
  • Figs. 6A, 6B and 6C illustrate graphical representations of the reflected signals in the rectangular box for various candidate tilt angles
  • Fig. 7 A illustrates a travelling motorcycle receiving reflected signals from surrounding objects on an onboard radar oriented away from the direction of travel;
  • Fig. 7B is a typical clutter Doppler distribution profile
  • Fig. 7C illustrates the misalignment between between the direction of the motorcycle’s front point and radar’s boresight.
  • Fig. 8 is a block diagram indicating some key features of a system for providing directional audio alerts
  • Fig. 9 schematically represents a vehicle mounted radar unit configured to sense objects in the region surrounding a vehicle
  • Fig. 10 is a block diagram of selected elements of a possible radar system which may be provided for sensing the surroundings;
  • FIG. 11 schematically represents an example of a vehicle cabin within which a set of audio signal generators are provided.
  • aspects of the present disclosure relate to systems and methods for using radars to sense the environment around vehicles.
  • systems and methods are described for detecting the tilt of vehicles and for providing audio alerts to indicate presence and direction of objects detected in the vehicle’s environment.
  • the tilt of a two-wheeler vehicle such as a motorcycle may be detected using a mounted radar system for receiving electromagnetic signals reflected from the objects on the road and processing the electromagnetic signals to determine the tilt angle of the vehicle.
  • the identified tilt angle may be sent to the concerned parties.
  • an audio signal generation unit may be configured to produce sounds such that a listener perceives the source of the sound as originated from the direction of the detected object.
  • one or more tasks as described herein may be performed by a data processor, such as a computing platform or distributed computing system for executing a plurality of instructions.
  • the data processor includes or accesses a volatile memory for storing instructions, data or the like.
  • the data processor may access a non-volatile storage, for example, a magnetic hard disk, flash-drive, removable media or the like, for storing instructions and/or data.
  • the methods may be performed in an order different from described, and that various steps may be added, omitted or combined.
  • aspects and components described with respect to certain embodiments may be combined in various other embodiments.
  • Fig. 1 a schematic representation of a radar-based system 100 for measuring the tilt angle of a motorcycle according to an aspect of the invention.
  • the system 100 includes a radar-based sensor unit 104, a processing unit 116, a database 118 and a communicator 120.
  • the radar-based sensor unit 104 may be mounted, for example, on a two-wheeler vehicle such as shown in Fig. 2 which illustrates a motorcycle 202 equipped with a radar-based sensor 208.
  • the sensor 208 is shown to be mounted near the handle of the motorcycle 202. Flowever, it should be noted that the sensor 208 may be mounted or attached to any part of the motorcycle 202 without limiting the scope of the invention.
  • the motorcycle 202 is shown to be driven by a rider 204 who is riding the motorcycle 202 in a leaning position with respect to the horizon of a road 206.
  • the sensor radar-based 208 transmits electromagnetic signals in all the directions while the motorcycle 202 is being driven.
  • the radar-based sensor unit 104 includes an array of transmitters 106 and an array of receivers 110.
  • the array of transmitters 106 may include an oscillator 108 connected to at least one transmitter antenna or an array of transmitter antennas. Accordingly, the transmitters 106 may be configured to produce a beam of electromagnetic (EM) radiations, such as microwave radiation or the like, directed towards the objects on the road 102 on which the motorcycle 202 is being driven.
  • EM electromagnetic
  • the road 102 is shown to have stationary as well as non-stationary objects like a standing electricity pole 102a, a moving car 102b, a standing tree 102c and a moving man 102d.
  • the road 102 may include a variety of other stationary and non-stationary objects, like sidewalls, animals, advertisement hoardings, bus stops, etc. towards which the EM signals may be sent. Further, the transmitter 106 may transmit the EM signals towards all directions on the road 102 surrounding the motorcycle 202.
  • the receiver 110 may include an array of receiver antennas configured and operable to receive electromagnetic waves reflected by objects on the road 102.
  • the information received by the receiver 110 may include a horizontal coordinate (perpendicular to the direction of travel or the vehicle) and a vertical coordinate for each object detected.
  • the information may also include the stationary and non-stationary state of the object including its state of walking, running, velocity, acceleration, etc.
  • the electromagnetic signals received by the receiver 110 are sent to a pre-processing unit 112 of the radar- based sensor unit 104.
  • the pre-processing unit 112 comprises a motion characteristic extraction module 114 which is configured to determine the stationary and non-stationary state of the objects 102a, 102b, 102c and 102c from the received electromagnetic signals.
  • the motion characteristics may also include, but not limited to, rate of acceleration and velocity, trajectory of the motion, the state of walking, running, and so on.
  • the horizontal and vertical coordinates along with the motion characteristics of the objects 102a, 102b, 102c and 102d generated by the module 114 are sent to the processing unit 116.
  • the processing unit 116 is configured to plot two-dimensional coordinates of each reflected object 102a, 102b, 102c and 102d in the horizontal-vertical plane perpendicular to the direction of motion of the vehicle 202.
  • Fig. 5 illustrates a 2D graphical representation of the signal reflected from objects on the road 102.
  • a camera reference 502 is shown with the horizon plotted based on the estimated roll or tilt.
  • the Velocity Consensus Points 504 are the consensus set of “static points” projected on the XY plane. All the points 506 of all the stationary and non-stationary objects 102a, 102b, 102c and 102d are shown to be plotted on the XY plane.
  • the graph 508 shows the plotting of roll estimation by the processing unit 116 along the x-axis 510 which is a measure of dx coordinate assumed for a unit vector, normal to the road, in radar coordinate system, and y-axis 512 which is a measure of per dx hypothesis (x axis) the precent of points located in resulting box.
  • the processing unit 116 further selects a series of candidate tilt or roll angles based on the two-dimensional coordinates of the reflected objects 102a, 102b, 102c and 102d. For each candidate tilt angle, a rectangular box parallel to its horizon is constructed.
  • Figs. 6A, 6B and 6C illustrate exemplary rectangular boxes 602a, 602b and 602c constructed for different roll angles 604a, 604b and 604c, respectively.
  • the number of reflections received from each object is plotted on the graphical representations 600a, 600b and 600c.
  • the number of reflections plotted inside the rectangular boxes 602a, 602b and 602c are counted and the tilt angle having the maximum number of plotted points inside the box is selected as the tilt angle of the vehicle 202.
  • Fig. 6C is shown to have the maximum number of points inside the box 602c and the corresponding roll angle 604c may be selected as the tilt angle of the vehicle 202.
  • Fig. 3 which illustrates a roll estimation method 300 on a radar coordinate system according to an exemplary aspect of the present invention.
  • the roll estimation method may be described according to the following exemplary algorithm:
  • the tilt angle of the vehicle at different instances of time may be calculated and stored in the database 118.
  • one use for knowing the tilt angle of the vehicle may be to allow the frame of reference of vehicle mounted sensors to be aligned to the real world.
  • the tilt angle of the vehicle 202 and some corresponding notification may further be sent to the concerned parties 124a, 124b and 124c.
  • the concerned party 124a may include the rider 204 itself to whom a danger notification is sent when the motorcycle 202 tilt angle reaches or crosses a safe threshold value.
  • the notification may be provided in audio/visual form to alert the rider 204.
  • the notification may also be transmitted to the traffic or road safety police 124b of dangerous driving by the rider 204 when the tilt angle reaches or crosses a safe threshold value.
  • the tilt angle may also indicate to the police or concerned authorities 124b about the tilt angle of the road to take appropriate actions.
  • the notification may also be sent on the mobile device of the parent or guardians 124c of the rider 204 about the dangerous driving habits of the rider 204.
  • the tilt angle of the vehicle 202 and the corresponding notifications are sent from the database 118 through the communicator 120 which transmits the information through a communication network 122.
  • the communication network 122 may include Internet, a Bluetooth network, a Wired LAN, a Wireless LAN, a WiFi Network, a Zigbee Network, a Z-Wave Network or an Ethernet Network.
  • Fig. 4 which illustrates a flowchart 400 showing method steps for measuring the tilt angle of the motorcycle 202 according to an aspect of the invention. The process starts at step 402 and a radar unit 104 is provided mounted on a two-wheeler vehicle like the motorcycle 202 at step 404.
  • the EM signals from the transmitting antennas 106 of the radar unit 104 are transmitted to the objects 102a, 102b, 102c and 102d on the road 102 on which the motorcycle 202 is being driven.
  • the signals reflected from the objects 102a, 102b, 102c and 102d are received by the receiver antennas 110 at step 408.
  • the information received by the receiver 110 may include a horizontal coordinate (perpendicular to the direction of travel or the vehicle) and a vertical coordinate for each object detected.
  • the information may also include motion characteristics of the objects 102a, 102b, 102c and 102d like the stationary and non-stationary state of the object including its state of walking, running, velocity, acceleration, etc.
  • the horizontal and vertical coordinates along with the motion characteristics of the objects 102a, 102b, 102c and 102d are sent to the processing unit 116.
  • the processing unit 116 plots a two- dimensional coordinate of each reflected object 102a, 102b, 102c and 102d in the horizontal-vertical plane perpendicular to the direction of motion of the vehicle 202.
  • a series of candidate tilt angles 604a, 604b and 604c are selected by the processing unit 116 and for each candidate tilt angle a corresponding rectangular box 602a, 602b and 602c is constructed at step 416.
  • the number of reflections plotted inside the rectangular boxes 602a, 602b and 602c are counted and the tilt angle having the maximum number of plotted points inside the box is selected as the tilt angle of the vehicle 202 at step 420.
  • plot may refer to a collection of 2D coordinates of points, or to a 2D representation of reflectivity values versus the coordinates, irrespective of whether this collection of data is displayed by graphical means or not.
  • the data generated by the system may be corrupted with statistical noise. Accordingly, the tilt angle may be estimated to lie only within a relatively high tolerance range.
  • a filter such as a Kalman filter or the like, may be applied to the noisy data at step 422.
  • the tilt angle of the vehicle 202 and some corresponding notification may be sent to the concerned parties 124a, 124b and 124c.
  • the process completes at step 426.
  • the systems and methods explained above may estimate the tilt angle of the driven two-wheeler vehicle in an efficient and accurate manner.
  • tilt estimation system includes, the “auto-calibration” procedure estimating the orientation of the radar relative to the direction of movement of the motorcycle’s front and issue a warning if the estimated deviation is larger than a certain threshold.
  • a certain threshold Preferably there should be a threshold accuracy of the order of one degree without redress to any specific setup or special handling by the rider.
  • the data passed to the application-layer may be slim (tracked targets).
  • the speed data (v x , v y ) passes estimation through a filter, thus may be less accurate.
  • the orientation error ⁇ 50 rac /ar may be found, assuming that the self-speed v ego is known.
  • the radial velocity (measured by Doppler-processing) towards the sensor is given by:
  • the clutter Doppler distribution profile is shown in Fig. 7B. It is noted that looking at the distribution of radial- velocities in the radar’s coordinate-system (angle against radar’s boresight) the distribution will have an offset proportional to the orientation-error.
  • This method may be further extended to incorporate also the angle vs the vertical axis (which somewhat resembles “elevation”)
  • the above described method may work well for a straight ride.
  • Fig. 7C in a motorcycle the front wheel and the radar are not strictly-connected, there is a misalignment between the direction of the motorcycle’s front point and radar’s boresight, resulting in ambiguity between legitimate steering and radar- misalignment. Accordingly, the steering angle per frame or other such filtering method is required.
  • motorcycles can impose significant roll-angle variation, which needs to be compensated out.
  • One potential filtering methods may account for the steering angle, when it is available, to compensate the temporary angle, by:
  • the motorcycle roll angle can be estimated by the method described above. Accordingly, the roll angle can be used to screen for straight-driving frames (as stable driving on a straight-line corresponds with no roll-angle).
  • the radar offset df radar can be estimated by:
  • averaging may be based on a simple average: or a weighted average: or any other method (e.g. MLE) as required.
  • Further methods may include methods for sufficient differentiation in which the self-calibration mechanism works only at certain minimal speed i hgo.min, considered vs the radar’s Doppler-resolution dve go , to have enough velocity-bins on the vender) curve. It is further noted that the radar-based roll estimation may suffer from installation errors. Once having an external measurement (steering angle or roll-angle from IMU) the roll-angle output can be compared to this data and averaged over long time, to find consistent errors.
  • aspects of the present disclosure relate to systems and methods for providing sensory indications to drivers regarding detected objects surrounding their vehicles and alerting them to potential hazards without distracting them from their driving.
  • audio signals maybe provided to indicate the direction and nature of surrounding objects.
  • the system may include a sensing unit 802, a controller 804, an alert generator 806, and a directional audio signal generation unit 808.
  • the sensing unit may be a radar unit, a lidar unit, a sonar unit, an echolocating unit, an optical image generator or the like as well as combinations thereof.
  • the sensing unit is configured and operable to collect raw data indicative of the presence of objects within the vicinity of the listener.
  • the raw data may also provide information regarding the relative direction of the objects from the listener.
  • the sensing unit may comprise an array of sensors of a variety of modalities, and/or sensors distributed around the vehicle.
  • the controller such as a computing device, may be configured to receive the data from the sensing unit and to execute an identification function thereby ascertaining the direction of detected objects. Where appropriate, the controller may be further operable to identify the nature of the object detected and to communicate control signals to the alert generator.
  • the controller may include an IMU (inertial measurement unit) or an odometric unit operable to gather data such as velocity and direction from a host vehicle.
  • the alert generator may be in communication with the controller and operable to receive signals from the controller.
  • the alert generator may be further operable to select alert signal appropriate to the identity and direction of the detected object.
  • the audio signal generation unit is configured to produce sounds such that a listener perceives the source of the sound as originated from the direction of the detected object.
  • the audio signal generation unit may be an array of speakers, for example a pair of stereo speakers, a set of four quadrophonic, earphones or the like.
  • Fig. 9 schematically represents an example of a sensing unit 900 configured to sense objects in the region 920 surrounding a vehicle to which it is mounted.
  • the sensing unit 900 may be mounted to various vehicles as required, such as road vehicles for example cars, trucks, bikes, trailers, caravans and the like, work vehicles such as diggers, cranes, and the like, as well aircrafts and watercrafts where appropriate.
  • such a sensing units may be stationary and configured to detect objects entering a surrounding monitored region.
  • Examples of the sensing unit 900 may be mounted to a car and used to detect various objects in its vicinity such as other cars 921, bicycles 922, pedestrians 923, road signs 924, walls 925, curbs 926, trees 927 and the like.
  • the sensing unit 900 may be used to harvest information regarding the environment through which the vehicle is travelling.
  • This disclosure teaches various techniques by which a radar unit may analyze received data such that useful information may be gathered such as the vehicles relative speed and the identification of hazards in its surroundings.
  • the system includes a radar unit 1010 and a controller 1030.
  • the radar unit 1010 may include a radar transmission unit 1012 and a radar receiving unit 1014.
  • the radar transmission unit 1012 includes an array of transmitter antennas TX connected to an oscillator 1016 and configured to transmit electromagnetic waves into a region surrounding the vehicle.
  • the radar receiving unit 1014 includes at least one receiver antenna RX configured to receive electromagnetic waves reflected by objects within the region surrounding the vehicle 1020 and may be operable to generate raw data.
  • the controller 130 may include various modules such as processor units 1032, direction estimation units 1034, target classification unit 1036 and the like.
  • a processor unit 1032 may be in communication with the radar receiving unit 1014 so as to receive raw data from the radar unit 1014 and generate environmental information based upon the received data.
  • a selfvelocity calculation module 1034 may be provided to calculate velocity of the vehicle from raw data
  • a wall detection module 1036 may be provided to detect planar surfaces in the region surrounding the vehicle
  • a dynamic-range enhancement module 1038 may be provided to distinguish objects reflecting weakly such as pedestrians 1023 from objects reflecting strongly such as a wall 1025 or a curb 1026 within the same vicinity.
  • the direction estimation module 1034 may determine the direction towards detected targets and the target classification module 1036 may be operable to identify the nature of the detected target.
  • the controller may provide data to the alert generator 1040 which is provided to select alert signal appropriate to the identity and direction of the detected object.
  • the radar typically includes at least one array of radio frequency transmitter antennas and at least one array of radio frequency receiver antennas.
  • the radio frequency transmitter antennas are connected to an oscillator (radio frequency signal source) and are configured and operable to transmit electromagnetic waves towards the target region.
  • the radio frequency receiver antennas are configured to receive electromagnetic waves reflected back from objects within the target region.
  • the transmitter may be configured to produce a beam of electromagnetic radiation, such as microwave radiation or the like, directed towards a monitored region such as an enclosed room or the like.
  • the receiver may include at least one receiving antenna or array of receiver antennas configured and operable to receive electromagnetic waves reflected by objects within the monitored region.
  • the raw data generated by the receivers is typically a set of magnitude and phase measurements corresponding to the waves scattered back from the objects in front of the array.
  • Spatial reconstruction processing is applied to the measurements to reconstruct the amplitude (scattering strength) at the three-dimensional coordinates of interest within the target region.
  • each three-dimensional section of the volume within the target region may represented by a voxel defined by four values corresponding to an x-coordinate, a y-coordinate, a z-coordinate, and an amplitude value.
  • the receivers are connected to a pre-processing unit configured and operable to process the amplitude matrix of raw data generated by the receivers and produce a filtered point cloud suitable for model optimization.
  • a preprocessing unit may include an amplitude filter operable to select voxels having amplitude above a required threshold and a voxel selector operable to reduce the number of voxels in the filtered data, for example by sampling the data or clustering neighboring voxels.
  • the filtered point cloud may be output to a processor. It is noted that the filtered point cloud may be further simplified by setting the amplitude value of each voxel to ONE when the amplitude is above the threshold and to ZERO when the amplitude is below the threshold.
  • Fig. 11 shows an example of a vehicle cabin 1110 within which a set of audio signal generators 1120a, 1120b, 1120c, 1120d are provided to generate audio signals such that they may be perceived by the driver of the vehicle as originated from the direction of an object detected by a vehicle mounted sensing unit 1130.
  • a set of audio signal generators 1120a, 1120b, 1120c, 1120d may be provided within a helmet worn by a motorist, a cyclist or motorcyclist for example.
  • the sound may be augmented to provide further information to the listener, for example volume may be modulated to indicate proximity to the object and the pitch may be altered to simulate doppler shift thereby indicating the object’s relative velocity.
  • the audio signal characteristics may be in correlation to one or more characteristics of the object detected.
  • amplitude, pitch, amplitude modulation, frequency modulation, pulsing of audio signal, timbre or multitone content of the signal may be in correlation with size of the object, distance to it, its velocity, relative velocity, direction of movement, or a measure of hazard presented by the object.
  • the nature of the alert sound may also indicate the classification of the target characteristics, for instance - air horn for trucks, car horn, bicycle bell, low pitch human voice for an adult and high pitch for children.
  • the audio output may be incorporated into a headset or earphones incorporating a head rotation monitor operable to measure the angle of the head such that perceived direction of the generated sound may be adjusted accordingly.
  • the sensing unit may itself be incorporated into the headset or the earphones, for example into a helmet worn by a motorcyclist or the like.
  • tactile feedback may be provided indicating the direction of the detected objects.
  • vibrations in the seats or from the steering wheel may be generated as required.
  • vibration of the steering wheel at the left-hand location vs. right hand location may indicate the direction, while characteristics such as magnitude or frequency of vibration may be in correlation to one or more characteristics of the object detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
EP22841586.5A 2021-07-15 2022-07-13 Systeme und verfahren zur erfassung der umgebung um fahrzeuge Pending EP4370951A4 (de)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163221963P 2021-07-15 2021-07-15
US202163230755P 2021-08-08 2021-08-08
US202163284057P 2021-11-30 2021-11-30
PCT/IB2022/056459 WO2023285988A2 (en) 2021-07-15 2022-07-13 Systems and methods for sensing environment around vehicles

Publications (2)

Publication Number Publication Date
EP4370951A2 true EP4370951A2 (de) 2024-05-22
EP4370951A4 EP4370951A4 (de) 2025-07-16

Family

ID=84919144

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22841586.5A Pending EP4370951A4 (de) 2021-07-15 2022-07-13 Systeme und verfahren zur erfassung der umgebung um fahrzeuge

Country Status (3)

Country Link
EP (1) EP4370951A4 (de)
JP (1) JP2024527619A (de)
WO (1) WO2023285988A2 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102023210330A1 (de) 2023-10-19 2025-04-24 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren und Steuergerät zur Bestimmung einer Schräglage eines Zweirads
DE102024206211A1 (de) * 2024-07-02 2025-05-22 Continental Autonomous Mobility Germany GmbH Sensorsystem für ein einspuriges Fahrzeug sowie einspuriges Fahrzeug

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7209221B2 (en) * 1994-05-23 2007-04-24 Automotive Technologies International, Inc. Method for obtaining and displaying information about objects in a vehicular blind spot
JP3733863B2 (ja) * 2001-02-02 2006-01-11 株式会社日立製作所 レーダ装置
JP4985420B2 (ja) * 2008-01-18 2012-07-25 トヨタ自動車株式会社 床面検出システム、移動ロボット及び床面検出方法
JP4827956B2 (ja) * 2009-09-18 2011-11-30 三菱電機株式会社 車載用レーダ装置
US20130311075A1 (en) * 2012-05-18 2013-11-21 Continental Automotive Systems, Inc. Motorcycle and helmet providing advance driver assistance
JP6221607B2 (ja) * 2013-10-08 2017-11-01 株式会社デンソー 物体検出装置
US9739881B1 (en) * 2016-03-24 2017-08-22 RFNAV, Inc. Low cost 3D radar imaging and 3D association method from low count linear arrays for all weather autonomous vehicle navigation
US10545221B2 (en) * 2017-05-23 2020-01-28 Veoneer, Inc. Apparatus and method for detecting alignment of sensor in an automotive detection system
WO2020041191A1 (en) * 2018-08-20 2020-02-27 Indian Motorcycle International, LLC Wheeled vehicle notification system and method
WO2020060508A1 (en) 2018-09-21 2020-03-26 Thales Electricus Enerji Uretim San. Ve Tic. A.S. Anti-bacterial seat belt
US10768307B2 (en) * 2018-12-31 2020-09-08 Lyft, Inc. Systems and methods for estimating vehicle speed based on radar
JP7339114B2 (ja) * 2019-10-09 2023-09-05 株式会社Soken 軸ずれ推定装置
DE102019216396A1 (de) * 2019-10-24 2021-04-29 Robert Bosch Gmbh Verfahren und Vorrichtung zum Kalibrieren eines Fahrzeugsensors
JP7736681B2 (ja) * 2019-11-08 2025-09-09 バヤー イメージング リミテッド 車両の周囲を検知するためのシステム及び方法

Also Published As

Publication number Publication date
WO2023285988A3 (en) 2023-03-02
JP2024527619A (ja) 2024-07-25
EP4370951A4 (de) 2025-07-16
WO2023285988A2 (en) 2023-01-19

Similar Documents

Publication Publication Date Title
CN110800031B (zh) 检测和响应警报
JP7394582B2 (ja) レーダーデータを処理する装置及び方法
JP7506829B2 (ja) 自律車両用途のための速度推定および物体追跡
JP6722066B2 (ja) 周辺監視装置及び周辺監視方法
JP2023073257A (ja) 出力装置、制御方法、プログラム及び記憶媒体
US20200225343A1 (en) Vehicle radar system for detecting dangerous goods
US9963069B2 (en) Alarm method for reversing a vehicle by sensing obstacles using structured light
US12164058B2 (en) Radar data analysis and concealed object detection
JP2017535008A (ja) 道路交通利用者の移動モデルを形成するための方法及び装置
WO2017177651A1 (en) Systems and methods for side-directed radar from a vehicle
KR101535722B1 (ko) 스테레오 비전을 이용한 과속 방지턱 감지장치 및 그 제어 방법
JP2018101345A (ja) 運転支援装置
US11798417B2 (en) Driving assistance device
EP4370951A2 (de) Systeme und verfahren zur erfassung der umgebung um fahrzeuge
Lim et al. Real-time forward collision warning system using nested Kalman filter for monocular camera
CN108021899A (zh) 基于双目相机的车载智能前车防撞预警方法
CN114286948B (zh) 用于交通工具操作的基于智能设备的雷达系统
CN112606836A (zh) 辅助驾驶方法及系统
GB2624479A (en) Method and system for reverse parking an autonomous vehicle
US20240338950A1 (en) Systems and methods for sensing environment around vehicles
US12528513B2 (en) Vehicle control systems and methods using kinematically stabilized machine-learning model predicted controls
CN120552861A (zh) 一种车辆控制方法和装置
CN121482753A (zh) 一种传感协同的半挂车倒车盲区动态补盲监测方法
KR20170072615A (ko) 후방 감지 센서 장치 및 이를 이용한 후방 감지 방법
TWM618998U (zh) 複眼攝像系統及使用複眼攝像系統的車輛

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240205

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: G01S0013930000

Ipc: G01S0013600000

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 13/931 20200101ALN20250320BHEP

Ipc: G01S 7/40 20060101ALN20250320BHEP

Ipc: G01S 13/88 20060101ALI20250320BHEP

Ipc: G01S 13/42 20060101ALI20250320BHEP

Ipc: G01S 13/60 20060101AFI20250320BHEP

A4 Supplementary search report drawn up and despatched

Effective date: 20250616

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 13/60 20060101AFI20250610BHEP

Ipc: G01S 13/42 20060101ALI20250610BHEP

Ipc: G01S 13/88 20060101ALI20250610BHEP

Ipc: G01S 7/40 20060101ALN20250610BHEP

Ipc: G01S 13/931 20200101ALN20250610BHEP