US20180322784A1 - Method and device for selecting and transmitting sensor data from a first motor vehicle to a second motor vehicle - Google Patents

Method and device for selecting and transmitting sensor data from a first motor vehicle to a second motor vehicle Download PDF

Info

Publication number
US20180322784A1
US20180322784A1 US15/773,072 US201615773072A US2018322784A1 US 20180322784 A1 US20180322784 A1 US 20180322784A1 US 201615773072 A US201615773072 A US 201615773072A US 2018322784 A1 US2018322784 A1 US 2018322784A1
Authority
US
United States
Prior art keywords
motor vehicle
surroundings
schematic depiction
sensors
applicable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/773,072
Other versions
US10490079B2 (en
Inventor
Bernhard Schild
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Assigned to CONTINENTAL AUTOMOTIVE GMBH reassignment CONTINENTAL AUTOMOTIVE GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHILD, BERNHARD, DR.
Publication of US20180322784A1 publication Critical patent/US20180322784A1/en
Application granted granted Critical
Publication of US10490079B2 publication Critical patent/US10490079B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • G01S7/006Transmission of data between radar, sonar or lidar systems and remote stations using shared front-end circuitry, e.g. antennas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • G01S2013/936
    • G01S2013/9364
    • G01S2013/9367

Definitions

  • the present invention relates to a method for wirelessly transmitting sensor data between vehicles and to an applicable apparatus.
  • Wireless communication networks are employed in a multiplicity of technical fields of application today.
  • vehicles use what is known as car-to-car communication to exchange information with one another.
  • This communication is a wireless ad hoc network that is set up between physically adjacent vehicles in road traffic and, from a technical point of view, involves an evolved WLAN (wireless local area network) network based on the IEEE 802.11 standard.
  • WLAN wireless local area network
  • a wireless radio link between vehicles is used to transmit the information ascertained by the sensor system of one vehicle to other vehicles in physical proximity, for example.
  • This allows in particular information regarding hazard spots to be quickly transmitted from one vehicle to other vehicles.
  • this method does not involve the vehicle that wirelessly receives this information specifying which vehicle is supposed to receive particular information.
  • the transmitted data are of abstract nature and contain little detail. The methods known from the prior art are therefore not suitable for discriminatory information transmission from one vehicle to the other, and no transmission of detailed data is offered that is suitable for assisting a driver directly in a traffic situation that is difficult to see.
  • the car-to-car communication architectures known to date which e.g.
  • the approach shown here permits authentication for the sender that is based inter alia on conventional methods of radio direction finding. This authentication is not possible as precisely for other car-to-car communication methods such as WiFi and Bluetooth or DSRC, because the transmitters thereof cannot be oriented as exactly to a point in three-dimensional space as in the case of the approach described here.
  • the sensor functionality of the AESA radar can be used here to perform distance measurements from a receiving second vehicle to a sending first vehicle for monitoring purposes more or less in parallel with the communication process. Therefore, fast error detection is possible for this type of communication link in the single millisecond range. This cannot be achieved in this way by other types of car-to-car communication.
  • DE 10 2006 055 344 A1 shows the use of data about the traffic situation in the surroundings of a first vehicle in a second vehicle that is relatively close to the first vehicle.
  • the data received in the second vehicle are output at least in part via an output means so as to be perceptible to the driver of said vehicle.
  • the data of this vehicle that are directly relevant to the driver of the first vehicle are received in discriminatory fashion.
  • the first vehicle is identified by registration or numberplate recognition by means of a camera or through the interchange of geoposition data.
  • the received data comprise inter alia a video image recorded by a camera provided in the first vehicle or abstracted data such as a distance between the first vehicle and a vehicle traveling ahead or distance and speed of an oncoming vehicle, for example.
  • DE 199 14 906 A1 discloses a communication system for discriminatory communication between vehicles driven independently of one another that allows exact addressing by virtue of orientation of a communication device and matching of a communication range to the position of a vehicle with which it is desirable to begin communication.
  • position equates with a whereabouts on the earth's surface or in or on a construction provided or suitable for road traffic.
  • a position can also denote the representation of a whereabouts on a map or roadmap.
  • a motor vehicle is a motor-driven vehicle running on the earth's surface or in or on structures connected thereto.
  • the term traffic space equates with a surface on which vehicles, moving or unmoving, participate in the traffic.
  • the traffic space can also extend in different planes, for example in the case of bridges or underpasses.
  • traffic space within the context of this description denotes immediate surroundings of a motor vehicle, the included radius also being able to be dependent on the speed of travel of the motor vehicle and the complexity of the traffic space.
  • the traffic space can have an irregular shape with a different extent in different directions, for example a rectangle that has a longer extent in the direction of travel of the motor vehicle than to the sides or behind.
  • orientation equates with an orientation of a vehicle in the traffic space, the orientation being oriented to a front or to a rear of a vehicle.
  • the front of a vehicle is also referenced by “at the front” and the rear of a vehicle is also referenced as “at the rear”.
  • azimuth describes a horizontal orientation of a sensor, for example a swivel to the left or right from a zero position determined by the installation location and the installation position of the sensor, or a horizontal swivel oriented to an absolute zero reference, for example to the direction of the north pole of the earth as seen from a current location.
  • elevation describes a vertical orientation of a sensor, for example an angle between the horizon and the direction of the sensor.
  • AESA radars may be used to perform three-dimensional scans of spatial sectors by virtue of e.g. electronic beam sweeping being used to vary the scanning radar beam.
  • Radar imaging techniques such as SAR, ISAR, etc. that have already been known for years are not discussed further at this juncture.
  • Lidar is the abbreviation for the English term “light detection and ranging”. Lidar is a method of optical distance and speed measurement related to radar (radio detection and ranging). Instead of radio waves, beams of light, in particular laser beams, are used. To pick up an area, the beam of light is moved in defined fashion over the area, in similar fashion to the line-by-line scanning in a cathode-ray tube television set, the scanning in the case of lidar being effected by means of one or more mobile mirror(s).
  • AESA radar Using an AESA radar, electronic beam sweeping can also be used to create three-dimensional scans of selected parts of the visible vehicle surroundings that can then be conditioned in an imaging process to produce a visible depiction. This involves, in general, a data fusion with other available sensor data, e.g. also from optical systems such as cameras, being carried out beforehand and data cleansing, e.g. by means of Kalman filtering, being performed for collation purposes.
  • a method for depicting sensor data of a first motor vehicle via a man-machine interface of a second motor vehicle involves an image of at least part of surroundings of the second motor vehicle being recorded and being reproduced via the man-machine interface of the second motor vehicle.
  • One or more first motor vehicle(s) is/are mapped at least in part on the image of the surroundings.
  • the display is provided on a screen arranged in the field of view of the driver, for example.
  • the display can also be provided such that respective positions and orientations of the first motor vehicles in the traffic space relative to the second motor vehicle are schematically depicted.
  • the determination of the positions and orientations of the first motor vehicles in the traffic space relative to the second motor vehicle can be carried out particularly easily when a stereo camera system is used.
  • a suitable piece of object-detection software can be used in particular to identify other motor vehicles, which are then assigned applicable coordinates in a three-dimensional space.
  • the object detection also allows silhouettes corresponding to the objects identified as a motor vehicle to be determined.
  • a driver or operator selects one or more of the first motor vehicles from which sensor data are supposed to be received.
  • the selection can be made by touching a respective motor vehicle on a touch-sensitive screen on which the image of the surroundings is reproduced, for example. If gesture detection is available, the selection can also be made by pointing to one or more first motor vehicle(s) depicted on a screen as applicable.
  • Other selection methods for example by placing a cursor on or close to a motor vehicle depicted on the screen, are likewise conceivable, the cursor being able to be controlled in a fundamentally known manner.
  • one or more individual first point-to-point connection(s) from the second to the selected first motor vehicle(s) is/are set up via a first wireless communication interface.
  • the one or more first motor vehicle(s) can be identified using registration recognition, for example.
  • the registration is at the same time an identification for setup of a connection.
  • motor vehicles it is also possible for motor vehicles to periodically emit identification signals that can be received by other motor vehicles and used for setting up a connection.
  • the one or more individual first point-to-point connection(s) can be set up after applicable orientation of a first transmission or reception apparatus in the direction of the applicable one or more first motor vehicle(s), for example.
  • the first transmission or reception apparatus can be oriented on the basis of angles of azimuth and elevation for the selected first motor vehicle(s) that are ascertained from the image of the surroundings of the second motor vehicle beforehand, for example.
  • the applicable angles for orienting the first transmission or reception apparatus to be determined from the known position of the second motor vehicle and the known properties of the camera that was used to record the image of the surroundings of the second motor vehicle.
  • the applicable angles for orienting the first transmission or reception apparatus can be determined from the known position of the second motor vehicle and the known properties of the camera that was used to record the image of the surroundings of the second motor vehicle.
  • the identity of the thus selected vehicle for the currently measured distance can additionally be plausibilized taking into consideration the azimuth and elevation values. If the installation locations for radar systems or lidar systems in motor vehicles are standardised, for example always centrally between the headlamps of the motor vehicles, then the object detection described above can be used to set the location to which the transmission or reception apparatus is oriented correspondingly more accurately.
  • the setup of point-to-point connections using oriented transmission and reception apparatuses allows the use of other identification features for setup of a connection to be dispensed with.
  • the beamforming to set up an optimal radio link from the respective vehicles, for example the first vehicles, to the addressed other vehicles, for example the second vehicles can be oriented such that only the addressed vehicles react to the setup of the point-to-point connection and those unaffected ignore it.
  • a vehicle accepts reception of electromagnetic waves from a vehicle only if they are directed from the sending vehicle to the receiving vehicle and the receiving vehicle also expects a transmission from the sending vehicle. This can be verified using conventional radio direction finding/radio locating means.
  • a request to transmit information pertaining to available sensors and the properties thereof to the second motor vehicle is transmitted to the selected first motor vehicle(s).
  • the second motor vehicle receives responses from one or more of the selected first motor vehicles via the respective first point-to-point connections and reproduces a schematic depiction of the surroundings of the second motor vehicle, wherein the schematic depiction shows positions of the second and the one or more first motor vehicle(s) in the surroundings of the second motor vehicle, and wherein the schematic depiction depicts areas within which the sensors of the one or more first motor vehicle(s) and/or of the second motor vehicle can pick up objects. These areas may be denoted in colour or by an applicable texture in the schematic depiction, for example.
  • the schematic depiction of the surroundings can moreover show objects that are located in the surroundings of the second motor vehicle and/or in the respective surroundings of the one or more first motor vehicle(s).
  • the objects include, inter alia, boundaries of roads, houses and the like, but also other objects located in the traffic space that have been picked up by sensors.
  • the man-machine interface of the second motor vehicle receives a user input that corresponds to a selection of one or more of the motor vehicles, or the sensors thereof, shown in the schematic depiction and of an area to be picked up by the selected sensor(s) and sends an applicable request to one or more first motor vehicle(s) whose sensors can pick up the area to be picked up. If an area to be picked up in the schematic depiction can be picked up only by sensors of one of the one or more first motor vehicle(s), the request is also sent only to this one first motor vehicle. Otherwise, the request can be sent to all first motor vehicles whose sensors can pick up the area to be picked up.
  • the request to pick up an area can include information pertaining to the azimuth and elevation of a sensor, either from the point of view of the second motor vehicle or from the point of view of the respective first motor vehicle whose sensor is supposed to pick up the area.
  • the selection of an area to be picked up by a sensor, in particular the narrowing-down of an area to be picked up, causes a decrease in the volume of the data to be transmitted, which allows faster reproduction of the sensor data in the second motor vehicle to take place.
  • the first motor vehicle(s) having received a request to pick up an area perform(s) applicable sensor measurements and transmit(s) the measurement results to the second motor vehicle.
  • the measurement result reported is the presence of an object in the picked-up area and the coordinates thereof or, if an object has not been detected, an applicable report that an object has not been detected.
  • further information about detected objects can be reported, for example the dimensions or contours thereof and, if multiple measurements have been carried out cyclically in succession, whether and in what direction detected objects move.
  • the corresponding system in the second motor vehicle receives the sensor data sent by the one or more first motor vehicle(s) in response to the request and reproduces them via the man-machine interface of the second motor vehicle. If sensor data have been received from multiple first motor vehicles, they are reproduced jointly, if need be after an applicable data fusion. If a measurement has not been carried out, this “nonevent” is also reproduced as applicable. In addition to a visual reproduction, an audible signal can be reproduced for each measurement, different audible signals also being able to be used for detected objects and non-detection of objects. Similarly, haptic feedback can be used to signal the measurement results.
  • an applicable vibration can signal the presence of an object in the area to be picked up selected by the position of the finger. Dispensing with time-consuming production of a visual depiction from the measurement data of the sensor(s) of the first motor vehicle(s) results in faster feedback in this case.
  • the measurement by the sensor(s) of the first motor vehicle(s) is repeated cyclically and the measurement results are transmitted and updated as applicable on the reproduction by the man-machine interface of the second motor vehicle.
  • the time intervals in which the measurement is repeated may be settable by a user or adjusted automatically in this case. Automatic adjustment is possible on the basis of the current speed of the first or the second motor vehicle, for example.
  • the request is sent to the selected first motor vehicle(s) via respective individual first point-to-point connections that are set up via the first wireless communication interface.
  • the responses are received by a respective second point-to-point connection that is set up via a second wireless communication interface.
  • the first wireless communication interface operates on the basis of the IEEE 802.11 standard, and point-to-point connections set up via it are logical point-to-point connections.
  • the second wireless communication interface uses a radar system or lidar system provided in the first and the one or more second vehicle(s), for example, and point-to-point connections set up via it are physical point-to-point connections that involve transmitter and receiver communicating with one another using a focused radar beam or beam of light oriented between transmitter and receiver.
  • the communication using a focused radar beam oriented between transmitter and receiver is known from military aviation.
  • radar systems having active electronic beam sweeping and aperture control also known as active phased-array radar or an active electronically scanned array (AESA) are used, which are based on semiconductor chips, and therefore can also be used in the automotive sector.
  • AESA active electronically scanned array
  • the introduction of the AESA in military aviation resulted in one and the same radar appliance being able to be used to carry out different orders in parallel that had previously been handled by separate appliances. Therefore, a single AESA radar can be used to handle different sensor and communication tasks (virtually) in parallel, virtually in parallel here meaning that the tasks alternate within short successive timeslots.
  • the receiving vehicle can authenticate the source, i.e. the transmitter of the sensor data, using conventional means of radio direction finding/radio locating in geometric fashion, i.e. taking into consideration azimuth, elevation and distance, taking into consideration the visibility thereof for the original selection of the vehicles. Further, in the case of distance measurement conducted virtually in parallel, it is also possible for an interruption in this “radio link” to be picked up immediately, that is to say to ensure error detection in the single millisecond range.
  • the sensor pickup by the first vehicle can periodically be temporarily directed by the second vehicle to areas of the traffic space that are able to be picked up jointly, in order to check correct functionality. A second vehicle can thus persistently periodically run a diagnosis of the sensors used in a first vehicle.
  • sensor data are transmitted between the vehicles by means of an EASA radar.
  • EASA radar instead of radar data, however, it is also possible for live images from a video camera arranged in the first vehicle to be transmitted to the second vehicle.
  • live images from a video camera arranged in the first vehicle may be transmitted to the second vehicle.
  • the high bandwidth that the data transmission via a directional radar beam provides allows almost latency-free transmission of sensor data, e.g. video image transmission of even uncompressed video signals.
  • the transmission of uncompressed video signals has the advantage that latency owing to compression disappears.
  • the direct transmission of sensor data without any preprocessing from a first vehicle to a second vehicle then also allows processing of these data in e.g. the ADAS system of the second vehicle with controllable, low latencies for the capture of said data.
  • the driver of the second vehicle can therefore obtain an overview of the traffic situation from the point of view of the first vehicle in real time.
  • data ascertained by the radar system of the first vehicle pertaining to objects in the coverage area of the radar system of the first vehicle can be transmitted and can be reproduced together with the video image.
  • Additional information comprises inter alia distance information between the object and the second vehicle and also the speed and direction of movement of the object, if present, measured in the first vehicle.
  • the real-time transmission of the image content (visual data or scan results e.g. from radar sensors) and of other sensor data allows a data fusion to be effected in the e.g. ADAS system of a second vehicle for local and remote sensor data.
  • ADAS system e.g. ADAS system of a second vehicle for local and remote sensor data.
  • Kalman filtering methods can be used to make improved predictions about the behavior of precisely those objects, for example vehicles, pedestrians, etc., that have left the scan area of the vehicle's own sensors, possibly even only for a short time.
  • Such a prediction can be transmitted to the driver, e.g. visually in the form of augmented reality.
  • the display of the man-machine interface is updated accordingly.
  • the update can be effected directly or after a waiting time has elapsed.
  • An apparatus provided in a second motor vehicle for carrying out the method according to the invention comprises at least one sensor that is configured to detect one or more first motor vehicle(s) in the surroundings.
  • the sensor is a camera arrangement that is connected to an applicable device for object detection.
  • the apparatus moreover comprises a first and optionally a second communication interface via which one or more point-to-point connection(s) to one or more first motor vehicle(s) can be set up.
  • the apparatus comprises a man-machine interface that is configured to reproduce sensor measurement results delivered by vehicle-based sensors and by other motor vehicles. In this case, the man-machine interface is also configured to accept user inputs.
  • the apparatus or single components thereof has/have one or more microprocessor(s) that is/are connected to volatile or nonvolatile memory means.
  • the one or more microprocessor(s) in this case execute(s) computer program instructions that are preferably stored in the nonvolatile memory means, and the execution of which causes the implementation of single or multiple steps of the method according to the invention.
  • An apparatus provided in a first motor vehicle for carrying out the method according to the invention comprises at least one sensor that is configured to pick up objects in the surroundings of the first motor vehicle.
  • the sensor is a camera arrangement that may be connected to an applicable device for object detection.
  • Other suitable sensors are radar sensors, ultrasonic sensors, laser sensors and the like, for example.
  • the apparatus provided in the first motor vehicle moreover comprises a first and optionally a second communication interface via which a point-to-point connection to a second motor vehicle can be set up.
  • the apparatus furthermore comprises a control unit that evaluates and executes demands for sensor measurements coming from a second motor vehicle and triggers the transmission of the sensor measurement results to the second motor vehicle.
  • the apparatus or single components thereof has/have one or more microprocessor(s) that are connected to volatile or nonvolatile memory means.
  • the one or more microprocessor(s) in this case execute(s) computer program instructions that are preferably stored in the nonvolatile memory means, and the execution of which causes the implementation of single or multiple steps of the method according to the invention.
  • the present invention allows a driver of a vehicle to access sensor systems of other vehicles in the surroundings and to obtain information even pertaining to objects that are not in the visual range of the driver or of the sensor systems of his vehicle.
  • the bidirectional, almost latency-free point-to-point communication furthermore allows sensor systems of other vehicles to be selectively accessed and the coverage areas thereof to be controlled such that information pertaining to particular selected areas of the surroundings of the vehicle is obtained in discriminatory fashion.
  • the selective active access to sensor systems of other vehicles means that the driver of the vehicle no longer has to rely on transmission intervals of the other vehicles that they use to broadcast possibly selected information pertaining to their respective surroundings to all receivers in the surroundings, as provided for in car-to-car communication.
  • the method for sending sensor data via an AESA radar from a first vehicle to a second vehicle using beamforming as directional radio presents a method that allows sufficiently low latency for the transmission and error detection, secure authentication and identification, which is indispensable for the use of external sensor data from a first vehicle in a second vehicle. If the second vehicle now also performs diagnoses for the sensor functionality in the first vehicle, e.g. when picking up common parts of the traffic space, with the necessary frequency, this communication link from a first vehicle to a second vehicle has sufficient reliability for use for ADAS purposes in a second vehicle.
  • FIG. 1 shows a schematic depiction of a traffic space in the surroundings of a second motor vehicle
  • FIG. 2 depicts areas able to be picked up by the driver of the second motor vehicle or by sensors in the motor vehicle;
  • FIG. 3 shows the visual range of the driver of a first motor vehicle
  • FIG. 4 shows an exemplary communication using the method according to the invention between a second and a first motor vehicle
  • FIG. 5 shows an area picked up by the sensors of the first motor vehicle at the request of the second motor vehicle
  • FIG. 6 shows a schematic exemplary block diagram of an apparatus in a first motor vehicle
  • FIG. 7 shows a schematic exemplary block diagram of an apparatus in a second motor vehicle.
  • FIG. 1 shows a schematic depiction of a traffic space 100 in the surroundings of a second motor vehicle 2 .
  • the present traffic space is a junction between two roads, with the general right-of-way rule of “right before left” applying.
  • There are several other vehicles in the present traffic space and at the corners of the junction there are buildings 104 - 107 , indicated by the thicker boundary lines, with footpaths in front of them.
  • the buildings 104 - 107 and a large vehicle 1 . 1 that has stopped in a road branching off to the right block part of the view of the traffic space for the driver of the second vehicle 2 and sensors mounted in the second vehicle 2 .
  • the planned route of the second motor vehicle 2 is indicated by the arrow 102 .
  • the driver of the second motor vehicle 2 should yield right of way to the vehicle 1 . 2 coming from the right before he himself can cross the junction.
  • FIG. 2 The areas able to be picked up by the driver of the second motor vehicle 2 or the areas able to be picked up by sensors in the motor vehicle are depicted in FIG. 2 .
  • the coverage areas of the sensors and the visual range of the driver are assumed to be identical in the figures.
  • the visual range of the driver is bounded on the left by the building 107 .
  • To the right, the visual range of the driver is bounded by the building 106 .
  • the vehicle 1 . 1 blocks a large part of the view to the right.
  • the part that the driver of the motor vehicle 2 cannot see is depicted by shading in the figure. It can clearly be seen that the view of the motor vehicle 1 . 2 is blocked by the motor vehicle 1 . 1 .
  • FIG. 3 depicts the visual range of the driver of the first motor vehicle 1 . 3 .
  • the visual range of the driver of the motor vehicle 1 . 3 is bounded to the left by the building 104 and to the right by the building 107 .
  • the vehicle 1 . 1 blocks a narrow area.
  • the area that the driver of the motor vehicle 1 . 3 cannot see is depicted by shading.
  • the vehicle 1 . 2 is completely in the visual range of the driver of the vehicle 1 . 3 .
  • FIG. 4 depicts an exemplary communication using the method according to the invention between a second and a first motor vehicle for the traffic situation shown in FIGS. 1 to 3 .
  • the second motor vehicle 2 is equipped with an apparatus that searches the surroundings of the second motor vehicle 2 for external sensors whose measurement results can be displayed via a man-machine interface in the second motor vehicle.
  • the apparatus is a camera that searches the traffic space in front of the second motor vehicle 2 for other motor vehicles.
  • the first motor vehicle 1 . 3 is picked up by the camera.
  • an apparatus provided in the second motor vehicle 2 sends a request to the first motor vehicle 1 .
  • the request can be sent via a first point-to-point connection that is set up via a first communication interface.
  • the first motor vehicle 1 . 3 receives and processes the request in an applicable apparatus.
  • the first motor vehicle 1 . 3 is likewise supposed to be equipped with a camera that picks up the traffic space in front of the first motor vehicle 1 . 3 .
  • the apparatus in the first motor vehicle 1 . 3 sends a response to the second motor vehicle 2 containing information pertaining to available sensors and the characteristics thereof.
  • the characteristics of sensors comprise a coverage area, a vertical and/or horizontal swivel area, information about the spatial and/or temporal resolution and/or the range of the sensor and the like, for example.
  • the response can be sent via the first point-to-point connection set up beforehand.
  • the apparatus in the second motor vehicle 2 displays to the driver the available external sensors and optionally also the arrangement thereof in the traffic space and also the coverage areas of said sensors.
  • the driver selects one or more external sensor(s) from which he would like to have measurement results displayed and optionally indicates which area of the traffic space is supposed to be picked up by means of the sensors.
  • the apparatus in the second motor vehicle 2 sends an applicable request to the first motor vehicle 1 .
  • the man-machine interface in the second motor vehicle 2 displays the received and possibly further-processed sensor measurement data, so that the driver of the second motor vehicle 2 can obtain an overview of the traffic situation.
  • the communication setup is not shown in this diagram. Further, authentication processes and error monitoring operations, although likewise at least optionally provided, are not shown.
  • FIG. 5 shows the traffic space from FIGS. 1 to 3 and shows the area picked up by the sensors of the first motor vehicle 1 . 3 at the request of the second motor vehicle 2 .
  • the figure depicts the area picked up by the sensors of the first motor vehicle 1 . 3 by shading. It can clearly be seen that not all of the possible coverage area of the sensors of the first motor vehicle 1 . 3 is actually scanned. In particular the right-hand area from the point of view of the first motor vehicle 1 . 3 is not scanned, because this area can be seen clearly by the driver of the second motor vehicle 2 . In the request for sensor measurement that the apparatus of the second motor vehicle 2 has sent to the first motor vehicle 1 . 3 , the measurement area to be picked up has been specified accordingly.
  • live camera images are sent from the first motor vehicle 1 . 3 to the second motor vehicle 2 .
  • the sensor in the first motor vehicle 1 . 3 is not a camera, but rather a lidar system or a radar system, for example, then instead of the live camera images it is also possible for a piece of information about the existence of an object and the position thereof in the specified measurement area to be transmitted.
  • the apparatus in the first motor vehicle 1 . 3 has an applicable computer capacity and program code means, then a symbol that represents an object in the measurement area can also be generated from the measurement data. In this case, only the symbol and its position in the traffic space need to be transmitted.
  • the position can be ascertained from images from a camera or from applicable measured values from a lidar system or radar system.
  • the position can be described using a distance and direction relative to the first motor vehicle 1 . 3 in this case, or using absolute geoposition data ascertained therefrom, for example.
  • a consideration of these processes in regard to ADASIS concepts is dispensed with at this juncture.
  • FIG. 6 shows a schematic exemplary block diagram of an apparatus 600 for carrying out the method according to the invention in a second motor vehicle.
  • a sensor 602 , a communication interface 604 , a man-machine interface 606 and a controller 608 having a microprocessor and also volatile and nonvolatile memory means are connected to one another via one or more communication line(s) or communication bus(es) 610 .
  • Each of the components 602 - 606 may likewise have a microprocessor and also volatile and nonvolatile memory means.
  • the microprocessor(s) execute(s) computer program instructions that cause the implementation of single or multiple steps of the method according to the invention.
  • a chip that performs a joint data fusion for local and remote sensor data further a chip that uses the data from the sensor fusion chip to produce predictions for the behavior of the picked-up objects such as vehicles, pedestrians, etc., in particular including for those that cannot or intermittently cannot be picked up using the vehicle's own sensors, or a chip that combines the results of the prediction also using map data etc. in visually presentable data structures, e.g. for an output by augmented reality.
  • FIG. 7 shows a schematic exemplary block diagram with an apparatus 700 for carrying out the method according to the invention in a first motor vehicle.
  • a sensor 702 , a communication interface 704 and a controller 708 having a microprocessor and also volatile and nonvolatile memory means are connected to one another via one or more communication line(s) or communication bus(es) 710 .
  • Each of the components 702 or 704 may likewise have a microprocessor and also volatile and nonvolatile memory means.
  • the microprocessor(s) execute(s) computer program functions that cause the implementation of single or multiple steps of the method according to the invention.

Abstract

A method for depicting sensor data of one or more first motor vehicle(s) via a man-machine interface of a second motor vehicle comprises depicting an image of at least part of surroundings of the second motor vehicle via the man-machine interface of the second motor vehicle. One or more first motor vehicle(s) in the surroundings is/are mapped at least in part. A user selects one or more of the mapped first motor vehicles to which respective point-to-point connections are set up that are used to send a request for information pertaining to available sensors and the properties thereof. Based on the responses, a schematic depiction of the surroundings of the second motor vehicle is created and reproduced that shows positions of the motor vehicles in the surroundings, the sensors available in the motor vehicles and areas within which the available sensors can pick up objects.

Description

    BACKGROUND Field of the Art
  • The present invention relates to a method for wirelessly transmitting sensor data between vehicles and to an applicable apparatus.
  • Background
  • Wireless communication networks are employed in a multiplicity of technical fields of application today. In the area of automotive engineering, it is known that vehicles use what is known as car-to-car communication to exchange information with one another. This communication is a wireless ad hoc network that is set up between physically adjacent vehicles in road traffic and, from a technical point of view, involves an evolved WLAN (wireless local area network) network based on the IEEE 802.11 standard.
  • In the area of car-to-car communication, a wireless radio link between vehicles is used to transmit the information ascertained by the sensor system of one vehicle to other vehicles in physical proximity, for example. This allows in particular information regarding hazard spots to be quickly transmitted from one vehicle to other vehicles. However, this method does not involve the vehicle that wirelessly receives this information specifying which vehicle is supposed to receive particular information. Moreover, the transmitted data are of abstract nature and contain little detail. The methods known from the prior art are therefore not suitable for discriminatory information transmission from one vehicle to the other, and no transmission of detailed data is offered that is suitable for assisting a driver directly in a traffic situation that is difficult to see. In particular, the car-to-car communication architectures known to date, which e.g. also resort to mobile radio networks, cannot guarantee sufficiently low latencies, for example in the single millisecond range, for the telecommunication. Further, the approach shown here permits authentication for the sender that is based inter alia on conventional methods of radio direction finding. This authentication is not possible as precisely for other car-to-car communication methods such as WiFi and Bluetooth or DSRC, because the transmitters thereof cannot be oriented as exactly to a point in three-dimensional space as in the case of the approach described here. Further, the sensor functionality of the AESA radar can be used here to perform distance measurements from a receiving second vehicle to a sending first vehicle for monitoring purposes more or less in parallel with the communication process. Therefore, fast error detection is possible for this type of communication link in the single millisecond range. This cannot be achieved in this way by other types of car-to-car communication.
  • Embedding of single functionalities presented here into an ADASIS architecture would be possible in principle, but is not pursued further in the descriptions that follow. Further, it would also be possible to imagine uses of e.g. DSRC and WLAN ITS-g5. The example solution proposed here for the communication using an AESA radar meets requirements in terms of latency, reliability and authentication, however, that are still under discussion for other methods or cannot be achieved on principle. Therefore, references to these concepts are dispensed with in the depictions that follow.
  • DE 10 2006 055 344 A1 shows the use of data about the traffic situation in the surroundings of a first vehicle in a second vehicle that is relatively close to the first vehicle. In this case, the data received in the second vehicle are output at least in part via an output means so as to be perceptible to the driver of said vehicle. After selective identification of the first vehicle, the data of this vehicle that are directly relevant to the driver of the first vehicle are received in discriminatory fashion. In D1, the first vehicle is identified by registration or numberplate recognition by means of a camera or through the interchange of geoposition data. The received data comprise inter alia a video image recorded by a camera provided in the first vehicle or abstracted data such as a distance between the first vehicle and a vehicle traveling ahead or distance and speed of an oncoming vehicle, for example.
  • DE 199 14 906 A1 discloses a communication system for discriminatory communication between vehicles driven independently of one another that allows exact addressing by virtue of orientation of a communication device and matching of a communication range to the position of a vehicle with which it is desirable to begin communication.
  • Explanations of Terms
  • The term position equates with a whereabouts on the earth's surface or in or on a construction provided or suitable for road traffic. Depending on the context, a position can also denote the representation of a whereabouts on a map or roadmap.
  • Within the context of this description, a motor vehicle is a motor-driven vehicle running on the earth's surface or in or on structures connected thereto.
  • In this description, the term traffic space equates with a surface on which vehicles, moving or unmoving, participate in the traffic. In this case, the traffic space can also extend in different planes, for example in the case of bridges or underpasses. Unless indicated otherwise, the term traffic space within the context of this description denotes immediate surroundings of a motor vehicle, the included radius also being able to be dependent on the speed of travel of the motor vehicle and the complexity of the traffic space. In particular, the traffic space can have an irregular shape with a different extent in different directions, for example a rectangle that has a longer extent in the direction of travel of the motor vehicle than to the sides or behind.
  • The statement “relative to a vehicle” equates in this description with a direction and/or distance in relation to a vehicle. In this case, a standard placement is provided on an area around the vehicle either in degrees, 0 degrees being at the front and 180 degrees being at the rear, or based on a dial of a clock in hours and possibly minutes. In the latter case, 12 o'clock equates with at the front and 6 o'clock equates with at the rear.
  • The term orientation equates with an orientation of a vehicle in the traffic space, the orientation being oriented to a front or to a rear of a vehicle. In this case, it is assumed that the front of a vehicle is also referenced by “at the front” and the rear of a vehicle is also referenced as “at the rear”.
  • The term “azimuth” describes a horizontal orientation of a sensor, for example a swivel to the left or right from a zero position determined by the installation location and the installation position of the sensor, or a horizontal swivel oriented to an absolute zero reference, for example to the direction of the north pole of the earth as seen from a current location. Accordingly, the term “elevation” describes a vertical orientation of a sensor, for example an angle between the horizon and the direction of the sensor.
  • Together with the determinations of distance that are standard for radar measurements, it is in particular possible for AESA radars to be used to perform three-dimensional scans of spatial sectors by virtue of e.g. electronic beam sweeping being used to vary the scanning radar beam. Radar imaging techniques such as SAR, ISAR, etc. that have already been known for years are not discussed further at this juncture.
  • “Lidar” is the abbreviation for the English term “light detection and ranging”. Lidar is a method of optical distance and speed measurement related to radar (radio detection and ranging). Instead of radio waves, beams of light, in particular laser beams, are used. To pick up an area, the beam of light is moved in defined fashion over the area, in similar fashion to the line-by-line scanning in a cathode-ray tube television set, the scanning in the case of lidar being effected by means of one or more mobile mirror(s).
  • Using an AESA radar, electronic beam sweeping can also be used to create three-dimensional scans of selected parts of the visible vehicle surroundings that can then be conditioned in an imaging process to produce a visible depiction. This involves, in general, a data fusion with other available sensor data, e.g. also from optical systems such as cameras, being carried out beforehand and data cleansing, e.g. by means of Kalman filtering, being performed for collation purposes.
  • Technical Problem
  • It is an object of the invention to provide a method and an apparatus for wireless communication between vehicles that offer more efficient assistance to the driver of a motor vehicle in a traffic situation that is difficult to see.
  • BRIEF SUMMARY Technical Solution
  • This object is achieved by the independent patent claims. Refinements of the invention are defined in the dependent claims.
  • A method according to the invention for depicting sensor data of a first motor vehicle via a man-machine interface of a second motor vehicle involves an image of at least part of surroundings of the second motor vehicle being recorded and being reproduced via the man-machine interface of the second motor vehicle. One or more first motor vehicle(s) is/are mapped at least in part on the image of the surroundings. The display is provided on a screen arranged in the field of view of the driver, for example. The display can also be provided such that respective positions and orientations of the first motor vehicles in the traffic space relative to the second motor vehicle are schematically depicted. The determination of the positions and orientations of the first motor vehicles in the traffic space relative to the second motor vehicle can be carried out particularly easily when a stereo camera system is used. A suitable piece of object-detection software can be used in particular to identify other motor vehicles, which are then assigned applicable coordinates in a three-dimensional space. The object detection also allows silhouettes corresponding to the objects identified as a motor vehicle to be determined.
  • A driver or operator selects one or more of the first motor vehicles from which sensor data are supposed to be received. The selection can be made by touching a respective motor vehicle on a touch-sensitive screen on which the image of the surroundings is reproduced, for example. If gesture detection is available, the selection can also be made by pointing to one or more first motor vehicle(s) depicted on a screen as applicable. Other selection methods, for example by placing a cursor on or close to a motor vehicle depicted on the screen, are likewise conceivable, the cursor being able to be controlled in a fundamentally known manner.
  • After a selection has been made, one or more individual first point-to-point connection(s) from the second to the selected first motor vehicle(s) is/are set up via a first wireless communication interface. In this case, the one or more first motor vehicle(s) can be identified using registration recognition, for example. Expediently, in this case the registration is at the same time an identification for setup of a connection. However, it is also possible for motor vehicles to periodically emit identification signals that can be received by other motor vehicles and used for setting up a connection.
  • The one or more individual first point-to-point connection(s) can be set up after applicable orientation of a first transmission or reception apparatus in the direction of the applicable one or more first motor vehicle(s), for example. The first transmission or reception apparatus can be oriented on the basis of angles of azimuth and elevation for the selected first motor vehicle(s) that are ascertained from the image of the surroundings of the second motor vehicle beforehand, for example. As such, if the point that the driver or operator has selected on the screen comes under the silhouette of a motor vehicle, it is possible for the applicable angles for orienting the first transmission or reception apparatus to be determined from the known position of the second motor vehicle and the known properties of the camera that was used to record the image of the surroundings of the second motor vehicle. Optionally, e.g. when radar is used, the identity of the thus selected vehicle for the currently measured distance can additionally be plausibilized taking into consideration the azimuth and elevation values. If the installation locations for radar systems or lidar systems in motor vehicles are standardised, for example always centrally between the headlamps of the motor vehicles, then the object detection described above can be used to set the location to which the transmission or reception apparatus is oriented correspondingly more accurately. The setup of point-to-point connections using oriented transmission and reception apparatuses allows the use of other identification features for setup of a connection to be dispensed with. In particular, when an AESA radar is used, the beamforming to set up an optimal radio link from the respective vehicles, for example the first vehicles, to the addressed other vehicles, for example the second vehicles, can be oriented such that only the addressed vehicles react to the setup of the point-to-point connection and those unaffected ignore it. To this end, a vehicle accepts reception of electromagnetic waves from a vehicle only if they are directed from the sending vehicle to the receiving vehicle and the receiving vehicle also expects a transmission from the sending vehicle. This can be verified using conventional radio direction finding/radio locating means.
  • After the one or more individual first point-to-point connection(s) has/have been set up, a request to transmit information pertaining to available sensors and the properties thereof to the second motor vehicle is transmitted to the selected first motor vehicle(s).
  • In response to the request, the second motor vehicle receives responses from one or more of the selected first motor vehicles via the respective first point-to-point connections and reproduces a schematic depiction of the surroundings of the second motor vehicle, wherein the schematic depiction shows positions of the second and the one or more first motor vehicle(s) in the surroundings of the second motor vehicle, and wherein the schematic depiction depicts areas within which the sensors of the one or more first motor vehicle(s) and/or of the second motor vehicle can pick up objects. These areas may be denoted in colour or by an applicable texture in the schematic depiction, for example. The schematic depiction of the surroundings can moreover show objects that are located in the surroundings of the second motor vehicle and/or in the respective surroundings of the one or more first motor vehicle(s). The objects include, inter alia, boundaries of roads, houses and the like, but also other objects located in the traffic space that have been picked up by sensors.
  • The man-machine interface of the second motor vehicle receives a user input that corresponds to a selection of one or more of the motor vehicles, or the sensors thereof, shown in the schematic depiction and of an area to be picked up by the selected sensor(s) and sends an applicable request to one or more first motor vehicle(s) whose sensors can pick up the area to be picked up. If an area to be picked up in the schematic depiction can be picked up only by sensors of one of the one or more first motor vehicle(s), the request is also sent only to this one first motor vehicle. Otherwise, the request can be sent to all first motor vehicles whose sensors can pick up the area to be picked up. The request to pick up an area can include information pertaining to the azimuth and elevation of a sensor, either from the point of view of the second motor vehicle or from the point of view of the respective first motor vehicle whose sensor is supposed to pick up the area. The selection of an area to be picked up by a sensor, in particular the narrowing-down of an area to be picked up, causes a decrease in the volume of the data to be transmitted, which allows faster reproduction of the sensor data in the second motor vehicle to take place.
  • The first motor vehicle(s) having received a request to pick up an area perform(s) applicable sensor measurements and transmit(s) the measurement results to the second motor vehicle. In the simplest case, the measurement result reported is the presence of an object in the picked-up area and the coordinates thereof or, if an object has not been detected, an applicable report that an object has not been detected. In embodiments of the method according to the invention, further information about detected objects can be reported, for example the dimensions or contours thereof and, if multiple measurements have been carried out cyclically in succession, whether and in what direction detected objects move.
  • The corresponding system in the second motor vehicle receives the sensor data sent by the one or more first motor vehicle(s) in response to the request and reproduces them via the man-machine interface of the second motor vehicle. If sensor data have been received from multiple first motor vehicles, they are reproduced jointly, if need be after an applicable data fusion. If a measurement has not been carried out, this “nonevent” is also reproduced as applicable. In addition to a visual reproduction, an audible signal can be reproduced for each measurement, different audible signals also being able to be used for detected objects and non-detection of objects. Similarly, haptic feedback can be used to signal the measurement results. By way of example, if a finger of an operator or driver of the second motor vehicle remains on a touch-sensitive screen while the measurement is carried out by the first vehicle(s), an applicable vibration can signal the presence of an object in the area to be picked up selected by the position of the finger. Dispensing with time-consuming production of a visual depiction from the measurement data of the sensor(s) of the first motor vehicle(s) results in faster feedback in this case.
  • Instead of the split into a first request to ascertain which sensors are able to pick up which area and a second request with the specific pickup order, it is also possible for a single request with a specific pickup order to be sent directly. If pickup according to the desired specifications is not possible, the pickup order is in this case either not carried out or pickup that complies with the specifications of the order in the best possible way is performed.
  • In one configuration of the invention, the measurement by the sensor(s) of the first motor vehicle(s) is repeated cyclically and the measurement results are transmitted and updated as applicable on the reproduction by the man-machine interface of the second motor vehicle. The time intervals in which the measurement is repeated may be settable by a user or adjusted automatically in this case. Automatic adjustment is possible on the basis of the current speed of the first or the second motor vehicle, for example.
  • In one configuration of the method according to the invention, the request is sent to the selected first motor vehicle(s) via respective individual first point-to-point connections that are set up via the first wireless communication interface. The responses are received by a respective second point-to-point connection that is set up via a second wireless communication interface. By way of example, the first wireless communication interface operates on the basis of the IEEE 802.11 standard, and point-to-point connections set up via it are logical point-to-point connections. The second wireless communication interface uses a radar system or lidar system provided in the first and the one or more second vehicle(s), for example, and point-to-point connections set up via it are physical point-to-point connections that involve transmitter and receiver communicating with one another using a focused radar beam or beam of light oriented between transmitter and receiver. The communication using a focused radar beam oriented between transmitter and receiver is known from military aviation. In this case, radar systems having active electronic beam sweeping and aperture control also known as active phased-array radar or an active electronically scanned array (AESA) are used, which are based on semiconductor chips, and therefore can also be used in the automotive sector. The introduction of the AESA in military aviation resulted in one and the same radar appliance being able to be used to carry out different orders in parallel that had previously been handled by separate appliances. Therefore, a single AESA radar can be used to handle different sensor and communication tasks (virtually) in parallel, virtually in parallel here meaning that the tasks alternate within short successive timeslots. In the present application in the automotive sector, this means in particular that when an AESA radar is used it is fundamentally possible for externally initiated sensor tasks to be carried out (virtually) in parallel. Further, applicable beamforming allows the functionality of a wideband radio link, directed to a selected communication partner, to be set up on an ad hoc basis. This radio link needs to exist only for a short timeslot in each case so that the AESA radar is available for other tasks, e.g. own or other sensor tasks, again in the next one. Applicable communication using a beam of light, in particular using a laser beam, can be effected using an optical scanner provided in a vehicle, which scanner is oriented to an applicable receiver in the other vehicle. It is naturally also possible for the whole communication to be handled via a communication interface.
  • If the AESA radar is oriented to the receiving vehicle to transmit sensor data from the sending vehicle using beamforming as in a radio link, the receiving vehicle can authenticate the source, i.e. the transmitter of the sensor data, using conventional means of radio direction finding/radio locating in geometric fashion, i.e. taking into consideration azimuth, elevation and distance, taking into consideration the visibility thereof for the original selection of the vehicles. Further, in the case of distance measurement conducted virtually in parallel, it is also possible for an interruption in this “radio link” to be picked up immediately, that is to say to ensure error detection in the single millisecond range. For verification purposes, the sensor pickup by the first vehicle can periodically be temporarily directed by the second vehicle to areas of the traffic space that are able to be picked up jointly, in order to check correct functionality. A second vehicle can thus persistently periodically run a diagnosis of the sensors used in a first vehicle.
  • These security characteristics, such as authentication and fast error detection, needed for the transfer of sensor data cannot be provided by other car-to-car communication methods.
  • In one embodiment of the above-described configuration of the method according to the invention, sensor data are transmitted between the vehicles by means of an EASA radar. Instead of radar data, however, it is also possible for live images from a video camera arranged in the first vehicle to be transmitted to the second vehicle. The high bandwidth that the data transmission via a directional radar beam provides allows almost latency-free transmission of sensor data, e.g. video image transmission of even uncompressed video signals. In this case, the transmission of uncompressed video signals has the advantage that latency owing to compression disappears. The direct transmission of sensor data without any preprocessing from a first vehicle to a second vehicle then also allows processing of these data in e.g. the ADAS system of the second vehicle with controllable, low latencies for the capture of said data. Therefore, these are more trustworthy than events reported by car-to-car, it even being possible for a data fusion with the data reported by local sensors to be effected. The driver of the second vehicle can therefore obtain an overview of the traffic situation from the point of view of the first vehicle in real time. In addition to the video image, data ascertained by the radar system of the first vehicle pertaining to objects in the coverage area of the radar system of the first vehicle can be transmitted and can be reproduced together with the video image. Additional information comprises inter alia distance information between the object and the second vehicle and also the speed and direction of movement of the object, if present, measured in the first vehicle.
  • Otherwise, the real-time transmission of the image content (visual data or scan results e.g. from radar sensors) and of other sensor data allows a data fusion to be effected in the e.g. ADAS system of a second vehicle for local and remote sensor data. This means that inter alia e.g. Kalman filtering methods can be used to make improved predictions about the behavior of precisely those objects, for example vehicles, pedestrians, etc., that have left the scan area of the vehicle's own sensors, possibly even only for a short time. Such a prediction can be transmitted to the driver, e.g. visually in the form of augmented reality.
  • In one embodiment of the method according to the invention, if a first motor vehicle leaves a coverage area of the second motor vehicle or if communication between the two vehicles is no longer possible, the display of the man-machine interface is updated accordingly. In this case, it is possible for not only a symbol representing the first motor vehicle but also measurement results of the measurements carried out by this motor vehicle to be removed. The update can be effected directly or after a waiting time has elapsed.
  • An apparatus provided in a second motor vehicle for carrying out the method according to the invention comprises at least one sensor that is configured to detect one or more first motor vehicle(s) in the surroundings. By way of example, the sensor is a camera arrangement that is connected to an applicable device for object detection. The apparatus moreover comprises a first and optionally a second communication interface via which one or more point-to-point connection(s) to one or more first motor vehicle(s) can be set up. Furthermore, the apparatus comprises a man-machine interface that is configured to reproduce sensor measurement results delivered by vehicle-based sensors and by other motor vehicles. In this case, the man-machine interface is also configured to accept user inputs. The apparatus or single components thereof has/have one or more microprocessor(s) that is/are connected to volatile or nonvolatile memory means. The one or more microprocessor(s) in this case execute(s) computer program instructions that are preferably stored in the nonvolatile memory means, and the execution of which causes the implementation of single or multiple steps of the method according to the invention.
  • An apparatus provided in a first motor vehicle for carrying out the method according to the invention comprises at least one sensor that is configured to pick up objects in the surroundings of the first motor vehicle. By way of example, the sensor is a camera arrangement that may be connected to an applicable device for object detection. Other suitable sensors are radar sensors, ultrasonic sensors, laser sensors and the like, for example. The apparatus provided in the first motor vehicle moreover comprises a first and optionally a second communication interface via which a point-to-point connection to a second motor vehicle can be set up. The apparatus furthermore comprises a control unit that evaluates and executes demands for sensor measurements coming from a second motor vehicle and triggers the transmission of the sensor measurement results to the second motor vehicle. The apparatus or single components thereof has/have one or more microprocessor(s) that are connected to volatile or nonvolatile memory means. The one or more microprocessor(s) in this case execute(s) computer program instructions that are preferably stored in the nonvolatile memory means, and the execution of which causes the implementation of single or multiple steps of the method according to the invention.
  • The present invention allows a driver of a vehicle to access sensor systems of other vehicles in the surroundings and to obtain information even pertaining to objects that are not in the visual range of the driver or of the sensor systems of his vehicle. The bidirectional, almost latency-free point-to-point communication furthermore allows sensor systems of other vehicles to be selectively accessed and the coverage areas thereof to be controlled such that information pertaining to particular selected areas of the surroundings of the vehicle is obtained in discriminatory fashion. The selective active access to sensor systems of other vehicles means that the driver of the vehicle no longer has to rely on transmission intervals of the other vehicles that they use to broadcast possibly selected information pertaining to their respective surroundings to all receivers in the surroundings, as provided for in car-to-car communication.
  • Further, the method for sending sensor data via an AESA radar from a first vehicle to a second vehicle using beamforming as directional radio presents a method that allows sufficiently low latency for the transmission and error detection, secure authentication and identification, which is indispensable for the use of external sensor data from a first vehicle in a second vehicle. If the second vehicle now also performs diagnoses for the sensor functionality in the first vehicle, e.g. when picking up common parts of the traffic space, with the necessary frequency, this communication link from a first vehicle to a second vehicle has sufficient reliability for use for ADAS purposes in a second vehicle.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is described below on the basis of the figures in the drawing:
  • FIG. 1 shows a schematic depiction of a traffic space in the surroundings of a second motor vehicle;
  • FIG. 2 depicts areas able to be picked up by the driver of the second motor vehicle or by sensors in the motor vehicle;
  • FIG. 3 shows the visual range of the driver of a first motor vehicle;
  • FIG. 4 shows an exemplary communication using the method according to the invention between a second and a first motor vehicle;
  • FIG. 5 shows an area picked up by the sensors of the first motor vehicle at the request of the second motor vehicle;
  • FIG. 6 shows a schematic exemplary block diagram of an apparatus in a first motor vehicle; and
  • FIG. 7 shows a schematic exemplary block diagram of an apparatus in a second motor vehicle.
  • In the figures, the same or similar elements are provided with the same reference numbers.
  • DETAILED DESCRIPTION Description of Embodiments
  • FIG. 1 shows a schematic depiction of a traffic space 100 in the surroundings of a second motor vehicle 2. The present traffic space is a junction between two roads, with the general right-of-way rule of “right before left” applying. There are several other vehicles in the present traffic space, and at the corners of the junction there are buildings 104-107, indicated by the thicker boundary lines, with footpaths in front of them. The buildings 104-107 and a large vehicle 1.1 that has stopped in a road branching off to the right block part of the view of the traffic space for the driver of the second vehicle 2 and sensors mounted in the second vehicle 2. The planned route of the second motor vehicle 2 is indicated by the arrow 102. The driver of the second motor vehicle 2 should yield right of way to the vehicle 1.2 coming from the right before he himself can cross the junction.
  • The areas able to be picked up by the driver of the second motor vehicle 2 or the areas able to be picked up by sensors in the motor vehicle are depicted in FIG. 2. For reasons of clarity, the coverage areas of the sensors and the visual range of the driver are assumed to be identical in the figures. The visual range of the driver is bounded on the left by the building 107. To the right, the visual range of the driver is bounded by the building 106. Additionally, the vehicle 1.1 blocks a large part of the view to the right. The part that the driver of the motor vehicle 2 cannot see is depicted by shading in the figure. It can clearly be seen that the view of the motor vehicle 1.2 is blocked by the motor vehicle 1.1. Were the driver of the motor vehicle 2 to continue his planned route, there is the risk of him seeing the motor vehicle 1.2 too late and a collision occurring.
  • FIG. 3 depicts the visual range of the driver of the first motor vehicle 1.3. The visual range of the driver of the motor vehicle 1.3 is bounded to the left by the building 104 and to the right by the building 107. To the front, the vehicle 1.1 blocks a narrow area. As previously in FIG. 2, the area that the driver of the motor vehicle 1.3 cannot see is depicted by shading. The vehicle 1.2 is completely in the visual range of the driver of the vehicle 1.3.
  • FIG. 4 depicts an exemplary communication using the method according to the invention between a second and a first motor vehicle for the traffic situation shown in FIGS. 1 to 3. The second motor vehicle 2 is equipped with an apparatus that searches the surroundings of the second motor vehicle 2 for external sensors whose measurement results can be displayed via a man-machine interface in the second motor vehicle. By way of example, the apparatus is a camera that searches the traffic space in front of the second motor vehicle 2 for other motor vehicles. In the example in FIGS. 1 to 3, the first motor vehicle 1.3 is picked up by the camera. In accordance with the method according to the invention, an apparatus provided in the second motor vehicle 2 sends a request to the first motor vehicle 1.3 concerning whether and if so which sensors can transmit their measurement results to the second motor vehicle 2. The request can be sent via a first point-to-point connection that is set up via a first communication interface. The first motor vehicle 1.3 receives and processes the request in an applicable apparatus. In the present example, the first motor vehicle 1.3 is likewise supposed to be equipped with a camera that picks up the traffic space in front of the first motor vehicle 1.3. The apparatus in the first motor vehicle 1.3 sends a response to the second motor vehicle 2 containing information pertaining to available sensors and the characteristics thereof. The characteristics of sensors comprise a coverage area, a vertical and/or horizontal swivel area, information about the spatial and/or temporal resolution and/or the range of the sensor and the like, for example. The response can be sent via the first point-to-point connection set up beforehand. The apparatus in the second motor vehicle 2 displays to the driver the available external sensors and optionally also the arrangement thereof in the traffic space and also the coverage areas of said sensors. The driver selects one or more external sensor(s) from which he would like to have measurement results displayed and optionally indicates which area of the traffic space is supposed to be picked up by means of the sensors. The apparatus in the second motor vehicle 2 sends an applicable request to the first motor vehicle 1.3, which performs an applicable sensor measurement and sends the sensor measurement data to the second motor vehicle 2. The man-machine interface in the second motor vehicle 2 displays the received and possibly further-processed sensor measurement data, so that the driver of the second motor vehicle 2 can obtain an overview of the traffic situation. The communication setup is not shown in this diagram. Further, authentication processes and error monitoring operations, although likewise at least optionally provided, are not shown.
  • FIG. 5 shows the traffic space from FIGS. 1 to 3 and shows the area picked up by the sensors of the first motor vehicle 1.3 at the request of the second motor vehicle 2. The figure depicts the area picked up by the sensors of the first motor vehicle 1.3 by shading. It can clearly be seen that not all of the possible coverage area of the sensors of the first motor vehicle 1.3 is actually scanned. In particular the right-hand area from the point of view of the first motor vehicle 1.3 is not scanned, because this area can be seen clearly by the driver of the second motor vehicle 2. In the request for sensor measurement that the apparatus of the second motor vehicle 2 has sent to the first motor vehicle 1.3, the measurement area to be picked up has been specified accordingly.
  • In one embodiment of the present method, live camera images are sent from the first motor vehicle 1.3 to the second motor vehicle 2. If the sensor in the first motor vehicle 1.3 is not a camera, but rather a lidar system or a radar system, for example, then instead of the live camera images it is also possible for a piece of information about the existence of an object and the position thereof in the specified measurement area to be transmitted. If the apparatus in the first motor vehicle 1.3 has an applicable computer capacity and program code means, then a symbol that represents an object in the measurement area can also be generated from the measurement data. In this case, only the symbol and its position in the traffic space need to be transmitted. Depending on the sensor equipment, the position can be ascertained from images from a camera or from applicable measured values from a lidar system or radar system. The position can be described using a distance and direction relative to the first motor vehicle 1.3 in this case, or using absolute geoposition data ascertained therefrom, for example. A consideration of these processes in regard to ADASIS concepts is dispensed with at this juncture.
  • FIG. 6 shows a schematic exemplary block diagram of an apparatus 600 for carrying out the method according to the invention in a second motor vehicle. A sensor 602, a communication interface 604, a man-machine interface 606 and a controller 608 having a microprocessor and also volatile and nonvolatile memory means are connected to one another via one or more communication line(s) or communication bus(es) 610. Each of the components 602-606 may likewise have a microprocessor and also volatile and nonvolatile memory means. The microprocessor(s) execute(s) computer program instructions that cause the implementation of single or multiple steps of the method according to the invention. Optionally, there may be further components (not shown in the figure), for example a chip that performs a joint data fusion for local and remote sensor data, further a chip that uses the data from the sensor fusion chip to produce predictions for the behavior of the picked-up objects such as vehicles, pedestrians, etc., in particular including for those that cannot or intermittently cannot be picked up using the vehicle's own sensors, or a chip that combines the results of the prediction also using map data etc. in visually presentable data structures, e.g. for an output by augmented reality.
  • FIG. 7 shows a schematic exemplary block diagram with an apparatus 700 for carrying out the method according to the invention in a first motor vehicle. A sensor 702, a communication interface 704 and a controller 708 having a microprocessor and also volatile and nonvolatile memory means are connected to one another via one or more communication line(s) or communication bus(es) 710. Each of the components 702 or 704 may likewise have a microprocessor and also volatile and nonvolatile memory means. The microprocessor(s) execute(s) computer program functions that cause the implementation of single or multiple steps of the method according to the invention.

Claims (20)

1. A method for depicting sensor data of one or more first motor vehicle(s) via a man-machine interface of a second motor vehicle, comprising:
depicting an image of at least part of surroundings of the second motor vehicle via the man-machine interface of the second motor vehicle, wherein one or more first motor vehicle(s) is/are mapped at least in part,
receiving a user input that corresponds to a selection of one or more of the first motor vehicles mapped on the image of the surroundings,
setting up one or more individual first point-to-point connection(s) from the second to the selected first motor vehicle(s) via a first wireless communication interface,
sending a request to the selected first motor vehicle(s) via the respective individual first point-to-point connection to transmit information pertaining to available sensors and the properties thereof to the second motor vehicle,
receiving the responses from one or more of the selected first motor vehicles and reproducing a schematic depiction of the surroundings of the second motor vehicle, wherein the schematic depiction shows positions of the second and the one or more first motor vehicle(s) in the surroundings of the second motor vehicle, wherein the schematic depiction shows sensors available in the one or more first motor vehicle(s), and wherein the schematic depiction depicts areas within which the available sensors of the one or more first motor vehicle(s) and/or the sensors of the second motor vehicle can pick up objects,
receiving a user input that corresponds to a selection of one or more of the sensors shown in the schematic depiction and of an area to be picked up by sensor,
sending an applicable request to one or more first motor vehicle(s) whose sensors can pick up the area to be picked up, and
receiving applicable sensor data for a subsequent reproduction via the man-machine interface of the second motor vehicle.
2. The method as claimed in claim 1, wherein the one or more individual first point-to-point connection(s) is/are set up after applicable orientation of a first transmission or reception apparatus in the direction of the applicable selected one or more first motor vehicle(s).
3. The method as claimed in claim 2, wherein the first transmission or reception apparatus is oriented on the basis of angles of azimuth and elevation for the selected first motor vehicle(s) that are ascertained from the image of the surroundings of the second motor vehicle beforehand.
4. The method as claimed in claim 1, wherein the schematic depiction shows objects located in the surroundings of the second motor vehicle and/or in the respective surroundings of the one or more first motor vehicle(s).
5. The method as claimed in claim 1, wherein selected first motor vehicles that provide no response to the request are denoted in applicable fashion in the schematic depiction.
6. The method as claimed in claim 1, wherein the schematic depiction shows a bird's-eye view of the surroundings of the second motor vehicle.
7. The method as claimed in claim 1, wherein sensor data updated in response to an applicable request are received cyclically from one or more first motor vehicle(s) for a previously stipulated period or up until a call to stop the transmission.
8. The method as claimed in claim 7, wherein the sensor data represent information from a radar system, a lidar system, a video camera and/or an ultrasonic system.
9. The method as claimed in claim 1, comprising:
actuating respective radar systems in the second and the one or more first motor vehicle(s) to orient an area picked up by the radar system to the first or second motor vehicle by means of electronically controlled direction and aperture,
sending the request and receiving the response to the request via the respective oriented radar systems.
10. An apparatus for depicting sensor data of one or more first motor vehicle(s) via a man-machine interface of a second motor vehicle, wherein the apparatus has a sensor, a communication interface, a man-machine interface and at least one controller that are connected to one another via one or more communication line(s) or communication bus(es), wherein the at least one controller has volatile and nonvolatile memory means, and wherein the apparatus is configured to, when the at least one controller (608) executes applicable computer program instructions stored in the nonvolatile memory means, carry out a method for depicting the sensor data of the one or more first motor vehicle(s) via the man-machine interface of the second motor vehicle, the method comprising:
depicting an image of at least part of surroundings of the second motor vehicle via the man-machine interface of the second motor vehicle, wherein one or more first motor vehicle(s) is/are mapped at least in part,
receiving a user input that corresponds to a selection of one or more of the first motor vehicles mapped on the image of the surroundings,
setting up one or more individual first point-to-point connection(s) from the second to the selected first motor vehicle(s) via a first wireless communication interface,
sending a request to the selected first motor vehicle(s) via the respective individual first point-to-point connection to transmit information pertaining to available sensors and the properties thereof to the second motor vehicle,
receiving the responses from one or more of the selected first motor vehicles and reproducing a schematic depiction of the surroundings of the second motor vehicle, wherein the schematic depiction shows positions of the second and the one or more first motor vehicle(s) in the surroundings of the second motor vehicle, wherein the schematic depiction shows sensors available in the one or more first motor vehicle(s), and wherein the schematic depiction depicts areas within which the available sensors of the one or more first motor vehicle(s) and/or the sensors of the second motor vehicle can pick up objects,
receiving a user input that corresponds to a selection of one or more of the sensors shown in the schematic depiction and of an area to be picked up by sensor,
sending an applicable request to one or more first motor vehicle(s) whose sensors can pick up the area to be picked up, and
receiving applicable sensor data for a subsequent reproduction via the man-machine interface of the second motor vehicle.
11. A motor vehicle having an apparatus for depicting sensor data of one or more first motor vehicle(s) via a man-machine interface of a second motor vehicle, wherein the apparatus has a sensor, a communication interface, a man-machine interface and at least one controller that are connected to one another via one or more communication line(s) or communication bus(es), wherein the at least one controller has volatile and nonvolatile memory means, and wherein the apparatus is configured to, when the at least one controller (608) executes applicable computer program instructions stored in the nonvolatile memory means, carry out a method for depicting the sensor data of the one or more first motor vehicle(s) via the man-machine interface of the second motor vehicle, the method comprising:
depicting an image of at least part of surroundings of the second motor vehicle via the man-machine interface of the second motor vehicle, wherein one or more first motor vehicle(s) is/are mapped at least in part,
receiving a user input that corresponds to a selection of one or more of the first motor vehicles mapped on the image of the surroundings,
setting up one or more individual first point-to-point connection(s) from the second to the selected first motor vehicle(s) via a first wireless communication interface,
sending a request to the selected first motor vehicle(s) via the respective individual first point-to-point connection to transmit information pertaining to available sensors and the properties thereof to the second motor vehicle,
receiving the responses from one or more of the selected first motor vehicles and reproducing a schematic depiction of the surroundings of the second motor vehicle, wherein the schematic depiction shows positions of the second and the one or more first motor vehicle(s) in the surroundings of the second motor vehicle, wherein the schematic depiction shows sensors available in the one or more first motor vehicle(s), and wherein the schematic depiction depicts areas within which the available sensors of the one or more first motor vehicle(s) and/or the sensors of the second motor vehicle can pick up objects,
receiving a user input that corresponds to a selection of one or more of the sensors shown in the schematic depiction and of an area to be picked up by sensor,
sending an applicable request to one or more first motor vehicle(s) whose sensors can pick up the area to be picked up, and
receiving applicable sensor data for a subsequent reproduction via the man-machine interface of the second motor vehicle.
12. The apparatus as claimed in claim 10, wherein the one or more individual first point-to-point connection(s) is/are set up after applicable orientation of a first transmission or reception apparatus in the direction of the applicable selected one or more first motor vehicle(s).
13. The apparatus as claimed in claim 12, wherein the first transmission or reception apparatus is oriented on the basis of angles of azimuth and elevation for the selected first motor vehicle(s) that are ascertained from the image of the surroundings of the second motor vehicle beforehand.
14. The apparatus as claimed in claim 10, wherein the schematic depiction shows objects located in the surroundings of the second motor vehicle and/or in the respective surroundings of the one or more first motor vehicle(s).
15. The apparatus as claimed in claim 10, wherein selected first motor vehicles that provide no response to the request are denoted in applicable fashion in the schematic depiction.
16. The apparatus as claimed in claim 10, wherein the schematic depiction shows a bird's-eye view of the surroundings of the second motor vehicle.
17. The apparatus as claimed in claim 10, wherein sensor data updated in response to an applicable request are received cyclically from one or more first motor vehicle(s) for a previously stipulated period or up until a call to stop the transmission.
18. The apparatus as claimed in claim 17, wherein the sensor data represent information from a radar system, a lidar system, a video camera and/or an ultrasonic system.
19. The apparatus as claimed in claim 10, comprising:
actuating respective radar systems in the second and the one or more first motor vehicle(s) to orient an area picked up by the radar system to the first or second motor vehicle by means of electronically controlled direction and aperture,
sending the request and receiving the response to the request via the respective oriented radar systems.
20. The motor vehicle as claimed in claim 11, wherein the one or more individual first point-to-point connection(s) is/are set up after applicable orientation of a first transmission or reception apparatus in the direction of the applicable selected one or more first motor vehicle(s).
US15/773,072 2015-11-02 2016-11-01 Method and device for selecting and transmitting sensor data from a first motor vehicle to a second motor vehicle Active US10490079B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102015221439.7 2015-11-02
DE102015221439 2015-11-02
DE102015221439.7A DE102015221439B3 (en) 2015-11-02 2015-11-02 Method and device for selecting and transmitting sensor data from a first to a second motor vehicle
PCT/EP2016/076288 WO2017076827A1 (en) 2015-11-02 2016-11-01 Method and device for selecting and transmitting sensor data from a first motor vehicle to a second motor vehicle

Publications (2)

Publication Number Publication Date
US20180322784A1 true US20180322784A1 (en) 2018-11-08
US10490079B2 US10490079B2 (en) 2019-11-26

Family

ID=57218902

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/773,072 Active US10490079B2 (en) 2015-11-02 2016-11-01 Method and device for selecting and transmitting sensor data from a first motor vehicle to a second motor vehicle

Country Status (5)

Country Link
US (1) US10490079B2 (en)
EP (1) EP3371800B1 (en)
CN (1) CN108028020B (en)
DE (1) DE102015221439B3 (en)
WO (1) WO2017076827A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180334174A1 (en) * 2017-05-18 2018-11-22 Ford Global Technologies, Llc Method to assist control of a vehicle assistance system
EP3380864A4 (en) * 2015-11-25 2019-07-03 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US10615513B2 (en) 2015-06-16 2020-04-07 Urthecast Corp Efficient planar phased array antenna assembly
US10871561B2 (en) 2015-03-25 2020-12-22 Urthecast Corp. Apparatus and methods for synthetic aperture radar with digital beamforming
JP2022511815A (en) * 2018-12-05 2022-02-01 テレフオンアクチーボラゲット エルエム エリクソン(パブル) Object targeting
US11354951B2 (en) * 2017-10-20 2022-06-07 Volvo Truck Corporation Methods for diagnosing error of an ego vehicle and/or a surrounding vehicle
US11378682B2 (en) 2017-05-23 2022-07-05 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods for moving targets
US11447130B2 (en) * 2019-03-28 2022-09-20 Nissan Motor Co., Ltd. Behavior prediction method, behavior prediction apparatus and vehicle control apparatus
WO2023175618A1 (en) * 2022-03-15 2023-09-21 B.G. Negev Technologies And Applications Ltd, At Ben Gurion University Cloud-based sensing and control system using networked sensors for moving or stationary platforms

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017208286A1 (en) * 2017-05-17 2018-11-22 Audi Ag Method for operating a motor vehicle, motor vehicle and communication system for a motor vehicle
CA3064735C (en) 2017-05-23 2022-06-21 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
DE102017212227A1 (en) * 2017-07-18 2019-01-24 Ford Global Technologies, Llc Method and system for vehicle data collection and vehicle control in road traffic
DE102017218680A1 (en) * 2017-10-19 2019-04-25 Volkswagen Aktiengesellschaft Apparatus, means of locomotion and method for supporting guidance of a first means of locomotion through a bottleneck by means of externally determined environment information
WO2019226194A2 (en) 2017-11-22 2019-11-28 Urthecast Corp. Synthetic aperture radar apparatus and methods
EP3721260A2 (en) * 2017-12-05 2020-10-14 Sew-Eurodrive GmbH & Co. KG System comprising an installation and mobile part, and method for operating a system
DE102017222878A1 (en) * 2017-12-15 2019-06-19 Zf Friedrichshafen Ag Control of a motor vehicle
DE102017223585A1 (en) * 2017-12-21 2019-06-27 Continental Automotive Gmbh Method and device for selecting and transmitting sensor data from a first to a second motor vehicle
DE102017223575A1 (en) * 2017-12-21 2019-06-27 Continental Automotive Gmbh Method and device
DE102018219376A1 (en) * 2018-11-13 2020-05-14 Robert Bosch Gmbh Procedures for selecting and accelerating action responses
US11693423B2 (en) * 2018-12-19 2023-07-04 Waymo Llc Model for excluding vehicle from sensor field of view
US11003195B2 (en) * 2019-02-28 2021-05-11 GM Global Technology Operations LLC Method to prioritize the process of receiving for cooperative sensor sharing objects
CN113448322A (en) * 2020-03-26 2021-09-28 宝马股份公司 Remote operation method and system for vehicle, storage medium, and electronic device
CN113065691A (en) * 2021-03-22 2021-07-02 中国联合网络通信集团有限公司 Traffic behavior prediction method and system
DE102021205751A1 (en) 2021-06-08 2022-12-08 Robert Bosch Gesellschaft mit beschränkter Haftung Method for communication between at least two participants in a networked transport system
CN113561726A (en) * 2021-09-01 2021-10-29 苏州盖茨电子科技有限公司 Vehicle active obstacle avoidance system

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6456904B1 (en) * 2001-09-13 2002-09-24 Robert Gay Method and device for gauging distance between race cars
US20020198660A1 (en) * 2001-06-26 2002-12-26 Medius, Inc. Method and apparatus for transferring information between vehicles
US20030098800A1 (en) * 2001-11-26 2003-05-29 Motorola, Inc. Method and apparatus for detecting and responding to an absence of journey-related information
US20030139881A1 (en) * 2002-01-24 2003-07-24 Ford Global Technologies, Inc. Method and apparatus for activating a crash countermeasure
US20080265097A1 (en) * 2007-04-30 2008-10-30 Stecko Stephen M Apparatus for an automated aerial refueling boom using multiple types of sensors
US7466992B1 (en) * 2001-10-18 2008-12-16 Iwao Fujisaki Communication device
US20090231432A1 (en) * 2008-03-17 2009-09-17 International Business Machines Corporation View selection in a vehicle-to-vehicle network
US20140009307A1 (en) * 2012-07-09 2014-01-09 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
US20140152774A1 (en) * 2011-09-27 2014-06-05 Aisin Seiki Kabushiki Kaisha Vehicle periphery monitoring device
US20150016823A1 (en) * 2013-07-15 2015-01-15 Harman Becker Automotive Systems Gmbh Techniques of establishing a wireless data connection
US8939839B2 (en) * 2013-03-14 2015-01-27 Honda Motor Co., Ltd. Interactive vehicle gaming system and method
US20150112800A1 (en) * 2013-10-18 2015-04-23 State Farm Mutual Automobile Insurance Company Targeted advertising using vehicle information
US20150127191A1 (en) * 2013-11-06 2015-05-07 Saswat Misra Vehicular network
US20150145995A1 (en) * 2013-11-22 2015-05-28 At&T Intellectual Property I, L.P. Enhanced view for connected cars
US20150155007A1 (en) * 2013-12-04 2015-06-04 Hti Ip, Llc Method and System for Avatar Replay Based on Mobile Sensor Information
US20150235538A1 (en) * 2014-02-14 2015-08-20 GM Global Technology Operations LLC Methods and systems for processing attention data from a vehicle
US20150317834A1 (en) * 2014-05-01 2015-11-05 Adam G. Poulos Determining coordinate frames in a dynamic environment
US20160232423A1 (en) * 2015-02-11 2016-08-11 Qualcomm Incorporated Environmental scene condition detection
US20160277601A1 (en) * 2015-03-17 2016-09-22 Continental Automotive Systems, Inc. Shared vehicle camera
US20170010174A1 (en) * 2015-07-07 2017-01-12 Toyota Jidosha Kabushiki Kaisha Mobile computer atmospheric barometric pressure system
US20190001987A1 (en) * 2015-09-01 2019-01-03 Lg Electronics Inc. Vehicle and control method thereof

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8255144B2 (en) * 1997-10-22 2012-08-28 Intelligent Technologies International, Inc. Intra-vehicle information conveyance system and method
US7647180B2 (en) * 1997-10-22 2010-01-12 Intelligent Technologies International, Inc. Vehicular intersection management techniques
US8000897B2 (en) * 1997-10-22 2011-08-16 Intelligent Technologies International, Inc. Intersection collision avoidance techniques
DE19914906A1 (en) * 1999-04-01 2000-10-05 Bosch Gmbh Robert System for communicating between vehicles driven independently of each other includes communications devices assigned to each vehicle for exchanging signals and information
EP1327233A2 (en) * 2000-10-13 2003-07-16 Paxgrid Telemetric Systems Inc. Automotive telemetry protocol
JP4539361B2 (en) * 2005-02-16 2010-09-08 アイシン精機株式会社 Mobile communication device
DE102006006850B4 (en) * 2006-02-15 2022-12-29 Bayerische Motoren Werke Aktiengesellschaft Method of aligning a pivotable vehicle sensor
DE102006055344A1 (en) * 2006-11-23 2008-05-29 Vdo Automotive Ag Method for wireless communication between vehicles
JP4345832B2 (en) 2007-03-12 2009-10-14 トヨタ自動車株式会社 Road condition detection system
US8885039B2 (en) * 2008-07-25 2014-11-11 Lg Electronics Inc. Providing vehicle information
US9188980B2 (en) * 2008-09-11 2015-11-17 Deere & Company Vehicle with high integrity perception system
DE102008042565A1 (en) 2008-10-02 2010-04-08 Robert Bosch Gmbh Method for operating a driver assistance device
JP4905512B2 (en) * 2009-07-09 2012-03-28 株式会社デンソー Target information estimation device
DE102010034140A1 (en) * 2010-08-12 2012-02-16 Valeo Schalter Und Sensoren Gmbh Method for displaying images on a display device and driver assistance system
US8886212B2 (en) * 2010-08-24 2014-11-11 Blackberry Limited Mobile tracking
CN102737522A (en) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 Active anti-collision method based on Internet of vehicles
KR102028720B1 (en) * 2012-07-10 2019-11-08 삼성전자주식회사 Transparent display apparatus for displaying an information of danger element and method thereof
DE102012015250A1 (en) 2012-08-01 2014-02-06 Audi Ag Radar sensor for a motor vehicle, motor vehicle and communication method

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020198660A1 (en) * 2001-06-26 2002-12-26 Medius, Inc. Method and apparatus for transferring information between vehicles
US6456904B1 (en) * 2001-09-13 2002-09-24 Robert Gay Method and device for gauging distance between race cars
US7466992B1 (en) * 2001-10-18 2008-12-16 Iwao Fujisaki Communication device
US20030098800A1 (en) * 2001-11-26 2003-05-29 Motorola, Inc. Method and apparatus for detecting and responding to an absence of journey-related information
US20030139881A1 (en) * 2002-01-24 2003-07-24 Ford Global Technologies, Inc. Method and apparatus for activating a crash countermeasure
US20080265097A1 (en) * 2007-04-30 2008-10-30 Stecko Stephen M Apparatus for an automated aerial refueling boom using multiple types of sensors
US20090231432A1 (en) * 2008-03-17 2009-09-17 International Business Machines Corporation View selection in a vehicle-to-vehicle network
US20140152774A1 (en) * 2011-09-27 2014-06-05 Aisin Seiki Kabushiki Kaisha Vehicle periphery monitoring device
US20140009307A1 (en) * 2012-07-09 2014-01-09 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
US8939839B2 (en) * 2013-03-14 2015-01-27 Honda Motor Co., Ltd. Interactive vehicle gaming system and method
US20150016823A1 (en) * 2013-07-15 2015-01-15 Harman Becker Automotive Systems Gmbh Techniques of establishing a wireless data connection
US20150112800A1 (en) * 2013-10-18 2015-04-23 State Farm Mutual Automobile Insurance Company Targeted advertising using vehicle information
US20150127191A1 (en) * 2013-11-06 2015-05-07 Saswat Misra Vehicular network
US20150145995A1 (en) * 2013-11-22 2015-05-28 At&T Intellectual Property I, L.P. Enhanced view for connected cars
US20150155007A1 (en) * 2013-12-04 2015-06-04 Hti Ip, Llc Method and System for Avatar Replay Based on Mobile Sensor Information
US20150235538A1 (en) * 2014-02-14 2015-08-20 GM Global Technology Operations LLC Methods and systems for processing attention data from a vehicle
US20150317834A1 (en) * 2014-05-01 2015-11-05 Adam G. Poulos Determining coordinate frames in a dynamic environment
US20160232423A1 (en) * 2015-02-11 2016-08-11 Qualcomm Incorporated Environmental scene condition detection
US20160277601A1 (en) * 2015-03-17 2016-09-22 Continental Automotive Systems, Inc. Shared vehicle camera
US20170010174A1 (en) * 2015-07-07 2017-01-12 Toyota Jidosha Kabushiki Kaisha Mobile computer atmospheric barometric pressure system
US20190001987A1 (en) * 2015-09-01 2019-01-03 Lg Electronics Inc. Vehicle and control method thereof

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10871561B2 (en) 2015-03-25 2020-12-22 Urthecast Corp. Apparatus and methods for synthetic aperture radar with digital beamforming
US10615513B2 (en) 2015-06-16 2020-04-07 Urthecast Corp Efficient planar phased array antenna assembly
EP3380864A4 (en) * 2015-11-25 2019-07-03 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US10955546B2 (en) 2015-11-25 2021-03-23 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US20180334174A1 (en) * 2017-05-18 2018-11-22 Ford Global Technologies, Llc Method to assist control of a vehicle assistance system
US10899366B2 (en) * 2017-05-18 2021-01-26 Ford Global Technologies, Llc Method to assist control of a vehicle assistance system
US11378682B2 (en) 2017-05-23 2022-07-05 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods for moving targets
US11354951B2 (en) * 2017-10-20 2022-06-07 Volvo Truck Corporation Methods for diagnosing error of an ego vehicle and/or a surrounding vehicle
JP2022511815A (en) * 2018-12-05 2022-02-01 テレフオンアクチーボラゲット エルエム エリクソン(パブル) Object targeting
US11815587B2 (en) 2018-12-05 2023-11-14 Telefonaktiebolaget Lm Ericsson (Publ) Object targeting
US11447130B2 (en) * 2019-03-28 2022-09-20 Nissan Motor Co., Ltd. Behavior prediction method, behavior prediction apparatus and vehicle control apparatus
WO2023175618A1 (en) * 2022-03-15 2023-09-21 B.G. Negev Technologies And Applications Ltd, At Ben Gurion University Cloud-based sensing and control system using networked sensors for moving or stationary platforms

Also Published As

Publication number Publication date
EP3371800A1 (en) 2018-09-12
US10490079B2 (en) 2019-11-26
CN108028020A (en) 2018-05-11
EP3371800B1 (en) 2020-10-28
WO2017076827A1 (en) 2017-05-11
CN108028020B (en) 2021-01-29
DE102015221439B3 (en) 2017-05-04

Similar Documents

Publication Publication Date Title
US10490079B2 (en) Method and device for selecting and transmitting sensor data from a first motor vehicle to a second motor vehicle
WO2020224375A1 (en) Positioning method, apparatus, and device, and computer-readable storage medium
EP3540464B1 (en) Ranging method based on laser radar system, device and readable storage medium
CN110537109B (en) Sensing assembly for autonomous driving
US9863775B2 (en) Vehicle localization system
EP3742200B1 (en) Detection apparatus and parameter adjustment method thereof
US11544940B2 (en) Hybrid lane estimation using both deep learning and computer vision
CN106291535A (en) A kind of obstacle detector, robot and obstacle avoidance system
CN102866706A (en) Cleaning robot adopting smart phone navigation and navigation cleaning method thereof
US11544868B2 (en) Object location coordinate determination
CN113869231B (en) Method and equipment for acquiring real-time image information of target object
CN111736613A (en) Intelligent driving control method, device and system and storage medium
CN110750153A (en) Dynamic virtualization device of unmanned vehicle
WO2021150689A1 (en) System and methods for calibrating cameras with a fixed focal point
CN113008237A (en) Path planning method and device and aircraft
US11226616B2 (en) Information processing apparatus and computer readable storage medium for remotely driving vehicles
JP7438928B2 (en) Apparatus and method for providing location
US10249056B2 (en) Vehicle position estimation system
US10958846B2 (en) Method, device and system for configuration of a sensor on a moving object
US11557201B2 (en) Apparatus for assisting driving of a host vehicle based on augmented reality and method thereof
EP3223188A1 (en) A vehicle environment mapping system
CN113917875A (en) Open universal intelligent controller, method and storage medium for autonomous unmanned system
CN113467450A (en) Unmanned aerial vehicle control method and device, computer equipment and storage medium
EP4220580A1 (en) Method for vehicle driving assistance within delimited area
CN113312403B (en) Map acquisition method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: CONTINENTAL AUTOMOTIVE GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHILD, BERNHARD, DR.;REEL/FRAME:045697/0110

Effective date: 20180131

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4