WO2012072361A1 - Verfahren zur darstellung einer mittels sensoren erfassten umgebung und vorrichtung zur darstellung einer von fahrzeuggestützten sensoren erfassten umgebung - Google Patents

Verfahren zur darstellung einer mittels sensoren erfassten umgebung und vorrichtung zur darstellung einer von fahrzeuggestützten sensoren erfassten umgebung Download PDF

Info

Publication number
WO2012072361A1
WO2012072361A1 PCT/EP2011/069251 EP2011069251W WO2012072361A1 WO 2012072361 A1 WO2012072361 A1 WO 2012072361A1 EP 2011069251 W EP2011069251 W EP 2011069251W WO 2012072361 A1 WO2012072361 A1 WO 2012072361A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
vehicle
blind
scanned
section
Prior art date
Application number
PCT/EP2011/069251
Other languages
German (de)
English (en)
French (fr)
Inventor
Michael Schoenherr
Werner Urban
Marcus Schneider
Benno Albrecht
Volker Niemz
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to CN201180057537.6A priority Critical patent/CN103534138B/zh
Priority to EP11790896.2A priority patent/EP2646286A1/de
Publication of WO2012072361A1 publication Critical patent/WO2012072361A1/de

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T7/00Brake-action initiating means
    • B60T7/12Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
    • B60T7/22Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger initiated by contact of vehicle, e.g. bumper, with an external object, e.g. another vehicle, or by means of contactless obstacle detectors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/006Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a distance sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/173Reversing assist

Definitions

  • representations based on acoustic cues or on light bars that fill with decreasing distance indicate only a scalar size, i. the proximity of the nearest object.
  • representations by means of an optical display are known, which in a bird's-eye view schematically reproduce the vehicle and an edge which surrounds the illustrated vehicle circumferentially, in which edge the sensor data are displayed depending on the location and orientation of the respective sensor.
  • display systems are known which, for each sensor, represent a field that conforms to the outline of the one shown
  • Known approaches such as those described above employ sensor arrays that optically represent two basic states: a first state that does not identify a detected object in the sensor section, ie, a vacant field and a second state that identifies a detected object in the sensor section, ie, an obstacle in the sensor section Field.
  • the second state is still the distance to the detected Object specified, for example, by a bar with a variable width.
  • the fields or the sections shown are fixed and correspond to the field of view of the built-in sensor. It has been recognized by the inventors that not all states of sensor detection are reproduced from the above-mentioned two states used for illustration. The resulting incomplete rendering of the
  • the invention enables a complete representation of the sensor data and more precise statements about the environment. This avoids misrepresentations, especially if the environment is incompletely scanned by the sensors. Sensors at some critical points of the vehicle are sufficient to present the driver with a complete sensor image, with all critical areas also represented as such. Furthermore, problematic situations may be more accurately represented than in the prior art, thereby misjudging as critical
  • the concept on which the invention is based is to further distinguish the sensor data by additionally displaying a state of the sensors that was not previously considered in the illustration.
  • the further state is shown according to the invention. In the further state, none of the sensors can make a statement about an environment because the surrounding area in question is not scanned or the
  • Unscanned in accordance with an aspect of the invention means that the quality of the scan is insufficient. Therefore, the terms "unsampled” and "unscanned” are used for portions for which there is no sampling data or for which the sampling data is of insufficient quality.
  • a blind section between two sensors such as a side section of a
  • Vehicle which has only rear and front sensors, according to the invention is shown in a different way, as a section in which one of the sensors detected by scanning at least one object. Likewise, a blind section is displayed in a different way than a section in which no object is detected by one of the sensors.
  • the three different states of sensor detection ie, "object detected in section” / "no object detected in section” / "section not sampled” are therefore each represented differently
  • the invention distinguishes sampled and non-sampled sections of the environment in the representation of these sections This additional information is detected by the driver and implemented intuitively According to the invention, a portion in which no object has been detected because the scan by the associated sensor has come to a negative result is represented differently than a portion in which there is no object for the reason it has been detected that the section has not been scanned so far, therefore, a distinction is made between a detectably free section (since scanned) and a section over which no statement can be made (since not scanned).
  • the invention consequently relates to a method for representing an environment detected by means of sensors.
  • the method provides to scan the surroundings of a vehicle by means of sensors which are mounted on a vehicle.
  • a blind section may be a side section of the vehicle.
  • the blind portion ie the portion which is not scanned, becomes another one graphic characteristic, as a scanned portion for which the sensors detect an obstacle or no obstacle. Obstacles are objects that cause damage to the vehicle when the vehicle contacts them.
  • An embodiment of the invention provides that the blind section, which is represented with a different graphical characteristic, corresponds to a surrounding area which has not yet been scanned by the sensors.
  • the at least one scanned portion corresponds to a surrounding area for which an obstacle or obstacle is currently detected.
  • the scanned portion may further include a surrounding area
  • This point in time which determines for a section whether it has the state "blind” or the state "sampled" corresponds to an activation time, in particular one
  • the time can also be defined by being in the. By a predetermined period of time
  • the time that determines a state of the section may also consist of a logical combination of the activation time, in particular the sensor activation time and / or the vehicle activation time, and a past time or a subset thereof. This ensures that a section is presented as a scanned section only having scanned objects or objects if the scan is current. It also ensures that between the last, immediate
  • the scan was not paused, for example, by a restart during which it was reactivated.
  • the present sampling data is up-to-date.
  • the scanning results are considered unreliable even if they already have a certain age, for example 1 minute, 1 hour or 1 day. Classification of results as unreliable results in the section being classified as a blind, non-sampled section and is also presented as such. With this embodiment, only a section is reliably displayed as free, ie without an obstacle, if this is also the case.
  • the "not scanned" condition is provided for this section so as not to present any uncertain results as safe, thereby significantly increasing the reliability of the display and reducing the risk of an accident by the driver being unaware Depictions has been significantly reduced.
  • the above-described embodiment provides for portions which have been scanned for more than a predetermined period of time to be considered blind. Sections whose sampling lasts more than a predetermined period of time are only sampled with insufficient quality, in particular because due to the low actuality the associated sampling data are only insufficiently significant or reliable. Correct object determination is not possible for these sections due to the long elapsed time. Generally, sections are said to be blind if at least one criterion is that the sampling of these sections is of inadequate quality.
  • Another of these criteria is the route. Since the distance also accumulates the error that results in the distance detection, from a certain distance since the last sampling of a section to be displayed, this section is regarded as blind and represented accordingly. This is particularly dependent on the direction of movement of the vehicle and the orientation of the portion to be scanned relative to the vehicle. One in a first direction
  • Aligned section is considered to be blind and displayed when a distance has been traveled in a second, opposite direction since the last sampling of this section, and when the length of the route over a
  • the section to be scanned is oriented in the rearward direction of the vehicle, then this section is regarded as blind and is represented if a distance in the forward direction has been covered since the last scan of this section, which lies above a certain limit.
  • the first specific embodiment relates to a variant in which the vehicle has only sensors oriented in the rearward direction and no sensors which are aligned in the forward direction or arranged on the front side of the vehicle.
  • the second specific embodiment relates to a variant in which the vehicle has only sensors oriented in the forward direction and no sensors which are aligned in the rearward direction or are arranged on the rear side of the vehicle.
  • Another criterion may be the precision of a position determination system with which the relative movement is determined. A value output by the position determining system then forms the criterion of whether a section due to the
  • Relative movement is no longer scanned, as shown blind or not.
  • Another criterion may be a signal-to-noise ratio that occurs during a scan. If this is below a limit, the area scanned thereby is regarded as blind and represented accordingly.
  • the criteria may be binary, i. orient themselves on the basis of a limit and be used individually or in any combination with one another, in particular in a logical AND or OR combination. Furthermore, the criteria can be evaluated analogously or in several stages, for example in more than two stages. Thus a value can be formed for one, for a part or for all criteria, which can be combined with the
  • Approach of the criterion increases to the predetermined limit.
  • the individual values can be combined arithmetically, for example as a weighted sum, which is compared with a total limit value. If the weighted sum exceeds the total limit value, the section in question - or if different sections and criteria are combined: the intersection of the sections or the union of the sections - is assumed to be blind and displayed accordingly.
  • Another possible combination is to change the limit for another criterion based on one criterion.
  • the limit for the distance traveled or the limit may still be permissible gear changes are increasingly reduced, starting from the concerned
  • Section is considered blind and presented.
  • the limits can therefore be predetermined and constant.
  • the boundaries may be predetermined and variable as a function of another quantity relevant to a criterion and its distance to the associated boundary.
  • a particularly preferred embodiment provides that the relative movement of the vehicle relative to the environment in the representation of the sections, i. when dividing into blind and scanned sections, is taken into account.
  • the vehicle and sensors shift relative to the environment, i. also opposite the obstacles or objects. Therefore, the detectable and the undetectable, blind portion of the environment change when the respective sensor and thus the field of view of the sensor moves relative to the environment.
  • the preferred embodiment shifts the illustrated detectable and non-detectable blind portions according to the offset of the vehicle from the environment resulting from the relative movement.
  • the offset can also be considered as a shift resulting from the relative movement.
  • the displacement is realized, for example, by turning a location of the blind portion to a location of the scanned portion when the relative movement of the vehicle relative to a location in the environment corresponding to the location so changed scans that location from the sensors.
  • the relative movement between the vehicle and the environment is detected by an in-vehicle motion sensor.
  • This motion sensor is a motion sensor which is fixedly arranged on the vehicle and which determines from the vehicle the relative movement of the vehicle relative to the surroundings.
  • the motion sensor is a track or rotary encoder connected to a wheel of the vehicle.
  • an existing in the vehicle distance sensor can be used, which also measures the distance traveled, for example, the distance sensor of the odometer or another odometer.
  • the motion sensor may also be a positioning system with a receiver carried by the vehicle, for example a GPS locating device or a radio-network-based locating device, for example based on the GSM, UMTS protocol or another mobile radio network protocol.
  • a receiver carried by the vehicle
  • a radio-network-based locating device for example based on the GSM, UMTS protocol or another mobile radio network protocol.
  • a line oriented horizontally to the vehicle can be displaced, which delimits an illustrated field that represents the respective one
  • Section plays. Such a spatial adaptation of the depicted environment including the visually recognizable distinction between previously unrecognized sections of the environment and captured sections taking into account the
  • Vehicle displacement allows a much better overview for the driver.
  • the section shown as blind is reduced, in particular shortened, or the section shown as detectable enlarged, in particular extended.
  • the representation thus couples the representation of the sensor data to a motion sensor of the vehicle to adapt the scanning of the environment to the relative movement of the vehicle relative to the environment.
  • Motion sensor are exclusively odometric sensors or
  • Location detection sensors are used, which are used to detect a movement of the vehicle, in particular with respect to the environment.
  • the term sensor is used for a scanning sensor that examines the environment for objects from the vehicle.
  • the other graphical feature that depicts the blind section - compared to the graphical feature of the captured section - is at least one of the following: brightness, contrast, sharpness, resolution, color, texture, fade in, fade in, flashing, transparency , transparent obscuration, geometric shape, area, and width.
  • the blind portion is therefore represented with such a property, unlike the other portions, or the blind portion and the other portions have differences in this property.
  • the properties can be combined as desired or used individually.
  • the blind portion may have a lower level of brightness, contrast, sharpness, resolution, color, transparency, area and / or width than the other portion being scanned or the others,
  • the blind section can be displayed with a texture, with an overlaid icon, with a superimposed pattern, with a transparent obscuration, which is different from the other sections
  • the blind portion may flash in the illustration, and / or be shown at least partially transparent, as opposed to the other scanned portions.
  • the geometric shape or size of the representation of the blind portion may differ from the geometric shape or size of the representation of the other portions.
  • the blind section may be hatched, unlike the other sections that are not blind, or may be
  • all optically detectable properties are suitable for the distinguishable presentation of the blind section.
  • a blind portion, a detected obstacle portion, and a scanned portion for which no object has been detected differ in appearance from each other in at least one of the above-mentioned graphic properties.
  • the method according to the invention provides that the blind section, which is not detected by any of the sensors, corresponds to a gap that exists between adjacent outer sides of fields of view of mutually adjacent sensors.
  • the blind section can therefore be characterized by a patchy
  • the blind portion may be provided by a sensor which is aligned with the blind portion having an error.
  • the error may be a recoverable error, in particular an error due to a detected and removable deposit on the sensor, which is aligned with the blind section.
  • the error can also be an error due to a permanent defect.
  • the actual condition may be displayed individually for that sensor, which indicator is integrated into the representation of the scanned and non-scanned portions. This display is intuitive to understand for the driver, for example, in spite of passing a highly visible object with the vehicle, a corresponding blind section is displayed. The driver can easily access a blocked or defective sensor on the specific
  • the blind portion and / or the scanned portion may be displayed in sections or fields that are each uniquely associated with one or more sensors of each sensor.
  • the scanned portion corresponds to the field of view of the associated sensor.
  • the sections of the section are in particular divided by gaps between adjacent sections and are shown spaced apart, in particular in the form of fields.
  • an immediate visual assignment of the sections or fields shown to individual sensors is possible.
  • a specific embodiment of the invention provides that the surroundings of the vehicle are scanned by means of sensors whose fields of vision substantially completely cover the front and the rear of the vehicle. The fields of vision of the sensors of the vehicle do not completely cover the two sides of the vehicle. Therefore, the blind section results at the lateral locations, which none of the sensors currently detects or has already detected. Since, as already noted, when maneuvering the vehicle, in particular a motor vehicle, a truck or a car, only
  • the sensors may be mounted on the bumpers and in particular partially cover a front or rear side area of the vehicle. However, due to the side doors, the sensors do not cover the entire side or flank of the vehicle. Particularly in this case, the invention makes possible a precise representation of the entire surrounding situation, including the blind sections whose representation indicates that it can not be assumed that, despite the lack of detection of an object, there is actually no object in this section.
  • the invention is further realized by an apparatus for representing an environment detected by vehicle-mounted sensors, the sensors having a blind section that is not covered by any of the sensors.
  • the device does not comprise the sensors themselves, but has a sensor interface which is set up to connect the sensors.
  • the apparatus further comprises an optical display which reproduces at least the blind portion and at least one portion scanned by the sensors.
  • the apparatus is arranged to display the blind portion with a different graphic characteristic than the scanned portion.
  • the device may also have a Display interface to which a visual display is connectable, wherein the display interface is arranged to provide corresponding playback data that can be displayed on the display.
  • the display may be an LCD display or LED display, for example with a dot matrix or preformed, individual display panels whose shape symbolically corresponds to the shape of the vehicle or the extent of the field of view of the sensors.
  • the display is arranged in the vehicle in the driver's field of vision. Further, the display may be a head-up display that provides a visual representation on the windshield.
  • the sensors are in particular ultrasonic or microwave sensors, which in one
  • Pulsechobacter be operated.
  • the sensor interface and the display interface are electronic interfaces.
  • An embodiment of the device comprises a rewritable
  • the device further comprises a reset device, which is set up, the
  • Reset location information in an activation process or a deactivation process of the device or when no sensor data arrive at the sensor interface, or when the location information has reached a predetermined age.
  • sensor data on the scanned and unsampled sections are deleted, if it can not be ruled out that the corresponding location information is no longer applicable. This is the case, for example, during a sampling pause, in particular during a sampling pause from a certain length. This avoids collisions with newly added objects or with moving objects, which may have been detected correctly beforehand and are not detected correctly by the sampling interval.
  • the device further comprises a
  • Motion data interface which is adapted to be connected to a motion sensor, in particular a distance sensor, for example to an odometer of the vehicle, as described above.
  • the device comprises a displacement unit connected to the motion data interface.
  • the shift unit is arranged to shift display data representing a blind section in favor of display data representing a scanned section in accordance with the data applied to the motion data interface.
  • the presentation data representing the scanned portion replaces presentation data of the scanned portion.
  • the shift unit is set up the presentation data or precursors thereof in accordance with
  • Movement sensor data to be displaced whereby the blind section or parts thereof is replaced or be by the location information based on sensor data for a previously blind section - or parts thereof - which can be detected due to the shift now or can.
  • the embodiment described in this paragraph is combined with the embodiment of the preceding paragraph so that the relocating unit can cooperate with the memory to update the location information stored there according to the relocation reproduced at the motion data interface.
  • a sensor is directed into a section or into a part of a section which has not already been scanned, so that this section or this part of the section is scanned.
  • Shift unit updates this part so that the device no longer represents this part as a blind subfield, but as a sampled subfield
  • Figures 1 a-c a traffic situation, which is suitable for explaining the invention
  • FIG. 3 shows an embodiment of the device according to the invention. Embodiments of the invention
  • a vehicle 10 with sensors 12a-f are shown in plan view, which are arranged at the front and at the rear of the vehicle 10. While the rear and front outer edges of the vehicle 10 are completely covered by the associated fields of view of the sensors, there is a cover gap between the sensors 12a and 12d and between the sensors 12b and 12e on the sides of the vehicle 10, respectively.
  • One of these fields of view, ie the field of view of the sensor 12a, is exemplary denoted by the reference numeral 16.
  • the fields of view of the sensors 12a-f correspond to a scanned portion, respectively.
  • the side cover gap sweeps over a blind portion 20 that is not sensed by the sensors. In the blind section 20, in the situation illustrated in FIG.
  • FIG. 1 a shows a display according to the invention for the situation of FIG. 1a with a symbolic representation of the vehicle 10 and a symbolic representation of the blind and scanned sections in the form of fields 40a-e shown.
  • FIGS. 2a-2d corresponds to a form of representation of the sections according to the invention, while a section relates directly to the environment. Since in FIGS. 2a-2d fields are used for the representations of the sections according to the invention, FIGS. 2a-2d
  • Reference numerals 40a-e relate to the sections represented by the fields and may equally relate to the fields themselves, which represent the sections. Features and properties concerning sections shown therefore also relate to features and properties of the fields.
  • the corresponding section is represented as a field having a graphical characteristic different from the graphical characteristic of the other fields.
  • the fields here represent scanned or blind sections.
  • field 40c is shown in phantom only, in contrast to the solid lines of fields 40a, b, d, e, which represent a scanned portion. Therefore, the graphical representation of fields 40a, b, d, e, which represent a scanned portion. Therefore, the graphical elements
  • field 40c is given a different color than scanned fields 40a, b, d, e, of which, in turn, the fields of detected object are different from the fields without detected object in the same graphic property, e.g., in color, art the line border, or in another property, for example. Width of a bar graph.
  • FIG. 1 b which represents a later point in time than FIG. 1 a, the vehicle 10 has moved relative to the surroundings of the vehicle 10 according to a relative movement 50 advanced. The offset is indicated by a distance 52. Still, the object 30 is in a non-scanned blind section.
  • the blind portion is reduced since the sensor 12d has been moved by the offset of the vehicle 10 in the direction of the relative movement 50, and this portion can be scanned. Even if the object 30 has not yet been located in a section which is scanned, it can be determined for the subsection by scanning that there is no object there.
  • FIG. 2b shows an associated representation at an early point in time
  • Figure 2c shows an associated representation at a later time of the relative movement 50, in particular a representation of the situation of Figure 1 b, while Figure 2b illustrates the situation at a time between Fig. 1 a and b. In the latter situation, the offset according to the distance 52 is not yet completed.
  • the field 40c is bisected in FIGS. 2b and 2c by the offset into a sub-field 40c ', which represents a blind section, cf. Dashed lines, and a subfield 40c "representing a scanned portion, see solid lines
  • the subfield 40c" is shown in the same way as the fields 40a, b, d, e. It can be seen that the offset increases the subfield 40c "to the same extent, especially in the length as the subfield 40c 'decreases, particularly in length.”
  • the offset shifts a boundary between these fields Subfield 40c ', the amount of increment of subfield 40c ", and the amount of shift of the boundary respectively correspond to the amount of offset in the direction of relative movement 50, and thus correspond to distance 52.
  • Vehicle ie the distance 52, for example, detected with an odometer and matched with the representation.
  • Detected sensor data is therefore shifted with respect to the corresponding location according to the offset to update this data according to the distance 52 of the offset.
  • the representation of FIGS. 2a-d is based on these updated or shifted data.
  • a comparison of Figure 2b with Figure 2c shows the progress of the representation according to the course of the relative movement 50, it being noted that with increasing offset according to distance 52, the limit is increasingly shifted.
  • the field 40c 'representing the non-scanned area is progressively shortened, while the field 40c "representing the scanned area (current or recent) increases with the offset.
  • FIG. 1 c shows a further advanced offset and the associated distance 52 of the vehicle 10 in the direction of the relative movement 50.
  • the sensor 12 d of the vehicle begins to detect the object 30.
  • the sensor 12d on the left rear side of the vehicle 10 scans the associated field of view and detects the object 30.
  • the detected object 30 is represented by the field 40d, which is shown with a different graphical characteristic than the other fields.
  • the corresponding representation is shown with the field 40d hatched, in contrast to the other fields representing a scanned portion in which no object was detected.
  • the distance 52 of the offset has a length corresponding to the gap between the laterally oriented sensors 12a and 12d. This scanned the entire lateral area, making field 40c, which represents this section, i. the distance between sensor 12a and 12d is shown as being fully sampled.
  • information about the object distance in that portion may be further represented, for example, by distance-dependent graphical characteristics such as variable color, shape, width, and the like.
  • FIG. 3 shows an embodiment of the device 1 10 according to the invention with a sensor interface 1 18, to which sensors 1 12a-c can be connected. Since the sensors 1 12a-c are not necessarily part of the device 1 10, these are shown in dashed lines.
  • the device further comprises a display 160 symbolically representing the vehicle 10 (see Figures 1 a-c, 2 a-d) and the panels 140. Fields 140 correspond in part to fields 40b, d of FIG. 2a. The presentation takes place
  • the device further comprises a memory 170, which communicates with the interface 1 18.
  • This connection is shown only symbolically. In particular, this connection can be indirect and, for example, via a not shown
  • the device 10 further comprises a reset device 172, preferably at the memory 170, with which location information within the memory 170 can be deleted or marked as not reliable. This avoids that incorrect, because outdated sensing information is displayed.
  • the memory 170 is also used to store data for presentation. This data may be updated in memory 170 to blind data from previously
  • this data may be updated in the memory 170 to account for a location displacement detected by a motion sensor, particularly to account for an offset between the vehicle and the environment in representing the location data concerned.
  • the device therefore comprises a movement data interface 180, to which a
  • Motion sensor 182 can be connected, in particular an odometer or a motion sensor, which is arranged in the vehicle and detects the distance traveled.
  • the motion sensor 182 is not part of the device.
  • the motion sensor 182 may form part of the device 110. Because of these alternatives, the motion sensor 182 is shown in dashed lines, as are the sensors 1 12a-c.
  • the device 110 comprises a shift unit 190 which is connected to the movement data interface 180 and which is adjacent thereto
  • Route data as input data takes to be able to manipulate or update the memory 170 connected to the shift unit 190.
  • the apparatus is capable of using the route data of the movement data interface 180 to update the location data in the memory 170, the location data being shifted locally in accordance with the route data.
  • the display is updated when the vehicle makes a relative movement to the environment, as further shown above.
  • the device can by means of hardwired circuits, by means of a

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Traffic Control Systems (AREA)
PCT/EP2011/069251 2010-12-01 2011-11-02 Verfahren zur darstellung einer mittels sensoren erfassten umgebung und vorrichtung zur darstellung einer von fahrzeuggestützten sensoren erfassten umgebung WO2012072361A1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201180057537.6A CN103534138B (zh) 2010-12-01 2011-11-02 用于表示由车辆支持的传感器检测的周围环境的方法和设备
EP11790896.2A EP2646286A1 (de) 2010-12-01 2011-11-02 Verfahren zur darstellung einer mittels sensoren erfassten umgebung und vorrichtung zur darstellung einer von fahrzeuggestützten sensoren erfassten umgebung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102010062254.0 2010-12-01
DE102010062254.0A DE102010062254B4 (de) 2010-12-01 2010-12-01 Verfahren zur Darstellung einer mittels Sensoren erfassten Umgebung und Vorrichtung zur Darstellung einer von fahrzeuggestützten Sensoren erfassten Umgebung

Publications (1)

Publication Number Publication Date
WO2012072361A1 true WO2012072361A1 (de) 2012-06-07

Family

ID=45093699

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/069251 WO2012072361A1 (de) 2010-12-01 2011-11-02 Verfahren zur darstellung einer mittels sensoren erfassten umgebung und vorrichtung zur darstellung einer von fahrzeuggestützten sensoren erfassten umgebung

Country Status (4)

Country Link
EP (1) EP2646286A1 (zh)
CN (1) CN103534138B (zh)
DE (1) DE102010062254B4 (zh)
WO (1) WO2012072361A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116714532A (zh) * 2023-08-07 2023-09-08 福建农林大学 一种车用盲区监测传感器

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014205014A1 (de) * 2014-03-18 2015-09-24 Ford Global Technologies, Llc Verfahren und Vorrichtung zum Erfassen von bewegten Objekten in der Umgebung eines Fahrzeugs
US10023120B2 (en) * 2016-03-30 2018-07-17 Delphi Technologies, Inc. Multi-purpose camera device for use on a vehicle
DE102018203440A1 (de) * 2018-03-07 2019-09-12 Robert Bosch Gmbh Verfahren und Lokalisierungssystem zum Erstellen oder Aktualisieren einer Umgebungskarte
CN109795464B (zh) * 2018-12-28 2020-04-28 百度在线网络技术(北京)有限公司 制动方法、装置和存储介质
CN116495004A (zh) * 2023-06-28 2023-07-28 杭州鸿泉物联网技术股份有限公司 车辆环境感知方法、装置、电子设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11110697A (ja) * 1997-09-30 1999-04-23 Mazda Motor Corp 車両用障害物検出装置
US20040148063A1 (en) * 2001-03-07 2004-07-29 11138037 Ontari Ltd. ("Alirt") Detecting device and method of using same
EP1803624A1 (en) * 2005-12-29 2007-07-04 Delphi Technologies, Inc. Apparatus and method for thermal side detection in a vehicle
US20070182527A1 (en) * 2006-01-31 2007-08-09 Steve Traylor Collision avoidance display system for vehicles
DE102007011539A1 (de) * 2007-03-09 2008-09-11 Volkswagen Ag Fahrzeug mit einer Monitoreinheit und Verfahren zum Betreiben einer fahrzeugeigenen Monitoreinheit

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2806998B1 (fr) * 2000-03-31 2002-05-17 Intellitech Intelligent Techno Procede et dispositif pour la prise en charge du parcage en creneau de vehicules motorises
WO2005107261A1 (ja) * 2004-04-27 2005-11-10 Matsushita Electric Industrial Co., Ltd. 車両周囲表示装置
JP4321543B2 (ja) * 2006-04-12 2009-08-26 トヨタ自動車株式会社 車両周辺監視装置
DE102008060684B4 (de) * 2008-03-28 2019-05-23 Volkswagen Ag Verfahren und Vorrichtung zum automatischen Einparken eines Kraftfahrzeugs

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11110697A (ja) * 1997-09-30 1999-04-23 Mazda Motor Corp 車両用障害物検出装置
US20040148063A1 (en) * 2001-03-07 2004-07-29 11138037 Ontari Ltd. ("Alirt") Detecting device and method of using same
EP1803624A1 (en) * 2005-12-29 2007-07-04 Delphi Technologies, Inc. Apparatus and method for thermal side detection in a vehicle
US20070182527A1 (en) * 2006-01-31 2007-08-09 Steve Traylor Collision avoidance display system for vehicles
DE102007011539A1 (de) * 2007-03-09 2008-09-11 Volkswagen Ag Fahrzeug mit einer Monitoreinheit und Verfahren zum Betreiben einer fahrzeugeigenen Monitoreinheit

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2646286A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116714532A (zh) * 2023-08-07 2023-09-08 福建农林大学 一种车用盲区监测传感器
CN116714532B (zh) * 2023-08-07 2023-10-20 福建农林大学 一种车用盲区监测传感器

Also Published As

Publication number Publication date
DE102010062254A1 (de) 2012-06-06
EP2646286A1 (de) 2013-10-09
CN103534138B (zh) 2016-09-14
CN103534138A (zh) 2014-01-22
DE102010062254B4 (de) 2024-05-02

Similar Documents

Publication Publication Date Title
DE102010035232B4 (de) Verfahren und System zum Anzeigen von Fahrzeugrückfahrkamerabildern in verschiedenen Modi
EP2646286A1 (de) Verfahren zur darstellung einer mittels sensoren erfassten umgebung und vorrichtung zur darstellung einer von fahrzeuggestützten sensoren erfassten umgebung
WO2009135538A1 (de) Fahrerassistenzverfahren zum bewegen eines kraftfahrzeugs und fahrerassistenzvorrichtung
EP3437929A1 (de) Sichtsystem mit fahrsituationsbedingter sichtfeld-/sichtbereicheinblendung
DE102008028303A1 (de) Anzeigesystem und Programm
DE102007011180A1 (de) Rangierhilfe und Verfahren für Fahrer von Fahrzeugen bzw. Fahrzeuggespannen, welche aus gegeneinander knickbare Fahrzeugelementen bestehen
EP1792776B1 (de) Verfahren zur Detektion und Anzeige einer Parklücke und Einparkassistenzsystem eines Kraftfahrzeugs
EP3227712B1 (de) Verfahren zum konfigurieren wenigstens eines an einer von mehreren einbaupositionen in einem kraftfahrzeug verbauten radarsensors hinsichtlich der einbauposition und kraftfahrzeug
DE102010034853A1 (de) Kraftfahrzeug mit Digitalprojektoren
EP3413085A1 (de) Verfahren zum unterstützen eines fahrers eines gespanns beim manövrieren mit dem gespann, totwinkelsystem sowie gespann
DE102010032411A1 (de) Vorrichtung zur Überwachung der seitlichen und rückwärtigen Umgebung eines Fahrzeugs
DE102009046726A1 (de) Auswahl einer Parklücke aus mehreren erkannten Parklücken
WO2014032664A1 (de) Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug
EP1874611A1 (de) Verfahren und vorrichtung zur auswertung von abstandsmessdaten eines abstandsmesssystems eines kraftfahrzeugs
EP1927017B1 (de) Parksystem für kraftfahrzeuge
EP4102330B1 (de) Verfahren zum bewegen eines fahrzeugs an eine komponente eines hierzu beabstandeten objekts (koordinatentransformation)
WO2020074406A1 (de) Verfahren zur optischen begleitung eines ausparkmanövers eines kraftfahrzeugs
EP0936472B1 (de) Verfahren und Einrichtung zur Prüfung der Funktionsweise einer Abstandsregeleinrichtung eines Kraftfahrzeuges
DE102020004215A1 (de) Verfahren zur Darstellung einer Verkehrssituation in einem Fahrzeug
DE102005059415B4 (de) Spurwechselassistenzsystem für ein Fahrzeug
DE102022116457A1 (de) Anzeigesteuervorrichtung, anzeigeverfahren und speichermedium
EP3032517A1 (de) Vorrichtung und verfahren zur unterstützung eines fahrers eines fahrzeugs, insbesondere eines nutzfahrzeugs
DE102021103680A1 (de) Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem und fahrzeug mit einem parkassistenzsystem
DE102022130172B4 (de) Verfahren und Fahrerassistenzsystem zur Unterstützung eines Fahrers beim Fahren in einem Proximitätsbereich einer Trajektorie
DE102022129805B3 (de) Verfahren zum Darstellen von symbolischen Fahrbahnmarkierungen auf einer Anzeigeeinheit eines Fahrzeugs sowie Anzeigesystem für ein Fahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11790896

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011790896

Country of ref document: EP