EP2637898A1 - Verfahren zum erzeugen eines bilds einer fahrzeugumgebung und abbildungsvorrichtung - Google Patents

Verfahren zum erzeugen eines bilds einer fahrzeugumgebung und abbildungsvorrichtung

Info

Publication number
EP2637898A1
EP2637898A1 EP11785353.1A EP11785353A EP2637898A1 EP 2637898 A1 EP2637898 A1 EP 2637898A1 EP 11785353 A EP11785353 A EP 11785353A EP 2637898 A1 EP2637898 A1 EP 2637898A1
Authority
EP
European Patent Office
Prior art keywords
image
vehicle
camera
parking
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11785353.1A
Other languages
German (de)
English (en)
French (fr)
Inventor
Vsevolod Vovkushevsky
Jens Bammert
Tobias Geiger
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Schalter und Sensoren GmbH
Original Assignee
Valeo Schalter und Sensoren GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter und Sensoren GmbH filed Critical Valeo Schalter und Sensoren GmbH
Publication of EP2637898A1 publication Critical patent/EP2637898A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • B60R2300/305Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Definitions

  • the present invention relates to a method for generating an image of a vehicle environment of a vehicle with a camera by recording a first partial image in a first position or orientation of the camera and recording at least one second partial image in at least one second position or
  • the present invention relates to a
  • An imaging device for generating an image of a vehicle environment of a vehicle with a corresponding camera.
  • Vehicles often have driver assistance systems to assist the driver in operating the vehicle.
  • Such assistance systems include z. B. a
  • Parking assistance system an ACC system, a lane keeping system, a lane change system, a high beam assistant and the like.
  • some cameras built into the vehicle provide the driver with information about the surroundings of the vehicle.
  • either the current image of the camera or current images of cameras are displayed individually or in combination, or the individual images are processed so that a virtual image emerges from a bird's eye view.
  • Systems are also known which use different sensors (usually ultrasound-based) to scan the surroundings of the vehicle and calculate the distances between the vehicle itself and the obstacles used to warn the driver.
  • Systems are also known which calculate a trajectory in recognized parking spaces and can maneuver the vehicle by driving the steering semi-automatically into the parking space. If such systems are operated separately, it may be difficult for the driver to associate the warning or event with the camera image. In addition, it is not always clear from the picture which obstacle is critical and which is not.
  • a navigation device connects image data acquired by an image data acquisition unit to a position where the Image data are recorded, and stores the data as a previous image data in an image memory.
  • the control unit further determines whether the vehicle has entered at least in a parking target area. Is determined that the vehicle in the
  • Parking set area then an earlier image based on the earlier image data taken by the parking target area and a current image based on the image data obtained at the current position of the vehicle are displayed on a direction display.
  • US Pat. No. 7,212,653 B2 discloses an image processing system for a vehicle for taking pictures from the surroundings of the vehicle. The images are stored and a processed image obtained by processing a previous image taken before reaching a current position is superimposed on a portion of a current image that can not be captured by the camera.
  • the system has one
  • An imaging member for recording with a single camera during the transition of the vehicle from a first to a second location, first and second images around the vehicle at a first and second location and at a first and second time.
  • the system has an object distance calculating part for calculating a distance from the vehicle to the 3D object by using the position of the 3D object in each of the first and second images and the transition data of the vehicle.
  • an image generator is provided for generating a third image based on the images and the acquired data.
  • the object of the present invention is to obtain more meaningful images of a vehicle environment and to assist the driver
  • a method for generating an image of a vehicle surroundings of a vehicle with a camera
  • Each field corresponds to the entire detection range of the camera
  • the first field and the second field are assembled to the image of the vehicle environment so that the image shows a larger area than each field.
  • a camera for capturing a first partial image in a first position or orientation and for capturing at least one second partial image in at least one second position or orientation of the camera, wherein
  • Each field corresponds to the entire detection range of the camera, further comprising
  • An image synthesis device with which the first field and the second field to the image of the vehicle environment is composable so that the image shows a larger area than each field.
  • the driver can thereby avoid the situation in the
  • the generated image may have a virtual viewpoint above the vehicle. It is thus provided a bird's eye view image. Such a picture further makes it easier for the driver to assess the current situation.
  • the picture shows a parking space in at least one of the fields
  • the driver is presented in a normal mode one or more of the partial images and in an extended mode, when the parking space is recognized as free, the entire composite image.
  • the parking space can be measured by means of an ultrasonic sensor.
  • ultrasonic sensors are robust and reliable. You can capture limitations of a parking space for surveying or visual representation with sufficient accuracy.
  • an artificial picture element which represents a dimension of the parking space, an obstacle or a calculated parking path is superimposed in the picture.
  • obstacles that are outside the field of view of the camera system can also be displayed. With this information, the driver can lead the vehicle much safer.
  • a picture element representing an automatically proposed parking position can be faded into the picture and the parking position can be changed by manually shifting the picture element.
  • the driver can thus one
  • Parking process of the vehicle carried out semi-automatically or fully automatically. Of the Parking process can thus be better assessed or tracked by the driver.
  • the image may represent multiple parking spaces in the vehicle environment, and the driver may select one of them for parking.
  • the image is not only used to visualize a parking operation in a given parking space, but also to select one of several parking spaces meaningful.
  • the imaging device can have a plurality of cameras. Then, the images taken in parallel with the cameras can be used together with images taken earlier by the cameras to produce an overall image. It may be sufficient if each of the individual cameras each have a detection range corresponding to a radius of less than 5 m. This is usually sufficient to register the critical environment of a vehicle, in particular for a parking operation.
  • a vehicle equipped with such a driver assistance system can be guided by a driver very comfortably.
  • Fig. 1 is a bird's eye view of a parking situation with parallel
  • Fig. 2 is a block diagram of a controller with connected
  • Fig. 3 is a principle block diagram of the controller of Fig. 2;
  • Fig. 5 is a map of the environment in a schematic representation
  • Fig. 6 is a map view, as of a invention
  • Imaging device is generated.
  • Embodiments of the present invention In contrast to the known systems for generating images of a
  • Vehicle environment the consecutively obtained in different vehicle positions or orientations or positions of the camera images so put together that an image of the vehicle environment can be displayed, which has a larger area than the detection range of a single camera.
  • Camera system can be used in particular to take in the measurement of a parking space successively multiple images, store them and then using the images taken and by means of displacement sensors (wheel sensors on the vehicle, possibly a steering angle sensor) determined travel path to generate an expanded environment image that the parking space and covering the parking space limiting obstacles.
  • displacement sensors wheel sensors on the vehicle, possibly a steering angle sensor
  • partially overlapping images can be combined to form a new image, the new image combining image information from the individual images and resulting in a larger image.
  • the general approach is to combine the display and environmental sensing functions.
  • the resulting HMI output is particularly understandable for the driver.
  • the environment of the vehicle is scanned by means of suitable sensors and an image of the environment is stored in an environment map (see FIG.
  • the information stored in the area map is further processed so that individual obstacles can be identified and classified.
  • the risk of collision is also determined and, if necessary, values for a distance-to-collision and time-to-collision calculated.
  • the information obtained using the environment map is used when generating HMI images.
  • the detected obstacles are displayed directly at the right places in the image (obstacle highlighting).
  • the dangerous obstacles are highlighted by a special presentation.
  • distances can also be displayed as bars or numeric values.
  • the predicted or planned trajectory, for example, of a semi-automatic parking process of the vehicle is also displayed. This type of presentation can be used both for images generated from a natural perspective (live view) and for images created as a bird's-eye view.
  • Obstacles 3, 4, 5, the target position 6 in the parking space 2 and the currently planned trajectory are displayed.
  • Such a representation can also be used for an interactive correction of, for example, the target position by the driver.
  • a corresponding input means is then provided to move the picture element, which is shown as a superposition element (target position 6) in the overall picture.
  • the system calculates the
  • the camera or the cameras of the vehicle 1 take a plurality of partial images, from which a current image 7 (bird's-eye view in FIG. 1 or enlarged live-view image) is generated.
  • Image 7 is thus the result of a combination of images taken at partially different times, these images individually covering only a part of the image 7, if necessary with overlapping. That generated
  • Whole image can be expanded with virtual objects that have not been seen with the camera, but z. B. already detected by ultrasound.
  • a camera image 7 possibly composed of several partial images, is available up to half of the gap 2.
  • a curb 4 or a wall could be detected.
  • Curb 4 may already be displayed above the camera image 7 in the image presented to the driver. If necessary, the curb 4 is even displayed behind the parked vehicles 3 and 5 in the image, if it was registered by other sensors or interpolated computationally. All these aspects help to better connect the camera image with reality.
  • the camera image 7 extends to the limit of the detection range of the camera system. In the example of FIG. 7, however, the camera image 7 does not extend to the curb 4.
  • the area between the border of the camera image 7 and the curb 4 can be shown in color for better orientation of the driver.
  • the parking space 2 or the target position 6 could be marked green, which conveys to the driver that the parking space is sufficiently large.
  • the area from the detection limit of the camera system to the curb could be grayed out. In particular, it can be represented with that color which possesses, for example, the roadway or the optically detected partial area of the parking space.
  • the curb 4 au outside the detection range of the camera system is detected by the ultrasonic sensors in position and size. From these measured quantities, a virtual object and its position are calculated true to scale in the enlarged image displayed to the driver.
  • the extended image is thus composed of a camera image (possibly several sub-images) and the virtual objects, with intervening areas possibly being suitably colored.
  • the "camera image" can be calculated from one or more live-view images.
  • This fade-out is always dynamically adapted to the current image representation or the current position of the vehicle.
  • the camera images are processed so that objects can be recognized. In doing so, the information from the area map for the calculation of interesting areas and
  • Another illustration possibility relates to the fading in of the information about the environment into the viewing devices used by the driver taking into account the
  • Such vision devices include mirrors (side mirrors, rearview mirrors) and head-up display's front and
  • Rear window or a hologram presentation unit which moves the static or dynamic, recognized objects in the field of view of the driver.
  • the driver moves in the direction of an obstacle to be warned, he can there recognize the object or obstacle being warned, real and marked, and he understands the warning during the parking operation or the driving movement.
  • Particularly advantageous is a representation in combination with the electronic, camera-based mirrors, in which the image is generated purely electronically and suggests a mirror image.
  • the map information can be blended into all the mirror images so that the driver, no matter where he looks, can always perceive the system view of the obstacles, trajectory, and so forth.
  • control unit 10 which is networked with several peripheral devices 1 1 to 16.
  • This controller 16 is capable of producing an image according to the present invention. For this purpose it receives corresponding image information or partial images from a camera 1 1 or more cameras.
  • the controller 10 the camera 1 1 drive to capture images.
  • the orientation of the camera 1 1 can be varied via the control unit 10.
  • control unit 10 has as further suppliers of
  • Input information ultrasonic sensors 12 These provide, for example
  • an electric power steering are connected to the control unit 10, for example via a CAN bus. It delivers the current steering angle to the
  • an ESP 14 (Electronic Stabilization Program) is connected to the control unit via the CAN-BUS.
  • the ESP 14 provides motion data to the controller 10. This includes, for example, the wheel speed, the roll direction, the yaw rate, the acceleration, and the like.
  • control unit 10 is connected here to a head unit 15.
  • head unit 15 any other optical output device
  • control unit 10 (eg screen). It can also be controlled by the control unit 10, several optical display devices.
  • a speaker 16 or more speakers is used as an acoustic wave
  • the structure and the data flows of a system implementing the solution according to the invention will now be explained with reference to FIG.
  • the camera 1 1 is in a
  • Video processing unit 20 embedded. It supplies video raw data 21 to
  • Video controller 22 This will, as indicated by the rectangle on the right edge of the box of the video controller 22, result in a processing loop
  • an ultrasonic processing 30 is provided. It comprises the ultrasonic sensor 12. This supplies raw data 31 to a signal filter 32. The filtered sensor data 33 become a
  • Obstacle detection unit 34 supplied. This continuously performs a data analysis to detect obstacles in a processing loop.
  • a position detection unit 40 (eg, odometry, ESP, EPS, etc.) constantly supplies current position data 41 to a memory unit 42. From this memory unit 42, the video control unit 22 and the obstacle detection unit 34 call
  • the current position on request 43 is made available to a memory unit 44 in which the local map of the immediate vicinity of the vehicle (see Fig. 6) is stored.
  • the surrounding area refers to a radius of 30 m around the vehicle.
  • an object tracking 45 is constantly performed. In this case, it is calculated, for example, in which direction an object moves relative to the vehicle, so that it can also be displayed when it is outside the detection range of the camera 11 or the ultrasonic sensors 12.
  • a functional logic 50 receives on request 51 map data or
  • Obstacle information from the memory unit 44 is from an ultrasonic parking assistant 53, which also the desired
  • a continuous processing loop constantly checks and calculates the distances. For example, he issues corresponding warnings 55 to the driver via a loudspeaker.
  • segment data 56 of the ultrasonic parking assistant 53 are stored in a memory unit 57 of the functional logic 50. For example, this data relates to bars that need to be superimposed on an image.
  • a display control unit 58 of the function logic 50 receives video data 25 processed by the video control unit 22. Further, the display control unit 58 obtains map data or obstacle information 46 from the memory unit 44 on request 59. Further, the display control unit 58 reads the ultrasonic segment data from the display unit
  • Display control unit 58 is constantly running a fitting and generating process 61 for overlay elements.
  • the display control unit 58 makes an output 62, for example to a screen or a head unit, each of which forms an optical interface to the driver.
  • An interaction 63 takes place between the driver and the display control unit 58.
  • the driver can, for example, change the display or, for example, move an element in the image (for example, shift parking position marking).
  • FIG. 4 shows a live view representation, which may be composed of several sub-images.
  • a roadway 70 to recognize on the right side parking 71 are located.
  • one of the parking spaces 71 has a detected obstacle 72, 73.
  • the obstacle 73 is not in the detection area of the camera, it is displayed as a virtual object because it was detected in a previous image or by another sensor. It was taken for the current presentation.
  • the system has detected a wall 74 as an obstacle. Between the two vehicles, d. H.
  • a parking space 75 which detects the system and overlays with a rectangular overlay element in perspective exactly in the live view image.
  • the system computes a target position 76, which is also marked by a rectangular overlay element (overlay) in the live view image. If necessary, a trajectory for a parking process is displayed in the image.
  • the image shown in Fig. 4 may be composed of a plurality of partial images obtained in succession in different vehicle positions.
  • the image can also be enlarged or widened on the right-hand edge so that the obstacle 73 can then also be seen in order to facilitate orientation for the driver.
  • FIG. 5 first shows a real situation in a schematic representation.
  • a vehicle 80 parks in accordance with a trajectory 81 between two vehicles 82 and 83.
  • In the parking lot are numerous other obstacles such as a wall 84, a motorcycle 85, other vehicles 86 and garbage cans 87.
  • the imaging device of the vehicle 80 registers these obstacles and makes a map therefrom, as shown in FIG 6 is shown.
  • Each obstacle is blocked by a block 82 'to 87' d.
  • H. virtual object symbolized.
  • the motorcycle 85 is symbolized by the block 85 '.
  • the image presented to the driver FIG.
  • Conventional camera systems comprise at least one reversing camera and side cameras (eg in the exterior mirrors or corner regions of the bumpers). These camera systems have a detection range of approx. 2 m to 4 m range around the vehicle. This detection range is usually not sufficient to represent the parking space over its entire extent along the guideway.
  • the camera images are therefore combined in the present example to a total image of a virtual viewpoint above the vehicle (bird-view).
  • the successively recorded images can be used, in particular, as shown here, to measure a parking space.
  • the recorded images are stored and used to generate an expanded environmental image on the basis of the travel sensors (wheel sensors on the vehicle, optionally a steering angle sensor) determined by means of displacement sensors. This image represents the parking space and the parking space limiting obstacles, for example, as shown in FIG. 6 again.
  • the camera system according to the invention particularly knows a normal mode
  • the parking process can then be semi-automatic (steering only) or fully automated (steering and braking).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
EP11785353.1A 2010-11-12 2011-10-26 Verfahren zum erzeugen eines bilds einer fahrzeugumgebung und abbildungsvorrichtung Withdrawn EP2637898A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102010051206A DE102010051206A1 (de) 2010-11-12 2010-11-12 Verfahren zum Erzeugen eines Bilds einer Fahrzeugumgebung und Abbildungsvorrichtung
PCT/EP2011/068728 WO2012062573A1 (de) 2010-11-12 2011-10-26 Verfahren zum erzeugen eines bilds einer fahrzeugumgebung und abbildungsvorrichtung

Publications (1)

Publication Number Publication Date
EP2637898A1 true EP2637898A1 (de) 2013-09-18

Family

ID=45001710

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11785353.1A Withdrawn EP2637898A1 (de) 2010-11-12 2011-10-26 Verfahren zum erzeugen eines bilds einer fahrzeugumgebung und abbildungsvorrichtung

Country Status (5)

Country Link
US (1) US9544549B2 (zh)
EP (1) EP2637898A1 (zh)
CN (1) CN103328261A (zh)
DE (1) DE102010051206A1 (zh)
WO (1) WO2012062573A1 (zh)

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2447672B (en) 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
US9926008B2 (en) 2011-04-19 2018-03-27 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US9969428B2 (en) * 2011-04-19 2018-05-15 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US9854209B2 (en) 2011-04-19 2017-12-26 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US9723274B2 (en) 2011-04-19 2017-08-01 Ford Global Technologies, Llc System and method for adjusting an image capture setting
DE102012023706A1 (de) * 2012-12-05 2014-06-05 Daimler Ag Fahrzeugseitiges Verfahren und fahrzeugseitige Vorrichtung zum Erfassen und Anzeigen von Parklücken für ein Fahrzeug
KR20150022436A (ko) * 2013-08-23 2015-03-04 주식회사 만도 주차 제어 장치, 방법 및 시스템
DE102013019374B4 (de) * 2013-11-19 2022-09-08 Audi Ag Verfahren zum Betrieb eines zur vollständig automatisierten Führung eines Kraftfahrzeugs ausgebildeten Fahrzeugsystems und Kraftfahrzeug
KR101478135B1 (ko) * 2013-12-02 2014-12-31 현대모비스(주) 프로젝션 유닛을 이용한 증강현실 차선변경 보조 시스템
US11617623B2 (en) * 2014-01-24 2023-04-04 Koninklijke Philips N.V. Virtual image with optical shape sensing device perspective
JP6096155B2 (ja) * 2014-09-12 2017-03-15 アイシン精機株式会社 運転支援装置及び運転支援システム
KR20160065549A (ko) * 2014-12-01 2016-06-09 현대모비스 주식회사 차량의 자동 출차 제어장치 및 그 방법
WO2016109829A1 (en) * 2014-12-31 2016-07-07 Robert Bosch Gmbh Autonomous maneuver notification for autonomous vehicles
US9592826B2 (en) * 2015-02-13 2017-03-14 Ford Global Technologies, Llc System and method for parallel parking a vehicle
DE102015205377A1 (de) * 2015-03-25 2016-09-29 Volkswagen Aktiengesellschaft Ansicht einer Umgebung eines Fahrzeugs
US9751558B2 (en) 2015-03-25 2017-09-05 Ford Global Technologies, Llc Handwheel obstruction detection and inertia compensation
WO2016168786A1 (en) 2015-04-17 2016-10-20 Tulip Interfaces, Inc. Augmented interface authoring
JP6562709B2 (ja) * 2015-05-14 2019-08-21 株式会社デンソーテン 駐車支援装置および駐車支援方法
DE102015212370B4 (de) 2015-07-02 2023-02-23 Robert Bosch Gmbh Verfahren und Vorrichtung zum Erzeugen einer Darstellung einer Fahrzeugumgebung eines Fahrzeuges
KR101795151B1 (ko) * 2015-10-05 2017-12-01 현대자동차주식회사 주차안내 장치 및 방법
US9981656B2 (en) 2015-10-13 2018-05-29 Ford Global Technologies, Llc Vehicle parking assist system
WO2017068701A1 (ja) * 2015-10-22 2017-04-27 日産自動車株式会社 駐車スペース検出方法および装置
US10366611B2 (en) * 2015-10-22 2019-07-30 Nissan Motor Co., Ltd. Parking support information display method and parking support device
US9783112B2 (en) * 2015-10-27 2017-10-10 Cnh Industrial America Llc Rear windshield implement status heads-up display
US9836060B2 (en) 2015-10-28 2017-12-05 Ford Global Technologies, Llc Trailer backup assist system with target management
US10328933B2 (en) 2015-10-29 2019-06-25 Ford Global Technologies, Llc Cognitive reverse speed limiting
KR101892026B1 (ko) 2015-11-10 2018-08-27 현대자동차주식회사 차량의 원격 주차 제어 방법 및 장치
US10606257B2 (en) * 2015-11-10 2020-03-31 Hyundai Motor Company Automatic parking system and automatic parking method
DE102015223175A1 (de) * 2015-11-24 2017-05-24 Conti Temic Microelectronic Gmbh Fahrerassistenzsystem mit adaptiver Umgebungsbilddatenverarbeitung
US9895945B2 (en) 2015-12-08 2018-02-20 Ford Global Technologies, Llc Trailer backup assist system with hitch assist
JP6545108B2 (ja) * 2016-01-14 2019-07-17 アルパイン株式会社 駐車支援装置および駐車支援方法
US9987983B2 (en) 2016-02-11 2018-06-05 GM Global Technology Operations LLC Parking assist system for a vehicle and method of using the same
EP3229172A1 (en) * 2016-04-04 2017-10-11 Conti Temic microelectronic GmbH Driver assistance system with variable image resolution
DE102016208369A1 (de) * 2016-05-17 2017-12-07 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Ermittlung von Daten, die einen Teil der Umgebung unterhalb des Fahrzeugs repräsentieren
JP6642306B2 (ja) * 2016-06-29 2020-02-05 アイシン精機株式会社 周辺監視装置
KR101916515B1 (ko) * 2016-07-20 2018-11-07 현대자동차주식회사 원격 전자동 주차지원시스템에서의 주차모드 안내 방법
JP6743593B2 (ja) * 2016-08-31 2020-08-19 アイシン精機株式会社 駐車支援装置
US9829883B1 (en) 2016-10-17 2017-11-28 Ford Global Technologies, Llc Trailer backup assist system having remote control and user sight management
US10501112B2 (en) * 2016-11-14 2019-12-10 Ford Global Technologies, Llc Steering system with active compensation for road disturbances
US10325391B2 (en) * 2016-11-21 2019-06-18 Qualcomm Incorporated Oriented image stitching for spherical image content
DE102017200160A1 (de) * 2017-01-09 2018-07-12 Robert Bosch Gmbh Verfahren und Vorrichtung zum Überwachen eines abgestellten Kraftfahrzeugs
DE102017218921B4 (de) * 2017-10-24 2024-05-23 Bayerische Motoren Werke Aktiengesellschaft Verfahren, Vorrichtung, Computerprogramm und Computerprogrammprodukt zum Betreiben einer Displayeinheit eines Fahrzeugs
DE102018214875A1 (de) 2018-08-31 2020-03-05 Audi Ag Verfahren und Anordnung zum Erzeugen einer Umgebungsrepräsentation eines Fahrzeugs und Fahrzeug mit einer solchen Anordnung
DE102018214874B3 (de) 2018-08-31 2019-12-19 Audi Ag Verfahren und Anordnung zum Erzeugen einer mit Bildinformationen texturierten Umfeldkarte eines Fahrzeugs und Fahrzeug umfassend eine solche Anordnung
DE102018214915A1 (de) * 2018-09-03 2020-03-05 Continental Teves Ag & Co. Ohg System zur Erkennung einer Parklücke in Fahrtrichtung
JP7099914B2 (ja) * 2018-09-07 2022-07-12 株式会社デンソー 電子ミラーの表示制御装置およびそれを備えた電子ミラーシステム
US10589677B1 (en) * 2018-10-11 2020-03-17 GM Global Technology Operations LLC System and method to exhibit information after a pedestrian crash incident
DE102018128634A1 (de) * 2018-11-15 2020-05-20 Valeo Schalter Und Sensoren Gmbh Verfahren zum Bereitstellen einer visuellen Information über zumindest einen Teil einer Umgebung, Computerprogrammprodukt, mobiles Kommunikationsgerät und Kommunikationssystem
DE102019205707A1 (de) * 2019-04-18 2020-10-22 Volkswagen Ag Verfahren zum Erkennen einer Erreichbarkeit einer Ladesäule für ein Elektrofahrzeug
CN110239522B (zh) * 2019-05-17 2023-04-14 浙江吉利控股集团有限公司 一种自动泊车方法、装置及设备
DE102019123778A1 (de) * 2019-09-05 2021-03-11 Valeo Schalter Und Sensoren Gmbh Darstellen einer Fahrzeugumgebung zum Bewegen des Fahrzeugs zu einer Zielposition
DE102019217936A1 (de) * 2019-11-21 2021-05-27 Volkswagen Aktiengesellschaft Verfahren und System zum Steuern eines Zugangs zu einer Parkposition für ein Fahrzeug
CN113359692B (zh) * 2020-02-20 2022-11-25 杭州萤石软件有限公司 一种障碍物的避让方法、可移动机器人
CN113269998A (zh) * 2021-05-19 2021-08-17 广州小鹏汽车科技有限公司 一种基于自动驾驶中泊车功能的学习方法和装置
CN113119956B (zh) * 2021-05-19 2023-10-31 广州小鹏汽车科技有限公司 一种基于自动驾驶的交互方法和装置
DE102022203785A1 (de) 2022-04-14 2023-10-19 Volkswagen Aktiengesellschaft Verfahren zur Berechnung einer Trajektorie, Berechnungseinheit und Kraftfahrzeug
DE102022111724A1 (de) 2022-05-11 2023-11-16 Valeo Schalter Und Sensoren Gmbh Verfahren, computerprogrammprodukt, parkassistenzsystem und fahrzeug
CN114880064A (zh) * 2022-06-16 2022-08-09 广州小鹏汽车科技有限公司 车辆显示控制方法、车辆和存储介质
CN115042821B (zh) * 2022-08-12 2022-11-04 小米汽车科技有限公司 车辆控制方法、装置、车辆及存储介质

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006051977A1 (de) * 2005-11-04 2007-05-10 Denso Corp., Kariya Parkassistenzsystem

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19816054B4 (de) * 1997-04-18 2007-02-22 Volkswagen Ag Kamerasystem zum Überwachen einer nicht unmittelbar einsehbaren Umgebung eines Fahrzeugs
US6483429B1 (en) 1999-10-21 2002-11-19 Matsushita Electric Industrial Co., Ltd. Parking assistance system
EP1465135A1 (en) * 2000-04-05 2004-10-06 Matsushita Electric Industrial Co., Ltd. Driving operation assisting method and system
DE10045616B4 (de) * 2000-09-15 2011-03-17 Volkswagen Ag Verfahren zum automatischen Einparken eines Kraftfahrzeugs
JP4156214B2 (ja) * 2001-06-13 2008-09-24 株式会社デンソー 車両周辺画像処理装置及び記録媒体
US7212653B2 (en) 2001-12-12 2007-05-01 Kabushikikaisha Equos Research Image processing system for vehicle
JP4195966B2 (ja) * 2002-03-05 2008-12-17 パナソニック株式会社 画像表示制御装置
DE102004018205A1 (de) * 2004-04-15 2005-11-10 Robert Bosch Gmbh Vorrichtung zur Unterstützung eines Einparkvorgangs eines Fahrzeugs
JP3898709B2 (ja) * 2004-05-19 2007-03-28 本田技研工業株式会社 車両用走行区分線認識装置
DE102004050795A1 (de) * 2004-10-19 2006-04-20 Robert Bosch Gmbh Verfahren zur Unterstützung eines Einparkvorgangs eines Fahrzeugs
JP2006268076A (ja) 2005-03-22 2006-10-05 Sanyo Electric Co Ltd 運転支援システム
JP4561479B2 (ja) 2005-05-26 2010-10-13 アイシン・エィ・ダブリュ株式会社 駐車支援方法及び駐車支援装置
JP2006341641A (ja) * 2005-06-07 2006-12-21 Nissan Motor Co Ltd 映像表示装置及び映像表示方法
DE102005034700A1 (de) * 2005-07-26 2007-02-08 Robert Bosch Gmbh Einparkvorrichtung
JP4682809B2 (ja) * 2005-11-04 2011-05-11 株式会社デンソー 駐車支援システム
EP2184208A4 (en) * 2007-07-31 2012-11-21 Toyota Jidoshokki Kk PARKING AID DEVICE, VEHICLE SIDE DEVICE FOR PARKING AID DEVICE, PARKING AID METHOD, AND PARKING ASSIST PROGRAM
JP5380941B2 (ja) * 2007-10-01 2014-01-08 日産自動車株式会社 駐車支援装置及び方法
JP2009129001A (ja) * 2007-11-20 2009-06-11 Sanyo Electric Co Ltd 運転支援システム、車両、立体物領域推定方法
DE102008003662A1 (de) * 2008-01-09 2009-07-16 Robert Bosch Gmbh Verfahren und Vorrichtung zum Anzeigen der Umgebung eines Fahrzeugs
DE102008034594B4 (de) * 2008-07-25 2021-06-24 Bayerische Motoren Werke Aktiengesellschaft Verfahren sowie Informationssystem zur Information eines Insassen eines Fahrzeuges

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006051977A1 (de) * 2005-11-04 2007-05-10 Denso Corp., Kariya Parkassistenzsystem

Also Published As

Publication number Publication date
WO2012062573A1 (de) 2012-05-18
US9544549B2 (en) 2017-01-10
US20130229524A1 (en) 2013-09-05
DE102010051206A1 (de) 2012-05-16
CN103328261A (zh) 2013-09-25

Similar Documents

Publication Publication Date Title
EP2637898A1 (de) Verfahren zum erzeugen eines bilds einer fahrzeugumgebung und abbildungsvorrichtung
EP2805183B1 (de) Verfahren und vorrichtung zum visualisieren der umgebung eines fahrzeugs
EP2603413B1 (de) Verfahren zur unterstützung eines parkvorgangs eines kraftfahrzeugs, fahrerassistenzsystem und kraftfahrzeug
DE102008034594B4 (de) Verfahren sowie Informationssystem zur Information eines Insassen eines Fahrzeuges
EP1642768B1 (de) Verfahren zur Anzeige eines Fahrzeugfahrraums
EP1147032B1 (de) Einrichtung zur überwachung der umgebung eines einparkenden fahrzeugs
EP3219533B1 (de) Sichtsystem für ein fahrzeug, insbesondere nutzfahrzeug
DE102010043128B4 (de) Überwachungsvorrichtung für die Fahrzeugumgebung und Verfahren zur Steuerung einer derartigen Überwachungsvorrichtung
DE102010051204A1 (de) Verfahren zum Darstellen eines Hindernisses und Abbildungsvorrichtung
EP3512739B1 (de) Verfahren zum bereitstellen einer rückspiegelansicht einer fahrzeugumgebung eines fahrzeugs
DE102016106255A1 (de) Fahrzeugaussenseiten-Kamerasysteme und Verfahren
DE102017100004A1 (de) Verfahren zum Bereitstellen von zumindest einer Information aus einem Umgebungsbereich eines Kraftfahrzeugs, Anzeigesystem für ein Kraftfahrzeug, Fahrerassistenzsystem für ein Kraftfahrzeug sowie Kraftfahrzeug
EP3437929A1 (de) Sichtsystem mit fahrsituationsbedingter sichtfeld-/sichtbereicheinblendung
EP3695395A1 (de) Verfahren zum darstellen einer umgebung eines fahrzeugs
WO2006114309A1 (de) Verfahren zur grafischen darstellung der umgebung eines kraftfahrzeugs
DE102010000385A1 (de) Fahrzeugperipherieanzeigevorrichtung
WO2016005232A1 (de) Zusammenfügen von teilbildern zu einem abbild einer umgebung eines fortbewegungsmittels
DE102009035422B4 (de) Verfahren zur geometrischen Bildtransformation
DE102012208288A1 (de) Verfahren zur Darstellung einer Fahrzeugumgebung auf einer Anzeige und Fahrassistenzsystem
WO2012003942A2 (de) Verfahren und vorrichtung zur fahrerunterstützung beim fahren und/oder rangieren eines fahrzeugs
WO2012019941A1 (de) Verfahren zum unterstützen eines fahrers beim führen eines kraftfahrzeugs und fahrerassistenzsystem
EP2801076B1 (de) Tiefenbildberechnung mit einer monokamera durch eine definierte trajektorie
WO2017198429A1 (de) Ermittlung von fahrzeugumgebungsdaten
EP2603403B1 (de) Verfahren zum anzeigen von bildern auf einer anzeigeeinrichtung in einem kraftfahrzeug, fahrerassistenzsystem und kraftfahrzeug
DE102013010233B4 (de) Verfahren zum Anzeigen von Umgebungsinformationen in einem Fahrzeug und Anzeigesystem für ein Fahrzeug

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130607

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20140618

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20150106