WO2013107550A1 - Verfahren zum überwachen eines fahrzeugumfeldes - Google Patents

Verfahren zum überwachen eines fahrzeugumfeldes Download PDF

Info

Publication number
WO2013107550A1
WO2013107550A1 PCT/EP2012/073981 EP2012073981W WO2013107550A1 WO 2013107550 A1 WO2013107550 A1 WO 2013107550A1 EP 2012073981 W EP2012073981 W EP 2012073981W WO 2013107550 A1 WO2013107550 A1 WO 2013107550A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
mobile device
vehicle
assistance system
data
Prior art date
Application number
PCT/EP2012/073981
Other languages
German (de)
English (en)
French (fr)
Inventor
Henning Von Zitzewitz
Karsten Muehlmann
Jan Egelhaaf
Raphael Cano
Leo VEPA
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to CN201280067213.5A priority Critical patent/CN104054033A/zh
Priority to EP12810124.3A priority patent/EP2805210A1/de
Publication of WO2013107550A1 publication Critical patent/WO2013107550A1/de

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning

Definitions

  • the invention relates to a method for monitoring a vehicle environment.
  • the invention relates to a driver assistance system and a computer program product for carrying out the method.
  • Driver assistance systems are additional devices in a vehicle that serve to assist the driver in certain driving situations.
  • a driver assistance system comprises several subsystems such as a parking assistant, blind spot monitoring or a lane change assistant.
  • these subsystems use data from on-vehicle environment sensors that monitor the environment of the vehicle through sensors such as optical sensors, ultrasonic sensors, or the like.
  • a typical functionality of driver assistance systems is to present the environment of the vehicle to the driver on the basis of environment sensor data on a display of a man-machine interface.
  • Object detection device automatically detected and displayed.
  • the user may be a person who monitors the execution of a maneuver and wirelessly receives information from the vehicle.
  • the display device may be part of an external device outside the vehicle, wherein information from the
  • Vehicle can be transmitted to the external device and the user can choose his own or the perspective of the occupant.
  • DE 10 2008 034 606 A1 describes the representation of the environment of a vehicle on a mobile unit by at least one image signal, which the mobile unit from Vehicle receives wirelessly.
  • a representation on the mobile unit is generated by the image signal, which contains a perspective arranged virtual plane on which a captured with a recording means of the vehicle image is displayed.
  • the mobile unit is designed as a mobile phone.
  • Known driver assistance systems represent image data of the vehicle environment on a mobile unit. High demands are placed on this representation, so that the driver can grasp his surroundings intuitively and quickly. In this case, situation-specific features may occur, for example, objects in the vicinity of the vehicle, which require an adaptation of the environment representation.
  • environmental sensor means any sensor means of the transmitter, which characterize the environment of the transmitter.
  • the environment sensor system comprises a camera system with one or more
  • the environmental sensor system may also include a system based on at least one ultrasonic sensor, at least one radar sensor or at least one LIDAR sensor.
  • the environment data can thus include data that are detected by the environment sensor system, in particular a camera system of the transmitter.
  • the environmental data can represent video systems based on the camera system and / or visualizations of ultrasound, LIDAR or radar data.
  • Data transmission for example via an Ethernet, a Media Oriented Systems Transport (MOST), a USB, a Firewire, a WLAN, a Bluetooth, infrared and / or a Near Field Communication (NFC) interface take place , Furthermore, the data can be wirelessly transmitted, ie transmitted and / or received via the Universal Mobile Telecommunications System (UMTS), Global System for Mobile Communications (GSM) and / or Long Term Evolution (LTE).
  • UMTS Universal Mobile Telecommunications System
  • GSM Global System for Mobile Communications
  • LTE Long Term Evolution
  • the transmitter is a driver assistance system that transmits environmental data to a mobile device as a receiver.
  • the transmitter may be a mobile device that inputs environmental data
  • a driver assistance system in this context refers to an additional device in the vehicle, which supports the driver in different driving situations.
  • a driver assistance system includes various subsystems, such as an input and output
  • the environment sensor provides environment data concerning the vehicle environment.
  • individual sensors of the environment sensor can be any sensor of the environment sensor.
  • Driver assistance system act as a transmitter and provide environment data that can be transmitted at least partially via an interface to a mobile device as a receiver.
  • the driver assistance system can be assigned a human-machine interface (HMI), which transmits information in optical, acoustic and / or haptic form to the driver.
  • HMI human-machine interface
  • a display device such as a display, be provided in the field of vision of the driver, which displays information optically.
  • the driver assistance system can be designed with a display assigned to the HMI to display environment data received from the mobile device.
  • a mobile device referred to in the present context, a portable device that can be used anywhere. Different components can be integrated into the mobile device. For example, environmental sensor technology, such as a camera system with at least one camera, can be integrated into the mobile device. Additional sensors such as a compass system with at least one compass, a motion sensor system with at least one motion sensor and / or a positioning system with at least one Global Positioning System (GPS) sensor can continue to position and
  • GPS Global Positioning System
  • the mobile device can capture environmental data by environmental sensors and additional sensors that characterize the environment of the mobile device and at least partially from acting as a transmitter mobile device via an interface to a driver assistance system as
  • the mobile device for interacting with a user may be assigned an HMI, which transmits information in an optical, acoustic and / or haptic form to the user.
  • the HMI of the mobile device may include, among other things, a display device, such as a display, which is designed to display environment data received from the driving assistance system. Additionally or alternatively, the HMI may be associated with an input device, such as a keyboard and / or a touch-sensitive display, via which the user can communicate with the mobile device.
  • Typical mobile devices are for example simple mobile phones or as a smartphone, personal digital assistant (PDA), tablet computer or tablet PC with additional
  • Infrastructural device such as a parking lot monitoring
  • a camera in a garage can be used as a mobile device, installed and oriented so that the recorded and transmitted to the car (or the driver assistance system) video data support the driver visually during the parking process.
  • the displayed detail on the display device of the receiver substantially coincides with the field of view of the mobile device in the current orientation and comprises at least part of the vehicle environment.
  • the displayed section on the display device of the receiver comprises at least a part of the field of view of the mobile device.
  • the extent of the field of view may vary and be adjustable depending on the function or also by the user. This makes the information easy to understand for the user (not too small, too big, too distorted, etc.). .).
  • the orientation of the mobile device can thus define a field of view, which may for example be orthogonal to the display device of the mobile display.
  • the display device serves as the image plane of a virtual camera, which in its imaging properties can also deviate from the physical camera integrated in the mobile device.
  • the environment data preferably includes data that is acquired by a camera system associated with the driving assistance system or the mobile device. From this data, the respective section can then be selected and displayed in accordance with the field of view of the mobile device in the current orientation.
  • the field of view of the mobile device is essentially determined by the environmental sensor system, the field of vision substantially corresponding to the detection range of the environmental sensor system. The field of view can therefore in particular coincide substantially with the detection range of a camera integrated into the mobile device.
  • the current orientation of the mobile device is determined by the user using the mobile device and determining the orientation of the mobile device relative to the environment.
  • environment data that are acquired with environment sensors of the mobile device can be transmitted via an interface from the mobile device to the driver assistance system.
  • the mobile device can in particular provide video views of the camera system with at least one camera of the interfaces, wherein the video views reproduce a section of the environment of the mobile device.
  • the reproduced section of the environment of the mobile device can be chosen arbitrarily due to its location independence.
  • the reproduced section of the environment of the mobile device may include at least a part of the vehicle environment.
  • Environment data recorded with a camera system of the mobile device can thus be transmitted to a driver assistance system of the vehicle.
  • the environment data in particular video views, so selected be that they reflect the vehicle environment.
  • the environment data can furthermore be displayed on a display device of the driver assistance system, for example a display assigned to the HMI.
  • the transmission of environmental data which is detected with environment sensors of the driver assistance system, can be made to the mobile device.
  • the driver assistance system can provide environment data of the interface, the environment data representing a section of the vehicle environment.
  • multiple cameras installed at different locations on the vehicle can provide video views representing different sections of the vehicle environment.
  • Environment data recorded with a camera system of the driver assistance system can thus be transmitted to the mobile device.
  • the environment data can be selected such that it contains the section of the mobile device
  • Vehicle environment corresponding to the fictitious field of view of the mobile device in the current orientation can be determined, for example, via a motion sensor or compass assigned to the mobile device.
  • Driver assistance system are transferred so that in the driver assistance system from the
  • Environment data can be made a pre-selection concerning the section to be transmitted.
  • the driver assistance system can transmit environment data to the mobile device and the selection of the section take place in the mobile device.
  • the environment data can be selected so that the section of the vehicle environment on the mobile device corresponds to a fictitious field of view of the mobile device to the vehicle environment.
  • the environment data can continue on one
  • Display device of the mobile device such as a display
  • Display device of the mobile device are displayed.
  • different methods can be considered.
  • the one or more cameras of the mobile device and the Driver assistance system to capture outstanding visual characteristics. The correspondence of these features then allows the relative orientation of the systems to be reconstructed. Additionally or alternatively, the one or more cameras of the mobile device could detect outstanding visual features in the interior of the vehicle.
  • Examples of such features may be mirror or console parts in the interior of the vehicle. Furthermore, markers can be provided in the interior of the vehicle. If the location of such features in the vehicle is known, the orientation of the mobile device relative to the system vehicle can be determined. Further,
  • an application which processes environment data for display on a display device is installed on the mobile device and / or the driver assistance system. So the processing of the
  • Environment data recorded with environment sensors of the driver assistance system, take place either on the mobile device or on the driver assistance system. This processing involves balancing the orientation of the mobile device with the resulting portion of the vehicle environment that is to be displayed. If the environment data is processed directly on the driver assistance system, the representations are immediately transmitted to the mobile device for display. Alternatively, environmental data from the
  • Driver assistance system are transmitted to the mobile device to be processed there for display on a display device.
  • environmental data recorded with environment sensor of the mobile device can be processed on the driver assistance system or the mobile device for display on a display device.
  • Image section and / or a superposition of, for example, camera field data with environmental data of other components, such as ultrasonic sensors or the like, take place.
  • further methods of image processing known to the person skilled in the art can be used for comprehensible presentation of the environmental data.
  • a computer program is also proposed according to which one of the methods described herein is performed when the computer program is executed on a programmable computer device.
  • the computer device can be, for example, a module for implementing a driver assistance system, or a subsystem thereof, in a vehicle. Additionally or alternatively, the computing device may also be a module for implementing a mobile device.
  • the computer program can be on a machine-readable
  • Storage medium can be stored, such as on a permanent or
  • the computer program may be provided for download on a computing device such as a server, for example via a data network such as the Internet or a communication link such as a telephone line or a wireless link.
  • the invention proposes a driver assistance system that is configured to receive environment data from a component of a mobile device and to display a section of a vehicle environment as a function of an orientation of the mobile device.
  • the driver assistance system is preferably designed to carry out the method described above.
  • Vehicle environment proposed which is adapted to receive environmental data from a component of a driving assistance system and display a section of a vehicle environment in response to an orientation of the mobile device.
  • driver assistance system according to the invention and the mobile device according to the invention are preferably designed to carry out the method described above in its respective variations.
  • Vehicle environment by only selected sections of the vehicle environment are presented. By displaying a section of the vehicle environment can The driver can focus on this section of the vehicle environment and easily identify potential sources of danger. As a result, in particular at Ein- and
  • Ausparken of the vehicle the sections of the environment are displayed, in which threatens a collision with an obstacle.
  • the driver environment details recorded with a mobile device can be sent to the display of the driver assistance system and thus to give him a clear picture of the driving situation.
  • this embodiment is helpful because the user of the mobile device can send video views of the vehicle environment to the driver.
  • a camera in a garage can be used as a mobile device, the camera being installed and oriented so that the recorded and transmitted to the vehicle (or the driver assistance system) video data visually support the driver during parking.
  • the driver may, for example, before using a mobile device to estimate the distances to obstacles directly from the displayed on the mobile device sections of the vehicle environment.
  • results in this embodiment an "x-ray vision" for the driver, the interior of the vehicle, for example, through the body through the
  • Vehicle environment can rate.
  • FIG. 2 is a schematic representation of a driving situation in which a driver intends to maneuver his vehicle equipped with a driver assistance system according to the invention out of a parking space;
  • FIG. 2 shows a flowchart of an operation of the driver assistance system according to FIG.
  • FIG. 3 is a schematic illustration of a parking situation in which the vehicle, equipped with a driver assistance system according to the invention, parks in a parking space;
  • FIG. 4 in the form of a flow chart of an operation of the driver assistance system according to FIG. 2.
  • Figure 1 shows an exemplary driving situation in which a vehicle 10 is in an environment with obstacles 12, such as columns. These obstacles 12 are not directly visible to the driver 13 in the situation shown in FIG. In particular, when reversing, for example when sudrichausparken, it may therefore come to collision with the obstacles 12.
  • the vehicle 10 is equipped with a driver assistance system 14.
  • the driver assistance system 14 comprises different subsystems, for example a parking assistant and a monitoring system.
  • the driver assistance system 14 comprises environmental sensor system comprising a system of optical sensors, in particular a front camera 18, side cameras 20 and a rear view camera 22.
  • the driver assistance system 14 comprises environmental sensor system comprising a system of optical sensors, in particular a front camera 18, side cameras 20 and a rear view camera 22.
  • Ultrasonic sensors, radar sensors and / or lidar sensors include, which are not shown in Figure 1.
  • the driver assistance system 14 comprises a control unit 24, which enables the communication with subsystems, such as the parking assistant, and the processing of sensor data, such as video views of the camera system 18, 20, 22.
  • control unit 24 which enables the communication with subsystems, such as the parking assistant, and the processing of sensor data, such as video views of the camera system 18, 20, 22.
  • Control unit 24 configured with an interface 26, which is designed for example as a wireless data transmission interface for data transfer via Bluetooth, WLAN or Firewire.
  • Camera system 18, 20, 22 different sections of the vehicle environment.
  • the rearview camera 22 provides video views of the vehicle environment behind the vehicle 10.
  • Figure 1 are behind the vehicle 10 objects 12, with which the vehicle 10 could collide in a reverse drive.
  • the video views of the camera system 18, 20, 22 are forwarded to the control unit 24 of the driver assistance system 14 and processed there, for example, for display on a display device, such as the display of an HMI.
  • the video views of the camera system 18, 20, 22 are transmitted from the control unit 24 of the driver assistance system 14 to a mobile device 16, for example
  • the mobile device 16 has a
  • Data transmission interface 28 for example, as a wireless
  • Data transmission interface for data transfer via Bluetooth, WLAN or Firewire is formed, and a display for displaying the video views of the vehicle environment.
  • the mobile device 16 is further equipped with a motion sensor 31. In this way, the
  • Orientation of the mobile device 16 determines and from the video views that the
  • the orientation of the mobile device 16 relative to the vehicle 10 can be identified in different ways. For example, cameras of the mobile device 16 and the driver assistance system 14 may detect outstanding visual features that are then matched.
  • the mobile device 16 essentially serves as a virtual camera, which defines a field of view 17 depending on the orientation of the mobile device 16.
  • the applications can be installed on a server for the applications.
  • the applications can be used on a server for the applications.
  • the presentation of the video view is modified.
  • the video data with information of other environment sensors, such as distance information from
  • Ultrasonic sensors are superimposed. Also, an overlay with, for example, points of interest (POIs), such as sights, or an automatic assignment and
  • the video views could resume over a network, such as the Internet
  • the driver 13 may select portions of the vehicle surroundings by the orientation of the mobile device 16 and display them on the mobile device 16 through the body. This allows the driver 13 to more intuitively detect the vehicle environment and detect any obstacles 12. Thus, the driver 13 can detect obstacles 12, which could cause a collision, for example, when reversing, even though they are outside his field of vision.
  • FIG. 2 illustrates the interaction of the components of the driver assistance system from FIG. 1 on the basis of a flowchart 30.
  • step 32 to monitor the driver assistance system
  • the environment sensor system includes a
  • Camera system with multiple cameras that captures video views of the environment.
  • the environment data accordingly contain video views of the camera system 18, 20, 22.
  • LIDAR data include.
  • the detected environment data is transmitted from the driving assistance system 14 to a mobile device 16.
  • the driver assistance system 14 functions as a transmitter that transmits the environment data to the mobile device acting as a receiver.
  • step 36 at least a portion of the detected vehicle environment is displayed on a display of the mobile terminal.
  • an application installed on the mobile unit processes the received environment data for display on the display of the mobile unit.
  • the displayed section of environment data can be selected so that they reflect the section of the vehicle environment that matches the fictitious field of view of the mobile device in the current orientation.
  • the section of the vehicle environment displayed on the mobile terminal corresponds to a fictitious field of view of the mobile terminal on the vehicle environment.
  • the inventive method allows the driver 13 a kind of "x-ray vision" with which the driver 13 can monitor the entire vehicle environment within the vehicle 10.
  • the driver 13 for example, before parking the vehicle environment, which is not visible from the driver position, and see detect possible sources of danger.
  • FIG. 3 shows a further exemplary driving situation in which a driver 13 wants to park his vehicle 10 backwards in a narrow parking space 11, for example in a parking garage.
  • the parking space boundaries 12, for example, pillars or walls constitute a potential obstacle in the vehicle environment. These obstacles 12 make it more difficult for the driver 13 to park in the situation shown in FIG.
  • the vehicle 10 is equipped with a driving assistance system 14.
  • the driver assistance system 14 comprises different subsystems, for example a
  • the driver assistance system 14 includes an HMI having a display 15 for optically outputting information.
  • video views of the camera system 18, 20, 22 integrated in the driver assistance system 14 can be displayed on the display.
  • the camera system 18, 20, 22 is optional and not mandatory.
  • the driver assistance system 14 provides a control unit 24 that enables communication with subsystems, such as the parking assistant and video surveillance, and the processing of sensor data, such as video views of the camera system 18, 20, 22.
  • the controller 24 is configured with an interface 26, which is designed, for example, as a wireless data transmission interface for the data transfer, for example via Bluetooth, WLAN or Firewire.
  • the mobile device 16 includes for this purpose a camera 29, which receives 16 video views depending on the orientation of the mobile device. So the user can take different sections of the
  • the controller 24 of the driver assistance system 14 receives these video views and processes them, for example, for display on a display device 16. From the illustration of a detail of the vehicle environment as shown in FIG. 3, the user of the mobile device 16 can assist the driver 13 in maneuvering into a parking space 11. Because the user can instruct instead of the driver 13 by hand signals in the parking space 1 1, the driver 13 provide a video view of the vehicle environment, the driver 13 critical points, in particular distances to the parking space boundary 12, show.
  • FIG. 4 illustrates the interaction of the components of the driver assistance system from FIG. 2 on the basis of a flow diagram 40.
  • step 42 environment data with environmental sensor technology of a mobile device 16 are recorded for monitoring the vehicle environment.
  • a built-in mobile device 16 camera 29 is used, which supplies depending on the orientation of the mobile device 16 different sections of the vehicle environment.
  • step 44 the environment data of the mobile terminal 16 is sent to the
  • Driver assistance system 14 transmitted.
  • the mobile device 16 acts as a transmitter, which transmits the environment data to the driver assistance system 14 functioning as a receiver.
  • both the driver assistance system 14 and the mobile device 16 are equipped with a wireless interface 26, 28 which transmits data via
  • step 46 at least a section of the detected vehicle environment is displayed on a display 15 of the driver assistance system 14.
  • Driving assistance system 14 installed application, the received environment data for display on the display 15 of the driving assistance system 15.
  • the displayed section of environment data can be selected so that it reproduces the section of the vehicle environment, with the field of view of the mobile device 16 in the current orientation. Consequently, the section of the vehicle environment indicated on the driver assistance system corresponds to the field of view 27 of the mobile device 16 on the vehicle environment.
  • the inventive method allows the user of a mobile device 16, the driver 13 environment cutouts, recorded with the mobile device 16, on the
  • this embodiment is helpful because the user of the mobile device 16 can send video views of the vehicle environment and in particular critical points to the driver 13.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
PCT/EP2012/073981 2012-01-19 2012-11-29 Verfahren zum überwachen eines fahrzeugumfeldes WO2013107550A1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201280067213.5A CN104054033A (zh) 2012-01-19 2012-11-29 用于监视车辆环境的方法
EP12810124.3A EP2805210A1 (de) 2012-01-19 2012-11-29 Verfahren zum überwachen eines fahrzeugumfeldes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102012200721.0 2012-01-19
DE102012200721A DE102012200721A1 (de) 2012-01-19 2012-01-19 Verfahren zum Überwachen eines Fahrzeugumfeldes

Publications (1)

Publication Number Publication Date
WO2013107550A1 true WO2013107550A1 (de) 2013-07-25

Family

ID=47504838

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/073981 WO2013107550A1 (de) 2012-01-19 2012-11-29 Verfahren zum überwachen eines fahrzeugumfeldes

Country Status (4)

Country Link
EP (1) EP2805210A1 (zh)
CN (1) CN104054033A (zh)
DE (1) DE102012200721A1 (zh)
WO (1) WO2013107550A1 (zh)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014212819A1 (de) 2014-07-02 2016-01-07 Zf Friedrichshafen Ag Ortspositionsabhängige Darstellung von Fahrzeugumfelddaten auf einer mobilen Einheit
DE102014115521A1 (de) * 2014-10-24 2016-04-28 Deutsche Telekom Ag Fahrerunterstützung bei der Erfassung der Fahrzeugumgebung
DE102014222190A1 (de) * 2014-10-30 2016-05-04 Volkswagen Aktiengesellschaft Vorrichtung und Verfahren zur Anbindung einer mobilen Kameraeinrichtung
DE102014017599B4 (de) * 2014-11-27 2017-01-05 Elektrobit Automotive Gmbh Tragbare Vorrichtung zur Verwendung durch einen Fahrer eines Kraftfahrzeugs sowie Verfahren zur Verwendung der Vorrichtung
DE102015201723A1 (de) * 2015-02-02 2016-08-04 Robert Bosch Gmbh Fahrerassistenzsystem für ein Kraftfahrzeug
DE102015004553B4 (de) * 2015-04-08 2017-02-16 Audi Ag Verfahren zum Betrieb eines Fahrzeugsystems eines Kraftfahrzeugs
DE102015210118B4 (de) * 2015-06-02 2019-03-21 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zur Unterstützung des Ausparkens eines Fahrzeugs
DE102015216920A1 (de) 2015-09-03 2017-03-23 Robert Bosch Gmbh Verfahren und Vorrichtung zur Überwachung der Umgebung eines Fahrzeugs
DE102016205867A1 (de) * 2016-04-08 2017-10-12 Robert Bosch Gmbh Verfahren zur Bestimmung einer Pose eines wenigstens teilautomatisiert fahrenden Fahrzeugs mittels verschiedener Landmarkentypen
DE102016212181A1 (de) * 2016-07-05 2018-01-11 Volkswagen Aktiengesellschaft Verfahren zur Bestimmung von Längen- und/oder Größenverhältnissen eines Anhängers
DE102016214477A1 (de) * 2016-08-04 2018-02-08 Bayerische Motoren Werke Aktiengesellschaft Anordnung, Fortbewegungsmittel und Verfahren zur Definition einer Farbe in einem Fortbewegungsmittel
EP3293667A1 (en) * 2016-09-08 2018-03-14 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH An apparatus for providing vehicular environment information
DE102016125319A1 (de) * 2016-12-22 2018-06-28 Vorwerk & Co. Interholding Gmbh Verfahren zum Betrieb eines sich selbsttätig fortbewegenden Fahrzeugs
DE102017209910A1 (de) * 2017-06-13 2018-12-13 Continental Teves Ag & Co. Ohg Fahrzeuglokalisierung mittels eines im Fahrzeug befindlichen Mobiltelefons
FR3070332B1 (fr) * 2017-08-31 2021-01-01 Valeo Vision Procédé d’analyse du comportement de conduite d’un utilisateur d’un véhicule automobile
US11592813B2 (en) * 2018-12-17 2023-02-28 Robert Bosch Gmbh Method and controller for the situational transmission of surroundings information of a vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004004320A1 (en) * 2002-07-01 2004-01-08 The Regents Of The University Of California Digital processing of video images
EP1510849A1 (en) * 2003-08-29 2005-03-02 C.R.F. Società Consortile per Azioni A virtual display device for use in a vehicle
DE102008034606A1 (de) 2008-07-25 2010-01-28 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Darstellung der Umgebung eines Fahrzeugs auf einer mobilen Einheit
DE102009020328A1 (de) 2009-05-07 2010-11-11 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Darstellung von unterschiedlich gut sichtbaren Objekten aus der Umgebung eines Fahrzeugs auf der Anzeige einer Anzeigevorrichtung
US20110221692A1 (en) * 2010-03-11 2011-09-15 Parrot Method and an appliance for remotely controlling a drone, in particular a rotary wing drone
US20110288696A1 (en) * 2008-11-27 2011-11-24 Parrot Device for piloting a drone

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100548420B1 (ko) * 2003-11-13 2006-02-02 엘지전자 주식회사 이동통신 단말기를 이용한 사각지대 감시 시스템 및 방법
US20060232670A1 (en) * 2004-10-19 2006-10-19 Fu-Lai Chu Vehicle monitoring apparatus
JP2009060499A (ja) * 2007-09-03 2009-03-19 Sanyo Electric Co Ltd 運転支援システム及び連結車両
CN201197181Y (zh) * 2008-04-30 2009-02-18 肖宽新 一种汽车监控系统
CN102098497A (zh) * 2010-12-23 2011-06-15 史立锦 多用途安防监控系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004004320A1 (en) * 2002-07-01 2004-01-08 The Regents Of The University Of California Digital processing of video images
EP1510849A1 (en) * 2003-08-29 2005-03-02 C.R.F. Società Consortile per Azioni A virtual display device for use in a vehicle
DE102008034606A1 (de) 2008-07-25 2010-01-28 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Darstellung der Umgebung eines Fahrzeugs auf einer mobilen Einheit
US20110288696A1 (en) * 2008-11-27 2011-11-24 Parrot Device for piloting a drone
DE102009020328A1 (de) 2009-05-07 2010-11-11 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Darstellung von unterschiedlich gut sichtbaren Objekten aus der Umgebung eines Fahrzeugs auf der Anzeige einer Anzeigevorrichtung
US20110221692A1 (en) * 2010-03-11 2011-09-15 Parrot Method and an appliance for remotely controlling a drone, in particular a rotary wing drone

Also Published As

Publication number Publication date
CN104054033A (zh) 2014-09-17
EP2805210A1 (de) 2014-11-26
DE102012200721A1 (de) 2013-07-25

Similar Documents

Publication Publication Date Title
WO2013107550A1 (de) Verfahren zum überwachen eines fahrzeugumfeldes
EP3380391B1 (de) System zum einparken eines fahrzeugs
EP3512739B1 (de) Verfahren zum bereitstellen einer rückspiegelansicht einer fahrzeugumgebung eines fahrzeugs
EP3517409B1 (de) Verfahren zur detektion von garagenparkplätzen
DE102018114285A1 (de) Einleitung von ferngesteuertem parken eines fahrzeugs über eine mobile vorrichtung
DE102011118253B4 (de) Verfahren zum Betrieb eines elektronischen Rückspiegels eines Kraftfahrzeugs und Kraftfahrzeug
DE102008028303A1 (de) Anzeigesystem und Programm
DE102010030463A1 (de) Verfahren zur Unterstützung eines Fahrers eines Kraftfahrzeugs
WO2016041689A1 (de) Verfahren und vorrichtung zum überwachen eines automatischen parkens eines fahrzeugs
DE102019123250A1 (de) Verfahren und vorrichtung zum unterstützen von ferngesteuerten fahrzeugmanövern und einer fergesteuerten fussgängererfassung
DE102018120517A1 (de) Erfassen von Spurenbedingungen in adaptiven Geschwindigkeitsregelsystemen
DE102012208288A1 (de) Verfahren zur Darstellung einer Fahrzeugumgebung auf einer Anzeige und Fahrassistenzsystem
EP3167427A1 (de) Zusammenfügen von teilbildern zu einem abbild einer umgebung eines fortbewegungsmittels
DE102014222190A1 (de) Vorrichtung und Verfahren zur Anbindung einer mobilen Kameraeinrichtung
DE102016208214A1 (de) Verfahren und Vorrichtung zur Unterstützung eines Manövriervorganges eines Fahrzeuges
DE102015214802A1 (de) Verfahren und Vorrichtung zum Leiten eines Fahrzeugs
EP2637897A1 (de) Verfahren zum anzeigen von bildern auf einer anzeigeeinrichtung, kamerasystem und kraftfahrzeug mit einem kamerasystem
DE112014005007T5 (de) Informationsmeldevorrichtung, Informationsmeldesystem, Informationsmeldeverfahren und Informationsmeldeprogramm
EP3562730A1 (de) Verfahren zum automatisierten parken eines kraftfahrzeugs
EP2528050B1 (de) Verfahren und Vorrichtung zur bildlichen Erfassung und Visualisierung von Umgebungsbedingungen an einem von einem Nutzfahrzeug anzufahrenden Hindernis
DE102013010010A1 (de) Verfahren zum Betrieb eines Fahrerassistenzsystems zum Rangieren und/oder Parken
DE102016002232B4 (de) Verfahren zum Betrieb eines Kraftfahrzeugs
DE102011080720A1 (de) Visualisierung einer Rampenabfahrt
WO2012003945A1 (de) Vorrichtung und verfahren zum erfassen und anzeigen des rückwärtigen und/oder seitlichen umfeldes eines kraftfahrzeugs
DE102016217037B4 (de) Verfahren zum Betreiben einer Anzeigeeinrichtung eines Kraftfahrzeugs mit einer digitalen Anzeigefläche und Kraftfahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12810124

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012810124

Country of ref document: EP