EP4120218B1 - System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs - Google Patents

System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs Download PDF

Info

Publication number
EP4120218B1
EP4120218B1 EP21185811.3A EP21185811A EP4120218B1 EP 4120218 B1 EP4120218 B1 EP 4120218B1 EP 21185811 A EP21185811 A EP 21185811A EP 4120218 B1 EP4120218 B1 EP 4120218B1
Authority
EP
European Patent Office
Prior art keywords
view
camera
vehicle
live
highlighted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP21185811.3A
Other languages
English (en)
French (fr)
Other versions
EP4120218A1 (de
Inventor
Ahmed Benmimoun
Chenhao Ma
Tony Pak
Hamid M. Golgiri
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to EP21185811.3A priority Critical patent/EP4120218B1/de
Priority to CN202210809865.7A priority patent/CN115701091A/zh
Publication of EP4120218A1 publication Critical patent/EP4120218A1/de
Application granted granted Critical
Publication of EP4120218B1 publication Critical patent/EP4120218B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space

Definitions

  • the invention relates to the field of known autonomous driver-assistance systems (ADAS) as used for autonomously driving a vehicle.
  • ADAS autonomous driver-assistance systems
  • Such advanced driver-assistance systems in vehicles may include Valet Parking Assistance (VaPA) to provide fully automated steering and manoeuvring when parking, for example within a car park or parking structure.
  • VaPA Valet Parking Assistance
  • Such systems use automated vehicle controls such as GPS (Global Positioning System) or on-board sensors along with camera, lidar, radar proximity and ultrasonic sensors, to navigate, identify valid parking slots, and park the vehicle ("drop-off" manoeuvre).
  • GPS Global Positioning System
  • the vehicle is also able to autonomously drive the parked vehicle from a parking slot to a specified pickup location ("summon" manoeuvre) upon request by the user. Within a summon manoeuvre the vehicle drives along a specified route or distance.
  • ADAS Autonomous driver-assistance systems
  • This digital map of the area could be very simple and consist only of a description of the drivable sections, or more complex such as high-definition maps with additional attributes such as signs, lane widths and the like.
  • the ADAS or VaPA has to consider an actual traffic situation in the area of use, for example the car park or parking structure. Said digital map and said actual traffic situation might permanently be updated by using dedicated databases being connected with the ADAS or the vehicle.
  • said digital map and said actual traffic situation might be updated by GPS data or the use of on-board sensors along with camera, lidar, radar proximity and ultrasonic sensors. Also, data relating to said digital map or said actual traffic situation which might be tracked and shared by other traffic participants might be used for such an update.
  • a user When using ADAS or VaPA for the first time, a user might not be familiar with the functions of the system. The user might want to learn or check how the system works. One way of doing this is to allow the user to stay inside the vehicle during an automated driving or parking operation (e.g. "drop-off" manoeuvre or “summon” manoeuvre). However, at a certain point in time the driving or parking operation will have to be performed without the user being inside the vehicle. For this purpose it would be beneficial if the user could monitor the driving or parking operation and the respective vehicle behaviour during the driving or parking operation from outside the vehicle, in particular from a position where the vehicle is out of sight of the user.
  • an automated driving or parking operation e.g. "drop-off" manoeuvre or “summon” manoeuvre
  • Document US 2021/023992 A1 refers to an apparatus including a plurality of capture devices and a processor.
  • the plurality of capture devices may each be configured to generate video frames corresponding to an area outside of a vehicle.
  • the processor may be configured to receive the video frames from each of the plurality of capture devices, generate video data for a display in response to the video frames, store a view preference for the display corresponding to (i) a location and (ii) a vehicle status, determine a current location and a current status of the vehicle and generate an output signal to select a view for the display.
  • the output signal may be generated in response to the current location matching the location and the current status matching the vehicle status.
  • the view selected may be determined based on the view preference.
  • WO 2014/070276 A2 refers to a camera system installed on the front end of a vehicle, either on the left front, the right front, or both sides.
  • the camera is linked via wired or wireless connection to an onboard computer and a navigation display that is located within the passenger compartment of the vehicle.
  • the driver reviews a visual description on the display of any oncoming traffic in the form of motor vehicles, pedestrians, cyclists, animals and the like on the navigation display via a single screen, split screen or alternating screens.
  • the camera system can include a speed sensor that detects when the vehicle reaches a threshold speed to activate or de-activate the camera.
  • the computer can activate the system when a turn signal is activated, and de-activate the system when the turn signal is no longer activated. This camera system can be retrofitted into older vehicles.
  • US 2018/052457 A1 refers to a stereo camera-based autonomous driving method and apparatus, the method including estimating a driving situation of a vehicle, determining a parameter to control a stereo camera width of a stereo camera based on the estimated driving situation, controlling a capturer configured to control arrangement between two cameras of the stereo camera for a first direction based on the determined parameter, and measuring a depth of an object located in the first direction based on two images respectively captured by the two cameras with the controlled arrangement.
  • a system for monitoring an autonomous driving or parking operation of a vehicle is provided according to claim 1.
  • the portable electronic device may be a any portable computer, e.g. a laptop, notebook, tablet computer, telephone, smartphone or the like.
  • a stationary computer could be employed instead of a portable electronic device as the basic idea of the present invention relates to remotely observing a driving or parking operation of the vehicle.
  • the system allows a user of a portable electronic device (the second communication unit of which is wirelessly connected to a first communication unit of the vehicle) to visually observe an autonomous driving or parking operation of the vehicle in a live-mode or live-video (during the autonomous driving or parking operation) from a position external to the vehicle.
  • the cameras installed at different positions of the vehicle may be installed outside or inside the vehicle.
  • Each camera may include one or more lenses.
  • each camera may be operated by a microcontroller, the microcontroller being connected with a main control unit of the vehicle.
  • the mentioned expression of "capturing” videos may be understood in terms of "displaying" moving images (time-resolved image-sequences captured by a camera) to a user.
  • capturing may be understood in terms of recording (and storing) said moving images (time-resolved image-sequences captured by a camera).
  • Data corresponding to said moving images may be stored temporarily or for longer terms.
  • Said data may also be transmitted to an external server or database (e.g. a cloud).
  • the signal connections of the cameras and the first communication unit may be based on cable(s) or may be a wireless signal connection.
  • the first communication unit may be part of a main control unit of the vehicle.
  • the wireless signal connection between the first and second communication unit may be based on digital signal (or data) transmission.
  • Said wireless signal connection may exemplarily be based on Bluetooth, WLAN, ZigBee, NFC, Wibree, WiMAX, IrDA, FSO, LiFi.
  • Said wireless signal connection may also be based on mobile internet connections of the first and second communication units, e.g. mobile internet connections based on 2G, 3G, 4G, 5G or any other known or future standard for mobile internet connections.
  • the wireless connection between said first and second communication unit may be a direct connection (including a direct signal and data transfer) between both units, or may be an indirect connection including one or more intermediate transmission units or server.
  • the first and second communication units can be understood as communication interfaces, each comprising dedicated means (e.g. antennas) for receiving and transmitting signals and data.
  • a suitable application software (abbrev.: App) may be installed on the portable electronic device to operate a visualization of the data transmitted to the portable electronic device via the wireless connection of the first and second communication unit.
  • the App may be configured to overlay or display specific features/information directly in the video or next to the video.
  • the portable electronic device is configured to show said live-videos to a user of the portable electronic device in a live mode during the driving or parking operation.
  • This enables a user to observe an actual autonomous driving or parking operation via his smartphone, although the user may be located out of sight of the vehicle. The user may thus observe the vehicle behaviour and its vicinity in real time and on demand.
  • the view-management means may comprise hardware and software components, both being part of the portable electronic device or the vehicle. It is also possible, that hardware and software components of the vehicle and the portable electronic device define the view-management means and are configured to interact with each other. Hardware components may be understood as computing unit. Besides the possibility of automatically selecting a camera-view by the view-management means, the latter may be configured in that a user can manually switch between different camera-views.
  • the view-management means may comprise an algorithm (the algorithm may be based on artificial intelligence) that may be operated in a dedicated software (environment), the software being installed on one or both of said computing units. The automated selection of the camera-view of which the corresponding live-video is shown to the user is performed by said algorithm.
  • the view-management means provide a situation-based view management system. Based on the situational context, a camera-view may be changed automatically to show the most interesting/relevant camera-view to the user.
  • the algorithm may consider different criteria when calculating which camera-view (of which camera) is to be shown to the user. Said criteria may relate to the autonomous driving or parking operation as such, to the vicinity of the vehicle (e.g. the traffic situation, traffic participants) or to the needs of the user.
  • the cameras are installed at positions of the vehicle to provide the following camera-views: a front view of the vehicle, a rear view of the vehicle, a left view of the vehicle and a right view of the vehicle.
  • a single camera or a number of cameras may be provided at the relevant positions of the vehicle (the front, the back, the left, the right of the vehicle).
  • the cameras may be mounted on suitable vehicle components. It is to be noted that the system according to the invention may be implemented in newly fabricated vehicles or via retrofitting.
  • a live-video referring to a bird's eye view of the vehicle can be obtained based on the live videos provided by the cameras installed at the vehicle and/or position data of the vehicle.
  • a bird's eye view refers to a view of the vehicle from above, with a perspective as the observer were a bird.
  • the live-video in bird's eye view may be calculated (extrapolated) based on video data provided from the front, rear, left, and/or right camera of the vehicle.
  • one or more camera(s) may be installed on top of the roof of the vehicle. Said camera (being installed on the roof) may be a 360° camera. It could also be possible to install a drone at the vehicle. In case a bird's eye view would be needed, the drone could rise (fly) to a certain height above the vehicle and provide a bird's eye view.
  • the view-management means are configured to select one- or more camera-views of which the corresponding live-video(s) is/are shown to the user on the portable electronic device. It is important to note that the view-management means are not only suitable to select a single camera view, but also to select multiple camera views to be shown to a user at the same time. In many driving or parking operations (as well as traffic situations) a parallel observation of several (different) views may be of interest.
  • the live-videos may be shown to the user in a gallery format with multiple videos displayed to the user.
  • the gallery format may include the videos as video-mosaics.
  • the view-management means are configured to select a camera-view of which the corresponding live-video is shown to the user on the portable electronic device in a single camera-view or as highlighted camera-view besides other views.
  • a single camera-view is means that only a single live-video (referring to a specific) camera-view is displayed to the user.
  • a highlighted camera-view is to be understood as display mode where a live-video referring to a specific camera-view is prominently displayed to a user besides live-videos of other camera-views (which are not highlighted).
  • the live video referring to the highlighted camera-view is shown enlarged with respect to live-videos of other camera-views shown to the user.
  • the view-management means are configured to automatically select the single or highlighted camera-view as follows:
  • the view-management means may be configured to evaluate (or weigh) which of the situations/aspects given under lit. a. - d is most relevant at a certain point in time. According to the evaluation (weighing) it is then decided which of the camera-views is selected as single or highlighted camera-view.
  • the case of lit c. is directed to a situation where an object is present within a predefined first distance (or range) around the vehicle.
  • the object might be a pedestrian.
  • the system may comprise means or determining the distance between the object and the vehicle. Also, the system may comprise means for determining if the vehicle is moving toward the object (e.g. the distance between object and vehicle decreases).
  • Said means may be one of the cameras as such or additional means (distance measurement means) installed at the vehicle. If both criteria are met, a camera-view directed to the object is selected (shown as single camera-view or highlighted with respect to other camera-views). Said camera-view may be called "static object view".
  • Said predefined first distance may automatically be determined or may be continuously adapted to a situational context (e.g. the traffic situation) of the vehicle.
  • the single or highlighted camera-view may be switched when the object leaves a field of view of a first camera and enters a field of view of a second camera.
  • a moving object might enter different field of views of different cameras.
  • the live-video may be provided with a bounding box to indicate which moving (dynamic) object is actually tracked.
  • the bounding box may be bound to the moving object and may be provided as overlay of the live-video.
  • the view-management means are configured to select the bird's view as single or highlighted camera-view in case that multiple movements of the vehicle with an anticipated length of movement below said given threshold value are expected.
  • a bird's eye view does not require fast changes of camera-views, much more the autonomous driving or parking operation (including multiple changes in the direction of movement) may be observed from a position above the vehicle.
  • an anticipated final position of the vehicle and the intended path may be projected on top of the camera-view.
  • Such a camera-view may be called "parking view”. So one further aspect of the invention enables that an anticipated movement path or end position of the vehicle in the autonomous driving or parking operation is projected into the single or highlighted camera-view.
  • the view-management means are configured to automatically select the single or highlighted camera-view in predefined situations of an autonomous driving or parking operation according to predefined selection criteria, wherein the predefined situations and predefined selection criteria are as follows:
  • a parking operation might often be better observed from a bird's eye view.
  • an automated switching through accessible or predefined camera-views e.g. front, rear, left, right
  • camera-views e.g. front, rear, left, right
  • the system may consider path planning data (e.g. based on GPS data) or data referring to the local environment of the vehicle.
  • Path planning data may also refer to a local map.
  • Such data may be provided from an external server to the vehicle or the portable electronic device, so that the view-management means may consider said data.
  • a section of the live-video (of a certain camera-view) where the vehicle is assumed to get very close to a certain object during the driving or parking operation may be marked (e.g. with a bounding box).
  • the view-management means are configured to automatically select the camera-view shown to the user based on a routine, optionally a routine based on artificial intelligence.
  • the view management means are configured to show additional information to a user by displaying said information in the live-video corresponding to a selected camera-view shown to the user on the portable electronic device, wherein said information is/are preferably displayed as video-overlay(s).
  • Said additional information may also be displayed by boxes or illustrative means affixed to objects or positions present in the live-video.
  • the information may relate to anticipated movement paths, vehicle data, data referring to the environment (e.g. an outdoor temperature), traffic signs etc.
  • the system may be configured to include said overlays to the live-video(s) displayed to a user on a screen of the portable electronic device.
  • a function may be implemented in the system where the user may choose to shut off the automated view-selection and to select a camera-view manually. This feature may be implemented in the App operated on the portable electronic device. The user may also choose a hybrid mode where some camera-views may be fixed (as selected by the user) and other views change automatically according to the situational context.
  • One or more camera-views of which the corresponding live-video is/are shown to a user on the portable electronic device in a single camera-view or as highlighted camera-view besides other views is/are automatically selected by view-management means.
  • the selection may refer to a selection (and display) of a single camera-view or to a selection of a highlighted camera-view (a selected camera-view is displayed enlarged with respect to other camera-views).
  • the automated selection may be based on the same criteria or situations as described before.
  • system may comprise dedicated units or means for performing any of the method steps described above.
  • the vehicle 1 (e.g. a car) has a front F, a back B as well as a left side L and right side R.
  • a number of cameras 2L, 2R, 2F, 2B are installed at different positions of the vehicle.
  • a camera 2L is installed at the left side L of the vehicle 1
  • a camera 2R is installed at the right side R of the vehicle 1
  • a camera 2F is installed at the front F of the vehicle 1
  • a camera 2B is installed at the back side B of the vehicle 1.
  • the back B may synonymously be expressed as "rear" side of the vehicle 1.
  • the positions of the cameras 2L, 2R, 2F, 2B were only chosen for illustrative purposes.
  • Each of the cameras 2L, 2R, 2F, 2B is configured to capture live-videos of the driving or parking operation from a camera-view corresponding to the position of the camera 2L, 2R, 2F, 2B.
  • the corresponding camera-views are indicated with field-of-views 21, 22, 24 and 24, wherein the field-of-view 21 refers to camera 2L, field-of-view 22 refers to camera 2R, field-of-view 23 refers to camera 2F and field-of-view 24 refers to camera 2B.
  • the vehicle comprises a first communication unit 11 which may be part of a board computer of the vehicle 1.
  • the cameras 2L, 2R, 2F, 2B are in signal connection with the first communication unit 11.
  • the system according to the invention comprises a number of cameras 2L, 2R, 2F, 2B installed at different positions of the vehicle 1, each of the cameras 2L, 2R, 2F, 2B configured to capture live-videos of the driving or parking operation from a camera-view corresponding to the position of the camera 2L, 2R, 2F, 2B, wherein the cameras 2L, 2R, 2F, 2B are in signal connection (not shown) with a first communication unit 11 being installed in the vehicle 1.
  • the system further comprises a portable electronic device 30 comprising a second communication unit 12.
  • the portable electronic device 30 comprises a display 13.
  • the portable electronic device 30 is used by user 5, wherein the user 5 is located external to the vehicle 1.
  • the first communication unit 11 is configured to transmit the captured live-videos to the second communication unit 12 via a wireless signal connection 15, wherein the second communication unit 12 is configured to receive the transmitted live-videos, and wherein the portable electronic device 30 is configured to show the live-videos to the user 5 of the portable electronic device 30 in a live mode during the driving or parking operation.
  • the system further comprises view-management means (not shown) configured to automatically select a camera-view of which the corresponding live-video is shown to the user 5 on the portable electronic device 30.
  • the view-management means may comprise hardware and software components, both being part of the portable electronic device 30 or the vehicle 1. It is also possible, that hardware and software components of the vehicle 1 and the portable electronic device 30 together provide the view-management means and are configured to interact with each other.
  • Hardware components may be understood as computing unit. Besides the possibility of automatically selecting a camera-view by the view-management means, the latter may be configured in that a user 5 can manually switch between different camera-views.
  • the view-management means may comprise an algorithm (the algorithm may be based on artificial intelligence) that may be operated in a dedicated software (environment), the software being installed on one or both of said computing units. The automated selection of the camera-view of which the corresponding live-video is shown to the user 5 is performed by said algorithm.
  • the view-management means are configured to select a camera-view of which the corresponding live-video is shown to the user 5 on the portable electronic device 30 in a single camera-view 100 or as highlighted camera-view 101 besides other views 102.
  • a single camera-view 100 only a single live-video is displayed on the display 13 of the portable electronic device 30.
  • a highlighted camera-view 101 a live-video of a certain camera view is displayed enlarged when compare to the live-videos of other views 101 (shown smaller).
  • buttons 5a - d illustrate different buttons (provided in an App operated on the portable electronic device 30) which a user 5 of the portable electronic device 30 may activate/deactivate, wherein the buttons are related to different selection options referring to the selection of a camera-view of which a live-video is shown to the user 5.
  • the buttons may be shown in a touch sensitive manner on the display 13 of the portable electronic device 30. From the right to the left of the buttons illustrated in figs. 5a - d the buttons refer to a right view, a left view, a rear view, a front view and a bird's eye view of the vehicle 1. Said buttons may also be displayed in an on-board display of the vehicle 1, so that the user 5 of the vehicle may pre-select a certain selection procedure before leaving the vehicle 1.
  • Fig. 5a refers to an activated button (the left button is activated) referring to an automated (auto) camera selection.
  • the automated camera selection may be selected as default.
  • Figure 5b refers to a hybrid mode of camera selection (second button from the left is activated). However, by selecting the hybrid mode only without selecting a further camera vie, the system undergoes an automated camera selection as shown in fig. 5a.
  • Figure 5c again refers to an activated hybrid of camera selection, but the right view is also activated. In such a case the activated view (the right view in this case) is displayed as single or highlighted view 100, 101 to the user 5 on the portable electronic device 30 until the view management means decide that there is a more relevant (or critical) view that should be displayed to the user (e.g.
  • Figure 5d refers to a selection of the left view without the buttons of the automated selection or hybrid selection being activated. In such a case only the selected view is displayed to the user 5 on the portable electronic device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Claims (12)

  1. System zum Überwachen eines autonomen Fahr- oder Parkvorgangs eines Fahrzeugs (1), das System umfassend
    - eine Anzahl von Kameras (2L, 2R, 2F, 2B), die konfiguriert sind, um an verschiedenen Positionen des Fahrzeugs (1) installiert zu sein, wobei jede der Kameras (2L, 2R, 2F, 2B) konfiguriert ist, Live-Videos des Fahr- oder Parkvorgangs von einer Kameraansicht korrespondierend mit der Position der Kamera (2L, 2R, 2F, 2B) zu erfassen, wobei die Kameras (2L, 2R, 2F, 2B) in Signalverbindung mit einer ersten Kommunikationseinheit (11) sind, die konfiguriert ist, um in dem Fahrzeug (1) installiert zu sein;
    - eine tragbare elektronische Vorrichtung (30), umfassend eine zweite Kommunikationseinheit (12);
    wobei die erste Kommunikationseinheit (11) konfiguriert ist, die erfassten Live-Videos an die zweite Kommunikationseinheit (12) über eine drahtlose Signalverbindung (15) zu übertragen, wobei die zweite Kommunikationseinheit (12) konfiguriert ist, die übertragenen Live-Videos zu empfangen, und wobei die tragbare elektronische Vorrichtung (30) konfiguriert ist, einem Benutzer (5) der tragbaren elektronischen Vorrichtung (30) die Live-Videos in einem Live-Modus während des Fahr- oder Parkvorgangs zu zeigen,
    wobei das System Ansichtsverwaltungsmittel umfasst, die konfiguriert sind zum automatischen Auswählen einer Kameraansicht, dessen korrespondierendes Live-Video dem Benutzer (5) auf der tragbaren elektronischen Vorrichtung (30) in einer einzelnen Kameraansicht (100) oder als hervorgehobene Kameraansicht (101) neben anderen Ansichten (102) gezeigt wird, wobei in einer hervorgehobenen Kameraansicht (101) ein Live-Video der ausgewählten Kameraansicht im Vergleich mit den Live-Videos von anderen Kameraansichten vergrößert angezeigt wird,
    wobei die Kameras (2L, 2R, 2F, 2B) konfiguriert sind, an Positionen des Fahrzeugs (1) installiert zu sein, um die folgenden Kameraansichten bereitzustellen: eine Vorderansicht des Fahrzeugs (1), eine Rückansicht des Fahrzeugs (1), eine linke Ansicht des Fahrzeugs (1) und eine rechte Ansicht des Fahrzeugs (1),
    wobei die Ansichtsverwaltungsmittel konfiguriert sind zum Auswählen einer Ansicht aus der Vogelperspektive als die einzelne (100) oder hervorgehobene (101) Kameraansicht im Fall, dass mehrere Bewegungen des Fahrzeugs (1) mit einer antizipierten Länge jeder der mehreren Bewegungen unter einem gegebenen Schwellenwert erwartet werden,
    wobei die mehreren Bewegungen mehrere Änderungen der Bewegungsrichtung enthalten.
  2. System nach Anspruch 1, wobei ein Live-Video in Bezug auf die Ansicht aus der Vogelperspektive des Fahrzeugs (1) basierend auf den durch die an dem Fahrzeug (1) installierten Kameras (2L, 2R, 2F, 2B) bereitgestellten Live-Videos und/oder Positionsdaten des Fahrzeugs (1) erlangt wird.
  3. System nach Anspruch 1 oder 2, wobei die Ansichtsverwaltungsmittel konfiguriert sind zum Auswählen einer oder mehrerer Kameraansichten, dessen korrespondierende(s) Live-Video(s) dem Benutzer (5) auf der tragbaren elektronischen Vorrichtung (30) gezeigt wird/werden.
  4. System nach Anspruch 1,
    wobei die Ansichtsverwaltungsmittel konfiguriert sind zum automatischen Auswählen der einzelnen oder hervorgehobenen Kameraansicht (100, 101) wie folgt:
    a. im Fall, dass das Fahrzeug (1) sich in eine Richtung gerade voraus oder gerade zurück bewegt: Auswählen einer Vorderansicht oder einer Rückansicht als einzelne (100) oder hervorgehobene (101) Kameraansicht;
    b. im Fall, dass die Fahrzeug (1) seine Bewegungsrichtung ändert: Auswählen einer Kameraansicht, die in die geänderte Bewegungsrichtung weist, als einzelne (100) oder hervorgehobene (101) Kameraansicht;
    c. im Fall, dass ein Objekt innerhalb einer im Voraus definierten ersten Distanz von dem Fahrzeug (1) überwacht wird und das Fahrzeug (1) sich hin zu dem Objekt bewegt: Auswählen einer Kameraansicht, die zu dem Objekt weist, als einzelne (100) oder hervorgehobene (101) Kameraansicht;
    d. im Fall, dass ein Objekt innerhalb einer im Voraus definierten zweiten Distanz von dem Fahrzeug (1) überwacht wird und das Objekt sich hin zu dem Fahrzeug (1) bewegt: Auswählen einer Kameraansicht, die zu dem Objekt weist, als einzelne (100) oder hervorgehobene (101) Kameraansicht.
  5. System nach Anspruch 4, wobei die zweite im Voraus definierte Distanz größer als die erste im Voraus definierte Distanz ist.
  6. System nach Anspruch 4, wobei in Fall d. die einzelne (100) oder hervorgehobene (101) Kameraansicht gewechselt wird, wenn das Objekt, das sich bewegt, ein Sichtfeld (21, 22, 23, 24) einer ersten Kamera (2L, 2R, 2F, 2B) verlässt und in ein Sichtfeld (21, 22, 23, 24) einer zweiten Kamera (2L, 2R, 2F, 2B) eintritt.
  7. System nach Anspruch 1, wobei die Ansichtsverwaltungsmittel konfiguriert sind zum Berücksichtigen einer antizipierten Bewegungslänge des autonomen Fahr- oder Parkvorgangs zum automatischen Auswählen der einzelnen (100) oder hervorgehobenen (101) Kameraansicht, die dem Benutzer (5) gezeigt wird, wobei im Fall, dass eine antizipierte Bewegungslänge unter der gegebenen Schwellenwertlänge ist, die ausgewählte(n) Kameraansicht(en), die dem Benutzer (5) gezeigt wird/werden, fest ist/sind.
  8. System nach Anspruch 1,
    wobei ein antizipierter Bewegungsweg oder eine Endposition des Fahrzeugs (1) in dem autonomen Fahr- oder Parkvorgang in die einzelne (100) oder hervorgehobene (101) Kameraansicht projiziert wird.
  9. System nach Anspruch 1,
    wobei die Ansichtsverwaltungsmittel konfiguriert sind zum automatischen Auswählen der einzelnen (100) oder hervorgehobenen (101) Kameraansicht in im Voraus definierten Situationen eines autonomen Fahr- oder Parkvorgangs gemäß im Voraus definierten Auswahlkriterien, wobei die im Voraus definierten Situationen und im Voraus definierten Auswahlkriterien wie folgt sind:
    a. Auswählen einer Ansicht aus der Vogelperspektive im Fall, dass eine Parklücke zum Ausführen des Parkvorgangs des Fahrzeugs (1) identifiziert wurde;
    b. Auswählen einer Anzahl von Kameraansichten und Wechseln durch die Anzahl von Kameraansichten am Anfang oder Ende des autonomen Fahr- oder Parkvorgangs.
  10. System nach einem der vorhergehenden Ansprüche, wobei die Ansichtsverwaltungsmittel konfiguriert sind zum automatischen Auswählen der Kameraansicht, die dem Benutzer (5) gezeigt wird, basierend auf einer Routine, wahlweise einer auf künstlicher Intelligenz basierenden Routine.
  11. System nach einem der vorhergehenden Ansprüche, wobei die Ansichtsverwaltungsmittel konfiguriert sind, einem Benutzer (5) zusätzliche Informationen zu zeigen, durch Anzeigen der Informationen in dem Live-Video korrespondierend mit einer ausgewählten Kameraansicht, die dem Benutzer (5) auf der tragbaren elektronischen Vorrichtung (30) gezeigt wird, wobei die Informationen vorzugsweise als Videoüberlagerung angezeigt werden.
  12. Verfahren zum Überwachen eines autonomen Fahr- oder Parkvorgangs eines Fahrzeugs mit einem System nach einem der Ansprüche 1-11, die folgenden Schritte umfassend:
    - Erfassen eines Live-Videos des Fahr- oder Parkvorgangs;
    - Übertragen des erfassten Live-Videos an die tragbare elektronische Vorrichtung (30) des Systems;
    - Zeigen des Live-Videos einem Benutzer (5) der tragbaren elektronischen Vorrichtung (30) in einem Live-Modus während des Fahr- oder Parkvorgangs,
    wobei eine Kameraansicht, dessen korrespondierendes Live-Video einem Benutzer (5) auf der tragbaren elektronischen Vorrichtung (30) gezeigt wird, durch die Ansichtsverwaltungsmittel des Systems automatisch ausgewählt wird,
    wobei die Ansichtsverwaltungsmittel eine Kameraansicht, dessen korrespondierendes Live-Video dem Benutzer (5) auf der tragbaren elektronischen Vorrichtung (30) in einer einzelnen Kameraansicht (100) oder als hervorgehobene Kameraansicht (101) neben anderen Ansichten (102) gezeigt wird, automatisch auswählen, wobei in einer hervorgehobenen Kameraansicht (101) ein Live-Video der ausgewählten Kameraansicht im Vergleich mit den Live-Videos von anderen Kameraansichten vergrößert angezeigt wird,
    wobei die Ansichtsverwaltungsmittel die Ansicht aus der Vogelperspektive als die einzelne (100) oder hervorgehobene (101) Kameraansicht im Fall, dass mehrere Bewegungen des Fahrzeugs (1) mit einer antizipierten Länge jeder der mehreren Bewegungen unter einem gegebenen Schwellenwert erwartet werden, auswählen, wobei die mehreren Bewegungen mehrere Änderungen der Bewegungsrichtung enthalten.
EP21185811.3A 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs Active EP4120218B1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21185811.3A EP4120218B1 (de) 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs
CN202210809865.7A CN115701091A (zh) 2021-07-15 2022-07-11 用于监控自动驾驶或停车操作的系统和方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP21185811.3A EP4120218B1 (de) 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs

Publications (2)

Publication Number Publication Date
EP4120218A1 EP4120218A1 (de) 2023-01-18
EP4120218B1 true EP4120218B1 (de) 2024-12-04

Family

ID=77071240

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21185811.3A Active EP4120218B1 (de) 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs

Country Status (2)

Country Link
EP (1) EP4120218B1 (de)
CN (1) CN115701091A (de)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102024114461A1 (de) 2024-05-23 2025-11-27 Ford Global Technologies, Llc Verfahren zum Betrieb eines Fahrzeugs und Fahrzeug

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2885161B1 (de) * 2012-08-16 2020-07-15 Klear-View Camera, LLC System und verfahren zur bereitstellung von frontorientierten visuellen informationen für einen fahrzeuglenker

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102462502B1 (ko) * 2016-08-16 2022-11-02 삼성전자주식회사 스테레오 카메라 기반의 자율 주행 방법 및 그 장치
IT201900012813A1 (it) * 2019-07-24 2021-01-24 Ambarella Int Lp Switchable display during parking maneuvers

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2885161B1 (de) * 2012-08-16 2020-07-15 Klear-View Camera, LLC System und verfahren zur bereitstellung von frontorientierten visuellen informationen für einen fahrzeuglenker

Also Published As

Publication number Publication date
CN115701091A (zh) 2023-02-07
EP4120218A1 (de) 2023-01-18

Similar Documents

Publication Publication Date Title
US20230418307A1 (en) Autonomous Vehicle Collision Mitigation Systems and Methods
EP3272586B1 (de) Nutzfahrzeug
CN105593641B (zh) 增加显示的方法和装置
CN108140311B (zh) 停车辅助信息的显示方法及停车辅助装置
US11486726B2 (en) Overlaying additional information on a display unit
US10692372B2 (en) Appartus and method for road vehicle driver assistance
US20200307616A1 (en) Methods and systems for driver assistance
US20150302259A1 (en) Driving assistance device and image processing program
US10902273B2 (en) Vehicle human machine interface in response to strained eye detection
EP3271207B1 (de) Verfahren zum betreiben einer kommunikationsvorrichtung für ein kraftfahrzeug während eines autonomen fahrmodus, kommunikationsvorrichtung sowie kraftfahrzeug
EP3879857B1 (de) Einparkinformationverwaltungsserver, einparkassistenzvorrichtung und einparkassistenzsystem
US10488658B2 (en) Dynamic information system capable of providing reference information according to driving scenarios in real time
KR20200043252A (ko) 차량의 부감 영상 생성 시스템 및 그 방법
KR20220156687A (ko) 차량의 자율 주차 방법 및 이를 수행하는 차량 시스템
CN115941883A (zh) 行驶影像显示方法、行驶影像显示系统
EP4120218B1 (de) System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs
CN111746789B (zh) 拍摄系统、服务器、控制方法以及存储程序的存储介质
US20230166755A1 (en) Vehicle display control device, vehicle display control system, and vehicle display control method
US20210107515A1 (en) Systems and methods for visualizing a route of a vehicle
US12085933B2 (en) Autonomous vehicle, control system for remotely controlling the same, and method thereof
WO2021132553A1 (ja) ナビゲーション装置、ナビゲーション装置の制御方法、ナビゲーション装置の制御プログラム
JP6429069B2 (ja) 運転支援情報表示システム
JP2022041245A (ja) 車両用表示制御装置、方法、プログラムおよび車両用表示システム
JP6997006B2 (ja) 車載装置、サーバ、情報システム
US20250191413A1 (en) Driving information display apparatus and method for providing information related to electric vehicle charging station

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230718

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20231107

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20240829

RIN1 Information on inventor provided before grant (corrected)

Inventor name: GOLGIRI, HAMID M.

Inventor name: PAK, TONY

Inventor name: MA, CHENHAO

Inventor name: BENMIMOUN, AHMED

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602021022724

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250304

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250304

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1749009

Country of ref document: AT

Kind code of ref document: T

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20250612

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250404

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250404

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20250612

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602021022724

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20250616

Year of fee payment: 5

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20250905