EP4120218A1 - System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs - Google Patents

System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs Download PDF

Info

Publication number
EP4120218A1
EP4120218A1 EP21185811.3A EP21185811A EP4120218A1 EP 4120218 A1 EP4120218 A1 EP 4120218A1 EP 21185811 A EP21185811 A EP 21185811A EP 4120218 A1 EP4120218 A1 EP 4120218A1
Authority
EP
European Patent Office
Prior art keywords
view
camera
vehicle
live
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP21185811.3A
Other languages
English (en)
French (fr)
Other versions
EP4120218B1 (de
Inventor
Ahmed Benmimoun
Chenhao Ma
Tony Pak
Hamid M. Golgiri
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to EP21185811.3A priority Critical patent/EP4120218B1/de
Priority to CN202210809865.7A priority patent/CN115701091A/zh
Publication of EP4120218A1 publication Critical patent/EP4120218A1/de
Application granted granted Critical
Publication of EP4120218B1 publication Critical patent/EP4120218B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space

Definitions

  • the present invention first is directed to a system for monitoring an autonomous driving or parking operation of a vehicle. Secondly, the invention is directed to a method for monitoring an autonomous driving or parking operation of a vehicle with the presently disclosed system.
  • the invention relates to the field of known autonomous driver-assistance systems (ADAS) as used for autonomously driving a vehicle.
  • ADAS autonomous driver-assistance systems
  • Such advanced driver-assistance systems in vehicles may include Valet Parking Assistance (VaPA) to provide fully automated steering and manoeuvring when parking, for example within a car park or parking structure.
  • VaPA Valet Parking Assistance
  • Such systems use automated vehicle controls such as GPS (Global Positioning System) or on-board sensors along with camera, lidar, radar proximity and ultrasonic sensors, to navigate, identify valid parking slots, and park the vehicle ("drop-off" manoeuvre).
  • GPS Global Positioning System
  • the vehicle is also able to autonomously drive the parked vehicle from a parking slot to a specified pickup location ("summon" manoeuvre) upon request by the user. Within a summon manoeuvre the vehicle drives along a specified route or distance.
  • the summon manoeuvre is an operation during which the vehicle drives (driving operation).
  • driving operation is used synonymously with the term “drop-off manoeuvre” and the term “parking operation” is used” synonymously for the term “summon manoeuvre”.
  • ADAS Autonomous driver-assistance systems
  • This digital map of the area could be very simple and consist only of a description of the drivable sections, or more complex such as high-definition maps with additional attributes such as signs, lane widths and the like.
  • the ADAS or VaPA has to consider an actual traffic situation in the area of use, for example the car park or parking structure. Said digital map and said actual traffic situation might permanently be updated by using dedicated databases being connected with the ADAS or the vehicle.
  • said digital map and said actual traffic situation might be updated by GPS data or the use of on-board sensors along with camera, lidar, radar proximity and ultrasonic sensors. Also, data relating to said digital map or said actual traffic situation which might be tracked and shared by other traffic participants might be used for such an update.
  • a user When using ADAS or VaPA for the first time, a user might not be familiar with the functions of the system. The user might want to learn or check how the system works. One way of doing this is to allow the user to stay inside the vehicle during an automated driving or parking operation (e.g. "drop-off manoeuvre or "summon” manoeuvre). However, at a certain point in time the driving or parking operation will have to be performed without the user being inside the vehicle. For this purpose it would be beneficial if the user could monitor the driving or parking operation and the respective vehicle behaviour during the driving or parking operation from outside the vehicle, in particular from a position where the vehicle is out of sight of the user.
  • an automated driving or parking operation e.g. "drop-off manoeuvre or "summon” manoeuvre
  • a system for monitoring an autonomous driving or parking operation of a vehicle comprises a number of cameras installed at different positions of the vehicle, each of the cameras configured to capture live-videos of the driving or parking operation from a camera-view corresponding to the position of the camera, wherein the cameras are in signal connection with a first communication unit being installed in the vehicle.
  • the system further comprises a portable electronic device comprising a second communication unit.
  • the first communication unit is configured to transmit the captured live-videos to the second communication unit via a wireless signal connection, wherein the second communication unit is configured to receive the transmitted live-videos.
  • the portable electronic device is configured to show the live-videos to the user of the portable electronic device in a live mode during the driving or parking operation.
  • the system comprises view-management means configured to automatically select a camera-view of which the corresponding live-video is shown to a user on the portable electronic device.
  • the portable electronic device may be a any portable computer, e.g. a laptop, notebook, tablet computer, telephone, smartphone or the like.
  • a stationary computer could be employed instead of a portable electronic device as the basic idea of the present invention relates to remotely observing a driving or parking operation of the vehicle.
  • the system allows a user of a portable electronic device (the second communication unit of which is wirelessly connected to a first communication unit of the vehicle) to visually observe an autonomous driving or parking operation of the vehicle in a live-mode or live-video (during the autonomous driving or parking operation) from a position external to the vehicle.
  • the cameras installed at different positions of the vehicle may be installed outside or inside the vehicle.
  • Each camera may include one or more lenses.
  • each camera may be operated by a microcontroller, the microcontroller being connected with a main control unit of the vehicle.
  • the mentioned expression of "capturing” videos may be understood in terms of "displaying" moving images (time-resolved image-sequences captured by a camera) to a user.
  • capturing may be understood in terms of recording (and storing) said moving images (time-resolved image-sequences captured by a camera).
  • Data corresponding to said moving images may be stored temporarily or for longer terms.
  • Said data may also be transmitted to an external server or database (e.g. a cloud).
  • the signal connections of the cameras and the first communication unit may be based on cable(s) or may be a wireless signal connection.
  • the first communication unit may be part of a main control unit of the vehicle.
  • the wireless signal connection between the first and second communication unit may be based on digital signal (or data) transmission.
  • Said wireless signal connection may exemplarily be based on Bluetooth, WLAN, ZigBee, NFC, Wibree, WiMAX, IrDA, FSO, LiFi.
  • Said wireless signal connection may also be based on mobile internet connections of the first and second communication units, e.g. mobile internet connections based on 2G, 3G, 4G, 5G or any other known or future standard for mobile internet connections.
  • the wireless connection between said first and second communication unit may be a direct connection (including a direct signal and data transfer) between both units, or may be an indirect connection including one or more intermediate transmission units or server.
  • the first and second communication units can be understood as communication interfaces, each comprising dedicated means (e.g. antennas) for receiving and transmitting signals and data.
  • a suitable application software (abbrev.: App) may be installed on the portable electronic device to operate a visualization of the data transmitted to the portable electronic device via the wireless connection of the first and second communication unit.
  • the App may be configured to overlay or display specific features/information directly in the video or next to the video.
  • the portable electronic device is configured to show said live-videos to a user of the portable electronic device in a live mode during the driving or parking operation.
  • This enables a user to observe an actual autonomous driving or parking operation via his smartphone, although the user may be located out of sight of the vehicle. The user may thus observe the vehicle behaviour and its vicinity in real time and on demand.
  • the view-management means may comprise hardware and software components, both being part of the portable electronic device or the vehicle. It is also possible, that hardware and software components of the vehicle and the portable electronic device define the view-management means and are configured to interact with each other. Hardware components may be understood as computing unit. Besides the possibility of automatically selecting a camera-view by the view-management means, the latter may be configured in that a user can manually switch between different camera-views.
  • the view-management means may comprise an algorithm (the algorithm may be based on artificial intelligence) that may be operated in a dedicated software (environment), the software being installed on one or both of said computing units. The automated selection of the camera-view of which the corresponding live-video is shown to the user is performed by said algorithm.
  • the view-management means provide a situation-based view management system. Based on the situational context, a camera-view may be changed automatically to show the most interesting/relevant camera-view to the user.
  • the algorithm may consider different criteria when calculating which camera-view (of which camera) is to be shown to the user. Said criteria may relate to the autonomous driving or parking operation as such, to the vicinity of the vehicle (e.g. the traffic situation, traffic participants) or to the needs of the user.
  • the cameras are installed at positions of the vehicle to provide the following camera-views: a front view of the vehicle, a rear view of the vehicle, a left view of the vehicle and a right view of the vehicle.
  • a single camera or a number of cameras may be provided at the relevant positions of the vehicle (the front, the back, the left, the right of the vehicle).
  • the cameras may be mounted on suitable vehicle components. It is to be noted that the system according to the invention may be implemented in newly fabricated vehicles or via retrofitting.
  • a live-video referring to a bird's eye view of the vehicle can be obtained based on the live videos provided by the cameras installed at the vehicle and/or position data of the vehicle.
  • a bird's eye view refers to a view of the vehicle from above, with a perspective as the observer were a bird.
  • the live-video in bird's eye view may be calculated (extrapolated) based on video data provided from the front, rear, left, and/or right camera of the vehicle.
  • one or more camera(s) may be installed on top of the roof of the vehicle. Said camera (being installed on the roof) may be a 360° camera. It could also be possible to install a drone at the vehicle. In case a bird's eye view would be needed, the drone could rise (fly) to a certain height above the vehicle and provide a bird's eye view.
  • the view-management means are configured to select one- or more camera-views of which the corresponding live-video(s) is/are shown to the user on the portable electronic device. It is important to note that the view-management means are not only suitable to select a single camera view, but also to select multiple camera views to be shown to a user at the same time. In many driving or parking operations (as well as traffic situations) a parallel observation of several (different) views may be of interest.
  • the live-videos may be shown to the user in a gallery format with multiple videos displayed to the user.
  • the gallery format may include the videos as video-mosaics.
  • the view-management means are configured to select a camera-view of which the corresponding live-video is shown to the user on the portable electronic device in a single camera-view or as highlighted camera-view besides other views.
  • a single camera-view is means that only a single live-video (referring to a specific) camera-view is displayed to the user.
  • a highlighted camera-view may be understood as display mode where a live-video referring to a specific camera-view is prominently displayed to a user besides live-videos of other camera-views (which are not highlighted).
  • the live video referring to the highlighted camera-view may be shown enlarged with respect to live-videos of other camera-views shown to the user.
  • the view-management means are configured to automatically select the single or highlighted camera-view as follows:
  • the view-management means may be configured to evaluate (or weigh) which of the situations/aspects given under lit. a. - d is most relevant at a certain point in time. According to the evaluation (weighing) it is then decided which of the camera-views is selected as single or highlighted camera-view.
  • a camera directed to the direction of movement may be defined as the most relevant camera.
  • the camera-view may change to a camera-view of a camera directed in the changed direction of movement.
  • a driving tube view may be overlaid on top of the selected camera view to indicate the direction of movement.
  • the case of lit c. is directed to a situation where an object is present within a predefined first distance (or range) around the vehicle.
  • the object might be a pedestrian.
  • the system may comprise means or determining the distance between the object and the vehicle. Also, the system may comprise means for determining if the vehicle is moving toward the object (e.g. the distance between object and vehicle decreases).
  • Said means may be one of the cameras as such or additional means (distance measurement means) installed at the vehicle. If both criteria are met, a camera-view directed to the object is selected (shown as single camera-view or highlighted with respect to other camera-views). Said camera-view may be called "static object view".
  • Said predefined first distance may automatically be determined or may be continuously adapted to a situational context (e.g. the traffic situation) of the vehicle.
  • a camera-view is selected which is directed to the object as single or highlighted camera-view.
  • Said predefined second distance may automatically be determined or may be continuously adapted to a situational context (e.g. the traffic situation) of the vehicle.
  • the single or highlighted camera-view may be switched when the object leaves a field of view of a first camera and enters a field of view of a second camera.
  • a moving object might enter different field of views of different cameras.
  • the live-video may be provided with a bounding box to indicate which moving (dynamic) object is actually tracked.
  • the bounding box may be bound to the moving object and may be provided as overlay of the live-video.
  • the view-management means are configured to consider an anticipated length of movement of the autonomous driving or parking operation for automatically selecting the single or highlighted camera-view shown to the user, wherein in case that an anticipated length of movement is below a given threshold length, the selected camera view(s) shown to the user are fixed. Said feature avoids flickering of camera-views due to fast changes of the direction of movement (e.g. within parking operations when the vehicle needs to undergo short moves).
  • the length of movement may be anticipated based on path planning information (e.g. GPS data) or a tracked vehicle behaviour. In case of vehicle movements below a given threshold length, the camera-view should not be changed.
  • the given threshold length may be automatically determined or manually defined by a manufacturer of the vehicle, a user of the vehicle or the like.
  • the view-management means are configured to select the bird's view as single or highlighted camera-view in case that multiple movements of the vehicle with an anticipated length of movement below said given threshold value are expected.
  • a bird's eye view does not require fast changes of camera-views, much more the autonomous driving or parking operation (including multiple changes in the direction of movement) may be observed from a position above the vehicle.
  • an anticipated final position of the vehicle and the intended path may be projected on top of the camera-view.
  • Such a camera-view may be called "parking view”. So one further aspect of the invention enables that an anticipated movement path or end position of the vehicle in the autonomous driving or parking operation is projected into the single or highlighted camera-view.
  • the view-management means are configured to automatically select the single or highlighted camera-view in predefined situations of an autonomous driving or parking operation according to predefined selection criteria, wherein the predefined situations and predefined selection criteria are as follows:
  • a parking operation might often be better observed from a bird's eye view.
  • an automated switching through accessible or predefined camera-views e.g. front, rear, left, right
  • camera-views e.g. front, rear, left, right
  • the system may consider path planning data (e.g. based on GPS data) or data referring to the local environment of the vehicle.
  • Path planning data may also refer to a local map.
  • Such data may be provided from an external server to the vehicle or the portable electronic device, so that the view-management means may consider said data.
  • a section of the live-video (of a certain camera-view) where the vehicle is assumed to get very close to a certain object during the driving or parking operation may be marked (e.g. with a bounding box).
  • the view-management means are configured to automatically select the camera-view shown to the user based on a routine, optionally a routine based on artificial intelligence.
  • the view management means are configured to show additional information to a user by displaying said information in the live-video corresponding to a selected camera-view shown to the user on the portable electronic device, wherein said information is/are preferably displayed as video-overlay(s).
  • Said additional information may also be displayed by boxes or illustrative means affixed to objects or positions present in the live-video.
  • the information may relate to anticipated movement paths, vehicle data, data referring to the environment (e.g. an outdoor temperature), traffic signs etc.
  • the system may be configured to include said overlays to the live-video(s) displayed to a user on a screen of the portable electronic device.
  • a function may be implemented in the system where the user may choose to shut off the automated view-selection and to select a camera-view manually. This feature may be implemented in the App operated on the portable electronic device. The user may also choose a hybrid mode where some camera-views may be fixed (as selected by the user) and other views change automatically according to the situational context.
  • a method for monitoring an autonomous driving or parking operation of a vehicle with a previously described system comprises (at least) the following steps:
  • a camera-view of which the corresponding live-video is shown to a user on the portable electronic device is automatically selected by view-management means.
  • the selection may refer to a selection (and display) of a single camera-view or to a selection of a highlighted camera-view (a selected camera-view is displayed enlarged with respect to other camera-views).
  • the automated selection may be based on the same criteria or situations as described before.
  • system may comprise dedicated units or means for performing any of the method steps described above.
  • the vehicle 1 (e.g. a car) has a front F, a back B as well as a left side L and right side R.
  • a number of cameras 2L, 2R, 2F, 2B are installed at different positions of the vehicle.
  • a camera 2L is installed at the left side L of the vehicle 1
  • a camera 2R is installed at the right side R of the vehicle 1
  • a camera 2F is installed at the front F of the vehicle 1
  • a camera 2B is installed at the back side B of the vehicle 1.
  • the back B may synonymously be expressed as "rear" side of the vehicle 1.
  • the positions of the cameras 2L, 2R, 2F, 2B were only chosen for illustrative purposes.
  • Each of the cameras 2L, 2R, 2F, 2B is configured to capture live-videos of the driving or parking operation from a camera-view corresponding to the position of the camera 2L, 2R, 2F, 2B.
  • the corresponding camera-views are indicated with field-of-views 21, 22, 24 and 24, wherein the field-of-view 21 refers to camera 2L, field-of-view 22 refers to camera 2R, field-of-view 23 refers to camera 2F and field-of-view 24 refers to camera 2B.
  • the vehicle comprises a first communication unit 11 which may be part of a board computer of the vehicle 1.
  • the cameras 2L, 2R, 2F, 2B are in signal connection with the first communication unit 11.
  • the system according to the invention comprises a number of cameras 2L, 2R, 2F, 2B installed at different positions of the vehicle 1, each of the cameras 2L, 2R, 2F, 2B configured to capture live-videos of the driving or parking operation from a camera-view corresponding to the position of the camera 2L, 2R, 2F, 2B, wherein the cameras 2L, 2R, 2F, 2B are in signal connection (not shown) with a first communication unit 11 being installed in the vehicle 1.
  • the system further comprises a portable electronic device 30 comprising a second communication unit 12.
  • the portable electronic device 30 comprises a display 13.
  • the portable electronic device 30 is used by user 5, wherein the user 5 is located external to the vehicle 1.
  • the first communication unit 11 is configured to transmit the captured live-videos to the second communication unit 12 via a wireless signal connection 15, wherein the second communication unit 12 is configured to receive the transmitted live-videos, and wherein the portable electronic device 30 is configured to show the live-videos to the user 5 of the portable electronic device 30 in a live mode during the driving or parking operation.
  • the system further comprises view-management means (not shown) configured to automatically select a camera-view of which the corresponding live-video is shown to the user 5 on the portable electronic device 30.
  • the view-management means may comprise hardware and software components, both being part of the portable electronic device 30 or the vehicle 1. It is also possible, that hardware and software components of the vehicle 1 and the portable electronic device 30 together provide the view-management means and are configured to interact with each other.
  • Hardware components may be understood as computing unit. Besides the possibility of automatically selecting a camera-view by the view-management means, the latter may be configured in that a user 5 can manually switch between different camera-views.
  • the view-management means may comprise an algorithm (the algorithm may be based on artificial intelligence) that may be operated in a dedicated software (environment), the software being installed on one or both of said computing units. The automated selection of the camera-view of which the corresponding live-video is shown to the user 5 is performed by said algorithm.
  • the view-management means are configured to select a camera-view of which the corresponding live-video is shown to the user 5 on the portable electronic device 30 in a single camera-view 100 or as highlighted camera-view 101 besides other views 102.
  • a single camera-view 100 only a single live-video is displayed on the display 13 of the portable electronic device 30.
  • a highlighted camera-view 101 a live-video of a certain camera view may be displayed enlarged when compare to the live-videos of other views 101 (shown smaller).
  • buttons 5a-d illustrate different buttons (provided in an App operated on the portable electronic device 30) which a user 5 of the portable electronic device 30 may activate/deactivate, wherein the buttons are related to different selection options referring to the selection of a camera-view of which a live-video is shown to the user 5.
  • the buttons may be shown in a touch sensitive manner on the display 13 of the portable electronic device 30. From the right to the left of the buttons illustrated in figs. 5a- d the buttons refer to a right view, a left view, a rear view, a front view and a bird's eye view of the vehicle 1. Said buttons may also be displayed in an on-board display of the vehicle 1, so that the user 5 of the vehicle may pre-select a certain selection procedure before leaving the vehicle 1.
  • Fig. 5a refers to an activated button (the left button is activated) referring to an automated (auto) camera selection.
  • the automated camera selection may be selected as default.
  • Figure 5b refers to a hybrid mode of camera selection (second button from the left is activated). However, by selecting the hybrid mode only without selecting a further camera vie, the system undergoes an automated camera selection as shown in fig. 5a.
  • Figure 5c again refers to an activated hybrid of camera selection, but the right view is also activated. In such a case the activated view (the right view in this case) is displayed as single or highlighted view 100, 101 to the user 5 on the portable electronic device 30 until the view management means decide that there is a more relevant (or critical) view that should be displayed to the user (e.g.
  • Figure 5d refers to a selection of the left view without the buttons of the automated selection or hybrid selection being activated. In such a case only the selected view is displayed to the user 5 on the portable electronic device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
EP21185811.3A 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs Active EP4120218B1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21185811.3A EP4120218B1 (de) 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs
CN202210809865.7A CN115701091A (zh) 2021-07-15 2022-07-11 用于监控自动驾驶或停车操作的系统和方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP21185811.3A EP4120218B1 (de) 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs

Publications (2)

Publication Number Publication Date
EP4120218A1 true EP4120218A1 (de) 2023-01-18
EP4120218B1 EP4120218B1 (de) 2024-12-04

Family

ID=77071240

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21185811.3A Active EP4120218B1 (de) 2021-07-15 2021-07-15 System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs

Country Status (2)

Country Link
EP (1) EP4120218B1 (de)
CN (1) CN115701091A (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102024114461A1 (de) 2024-05-23 2025-11-27 Ford Global Technologies, Llc Verfahren zum Betrieb eines Fahrzeugs und Fahrzeug

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2885161A2 (de) * 2012-08-16 2015-06-24 Klear-View Camera, LLC System und verfahren zur bereitstellung von frontorientierten visuellen informationen für einen fahrzeuglenker
US20180052457A1 (en) * 2016-08-16 2018-02-22 Samsung Electronics Co., Ltd. Stereo camera-based autonomous driving method and apparatus
US20210023992A1 (en) * 2019-07-24 2021-01-28 Ambarella International Lp Switchable display during parking maneuvers

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2885161A2 (de) * 2012-08-16 2015-06-24 Klear-View Camera, LLC System und verfahren zur bereitstellung von frontorientierten visuellen informationen für einen fahrzeuglenker
US20180052457A1 (en) * 2016-08-16 2018-02-22 Samsung Electronics Co., Ltd. Stereo camera-based autonomous driving method and apparatus
US20210023992A1 (en) * 2019-07-24 2021-01-28 Ambarella International Lp Switchable display during parking maneuvers

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102024114461A1 (de) 2024-05-23 2025-11-27 Ford Global Technologies, Llc Verfahren zum Betrieb eines Fahrzeugs und Fahrzeug

Also Published As

Publication number Publication date
EP4120218B1 (de) 2024-12-04
CN115701091A (zh) 2023-02-07

Similar Documents

Publication Publication Date Title
EP3272586B1 (de) Nutzfahrzeug
CN105593641B (zh) 增加显示的方法和装置
US9582907B1 (en) User interface for displaying internal state of autonomous driving system
CN108140311B (zh) 停车辅助信息的显示方法及停车辅助装置
EP2940427A1 (de) Detailliertes kartenformat für autonomes fahren
KR101251729B1 (ko) 주차 제어 방법 및 그 장치
JP5064313B2 (ja) 携帯情報端末
US9987927B2 (en) Method for operating a communication device for a motor vehicle during an autonomous drive mode, communication device as well as motor vehicle
US11021060B2 (en) Method for the automated guiding of a motor vehicle occupied by a driver and for the information of the driver
CN113393697B (zh) 停车信息管理服务器、停车辅助装置以及停车辅助系统
EP2829844A1 (de) Navigationssystem
US12198238B2 (en) Method and arrangement for producing a surroundings map of a vehicle, textured with image information, and vehicle comprising such an arrangement
JP2008001120A (ja) 車両用表示制御装置
EP2963632A1 (de) Manöverunterstützung
JP5052003B2 (ja) 情報配信システム
KR20220156687A (ko) 차량의 자율 주차 방법 및 이를 수행하는 차량 시스템
JP2022176234A (ja) 情報表示制御装置、情報表示制御方法及び情報表示制御プログラム
GB2563902A (en) Method and apparatus for use with vehicles having an autonomous driving mode
EP4120218B1 (de) System und verfahren zur überwachung eines autonomen fahr- oder einparkvorgangs
US20210107515A1 (en) Systems and methods for visualizing a route of a vehicle
KR20230109807A (ko) 차량 및 차량의 제어방법
US20230166755A1 (en) Vehicle display control device, vehicle display control system, and vehicle display control method
CN111746789B (zh) 拍摄系统、服务器、控制方法以及存储程序的存储介质
CN119414432B (zh) 静默式行泊车辆定位切换方法、装置、设备及存储介质
WO2021132553A1 (ja) ナビゲーション装置、ナビゲーション装置の制御方法、ナビゲーション装置の制御プログラム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230718

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20231107

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20240829

RIN1 Information on inventor provided before grant (corrected)

Inventor name: GOLGIRI, HAMID M.

Inventor name: PAK, TONY

Inventor name: MA, CHENHAO

Inventor name: BENMIMOUN, AHMED

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602021022724

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250304

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250304

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1749009

Country of ref document: AT

Kind code of ref document: T

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20250612

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250404

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250404

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20250612

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602021022724

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20241204

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20250616

Year of fee payment: 5

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20250905