EP4128160A1 - Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule - Google Patents

Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule

Info

Publication number
EP4128160A1
EP4128160A1 EP21715163.8A EP21715163A EP4128160A1 EP 4128160 A1 EP4128160 A1 EP 4128160A1 EP 21715163 A EP21715163 A EP 21715163A EP 4128160 A1 EP4128160 A1 EP 4128160A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
markers
marker
trailer
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21715163.8A
Other languages
German (de)
English (en)
Inventor
Tobias KLINGER
Dennis Sabelhaus
Oliver WULF
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZF CV Systems Global GmbH
Original Assignee
ZF CV Systems Global GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZF CV Systems Global GmbH filed Critical ZF CV Systems Global GmbH
Publication of EP4128160A1 publication Critical patent/EP4128160A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/245Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating push back or parking of trailers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/36Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating connection, e.g. hitch catchers, visual guide means, signalling aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/58Auxiliary devices
    • B60D1/62Auxiliary devices involving supply lines, electric circuits, or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D13/00Steering specially adapted for trailers
    • B62D13/06Steering specially adapted for trailers for backing a normally drawn trailer
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the invention relates to a method for determining a pose of an object, a method for controlling a vehicle, and a control unit and a vehicle for performing the method.
  • a stereo camera can be used, for example, which recognizes the trailer or two-dimensional or flat markers on the trailer or on components of the trailer and derives depth information therefrom.
  • This is described by way of example in DE 102016011 324 A1, DE 10 2018 114730 A1, WO 2018/210990 A1 or DE 102017 119968 A1.
  • a position and an orientation, i.e. a pose, of the respective object relative to the camera or the towing vehicle can be derived from this.
  • a kink angle or a distance can be determined.
  • the pose can also be determined with a mono camera by localizing at least three markers, which are preferably applied flatly on the object, in the image and, knowing the marker positions on the object, determining a transformation matrix from which the pose of the object on which they are located can be derived.
  • DE102018210340A1 it is also provided to determine a distance to a swap body by adding a size of three applied markers is compared with stored sizes for these markers.
  • the markers are coded accordingly so that their size can be determined in advance using a camera.
  • An angular offset can also be estimated from a relative position of individual markers to one another.
  • Detection of a kink angle by means of a camera is also described in US 2014200759 A, a flat marker on the trailer being observed over time from the towing vehicle and the kink angle being estimated therefrom.
  • JP 2002012 172 A, US2014277942A1, US 2008231 701 A and US 2017106796 A also describe knee angle detection as a function of flat markers.
  • coupling points can carry a marker to facilitate automatic detection.
  • the marker can have a special color, texture or wave reflection property.
  • DE 102004025252 B4 also describes how to determine the kink angle by transmitting radiation from a transmitter onto a semicircular or hemispherical reflector and then detecting the radiation reflected therefrom.
  • DE 10302545 A1 also describes the detection of a clutch using an object recognition algorithm.
  • EP 3 180769 B1 also describes how to use cameras to record a rear side of a trailer and to recognize traceable features from the image, e.g. an edge or corner. These are then tracked over time, in particular to infer a bend angle between the pulling vehicle and the trailer. This means that no specially applied markers are used.
  • US 2018039266 A also describes how to access information from a two-dimensional barcode or QR code.
  • a QR code can also be read out with a camera in order to identify the trailer and to pass on trailer parameters to a reversing assistant.
  • an am RFID reader located on a trailer read out an RFID transponder that is attached to the towing vehicle, for example in the form of a label. This allows a position of the QR code or the RFID transponder to be calculated. An orientation is not determined.
  • a solution using radio-based transponders is also provided in W01 8060192A1.
  • RFID elements are also used on the trailer for triangulation when approaching Fier.
  • an observation element which has at least three auxiliary points or measuring points that can be recognized by a camera.
  • the coordinates or vectors of the centers of gravity of the auxiliary points are determined from geometric considerations, and from this the coordinates of the measurement points relative to the camera or the image sensor.
  • a kink angle can be determined from this.
  • the object of the invention is therefore to provide a method which enables a simple and reliable determination of the pose of an object and a subsequent processing or use of this pose for controlling the vehicle.
  • the object of the invention is also to provide a control unit and a vehicle for carrying out the method.
  • a pose ie a combination of a position and an orientation, an object with an object width and an object height relative to a one-part or multi-part vehicle, the object and the vehicle ge against each other are movable, and the object has at least three markers, preferably at least four markers, namely spatially extended markers or markers with a spatial geometry, for example with a spherical or cylindrical or cuboid geometry.
  • the at least three, preferably at least four markers lie in the same object plane on the object and no line can be laid through the at least three, preferably at least four markers. It is assumed here that the object plane described by the marker also lies approximately on the respective object, so that the pose can be estimated from this.
  • the object is captured by at least one camera on the vehicle and at least the following steps are carried out:
  • a transformation matrix (homography) as a function of the marker positions and / or the marker vectors and as a function of the marker images, the transformation matrix being the spatially extended markers on the object or the first coordinate system on the Marker mapping in the image of the camera on the vehicle or a Schmko ordinatensystem maps;
  • At least three markers are required to determine the transformation matrix, in which case - possibly with further additional information - a transformation matrix can at least be estimated, possibly with heuristic methods. However, it is more precise if four or more markers are present on the respective object, since a clearly defined transformation matrix or flomography matrix can then be determined using known methods.
  • spatially extended markers can still be detected very well by the camera of the vehicle even at large bending angles or at extreme viewing angles, because they protrude from the object plane and can thus be better perceived.
  • This provides a method for determining both the position and the orientation of the object that is almost invariant to the viewing angle.
  • This is also made possible by monocular image processing so that stereo cameras do not necessarily have to be used to obtain depth information on the object from which the pose can be geometrically derived. This simplifies the process and the costs can be kept low.
  • the spatial markers can be displayed on the image plane with high contrast even in poor visibility or ambient conditions, so that the marker images can be used in different environments. Practice conditions and visual conditions allow sufficient information to be determined about the object level.
  • a coupling assistance function as a function of a normal distance and / or reference distance derived from the determined pose of the object between the towing vehicle as a vehicle and a trailer to be coupled as an object on which the markers are arranged.
  • the towing vehicle is then controlled manually or automatically in such a way that a second coupling point on the towing vehicle, for example a (fifth wheel) coupling, approaches a first coupling point on the trailer to be coupled, for example a kingpin or a drawbar, and the two coupling points at one common pivot point can be coupled together, or
  • a distance assistance function as a function of a normal distance and / or reference distance derived from the determined pose between the towing vehicle as a vehicle and a building as an object on which the markers are arranged, for example a loading ramp or a garage door, or a foreign vehicle is controlled as the object on which the markers are arranged.
  • the method for determining the pose of the object and / or also the control of the vehicle are carried out on a control unit according to the invention in a vehicle. It is preferably also provided that at least one of the at least three markers
  • - Has a self-luminous coating, for example a fluorescent coating, or the like.
  • the spatially extended markers can advantageously be used to ensure that the object plane or the pose of the object can also be detected with high contrast in the dark. This means that the respective assistance functions can also be reliably carried out in the dark.
  • the self-luminous coating or the fluorescent coating have the advantage that no energy sources are required on the object.
  • the light source is controlled in at least one of the at least three markers and / or the ambient light as a function of a state of motion of the vehicle and / or the object. This saves energy, since the markers are actually only actively illuminated when this is absolutely necessary, for example when moving as part of one of the driver assistance functions.
  • the markers can be illuminated in a targeted manner in order to additionally assist the driver, so that the driver can for example use a display device to see how far he can drive backwards before the respective object or the object plane defined by the marker is reached. Even in the case of adjacent objects, it is possible to distinguish between objects in the dark by means of color coding or flashing light coding. Such a color coding can also be achieved by the self-luminous coating, which can be applied to the respective marker in predefined colors depending on the marker position on the object.
  • the light sources in at least one of the at least three markers are supplied with energy from an energy source, for example a solar panel, on the object.
  • an energy source for example a solar panel
  • the illumination of the markers on the object is independent of the presence of a vehicle, for example in the non-coupled state.
  • the markers on the object for example on the trailer, can thus be individually illuminated when parking in a depot.
  • At least one of the at least three markers is formed by a clearance light or outline lighting on the edges of the object.
  • the ambient light emits visible and / or invisible radiation onto the markers and that the markers have a reflective coating. As a result, good visibility can be achieved in a variable manner.
  • a QR code and / or a barcode and / or an Aruco marker is applied to the surface of at least one of the markers or adjacent to at least one of the markers and the QR code and / or the barcode and / or the Aruco marker is captured by the camera.
  • coded information can be read out without using separate data transmission.
  • the marker positions and / or the marker vectors in the vehicle can be determined in a simple manner, which are necessary for determining the transformation matrix in order to be able to infer the pose from the transformation matrix.
  • an object width and / or an object height can be coded on the respective code or marker in a simple manner so that this information can also be accessed from the vehicle without further data transmission or any other reading in of the data to need.
  • the object width of the object and / or the object height of the object and / or a first reference point on the object can, however, preferably also be determined as a function of the spatial arrangement of the recognized markers on the object / or at least be estimated. Accordingly, a specific coding can also be achieved in this way without an additional Data transfer is necessary.
  • the object width of the object and / or the object height of the object and / or a first reference point on the object and / or the marker positions and / or the marker vectors via a communication unit on the object, for example in the markers, preferably transmitted to the vehicle via Bluetooth or RFID.
  • a communication unit on the object for example in the markers, preferably transmitted to the vehicle via Bluetooth or RFID.
  • the camera additionally records surface markings which are arranged adjacent to at least one of the markers on the object, the position of the object being redundantly determined from the surface markings recorded.
  • the area markings can for example also be used for a rough adjustment or rough determination of the pose if, for example, the determination of the pose from the spatial markers is still too imprecise because, for example, the resolution of the camera is too low.
  • the area markings can accordingly be resolved to a greater extent without great effort, the area markings advantageously also being illuminated in the dark due to their proximity to the spatial markers, so that a double function can be achieved through the illumination.
  • the extent of the spatial markers can be kept small, since these are only necessary for fine adjustment, for example during a coupling process or during another approach process.
  • the area markings and / or the markers are at least partially arranged at the edges of the object and an object width of the object and / or an object height of the object is derived from the captured image of the camera. Because the markers or area markings - each for themselves or in combination with one another - are arranged at the edges, a contour can be determined from which the respective object width or the object height can also be estimated.
  • the vehicle is in one piece and the object is not connected to the vehicle, the vehicle moving relative to the object and the object being a trailer to be coupled or a building, for example a loading ramp or a garage door, or is a third-party vehicle.
  • the vehicle moving relative to the object and the object being a trailer to be coupled or a building, for example a loading ramp or a garage door, or is a third-party vehicle.
  • the vehicle is in several parts, i.e. has at least one towing vehicle and at least one trailer, and the camera is arranged on a towing vehicle and / or on a trailer coupled to it, with
  • the object is connected to the multi-part vehicle in the form of a trailer coupled to the towing vehicle, the trailer moving rela tively to the towing vehicle at least temporarily or pivoting in relation to it, or
  • the object is not connected to the multi-part vehicle, the object being a building, for example a loading ramp or a garage door, or a third-party vehicle on which the markers are respectively arranged.
  • This object can then be in front of or next to the towing vehicle or behind or next to the coupled trailer, with the camera being aligned accordingly to the part of the environment.
  • the image of the camera is displayed on a display device, the image from the display device being a normal distance and / or a reference distance (as well as their lateral and vertical components) and / or a contour of the object and / or control information and / or an articulation angle between the towing vehicle and the trailer and / or a first coupling point (e.g.
  • king pin, drawbar and / or a second coupling point (e.g. (saddle) coupling) and / or a trailer identifier and / or a loading space image and / or a trailer center axis determined as a function of the markers and / or a towing vehicle center axis, or axles lying parallel to it, are superimposed.
  • the following information can also be displayed, especially from the po se, so that the driver can perceive this and can react to it in a more targeted manner within the framework of the respective assistance functions.
  • the driver can, for example, take countermeasures manually if the indicated articulation angle is too high or observe the coupling process and manually influence it.
  • FIG. 1 shows a two-part vehicle with markers and cameras
  • FIG. 3 shows a detailed view of a front side of a trailer as
  • FIG. 1 a multi-part vehicle 1 from a train vehicle 2 and a trailer 3 is shown schematically, according to the embodiment shown, a camera 4 with a detection area E is arranged on each of the two vehicle parts 2, 3.
  • a towing vehicle camera 42 with a towing vehicle detection area E2 which is oriented towards the trailer 3
  • a trailer camera 43 with a rearward-facing trailer detection area E3 is arranged on the trailer 3.
  • the cameras 4; 42, 43 each output camera data KD.
  • the vehicle 1 can be designed in several parts, for example as a truck with a truck and drawbar trailer or turntable trailer or as a tractor-trailer with a tractor unit and semi-trailer (see FIG. 1). It is also possible to provide more than just one trailer 3, for example in the case of a EuroCombi or a road train. In principle, however, the vehicle 1 can also only be in one piece.
  • the orientation of the camera 4 is chosen depending on the application. For example, an omnidirectional camera, a fisheye camera or a telephoto lens camera or other types of cameras can be used as the camera 4.
  • the respective camera data KD are generated as a function of an environment U around the vehicle 1, which is located in the respective detection area E and which is imaged on an image sensor 4a of the respective camera 4.
  • An image B of image points BPi with image coordinates xB, yB can therefore be created from the camera data KD, an object point PPi in the vicinity U being assigned to each image point BPi.
  • the object points PPi belong to objects O that are in the vicinity U.
  • the image B can also be displayed to the driver of the vehicle 1 via a display device 8.
  • the camera data KD of the respective camera 4 are transmitted to a control unit 5 in the respective vehicle part 2, 3 or to a higher-level control unit 5 of the vehicle 1, which is designed as a function of the camera data KD from one or more recorded images B to extract marker images M1a, M2a, M3a, M4a, for example by means of edge detection.
  • a marker M1, M2, M3, M4 in the vicinity U is assigned to each marker image M1a, M2a, M3a, M4a.
  • the respective control unit 5 is designed to determine an object plane OE in which the respective markers M1, M2, M3 , M4 are arranged in the vicinity U.
  • markers M1, M2, M3, M4 are provided and that these are at least four markers M1, M2, M3, M4 as shown in FIG. 3, in one plane and not on one Line are located on the object O to be recognized in the environment U, for example on a preferably flat front side 3a of the trailer 3. It is assumed that the object plane OE defined by the markers M1, M2, M3, M4 is actually the object O in the relevant area Describes the area, ie here on the flat front side 3a, at least approximately in an abstract way.
  • the four markers M1, M2, M3, M4 can then be detected by the towing vehicle camera 42, for example.
  • more than four markers M1, M2, M3, M4 can also be provided on the respective object O, with at least four markers M1, M2, M3, M4 being necessary to determine the transformation matrix T in order to obtain a clearly determined transformation matrix T.
  • a transformation matrix T ab can also be estimated with only three markers M1, M2, M3 and possibly further additional information, possibly using heuristic methods, which may then be less precise.
  • transforma- mation matrix T starting from the towing vehicle 2 both a position (translational degree of freedom) and an orientation (rotational degree of freedom) or a combination thereof, ie a pose PO, of the trailer 3 in space relative to the towing vehicle camera 42 and thus also relative to the Towing vehicle 2 can be estimated, as explained in more detail below:
  • image coordinates xB, yB are first determined in the at least one recorded image B in an image coordinate system KB. Since the markers M1, M2, M3, M4 have a certain spatial extent and are therefore mapped flat in the image B, ie several image points BPi are assigned to a marker M1, M2, M3, M4, the center point of the respective marker can, for example -Image M1a, M2a, M3a, M4a is determined and its image coordinates xB, yB are further processed.
  • the control unit 5 knows how the four markers M1, M2, M3, M4 are actually positioned on the object O, e.g. the front side 3a of the trailer 3, relative to one another in the object plane OE. Accordingly, a marker position P1, P2, P3, P4 or a marker vector V1, V2, V3, V4 can be specified for each marker M1, M2, M3, M4 (see FIG. 3).
  • the marker vectors V1, V2, V3, V4 each contain first coordinates x1, y1, z1 of a first coordinate system K1 with a first origin U1, the marker vectors V1, V2, V3, V4 for the respective marker M1, M2, M3, M4 show.
  • the first coordinate system K1 is fixed to the trailer or marker fixed, i.e. it moves with the markers M1, M2, M3, M4.
  • the first origin U1 can lie in a fixed first reference point PB1 on the object O, for example in one of the markers M1, M2, M3,
  • the control unit 5 can now use the known marker vectors V1, V2, V3, V4 or marker positions P1, P2 , P3, P4 of the markers M1, M2, M3, M4 in the first coordinate system K1 a transformation matrix T can be derived.
  • This transformation matrix T indicates how the individual markers M1, M2, M3, M4 as well as the object points PPi of the entire object plane OE, which according to FIG.
  • the transformation matrix T thus indicates how the object plane OE is mapped onto the image plane BE of the image sensor 4a in the current driving situation.
  • the transformation matrix T also contains the information on how the trailer-fixed or marker-fixed first coordinate system K1 is oriented relative to the image coordinate system KB, with both translational and rotational degrees of freedom being included, so that a pose PO (combination of position and orientation ) the object plane OE can be determined relative to the image plane BE.
  • a pose PO combination of position and orientation
  • the object plane OE can be determined relative to the image plane BE.
  • an aspect ratio SV of the image sensor 4a which are stored, for example, on the control unit 5 or transmitted to it, a series of information can be derived from the transformation matrix T, which can be derived from different assistance functions Fi let use.
  • the transformation matrix T follows from the current driving situation of the vehicle 1, it contains, for example, a dependence on an articulation angle KW between the towing vehicle 2 and the trailer 3 and on a normal distance AN between the camera 4 or the image sensor 4a and the object plane OE .
  • These can be determined when the position of the image sensor 4a or the image plane BE of the image sensor 4a is known in a camera-fixed or here also tractor-fixed second coordinate system K2.
  • both the image plane BE with the individual image points BPi and the object plane OE can be expressed in second coordinates x2, y2, z2 of the coordinate system K2 fixed to the towing vehicle, from which the pose PO of the object O relative to the towing vehicle 2 can be determined.
  • the normal distance AN which specifies the perpendicular distance between the object plane OE and the image sensor 4a. If the position of the coordinate systems K1, K2 with respect to one another is known from the transformation matrix T or if the object plane OE can be specified in the second coordinate system K2, then the normal distance NA can be determined in a simple manner. Instead of the normal distance NA, however, a reference distance AR can also be determined, which is measured between a first reference point PR1 in the object plane OE, e.g. on the trailer 3 or on a loading ramp 11a, and a second reference point PR2 on the towing vehicle 2 ( see Fig. 4). If the reference points PR1, PR2 in the respective coordinate system K1, K2 are known, the reference distance AR can be determined from them with simple geometric considerations using the transformation matrix T.
  • a second origin U2 of the second coordinate system K2 can, depending on the application, be at a specified second reference point PB2 on the towing vehicle 2, for example in the image sensor 4a itself or in a second coupling point 7 on the towing vehicle 2, for example in a fifth wheel 7a of a tractor-trailer.
  • the two coupling points 6, 7 thus define a common pivot point DP about which the towing vehicle 2 and the trailer 3 pivot with respect to one another when cornering.
  • the po se PO of the trailer 3 or the pose PO of any other objects O which mark such markers M1, M2, M3, M4 with known marker vectors V1,
  • V2, V3, V4 or marker positions P1, P2, P3, P4 are determined who the.
  • the image coordinate system KB is converted into the first coordinate system K1 by the transformation matrix T, whereby the object level OE can also be represented in the second coordinate system K2.
  • both translational measured variables e.g. the distances AN, AR, and rotational measured variables, e.g. angles, between the towing vehicle 2 and the object plane OE of the respective object O, i.e. the pose PO, ab can be estimated.
  • the markers M1, M2, M3, M4 In order to make the recognition of the markers M1, M2, M3, M4 more reliable, they each have a spatial geometry, preferably a spherical geometry, i.e. they are designed in the form of a sphere.
  • markers M1, M2, M3, M4 are always mapped as a circle in the recorded image B regardless of the viewing angle of the respective camera 4, ie form a two-dimensional projection of image points BPi on the image sensor 4a.
  • the markers M1, M2, M3, M4 are therefore invariant to the viewing angle.
  • markers M1, M2, M3, M4 with other spatial geometries can also be used, for example hemispheres, cubes or cylinders, which can also be detected from different angles by the camera 4 with defined two-dimensional projections.
  • markers M1, M2, M3, M4 for example, fixed clearance lights 21a or from These radiated outline lighting can be used, which may already be present on trailers 3.
  • the markers M1, M2, M3, M4 preferably from the inside or the rear from a marker interior 20 with a corresponding light source 21 (see. Fig. 5a, 5b, 5c), for example an LED, which can be controlled via a lighting control 22 with a lighting signal SL.
  • the respective marker M1, M2, M3, M4 is then at least partially transparent or translucent in order to enable the electromagnetic radiation to exit from the inside or from the rear.
  • the markers M1, M2, M3, M4 can be detected with high contrast by the camera 4 from different angles, even in the dark.
  • the markers M1, M2, M3, M4 or their light sources 21 are preferably supplied with energy via an energy source 23 in the trailer 3 or on the respective object O, wherein the energy source 23 can be charged by solar panels 23a.
  • the markers M1, M2, M3, M4 are illuminated by the lighting control 22 as a function of a movement state Z of the vehicle 1, the towing vehicle 2 and / or the trailer 3, or generally the respective object O.
  • a motion sensor 24 can be provided on the trailer 3 or generally on the object O with the markers M1, M2, M3, M4, which indicates the movement state Z of the vehicle 1, the towing vehicle 2 and / or the trailer 3, or the object O with the markers M1, M2, M3, M4 can detect. If the vehicle 1 or towing vehicle 2 moves in relation to the respective object O or the markers M1, M2, M3, M4, the light sources 21 can be supplied with energy by the light control 22 and thus made to glow. It can preferably also be provided that the light sources
  • the lighting criteria can be the normal distance AN and / or the reference distance AR.
  • the lighting control 22 can, for example, set different colors C for the lighting sources 21 and / or different pulse durations dt (flashing light) by frequency modulating the lighting signal SL. If the distance AN, AR is large, long pulse durations dt can be provided, and if the distance AN, AR is short, shorter pulse durations dt can be provided.
  • a marker position P1, P2, P3, P4 of the respective marker M1, M2, M3, M4 on the respective object O can be taken into account as a further lighting criterion.
  • the lighting control 22 can illuminate a marker located at the top left in red and a marker located at the bottom right in blue. In this way, the driver can better distinguish between objects O lying next to one another, which each have markers M1, M2, M3, M4 for themselves, for example trailers 3 parked parallel to one another.
  • the markers M1, M2, M3, M4 can be visible in other ways in the dark.
  • the markers M1, M2, M3, M4 can be provided on the surface with a self-luminous coating 27, for example a fluorescent coating 27a, or the markers M1, M2, M3, M4 can be made of a self-luminous material.
  • a self-luminous coating 27a for example a fluorescent coating 27a
  • the markers M1, M2, M3, M4 can be made of a self-luminous material.
  • the markers M1, M2, M3, M4 can also be illuminated or illuminated from the outside, for example by an ambient light 25 on the towing vehicle 2 or on the respective vehicle 1, which is attached to a trailer 3, a loading ramp 11 a, or in general to the object O with the markers M1, M2, M3, M4 approximates so that the markers M1, M2, M3, M4 can also be made visible in the dark.
  • the ambient light 25 can in this case emit radiation in the visible or invisible spectrum, for example ultraviolet, and illuminate the markers M1, M2, M3, M4 with it.
  • the markers M1, M2, M3, M4 can also be provided with a corresponding reflective coating 26 in order to be able to reflect the radiation in the respective spectrum back to the camera 4 with high intensity.
  • the ambient light 25 can, like the light source 21, be controlled as a function of the state of motion Z and other light criteria in order to e.g. to be illuminated depending on the normal distance AN and / or the reference distance AR or the state of movement Z.
  • the surface markings Mf can also be illuminated in the dark by the adjacent light sources 21 of the markers M1, M2, M3, M4 or by their self-luminosity or by the ambient light 25 so that they can still be recognized in the dark.
  • the flat marker Mf can also be provided with a self-illuminating coating 27 or a reflective coating 26 in order to be better recognizable even in the dark or to be able to reflect the radiation from the ambient light 25 with high intensity to the respective camera 4.
  • the area markings Mf can be used to identify the object O with the markers M1, M2, M3, M4, for example the trailer 3, over a greater distance of up to 10m if the markers M1, M2, M3, M4 cannot be sufficiently resolved by the camera 4 under certain circumstances.
  • a coarse identification based on the area markings Mf and, depending on this, a targeted approach of the camera 4 to the respective object O can take place until the camera 4 can also adequately resolve the markers M1, M2, M3, M4.
  • the object plane OE can then be determined using only the markers M1, M2, M3, M4 or using the markers M1, M2, M3, M4 and the area markings Mf, so that redundant information can be used for the determination can.
  • the area markings Mf can also provide additional information that can contribute to determining the position of the object plane OE in space.
  • the area markings Mf can be positioned adjacent to the edges 17 of the front side 3a of the trailer 3 or of the respective object O.
  • An object width OB and an object flea OFI of the front side 3a or of the respective object O within the determined object plane OE can therefore be estimated if it is assumed that the markers M1, M2, M3, M4 or at least some of the Markers M1, M2, M3, M4 delimit the object O on the edge.
  • the markers M1, M2, M3, M4 themselves or at least some of the markers M1, M2, M3, M4 can also be arranged in this way on the edges 17, so that their marker positions P1, P2, P3, P4 or marker vectors V1, V2, V3, V4, the object width OB and the object fleas OFI of the object O within the determined object plane OE can be estimated. If more than four markers M1, M2, M3, M4 and / or area markings Mf are provided, these can also delimit a polygonal object area OF in the object plane OE, so that a shape or a contour K of the object O within the determined object plane OE can be extracted. In addition, conclusions can be drawn about the shape of the entire object O, for example in the case of a trailer 3.
  • the object width OB and / or the object height OFI can thus be determined or at least estimated on the basis of different reference points.
  • the corresponding extension of the front side and in the case of a loading ramp 11a or a garage door 11b its areal extension. If it is unknown whether the respective reference points (markers, area markings, etc.) are on the edge, at least an approximate (minimal) limitation of the respective object O and thus an approximate (minimal) object width OB and / or can be derived from this Approximate (minimal) object fleas OFI can be estimated.
  • the area markings Mf can have QR codes CQ or barcodes CB or Aruco markers CA (see FIG. 5a).
  • the object width OB and / or the object fleas OFI of the object O in particular special within the determined object plane OE and / or the marker vectors V1, V2, V3, V4 and / or the marker positions P1, P2 , P3, P4 relative to the first reference point BP1 (for example, the first coupling point 6 on trailer 3) be coded.
  • the control unit 5, which is connected to the camera 4 can determine the first coordinate system K1 or extract the corresponding information from the transformation matrix T even without previously reading in this data.
  • the QR codes CQ or barcodes CB or Aruco marker CA can be created after calibration of the markers M1, M2, M3, M4 and as part of the Area markings Mf are then applied close to the area on the respective object O marker.
  • geometric information of the object O for example the object width OB and / or the object height OH of the object O, in particular within the determined object plane OE, and / or the marker vectors V1, V2, V3, V4 and / or the marker positions P1, P2, P3, P4 relative to the first reference point BP1 (for example the first coupling point 6 on the trailer 3) via the marker positions P1, P2, P3, P4 on the respective object O.
  • the first reference point BP1 for example the first coupling point 6 on the trailer
  • markers M1, M2, M3, M4 for example, one marker each can be arranged at the top left, top center and top right and another marker at the bottom center. This spatial arrangement is then assigned to a certain fixed position of the first Ankup pelands 6, which the control unit 5 is known. Correspondingly, in the case of a different fixed position of the first coupling point 6, a different spatial arrangement of the markers M1, M2, M3, M4 can be assigned, it also being possible for further markers to be provided for further subdivision.
  • the QR codes CQ or barcodes CB or aruco marker CA with the corresponding coding can also be applied in a correspondingly curved manner on the markers M1, M2, M3, M4 (not shown), so that the respective data for defining the first Coordinate system K1 or to identify the marker positions P1, P2, P3, P4 of the markers M1, M2, M3, M4 can be recognized by the camera 4.
  • communication units 30 can also be arranged in one or more markers M1, M2, M3, M4 (see FIG.
  • the articulation angle KW between these can preferably be determined continuously with the aid of the markers M1, M2, M3, M4.
  • This articulation angle KW can then be used, for example, to control the multi-part vehicle 1 when reversing and / or during a parking process or to estimate the stability during cornering by means of a change in the articulation angle dKW.
  • An assistance function F1 based on the articulation angle can thus be provided.
  • a coupling assistance function F2 can be made possible with the aid of the normal distance AN or the reference distance AR.
  • Coupling points 6, 7, for example can be selected as reference points PR1, PR2, with a lateral reference distance ARI and a vertical reference distance ARv being determined from the resulting reference distance AR.
  • These also follow from the pose PO of the object plane OE, determined via the transformation matrix T, relative to the image sensor 4a or the object plane OE in the second coordinate system K2. This enables a targeted approach of the second coupling point 7 on the towing vehicle 2 to the first coupling point 6 on the trailer 3 when the towing vehicle 2 is controlled manually or automatically as a function of the components of the reference distance ARI, ARv.
  • trailer information AI can also be created and displayed by a display device 8 together with the recorded image B of the camera 4 for the driver to see.
  • a trailer identifier ID, the normal distance AN and / or the reference distance AR, the articulation angle KW, etc. can be displayed by the display device 8 as trailer information AI.
  • the display device 8 can superimpose the trailer information AI on the image B recorded by the respective camera 4, so that both the image of the surroundings U and the trailer information Ai are displayed.
  • the trailer information AI can for example be clearly scaled as a function of the distance AN, AR and displayed superimposed in the image area of the image B in which the front side 3a of the trailer 3 is shown.
  • the display device 8 displays the contour K, for example of the front side 3a of the trailer 3 or of the entire trailer 3, as trailer information AI.
  • the contour K is modeled from the object plane OE determined via the markers M1, M2, M3, M4 as a function of the stored or determined object width OB and object height OH of the trailer 3 in each case.
  • the marker positions P1, P2, P3, P4 of the markers M1, M2, M3, M4 can also be taken into account if they are arranged, for example, on the edges 17 of the front side 3a.
  • the trailer contour K can be displayed individually or superimposed on the image B. This can, for example, simplify maneuvering or coupling in the dark.
  • first or second coupling point 6, ie the kingpin 6a or the satellite coupling 7a can be displayed individually or as an overlay over the image B by the display device 8 as trailer information AI. Since, for example, the kingpin 6a in image B of camera 4 is not directly can be seen, coupling can be facilitated by this superimposition, possibly in combination with the displayed reference distance AR, or a coupling process can be precisely monitored, for example, even in the dark.
  • a trailer center axis 40 of the trailer 3 and a towing vehicle center axis 41 of the towing vehicle 2 can be superimposed on the image B from the display device 8 as trailer information AI. While the central axis of the towing vehicle 41 is known, the central axis of the trailer 40 can be determined from the object level OE determined via the markers M1, M2, M3, M4 as a function of the stored or determined object width OB and / or object height OH Trailer 3 can be modeled.
  • the marker positions P1, P2, P3, P4 of the markers M1, M2, M3, M4 can also be taken into account if they are arranged, for example, on the edges 17 of the front side 3a.
  • the angle between the trailer center axis 40 of the trailer 3 and a towing vehicle center axis 41 of the towing vehicle 2 is then the articulation angle KW, which can also be displayed.
  • the display or superimposition of the center axes 40, 41 enables the driver to position both lines relative to one another, e.g. to bring them one on top of the other, so that he can be supported during a coupling process, for example.
  • An axis that is parallel to the trailer center axis 40 of the trailer 3 or the towing vehicle center axis 41 of the towing vehicle 2 can also be drawn in each case with the same effect.
  • control information Sl can be superimposed on the image B from the display device 8, which shows the driver, for example by arrows, in which direction he has to steer the towing vehicle 2 in order to approach the second coupling point 6 to the first coupling point 7.
  • the control information Sl can, for example, by a Speaking control algorithm can be determined on the control unit 5, which determines a suitable trajectory.
  • a cargo hold camera 10 can be provided in the cargo hold 10a of the trailer 3, which camera can detect a loaded cargo.
  • the cargo space images LB recorded by the cargo space camera 10 can be displayed by the display device 8.
  • the cargo space images LB can be placed over the images B of the camera 4 in such a way that they are displayed in the area of the trailer 3, which is verified by the markers M1, M2, M3, M4.
  • the driver can also check the freight during operation or recognize what the trailer 3 detected by the camera 4 has loaded. This can be done when approaching or while driving past the trailer 3, e.g. at a depot 11a, in order to visually verify whether a trailer 3 is assigned to the towing vehicle 2.
  • a multi-part vehicle 1 In addition to an internal application in a multi-part vehicle 1 as described, it can be provided as part of a distance assistance function F3 that the markers M1, M2, M3, M4, for example, on a building 11, e.g. a loading ramp 11 a, a garage door 11 b, etc., or are positioned on a moving or stationary third-party vehicle 12 or on other obstacles in the environment U, which can each be recognized by the camera 4 as potentially relevant objects O.
  • the markers M1, M2, M3, M4 can thus also be used in the manner described to determine a relevant object plane OE of these objects O when the marker positions P1, P2, P3, P4 and / or the marker vectors V1 , V2, V3, V4 are known. These can then also be combined with area markings Mf and / or light sources 21 and / or a self-illuminating coating 27 and / or reflective coating 26 in the described manner in order to conditions to enable the object level OE to be determined as reliably and simply as possible.
  • V1, V2, V3, V4 marker vector z state of motion xB, yB image coordinates x1, y1, z1 first coordinates in the first coordinate system K1 x2, y2, z2 second coordinates in the second coordinate system K2

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé permettant de déterminer une pose (PO) d'un objet (O) relativement à un véhicule (1), l'objet (O) présentant au moins trois marqueurs (M1, M2, M3, M4) et étant détecté par au moins une caméra (4) située sur le véhicule (1), ledit procédé comprenant les étapes suivantes : Détecter les marqueurs (M1, M2, M3, M4) avec la caméra (4) et générer une image (B), une représentation des marqueurs étant associée aux marqueurs (M1, M2, M3, M4) dans l'image, déterminer des positions de marqueurs et/ou des vecteurs de marqueurs des marqueurs (M1, M2, M3, M4) sur l'objet (O) détecté, définir une matrice de transformation (T) en fonction des positions de marqueurs et/ou des vecteurs de marqueurs ainsi qu'en fonction des représentations de marqueurs, la matrice de transformation (T) reproduisant les marqueurs (M1, M2, M3, M4) présents sur l'objet (O) sur la représentation de marqueurs dans l'image (B), définir un plan d'objet (OE) formé sur l'objet (O) par les marqueurs (M1, M2, M3, M4) dans un second système de coordonnées (K2) en fonction de la matrice de transformation (T) définie, le second système de coordonnées (K2) étant solidaire du véhicule afin de déterminer la pose (PO) de l'objet (O) relativement au véhicule (1). Selon l'invention, lesdits au moins trois marqueurs (M1, M2, M3, M4) sont étendus spatialement sur l'objet (O) et sont associés dans l'image à des représentations de marqueurs planes.
EP21715163.8A 2020-03-26 2021-03-22 Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule Pending EP4128160A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020108416.1A DE102020108416A1 (de) 2020-03-26 2020-03-26 Verfahren zum Ermitteln einer Pose eines Objektes, Verfahren zum Steuern eines Fahrzeuges, Steuereinheit und Fahrzeug
PCT/EP2021/057174 WO2021191099A1 (fr) 2020-03-26 2021-03-22 Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule

Publications (1)

Publication Number Publication Date
EP4128160A1 true EP4128160A1 (fr) 2023-02-08

Family

ID=75277976

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21715163.8A Pending EP4128160A1 (fr) 2020-03-26 2021-03-22 Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule

Country Status (4)

Country Link
US (1) US20230196609A1 (fr)
EP (1) EP4128160A1 (fr)
DE (1) DE102020108416A1 (fr)
WO (1) WO2021191099A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022111110A1 (de) 2022-05-05 2023-11-09 Bayerische Motoren Werke Aktiengesellschaft Markervorrichtung zur Positionsbestimmung sowie Verfahren zum Installieren einer Markervorrichtung
CN117622322B (zh) * 2024-01-26 2024-04-26 杭州海康威视数字技术股份有限公司 转角检测方法、装置、设备及存储介质

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002012172A (ja) 2000-06-30 2002-01-15 Isuzu Motors Ltd トレーラ連結角検出装置
DE10302545A1 (de) 2003-01-23 2004-07-29 Conti Temic Microelectronic Gmbh Automatisches Ankuppel bzw. Andocken mittels 2D- und 3D-Bildsensorik
DE102004008928A1 (de) 2004-02-24 2005-09-08 Bayerische Motoren Werke Ag Verfahren zum Ankuppeln eines Anhängers unter Einsatz einer Fahrzeugniveauregulierung
DE102004025252B4 (de) 2004-05-22 2009-07-09 Daimler Ag Anordnung zur Bestimmung des Gespannwinkels eines Gliederzugs
US20070065004A1 (en) * 2005-08-01 2007-03-22 Topcon Corporation Three-dimensional measurement system and method of the same, and color-coded mark
DE102006040879B4 (de) 2006-08-31 2019-04-11 Bayerische Motoren Werke Aktiengesellschaft Einpark- und Rückfahrhilfe
DE102006056408B4 (de) 2006-11-29 2013-04-18 Universität Koblenz-Landau Verfahren zum Bestimmen einer Position, Vorrichtung und Computerprogrammprodukt
GB2447672B (en) 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
RU2009113008A (ru) * 2009-04-08 2010-10-20 Михаил Юрьевич Воробьев (RU) Способ определения позиции и ориентации прицепа транспортного средства и устройство для его осуществления
US9139977B2 (en) * 2010-01-12 2015-09-22 Topcon Positioning Systems, Inc. System and method for orienting an implement on a vehicle
US9085261B2 (en) 2011-01-26 2015-07-21 Magna Electronics Inc. Rear vision system with trailer angle detection
US9335162B2 (en) 2011-04-19 2016-05-10 Ford Global Technologies, Llc Trailer length estimation in hitch angle applications
DE102012003992A1 (de) * 2012-02-28 2013-08-29 Wabco Gmbh Zielführungssystem für Kraftfahrzeuge
GB2513392B (en) * 2013-04-26 2016-06-08 Jaguar Land Rover Ltd System for a towing vehicle
US9464887B2 (en) * 2013-11-21 2016-10-11 Ford Global Technologies, Llc Illuminated hitch angle detection component
US9437055B2 (en) 2014-08-13 2016-09-06 Bendix Commercial Vehicle Systems Llc Cabin and trailer body movement determination with camera at the back of the cabin
US10384607B2 (en) 2015-10-19 2019-08-20 Ford Global Technologies, Llc Trailer backup assist system with hitch angle offset estimation
DE102016209418A1 (de) 2016-05-31 2017-11-30 Bayerische Motoren Werke Aktiengesellschaft Betreiben eines Gespanns mittels Vermessung der relativen Lage eines Informationsträgers über eine Ausleseeinrichtung
US10073451B2 (en) 2016-08-02 2018-09-11 Denso International America, Inc. Safety verifying system and method for verifying tractor-trailer combination
DE102016011324A1 (de) 2016-09-21 2018-03-22 Wabco Gmbh Verfahren zur Steuerung eines Zugfahrzeugs bei dessen Heranfahren und Ankuppeln an ein Anhängerfahrzeug
AU2017334261B2 (en) * 2016-09-27 2022-12-01 Towteknik Pty Ltd Device, method, and system for assisting with trailer reversing
DE102016218603A1 (de) 2016-09-27 2018-03-29 Jost-Werke Deutschland Gmbh Vorrichtung zur Positionserkennung eines ersten oder zweiten miteinander zu kuppelnden Fahrzeugs
DE102017208055A1 (de) * 2017-05-12 2018-11-15 Robert Bosch Gmbh Verfahren und Vorrichtung zur Bestimmung einer Neigung eines kippbaren Anbaugerätes eines Fahrzeugs
IT201700054083A1 (it) 2017-05-18 2018-11-18 Cnh Ind Italia Spa Sistema e metodo di collegamento automatico tra trattore ed attrezzo
US10346705B2 (en) 2017-06-20 2019-07-09 GM Global Technology Operations LLC Method and apparatus for estimating articulation angle
DE102017119968B4 (de) 2017-08-31 2020-06-18 Saf-Holland Gmbh Anhänger und System zur Identifikation eines Anhängers und zur Unterstützung eines Ankupplungsprozesses an eine Zugmaschine
DE102017119969B4 (de) * 2017-08-31 2023-01-05 Saf-Holland Gmbh Anhänger mit einer Anhänger-Steuervorrichtung, Ankupplungssystem und Verfahren zur Durchführung eines Kupplungsprozesses
DE102018203152A1 (de) * 2018-03-02 2019-09-05 Continental Automotive Gmbh Anhängerwinkelbestimmungssystem für ein Fahrzeug
DE102018210340B4 (de) 2018-06-26 2024-05-16 Zf Friedrichshafen Ag Verfahren und System zum Ermitteln einer Relativpose zwischen einem Zielobjekt und einem Fahrzeug

Also Published As

Publication number Publication date
WO2021191099A1 (fr) 2021-09-30
US20230196609A1 (en) 2023-06-22
DE102020108416A1 (de) 2021-09-30

Similar Documents

Publication Publication Date Title
DE102006056408B4 (de) Verfahren zum Bestimmen einer Position, Vorrichtung und Computerprogrammprodukt
DE102019124152A1 (de) Kupplungsunterstützungssystem
DE102006035929B4 (de) Verfahren zum sensorgestützten Unterfahren eines Objekts oder zum Einfahren in ein Objekt mit einem Nutzfahrzeug
DE102019129196A1 (de) Anhängemanöver
EP4128160A1 (fr) Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule
DE102011113197B4 (de) Verfahren und Vorrichtung zur Bestimmung eines Winkels zwischen einem Zugfahrzeug und einem daran gekoppelten Anhänger, und Fahrzeug
EP3105547A1 (fr) Procédé de détermination de la position absolue d'une unité mobile et unité mobile
EP2985182A2 (fr) Procédé d'avertissement pour des utilisateurs d'un réseau routier de zones de danger éventuelles générées par un véhicule effectuant une manoeuvre
DE102012003992A1 (de) Zielführungssystem für Kraftfahrzeuge
WO2010025792A1 (fr) Procédé et dispositif permettant de surveiller les alentours d'un véhicule
DE102018220298A1 (de) Einparkassistenzverfahren und Vorrichtung
EP3774499B1 (fr) Procédé de surveillance et de commande d'une opération de stationnement ou de manoeuvre télé-commandée d'un véhicule
DE102020124985A1 (de) Unterscheiden zwischen nahem anhänger und verbundenem anhänger
EP3164771A1 (fr) Représentation, en fonction d'une position spatiale, de données d'environnement de véhicule sur une unité mobile
DE102017223098A1 (de) Verfahren und Vorrichtung zur Ermittlung eines Relativwinkels zwischen zwei Fahrzeugen
DE102020109279A1 (de) System und verfahren zur anhängerausrichtung
DE102017106152A1 (de) Ermitteln einer Winkelstellung eines Anhängers mit optimierter Vorlage
DE102006044615A1 (de) Verfahren zur Kalibrierung von Bilderfassungseinrichtungen in Fahrzeugen
DE102017219119A1 (de) Verfahren zur Formerkennung eines Objekts in einem Außenbereich eines Kraftfahrzeugs sowie Kraftfahrzeug
DE102006037600B4 (de) Verfahren zur auflösungsabhängigen Darstellung der Umgebung eines Kraftfahrzeugs
DE102019116951A1 (de) System zur Erfassung von und Reaktion auf einen zurückweichenden Anhänger
DE102021204363A1 (de) Verfahren zur Kalibrierung eines Sensors mittels eines Fortbewegungsmittels
EP4118564A1 (fr) Procédé de commande d'un véhicule dans un dépôt, unité de commande de déplacement et véhicule
DE102020127206A1 (de) Prozess und system zur gemeinsamen nutzung von sensoren für einen autonomen fahrspurwechsel
DE102019132019A1 (de) Fahrzeug und steuerverfahren dafür

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20221026

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)