WO2008064892A1 - Procédé de détermination d'une position, dispositif et produit de programme informatique - Google Patents

Procédé de détermination d'une position, dispositif et produit de programme informatique Download PDF

Info

Publication number
WO2008064892A1
WO2008064892A1 PCT/EP2007/010387 EP2007010387W WO2008064892A1 WO 2008064892 A1 WO2008064892 A1 WO 2008064892A1 EP 2007010387 W EP2007010387 W EP 2007010387W WO 2008064892 A1 WO2008064892 A1 WO 2008064892A1
Authority
WO
WIPO (PCT)
Prior art keywords
trailer
points
point
longitudinal axis
image data
Prior art date
Application number
PCT/EP2007/010387
Other languages
German (de)
English (en)
Inventor
Elisabeth Balcerak
Dieter ZÖBEL
Thorsten Weidenfeller
Original Assignee
Universität Koblenz-Landau
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universität Koblenz-Landau filed Critical Universität Koblenz-Landau
Publication of WO2008064892A1 publication Critical patent/WO2008064892A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D13/00Steering specially adapted for trailers
    • B62D13/06Steering specially adapted for trailers for backing a normally drawn trailer
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/245Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating push back or parking of trailers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/58Auxiliary devices
    • B60D1/62Auxiliary devices involving supply lines, electric circuits, or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/22Articulation angle, e.g. between tractor and trailer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior

Definitions

  • the present invention relates to a method for determining a position of a biaxial trailer having at least one movable axle, a device for determining a position of a biaxial trailer having at least one movable axle and a computer program product.
  • G3T vehicles Today road tractors are dominated by semi-trailers with semi-trailers. In kinematics such vehicles are classified in the category of "general-2-trailer" or short G2T vehicles. Compared to G2T vehicles have trucks with biaxial trailers, in particular with at least one steering axle, kinematically referred to as G3T vehicles, some advantages. These include:
  • One aspect of the present invention relates to an apparatus for determining a position of a trailer having at least one movable axle relative to a towing vehicle, comprising the steps of:
  • the alignment of all three vehicle links is determined in real time.
  • the three vehicle links here are preferably the vehicle longitudinal axis of the towing vehicle, the steering axle of the trailer and the trailer longitudinal axis of the trailer.
  • the vehicle longitudinal axis of the towing vehicle is in particular an axle which is perpendicular to the wheel axles of the towing vehicle when driving straight ahead, i. for example, the front axle and the one or more rear axles of the towing vehicle.
  • the vehicle longitudinal axis of the towing vehicle is arranged in particular centrally between the opposite wheels of a (each) axis. The vehicle's longitudinal axis therefore halves the front axle (s) and the rear axle (s).
  • the steering axle of the trailer is preferably an axle which is perpendicular to the at least one movably arranged or rotatably arranged wheel axle of the trailer.
  • the movably arranged wheel axle is preferably the front axle or the frontmost wheel axle of the trailer.
  • the steering axis is arranged centrally between the corresponding, opposite wheels of the movable axle of the trailer.
  • the steering axle of the trailer is thus arranged movable relative to the trailer, wherein, in the frame of reference of the earth, the steering axle of the trailer is horizontally and / or vertically movable.
  • the trailer can have more than one moving axle.
  • Each of these movable axles may have a steering axle.
  • the steering axle may for example be parallel or identical to a drawbar of the trailer.
  • the trailer longitudinal axis of the trailer is, analogous to the vehicle longitudinal axis of the tractor, perpendicular to the front axle and the rear axle of the trailer, when driving straight ahead of the trailer, ie, when the front axle of the trailer and the rear axle of the trailer are parallel. Furthermore, in this state, the trailer longitudinal axis is centered between opposite wheels, ie centered arranged between opposite front wheels and centrally between opposite rear wheels. In other words, the trailer longitudinal axis intersects the center of the front axle and the center of the rear axle of the trailer when the trailer is traveling straight ahead.
  • the trailer longitudinal axis is stationary, ie, the orientation of the trailer longitudinal axis to the front axle / or the rear axle is variable.
  • the vehicle longitudinal axis of the towing vehicle and the trailer longitudinal axis of the trailer are parallel.
  • the vehicle longitudinal axis and the trailer longitudinal axis can be moved vertically against each other due to different ground clearance of towing vehicle and trailer.
  • the steering axle of the trailer is in straight ahead of the team in plan view of the team parallel to the vehicle longitudinal axis and the trailer longitudinal axis.
  • the vehicle longitudinal axis, the steering axle of the trailer and the trailer longitudinal axis in plan form a line in this case.
  • the vehicle longitudinal axis and the trailer longitudinal axis are parallel, but the steering axis of the trailer is at different heights of the axles of towing vehicle and trailer relative to the road, not parallel to the vehicle longitudinal axis and the trailer longitudinal axis. Rather, the steering axle of the trailer connects the vehicle longitudinal axis and the trailer longitudinal axis.
  • the vehicle's longitudinal axis, the steering axle and the trailer steering axle need not be the physical axes of the trailer. Rather, the aforementioned axes are geometric axes that are used to describe the kinematics. However, the aforementioned axes can also coincide at least partially with physical axes of the team.
  • the steering axle may be at least partially identical to a longitudinal axis of a drawbar of the trailer.
  • the vehicle's longitudinal axis and the trailer's longitudinal axis can also be vertical be displaceable.
  • the vehicle longitudinal axis is not limited to the plane which is formed on the basis of the wheel axles of the towing vehicle.
  • the trailer longitudinal axis is not limited to the plane formed by the wheel axles of the trailer. Rather, the vehicle longitudinal axis may be any straight line which is parallel to the plane formed from the wheel axles of the front and rear wheels of the towing vehicle and which is parallel to a plane perpendicular to the wheel axles of the towing vehicle (in straight ahead travel) and the centers of the wheel axles with respect to the distance of opposing wheels comprises. The same applies mutatis mutandis to the position of the trailer longitudinal axis of the trailer.
  • the position of the steering axle of the trailer is preferably determined by a point of the towing vehicle and a point of the trailer.
  • the point of the towing vehicle is, for example, the point of coupling or coupling of the trailer to the towing vehicle.
  • the corresponding point on the trailer is, for example, the pivot point or the pivot bearing of the front axle on the trailer.
  • the term “determining” includes in particular “calculating” a position, for example, the position of the observation element, in particular the absolute position, an angle, in particular an arrangement angle, etc. of the observation element. Additionally / alternatively, the term “determining” may also include an approximation method, a partial or complete extraction of a table, etc.
  • An “arrangement angle” in the sense of this invention is in particular an angle in three-dimensional space.
  • an array angle may also be limited to a plane in three-dimensional space, i. to be a two-dimensional size.
  • An "absolute position" of a measuring point, of an auxiliary point, etc. is preferably the position of the measuring point in a predetermined coordinate system, such as a Cartesian coordinate system, for example the image recording device at the origin or a coupling point at the origin of the coordinate system.
  • the absolute position relative to this origin is given, and the unit used is, for example, a unit in the metric system, such as meters, centimeters, and so forth.
  • the absolute position of a measuring point or an auxiliary point therefore differs from the position of the measuring point or the auxiliary point in the image data, although both positions in the coordinate system of the image recording device can be determined.
  • the image data does not have complete spatial information in three-dimensional space, since the measurement points or auxiliary points are all reproduced in the image plane.
  • the absolute position of a point is a (true) three-dimensional position specification, in which, in particular, a distance of each measuring point or auxiliary point from the coordinate origin is indicated. For example, this can be done in Cartesian coordinates and the position can be specified with respect to each of the three axes of the Cartesian coordinate system.
  • the Cartesian coordinate system can be embedded in a coordinate system of the towing vehicle and / or the trailer, wherein the vehicle longitudinal axis can be an axis of the coordinate system and, for example, the two other axes through a plane parallel to the road surface or parallel to a loading area of the towing vehicle and a plane determined perpendicular to the first plane, wherein the two planes must contain the vehicle longitudinal axis.
  • the coordinate system can also be formed with regard to the trailer.
  • the method according to the invention makes it possible to precisely determine the orientation of the three vehicle members, i. the vehicle's longitudinal axis, the towing vehicle, the trailer's steering axle and the trailer's trailer axle. This determination is preferably possible in real time.
  • Position angle ⁇ 2 of the steering axle of the trailer determined on the basis of the absolute positions of the measuring points relative to the image pickup device and
  • a position angle is preferably a variable in the three-dimensional space, which in particular contains a yaw angle, a pitch angle and a roll angle of the respective vehicle member, wherein the attitude angle of an axis in particular the orientation of the axis in three-dimensional space and a
  • Rotation angle of the axis contains (schematically, a position angle of an axis in Figure 1 is shown).
  • K 1 is the intersection of the steering axle with the trailer longitudinal axis of the trailer
  • R s D (R, - arctznl-YS)
  • m is the length of the vector SR and b, h is given dimensions of the at least one observation element.
  • the vector SR is defined by two preferably opposite measuring point.
  • the parameters b and h are dimensions of the observation element, in particular parameters of the depth and the width of the observation element.
  • the symbols ⁇ and ⁇ may have similar, in particular equal, sampling, in particular equal angles, the symbol ⁇ serving in particular to describe an angle and the symbol ⁇ preferably being an angular triple.
  • the angle ⁇ 12 may include the angles ⁇ g12 , ⁇ n i2, and ⁇ w i 2 .
  • angle ⁇ i, 2 includes, for example, only the angle ⁇ g i 2 , which is referred to as ⁇ i 2 .
  • the angle ⁇ i may include the angles ⁇ g , ⁇ n i and ⁇ w i.
  • the above conditions are exemplary conditions of a planar movement of the towing vehicle and trailer combination.
  • the team moves in a plane forward or is arranged in a plane.
  • the towing vehicle is disposed relative to the trailer in a different plane therefrom, for example, when one or more tires of the trailer are located at or above an obstacle, such as a curb, pothole, etc., and the towing vehicle is conventionally located is arranged on the roadway.
  • the roll angle of the vehicle longitudinal axis and the roll angle of the trailer longitudinal axis may differ and the yaw and pitch angle of the two axes, for example, be substantially equal.
  • the roll angle of the two axes may be the same, but the yaw and / or pitch angles differ from each other.
  • colored image data in particular RGB image data, are generated on the basis of the image recording device.
  • the colored image data is converted into black and white image data.
  • all pixels of the black-and-white image data are checked to detect the measurement points and identified on the basis of a minimum number and a maximum number of adjacent white image data at least three auxiliary points.
  • auxiliary points are identified.
  • all pixels of the black-and-white image data are checked to detect the measurement points and identified on the basis of a minimum number and a maximum number of adjacent black image data at least three auxiliary points.
  • the minimum number and maximum number of adjacent pixels depends on a nominal number N of pixels per measurement point.
  • the measuring point may in this case correspond to an image of a display device, in particular an infrared diode.
  • the nominal number N of pixels can be determined in an initial step, for example a calibration step, in particular the calibration step described below or in the calibration position, for example from an image automatically and / or manually counted and / or estimated or calculated on the basis of the given geometry and / or the diode properties or arrangement and / or the properties of the image generation device such as, for example, their resolution or their arrangement.
  • the center of gravity is preferably calculated in each case for the auxiliary points detected on the basis of the adjacent white and / or adjacent black pixels, and the following applies to the position of the respective center of gravity:
  • adjacent centers of gravity are preferably detected and the neighboring centers of gravity identified as lying on a track, when the sum of the distances of the individual centers of gravity is equal to the greatest distance between two of the centers of gravity.
  • the detected auxiliary points in particular the centers of gravity of the detected auxiliary points are processed one after the other.
  • This check is accomplished by comparing the distance, preferably from the first to the third center of gravity, with the distance from the first to the second to the third center of gravity. If both distances, ie the distance between the first and the third center of gravity and the sum of the distances between the first and the second center of gravity and the second and the third center of gravity are the same, the three centers of gravity must inevitably lie on a straight line. Otherwise, an error is output and / or the priorities are discarded, in particular, the priorities are redefined in this case. Alternatively, an error can be output and then the centers of gravity can be assigned in another way to possibly form a leg.
  • the ratios of the lengths between the individual points correspond to the given geometric relationships from the (actually given) geometry data, in particular the geometric data of the at least one observation element.
  • the actual geometric relationships of the auxiliary points can be predefined, for example, due to the construction and compared with the detected geometric relationships of the auxiliary points.
  • At least one leg is identified when four adjacent centroids lie on a common path.
  • a leg is identified when the maximum distance between two centers of gravity on one of the distances is greater than a predetermined minimum length.
  • a ramp is identified when
  • the two legs have a common point and the common point is the end point of both thighs.
  • the validity of the ramp may be further preferably checked by determining the length of the legs forming the ramp. The farther a leg is turned away from the image pickup device, the smaller it appears in the image data. Conversely, a thigh facing the image pickup device appears larger than it actually is. It follows that preferably at least one of the two legs must not fall below a certain minimum length. However, if both legs of a detected ramp are smaller than the predetermined minimum length, the ramp is discarded again. The above method is then preferably repeated, that is, a ramp is repeatedly identified until the identified ramp is accepted.
  • the at least three measuring points R, S, V are determined based on the position of the ramp, wherein one of the measuring points is assigned to a center of the ramp and the other two measuring points are assigned to outer ends of the ramp.
  • the measuring point R is preferably assigned to the auxiliary point arranged in the image data at the left end of the ramp.
  • the auxiliary point arranged in the middle of the ramp is preferably the measuring point S assigned and arranged at the right end of the ramp auxiliary points preferably the measuring point V assigned.
  • auxiliary points can be determined and based on the auxiliary points 3, 4, 5, 10, etc. measuring points are determined.
  • the plurality of auxiliary points can be specified by arranging a corresponding number of light-emitting diodes (see below).
  • two or more ramps can be identified, wherein, for example, the individual ramps can be arranged at a predetermined angle to one another or must be so arranged for construction reasons.
  • a first ramp may be perpendicular to a second ramp. This is particularly advantageous if the arrangement angles are determined in three-dimensional space.
  • the legs of the ramp are determined, in particular calculated, with the aid of the aforementioned steps, preferably on the basis of the detected center of gravity, wherein a limb has the following properties:
  • a leg consists of at least four centers of gravity or auxiliary points
  • the legs forming a ramp have exactly one center of gravity or auxiliary point;
  • the ramp consists of exactly two such legs
  • none of the two legs that form the ramp may be below a minimum length.
  • the at least one ramp is detected in an initial step and detected after detection of the ramp, a change in the position of the at least one auxiliary point, in particular at least one of the measuring points and again determines the position of the ramp based on this detection.
  • the change in the position of the at least one auxiliary point, in particular at least one of the measuring points is detected by checking (adjacent) pixels in an environment of the at least one auxiliary point, in particular at least one of the measuring points, and the position of the center of gravity of the at least one auxiliary point, in particular at least one measuring point is identified.
  • the initially detected coordinates of at least the points R, S, V are used and searched again in a certain radius around them for emphases. If all three measuring points are found again in this way, the coordinates of these points are returned as R, S, V. It can also be assumed that all auxiliary points, not only of the preferably three measuring points R, S, V and a change in the position of the auxiliary points are detected. Based on the changed position of the auxiliary points, the positions of the measuring points can then be redetermined.
  • the auxiliary points are preferably identical to the focal points.
  • the measurement points R, S, V are determined by identifying auxiliary points in the image data by detecting centroids, identifying legs therefrom, identifying one or more ramps therefrom, and finally, the measurement points R, S, V are identified.
  • each auxiliary point in a Calibration position of the image pickup device relative to the observation element determines the position of each auxiliary point in pixels of the image data, wherein the actual position of each auxiliary point is determined by an arrangement of a corresponding infrared diode on the observation element and the position of each infrared diode with respect to the image pickup device in the calibration position is predetermined ,
  • an image data record can be generated in a calibration position, for example in a position for straight-ahead driving of the combination, on the basis of the image recording device.
  • the auxiliary points are, for example, in particular manually predeterminable or predefined for construction reasons, the actual position of each auxiliary point relative to the image recording device is known in the calibration position.
  • the position of each measuring point for the aforementioned reason / reasons is known.
  • the position of the auxiliary points to each other or the position of the measuring points to each other is known.
  • the distances of the auxiliary points to each other can be measured.
  • the distance between two auxiliary points in pixels or all auxiliary points to each other in pixels can be determined on the basis of the image data.
  • a position indication in pixels can be converted into an actual position specification, for example in meters, centimeters, etc., and vice versa.
  • auxiliary points correspond to positions of infrared diodes.
  • the auxiliary points can be determined in any other possible way.
  • the auxiliary points can be determined by attaching colored or black-and-white or patterned stickers, etc.
  • such a sticker may be a conventionally known saddle point with opposite black and white
  • the position of each measuring point relative to the Help points are determined in the image data and can be determined by calibrating the image data, a position of each measuring point relative to the image pickup device.
  • the distance of measuring points from one another or from measuring points to auxiliary points in pixels can be determined and, based on the calibrated or already predetermined distances of pixels, this distance can be converted into actual distances, for example in meters, centimeters, etc.
  • Particularly preferred is based on the arrangement angle ⁇ i, 2 between the vehicle longitudinal axis of the towing vehicle and the steering axis of the trailer and based on the arrangement angle ⁇ 2 , 3 between the steering axis and the trailer longitudinal axis of the trailer, a trajectory of the team consisting of towing vehicle and trailer in particular determined automatically.
  • the trajectory can be displayed during a reversing a driver, so that the driver detects the reverse, especially the position of the trailer in a reverse drive and can react accordingly.
  • an automatic reversing of the team can be made possible, with a driver, for example, only controls the speed.
  • Another aspect of the present invention relates to an apparatus for determining a position of a trailer having at least one movable axle relative to a towing vehicle
  • an image recording device which is designed to generate image data of the at least one observation element
  • a detection device which is designed to detect at least three predetermined measurement points in the image data
  • the observation element comprises a plurality of display devices.
  • such Representation means a diode, in particular an infrared diode, a laser diode or a conventional semiconductor diode which emits yellow and / or green and / or red and / or blue and / or white light, be.
  • the display devices are preferably infrared diodes, in particular 3, 7, 10, etc., infrared diodes.
  • the presentation means may also be a sticker, a color dot, a color area, a conventional incandescent lamp, an acoustic signal transmitter, an RFID transmitter, etc.
  • the image pickup device comprises a conventional digital camera and / or a conventional analog camera.
  • Another aspect of the present invention relates to a computer program product, particularly stored or signalized on a computer readable medium, which, when loaded into the memory of a computer and executed by a computer, causes the computer to perform a method according to the invention.
  • Figure 1 a schematic representation of an axis in three-dimensional space
  • Figure 2 a schematic arrangement of objects and axes
  • Figure 3 is a schematic view of a team
  • FIG. 4 shows a schematic illustration of an observation element
  • FIG. 5a shows a schematic view of an exemplary arrangement
  • FIG. 5b shows a schematic illustration of an exemplary arrangement
  • FIG. 6 is a schematic detail view according to FIG. 3;
  • FIG. 7 shows a schematic view of an exemplary geometry of a trailer
  • FIG. 8 shows a schematic view of an exemplary receptacle on the basis of FIG
  • FIG. 9 is a schematic view of a special receptacle
  • FIG. 10 a schematic view of partial elements
  • Figure 11a is a flow chart
  • Figure 11b is a flow chart
  • FIG. 12 is a schematic view of a geometry
  • Table 1 an overview of model values
  • Table 3 a visual representation of actual measured values
  • Table 4 a visual representation of actual measured values
  • G3T vehicle a team of towing vehicle and trailer with at least one movable axle
  • Static data The static specification can be a length L 1 , i ⁇ ⁇ !, ..., 3 ⁇ of each vehicle link.
  • Dynamic data The yaw, pitch and roll angles of each vehicle link can be used as dynamic data.
  • the yaw, pitch and roll angles are usually represented as follows: / e ⁇ l, ..., 3 ⁇ ,
  • ⁇ xg (G,) denotes the yaw angle
  • ⁇ X n (G t ) denotes the pitch angle
  • ⁇ X w (G t ) denotes the roll angle.
  • the angle O x (G 1 ) or the angle triple here is a preferred attitude angle.
  • Optical measuring systems as described below by way of example as a preferred embodiment (s) of one or more constituents or as a preferred embodiment (s) of the device according to the invention, include, for example, a device C for optical
  • Line O 1 may represent an exemplary representation of a vehicle longitudinal axis (shown in FIG. 3) of the towing vehicle (shown in Figure 3).
  • Line ⁇ 3 may be an exemplary representation of a trailer longitudinal axis (shown in FIG. 3).
  • the line O 1 may also be an exemplary representation of the trailer longitudinal axis.
  • the line ⁇ 3 may be an exemplary representation of the vehicle longitudinal axis of the towing vehicle.
  • Line ⁇ 2 may be an exemplary representation of a steering axle of the trailer (shown in FIG. 3).
  • Vehicle links are preferably defined as preferred arrangement angles as follows:
  • ⁇ ,, 1 + 1 ⁇ , - ⁇ , +1 i ⁇ ⁇ !, ..., 2 ⁇
  • m excellent points, in particular auxiliary points, are sought on the observation object.
  • (O 1 , ..., O n ) points are found and identified as preferred measurement points, where n ⁇ m.
  • the measuring points are a subset of the auxiliary points.
  • the points (0 ,, ..., O n ) can be converted into angles, among which the
  • the lens plane is preferably a plane which in the position of use of the camera is parallel to a plane which is spanned by two axes of the towing vehicle.
  • the vehicle longitudinal axis, the steering axis and the trailer longitudinal axis are preferably arranged in a plane.
  • the lens plane is preferably parallel to this plane.
  • the output of recognition of the subject of observation is a tuple (C 1 , ..., C n ) of pairs of angles defining a family of n straight lines passing through both the center of the lens and each point O 1 , every ⁇ l, ..., n ⁇ go.
  • the points O 1 may represent one or more light-emitting diodes.
  • Second process step calculation of the angles Q 1 2 and ⁇ 23
  • the position of the object of observation ie the position of the totality of the points (O 1 ,..., O n ) relative to the lens plane and the center of the lens, is given by the Cartesian coordinates
  • the position of the points (O 1 ,..., O n ) in relation to the camera C can be determined from the angle pairs (C 1 ,..., C n ):
  • the two angles ⁇ 1 2 and ⁇ 23 of the two-axle trailer can be determined by means of a calculation rule / 2 (see below):
  • the two-axle trailer preferably moves only planar.
  • the pitch ( ⁇ n 2 2 , ⁇ n ⁇ ) and roll angle ( w w U , ⁇ w 23 ) are preferably neglected.
  • the pitch ( ⁇ nn , ⁇ n 23 ) and roll angle ( ⁇ w l2 , ⁇ w 23 ) can be calculated analogously to the yaw angles ⁇ g i2 , ⁇ g 2i ).
  • the yaw angles ( ⁇ g l2 , ⁇ g 23 ) will hereinafter be referred to as ⁇ 12 , ⁇ 23 .
  • FIG. 3 shows a vehicle 10 with a vehicle 12 and a trailer 14.
  • the basis of the optical surveying system according to the preferred embodiment of FIG. 3 is a camera 16, which is preferably mounted on the rear of the tractor 12.
  • An observation object 18 is arranged on the trailer 14.
  • the subject matter 18 may be mounted in front of or behind a towing device 20, in this example the towing device 20 is identical to a steering axle 20 of the trailer 14.
  • the camera 16 is preferably mounted on the longitudinal axis 22 and in the center of the towing vehicle 12.
  • the camera 16 may also be arranged on the trailer 14 and the observation object 18 on the towing vehicle 12.
  • the camera 16 is preferably equipped with a filter for infrared light, which filters out the incoming light up to the infrared component. This ensures that less stray light is incident on the camera 16 and the system still works well even in bad weather conditions.
  • the camera 16 may also be an optical element which detects electromagnetic radiation of a predetermined wavelength, e.g. UV radiation, etc.
  • the optical element may also be a sensor or detector of laser radiation, in particular a laser scanner.
  • the laser light may have any predetermined wavelength, e.g. red, blue, green light, etc.
  • the laser may be, in particular, an infrared laser, a UV laser, etc.
  • the laser is a conventional diode laser.
  • the counterpart to the camera 16, the observation object 18 or the Observation element 18, is preferably a pyramid-like ramp 18, which is preferably located on a front side of the trailer 14 substantially at the height of the camera 16.
  • infrared diodes 24 are arranged as preferred auxiliary points, which can be recognized at least partially on the basis of the image data of the camera 16. More or fewer infrared diodes 24 can be used. For example, 3, 4, 5, 6, 8, 10, 12, 14, 16, 18, etc. infrared diodes 24 can be used. Preferably, the infrared diodes 24 are arranged at substantially the same distance and symmetrically to one another.
  • auxiliary points are predefined, e.g. by arrangement of laser diodes and / or acoustic signal generators, in particular in the ultrasonic range, one or more colored stickers, colored surfaces, etc.
  • the yaw angles of these, preferably seven infrared diodes 24, particularly preferably only the two outer auxiliary points (R, V) and the tip (S) are required as preferred measuring points.
  • the angle at which the camera 16 sees the point S is calculated and passed on to the algorithm.
  • the measuring points R, S, V are shown in Figure 3 by the reference numeral 26.
  • a rear axle 28 of the trailer 14 is shown in Figure 3, are arranged on the other LEDs 28.
  • the light-emitting diodes 28 can be imaged, for example, by means of a camera 32.
  • the rear axle 28 may be, for example, a steerable axle and based on the LEDs 30, analogous to the light emitting diodes 24, the orientation of the rear axle 28 relative to other axes, such as a trailer longitudinal axis 34, the steering axle 20 and the vehicle longitudinal axis 22 are determined.
  • a ramp (not shown) similar or identical to ramp 18 may be arranged on the rear axle 28.
  • a preferred ramp 18 is shown by way of example as a preferred observation element 18.
  • Ramp 18 has a plurality of diodes 24 as preferred auxiliary points. Three of the diodes 24 are used as preferred measurement points to determine the aforementioned angles. Instead of the ramp 18, the diodes 24 etc. can be arranged directly on the trailer 14 and the towing vehicle 12, respectively. For example, seven infrared diodes 24 may be disposed on the trailer 14. The infrared diodes 24 thus together form an observation element 18 and in each case an auxiliary point. If, for example, only three infrared diodes 24 are arranged, the three infrared diodes 24 together represent an observation element 18 and in each case an auxiliary point and also a measuring point. However, the three diodes 24 can also each represent an observation element 18.
  • Wood panels were attached. On the back, a 80 cm x 80 cm black wooden panel was placed vertically, which is to simulate the measuring area at the front of the trailer. This is the plate
  • infrared diodes 24 there are preferably seven infrared diodes 24 on this ramp 18, which are mounted at a lateral distance of 10 cm and 7 cm apart in height. This results in that the diodes 24th
  • FIG. 6 is a detailed view of a section of FIG. 3. Identical components of Figure 3 and Figure 6 are therefore provided with identical reference numerals.
  • FIG. 6 shows geometric relations of the observation device 18, for example in relation to the towing vehicle 12, in particular the vehicle longitudinal axis 22 of the towing vehicle 12 and / or the trailer 14, in particular the steering axle 20 and / or the trailer longitudinal axis 34.
  • the ramp 18 in particular has a three-dimensional shape
  • FIG. 6 shows the projection of the ramp 18 into the aforementioned plane.
  • the ramp 18 thus has a depth h and a width 2b.
  • the depth h is preferably defined by the distance of the foremost diode 24, ie the diode 24 closest to the towing vehicle 12 and the rearmost diode 24. In the example shown, the depth h equals the distance of the points S and P.
  • the width 2b is equal to the distance of the two outermost diodes 24. In this example, the width 2b is equal to the distance of the points V and R from each other.
  • FIG. 6 also shows the coupling points 40, 42, which preferably determine the position of the steering axle 20. Furthermore, the arrangement angles ⁇ i, 2 and ⁇ 2 , 3 are shown, which are determined by the position of the vehicle longitudinal axis 22, the steering axis 20 and the trailer longitudinal axis 34.
  • the combination can be identified with its orthogonal projection onto this plane.
  • the points R, S, V, K ⁇ , K 2 are consequently points in the (predefined) plane or projections of the points into this plane.
  • the level can be:
  • the term "movement” does not mean a movement in the sense of forward and / or reverse travel, but the change of the position of the trailer in relation to the vehicle.
  • the point K x is preferably defined as the reference point 40 of the movement and receives the coordinates in the coordinate system O, y) chosen by itself
  • the point ⁇ 1 is therefore a fixed point in the example selected
  • Coordinate system The connecting straight line from K 1 to the camera 16 defines the x-axis.
  • the camera 16 also has fixed coordinates in the exemplary coordinate system. It is also possible that the camera 16 defines the origin of the coordinate system.
  • the angles are determined under which the camera 16 sees the dots, that is, under which the dots are imaged in the image data. These are called recording angles below and by C R , C S , C V denotes.
  • the recording angles are preferably angles in spherical coordinates or in polar coordinates in the coordinate system of the camera 16 or the reference point K 1 .
  • the two yaw angles ⁇ 9 12 , ⁇ 9 23 are determined.
  • the yaw angles ⁇ u , ⁇ 23 serve to determine the relative position of the
  • Towing vehicle 12 and the trailer 14 represent.
  • the acceptance angles C R , C C , C V correspond exactly to only one position of R, S, V, ie the triangle (S, V, R) can only be placed in a single way between the rays, so that the corners on the Rays are appropriate, which is proved in the following.
  • the calculation rule f x is formulated in the first step and in the second step
  • Calculation rule / 2 formulated.
  • the following description of the first step is a definition of the calculation rule Z 1 and the following description of the second step is a definition of the calculation rule f 2 .
  • First step The coordinates of R, S, V are determined from C R , C S , C V.
  • R, S, V are on rays emanating from C and through the
  • the corresponding angles in FIG. 8 are determined by the recording angles:
  • V 0 D -R 0 (15)
  • the triangle (VCS) and the triangle (SCR) have the angle C 1 and C 2 at point C, respectively.
  • both triangles have a side of equal length m and a common unknown side of length CS.
  • m a side of equal length
  • CS a common unknown side of length
  • V D (R -2 -p, S) (28)
  • V D (R s , -S 2 , S) (29)
  • the ramp is preferably made of wood, this case can not occur in the embodiment described above.
  • the top of the ramp S would obscure the point R and this is no longer perceptible to the camera (as shown by way of example in Figure 10).
  • the above embodiment is limited by the geometry of the ramp to a certain angle.
  • the ramp can at least partially consist of a partially transparent or completely transparent material. This may preferably be limited to the light of the wavelength of the diodes.
  • the ramp may be made of a material that is at least partially transparent to infrared light.
  • the desired size CS could be determined either from the formula for R 0 or from the formula for V 0 , in particular be calculated. In order for this to be possible in this case as well, the following must apply:
  • the counter is negative, i.
  • the desired size CS can be calculated with the help of R 0 .
  • the subsequent calculation of the points R, S, V is analogous to Case 1.
  • the point K 2 is calculated by the vector SR, which has the length m, around the fixed angle
  • ⁇ 23 is calculated as the angle between the vectors (K 2 K 1 ) and (K 2 P).
  • the points R, S, V and the angles C S , C R , C V can be calculated from the angles ⁇ l2 , ⁇ 23 .
  • the point P, with coordinates (x, y) is as rotation of P 02 by the angle ⁇ 9 23 um
  • R 012 D ((K ⁇ K 2 -PK 2 , b), ⁇ n , ⁇ , Q )) (72)
  • V 011 Di (K 1 K 2 - PK 2 ⁇ b) A 2 MO)) (74)
  • V D (V ⁇ i , ⁇ 23 , K 2 ) (75)
  • FIG. 11a shows a flow chart for an overview of the process steps that are taking place.
  • image data is generated by an image pickup device 16 (shown in the previous figure 6), which is also referred to as "fetch image.”
  • the measurement points necessary for further determination of the arrangement angles are detected on the basis of the image data
  • Measurement points in the image data that is to say based on the actual position of the measurement points in pixels in the image data, are calculated in step S3 the actual positions of the measurement points, ie for example the infrared diodes 24, in particular in the frame of reference of the camera
  • acquisition angles are preferably as cylindrical coordinates or spherical coordinates, in particular polar spherical coordinates.
  • step S4 the arrangement angle of the vehicle longitudinal axis relative to the steering axis and the arrangement angle of the steering axis to the trailer longitudinal axis are determined, in particular calculated, on the basis of the acceptance angles.
  • These angles can be output by means of a display, a computer interface, a radio transmission, etc., for example.
  • These angles can also be used to determine a trajectory of the team.
  • a change in the trajectory can be determined and in particular displayed.
  • the arrangement angle or change thereof can also be used to automatically control the reversing of a trailer.
  • the image data is generated on the basis of the image recording device.
  • These image data are preferably color image data, in particular RGB image data.
  • the preferred embodiment of the present invention described below is implemented as a computer program product under Linux, for example SuSe Linux 10 in the C ++ language, for example with the GCC 4.0 compiler.
  • This is preferably realized with the help of the QT library, in particular with QT 3.3.
  • the Linux standard library Video4Linux (as v4l referred to) are used.
  • This driver preferably reads the images in step S1, as shown in FIG. 11a, from a frame grabber card as an exemplary component of the image recording device, to which the camera is connected as an exemplary, further component of the image recording device.
  • the synchronization of the software with the camera is preferably also done via this driver, which finally gives a complete RGB image from the camera.
  • step S2 shows infrared diodes as exemplary auxiliary points or measuring points, it is not necessary to work with RGB images all the time. Therefore, in a first step of this step S2 (shown in FIG. 11a), in particular in step S20, a gray scale image is obtained from the RGB image
  • the black and white image is then calculated by deciding for each pixel i:
  • This value is preferably set or chosen so high that advantageously as many noise pixels disappear. On the other hand, this value must also be so low that the points of the ramp, i. the auxiliary points or the measuring points are still clearly recognizable.
  • a recursive white point search function is applied to the black and white image.
  • the picture becomes preferably gone through pixel by pixel and checked whether the current pixel
  • the contiguous white pixels are preferably counted. Since the infrared diodes to be detected reach a maximum and a minimum size, two parameters enter into the function here. With the aid of this it is then decided whether the detected area is an infrared diode, i. an auxiliary point or a measuring point is or is not:
  • Each diode is assigned a nominal number of N pixels, which nominal number of pixels may depend on the following constraints:
  • the value N is determined in advance or determined. This can be done by means of a
  • Calibration routine or method done For example, before the first use of the method according to the invention or the device according to the invention, when the diodes and the camera are in the operating position, an image in which the position of the diodes is known can be evaluated. Thus, based on this image, the nominal number N of pixels per diode can be determined. Alternatively or additionally, the nominal number N of pixels per diode can also be determined, in particular calculated, on the basis of geometrical and optical considerations. In this case, N represents a theoretical value.
  • the value of min is at least about 2, at least about 9, at least about 16, at least about 36, further preferably at least between at least about 16 and at least about 36 pixels, more preferably at least about 4 pixels.
  • the value of max is at least about 36, at least about 49, more preferably between about 30 and about 50 pixels, most preferably at most about 250 pixels.
  • the value of N is between about 25 and about 100, more preferably between about 30 and about 50 pixels, more preferably between about 36 and about 49 pixels. Further preferably, the value of N is at most about 250 pixels, more preferably less than about 150 pixels.
  • an algorithm calculates for an amount of pixels, i. searched for contiguous pixels of size "size", where (about) applies,
  • the method may further preferably comprise an initial step in which the nominal number N pixels per diode is determined.
  • the nominal number N can in this case be measured in a calibration step or calibration step and / or by means of geometric and / or optical considerations on the basis of
  • the calibration step may be identical or part of the above-described calibration step.
  • step S22 it is selected whether a so-called “large process” or a so-called “small process” is to be performed.
  • the .large method includes in particular the steps S23, S24, S25.
  • the small method includes steps S26, S27, S25.
  • a leg preferably has the following properties:
  • the ramp consists of exactly two such legs
  • one of the two legs, which should form the ramp, must not fall below a minimum length.
  • the leg recognition preferably proceeds as follows.
  • the found priorities are processed one after the other. For each point the nearest point is searched. In turn, these points are searched for nearest points. It is then checked whether the points lie on a straight line.
  • this ramp is checked by measuring the length of the legs. In particular, it is checked in step S24 whether the detected leg is actually a leg. To do this, check the length of the thigh. The further a leg is turned away from the camera as a preferred image pickup device, the smaller appears on the camera image than preferred image data. Conversely, the thigh turned towards the camera appears larger than it actually is. It follows that at least one of the two legs must not fall below a certain minimum length.
  • both legs of a detected ramp are less than the minimum length, the detected ramp is discarded and searched for another, i. the method described above at least partially performed again.
  • the aforementioned method can be carried out repeatedly or partially completely until one or more ramps are detected or determined.
  • Step S25 the coordinates of the right (V), left (R) and middle (S) infrared diodes viewed from the camera are preferably returned in Cartesian coordinates, cylindrical coordinates or spherical coordinates (step S25). Subsequently, the steps S3 and S4 can be performed. Steps S20 to S25 are preferably substeps of step S2.
  • the preferred embodiment of the method according to the invention described below i. the "small process” makes use of the fact that in reality the movement of the trailer from picture to picture, due to the high frequency of the camera, is only very small, so the small method is carried out in step S22, if before the large method has already been carried out and / or measuring points, in particular the three measuring points R, S, V.
  • step S27 the coordinates of the points R, S and V recognized by the large method are used and re-centered within a certain radius around them If all the measuring points, ie according to this embodiment, the three points R, S, V are found again in this way (step S27), then the coordinates of these (new) points are determined as the coordinates for R, S, V is returned (step S25). If it is determined in step S27 that one or more of the points are not found, the large method is preferably performed, i.e. a the shift was too big to be able to say clearly that these are the old points. Consequently, after step S27, i. the failed detection of all points, step S23 is executed.
  • the pixels For the conversion of the found R, S, V points into the relative angles from the point of view of the camera, the pixels must preferably first be converted at intervals to the perpendicular of the camera. This is done according to a preferred embodiment with the aid of a table which, for example, previously, in an initial step, in which the preferred device is adapted to the team, is created.
  • the pixel values are then interpolated and converted, for example, into centimeters. This is shown by way of example in FIG. After the pixel value has been converted to the corresponding distance to the solder, the angle of the point can be calculated using the tan function of the recording:

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé de détermination de la position d'une remorque (14) à double essieu qui présente au moins un essieu mobile par rapport à un véhicule tracteur (12). Le procédé comprend les étapes qui consistent à former des données d'image associées à au moins un point de mesure (R, S, V), à détecter au moins trois points de mesure prédéterminés (R, S, V) dans les données d'image, à déterminer la position de chaque point de mesure (R, S, V) par rapport au dispositif (16,32) d'enregistrement d'image dans des coordonnées de référence d'un système de coordonnées prédéterminé, à déterminer la position absolue de chaque point de mesure (R, S, V) par rapport au dispositif (16, 32) d'enregistrement d'image, à en déduire l'angle de disposition (Θ1,2) entre l'axe longitudinal (22) du véhicule tracteur (12) et l'axe d'obliquité (20) de la remorque (14) et l'angle de disposition (Θ2,3) entre l'axe d'obliquité (20) et l'axe longitudinal (34) de la remorque (14).
PCT/EP2007/010387 2006-11-29 2007-11-29 Procédé de détermination d'une position, dispositif et produit de programme informatique WO2008064892A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102006056408A DE102006056408B4 (de) 2006-11-29 2006-11-29 Verfahren zum Bestimmen einer Position, Vorrichtung und Computerprogrammprodukt
DE102006056408.1 2006-11-29

Publications (1)

Publication Number Publication Date
WO2008064892A1 true WO2008064892A1 (fr) 2008-06-05

Family

ID=39047835

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2007/010387 WO2008064892A1 (fr) 2006-11-29 2007-11-29 Procédé de détermination d'une position, dispositif et produit de programme informatique

Country Status (2)

Country Link
DE (1) DE102006056408B4 (fr)
WO (1) WO2008064892A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441320A3 (fr) * 2010-10-14 2014-10-29 Deere & Company Système de guidage de véhicule
EP2950040A1 (fr) * 2014-05-27 2015-12-02 MAN Truck & Bus AG Procédé et système d'assistance du conducteur pour l'assistance d'un ensemble attelé de véhicule utilitaire
CN105270408A (zh) * 2014-05-27 2016-01-27 曼卡车和巴士股份公司 用于测定商用车的行驶动态状态的方法和驾驶员辅助系统
CN105291980A (zh) * 2014-05-27 2016-02-03 曼卡车和巴士股份公司 用于协助商用车的驾驶员的方法以及驾驶员辅助系统
CN106225723A (zh) * 2016-07-25 2016-12-14 浙江零跑科技有限公司 一种基于后视双目相机的多列车铰接角测量方法
CN108732590A (zh) * 2018-05-16 2018-11-02 重庆邮电大学 一种双足机器人及一种斜坡角度测量方法
US10163033B2 (en) 2016-12-13 2018-12-25 Caterpillar Inc. Vehicle classification and vehicle pose estimation
CN109927716A (zh) * 2019-03-11 2019-06-25 武汉环宇智行科技有限公司 基于高精度地图的自主垂直泊车方法
WO2019120918A1 (fr) * 2017-12-18 2019-06-27 Robert Bosch Gmbh Procédé et dispositif permettant de déterminer un angle relatif entre deux véhicules
CN111572633A (zh) * 2019-02-18 2020-08-25 上海汽车集团股份有限公司 转向角度检测方法、装置及系统
US20210019904A1 (en) * 2018-03-02 2021-01-21 Continental Automotive Gmbh Trailer angle determination system for a vehicle
WO2021191099A1 (fr) 2020-03-26 2021-09-30 Zf Cv Systems Global Gmbh Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule
WO2022156630A1 (fr) * 2021-01-19 2022-07-28 北京九曜智能科技有限公司 Procédé d'attelage et système d'attelage pour atteler un véhicule à un camion tracteur
US11511800B2 (en) 2017-12-27 2022-11-29 Robert Bosch Gmbh Determining an angle of a movement path of a trailer
DE102021121869A1 (de) 2021-08-24 2023-03-02 Schaeffler Technologies AG & Co. KG Verfahren zur Kalibrierung und/oder Linearisierung eines Positionssensors; Positionssensor; Hinterachslenkung; Fahrzeug; Computerprogramm
WO2023061732A1 (fr) 2021-10-15 2023-04-20 Zf Cv Systems Global Gmbh Procédé de localisation d'une remorque, unité de traitement et véhicule
DE102021126816A1 (de) 2021-10-15 2023-04-20 Zf Cv Systems Global Gmbh Verfahren zum Ermitteln eines Knickwinkels, Verarbeitungseinheit und Fahrzeug
EP4174776A1 (fr) * 2021-10-29 2023-05-03 Volvo Truck Corporation Procédé permettant d'estimer un angle relatif entre un dispositif d'obtention d'images et un véhicule

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2447672B (en) 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
DE102008045436A1 (de) * 2008-09-02 2010-03-04 Volkswagen Ag Verfahren und Vorrichtung zum Bestimmen eines Knickwinkels zwischen einem Zugfahrzeug und einem Anhänger
ATE545039T1 (de) * 2008-09-05 2012-02-15 Fiat Ricerche Bestimmung der relativen position von zwei relativ beweglichen elementen
DE102009007990A1 (de) * 2009-02-07 2010-08-12 Hella Kgaa Hueck & Co. Verfahren zur Bestimmung von Anhängerdaten
DE102010008324A1 (de) * 2010-02-17 2011-08-18 ZF Lenksysteme GmbH, 73527 Erfassung und Auswertung einer Lagebeziehung zwischen einem Kraftfahrzeug und einem Anhänger
US9335163B2 (en) 2011-04-19 2016-05-10 Ford Global Technologies, Llc Trailer length estimation in hitch angle applications
US9290203B2 (en) 2011-04-19 2016-03-22 Ford Global Technologies, Llc Trailer length estimation in hitch angle applications
US8930140B2 (en) 2011-04-19 2015-01-06 Ford Global Technologies, Llc Trailer target placement assist system and method
US9290202B2 (en) 2011-04-19 2016-03-22 Ford Global Technologies, Llc System and method of calibrating a trailer backup assist system
US9296422B2 (en) 2011-04-19 2016-03-29 Ford Global Technologies, Llc Trailer angle detection target plausibility
US9248858B2 (en) 2011-04-19 2016-02-02 Ford Global Technologies Trailer backup assist system
US9374562B2 (en) 2011-04-19 2016-06-21 Ford Global Technologies, Llc System and method for calculating a horizontal camera to target distance
US10196088B2 (en) 2011-04-19 2019-02-05 Ford Global Technologies, Llc Target monitoring system and method
US9283892B2 (en) 2011-04-19 2016-03-15 Ford Global Technologies, Llc Method and system for monitoring placement of a target on a trailer
US9513103B2 (en) 2011-04-19 2016-12-06 Ford Global Technologies, Llc Hitch angle sensor assembly
US9102271B2 (en) 2011-04-19 2015-08-11 Ford Global Technologies, Llc Trailer monitoring system and method
US9102272B2 (en) 2011-04-19 2015-08-11 Ford Global Technologies, Llc Trailer target monitoring system and method
US9723274B2 (en) 2011-04-19 2017-08-01 Ford Global Technologies, Llc System and method for adjusting an image capture setting
US9937953B2 (en) 2011-04-19 2018-04-10 Ford Global Technologies, Llc Trailer backup offset determination
US9434414B2 (en) 2011-04-19 2016-09-06 Ford Global Technologies, Llc System and method for determining a hitch angle offset
US9683848B2 (en) 2011-04-19 2017-06-20 Ford Global Technologies, Llc System for determining hitch angle
DE102012006207A1 (de) 2012-03-27 2013-10-02 Volkswagen Aktiengesellschaft Assistenzvorrichtungen und Verfahren zum Betreiben einer Assistenzvorrichtung zur Fahrtsteuerung eines Zugfahrzeugs mit Anhänger
US9464887B2 (en) 2013-11-21 2016-10-11 Ford Global Technologies, Llc Illuminated hitch angle detection component
US9464886B2 (en) 2013-11-21 2016-10-11 Ford Global Technologies, Llc Luminescent hitch angle detection component
DE102014000978A1 (de) 2014-01-25 2015-07-30 Audi Ag Verfahren und Vorrichtung zur Steuerung eines Gespanns in einen Parkraum
US9296421B2 (en) 2014-03-06 2016-03-29 Ford Global Technologies, Llc Vehicle target identification using human gesture recognition
DE102014212821A1 (de) * 2014-07-02 2016-01-07 Zf Friedrichshafen Ag Vorrichtung und Verfahren zur Erfassung eines Schwenkwinkels zwischen einem Fahrzeug und einer Anhängervorrichtung
US9963004B2 (en) 2014-07-28 2018-05-08 Ford Global Technologies, Llc Trailer sway warning system and method
US9517668B2 (en) 2014-07-28 2016-12-13 Ford Global Technologies, Llc Hitch angle warning system and method
US10112537B2 (en) 2014-09-03 2018-10-30 Ford Global Technologies, Llc Trailer angle detection target fade warning
US9315212B1 (en) 2014-10-13 2016-04-19 Ford Global Technologies, Llc Trailer sensor module and associated method of wireless trailer identification and motion estimation
US9340228B2 (en) 2014-10-13 2016-05-17 Ford Global Technologies, Llc Trailer motion and parameter estimation system
US9533683B2 (en) 2014-12-05 2017-01-03 Ford Global Technologies, Llc Sensor failure mitigation system and mode management
US9607242B2 (en) 2015-01-16 2017-03-28 Ford Global Technologies, Llc Target monitoring system with lens cleaning device
US9522699B2 (en) 2015-02-05 2016-12-20 Ford Global Technologies, Llc Trailer backup assist system with adaptive steering angle limits
US9616923B2 (en) 2015-03-03 2017-04-11 Ford Global Technologies, Llc Topographical integration for trailer backup assist system
US9804022B2 (en) 2015-03-24 2017-10-31 Ford Global Technologies, Llc System and method for hitch angle detection
US10384607B2 (en) 2015-10-19 2019-08-20 Ford Global Technologies, Llc Trailer backup assist system with hitch angle offset estimation
US10611407B2 (en) 2015-10-19 2020-04-07 Ford Global Technologies, Llc Speed control for motor vehicles
US9836060B2 (en) 2015-10-28 2017-12-05 Ford Global Technologies, Llc Trailer backup assist system with target management
US10017115B2 (en) 2015-11-11 2018-07-10 Ford Global Technologies, Llc Trailer monitoring system and method
US10155478B2 (en) 2015-12-17 2018-12-18 Ford Global Technologies, Llc Centerline method for trailer hitch angle detection
US9610975B1 (en) 2015-12-17 2017-04-04 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system
US10011228B2 (en) 2015-12-17 2018-07-03 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system using multiple imaging devices
US9934572B2 (en) 2015-12-17 2018-04-03 Ford Global Technologies, Llc Drawbar scan solution for locating trailer hitch point
US9796228B2 (en) 2015-12-17 2017-10-24 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system
US9798953B2 (en) 2015-12-17 2017-10-24 Ford Global Technologies, Llc Template matching solution for locating trailer hitch point
US9827818B2 (en) 2015-12-17 2017-11-28 Ford Global Technologies, Llc Multi-stage solution for trailer hitch angle initialization
US10005492B2 (en) 2016-02-18 2018-06-26 Ford Global Technologies, Llc Trailer length and hitch angle bias estimation
US10106193B2 (en) 2016-07-01 2018-10-23 Ford Global Technologies, Llc Enhanced yaw rate trailer angle detection initialization
DE102016114060A1 (de) 2016-07-29 2018-02-01 Connaught Electronics Ltd. Verfahren zum Bestimmen eines geometrischen Parameters eines Anhängers eines Gespanns mit einem Kraftfahrzeug und dem Anhänger, Detektionssystem, Fahrerassistenzsystem und Kraftfahrzeug
US10046800B2 (en) 2016-08-10 2018-08-14 Ford Global Technologies, Llc Trailer wheel targetless trailer angle detection
US10222804B2 (en) 2016-10-21 2019-03-05 Ford Global Technologies, Llc Inertial reference for TBA speed limiting
US10710585B2 (en) 2017-09-01 2020-07-14 Ford Global Technologies, Llc Trailer backup assist system with predictive hitch angle functionality
DE102018117199A1 (de) * 2018-07-17 2020-01-23 Connaught Electronics Ltd. Verfahren zum Bestimmen zumindest eines relativen Orientierungswinkels eines Gespanns mittels einer inertialen Messeinheit, Computerprogrammprodukt, elektronische Recheneinrichtung sowie Fahrerassistenzsystem
US11077795B2 (en) 2018-11-26 2021-08-03 Ford Global Technologies, Llc Trailer angle detection using end-to-end learning
US10829046B2 (en) 2019-03-06 2020-11-10 Ford Global Technologies, Llc Trailer angle detection using end-to-end learning
DE102019205447A1 (de) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Vorrichtung und Verfahren zum Annähern eines Zugfahrzeugs an einen Anhänger sowie Zugfahrzeug und Anhäng
CN112356626B (zh) * 2020-12-04 2022-04-12 中科领航智能科技(苏州)有限公司 用于自动驾驶牵引车与拖斗自动对接的方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004025252A1 (de) * 2004-05-22 2005-12-15 Daimlerchrysler Ag Anordnung und Verfahren zur Bestimmung des Gespannwinkels eines Gliederzugs
WO2006042665A1 (fr) * 2004-10-15 2006-04-27 Daimlerchrysler Ag Procede pour determiner des angles de timon et de remorque

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19806655A1 (de) * 1998-02-18 1999-08-26 Owerfeldt Elektronische Rangierhilfe für einen Lastwagen mit Anhänger
DE10142457B4 (de) * 2001-08-31 2006-05-04 Daimlerchrysler Ag Digitale Bildmessung retroreflektierender Marken
DE10322829A1 (de) * 2003-05-19 2004-12-09 Daimlerchrysler Ag Steuerungssystem für ein Fahrzeug
DE102004050149A1 (de) * 2004-10-15 2006-04-20 Daimlerchrysler Ag Verfahren zur Bestimmung von Deichsel- und Trailerwinkel
DE102004059596B4 (de) * 2004-12-09 2007-11-08 Daimlerchrysler Ag Verfahren zum Ermitteln eines Knickwinkels eines Fahrzeuggespanns sowie entsprechende Vorrichtung

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004025252A1 (de) * 2004-05-22 2005-12-15 Daimlerchrysler Ag Anordnung und Verfahren zur Bestimmung des Gespannwinkels eines Gliederzugs
WO2006042665A1 (fr) * 2004-10-15 2006-04-27 Daimlerchrysler Ag Procede pour determiner des angles de timon et de remorque

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441320A3 (fr) * 2010-10-14 2014-10-29 Deere & Company Système de guidage de véhicule
CN105270408B (zh) * 2014-05-27 2020-01-21 曼卡车和巴士股份公司 用于测定商用车的行驶动态状态的方法和驾驶员辅助系统
RU2693422C2 (ru) * 2014-05-27 2019-07-02 Ман Трак Унд Бас Аг Система и способ поддержки водителя грузового автомобиля и грузовой автомобиль
CN111469851A (zh) * 2014-05-27 2020-07-31 曼卡车和巴士欧洲股份公司 用于测定商用车的行驶动态状态的方法和驾驶员辅助系统
CN105371767A (zh) * 2014-05-27 2016-03-02 曼卡车和巴士股份公司 用于协助商用车拖车挂车组合的方法和驾驶员辅助系统
EP2949532A3 (fr) * 2014-05-27 2017-04-26 MAN Truck & Bus AG Procédé et système d'assistance du conducteur pour l'assistance d'un conducteur de véhicule utilitaire
EP3351450A1 (fr) * 2014-05-27 2018-07-25 MAN Truck & Bus AG Procédé et système d'assistance à la conduite destinés à détecter des états dynamiques d'un véhicule utilitaire
CN105291980B (zh) * 2014-05-27 2020-06-05 曼卡车和巴士股份公司 用于协助商用车的驾驶员的方法以及驾驶员辅助系统
CN105270408A (zh) * 2014-05-27 2016-01-27 曼卡车和巴士股份公司 用于测定商用车的行驶动态状态的方法和驾驶员辅助系统
CN105291980A (zh) * 2014-05-27 2016-02-03 曼卡车和巴士股份公司 用于协助商用车的驾驶员的方法以及驾驶员辅助系统
EP2950040A1 (fr) * 2014-05-27 2015-12-02 MAN Truck & Bus AG Procédé et système d'assistance du conducteur pour l'assistance d'un ensemble attelé de véhicule utilitaire
RU2693126C2 (ru) * 2014-05-27 2019-07-01 Ман Трак Унд Бас Аг Способ и система поддержки водителя для помощи водителю грузового автопоезда
CN106225723A (zh) * 2016-07-25 2016-12-14 浙江零跑科技有限公司 一种基于后视双目相机的多列车铰接角测量方法
US10163033B2 (en) 2016-12-13 2018-12-25 Caterpillar Inc. Vehicle classification and vehicle pose estimation
WO2019120918A1 (fr) * 2017-12-18 2019-06-27 Robert Bosch Gmbh Procédé et dispositif permettant de déterminer un angle relatif entre deux véhicules
US11511800B2 (en) 2017-12-27 2022-11-29 Robert Bosch Gmbh Determining an angle of a movement path of a trailer
US11941834B2 (en) * 2018-03-02 2024-03-26 Continental Autonomous Mobility Germany GmbH Trailer angle determination system for a vehicle
US20210019904A1 (en) * 2018-03-02 2021-01-21 Continental Automotive Gmbh Trailer angle determination system for a vehicle
CN108732590A (zh) * 2018-05-16 2018-11-02 重庆邮电大学 一种双足机器人及一种斜坡角度测量方法
CN111572633A (zh) * 2019-02-18 2020-08-25 上海汽车集团股份有限公司 转向角度检测方法、装置及系统
CN111572633B (zh) * 2019-02-18 2021-09-24 上海汽车集团股份有限公司 转向角度检测方法、装置及系统
CN109927716A (zh) * 2019-03-11 2019-06-25 武汉环宇智行科技有限公司 基于高精度地图的自主垂直泊车方法
WO2021191099A1 (fr) 2020-03-26 2021-09-30 Zf Cv Systems Global Gmbh Procédé permettant de déterminer la pose d'un objet, procédé permettant de commander un véhicule, unité de commande et véhicule
DE102020108416A1 (de) 2020-03-26 2021-09-30 Zf Cv Systems Global Gmbh Verfahren zum Ermitteln einer Pose eines Objektes, Verfahren zum Steuern eines Fahrzeuges, Steuereinheit und Fahrzeug
WO2022156630A1 (fr) * 2021-01-19 2022-07-28 北京九曜智能科技有限公司 Procédé d'attelage et système d'attelage pour atteler un véhicule à un camion tracteur
DE102021121869A1 (de) 2021-08-24 2023-03-02 Schaeffler Technologies AG & Co. KG Verfahren zur Kalibrierung und/oder Linearisierung eines Positionssensors; Positionssensor; Hinterachslenkung; Fahrzeug; Computerprogramm
WO2023061732A1 (fr) 2021-10-15 2023-04-20 Zf Cv Systems Global Gmbh Procédé de localisation d'une remorque, unité de traitement et véhicule
DE102021126816A1 (de) 2021-10-15 2023-04-20 Zf Cv Systems Global Gmbh Verfahren zum Ermitteln eines Knickwinkels, Verarbeitungseinheit und Fahrzeug
WO2023061731A1 (fr) 2021-10-15 2023-04-20 Zf Cv Systems Global Gmbh Procédé de détermination d'un angle d'articulation, unité de traitement et véhicule
DE102021126814A1 (de) 2021-10-15 2023-04-20 Zf Cv Systems Global Gmbh Verfahren zum Lokalisieren eines Anhängers, Verarbeitungseinheit und Fahrzeug
EP4174776A1 (fr) * 2021-10-29 2023-05-03 Volvo Truck Corporation Procédé permettant d'estimer un angle relatif entre un dispositif d'obtention d'images et un véhicule

Also Published As

Publication number Publication date
DE102006056408A1 (de) 2008-06-19
DE102006056408B4 (de) 2013-04-18

Similar Documents

Publication Publication Date Title
DE102006056408B4 (de) Verfahren zum Bestimmen einer Position, Vorrichtung und Computerprogrammprodukt
DE102015104453B4 (de) Stereobildverarbeitungsvorrichtung für ein Fahrzeug
EP3510463B1 (fr) Ensemble de capteurs pour un véhicule utilitaire à déplacement autonome et procédé de détection d'images environnantes
DE102015107677B4 (de) Rundum-Sicht-Kamerasystem- (VPM-) Online-Kalibrierung
DE102014222617B4 (de) Fahrzeugerfassungsverfahren und Fahrzeugerfassungssytem
EP3328686A1 (fr) Procédé et dispositif d'affichage d'une scène environnante d'un attelage
DE102008031784A1 (de) Verfahren und Vorrichtung zur Verzerrungskorrektur und Bildverbesserung eines Fahrzeugrückblicksystems
DE102019114355A1 (de) Autokalibrierung für fahrzeugkameras
DE102017109445A1 (de) Kalibration einer Fahrzeug-Kameraeinrichtung in Fahrzeuglängsrichtung oder Fahrzeugquerrichtung
DE102020109279A1 (de) System und verfahren zur anhängerausrichtung
DE102017106152A1 (de) Ermitteln einer Winkelstellung eines Anhängers mit optimierter Vorlage
DE102018124979A1 (de) Fahrerassistenzsystem zur Bestimmung einer Entfernung zwischen zwei Fahrzeugen mit einer Kamera
DE102017223098A1 (de) Verfahren und Vorrichtung zur Ermittlung eines Relativwinkels zwischen zwei Fahrzeugen
DE102008000837A1 (de) Fahrwerksvermessungssystem sowie Verfahren zum Bestimmen der Lageparameter von Messköpfen eines Fahrwerksvermessungssystems
DE102019132019A1 (de) Fahrzeug und steuerverfahren dafür
DE102016224904A1 (de) Dreidimensionales Rundumsichtsystem
DE102018202753A1 (de) Verfahren zur Ermittlung einer Entfernung zwischen einem Kraftfahrzeug und einem Objekt
DE112021006799T5 (de) Signalverarbeitungsvorrichtung, signalverarbeitungsverfahren und signalverarbeitungssystem
DE102020129455A1 (de) Verfahren zum betreiben eines fahrassistenzsystems, computerprogrammprodukt, fahrassistenzsystem, gespann und nachrüstsatz
DE102021133091A1 (de) Verfahren zur Ermittlung einer Ausrichtung einer Kameraeinheit einer Erfassungseinrichtung, Erfassungseinrichtung und Fahrzeug
DE102020201000B3 (de) Computerimplementiertes Verfahren und System zum Erhalten eines Umfeldmodells und Steuergerät für ein automatisiert betreibbares Fahrzeug
DE102022002767A1 (de) Verfahren zur dreidimensionalen Rekonstruktion einer Fahrzeugumgebung
WO2024032971A1 (fr) Dispositif et procédé de mesure d'images virtuelles tridimensionnelles et d'objets sur un affichage tête haute
DE102020122908A1 (de) Verfahren zum Anzeigen einer Umgebung eines Fahrzeuges auf einer Anzeigeeinrichtung, Verarbeitungseinheit und Fahrzeug
DE102022126080A1 (de) Raumüberwachungssystem

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07846912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07846912

Country of ref document: EP

Kind code of ref document: A1