US20150077557A1 - Vehicle auto-motion control system - Google Patents

Vehicle auto-motion control system Download PDF

Info

Publication number
US20150077557A1
US20150077557A1 US14/027,996 US201314027996A US2015077557A1 US 20150077557 A1 US20150077557 A1 US 20150077557A1 US 201314027996 A US201314027996 A US 201314027996A US 2015077557 A1 US2015077557 A1 US 2015077557A1
Authority
US
United States
Prior art keywords
vehicle
target
camera
control system
implement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/027,996
Other versions
US9880560B2 (en
Inventor
Shufeng Han
Christopher D. Turner
Bryan K. Buerkle
Steven A. Duppong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deere and Co
Original Assignee
Deere and Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deere and Co filed Critical Deere and Co
Priority to US14/027,996 priority Critical patent/US9880560B2/en
Assigned to DEERE & COMPANY reassignment DEERE & COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BUERKLE, BRYAN K., TURNER, CHRISTOPHER D., DUPPONG, STEVEN A., HAN, SHUFENG
Priority to BR102014021838A priority patent/BR102014021838B8/en
Priority to DE201410217746 priority patent/DE102014217746A1/en
Priority to CN201410471869.4A priority patent/CN104460361B/en
Publication of US20150077557A1 publication Critical patent/US20150077557A1/en
Application granted granted Critical
Publication of US9880560B2 publication Critical patent/US9880560B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B59/00Devices specially adapted for connection between animals or tractors and agricultural machines or implements
    • A01B59/06Devices specially adapted for connection between animals or tractors and agricultural machines or implements for machines mounted on tractors
    • A01B59/061Devices specially adapted for connection between animals or tractors and agricultural machines or implements for machines mounted on tractors specially adapted for enabling connection or disconnection controlled from the driver's seat
    • A01B59/062Devices specially adapted for connection between animals or tractors and agricultural machines or implements for machines mounted on tractors specially adapted for enabling connection or disconnection controlled from the driver's seat the connection comprising a rigid interface frame on the tractor
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/001Steering by means of optical assistance, e.g. television cameras
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B71/00Construction or arrangement of setting or adjusting mechanisms, of implement or tool drive or of power take-off; Means for protecting parts against dust, or the like; Adapting machine elements to or for agricultural purposes
    • A01B71/06Special adaptations of coupling means between power take-off and transmission shaft to the implement or machine
    • A01B71/063Special adaptations of coupling means between power take-off and transmission shaft to the implement or machine for enabling transmission coupling and connection between tractor and implements to be controlled from the driver's seat
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/36Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating connection, e.g. hitch catchers, visual guide means, signalling aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/58Auxiliary devices
    • B60D1/62Auxiliary devices involving supply lines, electric circuits, or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons

Definitions

  • the present disclosure relates to a system which senses a target on a towed implement and automatically controls the steering and movement of a vehicle to align the vehicle with the towed implement which is to be coupled to the vehicle, such as a wagon or trailer.
  • a system includes a camera or cameras which are mounted on a vehicle.
  • the cameras produce images of a target on the implement to which the vehicle is to be coupled.
  • An image processor unit processes the images and other information and a control system automatically controls the steering and movement of the vehicle to align the vehicle with the implement, so that the implement can be coupled to the vehicle.
  • the system aligns the hitch of the vehicle with a towed implement, such as a wagon or trailer to be connected.
  • the vehicle automatically stops when the hitch is aligned with the tongue of the towed device. This reduces the skill required of an operator and overcomes field of view limitations.
  • FIG. 1 is a simplified schematic side view of a tractor and implement to be coupled to the tractor;
  • FIG. 2 is a front view of a target which is mounted on the implement of FIG. 1 ;
  • FIG. 3 is schematic block diagram of a control system for controlling the movement of the tractor of FIG. 1 ;
  • FIG. 4 is schematic block diagram of the perception engine of FIG. 3 ;
  • FIG. 5 is a schematic block diagram of the controller of FIG. 3 ;
  • FIG. 6 is a flow chart representation of the operator interface of FIG. 3 .
  • a towing vehicle 10 such as an agricultural tractor, has a conventional hitch 11 and/or a drawbar 12 for coupling to an implement 14 .
  • a target 16 is mounted on the implement 14 so as to be viewable from the direction of the vehicle 10 .
  • the target 16 is preferably pivotally mounted on the implement 14 about a pivot 18 having a pivot axis which is transverse to a fore-and-aft axis of the implement 14 .
  • a pair of cameras 20 and 22 are mounted on a rear upper portion of the towing vehicle.
  • the cameras include a near-view camera 20 and a far-view camera 22 .
  • the target 16 has a chessboard pattern for easy identification of corner positions.
  • CMOS complementary metal-oxide-semiconductor
  • Camera output can be digital or analog, in color or in monochrome.
  • the cameras should have a high dynamic range so they can successfully capture the target image in the lightest and darkest operation conditions.
  • the field-of-view of the camera should be large enough to be able to identify the target at the farthest and closest distances from vehicle to the implement. If a single camera's field-of-view cannot cover the entire distance range, a second camera could be used.
  • the camera is a high definition color digital camera.
  • the output signal from the camera is converted into digital format by a hardware device and associated software for further image processing.
  • image capturing hardware can have different variations, such as ADC (analog-to-digital converter), direct physical interface using one of the standard protocols (i.e., Camera Link, FireWire, USB etc.).
  • a digital signal processor (DSP) or other digital processor is used for image processing.
  • the target has a specific design or pattern which can be perceived by the cameras 20 and 22 .
  • the pattern may be a checkerboard pattern as shown in FIG. 2 .
  • the near-view camera 20 and the far-view camera 22 are connected to an electronic control unit (ECU) 30 .
  • the ECU processes the images from the cameras 20 and 22 and generates tractor movement commands that cause the tractor to move to a position which permits the implement 14 to be coupled to the tractor 10 .
  • the ECU 30 is programmed to implement a perception engine 32 , a controller 34 and a machine/vehicle and operator interface 36 .
  • the perception engine 32 processes the images from the cameras 20 and 22 and generates relative (tractor-implement) attitude information, including 3 dimensional position (x, y, z) and 3 dimensional rotation (yaw, pitch and roll).
  • the relative attitude information is then processed by a controller 34 .
  • the output of the controller 34 is further processed by a machine/vehicle and operator interface 36 .
  • An optional inertial navigation sensor 38 may provide yaw rate, acceleration and pitch rate information to the controller 34 .
  • the inertial sensor 38 may be either internal or external to the ECU 30 .
  • a digital data communication bus, such as an ISO11783 or SAE J19239 bus 40 connects the interface 36 to an operator display 42 , to a conventional vehicle steering control system 44 which controls the steering of the tractor 10 , and to a conventional vehicle propulsion control system 46 which controls the movement of the tractor 10 .
  • the cameras 20 and 22 may be commercially available components which are wired to the ECU 30 , or they could be application specific and integrated into an ECU enclosure (not shown). Preferably, the cameras should have the speed and resolution of HDTV.
  • the far-view camera 22 would see out further to identify the implement and begin the automated coupling operation from longer distances and lateral offsets to the implement.
  • the near-view camera 20 will provide higher localization accuracy when the tractor 10 is near the implement 14 .
  • perception engine 32 includes image processing software which is either actively trained or programmed to recognize the size, geometry, and or color of the target 16 located on the implement 14 .
  • the image processing software may also be capable normalizing the image error and distortion due camera optics and manufacturing tolerance.
  • the perception engine 32 includes an image capture process 50 which receives images from the cameras 20 and 22 .
  • the images are undistorted by image undistortion process 52 .
  • This undistortion process 52 finds the relationship between the image pixel coordinates and the corresponding coordinates in the camera reference frame. This process also corrects the geometric distortion introduced by the optics.
  • Two sets of parameters are used: intrinsic camera parameters and distortion coefficients. These parameters are obtained by a standard camera calibration process using a known image pattern such as a classical black-white chessboard.
  • Intrinsic camera parameters are related only to the camera (focal length f, lens distortion). Intrinsic camera parameters are stored in the storage device and be loaded at the start of the image processing program.
  • Transform data is obtained and calculated by transform data calculation process 54 .
  • This process finds the relationship between the coordinates in the camera reference frame and the corresponding coordinates in the world reference frame (i.e., location and orientation of the target object). This relationship is defined by extrinsic camera parameters (translation and rotation matrices).
  • Extrinsic camera parameters can be stored in the storage device, or they can be obtained during the field operation using a calibration panel. Extrinsic camera parameters are related to camera mounting position and the chosen world reference frame. When the camera is moved during the operation, it must be recalibrated to obtain a new set of extrinsic camera parameters
  • a process 56 determines an initial region of interest (ROI) in the image.
  • Process 58 searches for the target 16 in the ROI. This process calculates the translation and rotation matrices from the current (relative) position between the vehicle 10 and the target 16 . It creates a sub-image based on the region of interest (ROI), converts the sub-image image to black-and-white, finds the pixel coordinates of the corners of the target pattern, refines the pixel coordinates of the corners, estimates extrinsic camera parameters (translation and rotation matrices). Since the target object has a chessboard pattern similar to the one used for calibration, the functions for find the target is very similar to the camera calibration routines in the previous step.
  • ROI region of interest
  • step 60 directs the process back to the image capture process 50 . If the target is found, then step 60 directs the process in parallel to both processes 70 and 72 .
  • Process 70 updates the ROI and directs the process back to image capture process 50 .
  • the system does not process the entire image from the camera(s). Instead, the system processes only the ROI which only includes the view of the target 16 . Since the vehicle 10 is moving relative to the target 16 , the field of view (FOV) is constantly changing. Thus, the FOV is determined by the following algorithm:
  • c is a constant based on experiment, such as 0.8.
  • Process 74 outputs this information to the controller 34 .
  • the position and orientation of the target 16 i.e, x, y, rotation angle, the pitch, roll, and yaw angles
  • the position and orientation of the target 16 i.e, x, y, rotation angle, the pitch, roll, and yaw angles
  • the position and orientation of the target 16 are sent to the vehicle steering controller 44 and propulsion controller 46 for automated navigation.
  • the perception engine 32 update current ROI based on a previous target position. This improves the output rate to the controller 34 to meet real-time requirements. There is a balance between selecting a large ROI and selecting a smaller ROI around the target. A smaller ROI can improve the image processing speed, but a larger ROI makes it easier to find the next target position. An ROI size which is 120% larger than the target in the horizontal and vertical directions is preferred.
  • the controller 34 calculates the distance between the hitch 11 and implement 14 , the off track, and heading errors of the tractor 10 . This information is processed by the controller 34 to generate the steering and speed command to operate the tractor 10 . The distance calculation is used to control the speed and stop the tractor 10 when the hitch 11 is aligned with the implement 14 . Off track and heading errors are processed to either position the tractor 10 so that the center axis of the tractor 10 is aligned with the center axis of the implement, while sequentially or simultaneously controlling the convergence of the hitch points for aligning multi-contact point hitches (e.g. ag. tractor three point hitches), or to simply drive the tractor 10 the shortest path to converge the distance between the hitch points for single contact point hitches (drawbar—tongue applications).
  • multi-contact point hitches e.g. ag. tractor three point hitches
  • the controller 34 receives the following data: vehicle database information including hitch transformation, hitch height, and inertial transformation; implement database information including hitch transformation and hitch height; perception engine information including X, Y, Z position, roll, pitch and yaw; Inertial sensor information including 3D accelerations and rotation rates; and vehicle telemetry information including wheel curvature and wheel speed, and Inertial transformation.
  • vehicle database information including hitch transformation, hitch height, and inertial transformation
  • implement database information including hitch transformation and hitch height
  • perception engine information including X, Y, Z position, roll, pitch and yaw
  • Inertial sensor information including 3D accelerations and rotation rates
  • vehicle telemetry information including wheel curvature and wheel speed, and Inertial transformation.
  • the controller 34 processes this information and generates these outputs: guidance controller information including off track error, normal error, heading rate and heading error; and velocity limiter information including heading error, desired speed and desired direction (forward or reverse).
  • the vehicle database resides on a controller of the tractor 10 on which the auto-motion control system is installed.
  • the database can contain the following information (and more) about the tractor.
  • Hitch transformation is a Euclidian translation and rotation transformation matrices from the focal point on the camera(s) to the pin hole on the drawbar.
  • Hitch height is the magnitude of the vector normal to the world reference ground plane to the center of the hitch.
  • Inertial sensor transformation is the Euclidian translation between the inertial sensor 38 and either the hitch or the camera.
  • the implement database resides on a controller of the tractor and contains the information about all the implements in the fleet. It can contain the following information (and more) about the implement(s) in the fleet.
  • Hitch transformation is a Euclidian translation and rotation transformation matrices from a point on the target to the center of the implement hitch.
  • Hitch height is a magnitude of the vector normal to the world reference ground plane to the center of the hitch.
  • the perception engine is the vision system data. It contains the x,y,z, roll, pitch, yaw of the vehicle relative to the target 16 on the implement 14 .
  • An optional inertial sensor 38 provides vehicle attitude information, including but not limited to compass heading, 3-dimensional acceleration, and 3-axis rotation rates (roll, pitch, and yaw).
  • vehicle attitude information including but not limited to compass heading, 3-dimensional acceleration, and 3-axis rotation rates (roll, pitch, and yaw).
  • the forward or reverse speed of the wheels on the tractor 10 , and the left or right steering curvature of the tractor 10 are provided by existing sensors on the vehicle 10 .
  • One or more inertial sensor(s) may be employed to improve the stability of the controller 34 during rapid movement and interpolation between perceptions epics.
  • the inertial information may also be used in lieu of perception for intermittent loss of perception.
  • the controller 34 performs 3D translations and rotations using calibration data and the output from the perception engine 32 and projects the location of the implement hitch point onto the plane the tractor is driving on to generate a lateral offset and a heading error.
  • the controller 34 preferably generates a steering command based on simultaneously converging offset and heading error to align the hitch points.
  • the controller 34 preferably generates a speed control signal based on a lookup table using distance and heading error as inputs.
  • the controller 34 preferably also is programmed to prevent generation of a speed control signal which could make the tractor “run away” in case of a loss or error in perception information or increase speed while controlling the vehicle.
  • the interface 36 receives (from the controller 34 ) the output information described previously, and processes that information according to the process steps shown in FIG. 6 . If the vehicle is on in step 100 , then step 102 disables the hitch and alignment controls. Step 104 prevents further action unless the operator enables the hitch alignment process.
  • step 106 prompts the operator to drive the vehicle 10 towards the implement 14 .
  • step 108 detects the implement, it directs control to step 110 , else back to step 106 .
  • Step 110 searches a database 112 for calibration data. If a calibration is found, step 114 directs control to step 116 , else back to step 102 .
  • Step 116 sends the calibration data to the controller 34 .
  • step 118 prompts the operator to actuate automated hitching.
  • Step 120 directs control to step 122 if the operator actuates automated hitching, else back to step 118 .
  • Step 122 actuates automated steering controls and an automated speed limit.
  • Step 124 directs control to step 126 if the auto alignment is active and the operator provides a manual input to a control (turns the steering wheel, depresses the clutch, shifts the transmission, or applies the brakes), thus disabling the automated control.
  • the operator has priority so that the operator can take control of the machine.
  • Step 126 exits the automation, returns the tractor 10 to manual control and directs control back to step 102 .
  • Step 128 directs control step 130 if the tractor hitch 12 and the hitch of the implement 14 are aligned, else back to step 102 .
  • Step 130 exits the automation, prompts the operator to place the tractor 10 in park, and directs control back to step 102 .
  • the interface 36 implements a human interface using standard virtual terminal technology, and implements a Class 3 implement vehicle control. Both are defined in standard ISO11783. Alternatively, there are other means of implementing the human interface.
  • the target 16 may include a code for automatic identification of the implement 14 so that calibration parameters can be automatically loaded from a storage device.
  • the perception system described herein provides the position of the implement relative to the tractor.
  • the movement of the vehicle detected by by the vision system can be combined with existing on-tractor speed and acceleration sensors, including calculations from GPS. Such a combined signal may be used to improve control of the path the vehicle traverses.
  • This perception system could also be used by auto-guided tractor-implement pairs to perform direction reversals using a reverse motion turn to achieve automated three point turns.
  • the guidance control system can control the path of the vehicle and implement when pushing the implement in reverse.
  • the perception system also provides three dimensional information about the implement in a tractor-implement pair. Ride height of implements or penetration depth of ground engaging implements can be measured by referencing the implements height relative to the tractor operating on the surface.
  • This invention may also be used with an automated hitching system so that no manual effort is required to hitch the tractor when it is aligned with the implement.
  • This invention also applies to both single point (e.g. drawbar) and multiple point (e.g. three point) hitches for implements that hitch anywhere on the vehicle (front, rear, over and under).

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Transportation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Zoology (AREA)
  • Multimedia (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Guiding Agricultural Machines (AREA)

Abstract

An automatic control system moves a vehicle so that it can be coupled to an implement. The vehicle has propulsion and steering controls which respond to electronic input signals. A visual target is mounted on the implement. Vehicle mounted cameras generate images of the target. A motion control unit receives and processes the images, and generates vehicle movement commands as a function of the images which are communicated to the propulsion and steering controls. The motion control unit includes a perception engine connected to the cameras, a controller connected to the perception engine, and a vehicle/operator interface which is connected to the controller, and to the propulsion and steering controls. The perception engine captures images from the cameras, un-distorts the images, and searches for the target. If the target is found, the perception engine recovers vehicle posture information and transmits said information to the controller.

Description

    FIELD OF THE INVENTION
  • The present disclosure relates to a system which senses a target on a towed implement and automatically controls the steering and movement of a vehicle to align the vehicle with the towed implement which is to be coupled to the vehicle, such as a wagon or trailer.
  • BACKGROUND OF THE INVENTION
  • In the operation of work vehicles, it is often necessary to connect a towed implement to a hitch located at the rear of the vehicle. Or, it may be necessary to position a combine relative to a header unit. The amount of time and effort required to manually align the vehicle to the implement may be significant, depending on the skill of the operator in controlling the vehicle and the ability of the operator to see both the implement tongue and the vehicle drawbar. Particularly with large off-road vehicles, such as an agricultural tractor, the operator's field of view may be obstructed such that the operator may not be able to see either or both the tongue and/or the drawbar. It would be desirable to have a system which has electronic controls which automate the process based on images obtained by a camera or cameras.
  • SUMMARY
  • According to an aspect of the present disclosure, a system includes a camera or cameras which are mounted on a vehicle. The cameras produce images of a target on the implement to which the vehicle is to be coupled. An image processor unit processes the images and other information and a control system automatically controls the steering and movement of the vehicle to align the vehicle with the implement, so that the implement can be coupled to the vehicle. The system aligns the hitch of the vehicle with a towed implement, such as a wagon or trailer to be connected. The vehicle automatically stops when the hitch is aligned with the tongue of the towed device. This reduces the skill required of an operator and overcomes field of view limitations.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a simplified schematic side view of a tractor and implement to be coupled to the tractor;
  • FIG. 2 is a front view of a target which is mounted on the implement of FIG. 1;
  • FIG. 3 is schematic block diagram of a control system for controlling the movement of the tractor of FIG. 1;
  • FIG. 4 is schematic block diagram of the perception engine of FIG. 3;
  • FIG. 5 is a schematic block diagram of the controller of FIG. 3; and
  • FIG. 6 is a flow chart representation of the operator interface of FIG. 3.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • Referring to FIG. 1, a towing vehicle 10, such as an agricultural tractor, has a conventional hitch 11 and/or a drawbar 12 for coupling to an implement 14. A target 16 is mounted on the implement 14 so as to be viewable from the direction of the vehicle 10. The target 16 is preferably pivotally mounted on the implement 14 about a pivot 18 having a pivot axis which is transverse to a fore-and-aft axis of the implement 14. A pair of cameras 20 and 22 are mounted on a rear upper portion of the towing vehicle. Preferably, the cameras include a near-view camera 20 and a far-view camera 22. Preferably, the target 16 has a chessboard pattern for easy identification of corner positions.
  • Various types of cameras can be used, such as a CCD (charge-coupled device) or a CMOS (complementary metal-oxide-semiconductor). Camera output can be digital or analog, in color or in monochrome. The cameras should have a high dynamic range so they can successfully capture the target image in the lightest and darkest operation conditions. The field-of-view of the camera should be large enough to be able to identify the target at the farthest and closest distances from vehicle to the implement. If a single camera's field-of-view cannot cover the entire distance range, a second camera could be used. Preferably, the camera is a high definition color digital camera.
  • The output signal from the camera is converted into digital format by a hardware device and associated software for further image processing. Depending on the output signal of the camera, image capturing hardware can have different variations, such as ADC (analog-to-digital converter), direct physical interface using one of the standard protocols (i.e., Camera Link, FireWire, USB etc.). A digital signal processor (DSP) or other digital processor is used for image processing. As best seen in FIG. 2, the target has a specific design or pattern which can be perceived by the cameras 20 and 22. For example, the pattern may be a checkerboard pattern as shown in FIG. 2.
  • Referring again to FIG. 3, the near-view camera 20 and the far-view camera 22 are connected to an electronic control unit (ECU) 30. The ECU processes the images from the cameras 20 and 22 and generates tractor movement commands that cause the tractor to move to a position which permits the implement 14 to be coupled to the tractor 10. The ECU 30 is programmed to implement a perception engine 32, a controller 34 and a machine/vehicle and operator interface 36. The perception engine 32 processes the images from the cameras 20 and 22 and generates relative (tractor-implement) attitude information, including 3 dimensional position (x, y, z) and 3 dimensional rotation (yaw, pitch and roll). The relative attitude information is then processed by a controller 34. The output of the controller 34 is further processed by a machine/vehicle and operator interface 36. An optional inertial navigation sensor 38 may provide yaw rate, acceleration and pitch rate information to the controller 34. The inertial sensor 38 may be either internal or external to the ECU 30. A digital data communication bus, such as an ISO11783 or SAE J19239 bus 40 connects the interface 36 to an operator display 42, to a conventional vehicle steering control system 44 which controls the steering of the tractor 10, and to a conventional vehicle propulsion control system 46 which controls the movement of the tractor 10.
  • The cameras 20 and 22 may be commercially available components which are wired to the ECU 30, or they could be application specific and integrated into an ECU enclosure (not shown). Preferably, the cameras should have the speed and resolution of HDTV. The far-view camera 22 would see out further to identify the implement and begin the automated coupling operation from longer distances and lateral offsets to the implement. The near-view camera 20 will provide higher localization accuracy when the tractor 10 is near the implement 14.
  • Referring now to FIG. 4, perception engine 32 includes image processing software which is either actively trained or programmed to recognize the size, geometry, and or color of the target 16 located on the implement 14. The image processing software may also be capable normalizing the image error and distortion due camera optics and manufacturing tolerance.
  • The perception engine 32 includes an image capture process 50 which receives images from the cameras 20 and 22. The images are undistorted by image undistortion process 52. This undistortion process 52 finds the relationship between the image pixel coordinates and the corresponding coordinates in the camera reference frame. This process also corrects the geometric distortion introduced by the optics. Two sets of parameters are used: intrinsic camera parameters and distortion coefficients. These parameters are obtained by a standard camera calibration process using a known image pattern such as a classical black-white chessboard. Intrinsic camera parameters are related only to the camera (focal length f, lens distortion). Intrinsic camera parameters are stored in the storage device and be loaded at the start of the image processing program.
  • Transform data is obtained and calculated by transform data calculation process 54. This process finds the relationship between the coordinates in the camera reference frame and the corresponding coordinates in the world reference frame (i.e., location and orientation of the target object). This relationship is defined by extrinsic camera parameters (translation and rotation matrices). Extrinsic camera parameters can be stored in the storage device, or they can be obtained during the field operation using a calibration panel. Extrinsic camera parameters are related to camera mounting position and the chosen world reference frame. When the camera is moved during the operation, it must be recalibrated to obtain a new set of extrinsic camera parameters
  • A process 56 determines an initial region of interest (ROI) in the image. Process 58 then searches for the target 16 in the ROI. This process calculates the translation and rotation matrices from the current (relative) position between the vehicle 10 and the target 16. It creates a sub-image based on the region of interest (ROI), converts the sub-image image to black-and-white, finds the pixel coordinates of the corners of the target pattern, refines the pixel coordinates of the corners, estimates extrinsic camera parameters (translation and rotation matrices). Since the target object has a chessboard pattern similar to the one used for calibration, the functions for find the target is very similar to the camera calibration routines in the previous step.
  • If the target is not found, then step 60 directs the process back to the image capture process 50. If the target is found, then step 60 directs the process in parallel to both processes 70 and 72.
  • Process 70 updates the ROI and directs the process back to image capture process 50. In order to speed up image processing, the system does not process the entire image from the camera(s). Instead, the system processes only the ROI which only includes the view of the target 16. Since the vehicle 10 is moving relative to the target 16, the field of view (FOV) is constantly changing. Thus, the FOV is determined by the following algorithm:

  • Xmin_new=Xmin_old−dX

  • Xmax_new=Xmax_old+dX

  • Ymin_new=Ymin_old−dY

  • Ymax_new=Ymax_old+dY

  • dX=c*(Xmax_old−Xmin_old)

  • dY=c*(Ymax_old−Ymin_old),
  • where c is a constant based on experiment, such as 0.8.
  • Process 72 analyzes the image which contains the target and generates the tractor position and attitude information. This process finds the position and orientation of the target 16 in the world reference frame. The position and orientation of the target 16 is calculated from the differences between the current translation and rotation and the initial translation and rotation. The position of the target 16 is the 3D world coordinates (with z=0). The orientation of the target 16 includes the rotation angle, the pitch, roll, and yaw angles.
  • Process 74 outputs this information to the controller 34. The position and orientation of the target 16 (i.e, x, y, rotation angle, the pitch, roll, and yaw angles) are sent to the vehicle steering controller 44 and propulsion controller 46 for automated navigation.
  • Thus, the perception engine 32 update current ROI based on a previous target position. This improves the output rate to the controller 34 to meet real-time requirements. There is a balance between selecting a large ROI and selecting a smaller ROI around the target. A smaller ROI can improve the image processing speed, but a larger ROI makes it easier to find the next target position. An ROI size which is 120% larger than the target in the horizontal and vertical directions is preferred.
  • Referring now to FIG. 5, the relative position and orientation of the hitch 11 and the implement 14 is read by the controller 34. The controller 34 calculates the distance between the hitch 11 and implement 14, the off track, and heading errors of the tractor 10. This information is processed by the controller 34 to generate the steering and speed command to operate the tractor 10. The distance calculation is used to control the speed and stop the tractor 10 when the hitch 11 is aligned with the implement 14. Off track and heading errors are processed to either position the tractor 10 so that the center axis of the tractor 10 is aligned with the center axis of the implement, while sequentially or simultaneously controlling the convergence of the hitch points for aligning multi-contact point hitches (e.g. ag. tractor three point hitches), or to simply drive the tractor 10 the shortest path to converge the distance between the hitch points for single contact point hitches (drawbar—tongue applications).
  • The controller 34 receives the following data: vehicle database information including hitch transformation, hitch height, and inertial transformation; implement database information including hitch transformation and hitch height; perception engine information including X, Y, Z position, roll, pitch and yaw; Inertial sensor information including 3D accelerations and rotation rates; and vehicle telemetry information including wheel curvature and wheel speed, and Inertial transformation. The controller 34 processes this information and generates these outputs: guidance controller information including off track error, normal error, heading rate and heading error; and velocity limiter information including heading error, desired speed and desired direction (forward or reverse).
  • The vehicle database resides on a controller of the tractor 10 on which the auto-motion control system is installed. The database can contain the following information (and more) about the tractor. Hitch transformation is a Euclidian translation and rotation transformation matrices from the focal point on the camera(s) to the pin hole on the drawbar. Hitch height is the magnitude of the vector normal to the world reference ground plane to the center of the hitch. Inertial sensor transformation is the Euclidian translation between the inertial sensor 38 and either the hitch or the camera.
  • The implement database resides on a controller of the tractor and contains the information about all the implements in the fleet. It can contain the following information (and more) about the implement(s) in the fleet. Hitch transformation is a Euclidian translation and rotation transformation matrices from a point on the target to the center of the implement hitch. Hitch height is a magnitude of the vector normal to the world reference ground plane to the center of the hitch. The perception engine is the vision system data. It contains the x,y,z, roll, pitch, yaw of the vehicle relative to the target 16 on the implement 14.
  • An optional inertial sensor 38 provides vehicle attitude information, including but not limited to compass heading, 3-dimensional acceleration, and 3-axis rotation rates (roll, pitch, and yaw). The forward or reverse speed of the wheels on the tractor 10, and the left or right steering curvature of the tractor 10 are provided by existing sensors on the vehicle 10. One or more inertial sensor(s) may be employed to improve the stability of the controller 34 during rapid movement and interpolation between perceptions epics. The inertial information may also be used in lieu of perception for intermittent loss of perception.
  • As a result, the controller 34 performs 3D translations and rotations using calibration data and the output from the perception engine 32 and projects the location of the implement hitch point onto the plane the tractor is driving on to generate a lateral offset and a heading error. The controller 34 preferably generates a steering command based on simultaneously converging offset and heading error to align the hitch points. The controller 34 preferably generates a speed control signal based on a lookup table using distance and heading error as inputs. The controller 34 preferably also is programmed to prevent generation of a speed control signal which could make the tractor “run away” in case of a loss or error in perception information or increase speed while controlling the vehicle.
  • Referring now to FIG. 6, the interface 36 receives (from the controller 34) the output information described previously, and processes that information according to the process steps shown in FIG. 6. If the vehicle is on in step 100, then step 102 disables the hitch and alignment controls. Step 104 prevents further action unless the operator enables the hitch alignment process.
  • If the operator enables the hitch alignment process, then step 106 prompts the operator to drive the vehicle 10 towards the implement 14.
  • If step 108 detects the implement, it directs control to step 110, else back to step 106.
  • Step 110 searches a database 112 for calibration data. If a calibration is found, step 114 directs control to step 116, else back to step 102.
  • Step 116 sends the calibration data to the controller 34.
  • Then step 118 prompts the operator to actuate automated hitching.
  • Step 120 directs control to step 122 if the operator actuates automated hitching, else back to step 118.
  • Step 122 actuates automated steering controls and an automated speed limit.
  • Step 124 directs control to step 126 if the auto alignment is active and the operator provides a manual input to a control (turns the steering wheel, depresses the clutch, shifts the transmission, or applies the brakes), thus disabling the automated control. As a result, the operator has priority so that the operator can take control of the machine.
  • Step 126 exits the automation, returns the tractor 10 to manual control and directs control back to step 102.
  • Step 128 directs control step 130 if the tractor hitch 12 and the hitch of the implement 14 are aligned, else back to step 102.
  • Step 130 exits the automation, prompts the operator to place the tractor 10 in park, and directs control back to step 102.
  • The interface 36 implements a human interface using standard virtual terminal technology, and implements a Class 3 implement vehicle control. Both are defined in standard ISO11783. Alternatively, there are other means of implementing the human interface.
  • The conversion of the above flow chart into a standard language for implementing the algorithm described by the flow chart in a digital computer or microprocessor, will be evident to one with ordinary skill in the art.
  • Many other video camera arrangements could be used, such as stereo imaging, automatic tracking, auto focus PTZ (Pan Tilt Zoom) if the feedback from the mechanical lens position and pan tilt had sufficient accuracy. The target 16 may include a code for automatic identification of the implement 14 so that calibration parameters can be automatically loaded from a storage device.
  • The perception system described herein provides the position of the implement relative to the tractor. The movement of the vehicle detected by by the vision system can be combined with existing on-tractor speed and acceleration sensors, including calculations from GPS. Such a combined signal may be used to improve control of the path the vehicle traverses. This perception system could also be used by auto-guided tractor-implement pairs to perform direction reversals using a reverse motion turn to achieve automated three point turns. The guidance control system can control the path of the vehicle and implement when pushing the implement in reverse. The perception system also provides three dimensional information about the implement in a tractor-implement pair. Ride height of implements or penetration depth of ground engaging implements can be measured by referencing the implements height relative to the tractor operating on the surface. This invention may also be used with an automated hitching system so that no manual effort is required to hitch the tractor when it is aligned with the implement. This invention also applies to both single point (e.g. drawbar) and multiple point (e.g. three point) hitches for implements that hitch anywhere on the vehicle (front, rear, over and under).
  • While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description is to be considered as exemplary and not restrictive in character, it being understood that illustrative embodiments have been shown and described and that all changes and modifications that come within the spirit of the disclosure are desired to be protected. It will be noted that alternative embodiments of the present disclosure may not include all of the features described yet still benefit from at least some of the advantages of such features. Those of ordinary skill in the art may readily devise their own implementations that incorporate one or more of the features of the present disclosure and fall within the spirit and scope of the present invention as defined by the appended claims.

Claims (12)

We claim:
1. An automatic control system for moving a vehicle with respect to an implement to be attached to the vehicle, the vehicle having propulsion and steering controls which respond to electronic input signals, the automatic control system comprising:
a visual target mounted on the implement;
a camera mounted on the vehicle, the camera generating images of the target; and
a motion control unit which receives the images from the camera, the motion control unit processing said images and generating vehicle movement command signals as a function of the images, and the motion control unit communicating the command signals to the propulsion and steering controls, and in response to the command signals, the propulsion and steering controls moving the vehicle to a coupling position so that the implement can be coupled to the vehicle.
2. The control system of claim 1, wherein the control unit comprises:
a perception engine connected to the camera;
a controller connected to the perception engine; and
a vehicle/operator interface, the vehicle/operator interface being connected to the controller, and to the propulsion and steering controls.
3. The control system of claim 2, wherein:
the perception engine determines a region of interest and periodically updates the region of interest based a target position.
4. The control system of claim 3, wherein:
the region of interest is a portion of the image received from the camera.
5. The control system of claim 3, wherein:
the region of interest is approximately 120% larger than the target in each direction.
6. The control system of claim 2, wherein:
the perception engine captures an image from the camera, un-distorts the image, and searches for the target.
7. The control system of claim 6, wherein:
if the target is found, the perception engine recovers vehicle posture information and transmits this information to the controller.
8. The control system of claim 1, wherein:
the target displays a chessboard pattern.
9. The control system of claim 1, further comprising:
a near-view camera and a far-view camera.
10. A control system for moving a vehicle with respect to an implement to be attached to the vehicle, the vehicle having propulsion and steering controls which respond to electronic input signals, the control system comprising:
a visual target mounted on the implement;
a camera mounted on the vehicle, the camera generating images of the target; and
a motion control unit which receives the images from the camera, the motion control unit processing said images and generating vehicle movement command signals as a function of the images, and the motion control unit communicating the command signals to the propulsion and steering controls, and in response to the command signals, the propulsion and steering controls moving the vehicle to a coupling position so that the implement can be coupled to the vehicle, the motion control unit comprising:
a perception engine connected to the camera;
a controller connected to the perception engine; and
a vehicle/operator interface, the vehicle/operator interface being connected to the controller, and to the propulsion and steering controls, the perception engine capturing an image from the camera, un-distorting the image, and searching for the target, and if the target is found, the perception engine recovering vehicle posture information and transmitting said information to the controller.
11. The control system of claim 10, wherein:
the target displays a chessboard pattern.
12. An automatic control system for moving a vehicle with respect to an implement to be attached to the vehicle, the vehicle having propulsion and steering controls which respond to electronic input signals, the automatic control system comprising:
a visual target mounted on the implement;
a near-view camera and a far-view camera mounted on the vehicle, the cameras generating images of the target; and
a motion control unit which receives the images from the cameras, the motion control unit processing said images and generating vehicle movement command signals as a function of the images, and the motion control unit communicating the command signals to the propulsion and steering controls, and in response to the command signals, the propulsion and steering controls moving the vehicle to a coupling position so that the implement can be coupled to the vehicle.
US14/027,996 2013-09-16 2013-09-16 Vehicle auto-motion control system Active 2034-06-12 US9880560B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/027,996 US9880560B2 (en) 2013-09-16 2013-09-16 Vehicle auto-motion control system
BR102014021838A BR102014021838B8 (en) 2013-09-16 2014-09-03 automatic control system
DE201410217746 DE102014217746A1 (en) 2013-09-16 2014-09-04 A control system for moving a vehicle relative to an implement to be mounted on the vehicle
CN201410471869.4A CN104460361B (en) 2013-09-16 2014-09-16 Automatic motion control system for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/027,996 US9880560B2 (en) 2013-09-16 2013-09-16 Vehicle auto-motion control system

Publications (2)

Publication Number Publication Date
US20150077557A1 true US20150077557A1 (en) 2015-03-19
US9880560B2 US9880560B2 (en) 2018-01-30

Family

ID=52580176

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/027,996 Active 2034-06-12 US9880560B2 (en) 2013-09-16 2013-09-16 Vehicle auto-motion control system

Country Status (4)

Country Link
US (1) US9880560B2 (en)
CN (1) CN104460361B (en)
BR (1) BR102014021838B8 (en)
DE (1) DE102014217746A1 (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160059780A1 (en) * 2014-09-03 2016-03-03 Ford Global Technologies, Llc Trailer angle detection target fade warning
US20160312432A1 (en) * 2015-04-23 2016-10-27 Caterpillar Inc. Computer Vision Assisted Work Tool Recognition and Installation
EP3143854A1 (en) * 2015-09-17 2017-03-22 CLAAS Tractor S.A.S. Agricultural working machine
EP3178302A1 (en) * 2015-12-10 2017-06-14 Robert Bosch Gmbh Device and method for governing the operation of a hydraulically actuated towing device on a vehicle
US9723274B2 (en) 2011-04-19 2017-08-01 Ford Global Technologies, Llc System and method for adjusting an image capture setting
US9751558B2 (en) 2015-03-25 2017-09-05 Ford Global Technologies, Llc Handwheel obstruction detection and inertia compensation
US20170293198A1 (en) * 2016-04-07 2017-10-12 Lg Electronics Inc. Driver assistance apparatus and vehicle
US9829883B1 (en) 2016-10-17 2017-11-28 Ford Global Technologies, Llc Trailer backup assist system having remote control and user sight management
US9836060B2 (en) 2015-10-28 2017-12-05 Ford Global Technologies, Llc Trailer backup assist system with target management
US9854209B2 (en) 2011-04-19 2017-12-26 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
WO2017220257A1 (en) * 2016-06-24 2017-12-28 Zf Friedrichshafen Ag Autonomous coupling of a trailer
US9895945B2 (en) 2015-12-08 2018-02-20 Ford Global Technologies, Llc Trailer backup assist system with hitch assist
US9926008B2 (en) 2011-04-19 2018-03-27 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
GB2554523A (en) * 2016-09-24 2018-04-04 Roderick Webster Gary Automatic Car trailer hitching and unhitching system
IT201600113433A1 (en) * 2016-11-10 2018-05-10 Claudio Corsetti HANDLING SYSTEM FOR ANY AGRICULTURAL MACHINERY.
US9971943B2 (en) 2007-03-21 2018-05-15 Ford Global Technologies, Llc Vehicle trailer angle detection system and method
US9981656B2 (en) 2015-10-13 2018-05-29 Ford Global Technologies, Llc Vehicle parking assist system
US10011228B2 (en) 2015-12-17 2018-07-03 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system using multiple imaging devices
US20180284808A1 (en) * 2017-03-31 2018-10-04 Komatsu Ltd. Control system of transporter vehicle, transporter vehicle, and control method of transporter vehicle
US10106193B2 (en) 2016-07-01 2018-10-23 Ford Global Technologies, Llc Enhanced yaw rate trailer angle detection initialization
US10106951B2 (en) 2016-09-21 2018-10-23 Deere & Company System and method for automatic dump control
US20180316554A1 (en) * 2017-04-27 2018-11-01 Autoliv Asp, Inc. System and method for configuring at least one sensor system of a vehicle
US10127459B2 (en) 2015-12-17 2018-11-13 Ford Global Technologies, Llc Trailer type identification system
IT201700054083A1 (en) * 2017-05-18 2018-11-18 Cnh Ind Italia Spa SYSTEM AND METHOD OF AUTOMATIC CONNECTION BETWEEN TRACTOR AND TOOL
US10196088B2 (en) 2011-04-19 2019-02-05 Ford Global Technologies, Llc Target monitoring system and method
CN109799828A (en) * 2019-02-28 2019-05-24 北京智行者科技有限公司 Automatic driving vehicle transverse and longitudinal control method for coordinating
US10328933B2 (en) 2015-10-29 2019-06-25 Ford Global Technologies, Llc Cognitive reverse speed limiting
US10351061B1 (en) * 2018-03-09 2019-07-16 Deere & Company Implement attachment guidance system
WO2019156623A1 (en) * 2018-02-07 2019-08-15 Red Dot Robotics Pte. Ltd. Towing vehicle and towed vehicle connection assembly
WO2019195067A1 (en) * 2018-04-04 2019-10-10 Continental Automotive Systems, Inc. Vehicle trailer distance detection device and method
WO2019212868A1 (en) * 2018-05-01 2019-11-07 Continental Automotive Systems, Inc. Trailer detection and autonomous hitching
US10710585B2 (en) 2017-09-01 2020-07-14 Ford Global Technologies, Llc Trailer backup assist system with predictive hitch angle functionality
JP2020113268A (en) * 2019-01-07 2020-07-27 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツングContinental Automotive GmbH Method for calculating tow hitch position
US10773665B2 (en) 2018-10-16 2020-09-15 Cnh Industrial America Llc System and method for detecting a damage condition associated with an agricultural machine
US20200361397A1 (en) * 2019-05-14 2020-11-19 Magna Electronics Inc. Vehicular rear vision system with trailer connection verification
CN112424002A (en) * 2018-05-08 2021-02-26 大陆汽车系统公司 Visual object tracker
EP3864944A1 (en) * 2020-02-14 2021-08-18 Deere & Company Implement control of vehicle and implement combination
JP2022049379A (en) * 2020-09-16 2022-03-29 株式会社クボタ Work machine
WO2022123347A1 (en) * 2020-12-10 2022-06-16 Agco International Gmbh Methods and systems for controlling motion of agricultural vehicles
US11586208B2 (en) * 2014-10-24 2023-02-21 Clearpath Robotics Inc. Systems and methods for executing a task with an unmanned vehicle
CN115760989A (en) * 2023-01-10 2023-03-07 西安华创马科智能控制系统有限公司 Hydraulic support robot track alignment method and device
US20230152797A1 (en) * 2016-07-19 2023-05-18 Raven Industries, Inc. System and method for autonomous control of agricultural machinery and equipment
EP4295657A1 (en) * 2022-06-23 2023-12-27 Agco Corporation Methods of locating agricultural implements
US11991942B2 (en) * 2018-06-29 2024-05-28 Dcentralized Systems, Inc. Implement attachment system for autonomous modular ground utility robot system

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10104824B2 (en) 2013-10-14 2018-10-23 Kinze Manufacturing, Inc. Autonomous systems, methods, and apparatus for AG based operations
US10572141B2 (en) * 2016-06-10 2020-02-25 Cnh Industrial America Llc Autonomous agricultural system user interface interlock
DE102016011324A1 (en) 2016-09-21 2018-03-22 Wabco Gmbh A method of controlling a towing vehicle as it approaches and hitches to a trailer vehicle
EP3379222B1 (en) 2017-03-22 2020-12-30 Methode Electronics Malta Ltd. Magnetoelastic based sensor assembly
US10332002B2 (en) * 2017-03-27 2019-06-25 GM Global Technology Operations LLC Method and apparatus for providing trailer information
US10386856B2 (en) * 2017-06-29 2019-08-20 Uber Technologies, Inc. Autonomous vehicle collision mitigation systems and methods
US10065638B1 (en) 2017-08-03 2018-09-04 Uber Technologies, Inc. Multi-model switching on a collision mitigation system
US10509413B2 (en) * 2017-09-07 2019-12-17 GM Global Technology Operations LLC Ground reference determination for autonomous vehicle operations
US10633023B2 (en) * 2017-10-23 2020-04-28 Ford Global Technologies, Llc System and method for trailer-car initial hitching angle detection by force-based sensing system
US10670479B2 (en) 2018-02-27 2020-06-02 Methode Electronics, Inc. Towing systems and methods using magnetic field sensing
US11491832B2 (en) 2018-02-27 2022-11-08 Methode Electronics, Inc. Towing systems and methods using magnetic field sensing
US11135882B2 (en) 2018-02-27 2021-10-05 Methode Electronics, Inc. Towing systems and methods using magnetic field sensing
US11221262B2 (en) 2018-02-27 2022-01-11 Methode Electronics, Inc. Towing systems and methods using magnetic field sensing
US11014417B2 (en) 2018-02-27 2021-05-25 Methode Electronics, Inc. Towing systems and methods using magnetic field sensing
US11084342B2 (en) 2018-02-27 2021-08-10 Methode Electronics, Inc. Towing systems and methods using magnetic field sensing
CN111886627B (en) * 2018-03-19 2024-09-20 捷豹路虎有限公司 Controller for vehicle
US11050933B2 (en) * 2018-04-27 2021-06-29 Continenal Automotive Systems, Inc. Device and method for determining a center of a trailer tow coupler
WO2020008419A2 (en) * 2018-07-04 2020-01-09 Corephotonics Ltd. Cameras with scanning optical path folding elements for automotive or surveillance applications
DE102018215982A1 (en) * 2018-09-19 2020-03-19 Zf Friedrichshafen Ag Device and method for controlling a vehicle for a swap body
EP3874926B1 (en) * 2018-10-30 2022-12-21 Tokyo Keiki Inc. Error correction apparatus
EP3670761B1 (en) * 2018-12-21 2021-10-20 Hiab AB A vehicle provided with a control system, and a method for the vehicle
US11124943B2 (en) * 2019-02-22 2021-09-21 Deere & Company Apparatuses and methods for measuring saddle linkage position of a motor grader
US10780752B1 (en) 2019-04-09 2020-09-22 Ford Global Technologies, Llc Vehicle hitch ball detection system
DE102019205447A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Device and method for approaching a towing vehicle to a trailer as well as towing vehicle and trailer
CN110244724A (en) * 2019-06-12 2019-09-17 广州晒帝智能科技有限公司 A kind of direction of travel control method and device and equipment based on caterpillar type robot
JP7016115B2 (en) * 2019-08-09 2022-02-04 国立大学法人神戸大学 Shaft joint characteristic evaluation device and characteristic evaluation method
US12016257B2 (en) 2020-02-19 2024-06-25 Sabanto, Inc. Methods for detecting and clearing debris from planter gauge wheels, closing wheels and seed tubes
US11697414B2 (en) * 2020-03-17 2023-07-11 Ford Global Technologies, Llc System and method for remote automated vehicle alignment with a trailer for hitching
DE102021002955B3 (en) 2021-06-10 2022-08-11 Jost-Werke Deutschland Gmbh Method of moving a vehicle to a component of an object spaced therefrom (preposition point)
DE102021002956B3 (en) 2021-06-10 2022-08-11 Jost-Werke Deutschland Gmbh Method for moving a vehicle to a component of an object spaced therefrom (coordinate transformation)
US12114589B2 (en) 2021-06-30 2024-10-15 Deere & Company Pre-charging technique for dispensing product from tractor tool prior to reaching plant site
US11822339B2 (en) 2021-06-30 2023-11-21 Deere & Company Predictive technique for dispensing product from tractor tool
US11870973B2 (en) * 2021-07-27 2024-01-09 Deere & Company Camera calibration tool
DE102021119686A1 (en) * 2021-07-29 2023-02-02 Deere & Company Method for supporting a coupling process to be performed at an agricultural implement interface
US20230340758A1 (en) * 2022-04-21 2023-10-26 Deere & Company Work vehicle having enhanced visibility throughout implement movement
US20230339402A1 (en) * 2022-04-21 2023-10-26 Deere & Company Selectively utilizing multiple imaging devices to maintain a view of an area of interest proximate a work vehicle
DE102023100586A1 (en) 2023-01-12 2024-07-18 Connaught Electronics Ltd. Method for attaching a vehicle to a trailer

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050074143A1 (en) * 2003-10-02 2005-04-07 Nissan Motor Co., Ltd. Vehicle backing assist apparatus and vehicle backing assist method
DE102004043761A1 (en) * 2004-09-10 2006-03-16 Daimlerchrysler Ag Tow coupling monitoring method for towing vehicle, involves comparing graphic data with allowed samples and threshold values for producing assistance signals, and displaying data and signals for monitoring coupling process of coupling
US20120013742A1 (en) * 2010-07-16 2012-01-19 Delphi Technologies, Inc. Vision system and method for displaying a field of view dependent upon detecting an object
WO2012139275A1 (en) * 2011-04-11 2012-10-18 Intel Corporation Object of interest based image processing
US20140176713A1 (en) * 2012-12-22 2014-06-26 Continental Automotive Systems, Inc. Vehicular bifocal rear camera system
US20140270343A1 (en) * 2013-03-12 2014-09-18 Abu Shaher Sanaullah Efficient 360 degree video processing

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10323915A1 (en) 2003-05-23 2005-02-03 Daimlerchrysler Ag Camera-based position detection for a road vehicle
US7171769B2 (en) 2004-02-12 2007-02-06 Sno-Way International, Inc. Hitch system using a video camera to facilitate hitch alignment
EP1828862A2 (en) * 2004-12-14 2007-09-05 Sky-Trax Incorporated Method and apparatus for determining position and rotational orientation of an object
US7568716B2 (en) 2005-05-13 2009-08-04 Dietz Dan L Method and apparatus for alignment of trailer hitch
DE102006025698A1 (en) 2006-06-01 2007-12-06 Alois Pöttinger Maschinenfabrik Gmbh Method and device for controlling the tightening of an agricultural machine to a tractor
FR2913938B1 (en) * 2007-03-19 2009-05-08 Lohr Ind TOTAL IMMOBILIZATION ASSEMBLY AND AUTOCENTRAGE OF A PIVOT FOR HITCHING A SEMI-TRAILER ON A RAILWAY UNIT.
CN100555142C (en) * 2008-01-24 2009-10-28 浙江大学 Distributed measure and control device of automatic guided vehicle and parking position control method
US7777615B2 (en) 2008-03-20 2010-08-17 Toyota Motor Engineering & Manufacturing North America, Inc. System for assisting the attachment of a trailer to a vehicle
CN201503547U (en) * 2009-07-03 2010-06-09 陕西昕宇表面工程有限公司 Control system used for automatically overturning and locating conveying trolley
US8469385B2 (en) * 2010-04-02 2013-06-25 F & S Manufacturing LLC Rotary pivot hitch extension system
CN102117066A (en) * 2011-02-28 2011-07-06 三一重机有限公司 Remote control system for excavator
DE102012001380A1 (en) 2012-01-24 2012-08-02 Daimler Ag Assistance method for coupling maneuver of motor car with trailer coupling at trailer hitch pan using driver assistance system, involves overlying screen display of image with trajectory between trailer coupling and clutch pan
JP5852471B2 (en) * 2012-02-28 2016-02-03 株式会社豊田中央研究所 Vehicle control device, steering simulation device, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050074143A1 (en) * 2003-10-02 2005-04-07 Nissan Motor Co., Ltd. Vehicle backing assist apparatus and vehicle backing assist method
DE102004043761A1 (en) * 2004-09-10 2006-03-16 Daimlerchrysler Ag Tow coupling monitoring method for towing vehicle, involves comparing graphic data with allowed samples and threshold values for producing assistance signals, and displaying data and signals for monitoring coupling process of coupling
US20120013742A1 (en) * 2010-07-16 2012-01-19 Delphi Technologies, Inc. Vision system and method for displaying a field of view dependent upon detecting an object
WO2012139275A1 (en) * 2011-04-11 2012-10-18 Intel Corporation Object of interest based image processing
US20140176713A1 (en) * 2012-12-22 2014-06-26 Continental Automotive Systems, Inc. Vehicular bifocal rear camera system
US20140270343A1 (en) * 2013-03-12 2014-09-18 Abu Shaher Sanaullah Efficient 360 degree video processing

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9971943B2 (en) 2007-03-21 2018-05-15 Ford Global Technologies, Llc Vehicle trailer angle detection system and method
US9854209B2 (en) 2011-04-19 2017-12-26 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US10609340B2 (en) 2011-04-19 2020-03-31 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US10196088B2 (en) 2011-04-19 2019-02-05 Ford Global Technologies, Llc Target monitoring system and method
US9723274B2 (en) 2011-04-19 2017-08-01 Ford Global Technologies, Llc System and method for adjusting an image capture setting
US9926008B2 (en) 2011-04-19 2018-03-27 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US20160059780A1 (en) * 2014-09-03 2016-03-03 Ford Global Technologies, Llc Trailer angle detection target fade warning
US10112537B2 (en) * 2014-09-03 2018-10-30 Ford Global Technologies, Llc Trailer angle detection target fade warning
US12072709B2 (en) 2014-10-24 2024-08-27 Rockwell Automation Technologies, Inc. Systems and methods for executing a task with an unmanned vehicle
US11586208B2 (en) * 2014-10-24 2023-02-21 Clearpath Robotics Inc. Systems and methods for executing a task with an unmanned vehicle
US9751558B2 (en) 2015-03-25 2017-09-05 Ford Global Technologies, Llc Handwheel obstruction detection and inertia compensation
US10421490B2 (en) 2015-03-25 2019-09-24 Ford Global Technologies, Llc Handwheel obstruction detection and inertia compensation
US20160312432A1 (en) * 2015-04-23 2016-10-27 Caterpillar Inc. Computer Vision Assisted Work Tool Recognition and Installation
EP3143854A1 (en) * 2015-09-17 2017-03-22 CLAAS Tractor S.A.S. Agricultural working machine
US9981656B2 (en) 2015-10-13 2018-05-29 Ford Global Technologies, Llc Vehicle parking assist system
US10496101B2 (en) 2015-10-28 2019-12-03 Ford Global Technologies, Llc Trailer backup assist system with multi-purpose camera in a side mirror assembly of a vehicle
US9836060B2 (en) 2015-10-28 2017-12-05 Ford Global Technologies, Llc Trailer backup assist system with target management
US10328933B2 (en) 2015-10-29 2019-06-25 Ford Global Technologies, Llc Cognitive reverse speed limiting
US9895945B2 (en) 2015-12-08 2018-02-20 Ford Global Technologies, Llc Trailer backup assist system with hitch assist
US10306821B2 (en) 2015-12-10 2019-06-04 Robert Bosch Gmbh Device and method for controlling the operation of a towed implement, which can be activated hydraulically, on a vehicle
EP3178302A1 (en) * 2015-12-10 2017-06-14 Robert Bosch Gmbh Device and method for governing the operation of a hydraulically actuated towing device on a vehicle
US10127459B2 (en) 2015-12-17 2018-11-13 Ford Global Technologies, Llc Trailer type identification system
US10011228B2 (en) 2015-12-17 2018-07-03 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system using multiple imaging devices
US10768505B2 (en) * 2016-04-07 2020-09-08 Lg Electronics Inc. Driver assistance apparatus and vehicle
US20170293198A1 (en) * 2016-04-07 2017-10-12 Lg Electronics Inc. Driver assistance apparatus and vehicle
WO2017220257A1 (en) * 2016-06-24 2017-12-28 Zf Friedrichshafen Ag Autonomous coupling of a trailer
US10106193B2 (en) 2016-07-01 2018-10-23 Ford Global Technologies, Llc Enhanced yaw rate trailer angle detection initialization
US12078988B2 (en) * 2016-07-19 2024-09-03 Raven Industries, Inc. System and method for autonomous control of agricultural machinery and equipment
US20230152797A1 (en) * 2016-07-19 2023-05-18 Raven Industries, Inc. System and method for autonomous control of agricultural machinery and equipment
US10106951B2 (en) 2016-09-21 2018-10-23 Deere & Company System and method for automatic dump control
AU2017332524B2 (en) * 2016-09-24 2019-05-23 Glic Ltd Automatic car trailer hitching and unhitching system
GB2554523A (en) * 2016-09-24 2018-04-04 Roderick Webster Gary Automatic Car trailer hitching and unhitching system
US11179982B2 (en) 2016-09-24 2021-11-23 Gary Webster Automatic car trailer hitching and unhitching system
GB2554523B (en) * 2016-09-24 2018-12-19 Roderick Webster Gary Automatic Car trailer hitching and unhitching system
US9829883B1 (en) 2016-10-17 2017-11-28 Ford Global Technologies, Llc Trailer backup assist system having remote control and user sight management
WO2018087623A1 (en) * 2016-11-10 2018-05-17 Corsetti Claudio Movement system for any farming machinery
IT201600113433A1 (en) * 2016-11-10 2018-05-10 Claudio Corsetti HANDLING SYSTEM FOR ANY AGRICULTURAL MACHINERY.
US20180284808A1 (en) * 2017-03-31 2018-10-04 Komatsu Ltd. Control system of transporter vehicle, transporter vehicle, and control method of transporter vehicle
US10802503B2 (en) * 2017-03-31 2020-10-13 Komatsu Ltd. Control system of transporter vehicle, transporter vehicle, and control method of transporter vehicle
US20180316554A1 (en) * 2017-04-27 2018-11-01 Autoliv Asp, Inc. System and method for configuring at least one sensor system of a vehicle
US10742494B2 (en) * 2017-04-27 2020-08-11 Veoneer Us, Inc. System and method for configuring at least one sensor system of a vehicle
WO2018210990A1 (en) 2017-05-18 2018-11-22 Cnh Industrial Italia S.P.A. System and method for automatic connection between a tractor and an implement
IT201700054083A1 (en) * 2017-05-18 2018-11-18 Cnh Ind Italia Spa SYSTEM AND METHOD OF AUTOMATIC CONNECTION BETWEEN TRACTOR AND TOOL
US10710585B2 (en) 2017-09-01 2020-07-14 Ford Global Technologies, Llc Trailer backup assist system with predictive hitch angle functionality
WO2019156623A1 (en) * 2018-02-07 2019-08-15 Red Dot Robotics Pte. Ltd. Towing vehicle and towed vehicle connection assembly
US10351061B1 (en) * 2018-03-09 2019-07-16 Deere & Company Implement attachment guidance system
JP7060707B2 (en) 2018-04-04 2022-04-26 コンチネンタル オートモーティブ システムズ インコーポレイテッド Devices and methods for detecting the distance between the vehicle and the trailer
WO2019195067A1 (en) * 2018-04-04 2019-10-10 Continental Automotive Systems, Inc. Vehicle trailer distance detection device and method
US11633994B2 (en) 2018-04-04 2023-04-25 Continental Autonomous Mobility Us, Llc. Vehicle-trailer distance detection device and method
CN112424001A (en) * 2018-04-04 2021-02-26 大陆汽车系统公司 Vehicle-trailer distance detection apparatus and method
JP2021519242A (en) * 2018-04-04 2021-08-10 コンチネンタル オートモーティブ システムズ インコーポレイテッドContinental Automotive Systems, Inc. Devices and methods for detecting the distance between the vehicle and the trailer
WO2019212868A1 (en) * 2018-05-01 2019-11-07 Continental Automotive Systems, Inc. Trailer detection and autonomous hitching
US11518204B2 (en) 2018-05-01 2022-12-06 Continental Autonomous Mobility US, LLC Trailer detection and autonomous hitching
WO2019213287A1 (en) * 2018-05-01 2019-11-07 Continental Automotive Systems, Inc. Trailer detection and autonomous hitching
US11198341B2 (en) * 2018-05-01 2021-12-14 Continental Automotive Systems, Inc. Trailer detection and autonomous hitching
CN112424002A (en) * 2018-05-08 2021-02-26 大陆汽车系统公司 Visual object tracker
US11991942B2 (en) * 2018-06-29 2024-05-28 Dcentralized Systems, Inc. Implement attachment system for autonomous modular ground utility robot system
US10773665B2 (en) 2018-10-16 2020-09-15 Cnh Industrial America Llc System and method for detecting a damage condition associated with an agricultural machine
JP7444605B2 (en) 2019-01-07 2024-03-06 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツング How to calculate the location of the tow hitch
JP2020113268A (en) * 2019-01-07 2020-07-27 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツングContinental Automotive GmbH Method for calculating tow hitch position
CN109799828A (en) * 2019-02-28 2019-05-24 北京智行者科技有限公司 Automatic driving vehicle transverse and longitudinal control method for coordinating
US11904778B2 (en) * 2019-05-14 2024-02-20 Magna Electronics Inc. Vehicular rear vision system with trailer connection verification
US20200361397A1 (en) * 2019-05-14 2020-11-19 Magna Electronics Inc. Vehicular rear vision system with trailer connection verification
US11429114B2 (en) * 2020-02-14 2022-08-30 Deere & Company Implement control of vehicle and implement combination
EP3864944A1 (en) * 2020-02-14 2021-08-18 Deere & Company Implement control of vehicle and implement combination
JP7434127B2 (en) 2020-09-16 2024-02-20 株式会社クボタ work equipment
JP2022049379A (en) * 2020-09-16 2022-03-29 株式会社クボタ Work machine
WO2022123347A1 (en) * 2020-12-10 2022-06-16 Agco International Gmbh Methods and systems for controlling motion of agricultural vehicles
EP4295657A1 (en) * 2022-06-23 2023-12-27 Agco Corporation Methods of locating agricultural implements
CN115760989A (en) * 2023-01-10 2023-03-07 西安华创马科智能控制系统有限公司 Hydraulic support robot track alignment method and device

Also Published As

Publication number Publication date
CN104460361A (en) 2015-03-25
BR102014021838A2 (en) 2017-09-19
US9880560B2 (en) 2018-01-30
BR102014021838B8 (en) 2020-04-28
CN104460361B (en) 2020-06-05
DE102014217746A1 (en) 2015-03-19
BR102014021838B1 (en) 2020-04-07

Similar Documents

Publication Publication Date Title
US9880560B2 (en) Vehicle auto-motion control system
US11109018B2 (en) Targetless vehicular camera misalignment correction method
EP3787911B1 (en) Trailer detection and autonomous hitching
EP3787909B1 (en) Coupler and tow-bar detection for automated trailer hitching via cloud points
US11417116B2 (en) Vehicular trailer angle detection system
EP3624578B1 (en) System and method for automatic connection between a tractor and an implement
US11148667B2 (en) Automated reversing by choice of target location
US20210078634A1 (en) Vehicular trailering assist system
US8908037B2 (en) Calibration device, method, and program for on-board camera
JP5561566B2 (en) Driving assistance device
WO2015043507A1 (en) Image processing method and apparatus for cars, method for generating car surround view image, and car surround view system
US10453217B2 (en) Targetless vehicle camera calibration system
WO2014054753A1 (en) Image processing device and device for monitoring area in front of vehicle
WO2010070920A1 (en) Device for generating image of surroundings of vehicle
WO2016129552A1 (en) Camera parameter adjustment device
US9667875B2 (en) Vision system and method of monitoring surroundings of machine
JP5951785B2 (en) Image processing apparatus and vehicle forward monitoring apparatus
EP3761262B1 (en) Image processing device and image processing method
WO2024095802A1 (en) Driving control system, work vehicle, and driving control method
JP7236352B2 (en) Posture estimation device and posture estimation method
US20230394847A1 (en) Image processing system, image processing method, and storage medium
CN118665346A (en) Removable apparatus, image processing apparatus, storage medium, and method of mounting image pickup apparatus
JP2020145548A (en) Image adjustment device and image adjustment program
JPH05173629A (en) Visual sense device for unmanned traveling vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: DEERE & COMPANY, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAN, SHUFENG;BUERKLE, BRYAN K.;TURNER, CHRISTOPHER D.;AND OTHERS;SIGNING DATES FROM 20130903 TO 20130905;REEL/FRAME:031214/0418

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4