CN111267564A - Hitching auxiliary system - Google Patents

Hitching auxiliary system Download PDF

Info

Publication number
CN111267564A
CN111267564A CN201910849720.8A CN201910849720A CN111267564A CN 111267564 A CN111267564 A CN 111267564A CN 201910849720 A CN201910849720 A CN 201910849720A CN 111267564 A CN111267564 A CN 111267564A
Authority
CN
China
Prior art keywords
hitch
vehicle
trailer
controller
coupler
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910849720.8A
Other languages
Chinese (zh)
Inventor
卢克·尼维亚多姆斯基
布鲁诺·西利·贾尔斯科斯塔
安贾莉·克里希那马赫
道格拉斯·罗根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Publication of CN111267564A publication Critical patent/CN111267564A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/01Traction couplings or hitches characterised by their type
    • B60D1/06Ball-and-socket hitches, e.g. constructional details, auxiliary devices, their arrangement on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/245Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating push back or parking of trailers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/36Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating connection, e.g. hitch catchers, visual guide means, signalling aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/58Auxiliary devices
    • B60D1/62Auxiliary devices involving supply lines, electric circuits, or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D13/00Steering specially adapted for trailers
    • B62D13/06Steering specially adapted for trailers for backing a normally drawn trailer

Abstract

A hitch assist system is provided herein. The hitch assistance system includes a sensing system having an imager and a proximity sensor. The hitch assistance system further includes a controller for receiving signals from the proximity sensor and generating a signature graph; determining a coupler position based on the detected features; and maneuvering the vehicle along the path to align the hitch ball with the hitch of the trailer.

Description

Hitching auxiliary system
Technical Field
The present disclosure relates generally to autonomous and semi-autonomous vehicle systems and, more particularly, to hitch-assist systems that facilitate hitching a vehicle to a trailer.
Background
The process of hitching the vehicle to the trailer can be difficult, especially for those who are inexperienced. Therefore, there is a need for a system that simplifies the process by helping the user in a simple but intuitive way.
Disclosure of Invention
According to some aspects of the present disclosure, a hitch assistance system is provided herein. The hitch assistance system includes a sensing system having an imager and a proximity sensor. The hitch assistance system further includes a controller for receiving signals from the proximity sensor and generating a signature graph; determining a coupler position based on the detected features; and maneuvering the vehicle along the path to align the hitch ball with the hitch of the trailer.
According to some aspects of the present disclosure, a hitch assistance method is provided herein. The method includes generating a grid map of features proximate to a vehicle from one or more sensors disposed on the vehicle. The method also includes positioning and mapping two or more features indicative of the trailer relative to each other. The method also includes controlling the vehicle along the path to align the hitch ball with the hitch of the trailer.
According to some aspects of the present disclosure, a hitch assistance system is provided herein. The hitch assistance system includes an imager for capturing images of a rearward vehicle. The hitch assist system further includes a controller for creating an image patch of a scene behind the vehicle based on the image provided by the imager; applying a parametric circle function to locate circular structures within the image block; comparing the input value of the hitch ball diameter with the number of pixels in the circular structure to form a reference length; and the reference length is used to determine tee length or hitch ball height.
These and other aspects, objects, and features of the present invention will be understood and appreciated by those skilled in the art upon studying the following specification, claims, and appended drawings.
Drawings
In the drawings:
FIG. 1 is a top perspective view of a vehicle and trailer equipped with a hitch assist system (also referred to as a "hitch assist" system), according to some examples;
FIG. 2 is a block diagram illustrating various components of a hitch assistance system, according to some examples;
fig. 3 is a schematic top view of a vehicle during a step of an alignment sequence with a trailer, according to some examples;
fig. 4 is a schematic top view of a vehicle during a subsequent step of an alignment sequence with a trailer, according to some examples;
fig. 5 is a schematic top view of a vehicle during a subsequent step of an alignment sequence with a trailer, according to some examples;
fig. 6 is a schematic top view of a vehicle during a subsequent step of an alignment sequence with a trailer and showing the position of a hitch ball of the vehicle at the end of a derived alignment path, according to some examples;
FIG. 7 is a schematic top view of a vehicle with a proximity sensor attached, according to some examples;
FIG. 8 is an exemplary grid map of an area proximate to a vehicle generated by a proximity sensor, according to some examples;
FIG. 9 is an enlarged view of region IX of FIG. 8;
FIG. 10 is a flow diagram of a method of hitching an assistance system according to some examples;
FIG. 11 is a rear perspective view of a vehicle having an imager disposed in a rear portion thereof, according to some examples;
FIG. 12 is a representative image block generated by an imager, according to some examples;
FIG. 13 is a rear side plan view of a vehicle having a hitch assembly operably coupled thereto, according to some examples;
FIG. 14 is a flow chart illustrating a method of determining various hitch assembly characteristics, according to some examples;
FIG. 15 is an exemplary graph illustrating a relationship between hitch ball diameter and ball-to-imager distance, according to some examples; and
fig. 16 is an exemplary graph illustrating a relationship between tee length and imager view angle according to some examples.
Detailed Description
For purposes of the description herein, the terms "upper," "lower," "right," "left," "rear," "front," "vertical," "horizontal," and derivatives thereof shall relate to the invention as oriented in fig. 1. It is to be understood, however, that the invention may assume various alternative orientations, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification are simply exemplary of the inventive concepts defined in the appended claims. Hence, specific dimensions and other physical characteristics relating to the examples disclosed herein are not to be considered as limiting, unless the claims expressly state otherwise.
As required, detailed examples of the present invention are disclosed herein. However, it is to be understood that the disclosed examples are merely exemplary of the invention that may be embodied in various and alternative forms. The drawings are not necessarily to scale, and some of the drawings may be exaggerated or minimized to show a functional overview. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
In this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element "comprising" preceding an element does not exclude the presence of additional identical elements in a process, method, article, or apparatus that comprises the element, but is not more limited.
As used herein, the term "and/or," when used in reference to a list of two or more items, means that any one of the listed items can be employed alone, or any combination of two or more of the listed items can be employed. For example, if a composition is described as containing components A, B and/or C, the composition may contain: only A; only B; only C; a combination of A and B; a combination of A and C; a combination of B and C; or a combination of A, B and C.
As used herein, "visibility" is a measure of the distance at which an object or light can be clearly discerned. Thus, a low visibility condition may exist whenever an object or light is difficult to discern from a threshold distance, and a high visibility condition may exist whenever an object or light is discernable from a threshold distance. Objects may be difficult to discern due to night-like conditions (i.e., low light level conditions) and/or atmospheric disturbances such as fog, rain, or any other suspended particles reducing the ability to discern the object from the threshold distance.
The following disclosure describes a hitch assist system for a vehicle. The hitch assistance system may include a sensing system having an imager and a proximity sensor. The hitch assistance system may further include a controller for receiving a signal from the proximity sensor and generating a signature graph; determining a coupler position based on the detected features; and maneuvering the vehicle along the path to align the hitch ball with the hitch of the trailer. In some examples, the hitch assistance system may additionally and/or alternatively include an imager for capturing images of the rearward vehicle. The controller may be configured to create an image patch of a scene rearward of the vehicle based on the image provided by the imager; applying a parametric circle function to locate circular structures within the image block; comparing the input value of the hitch ball diameter with the number of pixels in the circular structure to form a reference length; and using the reference length to determine tee length or hitch ball height. It should be understood that an image block may be formed of any number of pixels, and the pixels may each have a common size with each other. However, in alternative examples, the pixel size may vary without departing from the scope of this disclosure.
Referring to fig. 1 and 2, reference numeral 10 designates a hitch assist system for a vehicle 12. Specifically, the hitch assistance system 10 includes a controller 14 that acquires position data of a coupler 16 of a trailer 18 and derives a vehicle path 20 (fig. 3) to align a hitch assembly 22 of the vehicle 12 with the coupler 16. In some examples, hitch assembly 22 may include a ball seat 24 supporting a hitch ball 26. Hitch ball 26 may be secured to a ball seat 24 extending from vehicle 12 and/or hitch ball 26 may be secured to a portion of vehicle 12, such as a bumper of vehicle 12. In some examples, ball seat 24 may be coupled with a receiver 28 secured to vehicle 12.
As shown in fig. 1, the vehicle 12 is illustratively embodied as a pick-up truck having a truck cargo bed 30, the truck cargo bed 30 being accessible via a rotatable tailgate 32. Hitch ball 26 may be received by hitch coupler 16 in the form of a coupler ball socket 34, which coupler ball socket 34 is disposed at an end portion of trailer coupler 16. The trailer 18 is illustratively embodied as a single axle trailer with the coupler 16 extending longitudinally therefrom. It should be appreciated that additional examples of the trailer 18 may alternatively be coupled with the vehicle 12 to provide a pivotal connection, such as by connecting with a fifth wheel connector. It is also contemplated that additional examples of the trailer 18 may include more than one axle, and may have various shapes and sizes configured for different loads and items, such as boat trailers or flatbed trailers, without departing from the teachings provided herein.
With respect to the general operation of the hitch assistance system 10, as shown in FIG. 2, the hitch assistance system 10 includes a sensing system 46, which sensing system 46 includes various sensors and devices that obtain or otherwise provide information related to the state of the vehicle. For example, in some cases, the sensing system 46 incorporates the imaging system 36, the imaging system 36 including one or more external imagers 38, 40, 42, 44 or any other vision-based device. One or more imagers 38, 40, 42, 44 each include an area-type image sensor, such as a CCD or CMOS image sensor, and image capture optics that capture an image of an imaging field of view (e.g., field of view 48, 50, 52a, 52b of fig. 5) defined by the image capture optics. In some cases, one or more imagers 38, 40, 42, 44 may derive image block 54 from a plurality of image frames that may be displayed on display 118. In various examples, the hitch assistance system 10 may include any one or more of a Center High Mounted Stop Lamp (CHMSL) imager 38, a rear imager 40, a left side-view imager 42, and/or a right side-view imager 44, although other arrangements including additional or alternative imagers are possible without departing from the scope of this disclosure.
In some examples, the imaging system 36 may include the rear imager 40 alone, or may be configured such that the hitch assistance system 10 utilizes only the rear imager 40 in a vehicle 12 having multiple external imagers 38, 40, 42, 44. In some cases, the various imagers 38, 40, 42, 44 included in the imaging system 36 may be positioned to substantially overlap in their respective fields of view, which in the arrangement depicted in fig. 5 include fields of view 48, 50, 52a, 52b to correspond to the CHMSL imager 38, the rear imager 40, and the side imagers 42 and 44, respectively. In this manner, image data 56 from two or more of the imagers 38, 40, 42, 44 may be combined into a single image or image block 54 in an image/signal processing program 58 or in another dedicated image/signal processor within the imaging system 36. In an extension of such examples, the image data 56 may be used to derive stereoscopic image data 56, which stereoscopic image data 56 may be used to reconstruct a three-dimensional scene of one or more regions within the overlapping regions of the respective fields of view 48, 50, 52a, 52b, including any objects therein (e.g., obstructions or couplings 16).
In some examples, given the known spatial relationship between the imagers 38, 40, 42, 44, the use of two images comprising the same object may be used to determine the position of the object relative to the two imagers 38, 40, 42 and/or 44 through the projection geometry of the imagers 38, 40, 42, 44. In this regard, the image/signal processing program 58 may use known programming and/or functionality to identify objects within the image data 56 from the various imagers 38, 40, 42, 44 within the imaging system 36. The image/signal processing program 58 may include information related to the location of any imagers 38, 40, 42, 44 present on the vehicle 12 or used by the hitch assistance system 10, including the location relative to the center 62 (fig. 1) of the vehicle 12. For example, the position of the imagers 38, 40, 42, 44 relative to the center 62 of the vehicle 12 AND/or each other may be used for object location calculations AND to generate object location data relative to, for example, the center 62 of the vehicle 12 or other FEATUREs of the vehicle 12, such as the HITCH ball 26 (fig. 1), where the position relative to the center 62 of the vehicle 12 is known in a manner similar to that described in commonly assigned U.S. patent application No. 15/708,427, filed on 19.9.2017 AND entitled "HITCH assist system with HITCH COUPLER identification FEATUREs AND HITCH COUPLER HEIGHT ESTIMATION (HITCH ASSIST SYSTEM WITH HITCH HITCH COUPLER identification feedback AND HITCH COUPLER HEIGHT ESTIMATION"), the entire disclosure of which is incorporated herein by reference.
With further reference to fig. 1 and 2, the proximity sensor 60 or array thereof and/or other vehicle sensors 70 may provide sensor signals that the controller 14 of the hitch assistance system 10 processes using various programs to determine various objects proximate the vehicle 12, the trailer 18, and/or the coupler 16 of the trailer 18. The proximity sensor 60 may also be used to determine the height and position of the coupler 16. The proximity sensor 60 may be configured as any type of sensor, such as an ultrasonic sensor, a radio detection and ranging (radar) sensor, a sound navigation and ranging (SONAR) sensor, a light detection and ranging (LIDAR) sensor, a vision-based sensor, and/or any other type of sensor known in the art.
Still referring to fig. 1 and 2, the positioning system 66 may include a dead reckoning device 68, or additionally or alternatively, a Global Positioning System (GPS) that determines the coordinate position of the vehicle 12. For example, the dead reckoning device 68 may establish and track the coordinate position of the vehicle 12 within the local coordinate system based at least on the vehicle speed and/or the steering angle δ (FIG. 3). The controller 14 may also be operatively coupled with various vehicle sensors 70, such as a speed sensor 72 and a yaw rate sensor 74. Additionally, the controller 14 may communicate with one or more gyroscopes 76 and accelerometers 78 to measure the position, orientation, direction, and/or velocity of the vehicle 12.
To enable autonomous or semi-autonomous control of the vehicle 12, the controller 14 of the hitch assistance system 10 may also be configured to communicate with a variety of vehicle systems. According to some examples, the controller 14 of the hitch assistance system 10 may control a power assisted steering system 80 of the vehicle 12 to operate the steerable wheels 82 of the vehicle 12 as the vehicle 12 moves along the vehicle path 20. The power assisted steering system 80 may be an Electric Power Assisted Steering (EPAS) system that includes an electric steering motor 84 for turning the steering wheels 82 to a steering angle δ based on steering commands generated by the controller 14, whereby the steering angle δ may be sensed by a steering angle sensor 86 of the power assisted steering system 80 and provided to the controller 14. As described herein, steering commands may be provided for autonomously steering the vehicle 12 during maneuvering, and may alternatively be provided manually via a rotational position (e.g., steering wheel angle) of a steering wheel 88 (fig. 3) or a steering input device 90, which steering input device 90 may be provided to enable a driver to control or otherwise modify a desired curvature of the path 20 of the vehicle 12. Steering input device 90 may be communicatively coupled to controller 14 in a wired or wireless manner and provide controller 14 with information defining a desired curvature of path 20 of vehicle 12. In response, the controller 14 processes the information and generates a corresponding steering command that is supplied to a power assisted steering system 80 of the vehicle 12. In some examples, the steering input device 90 includes a rotatable knob 92 operable between a plurality of rotational positions, each rotational position providing an incremental change in the desired curvature of the path 20 of the vehicle 12.
In some examples, the steering wheel 88 of the vehicle 12 may be mechanically coupled with the steered wheels 82 of the vehicle 12 such that the steering wheel 88 moves in unison with the steered wheels 82 via internal torque during autonomous steering of the vehicle 12. In such instances, the power assisted steering system 80 may include a torque sensor 94, the torque sensor 94 sensing a torque (e.g., grip and/or turn) on the steering wheel 88 that is undesirable for autonomous control of the steering wheel 88 and thus indicates manual intervention by the driver. In some examples, the external torque applied to the steering wheel 88 may be used as a signal to the controller 14 indicating that the driver has taken manual control and caused the hitch assist system 10 to interrupt the autonomous steering function. However, as provided in more detail below, the hitch assistance system 10 may continue one or more functions/operations while interrupting autonomous steering of the vehicle.
The controller 14 of the trailer assistance system 10 may also communicate with a vehicle brake control system 96 of the vehicle 12 to receive vehicle speed information, such as the speed of the individual wheels of the vehicle 12. Additionally or alternatively, vehicle speed information may be provided to controller 14 by powertrain control system 98 and/or vehicle speed sensor 72, among other conceivable devices. The powertrain control system 98 may include a throttle 100 and a transmission system 102. The gear selector 104 may be disposed within the transmission system 102, the transmission system 102 controlling the operating mode of the vehicle transmission system 102 through one or more gears of the transmission system 102. In some examples, the controller 14 may provide a brake command to the vehicle brake control system 96, allowing the hitch assistance system 10 to adjust the speed of the vehicle 12 during maneuvering of the vehicle 12. It should be appreciated that the controller 14 may additionally or alternatively regulate the speed of the vehicle 12 via interaction with the powertrain control system 98.
By interacting with power assisted steering system 80, vehicle braking control system 96, and/or powertrain control system 98 of vehicle 12, the likelihood of unacceptable conditions may be reduced as vehicle 12 moves along path 20. Examples of unacceptable conditions include, but are not limited to, vehicle overspeed conditions, sensor failure, and the like. In such cases, the driver may not be aware of the fault until an unacceptable reverse condition is imminent or has occurred. Accordingly, disclosed herein are: the controller 14 of the hitch assist system 10 may generate an alert signal corresponding to a notification of an actual, impending, and/or anticipated unacceptable reverse condition, and may generate countermeasures prior to driver intervention to prevent such an unacceptable reverse condition from occurring.
According to some examples, controller 14 may communicate with one or more devices including vehicle notification system 106 that may prompt visual, audible, and tactile notifications and/or warnings. For example, a vehicle brake light 108 and/or a vehicle emergency light may provide a visual alert. The vehicle horn 110 and/or the speaker 112 may provide an audible alarm. Additionally, the controller 14 and/or the vehicle notification system 106 may communicate with a user input device, such as a human-machine interface (HMI)114 of the vehicle 12. The HMI 114 may include a touch screen 116 or other user input device, such as a navigation and/or entertainment display 118 mounted within the cockpit module, the instrument cluster, and/or any other location within the vehicle 12, which may be capable of displaying images indicative of an alert.
In some cases, HMI 114 also includes an input device that may be implemented by configuring display 118 as part of touch screen 116 with circuitry 120 to receive input corresponding to a location on display 118. Other forms of input, including one or more joysticks, digital input pads, etc., may be used in place of or in addition to the touch screen 116.
Further, the hitch assistance system 10 may communicate with some instances of the HMI 114 and/or with one or more handheld or portable devices 122 (fig. 1) via wired and/or wireless communication, which handheld or portable devices 122 may additionally and/or alternatively be configured as user input devices. The network can be one or more of a variety of wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms, as well as any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary wireless communication networks include wireless transceivers (e.g., bluetooth modules, ZIGBEE transceivers, Wi-Fi transceivers, IrDA transceivers, RFID transceivers, etc.), Local Area Networks (LANs), and/or Wide Area Networks (WANs), including the internet, that provide data communication services.
The portable device 122 may also include a display 118 to display one or more images and other information to the user U. For example, the portable device 122 may display one or more images of the trailer 18 on the display 118, and may also be capable of receiving remote user input via the touch screen circuitry 120. In addition, the portable device 122 may provide feedback information, such as visual, audible, and tactile alerts. It should be understood that the portable device 122 may be any of a variety of computing devices and may include a processor and memory. For example, the portable device 122 may be a cell phone, a mobile communication device, a key fob, a wearable device (e.g., a workout bracelet, a watch, glasses, jewelry, a purse), a garment (e.g., a T-shirt, a glove, a shoe, or other accessory), a personal digital assistant, a headset, and/or other devices that include wireless communication protocol and/or any wired communication protocol capability.
The controller 14 is configured with a microprocessor 124 and/or other analog and/or digital circuitry for processing one or more logic programs stored in a memory 126. The logic routines may include one or more programs including the image/signal processing program 58, the hook detection program, the path derivation program 128, and the operation program 130. Information from the imager 40 or other components of the sensing system 46 may be supplied to the controller 14 via a communication network of the vehicle 12, which may include a Controller Area Network (CAN), a Local Interconnect Network (LIN), or other protocols used in the automotive industry. It should be appreciated that controller 14 may be a stand-alone dedicated controller, or may be a shared controller integrated with imager 40 or other components of hitch assist system 10 in addition to any other conceivable on-board or off-board vehicle control system.
The controller 14 may include any combination of software and/or processing circuitry suitable for controlling the various components of the hitch assistance system 10 described herein, including but not limited to microprocessors, microcontrollers, application specific integrated circuits, programmable gate arrays, and any other digital and/or analog components, as well as combinations of the foregoing, as well as inputs and outputs for transceiving control signals, drive signals, power signals, sensor signals, and the like. All such computing devices and environments are intended to fall within the meaning of the terms "controller" or "processor" as used herein unless a different meaning is explicitly provided or otherwise clear from the context.
With further reference to fig. 2-6, the controller 14 may generate vehicle steering information and commands based on all or a portion of the received information. Thereafter, the vehicle steering information and commands may be provided to the power assisted steering system 80 for affecting steering of the vehicle 12 to achieve the commanded travel path 20 for alignment with the coupling 16 of the trailer 18. It should also be understood that the image/signal processing routine 58 may be executed by a dedicated processor, for example, within the vehicle's 12 independent imaging system 36, which may output the results of its image/signal processing to other components and systems of the vehicle 12, including the microprocessor 124. Moreover, any system, computer, processor, etc. that performs image/signal processing functionality such as that described herein may be referred to herein as an "image/signal processor," regardless of other functionality that may also be implemented (including functionality that is implemented concurrently with execution of image/signal processing program 58).
In some examples, the image/signal processing program 58 may be programmed or otherwise configured to position the coupler 16 within the image data 56. In some cases, image/signal processing program 58 may identify coupler 16 within image data 56 based on stored or otherwise known visual characteristics, typically coupler 16 or a hanger. In some cases, a marking in the form of a sticker or the like may be affixed to the TRAILER 18 in a designated location relative to the coupler 16 in a manner similar to that described in commonly assigned U.S. patent No. 9,102,271 entitled "TRAILER MONITORING system and METHOD (TRAILER MONITORING SYSTEM AND METHOD)", the entire disclosure of which is incorporated herein by reference. In such examples, the image/signal processing program 58 may be programmed to identify characteristics of the markers for the locations in the image data 56, as well as the positioning of the coupler 16 relative to such markers, such that the position of the coupler 16 may be determined based on the marker locations. Additionally or alternatively, the controller 14 may seek confirmation of the coupling 16 via a prompt on the touch screen 116 and/or the portable device 122. If the determination of the coupler 16 is not confirmed, further image/signal processing may be provided, or the touch screen 116 or another input may be used to facilitate user adjustment of the position 134 of the coupler 16 to allow the user U to move the depicted position 134 of the coupler 16 on the touch screen 116, which the controller 14 uses to adjust the determination of the position 134 of the coupler 16 relative to the vehicle 12 based on the above-described use of the image data 56. Alternatively, the user U may visually determine the position 134 of the coupler 16 within the image presented on the HMI 114 and may provide the touch input in a manner similar to that described in co-pending, commonly assigned U.S. patent application No. 15/583,014, filed on 5/1/2017 and entitled "SYSTEM for automatically hitching a trailer (SYSTEM TOAUTOMATE HITCHING A TRAILER"), the entire disclosure of which is incorporated herein by reference. The image/signal handler 58 may then associate the location of the touch input with the coordinate system applied to the image block 54.
As shown in fig. 3-6, in some illustrative examples of the hitch assistance system 10, the image/signal processing routine 58 and the operating routine 130 may be used in conjunction with one another to determine the path 20 along which the hitch assistance system 10 may guide the vehicle 12 to align the hitch ball 26 with the coupler 16 of the trailer 18. In the illustrated example, the initial position of the vehicle 12 relative to the trailer 18 may be such that the coupler 16 is located in the field of view 52a of the side imager 42, with the vehicle 12 positioned laterally relative to the trailer 18, but with the coupler 16 nearly longitudinally aligned with the hitch ball 26. In this manner, upon activation of the hitch assistance system 10, such as through user input on the touch screen 116, for example, the image/signal processing program 58 may identify the coupler 16 within the image data 56 of the imager 42 and estimate the position 134 of the coupler 16 relative to the hitch ball 26 using the image data 56 in accordance with the above example or by other known means, including by receiving focus information within the image data 56 to determine the distance D from the coupler 16cAnd the offset angle between the coupler 16 and the longitudinal axis of the vehicle 12Degree αc. Once the user U has determined and optionally confirmed the positioning D of the coupler 16c、αcThe controller 14 may control at least the vehicle steering system 80 to control movement of the vehicle 12 along the desired path 20 to align the vehicle hitch ball 26 with the coupler 16.
With continued reference to FIG. 3, as discussed above, the controller 14 (FIG. 2) has estimated the positioning D of the coupling 16c、αcThe path derivation routine 128 may then be executed in some examples to determine the vehicle path 20 to align the vehicle hitch ball 26 with the coupler 16. The controller 14 may store various characteristics of the vehicle 12, including the wheelbase W, the distance D from the rear axle to the hitch ball 26 (which is referred to herein as the drawbar length), and the maximum angle δ that the steerable wheel 82 may rotatemax. As shown, the wheel base W and the current steering angle δ may be used to determine a corresponding turning radius ρ of the vehicle 12 according to the following equation:
Figure BDA0002195979050000131
wherein the wheelbase W is fixed and the steering angle δ may be controlled by the controller 14 in communication with the steering system 80, as discussed above. In this way, when the maximum steering angle δ is knownmaxMinimum possible value ρ of turning radiusminIs determined as:
Figure BDA0002195979050000132
the path derivation program 128 may be programmed to derive the vehicle path 20 to align the known position of the vehicle hitch ball 26 with the estimated position 134 of the coupler 16, which takes into account the determined minimum turning radius ρminThis may allow path 20 to use a minimum amount of space and steering. In this manner, the path derivation program 128 can use the position of the vehicle 12 (which can be based on the center 62 of the vehicle 12, the position along the rear axle, the position of the dead reckoning device 68, or another known position on the coordinate system) to determine the lateral distance from the coupling 16 and the distance to the coupling 16, and derive a path 20 that enables lateral and/or fore-aft movement of the vehicle 12 within the limits of the steering system 80, by the path 20. The derivation of the path 20 also takes into account the location of the hitch ball 26 relative to the tracked position of the vehicle 12 (which may correspond to the center of mass 62 of the vehicle 12, the location of a GPS receiver, or another designated known area) to determine the desired location of the vehicle 12 to align the hitch ball 26 with the coupler 16.
Once projected path 20, including end point 132, has been determined, controller 14 may control at least steering system 80 of vehicle 12 (whether controlled by the driver or by controller 14) with powertrain control system 98 and brake control system 96, which powertrain control system 98 and brake control system 96 control the speed (forward or backward) of vehicle 12. In this manner, the controller 14 may receive data from the positioning system 66 regarding the position of the vehicle 12 during movement of the vehicle 12 while controlling the steering system 80 to maintain the vehicle 12 along the path 20. The path 20, which has been determined based on the geometry of the vehicle 12 and the steering system 80, may adjust the steering angle δ according to the position of the vehicle 12 along the path, as shown by the path 20.
As shown in fig. 3, the initial positioning of the trailer 18 relative to the vehicle 12 may be such that forward movement of the vehicle 12 is required for a desired vehicle path 20, such as when the trailer 18 is laterally offset to the side of the vehicle 12. In this manner, the path 20 may include various road segments 136 of forward and/or backward travel of the vehicle 12, the road segments 136 separated by an inflection point 138 at which the vehicle 12 transitions between forward and backward movement 138. As used herein, an "inflection point" is any point along the vehicle path 20 where the vehicle condition changes. Vehicle conditions include, but are not limited to, changes in speed, changes in steering angle δ, changes in vehicle direction, and/or any other possible vehicle condition that may be adjusted. For example, if the vehicle speed is changed, the inflection point 138 may be located at a position where the speed is changed. In some examples, the path derivation program 128 may be configured to include a straight reverse segment 136 of a defined distance before reaching the point where the hitching ball 26 is aligned with the location 134 of the coupler 16. The remaining ways can be determinedThe segments 136 are moved laterally and forward/backward within the smallest area possible and/or with the minimum number of total segments 136 or inflection points 138. In the illustrated example of fig. 3, the path 20 may include two segments 136 that collectively traverse the lateral movement of the vehicle 12 while providing a straight backward-reversing segment 136 to bring the hitch ball 26 into the offset position 134 of the coupler 16, one of which includes the maximum steer angle δ in the right turn directionmaxAnd another road section comprises a maximum steering angle delta in a left-turn directionmaxForward travel. Then, an inflection point 138 is included wherein the vehicle 12 transitions from forward travel to rearward travel, followed by the straight rearward reverse segment 136 mentioned previously. It should be noted that variations of the depicted path 20 may be used, including the following variations: at less than maximum steering angle deltamaxA single forward driving path segment 136 at a right steering angle delta, followed by an inflection point 138 and at a maximum left steering angle deltamaxA backward travel path segment 136 having a shorter straight reverse path segment 136, and yet other paths 20 are possible.
In some cases, the hitch assistance system 10 may be configured to operate only in reverse of the vehicle 12, in which case the hitch assistance system 10 may prompt the driver to drive the vehicle 12 as needed to position the trailer 18 in a designated area relative to the vehicle 12, including to the rear thereof, so that the path derivation program 128 may determine the vehicle path 20 including the backward drive such instructions may further prompt the driver to position the vehicle 12 relative to the trailer 18 to compensate for other limitations of the hitch assistance system 10, including the specific distance to identify the coupler 16, the minimum offset angle αcAnd the like. It should also be noted that the positioning D of the coupler 16 as the vehicle 12 traverses the path 20, including positioning the vehicle 12 forward of the trailer 18 and as the vehicle 12 approaches the coupler 16c、αcThe estimation of (b) can become more accurate. Accordingly, such estimates may be derived and used to update the path derivation program 128 when needed in determining the adjusted initial end point 132 of the path 20.
Referring to fig. 5 and 6, usingThe strategy for determining the initial end point 132 of the vehicle path 20 that places the hitch ball 26 in the projected position to align with the coupler 16 includes calculating the actual or approximate trajectory that the coupler 16 moves onto the hitch ball 26 while lowering the coupler 16. The initial end point 132 is then derived as described above or otherwise to place the hitch ball 26 at the desired location 140 on the trajectory. In practice, by determining the height H of the coupling 16cAnd height H of hitch ball 26hbThe difference between which represents the vertical distance coupler 16 will be lowered to engage hitch ball 26. The determined trajectory is then used to correlate the vertical distance with a corresponding horizontal distance Δ x that coupler 16 moves in the direction of travel resulting from the vertical distance. This horizontal distance Δ x may be input to the path derivation program 128 as its desired initial end point 132, or may be used as an offset to the initial end point 132 derived from the initially determined position 134 of the coupler 16 at the end of the path 20 in a straight reverse segment 136, as shown in fig. 3.
Referring again to fig. 5 and 6, operating program 130 may continue to direct vehicle 12 until hitch ball 26 is at a desired final end point 140 relative to coupler 16 to engage coupler 16 with hitch ball 26 as coupler 16 is lowered into alignment and/or engagement with hitch ball 26. In the example discussed above, the image/signal processing routine 58 monitors the positioning D of the coupling 16 during execution of the operating routine 130c、αcIncluding as the coupler 16 enters the clearer view of the rear imager 40 while the vehicle 12 continues to move along the path 20. As described above, the position of the vehicle 12 may also be monitored by the dead reckoning device 68, where the path 20 and/or the initial endpoint 132 may be improved or should be updated (due to the improved height H due to, for example, closer resolution or additional image data 56)cDistance DcOr offset angle αcInformation), including the position 134 of the coupler 16 as the vehicle 12 moves closer to the trailer 18, is updated and fed into the path derivation program 128. In some cases, it may be assumed that the coupling 16 is static, such that the position of the vehicle 12 may be tracked by continuing to track the coupling 16 to eliminate the use of dead reckoningThe requirements of the device 68. In a similar manner, a modified variation of the operating program 130 may perform a predetermined sequence of maneuvers involving at a maximum steering angle δmaxOr a steering angle less than it, to steer the vehicle 12 while tracking the position D of the coupler 16c、αcTo converge the known position of the hitch ball 26 relative to its desired final end point 140 relative to the tracked position 134 of the coupler 16.
Referring to fig. 7-9, in some environments, snow, rain, and/or other obstructions may reduce the accuracy of vehicle sensors (such as imagers 38, 40, 42, 44) operating at wavelengths in the 400 μm to 900 μm size range, as the waves generated by these sensors may be at least partially blocked by the obstructions. Thus, in some examples, the trailer assist system 10 may utilize a proximity sensor, such as the radar sensor 64, that may operate successfully against most snow, rain, or dust without substantial impact on detecting the trailer 18 and/or the coupling 16. The proximity sensor may also be used to detect various other objects in proximity to the vehicle 12 during operation of the trailer assist system 10 before and/or during any trailer assist operation. It should be appreciated that any other sensor capable of providing information to hitch assistance system 10 under high and/or low visibility conditions may be used in conjunction with radar sensor 64 or in place of radar sensor 64.
Generally, the radar sensor 64 operates by emitting radio signals and detecting reflections of objects. In some examples, the radar sensor 64 may be used to detect physical objects, such as the trailer 18 (or portions of the trailer 18), the coupler 16, other vehicles, landscapes (such as trees, cliffs, rocks, hills, etc.), road edges, signs, buildings, or other objects. The radar sensor 64 may use the reflected radio waves to determine size, shape, distance, surface texture, or other information about the physical object or material.
With further reference to fig. 7-9, the radar sensor 64 may scan an area to obtain data regarding objects within a field of view 64a, 64b, 64c, 64d of the radar sensor 64 having a predefined range and angle of view. In some examples, the radar sensor 64 is configured to generate perception information from an area proximate the vehicle 12 (such as one or more areas proximate or around the rear of the vehicle 12). In some examples, the radar sensor 64 may provide sensory data including a two or three dimensional map or model to the hitch assistance system 10 for reference or processing. Further, the radar sensor 64 may operate in some of the most severe and inclement weather conditions and/or night-like conditions with little or no degradation in the quality or accuracy of the sensory data. For example, wet surfaces, snow and fog may have little effect on the ability of the radar sensor 64 to accurately locate and detect the range of objects. Thus, in some cases, the radar sensor 64 may function as a secondary detection system in high visibility environments and as a primary detection system when the vehicle 12 is operating in low visibility environments.
In some cases, as exemplarily shown in fig. 8, an environment occupancy grid map 142 may be generated from the received proximity sensor signals, the environment occupancy grid map 142 being formed by environment occupancy grid abstractions, which may be defined in cartesian coordinates with respect to the orientation of the vehicle such that, for example, the X-axis is vehicle left-right, the Y-axis is vehicle forward/reverse, and the Z-axis is upward. It should be understood that the coordinate system may be cylindrical coordinates having a range, angle, and height relative to the current orientation of the vehicle, and/or that the occupancy grid 142 may be converted to other coordinate systems for use by an operator without departing from the teachings provided herein.
The occupancy grid map 142 may be formed by dividing the environment into discrete occupancy cell grids and assigning to each grid a probability indicating whether the grid is occupied by an object. Initially, the occupancy grid may be set such that each occupancy bin is set to an initial probability. As the vehicle 12 scans the environment through the sensing system 46, the range data formed from the scan may be used to update the occupancy grid. For example, based on the range data, the vehicle 12 may detect objects that are far from a particular orientation and range of the vehicle 12. The range data may be converted to a different coordinate system (e.g., local or world cartesian coordinates). As a result of this detection, the vehicle 12 may increase the probability that a particular occupancy cell is occupied and decrease the probability that the occupancy cell between the vehicle 12 and the detected object is occupied. As the vehicle 12 moves through its environment, new limits may be exposed to the vehicle sensors, which enables the occupancy grid to be expanded and enhanced.
In some examples, the controller 14 monitors the environment near the vehicle 12 when the proximity sensor signal is provided. Next, the area in the image patch 54 that occupies the grid map 142 or in examples that additionally and/or alternatively use the imagers 38, 40, 42, 44 is analyzed and features 144 or patterns that indicate objects in the grid map 142 and/or the image patch 54 in the data are extracted. The extracted features 144 are then classified according to any number of classifiers. Exemplary classifications may include classification as a trailer 18, a coupler 16, a moving object (such as another vehicle), and/or a stationary object (such as a street sign). The data, including the classification, is then analyzed according to data correlations to form a feature extraction database 146 (FIG. 2). The data of the feature extraction database 146 is then stored for iterative comparison with new data and to predict the likelihood of the trailer 18 approaching the vehicle 12. The controller 14 may calculate the features 144 of the feature extraction database 146 using the following transformation: edges, histogram of gradient orientation (HOG), Scale Invariant Feature Transform (SIFT), Harris (Harris) corner detector, blocks projected onto a linear subspace, and/or any other feasible transform or detector algorithm. In some examples, machine learning algorithms may be used to adaptively assign weights and emphasis to alternative calculations based on the nature of the feedback, using programming. In addition, fuzzy logic may be used to adjust the input to the system according to the scalability factor based on the feedback. In this way, the accuracy of the system may be improved over time and based on the operator's particular driving habits.
Still referring to fig. 7-9, in some examples, by using the sensing system 46, the hitch assistance system 10 may be configured to perform simultaneous positioning and mapping (SLAM) of signals from the sensors to determine the position and alignment of the vehicle 12 relative to the trailer 18 and/or the hitch 16. SLAM is understood in this disclosure to be a problem in which the position and alignment of the vehicle 12 relative to the trailer 18 and/or any other obstacle is initially unknown. When solving the SLAM problem, the position and alignment of the vehicle 12 and the position of the trailer 18 and/or the coupler 16 may be determined simultaneously.
In some examples, the various proximity sensors included in the sensing system 46 may be positioned to substantially overlap in their respective fields of view, which in the arrangement depicted in fig. 7 include fields of view 64a, 64b, 64c, 64 d. In this manner, sensor signals from two or more proximity sensors 60 may be combined into a global frame in image/signal processing program 58 or in another dedicated image/signal processor within sensing system 46. In an extension of such examples, the sensor signals may be used to derive stereo data that may be used to reconstruct a three-dimensional scene of one or more regions within the overlapping regions of the respective fields of view 64a, 64b, 64c, 64d, including any objects therein (e.g., obstructions or couplings 16).
In some cases, the trailer 18 may include a pair of points 148, 150, the pair of points 148, 150 corresponding to a front outer corner or other shape of the trailer 18. The hitch coupler 16 may be centrally disposed between the pair of points 148, 150 or outer corners at the front 146 of the trailer 18. Thus, the hitch assistance system 10 may detect these points 148, 150 or any other desired point that may be found on the trailer 18 and therefore identifiable within the SLAM problem. Once these points 148, 150, 152 are determined, identified, located, and/or mapped relative to a global frame, which may be based on the center 62 of the vehicle 12 or any other coordinate system, the length L from the coupler 16 to the first point 148 or corner1Length L from coupler 16 to second point 150 or corner2And a length L between the first point 148 and the second point 150 or corner3May be used to determine the shape of the trailer 18, the coupler position, and/or the heading of the trailer 18 relative to the vehicle 12 according to the following equations:
Figure BDA0002195979050000191
Figure BDA0002195979050000192
Figure BDA0002195979050000193
referring to fig. 10, a method 154 of aligning hitch assembly 22 with coupler 16 is shown, according to some examples. Specifically, at step 156, the hitch assistance system 10 is started. Upon activation of the hitch assistance system 10, the method continues to step 158, where one or more proximity sensors on the vehicle 12 generate sensor signals that may be related to the position of an object in the field of view 64a, 64b, 64c, 64d (fig. 7) of the proximity sensor based on detection points 148, 150, 152 in the sensor signals. The sensor signals are provided to the controller 14 at step 160 to generate position data, map construction, and/or positioning/navigation (e.g., grid map 142 (fig. 8)) of objects proximate to the vehicle 12. The map construction uses measurements from the proximity sensors to measure and estimate the location of objects in the field of view 64a, 64b, 64c, 64d (FIG. 7) of the proximity sensors using techniques known to those of ordinary skill in the art. Localization/navigation uses techniques known to those skilled in the art to estimate the state of motion of the vehicle 12 and the location of objects in a global frame. For example, in some examples, an extended kalman filter is used to blend measurements from different sensors to estimate a state of motion of the vehicle 12 and/or position data of objects proximate to the vehicle 12. The different sensors may include, but are not limited to, different types of radar sensors 64 as described above that provide measurements of the state of motion of the vehicle 12. As used herein, the state of motion of the vehicle 12 refers to the position, speed, attitude (three-dimensional orientation), angular velocity, and/or position data of objects proximate to the vehicle 12 of the vehicle. The global frame is based on the frame of reference of the center 62 of the vehicle 12.
Once an object proximate the vehicle 12 is detected and mapped, the hitch assistance system 10 attempts to distinguish the points 148, 150, 152 within the collected data indicative of the trailer 18 and/or the coupler 16 by comparison to a database, as provided herein, through the SLAM process or any other feasible method. For example, as discussed herein, the pair of points 148, 150 or corners of the trailer 18 may be equally spaced laterally from the coupler 16, forming a triangular pattern. The pattern may indicate the trailer 18 and thus be distinguished by the hitch assist system 10. Such a pattern may be used to calculate one or more characteristics of the trailer 18, such as the trailer heading and/or the position of the coupler 16.
At step 162, projection vectors are formed between the detected positions or points 148, 150, 152 in the global frame. For example, the projection vector is formed between locations indicative of at least three points 148, 150, 152 of the trailer 18 (and the coupling 16). The projected vector may indicate a length of the coupler 16, a position of the coupler 16, and/or a direction of advancement of the coupler 16.
At step 164, the positions of the points 148, 150, 152 are calculated in the global frame by resolving the estimated position of the object relative to the vehicle 12 in the global frame. As step 166, the position of the trailer 18 and/or coupler 16 and vehicle 12 is used to determine an offset between the hitch assembly 22 and coupler 16. Once the offset is determined at step 166, the vehicle path 20 may be determined at step 168 using the path derivation program 128 to align the hitch ball 26 with the coupler 16. In this manner, the controller 14 uses the path derivation program 128 to determine the path 20 to align the hitch ball 26 with the hitch ball 26 overlap position of the coupler 16 on the hitch ball 26. Once the path 20 has been derived, the hitch assistance system 10 may require the user U to relinquish control of at least the steering wheel 88 of the vehicle 12 (and optionally throttle 100 and brake in various embodiments of the hitch assistance system 10, with the controller 14 assuming control of the powertrain 98 and brake 96 control systems during execution of the operating program 130) when the vehicle 12 performs an automatic hitch operation at step 170. When it has been confirmed that the user U is not attempting to control the steering system 80 (e.g., using the torque sensor 94), the controller 14 begins to move the vehicle 12 along the determined path 20. Additionally, the hitch assist system 10 may determine whether the transmission system 102 is in the correct gear and may shift to the desired gear or prompt the user U to shift to the desired gear. The hitch assistance system 10 may then control the steering system 80 to maintain the vehicle 12 along the path 20 as the user U or the controller 14 controls the speed of the vehicle 12 using the powertrain control system 98 and the brake control system 96. Once the hitch ball 26 is aligned with the coupler 16, the method 154 ends at step 172.
Referring to fig. 11-13, in some examples, the rear imager 40 may be disposed within the tailgate 32 or any other rear portion of the vehicle 12 and configured to provide image data 56 behind the vehicle 12. The imager 40 may be capable of imaging a top view of the hitch ball 26 and may provide image data 56 to the controller 14 for use by an image processing program 58 (through the above-described process or through other available processes) to determine the height H of the hitch ball 26hbAnd/or the length L of ball seat 24bmThis is possible even in low visibility conditions, due to the proximity of the hitch assembly 22 to the rear imager 40. Once the height of the hitch ball 26 is determined, the hitch assist system 10 may store the height H of the different hitch ball 26hbFor future use, it may not be necessary to measure various characteristics of ball seat 24 and hitch ball 26 during subsequent hitch assist operations.
Because of the wide variety of ball seats 24 and hitch balls 26 or connectors that may be used, hitch assist system 10 may utilize one or more imagers 38, 40, 42, 44 to determine various characteristics of hitch assembly 22, including ball seat length LbmAnd/or hitch ball height Hhb. In some examples, during an initial setup procedure for hitch assist system 10, user U may be prompted to install hitch assembly 22 to vehicle 12 by assembling ball seat 24, including hitch ball 26, within receiver 28 positioned on the rear of vehicle 12.
If no hitch assembly is stored in memory 126, or if a hitch assembly 22 attached to vehicle 12 is not recognized when compared to any previously attached and recognized hitch assemblies, the user may be asked to enter the diameter of hitch ball 26. Diameter D of hitch ball 26hbIs stored in the memory 126 and may be used as a reference length for determining various other measurements about the vehicle 12. In some examples, the imager 40 mayTo capture a series of still images near the rear of the vehicle 12. As previously described, the image includes portions of the vehicle 12, objects secured to the vehicle 12 (e.g., the hitch assembly 22), and/or noise (e.g., ground debris, animals, etc.).
The number of images captured and/or the time elapsed between capturing the images may be predetermined. As described below, the predetermined number of captured images may depend on pixel or intensity values of the averaged or combined image. The controller 14 may average or combine the captured images into an average or combined image patch 54 by using the processor 124. Typically, the averaging process includes averaging the pixels of a series of captured images after optionally stabilizing the images (i.e., aligning them to account for slight vibrations of the imager 40). The controller 14 may find the outer edges of the object within the averaged or combined image. The controller 14 finds the outer edges by analyzing the averaged or combined images. Generally, the analysis process involves determining an interface between an item statically affixed to the vehicle 12 and the blurred background noise. The controller 14 associates a particular pixel on the averaged image with a real-world spatial location that includes a known reference length, such as the diameter of the hitch ball 26. Thus, the controller 14 may determine the size of the object defined by a particular pixel. For example, the length L of tee 24bmDiameter D of hitch ball 26hbA comparison is made to determine the length L of ball holder 24 by comparing the number of pixelsbm. However, it should be understood that the length L of ball seat 24bmAny distance measurement technique may also be used for calculations by other sensors or modules in the vehicle 12.
Referring to fig. 13, an imager model 174 is shown, generally representing the rear imager 40 associated with hitch assembly 22, wherein the projected geometry of imager 40 may be used to determine the height H of hitch ball 26hbAs depicted, αiShows the hitch ball viewing angle, d, of the rear imager 40ibShowing the distance between the rear imager 40 and hitch ball 26, H showing the height of the rear imager 40 relative to the ground, HhbIndicating the height of the catch ball 26 relative to the ground, HiShowing rear imager 40 relative to hitch ball26 and L is equal tobmIs the distance between hitch ball 26 and vehicle 12.
The length of tee 24 and the distance d between rear imager 40 and hitch ball 26, based on imager model 174ibProvided by the following equation:
dib=C*(Dhb (Pixel))-1(6)
In equation 6, Dhb (Pixel)Is the diameter of hitch ball 26 measured in pixels, which is known from the image processing described above and C corresponds to a known constant. The constant C varies for different vehicle platforms (due to different camera positions, resolutions, etc.) and may be determined based on a calibration analysis of the instance of the vehicle setting and stored in the memory 126. For example, FIG. 15 shows the diameter D of hitch ball 26 measured in pixelshb (Pixel)The diameter and the indicated distance d between the rear imager 40 and hitch ball 26ibAnd (4) associating. Once the distance d between the rear imager 40 and hitch ball 26 is determinedibHeight H between hitch ball 26 and rear imager 40iCan be determined by the following equation:
Figure BDA0002195979050000231
in equation 7, assume dibAnd LbmAre known. Since the height H of the rear imager 40 relative to the ground is known, the height H between the hitch ball 26 and the rear imager 40 can be determined by subtracting the height H from the height H of the rear imager 40 relative to the groundiThe height H of the hitching ball 26 is obtainedhb. In some cases, an additional payload may be disposed within the vehicle 12, causing a change in the height H of the rear imager 40 relative to the ground. This may be addressed by any feasible method, including calculating the new height H of the rear imager 40 relative to the ground through various image processing techniques and/or through any other sensors that may be disposed on any portion of the vehicle 12.
Further, the viewing angle α of the hitch ball 26 may be calculated based on the following equationi
Figure BDA0002195979050000241
In equation 8, assume dibAnd LbmAre known. Thus, the view angle of the hitched ball can be calculated using sine law. Additionally and/or alternatively, a look-up table and/or graph 198 may be used, as exemplarily shown in fig. 16, for distancing hitch ball 26 from imager dib(which may be measured in pixel length) and hitch ball viewing angle αiAnd (4) associating.
Referring now to fig. 14, a method 176 is shown illustrating the steps of aligning a vehicle hitch ball 26 with a trailer hitch 16 using the hitch assist system 10, according to some examples. Specifically, in step 178, the hitch assistance system 10 is initiated. Once hitch assist system 10 is activated, controller 14 may use imaging system 36 to scan a visual scene using any or all of the available imagers 38, 40, 42, 44 in step 180. At step 180, the scene scan may create an image block 54 (FIG. 12), which may then be used at step 182 to identify the hitch assembly 22. As provided herein, memory 126 of controller 14 may store various characteristics of the identified hitch assembly, including the length of ball seat 24 and/or height H of hitch ball 26hb. Once the imaging system 36 detects the hitch assembly 22, the hitch assistance system 10 will determine at step 184 whether the hitch assembly 22 is recognized, thereby storing the characteristics of the hitch assembly 22 in the memory 126, or whether the hitch assembly 22 is newly mounted on the vehicle 12 or not recognized.
If various characteristics of hitch assembly 22 are not stored in memory 126, the user may be asked to provide a diameter D of hitch ball 26 at step 186hb. Hitch ball diameter D may be determined by any feasible means, such as HMI 114 (FIG. 2) and/or portable device 122 (FIG. 1)hbInto the hitch assistance system 10. The pixel diameter of hitch ball 26 and the pixel length of ball seat 24 are measured at steps 188 and 190, respectively. To measure the pixel diameter of the hitch ball 26, the processor 124 applies image distortion andthe homography is transformed to generate an overhead view of the image captured by the imager 40. The processor 124 may then apply a hough circle transform using the parametric circle function to locate the circular structure. In doing so, the hitch ball 26 having a circular shape may be more easily identified and distinguished from other structures proximate the vehicle 12. Upon identifying the circular structure, the controller 14 applies a filter (e.g., a kalman filter) to the circular structure via the processor 124. When a circular structure is detected, the number of pixels forming the diameter of the structure can be measured. Based on the amount of pixels measured at step 186 and the input value from the user, the diameter D of the ball 26 may be hunghbConverted to the number of pixels within the image block 54 (fig. 12). Likewise, ball sockets 24 may be identified through image processing, and the number of pixels along the length of ball sockets 24 may also be measured.
At step 192, the known diameter D of the hitch ball 26 is utilizedhbAnd the measured pixel diameter, hitch assist system 10 may be able to calculate the distance d between rear imager 40 and hitch ball 26ibOr focal length. In some cases, the distance d between the rear imager 40 and hitch ball 26 may be adjusted with respect to camera resolutionibEquations or look-up tables for camera position and/or hitch ball 26 width are stored in memory 126. In other examples, a look-up table including data relating hitch ball 26 position to hitch ball 26 pixel width may be stored in the memory 126 as a look-up table of the width of various discrete hitch balls 26, forming a data-driven formula. For example, fig. 15 shows the pixel width of hitch ball 26 and the distance di from hitch ball 26 to imager 40bMay be exponential and/or asymptotic.
At step 190, the hitch assistance system 10 may also store the number of pixels included in the tow bar, which may be identified by various image processing techniques. In some cases, a fixed point on the image block 54 may be used as a reference origin, which may be horizontally centered on the image. For example, the reference origin may be a pixel on tee 24 near the bumper. The system then measures the number of pixels between that point and the center of the hitch ball 26.
At step 196Length L of ball seat 24bmMay be used to determine the angle of the hitch ball 26 relative to the rear imager 40, which may be referred to as the viewing angle. As provided herein, data-driven formulas and/or look-up tables can be used to determine the viewing angle. For example, the look-up table may include information regarding various hitch ball diameters DhbAnd various distances between rear imager 40 and hitch ball 26 and/or various viewing angles. In some cases, the viewing angle may be predicted based on the pixel width of ball holder 24. Further, a look-up table and/or graph 198 may be used, as exemplarily shown in FIG. 16, for the distance d of the hitch ball 26 to the imagerib(which may be measured in pixel length) and hitch ball viewing angle αiAnd (4) associating. However, it should be understood that the length L of ball seat 24bmAnd the perspective may be determined by any other process without departing from the teachings provided herein.
At steps 200 and 202, the height of hitch ball 26 relative to the ground, and the length L of tee 24, may be determined using available image data 56 as described above, including using image processing program 58bm. The new hitch assembly data may then be stored in memory 126 of controller 14 for subsequent hitch-assist operations with the same hitch assembly 22 at step 204.
At step 206, the path derivation program 128 may be used to determine the vehicle path 20 to align the hitch ball 26 with the coupler 16. In this manner, the controller 14 uses the path derivation program 128 to determine the path 20 to align the hitch ball 26 with the hitch ball 26 overlap position of the coupler 16 on the hitch ball 26. Once the path 20 has been derived, the hitch assistance system 10 may require the user U to relinquish control of at least the steering wheel 88 of the vehicle 12 (and optionally throttle 100 and brake in various embodiments of the hitch assistance system 10, where the controller 14 assumes control of the powertrain 98 and brake 96 systems during execution of the operating program 130) when the vehicle 12 performs an automatic hitch operation at step 208. When it has been confirmed that the user U is not attempting to control the steering system 80 (e.g., using the torque sensor 94), the controller 14 begins to move the vehicle 12 along the determined path 20. Additionally, the hitch assist system 10 may determine whether the transmission system 102 is in the correct gear and may shift to the desired gear or prompt the user U to shift to the desired gear. Then, when the user U or controller 14 controls the speed of the vehicle 12 using the powertrain control system 98 and the brake control system 96, the hitch assist system 10 may control the steering system 80 to maintain the vehicle 12 along the path 20. As discussed herein, the controller 14 or user U may control at least the steering system 80 while tracking the position of the coupler 16 until the vehicle 12 reaches the end point 132, where the vehicle hitch ball 26 reaches the desired position 140 to achieve the desired alignment with the coupler 16, at which point the method may end at step 210.
Various advantages can be obtained by using the present disclosure. For example, use of the disclosed hitch assist system provides a system for determining hitch ball height and/or position for aligning a hitch ball with a coupler of a trailer. Furthermore, the hitch assistance system may utilize any type of sensor to generate the object detection grid map. In response to the grid map, the hitch assistance system may be able to identify the trailer and/or the coupler proximate the vehicle. With a known hitch ball position, the hitch assist system may be able to align the hitch ball with the detected coupler.
According to some examples, a hitch assistance system is provided herein. The hitch assistance system includes a sensing system having an imager and a proximity sensor. The hitch assistance system further includes a controller for receiving signals from the proximity sensor and generating a signature graph; determining a coupler position based on the detected features; and maneuvering the vehicle along the path to align the hitch ball with the hitch of the trailer. Examples of hitch assistance systems may include any one or combination of the following features:
● the controller is further configured to generate an image patch proximate the vehicle and determine hitch ball height;
● the proximity sensor is a radio detection and ranging (radar) sensor;
● the controller is further configured to apply a parametric circle function to locate circular structures within the image block;
● the controller is further configured to compare the input value of hitch ball diameter to the number of pixels within the circular structure to form a reference length;
● the controller is further configured to utilize the reference length to determine a tee length based on a number of pixels along a longitudinal axis of the tee compared to a number of pixels within a circular structure forming the reference length;
● the controller uses sensor signals from the proximity sensors to perform a simultaneous location and mapping (SLAM) process of an area proximate to the vehicle;
● the SLAM process is configured to locate one or more points on the trailer and the one or more points are used to determine a characteristic of the trailer;
● the one or more points include a first point indicative of a coupler, a second point indicative of a first corner of the trailer, and a third point indicative of a second corner of the trailer; and/or
● calculate the length of the coupling based on the relationship between the first, second and third points.
Accordingly, a hitch assist system is provided herein. The method includes generating a grid map of features proximate to a vehicle from one or more sensors disposed on the vehicle. The method also includes positioning and mapping two or more features indicative of the trailer relative to each other. The method also includes controlling the vehicle along the path to align the hitch ball with the hitch of the trailer. Examples of hitch assistance methods may include any one or combination of the following features and/or steps:
● collecting and storing feature locations for classifying two or more features; and analyzing the two or more features to form a feature extraction database;
● the feature extraction database storing the two or more features for iterative comparison with new data to predict the presence of a predefined object based on the detected features;
● calculating features of the feature extraction database using Scale Invariant Feature Transform (SIFT) or harris corner detector;
● use a Harris corner detector to compute features of a feature extraction database;
● creating image blocks of a scene behind the vehicle;
● applying a parametric circle function to locate circular structures within the image block;
● comparing the input value of hitch ball diameter with the number of pixels within the circular structure to form a reference length; and/or
● use the reference length to determine tee length or hitch ball height.
According to various examples, a hitch assistance system is provided herein. The hitch assistance system includes an imager for capturing images of a rearward vehicle. The hitch assist system further includes a controller for creating an image patch of a scene behind the vehicle based on the image provided by the imager; applying a parametric circle function to locate circular structures within the image block; comparing the input value of the hitch ball diameter with the number of pixels in the circular structure to form a reference length; and the reference length is used to determine tee length or hitch ball height. Examples of hitch assistance systems may include any one or combination of the following features:
● a proximity sensor configured to generate a grid map of features proximate to the vehicle from one or more sensors disposed on the vehicle;
● the controller recognizes the circular structure as representing a hitch ball and applies a filter to the circular structure; and/or
● the controller identifies a center point within the circular structure and measures the pixel length from the bumper to the center point to calculate the tee length.
It will be understood by those of ordinary skill in the art described that the construction of the invention and other components is not limited to any particular materials. Other exemplary examples of the invention disclosed herein may be formed from a variety of materials, unless otherwise described herein.
For the purposes of this disclosure, the term "coupled" (in all its forms, coupled, etc.) generally means that two components (electrical or mechanical) are connected to each other either directly or indirectly. Such a connection may be fixed in nature or movable in nature. Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components. Unless otherwise specified, such connections may be permanent in nature, or may be removable or releasable in nature.
In addition, any arrangement of components to achieve the same functionality is effectively "associated" such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as "associated with" each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being "operably connected," or "operably coupled," to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being "operably couplable," to each other to achieve the desired functionality. Some examples of operably couplable include, but are not limited to, physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting components. Further, it should be appreciated that components preceding the term "… …" may be disposed in any feasible location (e.g., on, within, and/or outside of a vehicle) such that the components may function in any manner described herein.
Embodiments of the systems, apparatus, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media storing computer-executable instructions are computer storage media (devices). Computer-readable media bearing computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the present disclosure can include at least two distinct computer-readable media: computer storage media (devices) and transmission media.
Computer storage media (devices) include RAM, ROM, EEPROM, CD-ROM, solid state drives ("SSDs") (e.g., based on RAM), flash memory, phase change memory ("PCM"), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
Embodiments of the apparatus, systems, and methods disclosed herein may communicate over a computer network. A "network" is defined as one or more data links that support the transfer of electronic data between computer systems and/or modules and/or other portable devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or any combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmission media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the features and acts are disclosed as exemplary forms of implementing the claims.
Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including internal vehicle computers, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablet computers, pagers, routers, switches, various storage devices, and the like. The present disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by any combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
Further, where appropriate, the functions described herein may be performed in one or more of the following: hardware, software, firmware, digital components, or analog components. For example, one or more Application Specific Integrated Circuits (ASICs) may be programmed to perform one or more of the systems and processes described herein. Certain terms are used throughout the description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name but not function.
It should be noted that the sensor and/or switch examples discussed above may include computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, the sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/circuitry controlled by the computer code. These exemplary devices are provided herein for purposes of illustration, and not for limitation. Examples of the disclosure may be implemented in other types of devices, as known to those skilled in the relevant art.
At least some examples of the disclosure relate to computer program products that include such logic (e.g., in software) stored on any computer-usable medium. Such software, when executed in one or more data processing devices, causes the devices to operate as described herein.
It is also important to note that the construction and arrangement of the elements of the invention as shown in the illustrative examples is illustrative only. Although only a few examples of the present inventions have been described in detail in this disclosure, those skilled in the art who review this disclosure will readily appreciate that many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited. For example, elements shown as integrally formed may be constructed of multiple parts or elements shown as multiple parts may be integrally formed, the operation of the interfaces may be reversed or otherwise varied, the length or width of the structures and/or members or connectors or other elements of the system may be varied, the nature or number of adjustment positions provided between the elements may be varied. It should be noted that the elements and/or assemblies of the system may be constructed of any of a variety of materials that provide sufficient strength or durability, in any of a variety of colors, textures, and combinations. Accordingly, all such modifications are intended to be included within the scope of this innovation. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the desired and other exemplary embodiments without departing from the spirit of the present innovations.
It is to be understood that any of the described processes or steps can be combined with other disclosed processes or steps to form structures within the scope of the present invention. The exemplary structures and processes disclosed herein are for illustrative purposes and should not be construed as limiting.
It should also be understood that variations and modifications can be made on the aforementioned structures and methods without departing from the concepts of the present disclosure, and further it should be understood that such concepts are intended to be covered by the following claims unless these claims by their language expressly state otherwise.
According to one embodiment, the invention is further characterized by creating an image block of a scene behind the vehicle; applying a parametric circle function to locate circular structures within the image block; the input value of the ball diameter is compared to the number of pixels within the circular structure to form a reference length and the reference length is used to determine the tee length or hitch ball height.
According to an embodiment of the present invention, there is provided a hitch assist system having: an imager for capturing an image of a rearward vehicle; and a controller for creating an image block of a scene behind the vehicle based on the image provided by the imager, applying a parametric circle function to locate a circular structure within the image block, comparing an input value of hitch ball diameter to a number of pixels within the circular structure to form a reference length, and determining a tee length or hitch ball height using the reference length.
According to one embodiment, the invention also features a proximity sensor configured to generate a grid map of features proximate to a vehicle from one or more sensors disposed on the vehicle.
According to one embodiment, the controller identifies the circular structure as representing a hitched ball and applies a filter to the circular structure.
According to one embodiment, the controller identifies a center point within the circular structure and measures a pixel length from the bumper to the center point to calculate a tee length.

Claims (15)

1. A hitch assist system, comprising:
a sensing system having an imager and a proximity sensor; and
a controller to:
receiving a signal from the proximity sensor and generating a signature;
determining a coupler position based on the detected features; and is
The vehicle is maneuvered along the path to align the hitch ball with the hitch of the trailer.
2. The hitch assist system of claim 1, wherein the controller is further configured to generate an image patch proximate the vehicle and determine hitch ball height.
3. The hitch assistance system of claim 1, wherein the proximity sensor is a radio detection and ranging (radar) sensor.
4. The hitch assist system of claim 2, wherein the controller is further configured to apply a parametric circle function to locate circular structures within the image block.
5. The hitch assist system of claim 4, wherein the controller is further configured to compare an input value of hitch ball diameter to a number of pixels within the circular structure to form a reference length.
6. The hitch assist system of claim 5, wherein the controller is further configured to utilize the reference length to determine the tee length based on a number of pixels along a longitudinal axis of a tee compared to the number of pixels within the circular structure forming the reference length.
7. The hitch assist system of any one of claims 1-6, wherein the controller uses sensor signals from the proximity sensor to conduct a simultaneous location and mapping (SLAM) process of an area proximate the vehicle.
8. The hitch assist system of claim 7, wherein the SLAM process is configured to locate one or more points on a trailer, and the one or more points are used to determine characteristics of the trailer.
9. The hitch assist system of claim 8, wherein the one or more points include a first point indicative of the coupler, a second point indicative of a first corner of the trailer, and a third point indicative of a second corner of the trailer.
10. The hitch assist system of claim 8, wherein the length of the coupler is calculated based on a relationship between the first, second, and third points.
11. A hitching auxiliary method comprises the following steps:
generating a grid map of features proximate to a vehicle from one or more sensors disposed on the vehicle;
positioning and mapping two or more features indicative of the trailer relative to each other; and
the vehicle is controlled along a path to align a hitch ball with a hitch of the trailer.
12. The hitch assist method of claim 11, further comprising:
collecting and storing feature locations for classifying the two or more features; and
the two or more features are analyzed to form a feature extraction database.
13. The hitch assistance method of claim 12, wherein the feature extraction database stores the two or more features for iterative comparison with new data to predict the presence of a predefined object based on the detected features.
14. The hitch assist method of any one of claims 12 or 13, further comprising:
features of the feature extraction database are computed using a Scale Invariant Feature Transform (SIFT) or harris corner detector.
15. The hitch assist method of claim 12, further comprising:
features of the feature extraction database are computed using a harris corner detector.
CN201910849720.8A 2018-09-10 2019-09-09 Hitching auxiliary system Pending CN111267564A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/126,235 2018-09-10
US16/126,235 US20200079165A1 (en) 2018-09-10 2018-09-10 Hitch assist system

Publications (1)

Publication Number Publication Date
CN111267564A true CN111267564A (en) 2020-06-12

Family

ID=69621306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910849720.8A Pending CN111267564A (en) 2018-09-10 2019-09-09 Hitching auxiliary system

Country Status (3)

Country Link
US (1) US20200079165A1 (en)
CN (1) CN111267564A (en)
DE (1) DE102019124152A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10955540B2 (en) 2017-12-01 2021-03-23 Aptiv Technologies Limited Detection system
JP7169121B2 (en) * 2018-08-23 2022-11-10 日立Astemo株式会社 Vehicle coupling assistance device, vehicle coupling assistance method, and vehicle coupling assistance system
US10838054B2 (en) * 2018-10-08 2020-11-17 Aptiv Technologies Limited Detection system and method
US11090991B2 (en) * 2018-12-04 2021-08-17 Ford Global Technologies, Llc Human machine interface for vehicle alignment in an acceptable hitch zone
US11092668B2 (en) * 2019-02-07 2021-08-17 Aptiv Technologies Limited Trailer detection system and method
US11584177B2 (en) * 2019-09-13 2023-02-21 Ford Global Technologies, Llc Fixed offset method for hitch detection in hitch assist operation
US11770991B2 (en) * 2019-10-31 2023-10-03 Deere & Company Vehicle connection guidance
US11377029B2 (en) * 2019-12-06 2022-07-05 Magna Electronics Inc. Vehicular trailering assist system with trailer state estimation
US11408995B2 (en) 2020-02-24 2022-08-09 Aptiv Technologies Limited Lateral-bin monitoring for radar target detection
CN111898460A (en) * 2020-07-08 2020-11-06 中国神华能源股份有限公司神朔铁路分公司 Locomotive auxiliary trailer system, method, device, equipment and storage medium
US20220179429A1 (en) * 2020-12-09 2022-06-09 Continental Automotive Systems, Inc. Method for determining a tow hitch position
US11676300B2 (en) 2020-12-09 2023-06-13 Continental Autonomous Mobility US, LLC Method for real-time tow ball detection
WO2023129858A1 (en) * 2021-12-31 2023-07-06 Continental Autonomous Mobility US, LLC System and method for estimating tow ball position

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3945467B2 (en) * 2003-10-02 2007-07-18 日産自動車株式会社 Vehicle retraction support apparatus and method
US20130226390A1 (en) * 2012-02-29 2013-08-29 Robert Bosch Gmbh Hitch alignment assistance
US9446713B2 (en) * 2012-09-26 2016-09-20 Magna Electronics Inc. Trailer angle detection system
CN105082910B (en) * 2014-05-07 2018-01-26 通用汽车环球科技运作有限责任公司 Aid in the system and method that delivery vehicle is attached to trailer
US9499018B2 (en) * 2015-04-01 2016-11-22 Robert Bosch Gmbh Trailer coupling assistance system with vehicle video camera
US10044988B2 (en) * 2015-05-19 2018-08-07 Conduent Business Services, Llc Multi-stage vehicle detection in side-by-side drive-thru configurations
US10391939B2 (en) * 2016-09-01 2019-08-27 GM Global Technology Operations LLC Method and apparatus to determine trailer pose
US10761534B2 (en) * 2018-01-30 2020-09-01 Uatc, Llc Fused sensor view for self-driving truck
US11198340B2 (en) * 2018-05-01 2021-12-14 Continental Automotive Systems, Inc. Coupler and tow-bar detection for automated trailer hitching via cloud points

Also Published As

Publication number Publication date
US20200079165A1 (en) 2020-03-12
DE102019124152A1 (en) 2020-03-12

Similar Documents

Publication Publication Date Title
CN111267564A (en) Hitching auxiliary system
JP7124117B2 (en) Trailer detection and autonomous hitching
EP3787909B1 (en) Coupler and tow-bar detection for automated trailer hitching via cloud points
US11050933B2 (en) Device and method for determining a center of a trailer tow coupler
JP6938793B2 (en) Automatic trailer concatenation using image coordinates
US11633994B2 (en) Vehicle-trailer distance detection device and method
JP7167185B2 (en) visual object tracker
US10632919B2 (en) Vehicle hitch assist system
US10953711B2 (en) Hitch assist system
US10800217B2 (en) Hitch assist system
US10632803B2 (en) Hitch assist system
US10744943B1 (en) System and method for trailer alignment
CN111055832A (en) Hitching auxiliary system
CN112638668A (en) System and method for calibrating motion estimation algorithms using vehicle cameras
US20210155238A1 (en) 3d position estimation system for trailer coupler
US20230331160A1 (en) Tow Ball Position Detection System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200612