CN112313592A - System and method for autonomously receiving a target object by a vehicle - Google Patents

System and method for autonomously receiving a target object by a vehicle Download PDF

Info

Publication number
CN112313592A
CN112313592A CN201980042812.3A CN201980042812A CN112313592A CN 112313592 A CN112313592 A CN 112313592A CN 201980042812 A CN201980042812 A CN 201980042812A CN 112313592 A CN112313592 A CN 112313592A
Authority
CN
China
Prior art keywords
vehicle
orientation
information
target object
engagement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980042812.3A
Other languages
Chinese (zh)
Inventor
P·克尼斯
A·班纳吉
A·哈菲勒
C·夏尔
M·弗雷德尔
S·比尔
T·杰格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZF Friedrichshafen AG
Original Assignee
ZF Friedrichshafen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZF Friedrichshafen AG filed Critical ZF Friedrichshafen AG
Publication of CN112313592A publication Critical patent/CN112313592A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser

Abstract

A system for autonomously receiving a target object by a vehicle (1), the system having: a position determination means (3a) configured for determining the position of the vehicle (1) with respect to a fixed-position coordinate system; and orientation determination means (4a, 4b, 4c, 4d, 4e) configured for determining an orientation of the vehicle (1) relative to the target object. Furthermore, a control device (5) is provided, which is configured to receive position information from the position determination means (3a) and orientation information from the orientation determination means (4a, 4b, 4c, 4d, 4e) and to autonomously operate the vehicle (1). The control device (5) controls the autonomous operation of the vehicle (1) on the basis of position determination information in a section between the initial position and the intermediate position of the vehicle (1) and additionally or alternatively on the basis of orientation information in a section between the intermediate position and the receiving position.

Description

System and method for autonomously receiving a target object by a vehicle
Technical Field
The invention relates to a system and a method for autonomously receiving a target object by a vehicle. The invention relates in particular to a system for autonomously receiving a target object designed as a replaceable cargo carrier by a vehicle, wherein a plurality of detection devices are used for determining the position and orientation.
Background
In the case of vehicles which are provided for receiving a swap body container (Wechselbr üken) as exchangeable goods carrier, systems for supporting the process of engaging the vehicle under the swap body container are known from the prior art. For example, document DE 102006057610 a1 discloses a system in which the process of engaging a vehicle under a swap body container is supported by an image-based sensor. With such a system, distance information between the vehicle and the exchange container can be determined and thus the joining process can be interrupted in a suitable manner.
During the joining of the vehicles for receiving the exchange containers, only small deviations in terms of the travel path (in particular during the actual joining) are allowed. It is therefore desirable to provide a system for determining position and orientation with extreme accuracy, in particular during the engagement of vehicles, in particular in autonomous operation, under a swap body container.
Disclosure of Invention
A system for autonomously receiving a target object by a vehicle, the system having: a position determination tool configured to determine a position of the vehicle with respect to a fixed-position coordinate system; and an orientation determination tool configured to determine an orientation of the vehicle relative to the target object. Here, the system may have a control device configured to receive position information from the position determination means and orientation information from the orientation determination means and to operate the vehicle autonomously. The control device may control the autonomous operation of the vehicle based on position determination information in a section between an initial position and an intermediate position of the vehicle, and additionally or alternatively based on orientation information in a section between the intermediate position and an accommodated position of the vehicle.
Here, the position of the vehicle with respect to the fixed-position coordinate system may include the position of a reference point of the vehicle and the spatial orientation of the vehicle with respect to the fixed-position coordinate system. In this case, the coordinate system with a fixed position can be selected freely, as long as a unique determination of the position of the vehicle can be achieved by means of the position determination means. The orientation of the vehicle relative to the target object relates to the vehicle position relative to the target object position and can be determined dynamically.
The system may have a GPS-based position determination system as a position determination tool. In this case, a device that is set in advance and that provides the position of the vehicle as position information when using the GPS system may be installed in the system. Various systems available today and in the future may be considered as GPS-based position determination systems.
In the system, the position-determining means may have a sensor system of a guidance system to detect reference elements arranged in the region of the travel path of the vehicle. In this case, if reference elements are provided whose position is known in relation to a fixed-position coordinate system, the position information in relation to the fixed-position coordinate system can be determined by means of the guidance system. In particular, RFID elements can be used in the sensor system to detect reference elements provided in the region of the travel path. In this case, the RFID element can be arranged at a known position in the region of the travel path. The sensor system may have a detection element for detecting the position of the RFID element, so that the position of the coordinate system with respect to the position fixing can be provided as position information on the basis of the known position of the RFID element. As long as the position of the vehicle with respect to the fixed-position coordinate system can be provided as the position information, an element different from the RFID element may be used as the reference element.
In the system, the orientation determination means may have a laser-based sensor system. The laser-based sensor system may be mountable on the vehicle and may be configured to detect an orientation of the vehicle relative to the target object. In this case, any type of laser-based sensor system can be used as long as it is possible to quantitatively detect the geometric variable, in particular the distance between the at least one reference position of the vehicle and the at least one reference position of the target object. In the case of a laser-based sensor system, a plurality of sensor elements may be provided, so that a plurality of information may be provided.
In the system, the laser-based sensor system may interact with an element of the target object and generate a spacing signal indicative of a spacing of the target object from the vehicle in a longitudinal direction. In this case, the sensor system can be arranged on the vehicle in such a way that it can detect the elements of the target object.
In the system, the laser-based sensor system interacts with a plurality of elements of the target object and generates orientation signals indicating an orientation between the target object and the vehicle in terms of relative rotation, in particular in terms of relative rotation about a vertical axis of the vehicle, and/or in terms of relative lateral offset. For this purpose, a plurality of laser sensors may be provided, each of which generates a distance signal. Alignment between the target object and the vehicle may be determined from the plurality of spacing signals based on knowing the geometry of the target object.
In the system, the orientation determination tool may have a camera system. The camera system may be directed backwards in terms of the direction of travel of the vehicle. Furthermore, the camera system may generate at least one spacing signal by imaging scanning a field of view, the at least one spacing signal being indicative of a spacing of the target object from the vehicle in a longitudinal direction. The camera system can generate an orientation signal which indicates the orientation between the target object and the vehicle with respect to a relative rotation, in particular with respect to a relative rotation about a vertical axis of the vehicle, and/or with respect to a relative lateral offset. For this purpose, the image information of the camera system can be analyzed and information about the alignment between the target object and the vehicle can be generated from the image information by means of a preset algorithm.
In particular, camera systems with two camera elements can be used, wherein the cameras are arranged offset on the vehicle in the direction of the main viewing direction or in the longitudinal direction of the vehicle. In particular, one camera element is arranged in a front section of the vehicle and the other camera element is arranged in a rear section of the vehicle. The information generated by the camera elements is combined using a predetermined algorithm to generate an orientation signal.
In the system, a levelness detection system may be provided, which generates a levelness signal. The levelness signal is indicative of a spacing of a reference element of the vehicle relative to a ground surface on which the vehicle is located. Here, the levelness detection system may have a laser-based sensor system. The reference element of the vehicle may be a frame or a section of a frame of the vehicle. Furthermore, the reference element of the vehicle may have an element for receiving the target object, for example a coupling element.
In the system, in autonomous operation of the vehicle, arrival of the vehicle at the intermediate position may be detected by the orientation determination means. Here, when the orientation determination means detects an element of the target object, the orientation determination means detects that the vehicle reaches the intermediate position. Among the alternatives discussed above for the orientation determination means, camera-based systems, laser-based sensor systems, etc. are mentioned in particular. Differences in detection range are derived based on different techniques of the system that can be used. In particular, the maximum detection distance that is technically determined may be set in advance for each of the systems used. The intermediate position of the vehicle is thus defined as the position of the vehicle relative to the target object at which at least one of the detection systems of the orientation determination means can perform the detection by interacting with an element of the target object. Here, the orientation determination means may generate a signal indicative of: the intermediate position is reached. Alternatively or additionally, the reaching of the intermediate position may be determined by a position determination tool.
In the system, the control device may be configured to fuse the available orientation information and/or position information with each other and to generate an overall orientation signal. The information from the detection devices is combined and analyzed in the control device by means of a predetermined algorithm. Due to deviations in the accuracy and/or reliability of the output signals of the detection system, corrections and/or plausibility checks can be performed by fusion. Thus, a reliable and accurate overall orientation signal can be generated by means of the integrity of the information.
A vehicle for receiving a target object designed as a swap body container may be provided with an engagement element configured for engagement into an engagement channel provided on an exchangeable goods carrier. Furthermore, the vehicle may have a control device which may be configured to receive position information from a position determining means of the vehicle and orientation information from an orientation determining means of the vehicle and to operate the vehicle autonomously. The vehicle may also be equipped with a system having one or more of the foregoing features.
In this vehicle, an engagement element can be provided in such a way that engagement into the engagement channel of the exchangeable goods carrier can be achieved by driving the vehicle in reverse below the exchangeable goods carrier. Here, the travel at the time of engagement is substantially straight travel. The vehicle may also be equipped with a system for adjusting the levelness, by means of which the spacing of the engaging elements relative to the ground can be adjusted. In particular, a levelness control system can be provided on the vehicle for this purpose.
A method for autonomously receiving a target object by a vehicle has the following steps:
determining position information at an initial position of the vehicle;
obtaining a trajectory for driving to an intermediate position where the vehicle is positioned at the target object in a preset orientation based on the determined position information; and controlling autonomous operation of the vehicle to move the vehicle to the neutral position;
determining orientation information after the vehicle has reached the intermediate location;
enabling an acceptance state of the vehicle based on the orientation information;
controlling autonomous operation of the vehicle to move the vehicle from the intermediate position to the receiving position in which the target object is receivable by the vehicle.
The method may be applied to a vehicle having the features of the vehicle described above. In particular, the method may be applied to a vehicle having one or more of the features of the system described above. In particular, the method for receiving an exchangeable goods carrier as target object is used by a vehicle provided with engaging elements. The engaging element is configured for engaging into an engaging channel provided on the exchangeable goods carrier.
In the method, the position information is used to obtain a trajectory for driving to the intermediate position. The position of the target object is taken into account in this case with respect to a positionally fixed coordinate system. Here, the intermediate position may be a position where the vehicle has approached the target object. For the case of using a replaceable cargo carrier as target object, the intermediate position is the following position of the vehicle: in this position, the longitudinal axis of the vehicle is approximately coincident with the longitudinal axis of the exchangeable goods carrier. The distance between the vehicle and the exchangeable goods carrier can be variable, as long as the alignment of the vehicle relative to the exchangeable goods carrier can be determined by the orientation determination means when the intermediate position is reached. Thus, the intermediate position can be estimated when the trajectory is acquired and the arrival at the intermediate position can be confirmed by corresponding measures.
In the method, the position information is determined by determining a position and an orientation of the vehicle in a fixed-position coordinate system, and the autonomous operation of the vehicle is controlled using the position information to move the vehicle to the intermediate position. In the method it may be assumed that: the intermediate position is reached if the vehicle is positioned in an almost longitudinally aligned manner with respect to the exchangeable goods carrier rack and orientation information can be determined by the orientation determining means by interaction between the orientation determining means provided on the vehicle and elements of the exchangeable goods carrier. The orientation determination means can thus generate and emit a corresponding signal, so that it can be concluded for further operation that the intermediate position has been reached.
In the method it may be assumed that: the receiving state is reached if the engaging element of the vehicle is fully engaged into the engaging channel of the exchangeable goods carrier. This state is reached if the vehicle is driven under the exchangeable goods carrier by reversing and the exchangeable goods carrier can be received in this state by subsequently lifting the engaging element of the vehicle. Here, the receiving means: the exchangeable goods carrier placed on the supporting element when the engaging element of the vehicle engages into the engaging channel of the exchangeable goods carrier is lifted in such a way that the supporting element is lifted from the ground and the exchangeable goods carrier is completely carried by the vehicle.
In the method, one or more of the following detection devices provided on the vehicle may be used in determining the orientation information:
a laser-based spacing detection device for detecting a longitudinal orientation and/or a transverse orientation of the vehicle relative to the interchangeable cargo carrier;
a camera system for detecting image information for determining orientation information;
a laser-based levelness detection device for detecting the spacing of the engagement element relative to the ground.
Correspondingly, the system mentioned in the system described above may be used as the detection means. In the method, starting from the intermediate position, the orientation information is acquired by fusing signals of a detection device provided on the vehicle when the engaging element engages into the engaging channel until the receiving position is reached. The measuring device can be used here in the following manner:
using the camera system while the vehicle is driving in reverse from the neutral position at least until the engagement element is engaged into the engagement channel;
using the laser-based spacing detection device to detect a longitudinal orientation of the vehicle relative to the interchangeable cargo carrier from the intermediate position until the receiving position;
using the laser-based levelness detection device prior to the joining of the joining element into the joining channel;
using the laser-based spacing detection device during engagement of the engagement element into the engagement channel to detect a lateral orientation of the vehicle relative to the interchangeable cargo carrier. Thus, individual detection means can be used in the method if technically feasible and useful.
In the method, the position information determined by the position determination tool may also be used for obtaining the orientation information. It should be noted here that the position information relates to the position of the vehicle relative to a fixed coordinate system, while the orientation information relates to the relative orientation between the vehicle and the exchangeable goods carrier. Due to the different properties of the position information and the orientation information, this processing yields: extremely accurate determination of alignment between the vehicle and the replaceable cargo carrier. Here, it is conceivable: in this method, the location information is only used if, for example, the GPS-based system used is of sufficient accuracy.
In the method, a plausibility check can be carried out on the signals of the detection devices arranged on the vehicle when the signals of the detection devices arranged on the vehicle are fused, and if necessary a corresponding correction can be carried out on the signals when the orientation information is acquired. Fusion involves the analysis of the available signals by means of a predetermined algorithm. Since the signals available partly relate to the same variable, redundant results can be achieved. For example, the spacing signal may be generated by either a camera system or a laser-based sensor system. In this case, the correction may be performed alternately on the signals. In the event of a signal of the plurality of redundant signals deviating excessively, it can be determined by means of plausibility checking whether this signal is too inaccurate. Such signals that are classified as inaccurate may not be considered for further processing.
In the method, the position information can be used in the plausibility check of the signal. These position information are detected by means of a position determination means for determining the vehicle relative to a fixed coordinate system. This prevents error sources (for example, positional deviations of reference elements on the exchangeable goods carriers) from influencing the plurality of redundant signals and also prevents deviations from being recognized by plausibility checks.
Drawings
FIG. 1 illustrates a vehicle and exchange container applicable to embodiments of the present invention;
FIG. 2 shows in schematic top view the arrangement of sensors on a vehicle applicable to an embodiment of the present invention;
FIG. 3 shows a bottom region of a swap body container that may be received by a vehicle;
FIG. 4 shows schematically different relative positions of a vehicle with respect to a swap body container to illustrate the inventive concept;
fig. 5 shows a schematic representation of the arrangement of sensors on a vehicle according to a further embodiment;
fig. 6 shows a vehicle and a swap body container in a schematic view from the rear according to the embodiment shown in fig. 5;
fig. 7 shows a schematic flow chart for illustrating a method according to an embodiment of the invention.
Detailed Description
Embodiments of the present invention are described below with reference to the drawings.
Fig. 1 shows a vehicle 1, which is suitable for receiving a target object. Fig. 1 furthermore shows a swap body container 2, which is embodied as an exchangeable goods carrier and is defined as target object in the present description. The exchange container 2 shown in fig. 1 can be received by a vehicle 1. The exchange container 2 is designed for receiving a load. For this purpose, an optional container 10 for receiving the load is provided on the exchange container 2. The vehicle 1 has a cab in a front section and a frame 8 in a rear section. In the illustration a vehicle 1 is shown with one front axle and two rear axles, however different arrangements with more or fewer axles are conceivable.
Behind the vehicle 1 in the longitudinal direction, in the illustration of fig. 1, the exchange container 2 is placed on a support element 9. In the configuration shown in fig. 1, the vehicle 1 is ready for receiving the exchange container 2. After the vehicle 1 has driven under the interchange container 2, the admission to the interchange container 2 is carried out by lifting the vehicle 1. The process for accepting a swap body container is discussed below.
Fig. 2 shows a schematic top view of the arrangement of the coupling elements 6a, 6b, 6c, 6d on the frame 8 of the vehicle 1. In particular, on a rear section of the frame 8 of the vehicle 1, there are arranged engaging elements 6a, 6b, which are spaced apart from one another in the transverse direction of the vehicle 1. In the front region of the frame 8 of the vehicle 1, the engaging elements 6c, 6d are shown, which are likewise spaced apart in the transverse direction of the vehicle 1. The engaging elements 6a, 6b fitted at the rear region of the frame 8 form a first pair of guide elements. The joint elements 6c, 6d arranged at the front region of the frame 8 form a second pair of joint elements. The first pair of engagement elements 6a, 6b is spaced apart from the second pair of engagement elements 6c, 6d in the longitudinal direction. In the present embodiment, therefore, four engaging elements 6a, 6b, 6c, 6d are provided on the vehicle frame 8, which are fitted at the four corners of an imaginary rectangle, respectively.
Fig. 3 shows a design of the switch body container 2. The area of the floor of the exchange container 2 can be seen in particular in the view of fig. 3. In this embodiment, the exchange container 2 has an optional container 10 provided for receiving a load. On the bottom surface of the exchange container 2, a joining channel 7 is provided. The joining channel 7 is defined upwards by the bottom surface of the exchange container 2. The joining channel 7 is constituted to the left by a right guide element 7a and to the right by a left guide element 7 b. The right guide element 7a and the left guide element 7b are each designed as guide rails, which are arranged in succession on the bottom surface of the exchange container 2. The distance between the left guide element 7a and the right guide element 7b is configured such that the engagement elements 6a, 6b, 6c, 6d can engage into the engagement channel 7 formed in this way.
Furthermore, the exchange container has support elements 9, wherein in the present embodiment four support elements 9 are provided. The support element 9 supports the exchange container on the ground so that the exchange container 2 is kept at a predetermined distance from the ground. The support element 9 is locked in the state shown in fig. 3 and can be folded and locked in the folded position after the vehicle 1 has received the exchange container 2, in particular after lifting the exchange container 2 from the ground.
Furthermore, the vehicle 1 shown in fig. 2 has a plurality of sensors and measuring devices. The following describes a sensor and a measuring device mounted on the vehicle 1 in the present embodiment.
A distance detection device 4a is provided in a front region of the vehicle body frame 8. The distance detection device 4a has a laser-based sensor system. The pitch detection device 4a directs the laser beam backward with respect to the traveling direction of the vehicle. A laser beam directed backwards from the spacing detection device 4a can impinge on an element of the exchange container 2. In the present exemplary embodiment, the element which the backward-directed laser beam can impinge on is a beam of the swap body container 2 at the front region of the swap body container 2. The distance detection device 4a has a measuring device, by means of which the distance between the distance detection device 4a and the element of the exchange container 2 (onto which the laser beam impinges) can be determined by known methods. Thus, the spacing detection device 4a can determine the relative position between the vehicle 1 and the interchange container 2 in the longitudinal direction. The spacing detection means 4a emit corresponding signals for further processing.
Furthermore, a camera system is provided on the vehicle shown in fig. 2, which camera system comprises two cameras 4c, 4d oriented rearward with respect to the direction of travel of the vehicle. Here, the front camera 4c is arranged on a front section of the frame 8, while the rear camera 4d is arranged on a rear section of the frame 8. The two cameras are configured such that they detect regions within the corresponding fields of view and transmit the detected information for processing. In particular, in the present embodiment, the information detected by the cameras 4c, 4d is converted into a signal which can infer the orientation of the exchange container 2 located behind the vehicle 1.
In the present embodiment, a vehicle shown in fig. 2 is provided with a GPS-based position detection system 3a as the position specifying means 3. The GPS-based position detection system 3a is configured for determining the position and orientation of the vehicle 1 with respect to a fixed-position coordinate system. The GPS based position determining system 3a issues corresponding signals for further processing.
The vehicle in fig. 2 is also provided with a levelness detection device 4 e. In the present embodiment, the levelness detection device 4e has a laser-based sensor that detects the distance from the ground on which the vehicle is located with reference to the reference position of the vehicle frame 8. The levelness detection means 4e issue corresponding signals for further processing.
Furthermore, the vehicle shown in fig. 2 has a laser-based engagement sensor system 4b configured for detecting the orientation of the vehicle 1 relative to the exchange container 2 in a laser-based manner. The laser-based joining sensor system 4b has four laser-based sensors 4b1, 4b2, 4b3, 4b4, which are each arranged in pairs on the frame 8 of the vehicle 1. Here, a first pair of laser-based sensors 4b1, 4b2 is arranged on the rear section of the vehicle 8. In particular, the first pair of laser-based sensors 4b1, 4b2 is disposed rearward of the first pair of engagement elements 6a, 6 b. Furthermore, the laser-based engagement sensor system 4b has a second pair of laser-based sensors 4b3, 4b4, which are arranged in the middle region of the frame 8 of the vehicle 1 with reference to the longitudinal direction. In particular, the second pair of laser-based sensors 4b3, 4b4 is arranged between the first pair of engagement elements 6a, 6b and the second pair of engagement elements 6c, 6 d. The laser based sensors 4b1, 4b2, 4b3, 4b4 are configured to detect the orientation and position of the swap body container 2 relative to the vehicle 1.
The laser-based bond sensor system 4b is shown in detail in fig. 5 and 6. As can be seen in fig. 5, the laser beams of the laser-based sensors 4b1, 4b2 arranged on the rear section of the carriage 8 are directed such that they impinge on the elements of the exchange container 2. Here, the laser beams of the laser-based sensors 4b1, 4b2 are oriented such that they are interleaved with each other. In this regard, fig. 6 shows in cross section the relative orientation of the laser-based sensors 4b1, 4b2 with respect to the right guide element 7a and the left guide element 7b of the joining channel 7.
By detecting the spacing between the laser-based sensor 4b1 at the right side of the vehicle and the elements of the interchange container 2 at the left side or the spacing between the laser-based sensor 4b2 at the left side of the vehicle 1 and the elements of the interchange container 2 at the right side, the orientation of the vehicle 1 relative to the interchange container 2 can already be detected before engaging the first pair of engagement elements 6a, 6 b. After the vehicle 1 has moved further under the interchange container 2, the laser-based sensors 4b3, 4b4 arranged at the middle region of the carriage 8 can detect the spacing from the elements of the interchange container 2 (in particular from the left guide rail 7b and the right guide rail 7 a). By this detection also an alignment between the vehicle 1 and the exchange container 2 can be performed. By combining the measurement results of the four laser-based sensors 4b1, 4b2, 4b3, 4b4 provided in the present embodiment during the entrance of the vehicle 1 under the interchange container 2, it is possible to detect the lateral offset of the vehicle 1 with respect to the interchange container 2 and the angular offset of the vehicle 1 with respect to the interchange container 2. A corresponding signal is generated by the laser-based bond sensor system 4b for further processing.
The process of engaging and receiving the exchange container 2 by the vehicle 1 is schematically illustrated in fig. 4. In fig. 4, the vehicle 1 is located in front of the exchange container 2 in the longitudinal orientation in the view indicated with a. Here, the vehicle 1 is still spaced apart from the exchange container 2 in the longitudinal direction. This position may be defined as an intermediate position, as discussed below.
In the view indicated with B, the vehicle 1 has traveled with its rear section under the interchange container 2. Here, the first pair of engaging elements 6a, 6b has engaged into the engaging channel 7 between the right guide element 7a and the left guide element 7 b.
In the view indicated with C, the vehicle 1 travels completely under the interchange container 2. Here, the first pair of engagement elements 6a, 6b and the second pair of engagement elements 6c, 6d engage into the engagement channel between the right guide element 7a and the left guide element 7 b. In this position, the vehicle 1 is in a receiving position, which is discussed below.
Thus, information about the position and orientation of the vehicle 1 with respect to a fixed-position coordinate system is generated and provided by the position determination means 3. Furthermore, information about the orientation of the vehicle 1 relative to the exchange container 2 is provided by the orientation determination means 4 as soon as detection by corresponding detection means is possible. In the present embodiment, the orientation determining means 4 has: a pitch detection device 4a, a laser-based joining sensor system 4b with laser-based sensors 4b1, 4b2, 4b3, 4b4, camera systems 4c, 4d with front camera 4c and rear camera 4d, and a levelness detection device 4 e. The detection means combined into the orientation determination means 4 thus generate signals which can be used for further processing.
The vehicle of the present embodiment is equipped with a control device 5 configured to perform, in particular, autonomous operation of the vehicle 1. For this purpose, the control device 5 has corresponding devices which are provided for receiving sensor signals, for emitting control signals, etc. The control device also has a processing unit in which a preset program can be executed. The control device 5 receives, in particular, position information from the position determination means 3 and orientation information from the orientation determination means 4. In addition, other signals required for autonomous operation are provided to the control device 5.
Furthermore, the control device 5 is configured to actuate actuators by means of control outputs, for example actuators for adjusting the steering angle of steerable wheels, actuators for actuating the drive in a predetermined operating state, actuators for actuating the brake device, and actuators for actuating the levelness adjusting system of the vehicle 1.
A method for autonomously receiving a target object by a vehicle according to one embodiment is set forth below. In the present embodiment, the method for receiving a swap body container 2 is performed by a vehicle. In this case, the structure and function of the vehicle 1 and the interchange container 2 according to the above description are assumed.
In the initial situation, the vehicle 1 is in the initial position. Furthermore, the exchange container 2 to be admitted is in the admission position. The initial position and the receiving position are defined by information including the position and orientation of the vehicle 1 or the interchange container 2.
The admissions location of the exchange container 2 may be known through a previous placement process of the exchange container 2, while the initial location of the vehicle 1 is determined by the GPS-based location determination system 3a and may be provided for use.
Starting from this situation and from the requirement for the accommodation of the interchange container 2 by the vehicle 1 by autonomous operation, the flow chart schematically illustrated in fig. 7 STARTs at START.
After the process of initialization is performed, the position information at the initial position of the vehicle 1 is determined in step S1. In this regard, as described above, the GPS-based position determining system 3a is used in the present embodiment.
The trajectory T is acquired in step S2. The trajectory T is acquired to go to the neutral position. Here, the intermediate position is defined such that the vehicle 1 is positioned at the swap body container 2 in a predetermined orientation at this position. Here, according to the present embodiment, the intermediate position is close to the swap body container 2 in the receiving position, wherein the vehicle 1 and the swap body container 2 are already almost aligned in the longitudinal direction in the intermediate position.
Based on the acquired trajectory T, in the present embodiment, the vehicle 1 travels from the initial position to the intermediate position by autonomous operation in consideration of the position information. In the present embodiment, the position information continuously available by the GPS-based position determining system is used for the self-running of the vehicle 1.
In step S3, after the vehicle 1 has reached the intermediate position, the orientation information is determined. In this embodiment, the orientation information may be generated by one or more of the detection devices having the orientation determining means 4. Once, for example, the rear camera 4d detects the front section of the interchange container 2 and can generate information about the alignment between the vehicle 1 and the interchange container 2, it can be determined that the vehicle 1 has reached an intermediate position. In this case, all detection elements of the orientation determination means 4 are used to continuously generate orientation information.
In step S4, the accommodated state of the vehicle 1 is realized based on the orientation information. In the present embodiment, the levelness of the vehicle frame 8 and in particular of the engaging elements 6a, 6b, 6c, 6d relative to the ground is adjusted on the basis of the signal of the levelness detection device 4e and using a levelness adjustment system provided on the vehicle 1 such that the engaging elements 6a, 6b, 6c, 6d can engage into the engaging channels 7 of the swap body container 2. The presence of the admission status is confirmed by the camera systems 4c, 4 d. Furthermore, in the present embodiment, previously derived information about the geometry of the interchange container 2 and the vehicle 1 is also used.
In step S5, autonomous operation of the vehicle 1 is performed to move the vehicle 1 from the neutral position to the accommodated position. In the receiving position, the exchange container 2 can be received by the vehicle 1. In particular, in the receiving position, the vehicle 1 runs completely under the exchange container 2 and all engaging elements 6a, 6b, 6c, 6d engage into the engaging channel 7. In this case, the method illustrated in fig. 7 ends in principle.
In this embodiment, the exchange container 2 is lifted by means of the carriage by lifting the vehicle by means of the levelness adjustment system after reaching the receiving position, so that the support element 9 is lifted from the ground. The support element 9 is now folded upwards and can be locked in this position. In this embodiment, this process can likewise be carried out automatically by means of a corresponding operating tool. In this case, the vehicle 1 and the interchange container 2 are ready to drive away and can be driven to a predetermined target point autonomously or with the support of the driver.
In the present embodiment, the signals provided by the detection elements of the orientation tool 4 are analyzed in a fused manner. The camera systems 4c, 4d are first used when the vehicle 1 is driven into the neutral position in order to determine the alignment between the vehicle 1 and the exchange container 2. After the reverse drive has continued and during the detection of the alignment between the vehicle 1 and the interchange container 2 by the camera systems 4c, 4d, the spacing detection device 4a is used to accurately determine the spacing between the vehicle 1 and the interchange container 2. Subsequently, the spacing of the vehicle frame 8 relative to the ground is still determined by the levelness detection device 4e before the engagement elements 6a, 6b, 6c, 6d start to engage into the engagement channel 7. After the vehicle 1 has driven further in reverse toward the interchange container 2, the laser-based sensors 4b1, 4b2 fitted on the rear end of the frame 8 detect the guide elements 7a, 7b of the interchange container 2. After the reverse drive has been continued, the laser-based sensors 4b3, 4b4 arranged in the middle region of the vehicle frame 8 detect the guide elements 7a, 7b of the joining channel 7.
Thus, depending on the state of the vehicle 1 driven under the interchange container 2, the detection elements of the orientation determination means 4 are used in a logical sequence to control the engagement by autonomous operation of the vehicle 1.
In the present embodiment, the GPS-based position determining system 3a is used as the position determining means. In a modified embodiment, sensors of the guide system 3b are alternatively or additionally provided on the vehicle 1, which sensors can detect elements (for example RFID elements) entering the traffic lane, the position of which elements is known. In this way, a correspondingly autonomous control of the vehicle 1 between the initial position and the intermediate position is possible.
List of reference numerals
1 vehicle
2 target object (exchange container)
3 position determination tool
3b position determination system
3c guidance system
4 orientation determination tool
4a interval detection device
4b Joint sensor System
4b1, 4b2, 4b3, 4b4 laser-based sensors
4c, 4d camera system
4e levelness detection device
5 control device
6a, 6b, 6c, 6d engaging element
7 joining channel
7a, 7b guide element
8 vehicle frame
9 support element
10 case body
T track
S1 determining position information
S2 obtaining a trajectory
S3 determining orientation information
S4 realizing the accepting state
S5 control autonomous operation

Claims (21)

1. A system for autonomously receiving a target object (2) by a vehicle (1), wherein the system has: a position determination means (3) configured for determining the position of the vehicle (1) with respect to a fixed-position coordinate system; and an orientation determination tool (4) configured for determining an orientation of the vehicle (1) relative to the target object (2), the system further having a control device (5) configured for receiving position information from the position determination tool (3) and orientation information from the orientation determination tool (4) and for autonomously running the vehicle (1), characterized in that the control device (5) controls the autonomous running of the vehicle (1) based on position determination information in a section between an initial position and an intermediate position of the vehicle (1) and additionally or alternatively based on orientation information in a section between the intermediate position and an accommodated position.
2. The system according to claim 1, characterized in that the position determination means (3) has a GPS-based position determination system (3 a).
3. The system according to claim 1, characterized in that the position determination means (3) has a sensor system (3b) for detecting a reference element (3c) arranged in the region of the travel path.
4. The system according to one of claims 1 to 3, characterized in that the orientation determination means (4) has a laser-based sensor system which can be fitted on the vehicle (1) and is configured for detecting the orientation of the vehicle (1) relative to the target object (2).
5. The system according to claim 4, characterized in that the laser-based sensor system has an engagement sensor system (4b) which interacts with elements of the target object (2) and generates a spacing signal which is indicative of the spacing of the target object (2) from the vehicle (1) in the longitudinal direction.
6. The system according to claim 4 or 5, characterized in that the laser-based sensor system has a spacing detection device (4a) which interacts with a plurality of elements of the target object (2) and generates an orientation signal which is indicative of the orientation between the target object (2) and the vehicle (1) in terms of relative rotation and/or relative lateral offset.
7. The system according to one of claims 1 to 6, characterized in that the orientation determination means (4) have a camera system (4c, 4d) which is directed backwards with respect to the direction of travel of the vehicle (1) and which generates a spacing signal by imaging scanning a field of view, which spacing signal indicates the spacing of the target object (2) from the vehicle (1) in the longitudinal direction; and the camera system generates an orientation signal indicative of an orientation between the target object (2) and the vehicle (1) in terms of relative rotation and/or relative lateral offset.
8. System according to one of claims 1 to 6, characterized in that it has a levelness detection device (4e) which generates a levelness signal indicating the spacing of a reference element of the vehicle (1) with respect to the ground on which the vehicle (1) is located.
9. The system according to one of the preceding claims, characterized in that during autonomous operation of the vehicle (1), the arrival of the vehicle (1) at the intermediate position is detected by the orientation determination means (4), wherein the arrival of the vehicle (1) at the intermediate position is detected by the orientation determination means (4) when the orientation determination means (4) detects an element of the target object (2).
10. System according to one of the preceding claims, characterized in that the control device (5) is configured for fusing the orientation information and/or position information available for use with each other and generating an overall orientation signal.
11. A vehicle (1) for receiving a target object (2) designed as a swap body container, wherein the vehicle (1) is provided with an engagement element (6a, 6b, 6c, 6d) configured for engagement into an engagement channel (7) provided on the swap body container (2), the vehicle further having a control device (5) configured for receiving position information from a position determining means (3) of the vehicle (1) and orientation information from an orientation determining means (4) of the vehicle (1) and for autonomous operation of the vehicle (1), the vehicle further having a system according to one of the preceding claims.
12. A method for autonomously receiving a target object (2) by a vehicle (1) according to claim 11, wherein the method has the steps of:
(S1) determining position information at an initial position of the vehicle (1);
(S2) acquiring a trajectory (T) for driving to an intermediate position where the vehicle (1) is positioned at the target object (2) in a preset orientation, based on the determined position information; and controlling the autonomous operation of the vehicle (1) to move the vehicle (1) to the intermediate position;
(S3) determining orientation information after the vehicle (1) has reached the intermediate position;
(S4) achieving an acceptance state of the vehicle (1) based on the orientation information;
(S5) controlling autonomous operation of the vehicle (1) to move the vehicle (1) from the intermediate position to the receiving position in which the target object (2) is receivable by the vehicle (1).
13. Method according to claim 12, characterized in that the method for receiving a swap body container as a target object (2) is applied by a vehicle (1) provided with engagement elements (6a, 6b, 6c, 6d), wherein the engagement elements (6a, 6b, 6c, 6d) are configured for engagement into engagement channels (7) provided on the swap body container (2).
14. The method according to one of claims 12 or 13, characterized in that the position information is determined by determining the position and orientation of the vehicle (1) in a fixed-position coordinate system and that the position information is used to control the autonomous operation of the vehicle (1) to move the vehicle (1) to the intermediate position.
15. The method according to claim 13, characterized in that the intermediate position is reached if the vehicle (1) is positioned in an almost longitudinally aligned manner with respect to the swap body container (2) and orientation information can be determined by the orientation determination means (4) by an interaction between orientation determination means (4) provided on the vehicle (1) and elements of the swap body container (2).
16. Method according to one of claims 13 to 15, characterized in that the accepting state is reached if the engaging elements (6a, 6b, 6c, 6d) of the vehicle (1) are fully engaged into the engaging channels (7) of the swap body container (2).
17. The method according to one of claims 13 to 16, characterized in that one or more of the following orientation detection means provided on the vehicle (1) are used in the determination of the orientation information:
-an engagement sensor system (4b) for detecting a spacing of the vehicle (1) in a longitudinal direction with respect to the swap body container (2);
-a spacing detection device (4a) for detecting a longitudinal orientation and/or a transverse orientation of the vehicle (1) relative to the interchange container (2);
-a camera system (4c, 4d) for detecting image information to determine orientation information;
-levelness detection means (4e) for detecting the levelness of said engagement elements (6a, 6b, 6c, 6d) with respect to the ground.
18. Method according to one of claims 13 to 17, characterized in that, starting from the intermediate position, the orientation information is acquired by fusing the signals of a detection device provided on the vehicle (1) when the engagement element (6a, 6b, 6c, 6d) is engaged into the engagement channel (7) until the receiving position is reached, wherein the orientation device is used in the following manner:
-using the camera system (4c, 4d) when the vehicle (1) is driven in reverse from the intermediate position at least until the engagement element (6a, 6b, 6c, 6d) is engaged into the engagement channel (7);
-using the engagement sensor system (4b) to detect the longitudinal orientation of the vehicle (1) relative to the swap body container (2) from the intermediate position until the receiving position;
-using said levelness detection means (4e) before engagement of said engagement elements (6a, 6b, 6c, 6d) in said engagement channels (7);
-using the spacing detection device (4a) to detect a longitudinal orientation and/or a transverse orientation of the vehicle (1) relative to the swap body container (2).
19. The method according to claim 18, characterized in that the method further uses the position information determined by the position determination means (3) for obtaining the orientation information.
20. Method according to one of claims 18 to 19, characterized in that, in the fusion of the signals of orientation detection means provided on the vehicle (1), a plausibility check is performed on the signals of orientation detection means provided on the vehicle (1) and, if necessary, a corresponding correction is performed on the signals when the orientation information is acquired.
21. Method according to claim 20, characterized in that the position information of the position determination means (3) is used in the plausibility check of the signal.
CN201980042812.3A 2018-06-26 2019-05-27 System and method for autonomously receiving a target object by a vehicle Pending CN112313592A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102018210341.0 2018-06-26
DE102018210341.0A DE102018210341A1 (en) 2018-06-26 2018-06-26 System and method for autonomously picking up a target object by a vehicle
PCT/EP2019/063572 WO2020001884A1 (en) 2018-06-26 2019-05-27 System and method for the autonomous receiving of a target object by a vehicle

Publications (1)

Publication Number Publication Date
CN112313592A true CN112313592A (en) 2021-02-02

Family

ID=66752062

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980042812.3A Pending CN112313592A (en) 2018-06-26 2019-05-27 System and method for autonomously receiving a target object by a vehicle

Country Status (3)

Country Link
CN (1) CN112313592A (en)
DE (1) DE102018210341A1 (en)
WO (1) WO2020001884A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006057610A1 (en) * 2006-09-29 2008-04-03 Daimler Ag Image-supported docking method for load-based ranking, involves determining vehicle height, consulting difference to vehicle height for docking in target object for vehicle treble control, and determining space and height data with sensor
CN103124994A (en) * 2010-04-06 2013-05-29 丰田自动车株式会社 Vehicle control apparatus, target lead-vehicle designating apparatus, and vehicle control method
CN104276227A (en) * 2013-07-09 2015-01-14 现代自动车株式会社 Joint guarantee system for vehicle assembly and control method of the same
US20150347840A1 (en) * 2014-05-27 2015-12-03 Murata Machinery, Ltd. Autonomous vehicle, and object recognizing method in autonomous vehicle
CN106990781A (en) * 2017-03-31 2017-07-28 清华大学 Automatic dock AGV localization methods based on laser radar and image information
US20170369101A1 (en) * 2016-06-24 2017-12-28 Deutsche Post Ag Vehicle comprising a manoeuvring system
CN107918388A (en) * 2017-11-08 2018-04-17 深圳市招科智控科技有限公司 Harbour container carries unmanned vehicle method for controlling trajectory and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012022336A1 (en) * 2012-11-14 2014-05-15 Valeo Schalter Und Sensoren Gmbh Method for carrying out an at least semi-autonomous parking operation of a motor vehicle in a garage, parking assistance system and motor vehicle
DE102016011324A1 (en) * 2016-09-21 2018-03-22 Wabco Gmbh A method of controlling a towing vehicle as it approaches and hitches to a trailer vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006057610A1 (en) * 2006-09-29 2008-04-03 Daimler Ag Image-supported docking method for load-based ranking, involves determining vehicle height, consulting difference to vehicle height for docking in target object for vehicle treble control, and determining space and height data with sensor
CN103124994A (en) * 2010-04-06 2013-05-29 丰田自动车株式会社 Vehicle control apparatus, target lead-vehicle designating apparatus, and vehicle control method
CN104276227A (en) * 2013-07-09 2015-01-14 现代自动车株式会社 Joint guarantee system for vehicle assembly and control method of the same
US20150347840A1 (en) * 2014-05-27 2015-12-03 Murata Machinery, Ltd. Autonomous vehicle, and object recognizing method in autonomous vehicle
US20170369101A1 (en) * 2016-06-24 2017-12-28 Deutsche Post Ag Vehicle comprising a manoeuvring system
CN106990781A (en) * 2017-03-31 2017-07-28 清华大学 Automatic dock AGV localization methods based on laser radar and image information
CN107918388A (en) * 2017-11-08 2018-04-17 深圳市招科智控科技有限公司 Harbour container carries unmanned vehicle method for controlling trajectory and device

Also Published As

Publication number Publication date
DE102018210341A1 (en) 2020-01-02
WO2020001884A1 (en) 2020-01-02

Similar Documents

Publication Publication Date Title
US10315697B2 (en) Vehicle comprising a manoeuvring system
JP6474888B2 (en) Method for driving a vehicle at least semi-autonomously, driver assistance system and vehicle
CN110785328B (en) Parking assistance method and parking control device
CN101484344B (en) Method for assisting with the parking of a vehicle
US8825205B2 (en) Method for controlling movement of travelling carriers
GB2473551A (en) Parking assist which compensates for distance covered in an undetected direction
KR20120039647A (en) Apparatus and method for the assisted parking of a vehicle
US8548665B2 (en) Movable body system
KR20160110386A (en) Method for carrying out a parking process of a motor vehicle into a transverse parking space, parking assistance system and motor vehicle
US20190206251A1 (en) Method for determining a parking surface for parking a motor vehicle, driver assistance system, and motor vehicle
CN105432074A (en) Camera system for a vehicle, and method and device for controlling an image region of an image of a vehicle camera for a vehicle
US20180001929A1 (en) Motor vehicle, system and method for operating such a motor vehicle and such a system
JP6928907B2 (en) Vehicle control device, vehicle control method, program and recording medium on which it is recorded
JP2004338637A (en) Vehicle travel support device
US20220105856A1 (en) System and method for determining a lateral offset of a swap body in relation to a vehicle
CN102812326A (en) Method for controlling a measuring system and measuring system for carrying out the method
CN112313592A (en) System and method for autonomously receiving a target object by a vehicle
CN112020735B (en) Driving support device and traffic system
JP2002182744A (en) Approach guide device for unmanned carrier to pallet
CN111435251A (en) Article conveying device
JP2017197100A (en) bus
JP3317200B2 (en) Radar optical axis adjustment method and apparatus
JP2011243129A (en) Transportation vehicle system
JP2008226151A (en) Device and method for deciding installation position of on-vehicle communication machine
JP2020090250A (en) On-vehicle device, positioning control method for vehicle, computer program and on-vehicle system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination