WO2023211967A1 - Système et procédé de connexion de lignes de service aux avants de remorques par des camions automatisés - Google Patents
Système et procédé de connexion de lignes de service aux avants de remorques par des camions automatisés Download PDFInfo
- Publication number
- WO2023211967A1 WO2023211967A1 PCT/US2023/019845 US2023019845W WO2023211967A1 WO 2023211967 A1 WO2023211967 A1 WO 2023211967A1 US 2023019845 W US2023019845 W US 2023019845W WO 2023211967 A1 WO2023211967 A1 WO 2023211967A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- gladhand
- robotic arm
- set forth
- trailer
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 117
- 230000008569 process Effects 0.000 claims abstract description 52
- 230000033001 locomotion Effects 0.000 claims abstract description 46
- 238000013507 mapping Methods 0.000 claims abstract description 6
- 239000012636 effector Substances 0.000 claims description 18
- 238000005057 refrigeration Methods 0.000 claims description 6
- 238000012544 monitoring process Methods 0.000 claims description 5
- 230000000630 rising effect Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 9
- 230000007613 environmental effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000007246 mechanism Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 238000005457 optimization Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 230000008447 perception Effects 0.000 description 4
- 206010053648 Vascular occlusion Diseases 0.000 description 3
- 230000000670 limiting effect Effects 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000005429 filling process Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000007789 sealing Methods 0.000 description 2
- 230000026676 system process Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 210000004907 gland Anatomy 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1615—Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
- B25J9/162—Mobile manipulator, movable base with manipulator arm mounted on it
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60D—VEHICLE CONNECTIONS
- B60D1/00—Traction couplings; Hitches; Draw-gear; Towing devices
- B60D1/58—Auxiliary devices
- B60D1/62—Auxiliary devices involving supply lines, electric circuits, or the like
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/31—From computer integrated manufacturing till monitoring
- G05B2219/31006—Monitoring of vehicle
Definitions
- This invention relates to autonomous vehicles and more particularly to autonomous trucks and trailers therefor, for example, as used to haul cargo around a shipping facility, a production facility or yard, or to transport cargo to and from a shipping facility, a production facility or yard.
- Trucks are an essential part of modem commerce. These trucks transport materials and finished goods across the continent within their large interior spaces. Such goods are loaded and unloaded at various facilities that can include manufacturers, ports, distributors, retailers, and end users.
- Large over-the road (OTR) trucks typically consist of a tractor or cab unit and a separate detachable trailer that is interconnected removably to the cab via a hitching system that consists of a so-called fifth wheel and a kingpin. More particularly, the trailer contains a kingpin along its bottom front and the cab contains a fifth wheel, consisting of a pad and a receiving slot for the kingpin.
- the cab provides power (through (e.g.) a generator, pneumatic pressure source, etc.) used to operate both itself and the attached trailer.
- a plurality of removable connections are made between the cab and trailer to deliver both electric power and pneumatic pressure.
- the pressure is used to operate emergency and service brakes, typically in conjunction with the cab’s own (respective) brake system.
- the electrical power is used to power (e.g.) interior lighting, exterior signal and running lights, lift gate motors, landing gear motors (if fitted), etc.
- the attachment and detachment is performed using a rotational motion between confronting gladhands to lock flanges together in a manner that compresses opposing annular seals contained in each gladhand body.
- the above-referenced Published Application describes end effectors and robotic hands that facilitate the attachment of a gladhand adapter to the native trailer frontmounted gladhand.
- the native gladhand should be identified.
- Machine vision employing pattern recognition based upon acquired images of the trailer front, can be used (at least in part) to identify and locate the trailer gladhand.
- This invention overcomes disadvantages of the prior art by providing a system and method for allowing motion of a robotic manipulator on an AV truck in connecting to a native gladhand on a trailer front that represents and constructs a model of this free space on-the-fly, in the manner of an Obstacle Detection and Obstacle Avoidance (OD/OA) system and process.
- OD/OA Obstacle Detection and Obstacle Avoidance
- a system and method for guiding a robotic arm on an AV truck, and AV (e.g. yard) truck, adapted to connect a pneumatic line to a gladhand on the trailer front hitched thereto is provided.
- a first 3D sensor generates pointclouds, at different stages of motion, and is located adjacent to an end of the robotic arm, the end carrying a tool for interacting with the gladhand.
- An occlusion map of the trailer front is generated, and a map update process that, based upon the pointclouds of the first 3D sensor, updates the occlusion map to add and remove voxels therefrom.
- a robot arm control process guides the robotic arm based upon the updated occlusion map.
- a second 3D sensor generates a pointcloud, and is located at on the truck to image the trailer front.
- An occlusion mapping process based upon the pointcloud of the second 3D sensor, generates the occlusion map of the trailer front.
- the second 3D sensor can be located at an elevated position on the truck.
- the robot arm control process can be adapted to initially move the robotic arm to image, with the first 3D sensor, a region of interest subject to update by the update process.
- the first 3D sensor generates images used to locate predetermined features in the region of interest.
- the predetermined features of the system and method can include the gladhand.
- the gladhand can be a rotating gladhand and the tool is adapted to extend the rotating gladhand upon recognition of such as one of the predetermined features.
- the second 3D sensor can comprise a combination of a rotating 2D LiDAR and a moving pan-tilt unit.
- the first 3D sensor can comprise a stereoscopic camera arrangement.
- a map expansion process can change an occlusion probability of each of the voxels in the updated occlusion map based upon occlusion state of neighboring voxels in the updated occlusion map.
- At least one of the first 3D sensor and the second 3D sensor can be adapted to perform self-calibration during runtime operation based upon features within an imaged scene.
- a path of motion of the robotic arm can be guided based, in part, on at least one of (a) moving the robotic arm along a trajectory until a rising or falling edge on an external switch is sensed, (b) moving the robotic arm along a trajectory whose speed is controlled by wrench readings from an end- of-arm force-torque sensor, (c) moving the robotic arm along a predetermined trajectory while monitoring end-effector wrenches and stopping the arm if it is determined that there is a risk of causing a controller of the robotic arm to fault, (d) moving the robotic arm along a predetermined trajectory to produce a target end-effector wrench, and (e) stopping the motion for any of (a)-(d) if a motion trajectory of the robotic arm has exceeded distance thresholds.
- the predetermined feature can comprise a gladhand, and the occlusion on the trailer front can be caused by a protrusion from the trailer front that overhangs the gladhand
- FIG. 1 is a diagram showing an aerial view of an exemplary shipping facility with locations for storing, loading and unloading trailers used in conjunction with the AV yard truck arrangements provided according to a system and method for handling trailers within a yard;
- FIG. 2 is a front-oriented perspective view of an exemplary AV yard truck for use in association with the system and method herein;
- FIG. 3 is a rear-oriented perspective view of the AV yard truck of Fig. 2;
- FIG. 4 is a side view of the AV yard truck of Fig. 2;
- FIG. 5 is front view of a front end of an exemplary trailer having a refrigeration unit that overhangs the gladhand area;
- FIG. 6 is a schematic diagram of a AV yard truck robotic pneumatic line connection system and environment scanning arrangement to assist in navigating the connection system to and from a trailer-front-mounted pneumatic gladhand;
- Fig. 7 is a diagram of an image showing an occupancy map of voxels generated by the scanning system of Fig. 6 for the trailer front of Fig. 5;
- Fig. 8 is a diagram of an image showing the occupancy map of Fig. 7 including voxels defining an occluded region;
- Fig. 9 is a diagram of an image showing the occupancy map of Fig. 8 including a cleared region of voxels in which the AV -truck-mounted robotic arm for connecting pneumatic lines with the trailer can safely operate based upon pointclouds derived by an end-of-arm environment 3D sensor;
- Fig. 10 is a flow diagram of an overall procedure for generating an updated occupancy map in accordance with Fig. 9;
- Fig. 11 is a flow diagram of an occlusion map expansion procedure for increasing a margin of operational safety within the updated occlusion map of Fig. 9.
- FIG. 1 shows an aerial view of an exemplary shipping facility 100, in which over-the-road (OTR) trucks (tractor trailers) deliver goods-laden trailers from remote locations and retrieve trailers for return to such locations (or elsewhere — such as a storage depot).
- OTR over-the-road
- the OTR transporter arrives with a trailer at a destination’s guard shack (or similar facility entrance checkpoint) 110.
- the guard/attendant enters the trailer information (trailer number or QR (ID) code scan- imbedded information already in the system, which would typically include: trailer make/model/year/service connection location, etc.) into the facility software system, which is part of a server or other computing system 120, located offsite, or fully or partially within the facility building complex 122 and 124.
- the complex 122, 124 includes perimeter loading docks (located on one or more sides of the building), associated (typically elevated) cargo portals and doors, and floor storage, all arranged in a manner familiar to those of skill in shipping, logistics, and the like.
- the guard/attendant would then direct the driver to deliver the trailer to a specific numbered parking space in a designated staging area 130 — show n herein as containing a large array of parked, side-by-side trailers 132, arranged as appropriate for the facility’s overall layout.
- the trailer’s data and parked status is generally updated in the company’s integrated yard management system (YMS), which can reside on the server 120 or elsewhere.
- YMS integrated yard management system
- the (i.e. loaded) trailer in the staging area 130 is hitched to a yard truck/tractor, which, in the present application is arranged as an autonomous vehicle (AV).
- AV autonomous vehicle
- the trailer is designated to be unloaded, the AV yard truck is dispatched to its marked parking space in order to retrieve the trailer.
- the yard truck backs down to the trailer, it uses one or multiple mounted (e.g.
- a standard or custom, 2D grayscale or color-pixel, image sensor-based) cameras and/or other associated (typically 3D/range-determining) sensors, such as GPS receiver(s), radar, LiDAR, stereo vision, time- of-flight cameras, ultrasonic/laser range finders, etc.) to assist in: (i) confirming the identity of the trailer through reading the trailer number or scanning a QR, bar, or other type of coded identifier; (ii) Aligning the truck’s connectors with the corresponding trailer receptacles.
- Such connectors include, but are not limited to, the cab fifth (5 th ) wheel-to- trailer kingpin, pneumatic lines, and electrical leads.
- the cameras mounted on the yard truck can also be used to perform a trailer inspection, such as checking for damage, confirming tire inflation levels, and verifying other safety criteria.
- the hitched trailer is hauled by the AV yard truck to an unloading area 140 of the facility 100. It is backed into a loading bay in this area, and the opened rear is brought into close proximity with the portal and cargo doors of the facility. Manual and automated techniques are then employed to offload the cargo from the trailer for placement within the facility 100. During unloading, the AV yard truck can remain hitched to the trailer or can be unhitched so the yard truck is available to perform other tasks.
- the AV yard truck After unloading, the AV yard truck eventually removes the trailer from the unloading area 140 and either returns it to the staging area 130 or delivers it to a loading area 150 in the facility 100.
- the trailer, with rear swing (or other type of door(s)) open, is backed into a loading bay and loaded with goods from the facility 100 using manual and/or automated techniques.
- the AV yard truck can again hitch to, and haul, the loaded trailer back to the staging area 130 from the loading area 150 for eventual pickup by an OTR truck.
- Appropriate data tracking and management is undertaken at each step in the process using sensors (described below) on the AV yard truck and/or other manual or automated data collection devices — for example, terrestrial and/or aerial camera drones.
- Figs. 2-4 show an exemplary AV yard truck 200 for use herein.
- the yard truck 200 is powered by diesel or another internal combustion fuel, or (more typically) electricity, using appropriate rechargeable battery assembly that can operate in a manner known to those of skill.
- the AV yard truck is powered by rechargeable batteries, but it is contemplated that any other motive power source (or a combination thereol) can be used to provide mobility to the unit.
- the yard truck 200 includes at least a driver’s cab section 210 (which can be omitted in a fully autonomous version) and steering wheel (along with other manual controls) and a chassis 220 containing front steerable wheels 222, and at least one pair of rear, driven wheels 224 (shown herein as a double-wheel arrangement for greater loadbearing capacity).
- the respective chassis 220 also includes a so-called fifth (5 th ) wheel 240 that is arranged as a horseshoe-shaped pad with a rear-facing slot 244, which is sized and arranged to receive the kingpin hitch located at the bottom front (510 in Fig. 5) of a standard trailer 500.
- Various fifth wheel-lifting mechanisms can be provided, which employ appropriate hydraulic lifting actuators/mechanisms known to those of skill so that the hitched trailer is raised at its front end. In this raised orientation, the hitch between the truck and trailer is secured.
- the AV yard truck can include a variety of custom or commercially available remote sensors and/or autonomous driving sensing arrangements (e.g., those available from vendors, such as Velodyne Lidar, Inc. of San Jose, CA), including, but not limited to GPS, LiDAR, radar, image-based (e.g. machine vision), inertial guidance, and ultrasonic that allow it to navigate through the yard and hitch-to/unhitch-from a trailer in an autonomous manner that is substantially or completely free of human intervention.
- Such lack of human intervention can be with the exception, possibly, of issuing an order to retrieve or unload a trailer — although such can also be provided by the YMS via the server 120 using a wireless data transmission 160 (Fig.
- the exemplary AV yard truck 200 includes a novel top-mounted bar 250 that carries various sensors (e.g. visual imaging sensors and LiDAR) in a manner that affords a desirable line of sight.
- visual sensors 252 are provided on ends of the bar 250 and a rear visual sensor 310 (Fig.
- the processing components 410 (also termed “processor”) for various sensing telemetry can be housed in the cab roof cap 420, which is wedge-shaped in this embodiment. It can include a cooling (e.g. fan) unit 430 and appropriate heat sinks to remove excess heat generated by data processing, storage and transceiver components. As also shown the processor(s) 410 receive and transmit data and commands 440 via an RF link 450 as described above.
- the AV yard truck 200 includes an emergency brake pneumatic hose (typically red) 340 (shown in phantom in Fig. 3), service brake pneumatic hose (typically blue, not shown) and an electrical line (often black, not shown), that extend from the rear of the cab 210.
- an emergency brake pneumatic hose typically red
- service brake pneumatic hose typically blue, not shown
- an electrical line typically black
- control of the truck 200 can be implemented in a self- contained manner, entirely within the processor 410 which receives mission plans and decides on appropriate maneuvers (e.g. start, stop, turn accelerate, brake, move forward, reverse, etc.).
- control decisions/functions can be distributed between the processor 410 and a remote-control computer — e g. the server 120, that computes control operations for the truck and transmits them back as data to be operated upon by the truck’s local control system.
- control of the truck’s operation based on a desired outcome, can be distributed appropriately between the local processor 410 and the facility system server 120.
- the AV truck chassis 220 rearward of the cab 210, includes an area that resides in front of the fifth wheel 240 that supports a multi-axis robotic manipulator arm assembly 270 that move in three dimensions (e.g., 7 degrees of freedom (DOF)) in a programmed path according to conventional robotic behavior.
- the arm assembly 270 is mounted on a track 450 that enables powered, lateral motion across the width of the chassis 220.
- the arm assembly 270 can be based upon a conventional robot, such as the GP7, available from Yaskawa America, Inc. of Waukegan, II.
- the end of the arm assembly can include a customized end effector assembly that is arranged to selectively grasp a native gladhand 520 in Fig.
- a corresponding gladhand i.e. and adapterless implementation
- a gladhand-engaging adapter on the end of the hose 340
- Other connections can be made by the robotic arm, e.g. between the service brake lines and/or the electrical connections using appropriate motion control and adapters. More generally, the attachment of AV truck pneumatic lines to various types of native gladhands is shown and described in above-incorporated commonly -as signed, U. S. Patent Application Serial No. 17/009,620, now U.S. Published Application No. US-2021 -0053407-Al.
- the end effector 274 can define a variety of shapes and functions, depending upon the nature of the task and type of adapter used to connect the AV truck pneumatic tine to the native gladhand on the trailer front.
- the number of axes and motion capability of the arm 270 is highly variable, depending upon the nature of the task and relative location of the robot versus the trailer gladhand.
- the robot 270 is positioned on the chassis 220 in such a manner that it can be stowed without (free of) interfering with normal turning of the trailer on its kingpin when hitched to the AV yard truck 200.
- the tracking 450 can be angled rearwardly from one side to the other (as shown) to help facilitate forward stowage of the robot 270 when not in use (as shown).
- the robot arm 270 moves under the control of a processor arrangement 460 they can be contained within the robot housing or (in whole or in part) provided as part of the overall processing arrangement 410. Note that any of the processing functions herein can be performed in the stand-alone fashion on the AV yard truck 200, or can be partially performed remotely by the server 120 for the yard.
- FIG. 6 shows a sensing system, which consists of two primary sensors, an external environment scanner 610 (which includes the above-described rear cab visual sensor 310) and an end-of-arm (270) environmental scanner 620. Both scanners 610 and 620 are capable of generating 3D pointclouds of an imaged scene within a respective 3D field of view/region of interest FOV1 and FOV2. These pointclouds are transmitted to the processor arrangement, and more particularly to an environmental sensing process(or) 630, that can be instantiated, in whole or part, in the processor arrangement 410 and/or on the remote server.
- an environmental sensing process(or) 630 that can be instantiated, in whole or part, in the processor arrangement 410 and/or on the remote server.
- Pointcloud data in the external environment processor 630 is handled by various functional modules that can be implemented using hardware, software (comprising a computer-readable medium of non-transitory program instructions) or a combination of hardware software.
- the functional processor/process modules 632, 634 and 636 depicted in the processor 630 are exemplary of a variety of organizations of processors and/or processing tasks that can be implemented according to skill in the art.
- the processor includes various 2D and 3D vision system tools 632, adapted to derive information from 3D point clouds. These can include surface contour tools, edge finders, 3D blob tools, trained deep learning tools, etc. Appropriate setup and calibration functions can also be included.
- a data synchronization processor 634 coordinates data from each scanner 610, 620 as described below and a motion processor 636 coordinates movement of the robot end effector 274 to the visual environment.
- Data in the form of motion commands 640 and position feedback 642 are exchanged with the robot’s motion control processor 460 to guide and track robot arm motion between a stowed position and various engaged positions relative to the trailer gladhand.
- the exemplary trailer front 530 is part of a reefer unit that includes overhanging bulge or blister housing 550 that terminates at a bottom edge 552, just above the gladhand 520.
- the robot arm assembly 270 and end effector 274 must navigate this restricted space to appropriately engage the gladhand 520.
- the cab-mounted scanner 610 can be constructed using any acceptable 3D sensing arrangement.
- the cab-mounted scanner 610 consists of a SICK TiM-561 2D LiDAR and a FLIR PTU-5 (movable) pan-tilt unit (PTU).
- PTU FLIR PTU-5 (movable) pan-tilt unit
- the end-of-arm environment scanner 620 can be a conventional depth camera based on (e.g.) active stereoscopic vision.
- the end-of-arm scanner can comprise a RealSenseTM depth camera module commercially available from Intel.
- the process herein can utilize an “occupancy map” to decompose the space of interest into a set of finite-sized (3D) voxels. For each voxel there is a parameter that describes the probability of that voxel being occupied or not.
- a probabilistic update rule and probabilistic sensor model are used to update each individual voxel’s occupational probability as successive pointclouds are added to the occupancy map. In general, if a pointcloud added to the occupancy map has no points within a given voxel that will lower its probability of occupation, points within a voxel will increase its probability of occupation.
- the data structures and algorithms for storing, querying, and updating the occupancy map are well-known and there have been many papers published on the topic.
- the exemplary implementation herein utilizes the well-known open source library OctoMap released along with the paper Armin Hornung, et al., An Efficient Probabilistic 3D Mapping Framework Based on Octrees Autonomous Robots (2013), which is incorporated herein by reference as useful background information.
- FIG. 7 shows an image 700 of the reefer trailer 500 of Fig. 5 along with the occupancy map produced by the external environment scanning system when the robot arm 270 attempts to connect to that trailer. Note that a large area 710 of unoccupied voxels below the refrigeration unit 550 that are occupied in real life. They appear as unoccupied in the map because they are occluded by the refrigeration unit 550 itself.
- the illustrative embodiment herein provides a technique to modify the raw occupancy map 700 of Fig. 7 in order to define a more conservative motion plan of the robot arm 270 in response to these occlusions. If the point-of-view of a sensor that produced a pointcloud is known, it is possible to compute which voxels in the occupancy map were occluded when that pointcloud was generated. In order to be safe with respect to these occluded voxels, the probability of occlusion is maximized in the computation. Not all voxels are considered occluded by the occupancy mapping process. Constraints are, thus, applied to the set, as otherwise, the occupancy map could extend into infinite space.
- a radius of analysis is defined, and occluded voxels within that radius are assumed to be occupied and all occluded voxels beyond that radius are unchanged.
- the process assumes that any voxel that is not visible to the scan within the radius is occupied to prevent the motion planning algorithms from attempting to use unknown space as occupied space.
- This capability can be handled on-demand.
- Given the current state of the occupancy map it is possible to fill all occluded voxels given a sensor point-of-view and a radius of analysis. A visualization of this operation is shown in the respective images 800 and 900 of Figs. 8 and 9.
- the occlusion filling process fills the occupancy map shown in the image 700 of Fig. 7 with candidate occlusions (lower region 810) from the point-of-view of the external environment scanner 610 on top of the cab (defining an XYZ coordinate system — with X being perpendicular to the page) up to a defined radius to produce the overall occupancy map.
- this occlusion-filled occupancy map is essentially too conservative in the sense that there are voxels considered to be occupied that are actually free, and this condition will unduly restrain the initial motion of the robot arm 720 and end effector 724 with associated end-of-arm scanner 620.
- This selective updating process adds pointclouds derived from the end-of-arm environment scanner 620, and thereby, updates the occupancy probability.
- the end-of arm scanner 620 has moved into a position to image the region (810 in Fig 8) that was fdled with occlusion candidates.
- voxels that were previously occluded and filled become visible to the end-of-arm scanner 620, their occlusion probabilities are updated accordingly and will eventually reflect unoccupied voxels.
- an area 910 of voxels below the overhanging refrigeration unit 550 that were previously occupied are now cleared through the addition of end-of-arm pointclouds.
- step 1010 the procedure 1000 generates and stores an initial occupancy map with voxels from the cab-mounted environment scanner (PTU) 610. Then, in step 1020, the robot arm is moved based upon the location of the occluded region under control of the robot control processor so as to image that occluded region moving the arm from its stowed position. The arm initially move conservatively based upon the initial occluded occupancy map and used its end-of-arm scanner to create 3D pointclouds of the occluded region.
- PTU cab-mounted environment scanner
- the pointclouds generated by the end-of ami scanner 620 allow for continual update of the occupancy map (step 1040) by adding pointclouds from the end-of-arm environmental scanner in regions of interest around our target objects.
- the occupancy map is defined to have a minimum voxel size.
- voxel size In the real world, due to a variety of factors such as sensor noise or imperfect calibrations it’s possible to end up with voxels that are not occupied even though they should be. For example, if a boundary of a particular voxel happens to be very close to a real-world object. If the scanning sensor exhibits a small amount of noise then it is possible that, from one scan to the next, a point in the pointcloud from that real-world object could jump from one side of the voxel boundary to the other.
- the updated occupancy map is provided to the expansion process (step 1110), and is modified to expand all occupied voxels (decision step 1120 — if unoccupied, then search for next voxel in step 1130) by a given number of neighboring voxels (step 1140).
- decision step 1120 if a particular voxel is considered to be occupied (via decision step 1120), the process takes into account neighboring cells within a radius and update their occupancy probability to match that of the starting cell. It runs until all voxels of interest have been expanded (step 1150). In this manner, the expansion process can inflate the occupancy map in all directions to provide a degree of padding around detected obstacles (step 1160). The process also tracks which voxels which were modified by this operation, and thus can undo the expansion in part or in the entirety
- the external environmental scanning system 610 consists of a controllable pan-tilt unit (PTU) and a 2D LiDAR.
- PTU controllable pan-tilt unit
- 2D LiDAR 2D LiDAR
- the process Since the process is expanding and occlusion-filling this map to provide robustness, it can move the PTU relatively quickly and yield a less-dense pointcloud. This saves execution time and processing power. However, in situations where a highly dense pointcloud is desired it can choose to move the PTU much slower and in a different region. For example, if the goal is to build an accurate 3D representation of a particular feature on the trailer, e.g. for planning motions very close to the feature or for 6D pose estimation of the feature the process can choose to move the scanner 610 slowly over a narrow region to build a high-density and narrow field-of-view pointcloud.
- the hose connection system and method system dictates several extrinsic calibrations that should be undertaken for each vehicle and provided to the controlling processes herein. These calibrations include: (a) the 3D pose of the end-of-arm environmental scanner relative to the arm’s tool-center point — which can be termed the camera calibration,' and (b) the pose of the external environmental scanner relative to a fixed frame on the vehicle — which can be termed the PTU calibration.
- the above calibrations are desired for obstacle detection and obstacle avoidance (OD/OA). They also form a critical component of our perception systems that determine the 6D poses of various target objects in the world (tools, gladhands, etc.).
- the process can access a toolbox that utilizes an arm motion generation algorithm, a perception system, and an optimization procedure.
- Such procedures are known in the art as 3D hand- eye- calibration based upon a global/world coordinate space.
- Such techniques can be largely conventional in application and/or custom techniques can be employed to supplement conventional calibration techniques
- the system and method employs a plurality of fiducial sets with well-known geometry. These are described generally in the above-incorporated U.S. Patent applications.
- the process leverages these fiducial sets and the vehicle’s sensor systems in such a way that it can continuously monitor camera calibration.
- tool variants include a set of ArUco markers with specified relative poses.
- the process should determine the pose of the tool on the face of a trailer.
- the robot arm 270 is moved using an algorithm informed by our current best-estimate of the tool pose, and as the arm moves, process generates and stores new images and updates this best estimate.
- this motion and estimation loop terminates and the process achieves an improved knowledge of pose of the tool (within some uncertainty bounds).
- This operation requires the camera calibration.
- the collected data from this operation is stored, and is used in parallel during runtime operation of the arm, to compute statistical assessments of the camera calibration itself.
- this continuous calibration update process allows the system to determine if the camera calibration has drifted and requires a re-calibration procedure. Such re-calibration can be carried out by a user based upon an alert, or can be automatically performed using certain known-position/ orientation fiducials.
- the arm can be moved to a region where the external environment scanner 610 is able to build high resolution scans of a known portion of the ami.
- the arm has internal encoders/steppers that provide feedback for such know position.
- a calibrated comer cube grasped by the grippers on the end effector 274 can be used.
- the actual gripper finger structure can be located for calibration free of a separate calibration object.
- A. Force Switch Servoing The following arm guidance operations can be undertaken by the system and method employing and external switch: (a) moving the arm along a trajectory' until a rising or falling edge on the external switch is sensed; (b) moving the arm along a trajectory whose speed is controlled by wrench readings from an end-of-arm force-torque sensor; (c) moving the arm along a predetermined trajectory while monitoring end-effector wrenches and stopping the arm if it is determined that there is a risk of causing the robot controller to fault; (d) moving the arm along a predetermined trajectory to produce a target end -effector wrench; and/or (e) stopping the motion for any of procedures (a)-(d), above, if the arm’s trajectory has exceeded distance thresholds.
- the adapterless tool can include a pivoting mechanism that allows grabbing a spring-loaded gladhand wedge from the back as described generally in the aboveincorporated U. S. Patent Application Serial No. 17/009,620, now U.S. Published Application No. US-2021 -0053407-Al.
- As the arm is used to rotate the spring-loaded gladhand away from a trailer face to expose its sealing gland, it is also used to rotate the tool’s pivoting mechanism to change the tool’s state.
- This set of rotations (exposing the gladhand and rotating the tool’s pivot) can be accomplished sequentially in either order or they can be accomplished at the same time.
- the motion of the arm specifically the instantaneous center of rotation of the gripper fingers projected onto the plane defined by the tool’s pivot axis and the gladhand’s rotation axis, defines how much rotation occurs around the gladhand versus the tool pivot.
- Setting the instantaneous center to be coincident with the gladhand rotation axis will only extract/ expose the gladhand, and setting it to be coincident with the tool pivot point will only rotate the tool. Setting the center elsewhere will allow rotation of both the retractable gladhand and the tool.
- the center should not be arbitrarily set so as to adhere to any constraints imposed by the now-closed system. In other words, if the arm is moved along constraintincompatible directions, such can cause undesirable reaction forces in the gladhand and/or tool.
- a motion planning technique is employed by the system and method that develops collision-free paths that avoid singularities and joint limits by exploring and exploiting the following freedoms:
- the (e g.) 7-DOF arm system has an infinite number of inverse kinematics (IK) solutions for aligning the tool’s fingers with the gladhand wedge — some of these solutions may be infeasible for accomplishing the subsequent rotations;
- the gladhand expose angle only needs to expose the gladhand enough to allow the switched state tool to be able to clamp the face — for various gladhand poses and corresponding IK solutions, increasing or decreasing the gladhand exposure angle (as long as it is above its minimum value) may help to avoid collisions, singularities, etc.;
- the set of constraint-compatible motions between the initial gladhand and tool angles and the final tool and gladhand angles is infinite. Freedom of choice in this set may help to avoid
- arm tools can be bistable, and thereby allow the tool’s wedge capture fingers to be positioned at two separate orientations relative to the robot’s standard gripper fingers where they grasp the tool.
- the capture fingers also allow the capture of the wedge from two different approach angles.
- a process can determine if the trailer employs a fixed gladhand type or a rotational gladhand type through any of the following procedures: (a) probabilistic classification vector machine logic deep learning-trained gladhand classifiers; (b) simple perception algorithms that project detected gladhand wedge poses into truck fixed frames and compute whether the gladhand is close to the trailer face (indicating rotational gladhand) or projecting out from the trailer front (indicating a fixed gladhand); and/or (c) remote assist mechanism(s) that allow a remote operator to classify a gladhand as fixed or rotational.
- the above techniques (a) - (c) can be used in conjunction with each other, or selectively as fallback techniques if any particular technique fails to yield a desired outcome. For example, if a deep learning model is able to classify with high confidence, such can be employed. If the model cannot successfully classify (which sometimes occurs when the same gladhand body can be rotational or fixed) then the system process can fallback to geometric perception techniques. If those techniques produce an ambiguous result, then the system process can utilize a remote assist request as a final fallback.
- gladhand is rotational or fixed, such can dictate the subsequent steps in the line connection procedure. This includes modifications to how the robotic arm and tool is unstowed, which approach angles are used for travel to the gladhand, and which motion planning and execution algorithms are used to actually accomplish the connection sequence.
- various directional and orientational terms such as “vertical”, “horizontal”, “up”, “down”, “bottom”, “top”, “side”, “front”, “rear”, “left”, “right”, “forward”, “rearward”, and the like, are used only as relative conventions and not as absolute orientations with respect to a fixed coordinate system, such as the acting direction of gravity.
- a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or processors. Such sub-processes and/or sub-processors can be variously combined according to embodiments herein.
- any function, process and/or processor herein can be implemented using electronic hardware, software consisting of a non-transitory computer-readable medium of program instructions, or a combination of hardware and software.
- qualifying terms such as “substantially” and “approximately” are contemplated to allow for a reasonable variation from a stated measurement or value can be employed in a manner that the element remains functional as contemplated herein — for example, 1-5 percent variation. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Transportation (AREA)
- Manipulator (AREA)
Abstract
La présente invention concerne un système et un procédé pour permettre le mouvement d'un manipulateur robotique sur un camion AV en liaison avec une tête d'accouplement d'origine sur l'avant d'une remorque qui représente et construit un modèle de cet espace libre à la volée, à la manière d'un système et d'un processus de détection d'obstacle et d'évitement d'obstacle (OD/OA). Un bras robotique sur un camion AV est conçu pour relier une ligne pneumatique à une tête d'accouplement sur l'avant de la remorque. Un premier capteur 3D génère un nuage de points, et est situé à une position élevée sur le camion pour imager l'avant de la remorque. Un second capteur 3D génère également des nuages de points au cours d'un mouvement de robot, situé adjacent à une extrémité du bras robotique. Un processus de mappage d'obstruction génère une carte d'obstruction de l'avant de la remorque, et un processus de mise à jour de carte met à jour la carte d'obstruction pour ajouter et éliminer des voxels de cette dernière afin de permettre un guidage sûr du robot.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/729,305 US20220371199A1 (en) | 2018-02-21 | 2022-04-26 | System and method for connection of service lines to trailer fronts by automated trucks |
US17/729,305 | 2022-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023211967A1 true WO2023211967A1 (fr) | 2023-11-02 |
Family
ID=88519652
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2023/019845 WO2023211967A1 (fr) | 2022-04-26 | 2023-04-25 | Système et procédé de connexion de lignes de service aux avants de remorques par des camions automatisés |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023211967A1 (fr) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170165839A1 (en) * | 2015-12-11 | 2017-06-15 | General Electric Company | Control system and method for brake bleeding |
US20170305694A1 (en) * | 2014-10-03 | 2017-10-26 | Wynright Corporation | Perception-Based Robotic Manipulation System and Method for Automated Truck Unloader that Unloads/Unpacks Product from Trailers and Containers |
US20210053407A1 (en) * | 2018-02-21 | 2021-02-25 | Outrider Technologies, Inc. | Systems and methods for automated operation and handling of autonomous trucks and trailers hauled thereby |
US20210192784A1 (en) * | 2018-09-04 | 2021-06-24 | Fastbrick Ip Pty Ltd. | Vision system for a robotic machine |
US20220080584A1 (en) * | 2020-09-14 | 2022-03-17 | Intelligrated Headquarters, Llc | Machine learning based decision making for robotic item handling |
US20220371199A1 (en) * | 2018-02-21 | 2022-11-24 | Outrider Technologies, Inc. | System and method for connection of service lines to trailer fronts by automated trucks |
-
2023
- 2023-04-25 WO PCT/US2023/019845 patent/WO2023211967A1/fr unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170305694A1 (en) * | 2014-10-03 | 2017-10-26 | Wynright Corporation | Perception-Based Robotic Manipulation System and Method for Automated Truck Unloader that Unloads/Unpacks Product from Trailers and Containers |
US20170165839A1 (en) * | 2015-12-11 | 2017-06-15 | General Electric Company | Control system and method for brake bleeding |
US20210053407A1 (en) * | 2018-02-21 | 2021-02-25 | Outrider Technologies, Inc. | Systems and methods for automated operation and handling of autonomous trucks and trailers hauled thereby |
US20220371199A1 (en) * | 2018-02-21 | 2022-11-24 | Outrider Technologies, Inc. | System and method for connection of service lines to trailer fronts by automated trucks |
US20210192784A1 (en) * | 2018-09-04 | 2021-06-24 | Fastbrick Ip Pty Ltd. | Vision system for a robotic machine |
US20220080584A1 (en) * | 2020-09-14 | 2022-03-17 | Intelligrated Headquarters, Llc | Machine learning based decision making for robotic item handling |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220371199A1 (en) | System and method for connection of service lines to trailer fronts by automated trucks | |
US11707955B2 (en) | Systems and methods for automated operation and handling of autonomous trucks and trailers hauled thereby | |
US11782436B2 (en) | Systems and methods for automated operation and handling of autonomous trucks and trailers hauled thereby | |
US10875448B2 (en) | Visually indicating vehicle caution regions | |
US11865726B2 (en) | Control system with task manager | |
US10754350B2 (en) | Sensor trajectory planning for a vehicle | |
US11312018B2 (en) | Control system with task manager | |
US10108194B1 (en) | Object placement verification | |
US20220241975A1 (en) | Control system with task manager | |
CA3193473A1 (fr) | Systemes et procedes pour le fonctionnement et la manipulation automatises de camions autonomes et remorques tirees par ceux-ci | |
WO2023211967A1 (fr) | Système et procédé de connexion de lignes de service aux avants de remorques par des camions automatisés |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23797163 Country of ref document: EP Kind code of ref document: A1 |