US9383752B2 - Railway maintenance device - Google Patents
Railway maintenance device Download PDFInfo
- Publication number
- US9383752B2 US9383752B2 US14/074,945 US201314074945A US9383752B2 US 9383752 B2 US9383752 B2 US 9383752B2 US 201314074945 A US201314074945 A US 201314074945A US 9383752 B2 US9383752 B2 US 9383752B2
- Authority
- US
- United States
- Prior art keywords
- vehicle
- component
- rail
- location
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000012423 maintenance Methods 0.000 title claims abstract description 37
- 238000012545 processing Methods 0.000 claims abstract description 178
- 230000009471 action Effects 0.000 claims abstract description 81
- 238000000034 method Methods 0.000 claims abstract description 31
- 230000008569 process Effects 0.000 claims abstract description 28
- 230000033001 locomotion Effects 0.000 claims abstract description 26
- 238000003384 imaging method Methods 0.000 claims description 48
- 238000007689 inspection Methods 0.000 claims description 23
- 238000012800 visualization Methods 0.000 claims description 13
- 230000004044 response Effects 0.000 claims description 11
- 239000003550 marker Substances 0.000 claims 2
- 239000000243 solution Substances 0.000 description 25
- 238000010168 coupling process Methods 0.000 description 24
- 238000005859 coupling reaction Methods 0.000 description 24
- 230000008878 coupling Effects 0.000 description 23
- 230000007246 mechanism Effects 0.000 description 23
- 230000000875 corresponding effect Effects 0.000 description 18
- 230000007547 defect Effects 0.000 description 16
- 230000008439 repair process Effects 0.000 description 16
- 238000005096 rolling process Methods 0.000 description 15
- 238000013459 approach Methods 0.000 description 12
- 238000007726 management method Methods 0.000 description 12
- 238000005259 measurement Methods 0.000 description 12
- 238000011156 evaluation Methods 0.000 description 11
- 230000000452 restraining effect Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 229910000831 Steel Inorganic materials 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003137 locomotive effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000013439 planning Methods 0.000 description 3
- 239000010959 steel Substances 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 235000012489 doughnuts Nutrition 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 239000013056 hazardous product Substances 0.000 description 2
- 238000011065 in-situ storage Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 238000012015 optical character recognition Methods 0.000 description 2
- 125000006850 spacer group Chemical group 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000012482 calibration solution Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 1
- 231100000518 lethal Toxicity 0.000 description 1
- 230000001665 lethal effect Effects 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61G—COUPLINGS; DRAUGHT AND BUFFING APPLIANCES
- B61G7/00—Details or accessories
- B61G7/04—Coupling or uncoupling by means of trackside apparatus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/005—Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0227—Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area
- G05D1/0229—Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area in combination with fixed guiding means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61C—LOCOMOTIVES; MOTOR RAILCARS
- B61C17/00—Arrangement or disposition of parts; Details or accessories not otherwise provided for; Use of control gear and control systems
- B61C17/12—Control gear; Arrangements for controlling locomotives from remote points in the train or when operating in multiple units
Definitions
- the disclosure relates generally to a robotic system, and more particularly to processing objects, such as rail vehicles, using a robotic system.
- railroad wheels are subjected to high, long-term stresses. Despite being made of high-quality steel, the stresses cause the wheels to become worn over a long period of operation. Without maintenance, a wheel can become too thin or otherwise no longer of the correct geometry. Further, the wheels may develop other defects, such as, for example, a “slid flat” or “flat spot”, which is caused by locking the wheels with the brakes in an attempt to stop.
- the wheels of railroad cars and locomotives cannot turn differentially since they are affixed to solid axles. As a result, any difference between the shape and/or size of the wheels on either side of a car/locomotive can cause a tendency to turn, leading to an increased possibility of derailment. Therefore, it is important to periodically inspect the wheels on railroad cars and locomotives to ensure that they remain safe to operate, both as an individual wheel and as a pair of wheels on the same axle.
- a J-shaped, steel wheel gauge is a common approach to inspecting rail wheels.
- an inspector manually places the gauge on the wheel, ensures contact with all relevant portions of the wheel, reads the measurements from marked scales on the gauge, and manually enters the data.
- an electronic wheel gauge can be used, which performs some of the functions automatically, thereby improving accuracy and reducing the overall time spent measuring the wheels.
- Various illustrative embodiments of handheld electronic wheel gauges are shown and described in U.S. Pat. No. 4,904,939, U.S. Patent Application Publication No. 2005/0259273, and U.S. Patent Application Publication No. 2007/0075192, each of which is incorporated by reference.
- rail wheels are inspected at a classification yard (e.g., hump yard, flat-shunted yard, gravity yard, and/or the like).
- a classification yard e.g., hump yard, flat-shunted yard, gravity yard, and/or the like.
- an incoming train may be halted while one or more cars are manually inspected.
- cursory e.g., visual inspection.
- the cars on the incoming train are classified and routed to corresponding tracks for inclusion on an outgoing train.
- the classification is performed based on a destination for each car. Once an outgoing train is assembled, one or more cars may be manually (e.g., visually) inspected along with an inspection of the brakes for the train. Subsequently, the train will leave the classification yard for the next destination.
- this approach does not provide a solution capable of performing other operations that may be necessary to completely decouple/couple rail vehicles (e.g., disassembly/assembly of brake hose) and/or may be desired (e.g., security, cleaning, inspection, and/or the like).
- decouple/couple rail vehicles e.g., disassembly/assembly of brake hose
- desired e.g., security, cleaning, inspection, and/or the like.
- an embodiment provides a robotic vehicle capable of performing routing-related functions of rail vehicles, evaluation of rail vehicles, security, and/or the like, in a rail yard.
- the robotic vehicle can be configured to: perform operations on moving rail vehicles; receive a set of tasks and execute the tasks without further operator intervention; successfully operate despite variations in object geometry, locations, and field conditions (e.g., snow/rain, cluttered scene, etc.); consume low power while operating; and/or the like.
- aspects of the invention provide a robotic vehicle configured for autonomous or semi-autonomous operation in a rail environment.
- the vehicle can process image data to move about the rail environment and perform one or more actions in the rail environment.
- the actions can include one or more actions related to decoupling and/or attaching rail vehicles, and can be implemented by performing three-dimensional image processing.
- the vehicle can be configured to move with any movement of a rail vehicle on which one or more actions are being performed.
- the various components configured to perform the action are implemented at a stationary location with respect to a rail line.
- a first aspect of the invention provides a vehicle configured for autonomous operation, the vehicle comprising: a first imaging device configured to acquire location image data; a transport component configured to enable the vehicle to move independently; a processing component configured to process the location image data acquired by the first imaging device and operate the transport component to move the vehicle using the location image data; a set of action components, the set of action components including at least one component configured to be temporarily secured with respect to a target object external to the vehicle; means for moving the at least one component to temporarily secure the at least one component with respect to the target object; and means for enabling the set of action components to move freely with the target object after the at least one component is temporarily secured.
- a second aspect of the invention provides a railway maintenance device configured for autonomous operation, the device comprising: a multi-link arm, each link capable of independent movement in at least one direction; a set of visualization components configured to capture work region image data for a region including at least one target object; an action component located on a link of the multi-link arm; and a processing component configured to process the work region image data to generate a three-dimensional representation of the region, move at least one of the links to place the component near a target object in the region using the three-dimensional representation, and operate at least one of the arm or the action component to perform a railway maintenance operation.
- a third aspect of the invention provides a rail yard comprising: a railway maintenance device configured for autonomous operation, the device comprising: a multi-link arm, each link capable of independent movement in at least one direction; a set of visualization components configured to capture work region image data for a region including at least one target object; an action component located on a link of the multi-link arm; and a processing component configured to process the work region image data to generate a three-dimensional representation of the region, move at least one of the links to place the component near a target object in the region using the three-dimensional representation, and operate at least one of the arm or the action component to perform a railway maintenance operation, wherein the railway maintenance operation comprises at least one of: decoupling a pair of rail vehicles, detaching a pair of brake hoses, attaching a pair of brake hoses, or acquiring inspection data for a rail vehicle.
- a fourth aspect of the invention provides a method of performing an operation on at least one rail vehicle, the method comprising: locating a vehicle adjacent to the at least one rail vehicle using location image data acquired by an imaging device located on the vehicle and a motor turning at least one of a set of wheels on the vehicle; securing at least one component of the vehicle with respect to a target object on the at least one rail vehicle in response to the locating; disengaging the motor from the at least one of the set of wheels using a clutch in response to the securing; and performing the operation on the at least one rail vehicle subsequent to the disengaging.
- the illustrative aspects of the invention are designed to solve one or more of the problems herein described and/or one or more other problems not discussed.
- FIG. 1 shows a side view of an illustrative robotic vehicle according to an embodiment.
- FIG. 2 shows side and top views of an alternative robotic vehicle according to an embodiment.
- FIG. 3 shows a schematic diagram of an illustrative processing component for a robotic vehicle according to an embodiment.
- FIG. 5 shows an illustrative coupling mechanism, which is commonly incorporated to couple rail vehicles, and the operation thereof according to an embodiment.
- FIGS. 6A-6D show operation of an illustrative manipulator according to an embodiment.
- FIG. 8 shows illustrative use of the manipulator of FIG. 7 to rotate a brake wheel located on a front/back side of a railway vehicle according to an embodiment.
- FIG. 9 shows a side view of another illustrative robotic vehicle according to an embodiment.
- FIG. 11 shows a top view of a portion of a rail yard according to an embodiment.
- FIG. 12 shows an illustrative user interface panel for a remote user according to an embodiment.
- aspects of the invention provide a robotic vehicle configured for autonomous or semi-autonomous operation in a rail environment.
- the vehicle can process image data to move about the rail environment and perform one or more actions in the rail environment.
- the actions can include one or more actions related to decoupling and/or attaching rail vehicles, and can be implemented by performing three-dimensional image processing.
- the vehicle can be configured to move with any movement of a rail vehicle on which one or more actions are being performed.
- the various components configured to perform the action are implemented at a stationary location with respect to a rail line.
- the term “set” means one or more (i.e., at least one) and the phrase “any solution” means any now known or later developed solution.
- An embodiment provides a robotic vehicle capable of fully autonomous and/or semi-autonomous operation for performing one or more rail-related actions including, but not limited to, rail yard maintenance/inspection, hump track maintenance, hazardous material response, perimeter security, and/or the like.
- Rail-related actions including, but not limited to, rail yard maintenance/inspection, hump track maintenance, hazardous material response, perimeter security, and/or the like.
- Rails are under constant pressures to reduce cost and provide safer operation. It is desirable to reduce the exposure of workers to slipping within a rail yard, the requirement of workers to perform continuous monitoring, the requirement of workers to perform actions in dangerous conditions, etc. Therefore, a railway solution that incorporates a robotic vehicle capable of fully autonomous and/or semi-autonomous operation can be beneficial to the railway industry.
- FIG. 1 shows a side view of an illustrative robotic vehicle 30 according to an embodiment.
- Vehicle 30 comprises a housing 32 , which holds a power component 34 , a processing component 36 , a positioning component 38 , and a transport component 40 therein and can protect the various components 34 , 36 , 38 , 40 from exposure to elements within an operating environment of vehicle 30 .
- FIG. 1 includes only schematic representations of components 34 , 36 , 38 , 40 and therefore no relative size or location of the various components 34 , 36 , 38 , 40 should be inferred from FIG. 1 .
- Housing 32 can comprise any type of rugged casing that is designed to withstand the rigors of use in an outdoor environment, such as a rail yard.
- housing 32 can include various additional components, such as an environmental monitoring and/or control component, which are not shown and described herein for clarity.
- Power component 34 can comprise a power source, such as a battery, which is capable of providing and distributing sufficient power to operate the various components of vehicle 30 without connection to an external power supply.
- Power component 34 can comprise a power source selected to provide a desired amount of peak operating power and provide such power for a desired amount of operating time. In this case, vehicle 30 can move freely within its operating environment.
- power component 34 comprises a set of lithium polymer batteries capable of providing a peak operating power of approximately 200 Watts for a period of approximately eight hours.
- Power component 34 can be recharged via a battery charger 33 and/or power cord 35 connected to an external power source 37 .
- vehicle 30 can operate while power component 34 acquires power from an external power source 37 , e.g., via power cord 35 .
- power component 34 can acquire power from an external power source 37 using any alternative power mechanism (e.g., a conductor rail, an overhead wire, and/or the like).
- vehicle 30 can comprise a tracked vehicle, in which transport component 40 includes a motor that drives a set of tracks 42 .
- vehicle 30 can move without restriction in an environment susceptible to various adverse travel conditions (e.g., mud, ice, large stone ballast, etc.) and/or including rough terrain (e.g., railroad tracks, steep inclines, etc.).
- Each track 42 is rotated about a set of wheels 44 A- 44 C, at least one of which (e.g., wheel 44 B) is driven by a motor within transport component 40 .
- the motor comprises a pulse width modulated (PWM) direct current (DC) electrical motor, which will enable vehicle 30 to operate for prolonged durations (e.g., approximately eight hours) without requiring recharging.
- PWM pulse width modulated
- DC direct current
- transport component 40 can include a clutch/transmission, which attaches the motor to the wheel(s). Inclusion of a clutch will allow processing component 36 to disengage the clutch and allow the wheels 44 A- 44 C and tracks 42 to move freely without using transport component 40 .
- a direction of travel for vehicle 30 can be changed by rotating two parallel tracks 42 at different speeds and/or in different directions.
- each track 42 comprises a rubber track, however, alternative tracks, such as steel tracks, can be utilized.
- vehicle 30 can comprise a wheeled vehicle, which can move unrestricted over a terrain, move along a rail, move over pathways, and/or the like.
- FIG. 2 shows side and top views, respectively, of an alternative robotic vehicle 30 A, which comprises a rail-based vehicle traveling over a set of rails according to an embodiment.
- the set of rails can run parallel to a rail line on which rail vehicles 4 A- 4 B are moving and the set of rails can be situated a sufficient distance from rail vehicles 4 A- 4 B so that vehicle 30 A does not interfere with other operations.
- vehicle 30 A can include the same components as shown and described herein with respect to vehicle 30 ( FIG. 1 ).
- vehicle 30 can include an ability to operate autonomously.
- processing component 36 and/or positioning component 38 can be configured to enable vehicle 30 to identify its location and identify objects within its surrounding area.
- vehicle 30 can include an omni-directional imaging device 46 , such as a low-light 360 degree omni-directional camera.
- Imaging device 46 can be configured to capture the omni-directional image data using moving parts (e.g., a pan/tilt/zoom imaging device) or no moving parts (e.g., using mirrors or the like to reflect the omni-directional image data onto imaging device 46 ).
- Imaging device 46 can comprise a sufficiently high resolution that enables determination of desired details of targets of interest within a desired operating range in the image data to enable a desired level of analysis.
- imaging device 46 comprises a two megapixel imaging device.
- Imaging device 46 can acquire images of the general surroundings of vehicle 30 , which can be provided for processing by processing component 36 to acquire an overall situational awareness for vehicle 30 .
- vehicle 30 can include a set of microphones, such as an acoustic microphone array 47 , which can acquire acoustic data from the environment around vehicle 30 .
- Processing component 36 can process the acoustic data to detect and compute directional information for a source of a sound, e.g., by comparing phase information from the array 47 , identify the type of sound, and/or the like.
- vehicle 30 can include a set of collision avoidance sensors, such as sensor 48 , each of which can be mounted on housing 32 .
- a sensor 48 can comprise any type of sensor, such as an ultrasonic sensor, a laser scanning sensor, and/or the like, which is capable of acquiring sufficient data to enable processing component 36 to detect objects relatively close to vehicle 30 .
- Processing component 36 can process the data acquired by sensor(s) 48 to determine a distance, a relative speed, a relative direction of travel, and/or the like, of the closest objects.
- the compiled data from the set of sensors 48 can provide a complete distance map and potential trajectory of the nearest objects to the vehicle 30 .
- Positioning component 38 also can include a geographic location radio signal receiver, such as a global positioning system (GPS) receiver, which can acquire geographic location information for vehicle 30 from a wireless geographic location transmitter, such as a GPS transmitter.
- GPS global positioning system
- the GPS receiver is further configured to receive a differential GPS (DGPS) signal from a differential GPS correction transmitter, which can be utilized to further refine an accuracy of the GPS-based location.
- positioning component 38 can further include one or more sensors that can provide dead-reckoning data.
- positioning component 38 can include a three-axis accelerometer sensor, a fiber optic gyro, or the like, which track very slow movements, a distance sensor, e.g., an inertial navigation unit, and/or the like.
- positioning component 38 can receive signals from a set of beacons to derive location information using triangulation. Alternatively, positioning component 38 can derive the location information from an intersection point for a set of vectors derived from the relative field strength, time of arrival (TOA), time difference of arrival (TDOA), phase difference between signals, and/or the like. It is understood that various types of beacons, such as ultra-wideband location beacons, cell phone towers, pager towers, distance measuring equipment (DME), VHF Omni-directional Radio Range (VOR), LORAN, and/or the like can be utilized.
- TOA time of arrival
- TDOA time difference of arrival
- phase difference between signals and/or the like.
- beacons such as ultra-wideband location beacons, cell phone towers, pager towers, distance measuring equipment (DME), VHF Omni-directional Radio Range (VOR), LORAN, and/or the like can be utilized.
- processing component 36 can send data to and receive data from other systems that are remote from vehicle 30 .
- vehicle 30 can include an antenna 52 , such as a radio frequency (RF) antenna, to send and receive data via a wireless communications solution.
- RF radio frequency
- vehicle 30 sends/receives data to/from other systems using antenna 52 via a wireless fidelity (Wi-Fi) link under the 802.11b protocol and a secure link that is not susceptible to jamming or other interference.
- Wi-Fi wireless fidelity
- other types of wireless communications links, operating frequencies, communications protocols, and/or the like can be used including, but not limited to, a microwave link, an infrared link, 802.11g, TCP/IP protocol, and/or the like.
- an embodiment of the wireless communications solution enables multiple vehicles 30 to operate within an area and communicate with the same system(s) without the communications conflicting.
- a group of vehicles 30 can be implemented within a work area, and cooperatively address various maintenance-related, dangerous, and/or routine tasks for the work area.
- multiple vehicles 30 can be deployed in a rail yard to perform inspections, monitor the area, decouple rail vehicles, and/or the like.
- the vehicles 30 can be configured to communicate with a central system and/or with one another to request assistance or perform some action, when necessary.
- Processing component 36 can be selected and configured to perform processing related to the corresponding application for vehicle 30 .
- processing component 36 comprises a distributed, low power, processing engine.
- processing component 36 can be configured to process machine vision-related algorithms to enable vehicle 30 to move about an environment without communicating with an external system, which enables a reduction in an amount of bandwidth required and enables real time processing to be performed by vehicle 30 .
- processing component 36 is shown including a low power digital signal processor (DSP) 54 and an on-board field programmable gate array (FPGA) 56 .
- DSP 54 can comprise, for example, a current state of the art DSP 54 , such as DM642 from Texas Instruments.
- FPGA 56 can comprise, for example, a Virtex series FPGA from Xilinx, Inc. In this case, by utilizing FPGA 56 to perform repetitive computations, the processing component 36 can compute at a rate of 500 million instructions per second (MIPS) while consuming only a total of approximately five watts.
- MIPS million instructions per second
- Processing component 36 can operate the various interface components, such as the motor 40 for moving vehicle 30 , the microphone array 47 , and/or various application-specific components 100 , in a manner that enables additional power savings. For example, processing component 36 can power down components that are not actively performing any processing. Similarly, some or all of processing component 36 can be powered down when not actively being used for processing.
- Robotic vehicle 30 comprises an arm 102 , which includes a plurality of links 104 A- 104 D, each of which can be moved independent of the other links 104 A- 104 D.
- link 104 A can rotate around a vertical axis extending from an attachment point 106 A.
- Links 104 B, 104 C are rotatable about attachment points 106 B, 106 C, respectively, and link 104 D can be extended from/retracted into link 104 C.
- the rotation of links 104 A- 104 C about attachment points 106 A- 106 C, respectively, can be implemented using, for example, a harmonic drive located at each attachment point 106 A- 106 C, each of which can be independently controlled by processing component 36 .
- Link 104 D can be extended/retracted using, for example, a linear actuator located within link 104 C and attached to link 104 D, which also can be independently controlled by processing component 36 . Similar to one or more of the attachment points 106 A- 106 D can be configured to enable processing component 36 to disengage the corresponding device used to move the link 104 A- 104 D to enable the link 104 A- 104 D to move freely (e.g., due to the movement of another object).
- harmonic drives and a linear actuator are only illustrative of various devices that can be utilized.
- harmonic drives and linear actuator are interchanged and/or alternative devices, such as servo motors, stepper motors, muscle wire, feedback encoders, electronic motor drives, feedback torque sensors, etc., can be used to move the various links 104 A- 104 D.
- vehicle 30 is shown including a single arm 102 , it is understood that vehicle 30 can include any number of arms 102 .
- Vehicle 30 can include a set of components that are utilized for performing various operations.
- link 104 C is shown including a set of visualization components 108 and link 104 D is shown including a set of action components 110 .
- the set of visualization components 108 includes a light source 112 and a set of imaging devices 114 A, 114 B.
- light source 112 can comprise a source of a type of diffused light, which can enable the set of imaging devices 114 A, 114 B to capture image data that is capable of being processed using two-dimensional machine vision techniques, such as image segmentation, thresholding, pattern recognition, or the like, to locate one or more objects within the field of view.
- two-dimensional machine vision techniques such as image segmentation, thresholding, pattern recognition, or the like, to locate one or more objects within the field of view.
- a single imaging device can be utilized to capture the image data.
- imaging devices 114 A, 114 B each comprise a resolution of approximately two megapixels. It is understood that various alternative configurations of the set of visualization components 108 can be implemented depending on the required imaging and/or the operating environment.
- the set of imaging devices can include one or more imaging devices that capture color, infrared, multi-spectral, and other types of image data.
- the set of action components 110 can be selected and configured to perform any combination of various actions including, but not limited to, inspection of an object (e.g., using an imaging device, one or more sensors such as infrared sensors, chemical sensors, etc.), repair/repair assistance, coupling/decoupling, and/or other types of manipulations of objects.
- the set of action components 110 can be operated and controlled by processing component 36 and/or can be operated under the direction of one or more remote users.
- vehicle 30 can be configured to perform operations that require a highly precise machine vision system to locate object(s) that are to be manipulated using the set of action components 110 in various outdoor operating environments.
- a two-dimensional machine vision system can be fooled by, for example, falling snow partially covering a part, plain scene clutter with too many similar looking objects, many variations in two-dimensional shapes, other objects which appear similar when viewed in a two-dimensional plane, and/or the like.
- the set of visualization components 108 can include a pair of imaging devices 114 A, 114 B, which capture image data that can be used to create a stereo image, e.g., an image including a three-dimensional set of coordinates, of an object on which vehicle 30 is configured to perform one or more manipulations.
- the set of visualization components 108 can include a light source 112 that emits a pattern of light, such as a sheet (e.g., line) of light, and one or more imaging devices 114 A, 114 B that capture images of an area as the light reflects off of different surfaces within the area.
- Light source 112 can be configured to move the pattern of light across the area, or light source 112 can be moved with respect to the area.
- processing component 36 can process the multiple images to generate a complete profile for the object(s) within the field of view, e.g., by reconstructing structural information from motion (SFM), shape from shadows, and/or the like.
- processing component 36 can use arm 102 to capture multiple images of an object for which three-dimensional information is desired.
- processing component 36 can use image geometry and the multiple images to compute the three-dimensional information.
- vehicle 30 can include any combination of two or more light sources 112 .
- light source 112 can comprise a large number of light emitting diodes (LEDs) with a diffuser plate in front of the LED array.
- additional light sources 112 can be mounted around imaging devices 114 A, 1148 to reduce shadows on the object(s) within the field of view caused by the illuminated light.
- processing component 36 can plan movement of links 104 A- 104 D in a manner that ensures the links 104 A- 104 D travel in a direction that moves closer to the object without spending time searching the three dimensional travel space.
- processing component 36 implements an inverse kinematics-based solution to position the links 104 A- 104 D with respect to a target object.
- Inverse kinematics comprise a class of computer solutions that can plan the path for the links 104 A- 104 D given a destination for the links 104 A- 104 D.
- Processing component 36 can use linear equations, Jacobian models, and optimized 3D space searches to quickly calculate an optimal path for the links 104 A- 104 D until the links 104 A- 104 D arrive at a desired location with respect to the object.
- Vehicle 30 can be configured to capture data that enables processing component 36 to attempt to uniquely identify one or more objects, such as rail vehicles.
- imaging devices 114 A, 114 B and/or imaging device 46 can capture image data of a unique identifier, such as a license plate, a vehicle number, object attributes (e.g., color, construction, etc.), and/or the like.
- Processing component 36 can process the image data, e.g., using optical character recognition (OCR) software, pattern recognition software, or the like, to determine a unique identifier for the vehicle.
- vehicle 30 can include a wireless identification device 116 , such as a radio frequency identification (RFID) tag reader, which can acquire, when available, identification data transmitted by a corresponding wireless identification transmitter placed in conjunction with an object.
- RFID radio frequency identification
- a classification yard includes a system for evaluating and/or performing routine maintenance on rolling stock in each of many consists (e.g., one or more connected rail vehicles) of rolling stock using vehicle 30 .
- the system can route any rolling stock that is evaluated as including one or more designated defects to a maintenance area, which can address the defect(s) before allowing the rolling stock to be included on a train that is sent out to various destinations for delivery.
- the system can improve: safety by reducing a likelihood of an accident in which one or more of the defects is a contributing cause; efficiency by removing defects that can lead to increased energy expenditure during operation; and/or the like.
- Robotic vehicle 30 can be implemented within the classification yard, e.g., as part of the evaluation and/or maintenance system, and be configured to perform one or more operations that are typically manually implemented in conjunction with maintenance and/or routing rail vehicles in the classification yard.
- an illustrative robotic vehicle can be configured to operate a coupler release handle, disconnect/reconnect railway brake hoses, and/or the like, between two connected rail vehicles to decouple/attach the rail vehicles. In this manner, it is not necessary for any personnel to get between two rail vehicles to detach/attach the rail vehicles from/to one another thereby improving the safety and efficiency of processing the rail vehicles within the classification yard.
- FIG. 4 shows an illustrative simplified diagram of a classification yard 10 according to an embodiment.
- Classification yard 10 includes a number of consist assembly tracks 12 that feed into a single rail line 14 . All rail traffic passing through classification yard 10 , apart from through traffic, passes along rail line 14 . Rail line 14 then diverges into multiple outbound tracks 16 . Rolling stock evaluated as having defect(s) that require service is/are routed to a dedicated set of maintenance tracks 18 . Additionally, freight traffic classification can occur within classification yard 10 . During the classification of freight traffic, freight-carrying rail vehicles that do not require servicing are decoupled according to the destinations and routed to one of the various outbound tracks 16 in classification yard 10 for re-coupling with other rail vehicles as part of an outbound train. Rail line 14 includes a set of retarder devices 13 , which are configured to reduce the speed of rail vehicles passing thereby.
- Classification yard 10 includes a processing system 20 , which can evaluate the rolling stock for the presence of one or more defects and route the rolling stock based on the defect(s) and/or its destination using vehicle 30 ( FIG. 1 ).
- processing system 20 is shown including an evaluation component 22 that automatically acquires measurement data and evaluates various aspects of the rolling stock as it travels along rail line 14 .
- Evaluation component 22 can provide measurement and/or evaluation data to a management component 24 , which can route the rolling stock accordingly.
- Management component 24 can include a computer system that aids in routing the rolling stock (e.g., by designating a track, operating switches to route the rolling stock to the track, and/or the like).
- management component 24 can comprise a routing component 25 , which provides for the routing of rail vehicles through classification yard 10 , and an inspection component 27 , which implements the inspection actions described herein. Routing component 25 and inspection component 27 can interact with one another and one or more vehicles 30 to process vehicles through classification yard 10 .
- a user can be located in a control tower or the like, which can assist the user in overseeing the operations of classification yard 10 while utilizing management component 24 in moving rolling stock through classification yard 10 . In this manner, classification yard 10 permits a real-time assignment of good order or bad order evaluation to all passing rolling stock, which further enables more efficient processing of the rolling stock through classification yard 10 .
- evaluation component 22 can acquire measurement data for performing an inspection of the various couplings between two connected rail vehicles, which can be provided for processing by inspection component 27 .
- inspection component 27 determines that a rail vehicle does not pass the inspection (e.g., includes one or more defects)
- processing system 20 can determine whether the defect(s) is (are) of the type that can be repaired locally (e.g., in-situ or on a local track).
- a local repair can include a repair of one or more minor defects (e.g., reattachment of a brake line) with the couplings between two rail vehicles.
- Vehicle 30 can be directed to travel to a location between two rail vehicles and detach/attach various connectors (coupling mechanisms, brake hoses, etc.) when assembling or disassembling a consist, performing a repair, detaching a bad-ordered vehicle, and/or the like.
- Vehicle 30 can be configured to attempt to fix minor defects such as loose/hanging equipment (e.g., a hose) or material, over-height/over-width loads, and/or the like, in an automated or semi-automated manner without re-routing the rail vehicle and/or detaching and routing the vehicle locally, e.g., via a loop-back rail line 15 .
- one or more personnel can direct and/or assist vehicle 30 in performing the repair, and indicate to processing system 20 one or more details of the repair (e.g., time, materials, etc.) as well as whether the repair was successful or not. If the repair is successful, processing system 20 can route the rail vehicle for re-inspection, e.g., via a loop-back rail line 15 .
- processing system 20 determines that one or more defects on a vehicle cannot be performed in-situ or that one or more repairs were unsuccessful, the vehicle can be detached from a consist using vehicle 30 and processing system 20 can route the vehicle to one of a set of maintenance tracks 18 .
- Processing system 20 can route the vehicle to the maintenance track 18 via rail line 14 or via rail line 17 , e.g., when a repair was first attempted on loop-back rail line 15 .
- one or more maintenance personnel at a maintenance shop 19 can perform the repair, and indicate to processing system 20 one or more details of the repair (e.g., time, materials, etc.).
- processing system 20 can route the rail vehicle for re-inspection, e.g., via rail lines 15 , 17 .
- processing system 20 can utilize a set of robotic vehicles to perform one or more operations required to process the rail vehicles as they pass through classification yard 10 .
- robotic vehicle 30 includes a set of application-specific components 100 that enable robotic vehicle 30 to perform various operations required within classification yard 10 .
- robotic vehicle 30 is configured to operate a coupler release handle for decoupling two rail vehicles, disconnect/reconnect railway brake hoses, identify, evaluate, and/or repair the rail vehicles, operate external systems, and/or the like.
- processing component 36 can interact with one or more external systems, such as management component 24 , to receive operating instructions corresponding to a consist being processed through classification yard 10 and/or obtain assistance.
- management component 24 can identify the particular couplings in a consist of rail vehicles that require decoupling.
- processing component 36 can receive a set of instructions for a consist as it approaches, and can receive updates to the instructions in response to evaluations performed on the various rail vehicles.
- the initial set of instructions can indicate the various locations at which the consist must be decoupled according to the corresponding destinations for the rail vehicles, while the updated instructions can indicate a rail vehicle that must be decoupled in order to further evaluate and/or perform maintenance on the vehicle before it is moved through the classification yard 10 .
- the instructions can uniquely identify the coupling to be decoupled using any solution, e.g., based on a location of vehicle 30 , location of coupling within the consist, identification of front rail vehicle, and/or the like.
- Processing component 36 can communicate the status of completion of one or more of the assigned tasks with an external system, such as management component 24 . For example, when two rail vehicles are successfully decoupled, processing component 36 can send an indication to management component 24 , which can perform the routing accordingly.
- an error such as an object is detected within an unsafe operation perimeter of vehicle 30 , a rail vehicle cannot be identified (e.g., due to a missing RFID tag, smeared or covered vehicle, number, or the like)
- processing component 36 can send data to and request assistance from an external system, such as management component 24 .
- processing component 24 can transmit image data corresponding to the detected object or unidentifiable vehicle 30 , which can be evaluated by personnel to resolve the error (e.g., identify the vehicle) and/or determine further actions to be performed by vehicle 30 .
- vehicle 30 can obtain assistance from an external source.
- processing component 36 can locate an object, such as a coupler release handle or brake hoses
- processing component 36 can provide image data corresponding to an area in which the object should be present for evaluation by an individual.
- processing component 36 can request assistance for a particular task when a task cannot be performed within a predetermined amount of time.
- processing component 36 can control one or more retarder devices 13 to slow rail vehicles passing thereby, e.g., in order to reduce the tension between two rail vehicles to allow them to separate.
- vehicle 30 transmits RF signals directly to the set of retarder devices 13 .
- processing component 36 can signal an external system, such as management component 24 , which in turn operates the set of retarder devices 13 accordingly.
- the set of action components 110 can be configured to manipulate one or more objects, such as in decoupling a pair of rail vehicles.
- the set of action components 110 can be configured to operate a coupler release handle, which can be located on one or both sides of the coupler mechanism of a rail vehicle. When operation of the coupler release handle on one side of the rail vehicle fails, the coupler release handle located on the other side of the rail vehicle can be utilized.
- processing component 36 can acquire image data from imaging devices 114 A, 114 B and process the image data to locate an object to be manipulated, such as the coupler release handle, and operate the set of action components 110 to perform the manipulation.
- the set of imaging devices 114 A, 114 B provide a detailed view of a particular object to be manipulated, which enables processing component 36 to implement machine vision-related processes locally and perform refined object recognition.
- FIG. 5 shows an illustrative coupling mechanism 120 , which is commonly incorporated to couple rail vehicles 4 A, 4 B, according to an embodiment.
- Rail vehicles 4 A, 4 B are connected at two points: a main coupler assembly 122 and the brake hoses 126 A, 126 B.
- coupler assembly 122 automatically engages when a forward component 124 A of a rail vehicle 4 B encounters a rear component 124 B of another rail vehicle 4 A.
- coupling mechanism 120 provides for automatic coupling (although brake hoses 126 A, 126 B are coupled manually), decoupling continues to be performed manually, which can be dangerous.
- a human must go between rail vehicles 4 A, 4 B, which can be moving and/or move at any time, to perform the decoupling.
- the decoupling often relies on the movement of one of the rail vehicles 4 A, 4 B once coupler assembly 122 has been decoupled (e.g., due to gravity when rail vehicles 4 A, 4 B are on an incline, such as at a classification yard).
- a coupler release handle 128 is moved a relatively small amount in a direction perpendicular to the handle's axis to operate a coupler release mechanism 130 . That is, release handle 128 is moved vertically when release handle 128 extends horizontally or horizontally when release handle 128 is substantially vertical in orientation, as shown. Operation of release handle 128 in this manner causes coupler release mechanism 130 to release the components 124 A, 124 B of coupler assembly 122 . An operator will determine whether rail vehicles 4 A, 4 B have separated, and if not, may need to operate release handle 128 again.
- Brake hoses 126 A, 126 B are generally connected to a corresponding portion of coupler assembly 122 by a wire or chain harness 132 . As rail vehicles 4 A, 4 B separate, harness 132 exerts angular force upon the brake hose connection, causing the brake hoses 126 A, 126 B to separate. It is understood that coupling mechanism 120 can include mirrored components of release handle 128 and brake hoses 126 A, 126 B on an opposite side, which are not shown for clarity.
- the set of imaging devices 114 A, 114 B can acquire image data having a field of view approximately of that shown in FIG. 5 .
- Processing component 36 can identify certain features/objects within the field of view and locate a bounding area 134 within which coupler release handle 128 should be located. For example, processing component 36 can identify the start of rail vehicle 4 B, ground level, and the closest side of vehicle 4 B to vehicle 30 .
- Processing component 36 can define the bounding area 134 as a cube having an X-axis that starts one foot in front of vehicle 4 B and ends two feet in front of vehicle 4 B, a Y-axis that starts one foot above ground level and ends two and a half feet above ground level, and a Z-axis that extends from one foot beyond the vehicle 30 to a half foot beyond the closest side of vehicle 4 B.
- processing component 36 can process image data corresponding to the bounding area 134 to recognize coupler release handle 128 , which can be a lever, a hook, an inverted “C” lever, and/or the like.
- Processing component 36 can apply a set of three-dimensional machine vision algorithms to the image data to generate a representation of the geometry of various objects within the field of view in three dimensions. For example, processing component 36 can identify various three-dimensional data points by evaluating the stereoscopic image data acquired by imaging devices 114 A, 114 B, e.g., using the disparity between the image data captured by the two imaging devices 114 A, 114 B. Similarly, processing component 36 can identify the three-dimensional points by analyzing a set of images acquired with a pattern of light (e.g., moire light pattern, binary light pattern, and/or the like) reflecting off of the various surfaces within the field of view, and/or the like.
- a pattern of light e.g., moire light pattern, binary light pattern, and/or the like
- vehicle 30 can be implemented on vehicle 30 , such as a 3D profile scanner from Keyence, an LMI laser, a Sick laser scanner, a three-dimensional range camera, light detection and radar (LIDAR), scanning sheet of light systems, and/or the like, to generate a 3D representation including three-dimensional data points.
- a 3D profile scanner from Keyence an LMI laser, a Sick laser scanner, a three-dimensional range camera, light detection and radar (LIDAR), scanning sheet of light systems, and/or the like
- LIDAR light detection and radar
- An alternative 3D representation can use Voxels, e.g., small triangles that represent a 3D rendering of an object from 3D data points.
- processing component 36 can process the data points to identify the various objects.
- processing component 36 uses in-variant features representing the objects to match against the 3D points. For example, a circular “donut like” object can be described by a set of radii and thickness of the donut.
- processing component 36 can fit the acquired 3D data points through a curve fitting algorithm to determine whether a match against a previously stored in-variant template occurs.
- processing component 36 can use more general purpose 3D volume fitting algorithms, e.g. Open Source Visualization Toolkit (VTK), or the like, to recognize the object(s).
- VTK Open Source Visualization Toolkit
- processing component 36 can analyze other features of the object, such as color, texture, approximate location, etc., in addition to its shape to accurately identify the object. Further, it is understood that processing component 36 can identify the various objects by being trained to recognize certain stored patterns, shapes, parts, etc.
- the VOLTS-IQ visual intelligence software suite offered by Braintech, comprises an approach to train processing component 36 to identify parts and shapes for later recall and usage.
- processing component 36 can operate one or more components in the set of action components 110 to move coupler release handle 128 to decouple vehicles 4 A, 4 B.
- FIGS. 6A-6D show operation of an illustrative manipulator 140 , which can be located at an end of link 104 D ( FIG. 1 ), according to an embodiment.
- FIGS. 6A, 6B show a front view and top view, respectively, of manipulator 140 .
- Manipulator 140 includes a manipulator attachment 142 , a rotation mechanism 144 , a gripping mechanism 146 , and a plurality of fingers 148 A- 148 C.
- Each finger 148 A- 148 C can be equipped with a set of sensors, which can, for example, provide information on an amount of force being exerted on the finger 148 A- 148 C.
- the set of sensors can include, for example, tactile sensor(s), pressure sensor(s), force sensor(s), torque sensor(s), and/or the like.
- Gripping mechanism 146 includes a plurality of tracks 150 A-C along which fingers 148 A-C can move. Further, rotation mechanism 144 can enable gripping mechanism 146 and fingers 148 A-C to be rotated about its axis. Still further, as illustrated in FIG. 6C , manipulator 140 can be attached to link 104 D, which can provide horizontal and/or vertical movement of manipulator 140 .
- FIGS. 5, 6C, 6D illustrate use of manipulator 140 to operate release handle 128 in order to detach rail vehicles 4 A-B.
- processing component 36 FIG. 1
- release handle 128 can be positioned such that multiple fingers, such as fingers 148 A, 148 C are on one side of release handle 128 , while at least one finger, such as finger 148 B is on the other side of release handle 128 .
- Processing component 36 can determine that fingers 148 A- 148 C are properly aligned using, for example, data acquired from a sensor on each finger 148 A- 148 C that measures an amount of force being exerted.
- processing component 36 can operate arm 102 to move (e.g., shake) manipulator 140 in the direction/distance required to release components 124 A, 124 B.
- processing component 36 can determine whether rail vehicles 4 A, 4 B have been successfully released from one another. If so, fingers 148 A- 148 C can disengage from release handle 128 . Otherwise, processing component 36 can move manipulator 140 again to seek to release components 124 A, 124 B.
- processing component 36 can use data from the force sensors on fingers 148 A- 148 C to determine, for example, whether any unusual/abnormal resistance or lack of resistance occurs while the release handle 128 is being moved. Further, processing component 36 can determine various other faults using any solution. For example, processing component 36 can determine a fault due to a failure to release rail vehicles 4 A, 4 B after a predetermined number of tries, a broken (e.g., stuck or missing) component in coupling mechanism 120 , and/or the like. In this case, processing component 36 can generate an alarm, which can be presented to a user for action.
- Processing component 36 can determine whether rail vehicles 4 A, 4 B have been successfully released using any solution. For example, processing component 36 can process image data captured by one or more imaging devices 46 , 114 A, 114 B to determine whether the rail vehicles 4 A, 4 B are moving away from one another, e.g., whether the distance between the vehicles 4 A, 4 B is increasing.
- vehicle 30 can include a laser distance sensor, or the like, which processing component 36 can operate to point at one or both vehicles 4 A, 4 B and measure the distance to determine whether the decoupling was successful. Regardless, once a successful decoupling operation is performed, processing component 30 can transmit its success to another system, such as management component 24 ( FIG. 4 ), and wait for instructions or commence its next action.
- processing component 36 can utilize the same or similar processes in order to identify various other types of objects for other applications, and that the set of action components 110 can be configured to perform various other types of actions on these objects for other applications.
- the set of action components 110 can comprise one or more additional components and/or alternative components.
- vehicle 30 can comprise multiple arms 102 , each having a different set of action components 110 .
- processing component 36 can identify brake hoses 126 A, 126 B ( FIG. 5 ) for subsequent connection or separation using a three-dimensional machine vision solution discussed herein.
- vehicle 30 can be configured to use active machine vision metrology, such as imaging reflections of a sheet of light generated by light source 112 , to identify the brake hoses 126 A, 126 B from other background objects by gauging the dimensions of the brake hoses 126 A, 126 B.
- the set of action components 110 can comprise a manipulator configured to decouple railway brake hoses 126 A, 126 B.
- FIGS. 7A-7D show operation of another illustrative manipulator 160 , which can be located at an end of link 104 D, according to an embodiment.
- FIGS. 7A, 7B show a top view and front view, respectively, of manipulator 160 .
- Manipulator 160 includes a pair of restraining rods 162 A, 162 B and a contact component 164 .
- Contact component 164 can be moved up/down with respect to restraining rods 162 A, 162 B via a piston 166 or the like.
- Restraining rods 162 A, 162 B are positioned above a low point of contact component 164 via a pair of vertical supports 168 A, 168 B that are spaced apart using a spacer 170 , which is attached to link 104 D using any solution.
- Each component of manipulator 160 that contacts one or more components of a rail vehicle 4 A, 4 B ( FIG. 5 ) can have a smooth rounded cross section to reduce the risk of wear or damage to one or more components that are manipulated using manipulator 160 .
- Processing component 36 can operate manipulator 160 to detach a pair of connectors 8 A, 8 B for a standard rail brake hose 126 A, 126 B ( FIG. 5 ) on rail vehicles 4 A, 4 B.
- FIGS. 7C, 7D show manipulator 160 being used to detach connectors 8 A, 8 B.
- processing component 36 can operate arm 102 ( FIG. 1 ) to locate manipulator 160 so that each restraining rod 162 A, 162 B is located above the brake hose adjacent to a corresponding connector 8 A, 8 B, respectively, while contact component 164 is located below connectors 8 A, 8 B.
- the spacing between restraining rods 162 A, 162 B can be selected such that each restraining rod 162 A, 162 B can be located near where connectors 8 A, 8 B meet the brake hose. Further, processing component 36 can adjust a width of spacer 170 using any solution to enable the corresponding locations of restraining rods 162 A, 162 B to be adjusted. Similarly, a distance between a top of contact component 164 and a bottom of restraining rods 162 A, 162 B can be selected such that connectors 8 A, 8 B will readily fit between. Further, processing component 36 can adjust the distance by adjusting a length of vertical supports 168 A, 168 B and/or a height of contact component 164 using any solution (e.g., via piston 166 ).
- processing component 36 can move contact component 164 upward toward connectors 8 A, 8 B using piston 166 .
- Contact component 164 will force connectors 8 A, 8 B to move upward, while the brake hose 126 A, 126 B ( FIG. 5 ) is prevented from moving upward by restraining rods 162 A, 162 B.
- connectors 8 A, 8 B will swivel away from one another, resulting in the hoses 126 A, 126 B becoming separated.
- Processing component 36 can confirm that the brake hoses have been decoupled using any solution (e.g., by analyzing image data of the work region, data from one or more pressure sensors located on manipulator 160 , or the like). Once a successful decoupling operation is performed, processing component 30 can transmit its success to another system, such as management component 24 ( FIG. 4 ), and wait for instructions or commence its next action.
- manipulators 140 , 160 and the functionality described therewith are only illustrative of numerous types of manipulation devices and functions, which can be implemented on vehicle 30 and utilized to perform a wide variety of tasks.
- processing component 36 can utilize the same/similar components and processes described herein to perform additional actions in an operating environment, such as one or more repairs or maintenance tasks in a classification yard.
- FIG. 8 shows illustrative use of manipulator 160 to rotate a brake wheel 6 located on a front/back side of a rail vehicle 4 according to an embodiment. Brake wheel 6 can be rotated to release stuck brakes on rail vehicle 4 .
- manipulator 160 can be utilized to bleed the brake system of a rail vehicle 4 by pulling an air system release lever located on the side of a rail vehicle 4 .
- manipulator 160 can grasp the release lever and the arm can be operated to pull the lever down to bleed off the air pressure for a few seconds.
- vehicle 30 can operate a hose to clean a portion of the work area, such as a maintenance pit, between tracks, rail vehicles 4 , and/or the like.
- manipulator 160 can be implemented with restraining rods 162 A, 162 B capable of movement similar to human fingers.
- processing component 36 can implement grasp planning prior to moving the restraining rods 162 A, 162 B.
- the grasp planning can comprise, for example, one or more grasp planning optimization algorithms, such as grasp analysis, grasp workspace determination, grasp solution computation within the workspace, and/or the like.
- vehicle 30 can comprise a pair of manipulators 160 , each of which can be used to grasp one of a pair of disconnected brake hoses between two rail vehicles and attach the brake hoses, e.g., as part of assembling a consist for an outbound train.
- the set of action components 110 on vehicle 30 can be configured to acquire data for inspecting one or more aspects of an object.
- processing component 36 can be configured to acquire measurement data for and/or perform maintenance on various components of a rail vehicle.
- the set of action components 110 can comprise a non-destructive testing head, which can be applied to or placed in proximity with an object and used to acquire data regarding the object.
- the testing head can comprise: an electromagnetic acoustic (EMAT)-based testing head, such as shown and described in U.S. Pat. No. 6,523,411; a handheld electronic gauge, such as shown and described in U.S. Pat. No.
- EMAT electromagnetic acoustic
- Processing component 36 can operate arm 102 to apply the testing head to various locations of a rail wheel and probe for flaws within the rail wheel (e.g., a crack), gather wheel stress measurements, gather dimensional measurements (e.g., diameter), and/or the like.
- a rail wheel and probe for flaws within the rail wheel (e.g., a crack), gather wheel stress measurements, gather dimensional measurements (e.g., diameter), and/or the like.
- the set of action components 110 can comprise a set of data acquisition devices, such as an imaging device, a chemical/biological sensor, an infrared imaging device (e.g., active illumination infrared camera or a passive infrared camera), a multi-spectral imaging device, or the like, which processing component 36 can locate accordingly to acquire measurement data, such as image data, chemical/biological levels, heat data, and/or the like, which processing component 36 can analyze to determine the presence of one or more unsafe conditions (e.g., a leak, hot brakes, uneven wear, hidden compartments, etc.).
- a vehicle 30 can respond to an accident that may have resulted in a leak of hazardous material to determine a level and severity of the spill.
- imaging device 46 and/or the set of action components 110 can be configured to acquire various types of image data for the surrounding area to perform security-related actions.
- imaging device can acquire infrared image data, which processing component 36 can evaluate to determine the presence of unauthorized individuals in the work region regardless of scene clutter and/or weather conditions.
- vehicle 30 A ( FIG. 2 ) can be configured to perform track-related maintenance and/or inspection as it moves along the tracks.
- vehicle 30 A can be configured to visually inspect the track for defects, make track-based measurements (e.g., gauge, profile, etc.), make minor repairs to railway tie spikes or other track-related infrastructure, etc.
- embodiments of vehicle 30 can be configured to perform inspections of road-based vehicles, water-based vehicles, explosive ordinance disposal, remote inspection and/or monitoring, and/or the like.
- the rail vehicles may move unexpectedly, change direction, or be continuously moving.
- robotic vehicle 30 can maneuver itself and align itself with a coupling or corresponding work region of a rail vehicle and secure itself with respect to the rail vehicle (e.g., by latching on to the release handle). Additionally, robotic vehicle 30 can align its tracks 42 to be parallel with the rail on which the rail vehicle is located. Alternatively, when robotic vehicle 30 is implemented as a rail-based vehicle 30 A ( FIG. 2 ), the corresponding rails can be aligned accordingly. Regardless, processing component 36 can disengage the clutch within transport component 40 to allow tracks 42 to move freely.
- arm 102 and the components thereon can remain stationary with respect to the work region of a rail vehicle even if the rail vehicle is moving or suddenly starts/stops.
- processing component 36 can unsecure vehicle 30 from the rail vehicle and re-engage the clutch to enable vehicle 30 to move on its own.
- processing component 36 can extend arm within the gap between a pair of connected vehicles.
- processing component 36 can implement a mechanical tracking solution, such as a stabilization platform, which stabilizes a position of the arm 102 with respect to the rail vehicles.
- vehicle 30 can include a sensor to acquire the speed of the rail vehicles and processing component 36 can adjust the speed of vehicle 30 with the speed of the rail vehicles to move in-step with the rail vehicles (e.g., track the movement of the rail vehicles).
- Processing component 36 can track the vehicle movement using image-based invariant feature tracking, particle filters, or perform binary correlation to match features in a spatio-temporal image stream with features of a template image of the target. Further, processing component 36 can implement finer adjustments by moving arm 102 in response to determining a difference in location for an object within a field of view of an imaging device, or the like.
- arm 102 and the various components for performing actions and/or acquiring data on an object can be implemented in a stationary location.
- arm 102 can be permanently mounted adjacent to a rail line, such as rail line 14 ( FIG. 4 ).
- arm 102 can be permanently mounted between a pair of rails.
- arm 102 and the components thereon can be configured to acquire data/perform manipulations of various components located on the underside of the rail vehicles (e.g., brakes, suspension, axle, etc.).
- arm 102 can access and manipulate the release handle 128 ( FIG. 5 ) and/or brake hoses 126 A, 126 B ( FIG. 5 ) of the coupling mechanism 120 ( FIG. 5 ).
- FIG. 9 shows a side view of another illustrative robotic vehicle 30 B according to an embodiment.
- Robotic vehicle 30 B is configured to operate between a pair of rails.
- vehicle 30 B can comprise a profile that is approximately six inches high and a width that is approximately two feet or less.
- arm 102 can comprise a three-link arm, in which the first and second links rotate around connection points, and the third link, with the set of action components 110 , can extend from the second link.
- processing component 36 can raise and extend the set of action components 110 to a desired location to perform the corresponding action(s).
- the set of action components 110 can be configured to decouple the rail vehicles.
- FIG. 10 shows an illustrative configuration in which vehicle 30 B can image a coupling mechanism 120 from below using a set of visualization components 108 according to an embodiment. As illustrated, vehicle 30 B ( FIG. 9 ) can position the set of visualization components 108 to enable the capture of cross-sectional image data of the coupling mechanism 120 .
- FIG. 11 shows a top view of a portion of a rail yard 10 A according to an embodiment.
- the rail yard 10 A comprises a large number of tracks that are configured in a structured manner.
- the various tracks can be as close as approximately ten feet to each other, which will require vehicle 30 to be capable of determining its location with a high degree of accuracy to avoid collision with rail vehicles moving on the tracks.
- Rail yard 10 A comprises numerous salient features that can be used to accurately location vehicle 30 .
- the rail tracks comprise recognizable patterns, as well as various other features, such as sign posts, roads, structures, and/or the like, can be identified on a highly accurate digital map of rail yard 10 A.
- Processing component 36 can identify some of these features within image data captured by imaging device 46 and use their location to make periodic course corrections to the movement of vehicle 30 . Additionally, processing component 36 can implement an algorithm, such as Kalman filters, to reduce location estimation errors.
- rail yard 10 A can comprise several virtual cells 204 A- 204 C.
- Each virtual cell 204 A- 204 C can have a corresponding size, risk probability, volume, action list, maximum speed, and/or the like, associated therewith.
- processing component 36 can operate vehicle 30 accordingly. For example, in a virtual cell in which rail vehicles infrequently travel and are slow moving, processing component 36 can move vehicle 30 alongside a rail. However, in a virtual cell that comprises a high volume of and/or fast moving rail vehicles, processing component 36 can carefully analyze the rails before approaching a rail.
- processing component 36 can direct vehicle 30 to a location of the first coupling mechanism to be decoupled based on the action list. It is understood that virtual cells 204 A- 204 C are only illustrative, and rail yard 10 A can comprise numerous virtual cells.
- vehicle 30 also can include a positioning component 38 ( FIG. 1 ), which can comprise a geographic location radio signal receiver, which acquires geographic location information for vehicle 30 from a geographic location transmitter.
- a positioning component 38 FIG. 1
- rail yard 10 A is shown including a set of location transmitters 200 A- 200 C, such as ultra-wideband location beacons, according to an embodiment.
- Each location transmitter 200 A- 200 C can transmit a signal that is received by positioning component 38 .
- Processing component 36 can process one or more attributes of the respective signals to determine an area 202 corresponding to a location of vehicle 30 .
- a work environment such as rail yard 10 A
- a work environment can be supplemented with one or more alternative markers for assisting vehicle 30 and processing component 36 in moving throughout the work environment.
- a line can be painted on a surface along which vehicle 30 travels, which processing component 36 can utilize to route itself accordingly.
- an emitting source such as a buried wire emitting low levels of coded RF signal, a laser light emitter, or the like, can be used to assist vehicle 30 in guiding itself along a predetermined path.
- other markers such as an edge of maintenance pit, one or a pair of side by side tracks, an overhead sentry, and/or the like, can be utilized by processing component 36 to determine a path of travel.
- processing component 36 can locate the area for the work location, e.g., the location of a coupling mechanism to be decoupled. To this extent, processing component 36 can determine the correct rail vehicles to be decoupled in a stream of connected rail vehicles using any solution. For example, processing component 36 can identify a particular rail vehicle in a consist to determine whether the rail vehicle is one to be decoupled or to determine a relative location of the rail vehicle with respect to the decoupling location.
- processing component 36 can detect the end of rail vehicles to determine a location of the coupling mechanism.
- processing component 36 can process data received from the set of collision avoidance sensors 48 to determine the start/end of each rail vehicle.
- processing component 36 can receive data from an external vehicle sensing system, which is implemented within rail yard 10 A. In this case, the external vehicle sensing system can provide an output signal indicating the end of a rail vehicle and the beginning of the next rail vehicle, which processing component 36 can use to determine the area between the rail vehicles.
- Processing component 36 also can process image data acquired by imaging device 46 to identify the work location, e.g., by identifying a loop figure, which indicates connected brake hoses for a coupling mechanism.
- vehicle 30 can be provided with a set of actions, which can subsequently be carried out autonomously be vehicle 30 .
- an off the shelf robot programming environment such as one provided by Mobilerobots, Inc., Robotics Developer Studio from Microsoft, Inc., and/or the like, can be utilized to program the actions for processing component 36 .
- an illustrative set of instructions provided by a user for processing component 36 to implement using the various components of vehicle 30 can comprise: “go to track 5 left side”; “wait until end of car detected”; “wait for decoupling order”; “confirm vehicle identity”; “look for pin puller lever”; “grab lever”; “release car”; “confirm release”; “repeat until end-of-train”.
- vehicle 30 is generally described herein as being configured for autonomous operation in response to receiving a set of tasks from another system, it is understood that vehicle 30 can operate in semi-autonomously, during which a remote user can control vehicle 30 to perform one or more tasks and/or assist vehicle 30 in performing one or more assigned tasks.
- processing component 36 can transmit image data captured by imaging devices 46 , 114 A, and/or 114 B for presentation to the remote user. Additionally, processing component 36 can transmit data acquired by other sensors, such as microphone array 47 for presentation to the remote user.
- Data from imaging device 46 and microphone array 47 can provide situational awareness for the remote user, and processing component 36 and/or a remote system can analyze and supplement the data with analysis information (e.g., identified objects within the field of view, directional information of a source of a sound, and/or the like).
- Analysis information e.g., identified objects within the field of view, directional information of a source of a sound, and/or the like.
- Data from imaging devices 114 A, 114 B can be utilized by the remote user to perform/assist processing component with performing the task(s).
- FIG. 12 shows an illustrative user interface panel 210 for a remote user according to an embodiment.
- Panel 210 includes five display areas 212 A- 212 D, 214 .
- Display areas 212 A- 212 D provide different views acquired by imaging device 46 , and can provide the remote user with situational awareness regarding the environment around vehicle 30 .
- Each display area 212 A- 212 D corresponds to a different portion of an area around vehicle 30 , e.g., every ninety degrees of a circle around the vehicle 30 .
- Display area 214 provides display information for the work region, which the remote user can utilize to perform one or more actions.
- Processing component 36 can generate and transmit data for the display areas 212 A- 212 D, 214 using any solution.
- processing component 36 compresses the image data acquired from the imaging devices using, for example, the JPEG 2000 compression algorithm enhanced for low latency operation, and processing component 36 can transmit the image data for presentation on panel 210 using a high speed 801.11 wireless network.
- the remote user will be presented with a near real time view of the various locations to enable precise control of one or more components of vehicle 30 .
- the update periods and image quality, and therefore the corresponding algorithms for generating and transmitting the image data, for display areas 212 A- 212 D can differ from that of display area 214 , the latter of which can be most important for performing actions using vehicle 30 .
- a latency of approximately ten milliseconds or less is generally adequate for a slow speed remote operation.
- panel 210 can include one or more input devices for enabling the remote user to remotely control the operation of one or more components of vehicle 30 .
- panel 210 is shown including a pair of joysticks 216 A, 216 B, which can enable the remote user to operate any combination of the various components on vehicle 30 using any solution.
- any type of input device can be utilized, including a touch screen or the like, which a remote user can utilize to point out an object within a field of view of vehicle 30 , the bounding area 134 , and/or the like.
- processing component 36 can determine the object's coordinates with respect to vehicle 30 by employing a range sensor, laser range finder, and/or the like, and combining the range information with the imaging device's field of view using camera calibration solutions.
- panel 210 can include alternative and/or additional input/output devices, such as a speaker, which can present audible data acquired by vehicle 30 , a microphone, and/or the like.
- the remote user can designate one or more operations, e.g., using a speech recognition software or the like, which can be interpreted into commands transmitted for processing by processing component 36 .
- the various display areas can be presented on a standard monitor using a general purpose computing device executing computer program code configured to manage data for the various display areas
- the input devices can comprise input devices for the general purpose computing device, the interaction of which is converted into a set of actions to be performed by component(s) located on the vehicle 30 by computer program code executing on the computing device that is configured to interpret the interactions into the set of actions.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
- Vehicle Cleaning, Maintenance, Repair, Refitting, And Outriggers (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Transportation (AREA)
Abstract
A railway maintenance device configured for autonomous or semi-autonomous operation in a rail environment is provided. The device can process image data to move about the rail environment and perform one or more actions in the rail environment. The actions can include one or more actions related to decoupling and/or attaching rail vehicles, and can be implemented by performing three-dimensional image processing. The device can be configured to move with any movement of a rail vehicle on which one or more actions are being performed. In an alternative embodiment, the various components configured to perform the action are implemented at a stationary location with respect to a rail line.
Description
The current application is a continuation application of U.S. Utility patent application Ser. No. 12/563,577, titled “Robotic Vehicle for Performing Rail-Related Actions,” which was filed on 21 Sep. 2009 and issued as U.S. Pat. No. 8,583,313 on 12 Nov. 2013, and which claims the benefit of co-pending U.S. Provisional Application No. 61/136,624, titled “Robotic vehicle to perform railway maintenance”, which was filed on 19 Sep. 2008, both of which are hereby incorporated by reference. Aspects of the disclosure are related to U.S. Utility patent application Ser. No. 12/043,357, titled “Rail vehicle Identification and processing”, which was filed on 6 Mar. 2008, and U.S. Utility patent application Ser. No. 12/171,438, titled “Rail vehicle identification and processing”, which was filed on 11 Jul. 2008, each of which is hereby incorporated by reference.
The disclosure relates generally to a robotic system, and more particularly to processing objects, such as rail vehicles, using a robotic system.
During use, railroad wheels are subjected to high, long-term stresses. Despite being made of high-quality steel, the stresses cause the wheels to become worn over a long period of operation. Without maintenance, a wheel can become too thin or otherwise no longer of the correct geometry. Further, the wheels may develop other defects, such as, for example, a “slid flat” or “flat spot”, which is caused by locking the wheels with the brakes in an attempt to stop.
The wheels of railroad cars and locomotives cannot turn differentially since they are affixed to solid axles. As a result, any difference between the shape and/or size of the wheels on either side of a car/locomotive can cause a tendency to turn, leading to an increased possibility of derailment. Therefore, it is important to periodically inspect the wheels on railroad cars and locomotives to ensure that they remain safe to operate, both as an individual wheel and as a pair of wheels on the same axle.
The use of a J-shaped, steel wheel gauge is a common approach to inspecting rail wheels. In this approach, an inspector manually places the gauge on the wheel, ensures contact with all relevant portions of the wheel, reads the measurements from marked scales on the gauge, and manually enters the data. Similarly, an electronic wheel gauge can be used, which performs some of the functions automatically, thereby improving accuracy and reducing the overall time spent measuring the wheels. Various illustrative embodiments of handheld electronic wheel gauges are shown and described in U.S. Pat. No. 4,904,939, U.S. Patent Application Publication No. 2005/0259273, and U.S. Patent Application Publication No. 2007/0075192, each of which is incorporated by reference. In both approaches, the inspection is carried out by hand, on one wheel at a time, on a stationary train. To address this limitation, a number of approaches seek to measure rolling stock wheels while they are in motion, detect various defects through the measurements, and record the associated data in an automated fashion. Various illustrative embodiments of such measurement solutions are shown and described in U.S. Pat. No. 5,636,026, U.S. Pat. No. 6,768,551, U.S. Pat. No. 6,523,411, and U.S. Patent Application Publication No. 2007/0064244, each of which is incorporated by reference.
Frequently, rail wheels are inspected at a classification yard (e.g., hump yard, flat-shunted yard, gravity yard, and/or the like). For example, an incoming train may be halted while one or more cars are manually inspected. Often, due to time constraints, only a few cars are actually inspected and/or the inspection is only cursory (e.g., visual inspection). Subsequently, the cars on the incoming train are classified and routed to corresponding tracks for inclusion on an outgoing train. The classification is performed based on a destination for each car. Once an outgoing train is assembled, one or more cars may be manually (e.g., visually) inspected along with an inspection of the brakes for the train. Subsequently, the train will leave the classification yard for the next destination.
Most attempts to automate the routing of incoming and outgoing rail vehicles for a train require the installation of automation equipment on the individual rail vehicles. For example, in order to automate the decoupling of two rail vehicles, several approaches seek to install an additional mechanical assembly on the coupler. Another approach provides a sentinel-like structure overhanging a train, which includes a stationary multi-link robotic arm to perform decoupling (e.g., pin-pulling) of rail vehicles. However, this requires that the rail vehicles be stationary and placed in a relatively precise location and that a pin washer be present and recognizable in the coupling in order to successfully decouple the rail vehicles. Additionally, this approach does not provide a solution capable of performing other operations that may be necessary to completely decouple/couple rail vehicles (e.g., disassembly/assembly of brake hose) and/or may be desired (e.g., security, cleaning, inspection, and/or the like).
The inventors recognize, among other things, a need for a cost effective automated or semi-automated solution that improves and/or provides for various operations relating to the processing of rail vehicles. To this extent, an embodiment provides a robotic vehicle capable of performing routing-related functions of rail vehicles, evaluation of rail vehicles, security, and/or the like, in a rail yard. The robotic vehicle can be configured to: perform operations on moving rail vehicles; receive a set of tasks and execute the tasks without further operator intervention; successfully operate despite variations in object geometry, locations, and field conditions (e.g., snow/rain, cluttered scene, etc.); consume low power while operating; and/or the like.
Aspects of the invention provide a robotic vehicle configured for autonomous or semi-autonomous operation in a rail environment. The vehicle can process image data to move about the rail environment and perform one or more actions in the rail environment. The actions can include one or more actions related to decoupling and/or attaching rail vehicles, and can be implemented by performing three-dimensional image processing. The vehicle can be configured to move with any movement of a rail vehicle on which one or more actions are being performed. In an alternative embodiment, the various components configured to perform the action are implemented at a stationary location with respect to a rail line.
A first aspect of the invention provides a vehicle configured for autonomous operation, the vehicle comprising: a first imaging device configured to acquire location image data; a transport component configured to enable the vehicle to move independently; a processing component configured to process the location image data acquired by the first imaging device and operate the transport component to move the vehicle using the location image data; a set of action components, the set of action components including at least one component configured to be temporarily secured with respect to a target object external to the vehicle; means for moving the at least one component to temporarily secure the at least one component with respect to the target object; and means for enabling the set of action components to move freely with the target object after the at least one component is temporarily secured.
A second aspect of the invention provides a railway maintenance device configured for autonomous operation, the device comprising: a multi-link arm, each link capable of independent movement in at least one direction; a set of visualization components configured to capture work region image data for a region including at least one target object; an action component located on a link of the multi-link arm; and a processing component configured to process the work region image data to generate a three-dimensional representation of the region, move at least one of the links to place the component near a target object in the region using the three-dimensional representation, and operate at least one of the arm or the action component to perform a railway maintenance operation.
A third aspect of the invention provides a rail yard comprising: a railway maintenance device configured for autonomous operation, the device comprising: a multi-link arm, each link capable of independent movement in at least one direction; a set of visualization components configured to capture work region image data for a region including at least one target object; an action component located on a link of the multi-link arm; and a processing component configured to process the work region image data to generate a three-dimensional representation of the region, move at least one of the links to place the component near a target object in the region using the three-dimensional representation, and operate at least one of the arm or the action component to perform a railway maintenance operation, wherein the railway maintenance operation comprises at least one of: decoupling a pair of rail vehicles, detaching a pair of brake hoses, attaching a pair of brake hoses, or acquiring inspection data for a rail vehicle.
A fourth aspect of the invention provides a method of performing an operation on at least one rail vehicle, the method comprising: locating a vehicle adjacent to the at least one rail vehicle using location image data acquired by an imaging device located on the vehicle and a motor turning at least one of a set of wheels on the vehicle; securing at least one component of the vehicle with respect to a target object on the at least one rail vehicle in response to the locating; disengaging the motor from the at least one of the set of wheels using a clutch in response to the securing; and performing the operation on the at least one rail vehicle subsequent to the disengaging.
The illustrative aspects of the invention are designed to solve one or more of the problems herein described and/or one or more other problems not discussed.
These and other features of the disclosure will be more readily understood from the following detailed description of the various aspects of the invention taken in conjunction with the accompanying drawings that depict various aspects of the invention.
It is noted that the drawings may not be to scale. The drawings are intended to depict only typical aspects of the invention, and therefore should not be considered as limiting the scope of the invention. In the drawings, like numbering represents like elements between the drawings.
As indicated above, aspects of the invention provide a robotic vehicle configured for autonomous or semi-autonomous operation in a rail environment. The vehicle can process image data to move about the rail environment and perform one or more actions in the rail environment. The actions can include one or more actions related to decoupling and/or attaching rail vehicles, and can be implemented by performing three-dimensional image processing. The vehicle can be configured to move with any movement of a rail vehicle on which one or more actions are being performed. In an alternative embodiment, the various components configured to perform the action are implemented at a stationary location with respect to a rail line. As used herein, unless otherwise noted, the term “set” means one or more (i.e., at least one) and the phrase “any solution” means any now known or later developed solution.
An embodiment provides a robotic vehicle capable of fully autonomous and/or semi-autonomous operation for performing one or more rail-related actions including, but not limited to, rail yard maintenance/inspection, hump track maintenance, hazardous material response, perimeter security, and/or the like. Railways are under constant pressures to reduce cost and provide safer operation. It is desirable to reduce the exposure of workers to slipping within a rail yard, the requirement of workers to perform continuous monitoring, the requirement of workers to perform actions in dangerous conditions, etc. Therefore, a railway solution that incorporates a robotic vehicle capable of fully autonomous and/or semi-autonomous operation can be beneficial to the railway industry.
Turning to the drawings, FIG. 1 shows a side view of an illustrative robotic vehicle 30 according to an embodiment. Vehicle 30 comprises a housing 32, which holds a power component 34, a processing component 36, a positioning component 38, and a transport component 40 therein and can protect the various components 34, 36, 38, 40 from exposure to elements within an operating environment of vehicle 30. It is understood that FIG. 1 includes only schematic representations of components 34, 36, 38, 40 and therefore no relative size or location of the various components 34, 36, 38, 40 should be inferred from FIG. 1 . Housing 32 can comprise any type of rugged casing that is designed to withstand the rigors of use in an outdoor environment, such as a rail yard. Additionally, it is understood that the various components 34, 36, 38, 40 can be held in place within housing 32 using any solution, e.g., via bolts. Still further, it is understood that housing 32 can include various additional components, such as an environmental monitoring and/or control component, which are not shown and described herein for clarity.
As illustrated, vehicle 30 can comprise a tracked vehicle, in which transport component 40 includes a motor that drives a set of tracks 42. In this case, vehicle 30 can move without restriction in an environment susceptible to various adverse travel conditions (e.g., mud, ice, large stone ballast, etc.) and/or including rough terrain (e.g., railroad tracks, steep inclines, etc.). Each track 42 is rotated about a set of wheels 44A-44C, at least one of which (e.g., wheel 44B) is driven by a motor within transport component 40. In an embodiment, the motor comprises a pulse width modulated (PWM) direct current (DC) electrical motor, which will enable vehicle 30 to operate for prolonged durations (e.g., approximately eight hours) without requiring recharging. Further, transport component 40 can include a clutch/transmission, which attaches the motor to the wheel(s). Inclusion of a clutch will allow processing component 36 to disengage the clutch and allow the wheels 44A-44C and tracks 42 to move freely without using transport component 40. A direction of travel for vehicle 30 can be changed by rotating two parallel tracks 42 at different speeds and/or in different directions. In an embodiment, each track 42 comprises a rubber track, however, alternative tracks, such as steel tracks, can be utilized.
Additionally, it is understood that for various applications, vehicle 30 can comprise a wheeled vehicle, which can move unrestricted over a terrain, move along a rail, move over pathways, and/or the like. To this extent, FIG. 2 shows side and top views, respectively, of an alternative robotic vehicle 30A, which comprises a rail-based vehicle traveling over a set of rails according to an embodiment. In this case, the set of rails can run parallel to a rail line on which rail vehicles 4A-4B are moving and the set of rails can be situated a sufficient distance from rail vehicles 4A-4B so that vehicle 30A does not interfere with other operations. It is understood that, with the exception of modifications to a portion of transport component 40 (FIG. 1 ), vehicle 30A can include the same components as shown and described herein with respect to vehicle 30 (FIG. 1 ).
Returning to FIG. 1 , vehicle 30 can include an ability to operate autonomously. To this extent, processing component 36 and/or positioning component 38 can be configured to enable vehicle 30 to identify its location and identify objects within its surrounding area. For example, vehicle 30 can include an omni-directional imaging device 46, such as a low-light 360 degree omni-directional camera. Imaging device 46 can be configured to capture the omni-directional image data using moving parts (e.g., a pan/tilt/zoom imaging device) or no moving parts (e.g., using mirrors or the like to reflect the omni-directional image data onto imaging device 46). Imaging device 46 can comprise a sufficiently high resolution that enables determination of desired details of targets of interest within a desired operating range in the image data to enable a desired level of analysis. The level of detail, operating range, and level of analysis can be selected and vary based on the application in which imaging device 46 is used. In an illustrative embodiment, imaging device comprises a two megapixel imaging device. Imaging device 46 can acquire images of the general surroundings of vehicle 30, which can be provided for processing by processing component 36 to acquire an overall situational awareness for vehicle 30. Similarly, vehicle 30 can include a set of microphones, such as an acoustic microphone array 47, which can acquire acoustic data from the environment around vehicle 30. Processing component 36 can process the acoustic data to detect and compute directional information for a source of a sound, e.g., by comparing phase information from the array 47, identify the type of sound, and/or the like.
Additionally, vehicle 30 can include a set of collision avoidance sensors, such as sensor 48, each of which can be mounted on housing 32. A sensor 48 can comprise any type of sensor, such as an ultrasonic sensor, a laser scanning sensor, and/or the like, which is capable of acquiring sufficient data to enable processing component 36 to detect objects relatively close to vehicle 30. Processing component 36 can process the data acquired by sensor(s) 48 to determine a distance, a relative speed, a relative direction of travel, and/or the like, of the closest objects. The compiled data from the set of sensors 48 can provide a complete distance map and potential trajectory of the nearest objects to the vehicle 30.
It is understood that while DGPS and GPS signals are discussed herein, any type of geographic location signals, such as LORAN (Long Range Navigation), GLONASS (Global Navigation Satellite System), and/or the like, can be utilized. Similarly, an embodiment of positioning component 38 can receive signals from a set of beacons to derive location information using triangulation. Alternatively, positioning component 38 can derive the location information from an intersection point for a set of vectors derived from the relative field strength, time of arrival (TOA), time difference of arrival (TDOA), phase difference between signals, and/or the like. It is understood that various types of beacons, such as ultra-wideband location beacons, cell phone towers, pager towers, distance measuring equipment (DME), VHF Omni-directional Radio Range (VOR), LORAN, and/or the like can be utilized.
In addition to geographic location information, processing component 36 can send data to and receive data from other systems that are remote from vehicle 30. To this extent, vehicle 30 can include an antenna 52, such as a radio frequency (RF) antenna, to send and receive data via a wireless communications solution. In an embodiment, vehicle 30 sends/receives data to/from other systems using antenna 52 via a wireless fidelity (Wi-Fi) link under the 802.11b protocol and a secure link that is not susceptible to jamming or other interference. However, it is understood that other types of wireless communications links, operating frequencies, communications protocols, and/or the like, can be used including, but not limited to, a microwave link, an infrared link, 802.11g, TCP/IP protocol, and/or the like.
Regardless, an embodiment of the wireless communications solution enables multiple vehicles 30 to operate within an area and communicate with the same system(s) without the communications conflicting. To this extent, a group of vehicles 30 can be implemented within a work area, and cooperatively address various maintenance-related, dangerous, and/or routine tasks for the work area. For example, multiple vehicles 30 can be deployed in a rail yard to perform inspections, monitor the area, decouple rail vehicles, and/or the like. The vehicles 30 can be configured to communicate with a central system and/or with one another to request assistance or perform some action, when necessary.
The set of action components 110 can be selected and configured to perform any combination of various actions including, but not limited to, inspection of an object (e.g., using an imaging device, one or more sensors such as infrared sensors, chemical sensors, etc.), repair/repair assistance, coupling/decoupling, and/or other types of manipulations of objects. The set of action components 110 can be operated and controlled by processing component 36 and/or can be operated under the direction of one or more remote users.
To this extent, vehicle 30 can be configured to perform operations that require a highly precise machine vision system to locate object(s) that are to be manipulated using the set of action components 110 in various outdoor operating environments. In this case, a two-dimensional machine vision system can be fooled by, for example, falling snow partially covering a part, plain scene clutter with too many similar looking objects, many variations in two-dimensional shapes, other objects which appear similar when viewed in a two-dimensional plane, and/or the like. In an alternative more particular embodiment, the set of visualization components 108 can include a pair of imaging devices 114A, 114B, which capture image data that can be used to create a stereo image, e.g., an image including a three-dimensional set of coordinates, of an object on which vehicle 30 is configured to perform one or more manipulations. In still another alternative more particular embodiment, the set of visualization components 108 can include a light source 112 that emits a pattern of light, such as a sheet (e.g., line) of light, and one or more imaging devices 114A, 114B that capture images of an area as the light reflects off of different surfaces within the area. Light source 112 can be configured to move the pattern of light across the area, or light source 112 can be moved with respect to the area. In any event, processing component 36 can process the multiple images to generate a complete profile for the object(s) within the field of view, e.g., by reconstructing structural information from motion (SFM), shape from shadows, and/or the like. For example, processing component 36 can use arm 102 to capture multiple images of an object for which three-dimensional information is desired. Processing component 36 can use image geometry and the multiple images to compute the three-dimensional information. It is understood that while only a single light source 112 is shown, vehicle 30 can include any combination of two or more light sources 112. For example, light source 112 can comprise a large number of light emitting diodes (LEDs) with a diffuser plate in front of the LED array. Further, additional light sources 112 can be mounted around imaging devices 114A, 1148 to reduce shadows on the object(s) within the field of view caused by the illuminated light.
In order for vehicle 30 to perform a task on a object whose three-dimensional coordinates are known, processing component 36 can plan movement of links 104A-104D in a manner that ensures the links 104A-104D travel in a direction that moves closer to the object without spending time searching the three dimensional travel space. In an embodiment, processing component 36 implements an inverse kinematics-based solution to position the links 104A-104D with respect to a target object. Inverse kinematics comprise a class of computer solutions that can plan the path for the links 104A-104D given a destination for the links 104A-104D. Processing component 36 can use linear equations, Jacobian models, and optimized 3D space searches to quickly calculate an optimal path for the links 104A-104D until the links 104A-104D arrive at a desired location with respect to the object.
In an illustrative application, which is used to describe aspects of the invention herein, a classification yard includes a system for evaluating and/or performing routine maintenance on rolling stock in each of many consists (e.g., one or more connected rail vehicles) of rolling stock using vehicle 30. The system can route any rolling stock that is evaluated as including one or more designated defects to a maintenance area, which can address the defect(s) before allowing the rolling stock to be included on a train that is sent out to various destinations for delivery. In this manner, the system can improve: safety by reducing a likelihood of an accident in which one or more of the defects is a contributing cause; efficiency by removing defects that can lead to increased energy expenditure during operation; and/or the like.
Any combination of various components of rail vehicles can be inspected using evaluation component 22 and/or inspection component 27. For example, evaluation component 22 can acquire measurement data for performing an inspection of the various couplings between two connected rail vehicles, which can be provided for processing by inspection component 27. When inspection component 27 determines that a rail vehicle does not pass the inspection (e.g., includes one or more defects), processing system 20 can determine whether the defect(s) is (are) of the type that can be repaired locally (e.g., in-situ or on a local track).
Diverting a vehicle from a consist to another track, e.g., due to the presence of one or more defects, requires that the vehicle be detached from the other vehicles in the consist. However, a local repair can include a repair of one or more minor defects (e.g., reattachment of a brake line) with the couplings between two rail vehicles. Vehicle 30 can be directed to travel to a location between two rail vehicles and detach/attach various connectors (coupling mechanisms, brake hoses, etc.) when assembling or disassembling a consist, performing a repair, detaching a bad-ordered vehicle, and/or the like. These tasks are inherently dangerous, as they previously required that a worker get between two vehicles in a consist, which may be moving constantly, or suddenly stop or start without warning in classification yard 10. Even a very small movement by a 300,000 pound rail vehicle can be potentially lethal for a worker between two vehicles at that moment.
When processing system 20 determines that one or more defects on a vehicle cannot be performed in-situ or that one or more repairs were unsuccessful, the vehicle can be detached from a consist using vehicle 30 and processing system 20 can route the vehicle to one of a set of maintenance tracks 18. Processing system 20 can route the vehicle to the maintenance track 18 via rail line 14 or via rail line 17, e.g., when a repair was first attempted on loop-back rail line 15. In any event, one or more maintenance personnel at a maintenance shop 19 can perform the repair, and indicate to processing system 20 one or more details of the repair (e.g., time, materials, etc.). Subsequently, processing system 20 can route the rail vehicle for re-inspection, e.g., via rail lines 15, 17.
As discussed herein, processing system 20 can utilize a set of robotic vehicles to perform one or more operations required to process the rail vehicles as they pass through classification yard 10. To this extent, referring to FIGS. 1 and 4 , robotic vehicle 30 includes a set of application-specific components 100 that enable robotic vehicle 30 to perform various operations required within classification yard 10. In an illustrative application described further herein, robotic vehicle 30 is configured to operate a coupler release handle for decoupling two rail vehicles, disconnect/reconnect railway brake hoses, identify, evaluate, and/or repair the rail vehicles, operate external systems, and/or the like.
In any event, processing component 36 can interact with one or more external systems, such as management component 24, to receive operating instructions corresponding to a consist being processed through classification yard 10 and/or obtain assistance. For example, management component 24 can identify the particular couplings in a consist of rail vehicles that require decoupling. In an embodiment, processing component 36 can receive a set of instructions for a consist as it approaches, and can receive updates to the instructions in response to evaluations performed on the various rail vehicles. For example, the initial set of instructions can indicate the various locations at which the consist must be decoupled according to the corresponding destinations for the rail vehicles, while the updated instructions can indicate a rail vehicle that must be decoupled in order to further evaluate and/or perform maintenance on the vehicle before it is moved through the classification yard 10. The instructions can uniquely identify the coupling to be decoupled using any solution, e.g., based on a location of vehicle 30, location of coupling within the consist, identification of front rail vehicle, and/or the like.
When vehicle 30 is unable to complete a task independently, such as decoupling two rail vehicles, vehicle 30 can obtain assistance from an external source. For example, when processing component 36 cannot locate an object, such as a coupler release handle or brake hoses, processing component 36 can provide image data corresponding to an area in which the object should be present for evaluation by an individual. Further, processing component 36 can request assistance for a particular task when a task cannot be performed within a predetermined amount of time. Still further, processing component 36 can control one or more retarder devices 13 to slow rail vehicles passing thereby, e.g., in order to reduce the tension between two rail vehicles to allow them to separate. In an embodiment, vehicle 30 transmits RF signals directly to the set of retarder devices 13. Alternatively, processing component 36 can signal an external system, such as management component 24, which in turn operates the set of retarder devices 13 accordingly.
As discussed herein, the set of action components 110 can be configured to manipulate one or more objects, such as in decoupling a pair of rail vehicles. To this extent, the set of action components 110 can be configured to operate a coupler release handle, which can be located on one or both sides of the coupler mechanism of a rail vehicle. When operation of the coupler release handle on one side of the rail vehicle fails, the coupler release handle located on the other side of the rail vehicle can be utilized.
Regardless, once vehicle 30 is located adjacent to a work region, e.g., adjacent to the region between a pair of rail vehicles to be decoupled, processing component 36 can acquire image data from imaging devices 114A, 114B and process the image data to locate an object to be manipulated, such as the coupler release handle, and operate the set of action components 110 to perform the manipulation. The set of imaging devices 114A, 114B provide a detailed view of a particular object to be manipulated, which enables processing component 36 to implement machine vision-related processes locally and perform refined object recognition.
To decouple rail vehicles 4A, 4B, a coupler release handle 128 is moved a relatively small amount in a direction perpendicular to the handle's axis to operate a coupler release mechanism 130. That is, release handle 128 is moved vertically when release handle 128 extends horizontally or horizontally when release handle 128 is substantially vertical in orientation, as shown. Operation of release handle 128 in this manner causes coupler release mechanism 130 to release the components 124A, 124B of coupler assembly 122. An operator will determine whether rail vehicles 4A, 4B have separated, and if not, may need to operate release handle 128 again. Brake hoses 126A, 126B are generally connected to a corresponding portion of coupler assembly 122 by a wire or chain harness 132. As rail vehicles 4A, 4B separate, harness 132 exerts angular force upon the brake hose connection, causing the brake hoses 126A, 126B to separate. It is understood that coupling mechanism 120 can include mirrored components of release handle 128 and brake hoses 126A, 126B on an opposite side, which are not shown for clarity.
Referring to FIGS. 1 and 5 , the set of imaging devices 114A, 114B can acquire image data having a field of view approximately of that shown in FIG. 5 . Processing component 36 can identify certain features/objects within the field of view and locate a bounding area 134 within which coupler release handle 128 should be located. For example, processing component 36 can identify the start of rail vehicle 4B, ground level, and the closest side of vehicle 4B to vehicle 30. Processing component 36 can define the bounding area 134 as a cube having an X-axis that starts one foot in front of vehicle 4B and ends two feet in front of vehicle 4B, a Y-axis that starts one foot above ground level and ends two and a half feet above ground level, and a Z-axis that extends from one foot beyond the vehicle 30 to a half foot beyond the closest side of vehicle 4B. Subsequently, processing component 36 can process image data corresponding to the bounding area 134 to recognize coupler release handle 128, which can be a lever, a hook, an inverted “C” lever, and/or the like.
Once a 3D representation is available, processing component 36 can process the data points to identify the various objects. In an embodiment, processing component 36 uses in-variant features representing the objects to match against the 3D points. For example, a circular “donut like” object can be described by a set of radii and thickness of the donut. In an illustrative approach, processing component 36 can fit the acquired 3D data points through a curve fitting algorithm to determine whether a match against a previously stored in-variant template occurs. In another approach, processing component 36 can use more general purpose 3D volume fitting algorithms, e.g. Open Source Visualization Toolkit (VTK), or the like, to recognize the object(s). It should be noted that with abundant processing power currently available, 3D volume fitting and template matching has become a practical approach. Determination and evaluation of the three-dimensional points provide improved object recognition when work piece surface conditions, work piece color, and/or the like, can vary substantially, as occurs in outdoor railway conditions where rust is frequently present. Regardless, it is understood that processing component 36 can analyze other features of the object, such as color, texture, approximate location, etc., in addition to its shape to accurately identify the object. Further, it is understood that processing component 36 can identify the various objects by being trained to recognize certain stored patterns, shapes, parts, etc. For example, the VOLTS-IQ visual intelligence software suite, offered by Braintech, comprises an approach to train processing component 36 to identify parts and shapes for later recall and usage.
Once identified, processing component 36 can operate one or more components in the set of action components 110 to move coupler release handle 128 to decouple vehicles 4A, 4B. To this extent, FIGS. 6A-6D show operation of an illustrative manipulator 140, which can be located at an end of link 104D (FIG. 1 ), according to an embodiment. FIGS. 6A, 6B show a front view and top view, respectively, of manipulator 140. Manipulator 140 includes a manipulator attachment 142, a rotation mechanism 144, a gripping mechanism 146, and a plurality of fingers 148A-148C. Each finger 148A-148C can be equipped with a set of sensors, which can, for example, provide information on an amount of force being exerted on the finger 148A-148C. The set of sensors can include, for example, tactile sensor(s), pressure sensor(s), force sensor(s), torque sensor(s), and/or the like. Gripping mechanism 146 includes a plurality of tracks 150A-C along which fingers 148A-C can move. Further, rotation mechanism 144 can enable gripping mechanism 146 and fingers 148A-C to be rotated about its axis. Still further, as illustrated in FIG. 6C , manipulator 140 can be attached to link 104D, which can provide horizontal and/or vertical movement of manipulator 140. It is understood that the various movements described herein can be implemented using any combination of one or more types of motion control components including, but not limited to, servo motors, stepper motors, muscle wire, harmonic drives, feedback encoders, electronic motor drives, feedback torque sensors, and/or the like.
During operation of release handle 128, processing component 36 can use data from the force sensors on fingers 148A-148C to determine, for example, whether any unusual/abnormal resistance or lack of resistance occurs while the release handle 128 is being moved. Further, processing component 36 can determine various other faults using any solution. For example, processing component 36 can determine a fault due to a failure to release rail vehicles 4A, 4B after a predetermined number of tries, a broken (e.g., stuck or missing) component in coupling mechanism 120, and/or the like. In this case, processing component 36 can generate an alarm, which can be presented to a user for action.
Returning to FIG. 1 , it is understood that processing component 36 can utilize the same or similar processes in order to identify various other types of objects for other applications, and that the set of action components 110 can be configured to perform various other types of actions on these objects for other applications. To this extent, the set of action components 110 can comprise one or more additional components and/or alternative components. Further, as discussed herein, vehicle 30 can comprise multiple arms 102, each having a different set of action components 110.
Within the railroad environment, processing component 36 can identify brake hoses 126A, 126B (FIG. 5 ) for subsequent connection or separation using a three-dimensional machine vision solution discussed herein. In an embodiment, since brake hoses 126A, 126B must meet dimensional tolerances specified by the Association of American Railways (AAR), vehicle 30 can be configured to use active machine vision metrology, such as imaging reflections of a sheet of light generated by light source 112, to identify the brake hoses 126A, 126B from other background objects by gauging the dimensions of the brake hoses 126A, 126B. In this case, the set of action components 110 can comprise a manipulator configured to decouple railway brake hoses 126A, 126B.
Processing component 36 (FIG. 1 ) can operate manipulator 160 to detach a pair of connectors 8A, 8B for a standard rail brake hose 126A, 126B (FIG. 5 ) on rail vehicles 4A, 4B. To this extent, FIGS. 7C, 7D show manipulator 160 being used to detach connectors 8A, 8B. Initially, processing component 36 can operate arm 102 (FIG. 1 ) to locate manipulator 160 so that each restraining rod 162A, 162B is located above the brake hose adjacent to a corresponding connector 8A, 8B, respectively, while contact component 164 is located below connectors 8A, 8B. The spacing between restraining rods 162A, 162B can be selected such that each restraining rod 162A, 162B can be located near where connectors 8A, 8B meet the brake hose. Further, processing component 36 can adjust a width of spacer 170 using any solution to enable the corresponding locations of restraining rods 162A, 162B to be adjusted. Similarly, a distance between a top of contact component 164 and a bottom of restraining rods 162A, 162B can be selected such that connectors 8A, 8B will readily fit between. Further, processing component 36 can adjust the distance by adjusting a length of vertical supports 168A, 168B and/or a height of contact component 164 using any solution (e.g., via piston 166).
Once manipulator 160 is positioned appropriately, processing component 36 can move contact component 164 upward toward connectors 8A, 8 B using piston 166. Contact component 164 will force connectors 8A, 8B to move upward, while the brake hose 126A, 126B (FIG. 5 ) is prevented from moving upward by restraining rods 162A, 162B. As a result, connectors 8A, 8B will swivel away from one another, resulting in the hoses 126A, 126B becoming separated. Processing component 36 can confirm that the brake hoses have been decoupled using any solution (e.g., by analyzing image data of the work region, data from one or more pressure sensors located on manipulator 160, or the like). Once a successful decoupling operation is performed, processing component 30 can transmit its success to another system, such as management component 24 (FIG. 4 ), and wait for instructions or commence its next action.
It is understood that manipulators 140, 160 and the functionality described therewith are only illustrative of numerous types of manipulation devices and functions, which can be implemented on vehicle 30 and utilized to perform a wide variety of tasks. To this extent, processing component 36 can utilize the same/similar components and processes described herein to perform additional actions in an operating environment, such as one or more repairs or maintenance tasks in a classification yard. For example, FIG. 8 shows illustrative use of manipulator 160 to rotate a brake wheel 6 located on a front/back side of a rail vehicle 4 according to an embodiment. Brake wheel 6 can be rotated to release stuck brakes on rail vehicle 4. Similarly, manipulator 160 can be utilized to bleed the brake system of a rail vehicle 4 by pulling an air system release lever located on the side of a rail vehicle 4. For example, manipulator 160 can grasp the release lever and the arm can be operated to pull the lever down to bleed off the air pressure for a few seconds. Additionally, vehicle 30 can operate a hose to clean a portion of the work area, such as a maintenance pit, between tracks, rail vehicles 4, and/or the like.
Further, manipulator 160 can be implemented with restraining rods 162A, 162B capable of movement similar to human fingers. In this case, in order to grasp an object, processing component 36 can implement grasp planning prior to moving the restraining rods 162A, 162B. The grasp planning can comprise, for example, one or more grasp planning optimization algorithms, such as grasp analysis, grasp workspace determination, grasp solution computation within the workspace, and/or the like. In an embodiment, vehicle 30 can comprise a pair of manipulators 160, each of which can be used to grasp one of a pair of disconnected brake hoses between two rail vehicles and attach the brake hoses, e.g., as part of assembling a consist for an outbound train.
Returning to FIG. 1 , the set of action components 110 on vehicle 30 can be configured to acquire data for inspecting one or more aspects of an object. To this extent, processing component 36 can be configured to acquire measurement data for and/or perform maintenance on various components of a rail vehicle. For example, the set of action components 110 can comprise a non-destructive testing head, which can be applied to or placed in proximity with an object and used to acquire data regarding the object. In various embodiments, the testing head can comprise: an electromagnetic acoustic (EMAT)-based testing head, such as shown and described in U.S. Pat. No. 6,523,411; a handheld electronic gauge, such as shown and described in U.S. Pat. No. 7,525,667; a handheld measurement device, such as shown and described in U.S. Pat. No. 7,478,570; and/or the like. Processing component 36 can operate arm 102 to apply the testing head to various locations of a rail wheel and probe for flaws within the rail wheel (e.g., a crack), gather wheel stress measurements, gather dimensional measurements (e.g., diameter), and/or the like.
Similarly, the set of action components 110 can comprise a set of data acquisition devices, such as an imaging device, a chemical/biological sensor, an infrared imaging device (e.g., active illumination infrared camera or a passive infrared camera), a multi-spectral imaging device, or the like, which processing component 36 can locate accordingly to acquire measurement data, such as image data, chemical/biological levels, heat data, and/or the like, which processing component 36 can analyze to determine the presence of one or more unsafe conditions (e.g., a leak, hot brakes, uneven wear, hidden compartments, etc.). In an embodiment, a vehicle 30 can respond to an accident that may have resulted in a leak of hazardous material to determine a level and severity of the spill. Likewise, imaging device 46 and/or the set of action components 110 can be configured to acquire various types of image data for the surrounding area to perform security-related actions. For example, imaging device can acquire infrared image data, which processing component 36 can evaluate to determine the presence of unauthorized individuals in the work region regardless of scene clutter and/or weather conditions.
Moreover, vehicle 30A (FIG. 2 ) can be configured to perform track-related maintenance and/or inspection as it moves along the tracks. To this extent, vehicle 30A can be configured to visually inspect the track for defects, make track-based measurements (e.g., gauge, profile, etc.), make minor repairs to railway tie spikes or other track-related infrastructure, etc. Outside of the railway industry, embodiments of vehicle 30 can be configured to perform inspections of road-based vehicles, water-based vehicles, explosive ordinance disposal, remote inspection and/or monitoring, and/or the like.
For some applications, such as within classification yard 10 (FIG. 4 ), the rail vehicles may move unexpectedly, change direction, or be continuously moving. To this extent, referring to FIG. 1 , robotic vehicle 30 can maneuver itself and align itself with a coupling or corresponding work region of a rail vehicle and secure itself with respect to the rail vehicle (e.g., by latching on to the release handle). Additionally, robotic vehicle 30 can align its tracks 42 to be parallel with the rail on which the rail vehicle is located. Alternatively, when robotic vehicle 30 is implemented as a rail-based vehicle 30A (FIG. 2 ), the corresponding rails can be aligned accordingly. Regardless, processing component 36 can disengage the clutch within transport component 40 to allow tracks 42 to move freely. In this manner, arm 102 and the components thereon can remain stationary with respect to the work region of a rail vehicle even if the rail vehicle is moving or suddenly starts/stops. Once vehicle 30 has completed the task, processing component 36 can unsecure vehicle 30 from the rail vehicle and re-engage the clutch to enable vehicle 30 to move on its own.
In an embodiment, processing component 36 can extend arm within the gap between a pair of connected vehicles. In this case, processing component 36 can implement a mechanical tracking solution, such as a stabilization platform, which stabilizes a position of the arm 102 with respect to the rail vehicles. In this case, vehicle 30 can include a sensor to acquire the speed of the rail vehicles and processing component 36 can adjust the speed of vehicle 30 with the speed of the rail vehicles to move in-step with the rail vehicles (e.g., track the movement of the rail vehicles). Processing component 36 can track the vehicle movement using image-based invariant feature tracking, particle filters, or perform binary correlation to match features in a spatio-temporal image stream with features of a template image of the target. Further, processing component 36 can implement finer adjustments by moving arm 102 in response to determining a difference in location for an object within a field of view of an imaging device, or the like.
It is understood that arm 102 and the various components for performing actions and/or acquiring data on an object can be implemented in a stationary location. For example, arm 102 can be permanently mounted adjacent to a rail line, such as rail line 14 (FIG. 4 ). Further, arm 102 can be permanently mounted between a pair of rails. In this manner, arm 102 and the components thereon, can be configured to acquire data/perform manipulations of various components located on the underside of the rail vehicles (e.g., brakes, suspension, axle, etc.). Further, arm 102 can access and manipulate the release handle 128 (FIG. 5 ) and/or brake hoses 126A, 126B (FIG. 5 ) of the coupling mechanism 120 (FIG. 5 ).
However, due to potential movement of rail vehicles, rather than an arm 102 at a fixed location, another embodiment provides a vehicle 30 configured to operate between a pair of rails on which rail vehicles travel. To this extent, FIG. 9 shows a side view of another illustrative robotic vehicle 30B according to an embodiment. Robotic vehicle 30B is configured to operate between a pair of rails. In this case, when arm 102 is not extended, vehicle 30B can comprise a profile that is approximately six inches high and a width that is approximately two feet or less. As illustrated, arm 102 can comprise a three-link arm, in which the first and second links rotate around connection points, and the third link, with the set of action components 110, can extend from the second link.
When vehicle 30B is properly positioned, e.g., underneath the spacing between a pair of rail vehicles, processing component 36 can raise and extend the set of action components 110 to a desired location to perform the corresponding action(s). For example, the set of action components 110 can be configured to decouple the rail vehicles. To this extent, FIG. 10 shows an illustrative configuration in which vehicle 30B can image a coupling mechanism 120 from below using a set of visualization components 108 according to an embodiment. As illustrated, vehicle 30B (FIG. 9 ) can position the set of visualization components 108 to enable the capture of cross-sectional image data of the coupling mechanism 120.
Returning to FIGS. 1 and 4 , an embodiment of vehicle 30 is configured to move throughout a rail yard, such as a classification yard 10, without its movement being limited to tracks, within tracks, or the like. To this extent, processing component 36 can be configured to determine a location of vehicle 30 within the classification yard 10. FIG. 11 shows a top view of a portion of a rail yard 10A according to an embodiment. As illustrated, the rail yard 10A comprises a large number of tracks that are configured in a structured manner. The various tracks can be as close as approximately ten feet to each other, which will require vehicle 30 to be capable of determining its location with a high degree of accuracy to avoid collision with rail vehicles moving on the tracks.
Additionally, the map of rail yard 10A can be supplemented with information utilized by vehicle 30. For example, rail yard 10A can comprise several virtual cells 204A-204C. Each virtual cell 204A-204C can have a corresponding size, risk probability, volume, action list, maximum speed, and/or the like, associated therewith. Once within a virtual cell 204A-204C, processing component 36 can operate vehicle 30 accordingly. For example, in a virtual cell in which rail vehicles infrequently travel and are slow moving, processing component 36 can move vehicle 30 alongside a rail. However, in a virtual cell that comprises a high volume of and/or fast moving rail vehicles, processing component 36 can carefully analyze the rails before approaching a rail. In the case of decoupling rail vehicles, processing component 36 can direct vehicle 30 to a location of the first coupling mechanism to be decoupled based on the action list. It is understood that virtual cells 204A-204C are only illustrative, and rail yard 10A can comprise numerous virtual cells.
As discussed herein, vehicle 30 also can include a positioning component 38 (FIG. 1 ), which can comprise a geographic location radio signal receiver, which acquires geographic location information for vehicle 30 from a geographic location transmitter. To this extent, rail yard 10A is shown including a set of location transmitters 200A-200C, such as ultra-wideband location beacons, according to an embodiment. Each location transmitter 200A-200C can transmit a signal that is received by positioning component 38. Processing component 36 can process one or more attributes of the respective signals to determine an area 202 corresponding to a location of vehicle 30.
In another embodiment, a work environment, such as rail yard 10A, can be supplemented with one or more alternative markers for assisting vehicle 30 and processing component 36 in moving throughout the work environment. For example, a line can be painted on a surface along which vehicle 30 travels, which processing component 36 can utilize to route itself accordingly. Similarly, an emitting source, such as a buried wire emitting low levels of coded RF signal, a laser light emitter, or the like, can be used to assist vehicle 30 in guiding itself along a predetermined path. Still further, other markers, such as an edge of maintenance pit, one or a pair of side by side tracks, an overhead sentry, and/or the like, can be utilized by processing component 36 to determine a path of travel.
Once vehicle 30 is located in a roughly desirable location, e.g., within a virtual cell 204A-204C, vehicle 30 must be positioned sufficiently close to the required task, such as decoupling rail vehicles. Initially, processing component 36 can locate the area for the work location, e.g., the location of a coupling mechanism to be decoupled. To this extent, processing component 36 can determine the correct rail vehicles to be decoupled in a stream of connected rail vehicles using any solution. For example, processing component 36 can identify a particular rail vehicle in a consist to determine whether the rail vehicle is one to be decoupled or to determine a relative location of the rail vehicle with respect to the decoupling location.
Further, processing component 36 can detect the end of rail vehicles to determine a location of the coupling mechanism. In an embodiment, processing component 36 can process data received from the set of collision avoidance sensors 48 to determine the start/end of each rail vehicle. Additionally, processing component 36 can receive data from an external vehicle sensing system, which is implemented within rail yard 10A. In this case, the external vehicle sensing system can provide an output signal indicating the end of a rail vehicle and the beginning of the next rail vehicle, which processing component 36 can use to determine the area between the rail vehicles. Processing component 36 also can process image data acquired by imaging device 46 to identify the work location, e.g., by identifying a loop figure, which indicates connected brake hoses for a coupling mechanism.
As discussed herein, vehicle 30 can be provided with a set of actions, which can subsequently be carried out autonomously be vehicle 30. In an embodiment, an off the shelf robot programming environment, such as one provided by Mobilerobots, Inc., Robotics Developer Studio from Microsoft, Inc., and/or the like, can be utilized to program the actions for processing component 36. For example, an illustrative set of instructions provided by a user (e.g., via voice input, menu selection, or the like) for processing component 36 to implement using the various components of vehicle 30 can comprise: “go to track 5 left side”; “wait until end of car detected”; “wait for decoupling order”; “confirm vehicle identity”; “look for pin puller lever”; “grab lever”; “release car”; “confirm release”; “repeat until end-of-train”.
While vehicle 30 is generally described herein as being configured for autonomous operation in response to receiving a set of tasks from another system, it is understood that vehicle 30 can operate in semi-autonomously, during which a remote user can control vehicle 30 to perform one or more tasks and/or assist vehicle 30 in performing one or more assigned tasks. For example, processing component 36 can transmit image data captured by imaging devices 46, 114A, and/or 114B for presentation to the remote user. Additionally, processing component 36 can transmit data acquired by other sensors, such as microphone array 47 for presentation to the remote user. Data from imaging device 46 and microphone array 47 can provide situational awareness for the remote user, and processing component 36 and/or a remote system can analyze and supplement the data with analysis information (e.g., identified objects within the field of view, directional information of a source of a sound, and/or the like). Data from imaging devices 114A, 114B can be utilized by the remote user to perform/assist processing component with performing the task(s).
In an embodiment, processing component 36 compresses the image data acquired from the imaging devices using, for example, the JPEG 2000 compression algorithm enhanced for low latency operation, and processing component 36 can transmit the image data for presentation on panel 210 using a high speed 801.11 wireless network. In this manner, the remote user will be presented with a near real time view of the various locations to enable precise control of one or more components of vehicle 30. It is understood that the update periods and image quality, and therefore the corresponding algorithms for generating and transmitting the image data, for display areas 212A-212D can differ from that of display area 214, the latter of which can be most important for performing actions using vehicle 30. For display area 214, a latency of approximately ten milliseconds or less is generally adequate for a slow speed remote operation.
Additionally, panel 210 can include one or more input devices for enabling the remote user to remotely control the operation of one or more components of vehicle 30. For example, panel 210 is shown including a pair of joysticks 216A, 216B, which can enable the remote user to operate any combination of the various components on vehicle 30 using any solution. It is understood that any type of input device can be utilized, including a touch screen or the like, which a remote user can utilize to point out an object within a field of view of vehicle 30, the bounding area 134, and/or the like. Using coordinate geometry, processing component 36 can determine the object's coordinates with respect to vehicle 30 by employing a range sensor, laser range finder, and/or the like, and combining the range information with the imaging device's field of view using camera calibration solutions. Further, panel 210 can include alternative and/or additional input/output devices, such as a speaker, which can present audible data acquired by vehicle 30, a microphone, and/or the like. The remote user can designate one or more operations, e.g., using a speech recognition software or the like, which can be interpreted into commands transmitted for processing by processing component 36. While a separate panel 210 is illustrated, it is understood that the various display areas can be presented on a standard monitor using a general purpose computing device executing computer program code configured to manage data for the various display areas, and the input devices can comprise input devices for the general purpose computing device, the interaction of which is converted into a set of actions to be performed by component(s) located on the vehicle 30 by computer program code executing on the computing device that is configured to interpret the interactions into the set of actions.
The foregoing description of various aspects of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously, many modifications and variations are possible. Such modifications and variations that may be apparent to an individual in the art are included within the scope of the invention as defined by the accompanying claims.
Claims (20)
1. A vehicle comprising:
a transport component including:
a motor;
a set of wheels; and
a clutch configured to selectively engage the set of wheels with the motor;
a set of action components, the set of action components including at least one action component configured to be temporarily secured with respect to a target object external to the vehicle;
means for moving the at least one action component to temporarily secure the at least one component with respect to the target object, the means for moving comprising an arm including a plurality of links, wherein the set of action components are located on the arm, the vehicle further comprising means for disengaging at least one link of the multi-link arm in response to the at least one action component being temporarily secured with respect to the target object; and
means for operating the clutch to disengage the motor from the set of wheels in response to the at least one action component being temporarily secured with respect to the target object.
2. The vehicle of claim 1 , wherein the target object is a moving object.
3. The vehicle of claim 1 , wherein the target object comprises a component of a rail vehicle.
4. The vehicle of claim 1 , wherein the set of action components includes a manipulator configured to decouple a pair of rail vehicles.
5. The vehicle of claim 1 , further comprising:
a first imaging device configured to acquire location image data for the vehicle;
a set of visualization components configured to capture work region image data of a region including the target object; and
a processing component configured to process the location image data and operate the transport component to move the vehicle using the location image data and process the work region image data and create a three-dimensional representation of objects in the work region based on the work region image data.
6. The vehicle of claim 5 , wherein the means for moving the at least one action component includes the processing component, and wherein the processing component is configured to operate the transport component and move the at least one action component autonomously.
7. The vehicle of claim 1 , wherein the motor is configured to drive at least one wheel that rotates a set of tracks.
8. A railway maintenance device comprising:
a multi-link arm, each link capable of independent movement in at least one direction;
a set of visualization components configured to capture work region image data for a region including at least one target object;
an action component located on a link of the multi-link arm; and
a processing component configured to: move at least one of the links to place the action component near a target object in the region using the work region image data; operate at least one of: the arm or the action component, to perform a railway maintenance operation; and disengage at least one link of the multi-link arm in response to the action component being temporarily secured with respect to the target object;
a location imaging device configured to acquire location image data having a second field of view distinct from the work region image data; and
a transport component configured to enable the device to move independently, wherein the processing component is further configured to process the location image data and operate the transport component to move the vehicle using the location image data.
9. The device of claim 8 ,
wherein the processing component disengages the at least one link to allow the at least one link to move freely while the railway maintenance device is performing the railway maintenance operation.
10. The device of claim 8 , further comprising a positioning component configured to receive a set of location signals, wherein the processing component is further configured to process the set of location signals to determine a location of the device and use the location of the device to move the device.
11. The device of claim 10 , wherein the positioning component is further configured to acquire dead-reckoning data, and wherein the processing component further uses the dead-reckoning data to determine the location of the device.
12. The device of claim 8 , wherein the processing component is configured to identify at least one feature in the location image data included in a digital map of a rail yard and adjust movement of the device based on a location of the at least one feature with respect to the device.
13. The device of claim 8 , wherein the railway maintenance operation comprises at least one of: decoupling a pair of rail vehicles, detaching a pair of brake hoses, attaching a pair of brake hoses, or acquiring inspection data for a rail vehicle.
14. A rail yard comprising:
a railway maintenance device configured to perform each of a plurality of railway maintenance operations including: decoupling a pair of rail vehicles; detaching a pair of brake hoses; attaching a pair of brake hoses; and acquiring inspection data for a rail vehicle, the device comprising:
a multi-link arm, each link capable of independent movement in at least one direction;
a set of visualization components configured to capture work region image data for a region including at least one target object;
an action component located on a link of the multi-link arm; and
a processing component configured to: move at least one of the links to place the action component near a target object in the region using the work region image data; operate at least one of: the arm or the action component, to perform at least one of the plurality of railway maintenance operations; and disengage at least one link of the multi-link arm in response to the action component being temporarily secured with respect to the target object.
15. The rail yard of claim 14 , wherein the railway maintenance device is located between a pair of rail tracks for a rail line in the rail yard.
16. The rail yard of claim 14 , the railway maintenance device further comprising:
a location imaging device configured to acquire location image data having a second field of view distinct from the work region image data; and
a transport component configured to enable the device to move independently, wherein the processing component is further configured to: process the location image data; operate the transport component to move the device using the location image data autonomously; and disengage a motor in the transport component from a set of wheels in the transport component in response to the action component being temporarily secured with respect to the target object.
17. The rail yard of claim 16 , further comprising means for assisting the device in moving within the rail yard.
18. The rail yard of claim 17 , wherein the means for assisting comprises a plurality of wireless geographic location transmitters, each location transmitter configured to transmit location information, and wherein the processing component is further configured to move the device using the location information.
19. The rail yard of claim 17 , wherein the means for assisting comprises a set of rail tracks, wherein the device comprises a rail-based vehicle.
20. The rail yard of claim 17 , wherein the means for assisting comprises a set of markers, and wherein the processing component is further configured to acquire data on a location of a marker in the set of markers and move the device using the location of the marker.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/074,945 US9383752B2 (en) | 2008-09-19 | 2013-11-08 | Railway maintenance device |
US15/201,336 US10471976B2 (en) | 2008-09-19 | 2016-07-01 | Railway maintenance device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13662408P | 2008-09-19 | 2008-09-19 | |
US12/563,577 US8583313B2 (en) | 2008-09-19 | 2009-09-21 | Robotic vehicle for performing rail-related actions |
US14/074,945 US9383752B2 (en) | 2008-09-19 | 2013-11-08 | Railway maintenance device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/563,577 Continuation US8583313B2 (en) | 2008-09-19 | 2009-09-21 | Robotic vehicle for performing rail-related actions |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/201,336 Continuation US10471976B2 (en) | 2008-09-19 | 2016-07-01 | Railway maintenance device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140067188A1 US20140067188A1 (en) | 2014-03-06 |
US9383752B2 true US9383752B2 (en) | 2016-07-05 |
Family
ID=42038486
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/563,577 Active 2032-07-09 US8583313B2 (en) | 2008-09-19 | 2009-09-21 | Robotic vehicle for performing rail-related actions |
US14/074,945 Active 2030-06-22 US9383752B2 (en) | 2008-09-19 | 2013-11-08 | Railway maintenance device |
US15/201,336 Active 2030-06-01 US10471976B2 (en) | 2008-09-19 | 2016-07-01 | Railway maintenance device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/563,577 Active 2032-07-09 US8583313B2 (en) | 2008-09-19 | 2009-09-21 | Robotic vehicle for performing rail-related actions |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/201,336 Active 2030-06-01 US10471976B2 (en) | 2008-09-19 | 2016-07-01 | Railway maintenance device |
Country Status (1)
Country | Link |
---|---|
US (3) | US8583313B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10338597B2 (en) * | 2014-12-26 | 2019-07-02 | Kawasaki Jukogyo Kabushiki Kaisha | Self-traveling articulated robot |
US10471976B2 (en) | 2008-09-19 | 2019-11-12 | International Electronic Machines Corp. | Railway maintenance device |
US20200231082A1 (en) * | 2019-01-21 | 2020-07-23 | Kevin Arnold Morran | Remote controlled lighting apparatus |
US10752268B2 (en) * | 2016-04-19 | 2020-08-25 | Voith Patent Gmbh | Device for data and/or signal transmission |
US11958183B2 (en) | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
Families Citing this family (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10105844B2 (en) * | 2016-06-16 | 2018-10-23 | General Electric Company | System and method for controlling robotic machine assemblies to perform tasks on vehicles |
US10618168B2 (en) | 2016-05-13 | 2020-04-14 | General Electric Company | Robot system path planning for asset health management |
US10493629B2 (en) * | 2016-05-27 | 2019-12-03 | Ge Global Sourcing Llc | Multisensory data fusion system and method for autonomous robotic operation |
US10065317B2 (en) * | 2016-06-30 | 2018-09-04 | General Electric Company | Control system for coordinating robotic machines to collaborate on tasks |
US10300601B2 (en) * | 2014-11-14 | 2019-05-28 | Ge Global Sourcing Llc | Vehicle control system with task manager |
US8478480B2 (en) | 2006-10-27 | 2013-07-02 | International Electronic Machines Corp. | Vehicle evaluation using infrared data |
WO2010048453A2 (en) * | 2008-10-22 | 2010-04-29 | International Electronic Machines Corp. | Thermal imaging-based vehicle analysis |
AU2010242544B2 (en) * | 2009-05-01 | 2015-07-16 | Technological Resources Pty. Limited | Integrated automation system with picture compilation system |
PE20121018A1 (en) | 2009-05-01 | 2012-08-18 | Univ Sydney | INTEGRATED AUTOMATION SYSTEM |
US8177115B1 (en) | 2010-03-12 | 2012-05-15 | Craig Mercier | Method and system for retreading track wheel |
US8662375B2 (en) | 2010-03-12 | 2014-03-04 | Craig Mercier | Method and system for retreading track wheel |
US20120192756A1 (en) * | 2011-01-31 | 2012-08-02 | Harsco Corporation | Rail vision system |
US9258975B2 (en) * | 2011-04-28 | 2016-02-16 | Technologies Holdings Corp. | Milking box with robotic attacher and vision system |
US8447863B1 (en) | 2011-05-06 | 2013-05-21 | Google Inc. | Systems and methods for object recognition |
US8794386B2 (en) * | 2011-07-01 | 2014-08-05 | Cardinal Gibbons High School | Folding forklift |
US9250073B2 (en) * | 2011-09-02 | 2016-02-02 | Trimble Navigation Limited | Method and system for position rail trolley using RFID devices |
EP2798574A4 (en) * | 2011-12-29 | 2016-03-02 | Intel Corp | Systems, methods, and apparatus for obtaining information from an object attached to a vehicle |
US9036025B2 (en) | 2012-01-11 | 2015-05-19 | International Business Macines Corporation | System and method for inexpensive railroad track imaging for inspection |
US8818031B1 (en) * | 2012-03-02 | 2014-08-26 | Google Inc. | Utility pole geotagger |
WO2013177393A1 (en) * | 2012-05-24 | 2013-11-28 | International Electronic Machines Corporation | Wayside measurement of railcar wheel to rail geometry |
BR112014031922B1 (en) | 2012-06-18 | 2022-03-15 | Technological Resources Pty. Limited | Systems and methods for processing geophysical data |
DE102012211151B4 (en) * | 2012-06-28 | 2021-01-28 | Siemens Aktiengesellschaft | Charging arrangement and method for inductive charging of an electrical energy store |
US9036865B2 (en) | 2012-09-12 | 2015-05-19 | International Business Machines Corporation | Location determination for an object using visual data |
US20140142868A1 (en) * | 2012-11-18 | 2014-05-22 | Andian Technologies Ltd. | Apparatus and method for inspecting track in railroad |
WO2014089316A1 (en) * | 2012-12-06 | 2014-06-12 | International Electronic Machines Corporation | Human augmentation of robotic work |
US8781504B1 (en) * | 2012-12-24 | 2014-07-15 | Yi-Phone Inc. | System for monitoring in real-time movement or location and method thereof |
US9036892B2 (en) * | 2012-12-31 | 2015-05-19 | General Electric Company | Systems and methods for data entry in a non-destructive testing system |
US8914162B2 (en) * | 2013-03-12 | 2014-12-16 | Wabtec Holding Corp. | System, method, and apparatus to detect and report track structure defects |
US20140267793A1 (en) * | 2013-03-15 | 2014-09-18 | Delphi Display Systems, Inc. | System and method for vehicle recognition in a dynamic setting |
WO2014204928A1 (en) * | 2013-06-17 | 2014-12-24 | International Electronic Machines Corporation | Pre-screening for robotic work |
US9107513B2 (en) * | 2013-07-16 | 2015-08-18 | Amirmasood Asfa | Baby walker system with a braking mechanism for movement control |
US8989985B2 (en) | 2013-08-14 | 2015-03-24 | Thales Canada Inc. | Vehicle-based positioning system and method of using the same |
JOP20200120A1 (en) * | 2013-10-21 | 2017-06-16 | Esco Group Llc | Wear assembly removal and installation |
CA2928645C (en) * | 2013-10-25 | 2021-10-26 | Aleksandar VAKANSKI | Image-based robot trajectory planning approach |
DE102013019368A1 (en) * | 2013-11-18 | 2015-05-21 | Grenzebach Maschinenbau Gmbh | Method and device for the largely automated assembly of goods deliveries in warehouses |
AU2014262221C1 (en) | 2013-11-25 | 2021-06-10 | Esco Group Llc | Wear part monitoring |
US8825226B1 (en) | 2013-12-17 | 2014-09-02 | Amazon Technologies, Inc. | Deployment of mobile automated vehicles |
US20150369593A1 (en) * | 2014-06-19 | 2015-12-24 | Kari MYLLYKOSKI | Orthographic image capture system |
JP6131385B2 (en) * | 2014-07-30 | 2017-05-17 | ヤンマー株式会社 | Remote control device |
US9415513B2 (en) | 2014-08-29 | 2016-08-16 | General Electric Company | Systems and methods for railyard robotics |
US9625912B2 (en) * | 2014-09-03 | 2017-04-18 | Sharp Laboratories Of America, Inc. | Methods and systems for mobile-agent navigation |
US9157757B1 (en) * | 2014-09-03 | 2015-10-13 | Sharp Laboratories Of America, Inc. | Methods and systems for mobile-agent navigation |
US9969337B2 (en) * | 2014-09-03 | 2018-05-15 | Sharp Laboratories Of America, Inc. | Methods and systems for mobile-agent navigation |
US9536311B2 (en) | 2014-09-29 | 2017-01-03 | General Electric Company | System and method for component detection |
EP3256650B1 (en) | 2015-02-13 | 2023-06-28 | ESCO Group LLC | Monitoring ground-engaging products for earth working equipment |
US10175040B2 (en) | 2015-03-20 | 2019-01-08 | Process Metrix | Characterization of refractory lining of metallurgical vessels using autonomous scanners |
DE102015004087B3 (en) * | 2015-03-31 | 2016-12-29 | gomtec GmbH | Mobile robot with collision detection |
US11020859B2 (en) * | 2015-05-01 | 2021-06-01 | Transportation Ip Holdings, Llc | Integrated robotic system and method for autonomous vehicle maintenance |
US10272573B2 (en) * | 2015-12-18 | 2019-04-30 | Ge Global Sourcing Llc | Control system and method for applying force to grasp a brake lever |
US20170341235A1 (en) | 2016-05-27 | 2017-11-30 | General Electric Company | Control System And Method For Robotic Motion Planning And Control |
US10286930B2 (en) | 2015-06-16 | 2019-05-14 | The Johns Hopkins University | Instrumented rail system |
JP6610665B2 (en) * | 2015-06-23 | 2019-11-27 | 日本電気株式会社 | Detection system, detection method, and program |
CN105487507A (en) * | 2015-11-26 | 2016-04-13 | 深圳市施罗德工业测控设备有限公司 | Intelligent pipe network system bases on railway robot |
CN105397795B (en) * | 2015-12-10 | 2018-01-16 | 深圳市施罗德工业测控设备有限公司 | A kind of rail mounted crusing robot |
US10029372B2 (en) | 2015-12-11 | 2018-07-24 | General Electric Company | Control system and method for brake bleeding |
US9799198B2 (en) * | 2015-12-18 | 2017-10-24 | General Electric Company | System and method for communicating with an operator of the system |
US10471595B2 (en) * | 2016-05-31 | 2019-11-12 | Ge Global Sourcing Llc | Systems and methods for control of robotic manipulation |
US9996083B2 (en) | 2016-04-28 | 2018-06-12 | Sharp Laboratories Of America, Inc. | System and method for navigation assistance |
US10152891B2 (en) | 2016-05-02 | 2018-12-11 | Cnh Industrial America Llc | System for avoiding collisions between autonomous vehicles conducting agricultural operations |
EA202191816A1 (en) | 2016-06-13 | 2022-03-31 | ЭСКО ГРУП ЛЛСи | MANIPULATION SYSTEM FOR WEAR GROUND ENGAGING ELEMENTS ATTACHED TO EARTH-MOVING EQUIPMENT |
WO2018018075A1 (en) * | 2016-07-25 | 2018-02-01 | Hegel Industrial Solutions Pty Ltd | Vessel inspection system |
CN106184275A (en) * | 2016-08-18 | 2016-12-07 | 华南理工大学 | Robot extractd by a kind of follow-on hitch of SCARA |
US10191014B2 (en) * | 2016-08-23 | 2019-01-29 | The Boeing Company | System and method for nondestructive evaluation of a test object |
CN106522628B (en) * | 2016-12-29 | 2022-03-04 | 同方威视技术股份有限公司 | Automatic vehicle inspection system and method and intelligent garage |
US10796192B2 (en) | 2017-03-23 | 2020-10-06 | Harsco Technologies LLC | Track feature detection using machine vision |
US10286564B2 (en) | 2017-05-01 | 2019-05-14 | Lincoln Global, Inc. | System for locally generating electricity on a robotic device |
US20180348792A1 (en) * | 2017-06-06 | 2018-12-06 | Walmart Apollo, Llc | Systems and methods for coupling autonomous ground vehicles delivering merchandise |
CN111164531A (en) * | 2017-08-07 | 2020-05-15 | 欧姆尼消费品有限责任公司 | System, method and apparatus for surveillance drone |
WO2019032736A1 (en) | 2017-08-08 | 2019-02-14 | Smart Picture Technologies, Inc. | Method for measuring and modeling spaces using markerless augmented reality |
US10640159B2 (en) * | 2017-11-13 | 2020-05-05 | Eric Bliss | Stair-climbing remote control utility wagon |
EP3714231B1 (en) * | 2017-11-24 | 2024-07-03 | ABB Schweiz AG | System and method for characterizing a coating such as a paint film by radiation, and painting facility with such a system |
US11383679B2 (en) * | 2017-12-01 | 2022-07-12 | Volvo Truck Corporation | Method for maintenance of a vehicle |
US11465659B2 (en) * | 2018-02-19 | 2022-10-11 | Claudio Filippone | Autonomous scouting rail vehicle |
BR112020017031A2 (en) * | 2018-02-21 | 2021-02-23 | Outrider Technologies, Inc. | systems for operating an autonomous vehicle terminal tractor, to control the loading of an electric truck in a facility, to maneuver a trailer with respect to a truck, to control access by a user to an autonomous truck, to allow movement of a trailer, to identify and orient with respect to container wells on railway cars, to transport a long distance trailer, to robotically open swinging rear doors of a trailer, to operate a truck, to retain swinging open doors on a trailer, to assist in reverse operations on a trailer hitched to an autonomous truck, to automatically apply a stationary lift bracket to a trailer, for automatic support of a trailer, method for operating an autonomous vehicle terminal tractor, and, bridge system rolling. |
US10916026B2 (en) | 2018-05-16 | 2021-02-09 | Toyota Research Institute, Inc. | Systems and methods of determining stereo depth of an object using object class information |
US10778943B2 (en) | 2018-07-17 | 2020-09-15 | C-Tonomy, LLC | Autonomous surveillance duo |
US10919412B2 (en) * | 2018-09-24 | 2021-02-16 | Transportation Ip Holdings, Llc | Method and systems for an auxiliary power unit |
CN111103863B (en) * | 2018-10-29 | 2022-08-30 | 株洲中车时代电气股份有限公司 | Intelligent maintenance robot, system and method for rail transit vehicle |
EA202191637A1 (en) | 2018-12-10 | 2021-09-21 | Эско Груп Ллк | SYSTEM AND METHOD FOR WORKING IN THE FIELD CONDITIONS |
WO2020191398A1 (en) * | 2019-03-21 | 2020-09-24 | Rethink Technologies, Llc | Inspecting railroad track and key track components using a visual information system |
US11138757B2 (en) | 2019-05-10 | 2021-10-05 | Smart Picture Technologies, Inc. | Methods and systems for measuring and modeling spaces using markerless photo-based augmented reality process |
US11279384B2 (en) | 2019-05-10 | 2022-03-22 | Reliabotics, LLC | Robotic system for installing equipment on vertical surfaces of railway tunnels |
CN112068540A (en) * | 2019-05-24 | 2020-12-11 | 北京海益同展信息科技有限公司 | Track inspection robot |
RU2732676C1 (en) * | 2020-03-25 | 2020-09-21 | Федеральное государственное автономное образовательное учреждение высшего образования "Российский университет транспорта" (ФГАОУ ВО РУТ (МИИТ), РУТ (МИИТ) | Railway car disengagement device |
CN111390925A (en) * | 2020-04-07 | 2020-07-10 | 青岛黄海学院 | A inspection robot for dangerization article warehouse |
CN111702809B (en) * | 2020-06-27 | 2021-10-08 | 上海工程技术大学 | Robot track self-checking device and method thereof |
US11423639B2 (en) * | 2020-07-31 | 2022-08-23 | Ford Global Technologies, Llc | Hidden camera detection systems and methods |
EP4244098A4 (en) * | 2020-11-11 | 2024-05-08 | Abb Schweiz Ag | Apparatus and method for handling twistlocks |
CN113146628B (en) * | 2021-04-13 | 2023-03-31 | 中国铁道科学研究院集团有限公司通信信号研究所 | Brake hose picking robot system suitable for marshalling station |
CN113942539B (en) * | 2021-05-07 | 2022-12-16 | 北京汇力智能科技有限公司 | Unhooking and re-hooking robot and unhooking operation method thereof |
CN113448333B (en) * | 2021-06-25 | 2024-02-06 | 北京铁道工程机电技术研究所股份有限公司 | Bottom inspection positioning method and device based on sensor combination and electronic equipment |
RU208417U1 (en) * | 2021-07-21 | 2021-12-17 | Руслан Рамзанович Садуев | Industrial robot for automatic uncoupling of moving freight wagons |
CN113894765A (en) * | 2021-09-27 | 2022-01-07 | 中国科学院自动化研究所 | Automatic unhooking robot and system for train |
TWI800102B (en) * | 2021-11-16 | 2023-04-21 | 財團法人工業技術研究院 | Method and system for vehicle head compensation |
WO2023150786A1 (en) * | 2022-02-07 | 2023-08-10 | Trackmobile Llc | Mechanical coupling in automated gladhand system |
DE102022000701A1 (en) | 2022-02-25 | 2023-08-31 | Visevi Robotics GmbH | Autonomous manipulation system for maintenance and inspection work on track systems |
US11628869B1 (en) | 2022-03-04 | 2023-04-18 | Bnsf Railway Company | Automated tie marking |
US11565730B1 (en) | 2022-03-04 | 2023-01-31 | Bnsf Railway Company | Automated tie marking |
CN114802335B (en) * | 2022-03-25 | 2023-08-22 | 北京汇力智能科技有限公司 | Robot for pulling out pin and wind picking pipe of train coupler |
CN118238114B (en) * | 2024-05-20 | 2024-08-30 | 安徽泰新物联科技有限公司 | Manipulator device for handcart switch maintenance |
Citations (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1028831A (en) | 1911-03-31 | 1912-06-04 | John W Stagg | Portable car-unloading chute. |
US1824108A (en) | 1930-11-01 | 1931-09-22 | Gen Railway Signal Co | Track instrument |
US3132749A (en) | 1961-02-14 | 1964-05-12 | English Steel Corp Ltd | Devices for automatically operating the locking members of automatic couplers for rail vehicles |
US3211907A (en) | 1958-12-26 | 1965-10-12 | Gen Signal Corp | Car routing system for railroads |
US3247509A (en) | 1963-10-04 | 1966-04-19 | American Brake Shoe Co | Microwave identification of railroad cars |
US3558876A (en) | 1968-10-16 | 1971-01-26 | Servo Corp Of America | Train wheel defect detector |
US3682325A (en) | 1970-07-02 | 1972-08-08 | Kennecott Copper Corp | Apparatus for uncoupling railroad cars |
US3721821A (en) | 1970-12-14 | 1973-03-20 | Abex Corp | Railway wheel sensor |
US3736420A (en) | 1971-09-13 | 1973-05-29 | Westinghouse Air Brake Co | Switch control arrangement for railroad classification yards |
US3750897A (en) | 1970-12-26 | 1973-08-07 | Japan National Railway | Automatic releasing apparatus for couplers of railway vehicles |
US3854598A (en) | 1973-01-30 | 1974-12-17 | Japan National Railway | Automatic unlocking device for rolling stock couplers |
US4288689A (en) | 1979-10-12 | 1981-09-08 | Lemelson Jerome H | Automatic vehicle identification system and method |
US4532511A (en) | 1979-10-12 | 1985-07-30 | Lemelson Jerome H | Automatic vehicle identification system and method |
US4610206A (en) | 1984-04-09 | 1986-09-09 | General Signal Corporation | Micro controlled classification yard |
US4779203A (en) | 1985-08-30 | 1988-10-18 | Texas Instruments Incorporated | Visual navigation system for mobile robots |
US4904939A (en) | 1988-09-16 | 1990-02-27 | International Electronic Machines Corp. | Portable electronic wheel wear gauge |
US4947353A (en) | 1988-09-12 | 1990-08-07 | Automatic Toll Systems, Inc. | Automatic vehicle detecting system |
US4973215A (en) * | 1986-02-18 | 1990-11-27 | Robotics Research Corporation | Industrial robot with servo |
US5062673A (en) | 1988-12-28 | 1991-11-05 | Kabushiki Kaisha Toyota Chuo Kenkyusho | Articulated hand |
US5139161A (en) | 1991-04-25 | 1992-08-18 | National Castings, Inc. | Automatic actuator for coupler knuckle-assembly of a railway passenger car |
US5181472A (en) | 1990-07-13 | 1993-01-26 | Les Fils D'auguste Scheuchzer S.A. | Device for the substitution of the rails of railway tracks |
US5433111A (en) | 1994-05-05 | 1995-07-18 | General Electric Company | Apparatus and method for detecting defective conditions in railway vehicle wheels and railtracks |
US5531337A (en) * | 1995-05-30 | 1996-07-02 | Inco Limited | Automated decoupler for rail cars |
US5550953A (en) | 1994-04-20 | 1996-08-27 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | On-line method and apparatus for coordinated mobility and manipulation of mobile robots |
US5636026A (en) | 1995-03-16 | 1997-06-03 | International Electronic Machines Corporation | Method and system for contactless measurement of railroad wheel characteristics |
US5678789A (en) | 1995-12-05 | 1997-10-21 | Pipich; Robert B. | Model railroad car position indicator |
US5737500A (en) | 1992-03-11 | 1998-04-07 | California Institute Of Technology | Mobile dexterous siren degree of freedom robot arm with real-time control system |
US5828979A (en) | 1994-09-01 | 1998-10-27 | Harris Corporation | Automatic train control system and method |
US6125311A (en) | 1997-12-31 | 2000-09-26 | Maryland Technology Corporation | Railway operation monitoring and diagnosing systems |
GB2352486A (en) | 1999-07-23 | 2001-01-31 | Notetry Ltd | Robotic vacuum cleaner with overload protection clutch |
US20020007289A1 (en) | 2000-07-11 | 2002-01-17 | Malin Mark Elliott | Method and apparatus for processing automobile repair data and statistics |
JP2002019606A (en) | 2000-07-03 | 2002-01-23 | Mitsubishi Heavy Ind Ltd | Railroad maintenance traveling device with work robot |
US6397130B1 (en) | 2000-04-13 | 2002-05-28 | Ensco, Ltd. | Multi-sensor route detector for rail vehicle navigation |
US6416020B1 (en) | 1998-07-10 | 2002-07-09 | Leif Gronskov | Method and apparatus for detecting defective track wheels |
US20020101361A1 (en) | 2000-11-29 | 2002-08-01 | Barich David J. | Railcar maintenance management system |
US6484074B1 (en) | 1999-06-11 | 2002-11-19 | Alstom | Method of and device for controlling controlled elements of a rail vehicle |
US6523411B1 (en) | 2000-03-21 | 2003-02-25 | International Electronic Machines Corp. | Wheel inspection system |
US20030072001A1 (en) | 2001-10-17 | 2003-04-17 | Mian Zahid F. | Contactless wheel measurement system and method |
US6636814B1 (en) | 1999-11-05 | 2003-10-21 | Bombardier Transportation Gmbh | Light rail vehicle having predictive diagnostic system for motor driven automated doors |
US6655502B2 (en) | 2000-06-14 | 2003-12-02 | Robert Bosch Gmbh | Method for monitoring the thickness of the brake linings of a vehicle braking system |
US6681160B2 (en) | 1999-06-15 | 2004-01-20 | Andian Technologies Ltd. | Geometric track and track/vehicle analyzers and methods for controlling railroad systems |
US20040049327A1 (en) | 2002-09-10 | 2004-03-11 | Kondratenko Robert Allen | Radio based automatic train control system using universal code |
US20040068361A1 (en) | 2002-06-04 | 2004-04-08 | Bombardier Transportation (Technology) Germany Gmbh | Automated manipulation system and method in a transit system |
US20040181321A1 (en) | 2003-02-13 | 2004-09-16 | General Electric Company | Digital train system for automatically detecting trains approaching a crossing |
US20040194549A1 (en) | 2001-08-10 | 2004-10-07 | Rene Noel | Sound pollution surveillance system and method |
US6922632B2 (en) | 2002-08-09 | 2005-07-26 | Intersense, Inc. | Tracking, auto-calibration, and map-building system |
US20050226201A1 (en) | 1999-05-28 | 2005-10-13 | Afx Technology Group International, Inc. | Node-to node messaging transceiver network with dynamec routing and configuring |
US6957780B2 (en) | 2002-02-01 | 2005-10-25 | Andy Rosa | Fluid application system for a vehicle |
US20050259273A1 (en) | 2004-05-24 | 2005-11-24 | Mian Zahid F | Portable electronic measurement |
US20050258943A1 (en) | 2004-05-21 | 2005-11-24 | Mian Zahid F | System and method for monitoring an area |
US20060231685A1 (en) | 2005-04-14 | 2006-10-19 | Mace Stephen E | Railroad car coupler gap analyzer |
US20070061043A1 (en) | 2005-09-02 | 2007-03-15 | Vladimir Ermakov | Localization and mapping system and method for a robotic device |
US20070061041A1 (en) * | 2003-09-02 | 2007-03-15 | Zweig Stephen E | Mobile robot with wireless location sensing apparatus |
US20070064244A1 (en) | 2005-09-16 | 2007-03-22 | Mian Zahid F | Optical wheel evaluation |
US20070075192A1 (en) | 2005-10-05 | 2007-04-05 | Mian Zahid F | Wheel measurement systems and methods |
US7213789B1 (en) | 2003-04-29 | 2007-05-08 | Eugene Matzan | System for detection of defects in railroad car wheels |
US20070233333A1 (en) | 2006-03-29 | 2007-10-04 | Moffett Jeffrey P | Rail wheel servicing management |
US20080149782A1 (en) | 2006-12-20 | 2008-06-26 | General Electric Company | Wheel detection and classification system for railroad data network |
US7438075B1 (en) | 2004-12-30 | 2008-10-21 | Washworld, Inc. | Spray arch controller for a carwash |
US20080297590A1 (en) | 2007-05-31 | 2008-12-04 | Barber Fred | 3-d robotic vision and vision control system |
US20080304065A1 (en) | 2004-09-11 | 2008-12-11 | General Electric Company | Rail Sensing Apparatus Method |
US20080306705A1 (en) | 2007-06-06 | 2008-12-11 | Huageng Luo | Apparatus and method for identifying a defect and/or operating characteristic of a system |
US7593795B2 (en) | 2002-05-31 | 2009-09-22 | Quantum Engineering, Inc. | Method and system for compensating for wheel wear on a train |
US20100068024A1 (en) | 2008-09-18 | 2010-03-18 | Agens Michael W | Remotely controlled robots having improved tool deployment systems |
US7748900B2 (en) | 2006-07-11 | 2010-07-06 | Siemens Aktiengesellschaft | X-ray system with an industrial robot |
US20100263948A1 (en) | 2006-10-06 | 2010-10-21 | Couture Adam P | Robotic vehicle |
US7974736B2 (en) | 2007-04-05 | 2011-07-05 | Foster-Miller, Inc. | Robot deployed weapon system and safing method |
US8002365B2 (en) | 2006-11-13 | 2011-08-23 | Raytheon Company | Conformable track assembly for a robotic crawler |
US8583313B2 (en) | 2008-09-19 | 2013-11-12 | International Electronic Machines Corp. | Robotic vehicle for performing rail-related actions |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8655540B2 (en) | 2007-08-20 | 2014-02-18 | International Electronic Machines Corp. | Rail vehicle identification and processing |
-
2009
- 2009-09-21 US US12/563,577 patent/US8583313B2/en active Active
-
2013
- 2013-11-08 US US14/074,945 patent/US9383752B2/en active Active
-
2016
- 2016-07-01 US US15/201,336 patent/US10471976B2/en active Active
Patent Citations (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1028831A (en) | 1911-03-31 | 1912-06-04 | John W Stagg | Portable car-unloading chute. |
US1824108A (en) | 1930-11-01 | 1931-09-22 | Gen Railway Signal Co | Track instrument |
US3211907A (en) | 1958-12-26 | 1965-10-12 | Gen Signal Corp | Car routing system for railroads |
US3132749A (en) | 1961-02-14 | 1964-05-12 | English Steel Corp Ltd | Devices for automatically operating the locking members of automatic couplers for rail vehicles |
US3247509A (en) | 1963-10-04 | 1966-04-19 | American Brake Shoe Co | Microwave identification of railroad cars |
US3558876A (en) | 1968-10-16 | 1971-01-26 | Servo Corp Of America | Train wheel defect detector |
US3682325A (en) | 1970-07-02 | 1972-08-08 | Kennecott Copper Corp | Apparatus for uncoupling railroad cars |
US3721821A (en) | 1970-12-14 | 1973-03-20 | Abex Corp | Railway wheel sensor |
US3750897A (en) | 1970-12-26 | 1973-08-07 | Japan National Railway | Automatic releasing apparatus for couplers of railway vehicles |
US3736420A (en) | 1971-09-13 | 1973-05-29 | Westinghouse Air Brake Co | Switch control arrangement for railroad classification yards |
US3854598A (en) | 1973-01-30 | 1974-12-17 | Japan National Railway | Automatic unlocking device for rolling stock couplers |
US4288689A (en) | 1979-10-12 | 1981-09-08 | Lemelson Jerome H | Automatic vehicle identification system and method |
US4532511A (en) | 1979-10-12 | 1985-07-30 | Lemelson Jerome H | Automatic vehicle identification system and method |
US4610206A (en) | 1984-04-09 | 1986-09-09 | General Signal Corporation | Micro controlled classification yard |
US4779203A (en) | 1985-08-30 | 1988-10-18 | Texas Instruments Incorporated | Visual navigation system for mobile robots |
US4973215A (en) * | 1986-02-18 | 1990-11-27 | Robotics Research Corporation | Industrial robot with servo |
US4947353A (en) | 1988-09-12 | 1990-08-07 | Automatic Toll Systems, Inc. | Automatic vehicle detecting system |
US4904939A (en) | 1988-09-16 | 1990-02-27 | International Electronic Machines Corp. | Portable electronic wheel wear gauge |
US5062673A (en) | 1988-12-28 | 1991-11-05 | Kabushiki Kaisha Toyota Chuo Kenkyusho | Articulated hand |
US5181472A (en) | 1990-07-13 | 1993-01-26 | Les Fils D'auguste Scheuchzer S.A. | Device for the substitution of the rails of railway tracks |
US5139161A (en) | 1991-04-25 | 1992-08-18 | National Castings, Inc. | Automatic actuator for coupler knuckle-assembly of a railway passenger car |
US5737500A (en) | 1992-03-11 | 1998-04-07 | California Institute Of Technology | Mobile dexterous siren degree of freedom robot arm with real-time control system |
US5550953A (en) | 1994-04-20 | 1996-08-27 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | On-line method and apparatus for coordinated mobility and manipulation of mobile robots |
US5433111A (en) | 1994-05-05 | 1995-07-18 | General Electric Company | Apparatus and method for detecting defective conditions in railway vehicle wheels and railtracks |
US5828979A (en) | 1994-09-01 | 1998-10-27 | Harris Corporation | Automatic train control system and method |
US5636026A (en) | 1995-03-16 | 1997-06-03 | International Electronic Machines Corporation | Method and system for contactless measurement of railroad wheel characteristics |
US5531337A (en) * | 1995-05-30 | 1996-07-02 | Inco Limited | Automated decoupler for rail cars |
US5678789A (en) | 1995-12-05 | 1997-10-21 | Pipich; Robert B. | Model railroad car position indicator |
US6125311A (en) | 1997-12-31 | 2000-09-26 | Maryland Technology Corporation | Railway operation monitoring and diagnosing systems |
US6416020B1 (en) | 1998-07-10 | 2002-07-09 | Leif Gronskov | Method and apparatus for detecting defective track wheels |
US20050226201A1 (en) | 1999-05-28 | 2005-10-13 | Afx Technology Group International, Inc. | Node-to node messaging transceiver network with dynamec routing and configuring |
US6484074B1 (en) | 1999-06-11 | 2002-11-19 | Alstom | Method of and device for controlling controlled elements of a rail vehicle |
US6681160B2 (en) | 1999-06-15 | 2004-01-20 | Andian Technologies Ltd. | Geometric track and track/vehicle analyzers and methods for controlling railroad systems |
GB2352486A (en) | 1999-07-23 | 2001-01-31 | Notetry Ltd | Robotic vacuum cleaner with overload protection clutch |
US6636814B1 (en) | 1999-11-05 | 2003-10-21 | Bombardier Transportation Gmbh | Light rail vehicle having predictive diagnostic system for motor driven automated doors |
US6523411B1 (en) | 2000-03-21 | 2003-02-25 | International Electronic Machines Corp. | Wheel inspection system |
US6397130B1 (en) | 2000-04-13 | 2002-05-28 | Ensco, Ltd. | Multi-sensor route detector for rail vehicle navigation |
US6655502B2 (en) | 2000-06-14 | 2003-12-02 | Robert Bosch Gmbh | Method for monitoring the thickness of the brake linings of a vehicle braking system |
JP2002019606A (en) | 2000-07-03 | 2002-01-23 | Mitsubishi Heavy Ind Ltd | Railroad maintenance traveling device with work robot |
US20020007289A1 (en) | 2000-07-11 | 2002-01-17 | Malin Mark Elliott | Method and apparatus for processing automobile repair data and statistics |
US20020101361A1 (en) | 2000-11-29 | 2002-08-01 | Barich David J. | Railcar maintenance management system |
US20040194549A1 (en) | 2001-08-10 | 2004-10-07 | Rene Noel | Sound pollution surveillance system and method |
US20030072001A1 (en) | 2001-10-17 | 2003-04-17 | Mian Zahid F. | Contactless wheel measurement system and method |
US6768551B2 (en) | 2001-10-17 | 2004-07-27 | International Electronic Machines Corp. | Contactless wheel measurement system and method |
US6957780B2 (en) | 2002-02-01 | 2005-10-25 | Andy Rosa | Fluid application system for a vehicle |
US7593795B2 (en) | 2002-05-31 | 2009-09-22 | Quantum Engineering, Inc. | Method and system for compensating for wheel wear on a train |
US20040068361A1 (en) | 2002-06-04 | 2004-04-08 | Bombardier Transportation (Technology) Germany Gmbh | Automated manipulation system and method in a transit system |
US6922632B2 (en) | 2002-08-09 | 2005-07-26 | Intersense, Inc. | Tracking, auto-calibration, and map-building system |
US20040049327A1 (en) | 2002-09-10 | 2004-03-11 | Kondratenko Robert Allen | Radio based automatic train control system using universal code |
US20040181321A1 (en) | 2003-02-13 | 2004-09-16 | General Electric Company | Digital train system for automatically detecting trains approaching a crossing |
US7213789B1 (en) | 2003-04-29 | 2007-05-08 | Eugene Matzan | System for detection of defects in railroad car wheels |
US20070061041A1 (en) * | 2003-09-02 | 2007-03-15 | Zweig Stephen E | Mobile robot with wireless location sensing apparatus |
US20050258943A1 (en) | 2004-05-21 | 2005-11-24 | Mian Zahid F | System and method for monitoring an area |
US20050259273A1 (en) | 2004-05-24 | 2005-11-24 | Mian Zahid F | Portable electronic measurement |
US20080304065A1 (en) | 2004-09-11 | 2008-12-11 | General Electric Company | Rail Sensing Apparatus Method |
US7438075B1 (en) | 2004-12-30 | 2008-10-21 | Washworld, Inc. | Spray arch controller for a carwash |
US20060231685A1 (en) | 2005-04-14 | 2006-10-19 | Mace Stephen E | Railroad car coupler gap analyzer |
US7328871B2 (en) | 2005-04-14 | 2008-02-12 | Progressive Rail Technologies, Inc. | Railroad car coupler gap analyzer |
US20070061043A1 (en) | 2005-09-02 | 2007-03-15 | Vladimir Ermakov | Localization and mapping system and method for a robotic device |
US20070064244A1 (en) | 2005-09-16 | 2007-03-22 | Mian Zahid F | Optical wheel evaluation |
US20070075192A1 (en) | 2005-10-05 | 2007-04-05 | Mian Zahid F | Wheel measurement systems and methods |
US20070233333A1 (en) | 2006-03-29 | 2007-10-04 | Moffett Jeffrey P | Rail wheel servicing management |
US7748900B2 (en) | 2006-07-11 | 2010-07-06 | Siemens Aktiengesellschaft | X-ray system with an industrial robot |
US20100263948A1 (en) | 2006-10-06 | 2010-10-21 | Couture Adam P | Robotic vehicle |
US8002365B2 (en) | 2006-11-13 | 2011-08-23 | Raytheon Company | Conformable track assembly for a robotic crawler |
US20080149782A1 (en) | 2006-12-20 | 2008-06-26 | General Electric Company | Wheel detection and classification system for railroad data network |
US7974736B2 (en) | 2007-04-05 | 2011-07-05 | Foster-Miller, Inc. | Robot deployed weapon system and safing method |
US20080297590A1 (en) | 2007-05-31 | 2008-12-04 | Barber Fred | 3-d robotic vision and vision control system |
US20080306705A1 (en) | 2007-06-06 | 2008-12-11 | Huageng Luo | Apparatus and method for identifying a defect and/or operating characteristic of a system |
US20100068024A1 (en) | 2008-09-18 | 2010-03-18 | Agens Michael W | Remotely controlled robots having improved tool deployment systems |
US8583313B2 (en) | 2008-09-19 | 2013-11-12 | International Electronic Machines Corp. | Robotic vehicle for performing rail-related actions |
Non-Patent Citations (21)
Title |
---|
Edwards et al., "Improving the Efficiency and Effectiveness of Railcar Safety Appliance Inspection Using Machine Vision Technology," In. Proc. Joint Rail Conference, Apr. 2006, pp. 81-89. |
Ikeda et al., "Asymptotic stable Guidance Control of PWS Mobile Manipulator and Dynamical Influence of Slipping Carrying Object to Stability," In. Proc. IEEE/RSJ Intl. Conference on Intelligent Robots and Symptoms, Oct. 2003, pp. 2917-2202. |
Katz et al., "The UMass Mobile Manipulator UMan: An Experimental Platform for Autonomous Mobile Manipulation," In Proc. RSS Workshop Manipulation for Human Environments, Philadelphia, PA, Aug. 2006. |
Michael F. Whalen, PTO Office Action, U.S. Appl. No. 12/043,357, Notification Date Jun. 21, 2011, 21 pages. |
Michael F. Whalen, USPTO Final Office Action, U.S. Appl. No. 12/043,357, Notification Date Jan. 12, 2012, 25 pages. |
Peter D Nolan, Notice of Allowance and Fee(s) Due, U.S. Appl. No. 12/563,577, Jul. 9, 2013, 13 pages. |
Peter D Nolan, USPTO Office Action, U.S. Appl. No. 12/563,577, Jan. 18, 2013, 25 pages. |
Peter D Nolan, USPTO Office Action, U.S. Appl. No. 12/563,577, Jul. 20, 2012, 39 pages. |
S. Hirose, T Shirasu, and E. Fukushima, "Proposal for cooperative robot "Gunryu" composed of autonomous segments," Robotics and Autonomous System 17, 1996, pp. 107-118. * |
Sasha Varghese, PTO Office Action, U.S. Appl. No. 12/171,438, Notification Date Apr. 27, 2011, 22 pages. |
Shughart et al., "A Comprehensive Decision Support System for Hump Yard Management Using Simulation and Optimization," Innovative Scheduling, Inc., Gainesville, Florida, Aug. 1, 2006, 44 pages. |
Sukumar et al., "Robotic Three-Dimensional Imaging System for Under-Vehicle Inspection", Journal of Electronic Imaging, vol. 15, No. 3, 2006, 11 pages. |
Unknown, "Point Grey Research Inc.," accessed from http://www.ptgrey.com, date unknown, printed on Jan. 30, 2008, 1 page. |
Unknown, "Selkirk Yard," accessed from http://www.trainweb.org/railnuts/yard.html, date unknown, printed on Dec. 14, 2007, 6 pages. |
Unknown, "Sick IVP-A New Dimension in Vision," accessed from http://www.sickivp.se/sickivp/en.html, date unknown, printed on Jan. 30, 2008, 1 page. |
Unknown, "Videre Design," accessed from http://www.videredesign.com, date unknown, printed on Jan. 30, 2008, 1 page. |
Varghese, U.S. Appl. No. 12/171,438, Notice of Allowance & Fees Due, Nov. 10, 2011, 11 pages. |
Whalen, U.S. Appl. No. 12/043,357, Notice of Allowance and Fee(s) Due, Oct. 10, 2013, 15 pages. |
Whalen, U.S. Appl. No. 12/043,357, Office Action Communication, Jan. 18, 2013, 22 pages. |
Whalen, U.S. Appl. No. 12/043,357, Office Action Communication, Jun. 21, 2013, 20 pages. |
Yamamoto et al., "Optical Sensing for Robot Perception and Localization," 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, Nagoya, Japan, Jun. 12-15, 2005. |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10471976B2 (en) | 2008-09-19 | 2019-11-12 | International Electronic Machines Corp. | Railway maintenance device |
US10338597B2 (en) * | 2014-12-26 | 2019-07-02 | Kawasaki Jukogyo Kabushiki Kaisha | Self-traveling articulated robot |
US10752268B2 (en) * | 2016-04-19 | 2020-08-25 | Voith Patent Gmbh | Device for data and/or signal transmission |
US20200231082A1 (en) * | 2019-01-21 | 2020-07-23 | Kevin Arnold Morran | Remote controlled lighting apparatus |
US20210170934A1 (en) * | 2019-01-21 | 2021-06-10 | Kevin Arnold Morran | Mobile Lighting System |
US11958183B2 (en) | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
Also Published As
Publication number | Publication date |
---|---|
US20140067188A1 (en) | 2014-03-06 |
US8583313B2 (en) | 2013-11-12 |
US20160313739A1 (en) | 2016-10-27 |
US20100076631A1 (en) | 2010-03-25 |
US10471976B2 (en) | 2019-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10471976B2 (en) | Railway maintenance device | |
US11865726B2 (en) | Control system with task manager | |
US11892853B2 (en) | Vehicle guidance systems and associated methods of use at logistics yards and other locations | |
CN110221623B (en) | Air-ground collaborative operation system and positioning method thereof | |
US11312018B2 (en) | Control system with task manager | |
US9753461B1 (en) | Autonomous aerial cable inspection system | |
KR101812088B1 (en) | Remote control based Stereo Vision guided vehicle system for the next generation smart factory | |
EP3333043B1 (en) | Rail inspection system and method | |
CN214520204U (en) | Port area intelligent inspection robot based on depth camera and laser radar | |
CN210819569U (en) | Vehicle bottom inspection robot and inspection system based on two-dimensional code positioning | |
WO2021141723A1 (en) | Directing secondary delivery vehicles using primary delivery vehicles | |
KR20100048414A (en) | Method, system, and operation method for providing surveillance to power plant facilities using track-type mobile robot system | |
KR101805423B1 (en) | ICT based Stereo Vision guided vehicle system for the next generation smart factory | |
CN210377164U (en) | Air-ground cooperative operation system | |
US20190263430A1 (en) | System and method for determining vehicle orientation in a vehicle consist | |
KR20180065760A (en) | Autonomous Driving System and Autonomous Driving Vehicle Apparatus Using Unmanned Aerial Vehicle | |
US20220241975A1 (en) | Control system with task manager | |
CN116583382A (en) | System and method for automatic operation and manipulation of autonomous trucks and trailers towed by same | |
KR102433595B1 (en) | Unmanned transportation apparatus based on autonomous driving for smart maintenance of railroad vehicles | |
JP2008252643A (en) | Mobile monitoring system and monitoring method thereof | |
CN116088499A (en) | Unmanned aerial vehicle auxiliary system of live working robot system | |
CN114973747A (en) | Intelligent guiding parking system | |
WO2023233815A1 (en) | State estimation device for articulated vehicle | |
US20200122528A1 (en) | Crawler | |
CN116652973B (en) | Analog traffic director system with V2X function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |