US20170341235A1 - Control System And Method For Robotic Motion Planning And Control - Google Patents
Control System And Method For Robotic Motion Planning And Control Download PDFInfo
- Publication number
- US20170341235A1 US20170341235A1 US15/282,102 US201615282102A US2017341235A1 US 20170341235 A1 US20170341235 A1 US 20170341235A1 US 201615282102 A US201615282102 A US 201615282102A US 2017341235 A1 US2017341235 A1 US 2017341235A1
- Authority
- US
- United States
- Prior art keywords
- robotic
- movement
- vehicle
- robotic system
- robotic vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 99
- 238000000034 method Methods 0.000 title claims description 29
- 238000013507 mapping Methods 0.000 claims description 8
- 230000004807 localization Effects 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 9
- 230000008447 perception Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 238000007596 consolidation process Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1615—Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
- B25J9/162—Mobile manipulator, movable base with manipulator arm mounted on it
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0055—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
- G05D1/0061—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/246—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/247—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
- G05D1/249—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/80—Arrangements for reacting to or preventing system or operator failure
- G05D1/81—Handing over between on-board automatic and on-board manual control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39082—Collision, real time collision avoidance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40282—Vehicle supports manipulator and other controlled devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40424—Online motion planning, in real time, use vision to detect workspace changes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45084—Service robot
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/01—Mobile robot
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
- Y10S901/47—Optical
Definitions
- the subject matter described herein relates to systems and methods for autonomously controlling movement of a device.
- Classification yards or hump yards
- Inbound vehicle systems e.g., trains
- cargo-carrying vehicles e.g., railcars
- the efficiency of the yards in part drives the efficiency of the entire transportation network.
- the hump yard is generally divided into three main areas: the receiving yard, where inbound vehicle systems arrive and are prepared for sorting; the class yard, where cargo-carrying vehicles in the vehicle systems are sorted into blocks; and the departure yard, where blocks of vehicles are assembled into outbound vehicle systems, inspected, and then depart.
- One challenge in using automated robotic systems to perform maintenance of the vehicles in the yard is ensuring that the robotic systems safely move through the yard. For example, safeguards are needed to ensure that the robotic systems do not collide with other objects (stationary or moving) and that the robotic systems are able to respond to a dynamically changing environment (e.g., where an object moves into the path of a moving robotic system), while also attempting to ensure that the robotic systems move toward locations for performing the vehicle maintenance along efficient paths (e.g., the shortest possible path or the path that is shorter than one or more other paths, but not all paths).
- efficient paths e.g., the shortest possible path or the path that is shorter than one or more other paths, but not all paths.
- a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more sensors configured to be disposed onboard the robotic vehicle and to obtain image data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward.
- the waypoint is located between a current location of the robotic vehicle and a final destination of the robotic vehicle.
- the controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the image data.
- the controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects.
- the controller also is configured to determine one or more additional waypoints subsequent to the robotic vehicle reaching the waypoint, determine one or more additional limitations on the movement of the robotic vehicle toward each of the respective additional waypoints, and control the propulsion system of the robotic vehicle to sequentially move the robotic vehicle to the one or more additional waypoints.
- a method includes obtaining image data representative of an environment external to a robotic system, determining a waypoint for the robotic system to move toward, the waypoint located between a current location of the robotic system and a final destination of the robotic system, determining limitations on movement of the robotic system toward the waypoint.
- the limitations are based on the image data, controlling a propulsion system of the robotic system to move the robotic system to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects, determining one or more additional waypoints subsequent to the robotic system reaching the waypoint, determining one or more additional limitations on the movement of the robotic system toward each of the respective additional waypoints, and controlling the propulsion system of the robotic system to sequentially move the robotic system to the one or more additional waypoints.
- a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more structured light sensors configured to be disposed onboard the robotic vehicle and to obtain point cloud data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward.
- the waypoint is located between a current location of the robotic vehicle and a brake lever of a vehicle.
- the controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the point cloud data.
- the controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects.
- FIG. 1 illustrates one embodiment of a robotic system
- FIG. 2 illustrates a flowchart of a method or state diagram of operation of a controller of the robotic system shown in FIG. 1 in directing movement of the robotic system according to one embodiment
- FIG. 3 illustrates one example of sensor data that can be examined by the controller shown in FIG. 1 to determine how to autonomously move the robotic system also shown in FIG. 1 ;
- FIG. 4 illustrates one example of a waypoint location that can be determined for the robotic system shown in FIG. 1 .
- FIG. 1 illustrates one embodiment of a robotic system 100 .
- the robotic system 100 may be used to autonomously move toward, grasp, and actuate (e.g., move) a brake lever or rod on a vehicle in order to change a state of a brake system of the vehicle.
- the robotic system 100 may autonomously move toward, grasp, and move a brake rod of an air brake system on a rail car in order to bleed air out of the brake system.
- the robotic system 100 includes a robotic vehicle 102 having a propulsion system 104 that operates to move the robotic system 100 .
- the propulsion system 104 may include one or more motors, power sources (e.g., batteries, alternators, generators, etc.), or the like, for moving the robotic system 100 .
- a controller 106 of the robotic system 100 includes hardware circuitry that includes and/or is connected with one or more processors (e.g., microprocessors, field programmable gate arrays, and/or integrated circuits) that direct operations of the robotic system 100 .
- the robotic system 100 also includes several sensors 108 , 109 , 110 , 111 , 112 that measure or detect various conditions used by the robotic system 100 to move toward, grasp, and actuate brake levers.
- the sensors 108 - 111 are optical sensors, such as cameras, infrared projectors and/or detectors. While four optical sensors 108 , 110 are shown, alternatively, the robotic system 100 may have a single optical sensor, less than four optical sensors, or more than four optical sensors.
- the sensors 109 , 111 are RGB cameras and the sensors 110 , 112 are structured-light three-dimensional (3-D) cameras, but alternatively may be another type of camera.
- the sensor 112 is a touch sensor that detects when a manipulator arm 114 of the robotic system 100 contacts or otherwise engages a surface or object.
- the touch sensor 112 may be one or more of a variety of touch-sensitive devices, such as a switch (e.g., that is closed upon touch or contact), a capacitive element (e.g., that is charged or discharged upon touch or contact), or the like.
- a switch e.g., that is closed upon touch or contact
- a capacitive element e.g., that is charged or discharged upon touch or contact
- one or more of the sensors 108 - 112 may be another type of sensor, such as a radar sensor, LIDAR sensor, etc.
- the manipulator arm 114 is an elongated body of the robotic system 100 that can move in a variety of directions, grasp, and pull and/or push a brake rod.
- the controller 106 may be operably connected with the propulsion system 104 and the manipulator arm 114 to control movement of the robotic system 100 and/or the arm 114 , such as by one or more wired and/or wireless connections.
- the controller 106 may be operably connected with the sensors 108 - 112 to receive data obtained, detected, or measured by the sensors 108 - 112 .
- the robotic system 100 can include a communication device 116 that communicates with an off-board control unit 118 .
- the communication device 116 can represent one or more antennas and associated transceiving circuitry, such as one or more modems, transceivers, receivers, transmitters, etc.
- the control unit 118 can represent hardware circuitry that includes and/or is connected with one or more processors (e.g., microprocessors, field programmable gate arrays, or integrated circuits) that receives user input to remotely control movement and other operation of the robotic system 100 .
- the control unit 118 also represents one or more input devices, such as joysticks, touchscreens, styluses, keyboards, etc., to allow a user to remotely control movement and other operations of the robotic system 100 .
- the control unit 118 also can include one or more antennas and associated transceiving circuitry to allow wireless communication with the communication device 116 of the robotic system 100 .
- the communication device 116 of the robotic system 100 may be connected with the control unit 118 by one or more wired connections to allow for remote control of the robotic system 100 via the wired connection(s).
- FIG. 2 illustrates a state diagram 200 of operation of the controller 106 in directing movement of the robotic system 100 shown in FIG. 1 according to one embodiment.
- the state diagram 200 can represent a flowchart of a method for controlling movement of the robotic system 100 , and may represent or be used to create software that directs operation of the controller 106 .
- the controller 106 may operate using the method represented by the state diagram 200 to move the robotic system 100 between or among different locations (e.g., in vehicle yards or other locations) to perform tasks, such as maintenance, inspection, repair, etc., of the vehicles.
- the controller 106 may operate in different operational modes. One mode can be referred to as an autonomous navigation mode and another mode can be referred to as tele-operation mode. Operations performed or controlled by the controller 106 can be referred to herein as modules.
- the modules can represent different sets of functions performed by the same or different processors of the controller 106 , and/or can represent different hardware components (e.g., processors and associated circuitry) performing the functions associated with the respective modules.
- the robotic system 100 is in a ready state.
- the ready state may involve the robotic system 100 being stationary and prepared to begin movement.
- the controller 106 may monitor the communication device 116 (or wait for a signal from the communication device 116 ) to indicate whether the robotic system 100 is to begin movement.
- the controller 106 may receive an input signal from the control unit 118 via the communication device 116 and/or from an input device of the robotic system 100 (e.g., one or more buttons, knobs, switches, touchscreens, keyboards, etc.). Responsive to receiving the input signal, the controller 106 may determine whether the input signal indicates that the robotic system 100 is to operate in the autonomous navigation mode (also referred to as “Autonomous NAV” in FIG.
- autonomous navigation mode also referred to as “Autonomous NAV” in FIG.
- the input signal may indicate that the robotic system 100 is to operate in the tele-operation mode if the input signal indicates movement of the input device of the control unit 118 , such as movement of a joystick or other input.
- the input signal may indicate that the robotic system 100 is to operate in the autonomous navigation mode if the input signal indicates other actuation of the input device of the control unit 118 , such as selection of an input that indicates autonomous operation.
- controller 106 determines that the robotic system 100 is to operate in the tele-operational mode 203 , then flow of the method or state diagram 200 may proceed toward 204 . If the controller 106 determines that the robotic system 100 is to operate in the autonomous navigation mode 201 , then flow of the method or state diagram 200 may proceed toward 208 .
- the robotic system 100 determines if a permissive signal to move has been generated or provided.
- the permissive signal may be generated or provided by a deliberation module of the controller 106 .
- the deliberation module receives input from the control unit 118 , such as movement of a joystick or other input that indicates a direction of movement, speed, and/or acceleration of the robotic system 100 .
- the deliberation module of the controller 106 also examines data or other information provided by one or more of the sensors 108 - 112 to determine whether movement, as requested or dictated by the input received from the control unit 118 , is feasible and/or safe.
- the deliberation module of the controller 106 can obtain two dimensional (2D) image data (e.g., 2D images or video) from the sensors 109 and/or 111 , three dimensional (3D) image data (e.g., 3D images or video, point clouds, etc.) from the sensors 108 and/or 110 , and/or detection of engagement or touch of an object from the sensor 112 .
- the deliberation module can examine this data to determine if the movement requested by the control unit 118 can be performed without the robotic system 100 colliding with another object or operating in another unsafe manner.
- the deliberation module can examine the 2D and/or 3D image data to determine if one or more obstacles remain in the movement path requested by the input.
- the image data provided by one or more of the sensors 108 - 111 can be used to determine whether any objects are in the path of the robotic system 100 .
- controller 106 determines that the robotic system 100 can move according to the input provided by the control unit 118 at 202 . Otherwise, flow of the method or state diagram 200 continues toward 206 . Otherwise, the method or state diagram 200 may remain at 204 until permission to move is received from or otherwise provided by the deliberation module of the controller 106 .
- the robotic system 100 moves according to the input provided by or otherwise received from the control unit 118 .
- the controller 106 may generate control signals that are communicated to the propulsion system 104 of the robotic system 100 to move the robotic system 100 according to the input.
- the propulsion system 104 may stop moving the robotic system 100 and flow of the method or state diagram 200 may return toward 204 .
- flow of the method or state diagram 200 may proceed toward 208 .
- flow may proceed toward 208 . For example, if the input received by the controller 106 from the control unit 118 indicates that the robotic system 100 is to autonomously move, then flow may proceed toward 208 .
- a navigation module of the controller 106 informs the deliberation module that autonomous movement of the robotic system 100 has been initiated. This can involve the controller 106 from the manual navigation mode to the autonomous navigation mode.
- the robotic system 100 may remain stationary and optionally prohibit movement of the robotic system 100 until confirmation of the change from the manual to autonomous navigation mode has been received. This confirmation may be provided from the deliberation module of the controller 106 .
- the controller 106 can obtain information such as a current location of the robotic system 100 (e.g., via a global positioning system receiver or data), locations of vehicles in the vehicle yard, numbers of vehicles in a vehicle consist that the robotic system 100 is to move alongside, known or designated locations of objects in or around the robotic system 100 , etc.
- information such as a current location of the robotic system 100 (e.g., via a global positioning system receiver or data), locations of vehicles in the vehicle yard, numbers of vehicles in a vehicle consist that the robotic system 100 is to move alongside, known or designated locations of objects in or around the robotic system 100 , etc.
- a perception module of the controller 106 can examine the sensor data and/or other data to determine how to autonomously move the robotic system 100 .
- the perception module can examine this data to determine how to safely and efficiently move the robotic system 100 without intervention (or at least additional intervention) from a human operator.
- FIG. 3 illustrates one example of sensor data 300 that can be examined by the controller 106 to determine how to autonomously move the robotic system 100 .
- the sensor data 300 is a point cloud that represents locations of different points in 3D space.
- the sensor data 300 may be obtained from a structured light sensor, such as a Microsoft KINECT camera device or other structured light sensor.
- the point cloud indicates where different objects are located relative to the sensor 108 , 110 that provided the data used to create the point cloud.
- the perception module of the controller 106 can examine the point cloud to determine if there are any objects that the robotic system 100 could collide with. Based on the locations of the points in the point cloud, the perception module of the controller 106 can determine how far the object is from the sensor that provided the data used to generate the point cloud.
- the controller 106 may examine other data, such as 2D or 3D images obtained by the sensors 108 - 111 , detection of touch as determined by the sensor 112 , radar data provided by one or more sensors, or other data, to determine the presence, distance to, relative location, etc., of other object(s) around the robotic system 100 .
- data such as 2D or 3D images obtained by the sensors 108 - 111 , detection of touch as determined by the sensor 112 , radar data provided by one or more sensors, or other data, to determine the presence, distance to, relative location, etc., of other object(s) around the robotic system 100 .
- the perception module examines the data to determine whether the robotic system 100 can move without colliding with another object (stationary or moving) and, if the robotic system 100 can move without a collision, where the robotic system 100 can move.
- the controller 106 can examine the data to determine allowable limits on where the robotic system 100 can move. These limits can include restrictions on how far the robotic system 100 can move in one or more directions, how fast the robotic system 100 can move in one or more directions, and/or how quickly the robotic system 100 can accelerate or decelerate in one or more directions.
- the controller 106 may hold off on moving the robotic system 100 until the determination is made as to whether the robotic system 100 can move and limitations on how the robotic system 100 can move.
- the robotic system 100 autonomously moves.
- the navigation module of the controller 106 determines a waypoint location for the robotic system 100 to move toward.
- the waypoint location may be a geographic location that is between a current location of the robotic system 100 and a final, destination, or goal location that the robotic system 100 is moving toward. For example, if the robotic system 100 is to move five meters to a brake lever of a vehicle in order to grasp and pull the brake lever (e.g., to bleed an air brake of the vehicle), the navigation module may generate control signals to cause the robotic system 100 to move to a waypoint that is fifty centimeters (or another distance) toward the brake lever from the current location of the robotic system 100 , but that is not at the location of the brake lever.
- FIG. 4 illustrates one example of a waypoint location that can be determined for the robotic system 100 .
- the navigation module can use the sensor data (and/or other data described herein) and determine locations of other objects (“Plane of Railcar” in FIG. 4 ), the surface on which the robotic system 100 is moving (“Ground Plane” in FIG. 4 ), and/or the waypoint location to which the robotic system 100 is moving (“Waypoint” in FIG. 4 ).
- the point cloud obtained from one or more of the sensors 108 , 110 can be examined to determine locations of the other objects and/or surface shown in FIG. 4 .
- the controller 106 can determine the locations of objects using the data with simultaneous localization and mapping (SLAM). For example, the controller 106 can use real-time appearance-based mapping (RTAB-Map) to identify the locations of objects.
- SLAM simultaneous localization and mapping
- RTAB-Map real-time appearance-based mapping
- the navigation module of the controller 106 can generate control signals to dictate how the robotic system 100 moves toward the waypoint location. These control signals may designate the direction of movement, the distance that the robotic system 100 is to move, the moving speed, and/or acceleration based on the current location of the robotic system 100 , the waypoint location, and/or limitations determined by the perception module of the controller 106 (described below).
- the navigation module generates control signals that are communicated to the propulsion system 104 of the robotic system 100 . These control signals direct the motors and other components of the propulsion system 104 how to operate to move the robotic system 100 .
- Movement of the robotic system 100 can be monitored to determine whether the movement of the robotic system 100 has or will violate one or more predefined or previously designated limits.
- the robotic system 100 is not allowed to move more than forty inches (e.g., 102 centimeters).
- another distance limitation or other limitation e.g., a limitation on an upper or lower speed, a limitation on an upper or lower acceleration, a limitation on a direction of movement, etc. may be used. If the movement of the robotic system 100 reaches or violates one or more of these limitations, flow of the method or state diagram 200 can proceed toward 214 .
- the controller 106 determines the movements of the robotic system 100 to try and achieve different goals.
- One goal is to move the robotic system 100 so as to minimize or reduce the distance between the robotic system 100 and the desired location, such as the next waypoint (relative to moving the robotic system 100 along one or more, or all, other feasible paths to the next waypoint).
- Another goal is to keep at least a designated safe distance between the robotic system 100 and one or more other objects, such as rail tracks on which the vehicles are disposed.
- the controller 106 can determine commands for the propulsion system 104 that drive the robotic system 100 toward the next waypoint and fuse these commands with commands that keep the robotic system 100 away from the vehicles (or other objects), by at least a designated, non-zero distance (e.g., four inches or ten centimeters). These commands are combined by the controller 106 to determine a velocity command that will control the propulsion system 104 of the robotic system 100 to move.
- the fusion can be a weighted sum of the commands:
- cmd vel ⁇ *cmd goal + ⁇ *cmd safety (1)
- cmd vel represents the velocity command
- cmd goal and cmd safety are generated using the artificial potential field algorithm
- ⁇ and ⁇ are parameters that are tuned or set based on the task-relevant situations.
- movement of the robotic system 100 is stopped.
- the navigation module of the controller 106 can generate and communicate an alarm signal to the propulsion system 104 that stops movement of the robotic system 100 .
- This signal can direct motors to stop rotating wheels of the vehicle 102 of the robotic system 100 and/or direct a brake of the vehicle 102 to stop movement of the robotic system 100 .
- Flow of the method or state diagram 200 may then return toward 202 .
- the robotic system 100 may continue autonomously moving toward the waypoint location.
- the following motion will be determined through a decision making process (e.g., motion and energy optimization).
- the navigation module may direct the propulsion system 104 to move the robotic system 100 to move toward, but not all the way to, a destination or goal location. Instead, the navigation module can direct the propulsion system 104 to move the robotic system 100 part of the way to the destination or goal location.
- the robotic system 100 moves toward the waypoint location subject to the limitations described above.
- flow of the method or state diagram 200 can return toward 210 from 212 .
- the perception module can again determine whether the robotic system 100 can move based on the sensor data and/or other data, as described above.
- At least a portion of the method or state diagram 200 may repeat one or more times or iterations between perceiving the surroundings, determining a subsequent waypoint, determining limitations on movement toward the waypoint, and moving to the waypoint.
- the final waypoint may be at or near the final destination of the robotic system 100 .
- a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more sensors configured to be disposed onboard the robotic vehicle and to obtain image data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward.
- the waypoint is located between a current location of the robotic vehicle and a final destination of the robotic vehicle.
- the controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the image data.
- the controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects.
- the controller also is configured to determine one or more additional waypoints subsequent to the robotic vehicle reaching the waypoint, determine one or more additional limitations on the movement of the robotic vehicle toward each of the respective additional waypoints, and control the propulsion system of the robotic vehicle to sequentially move the robotic vehicle to the one or more additional waypoints.
- the one or more sensors are configured to obtain a point cloud of the external environment using one or more structured light sensors.
- controller is configured to determine the limitations on the movement of the robotic vehicle by determining relative locations of one or more objects in the external environment based on the image data and restricting movement of the robotic vehicle to avoid colliding with the one or more objects.
- the controller is configured to determine the limitations using simultaneous localization and mapping to restrict the movement of the robotic vehicle.
- the controller also is configured to stop movement of the robotic vehicle responsive to the robotic vehicle moving farther than a designated, non-zero distance toward the waypoint.
- the final destination of the robotic vehicle is a brake lever of a vehicle.
- the controller also is configured to switch between manual control of the movement of the robotic vehicle and autonomous movement of the robotic vehicle based on input received from a control unit disposed off-board the robotic vehicle.
- a method includes obtaining image data representative of an environment external to a robotic system, determining a waypoint for the robotic system to move toward, the waypoint located between a current location of the robotic system and a final destination of the robotic system, determining limitations on movement of the robotic system toward the waypoint.
- the limitations are based on the image data, controlling a propulsion system of the robotic system to move the robotic system to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects, determining one or more additional waypoints subsequent to the robotic system reaching the waypoint, determining one or more additional limitations on the movement of the robotic system toward each of the respective additional waypoints, and controlling the propulsion system of the robotic system to sequentially move the robotic system to the one or more additional waypoints.
- obtaining the image data includes obtaining a point cloud using one or more structured light sensors.
- determining the limitations on the movement of the robotic system include determining relative locations of one or more objects in the environment based on the image data and restricting movement of the robotic system to avoid colliding with the one or more objects.
- determining the limitations includes using simultaneous localization and mapping to restrict the movement of the robotic system.
- the method also includes stopping movement of the robotic system responsive to the robotic system moving farther than a designated, non-zero distance toward the waypoint.
- the final destination of the robotic system is a brake lever of a vehicle.
- the method also includes switching between manual control of the movement of the robotic system and autonomous movement of the robotic system based on input received from a control unit disposed off-board the robotic system.
- a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more structured light sensors configured to be disposed onboard the robotic vehicle and to obtain point cloud data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward.
- the waypoint is located between a current location of the robotic vehicle and a brake lever of a rail vehicle.
- the controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the point cloud data.
- the controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects.
- the controller also is configured to determine one or more additional waypoints subsequent to the robotic vehicle reaching the waypoint, determine one or more additional limitations on the movement of the robotic vehicle toward each of the respective additional waypoints, and control the propulsion system of the robotic vehicle to sequentially move the robotic vehicle to the one or more additional waypoints.
- the controller is configured to determine the limitations on the movement of the robotic vehicle by determining relative locations of one or more objects in the external environment based on the point cloud data and restricting movement of the robotic vehicle to avoid colliding with the one or more objects.
- the controller is configured to determine the limitations using simultaneous localization and mapping to restrict the movement of the robotic vehicle.
- the controller also is configured to stop movement of the robotic vehicle responsive to the robotic vehicle moving farther than a designated, non-zero distance toward the waypoint.
- the controller also is configured to switch between manual control of the movement of the robotic vehicle and autonomous movement of the robotic vehicle based on input received from a control unit disposed off-board the robotic vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Optics & Photonics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Application No. 62/342,448, filed 27 May 2016, the entire disclosure of which is incorporated herein by reference.
- The subject matter described herein relates to systems and methods for autonomously controlling movement of a device.
- The challenges in the modern vehicle yards are vast and diverse. Classification yards, or hump yards, play an important role as consolidation nodes in vehicle freight networks. At classification yards, inbound vehicle systems (e.g., trains) are disassembled and the cargo-carrying vehicles (e.g., railcars) are sorted by next common destination (or block). The efficiency of the yards in part drives the efficiency of the entire transportation network.
- The hump yard is generally divided into three main areas: the receiving yard, where inbound vehicle systems arrive and are prepared for sorting; the class yard, where cargo-carrying vehicles in the vehicle systems are sorted into blocks; and the departure yard, where blocks of vehicles are assembled into outbound vehicle systems, inspected, and then depart.
- Current solutions for field service operations are labor-intensive, dangerous, and limited by the operational capabilities of humans being able to make critical decisions in the presence of incomplete or incorrect information. Furthermore, efficient system level-operations require integrated system wide solutions, more than just point solutions to key challenges. The nature of these missions dictates that the tasks and environments cannot always be fully anticipated or specified at the design time, yet an autonomous solution may need the essential capabilities and tools to carry out the mission even if it encounters situations that were not expected.
- Solutions for typical vehicle yard problems, such as brake bleeding, brake line lacing, coupling cars, etc., can require combining mobility, perception, and manipulation toward a tightly integrated autonomous solution. When placing robots in an outdoor environment, technical challenges largely increase, but field robotic application benefits both technically and economically.
- One challenge in using automated robotic systems to perform maintenance of the vehicles in the yard is ensuring that the robotic systems safely move through the yard. For example, safeguards are needed to ensure that the robotic systems do not collide with other objects (stationary or moving) and that the robotic systems are able to respond to a dynamically changing environment (e.g., where an object moves into the path of a moving robotic system), while also attempting to ensure that the robotic systems move toward locations for performing the vehicle maintenance along efficient paths (e.g., the shortest possible path or the path that is shorter than one or more other paths, but not all paths).
- In one embodiment, a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more sensors configured to be disposed onboard the robotic vehicle and to obtain image data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward. The waypoint is located between a current location of the robotic vehicle and a final destination of the robotic vehicle. The controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the image data. The controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects. The controller also is configured to determine one or more additional waypoints subsequent to the robotic vehicle reaching the waypoint, determine one or more additional limitations on the movement of the robotic vehicle toward each of the respective additional waypoints, and control the propulsion system of the robotic vehicle to sequentially move the robotic vehicle to the one or more additional waypoints.
- In one embodiment, a method includes obtaining image data representative of an environment external to a robotic system, determining a waypoint for the robotic system to move toward, the waypoint located between a current location of the robotic system and a final destination of the robotic system, determining limitations on movement of the robotic system toward the waypoint. The limitations are based on the image data, controlling a propulsion system of the robotic system to move the robotic system to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects, determining one or more additional waypoints subsequent to the robotic system reaching the waypoint, determining one or more additional limitations on the movement of the robotic system toward each of the respective additional waypoints, and controlling the propulsion system of the robotic system to sequentially move the robotic system to the one or more additional waypoints.
- In one embodiment, a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more structured light sensors configured to be disposed onboard the robotic vehicle and to obtain point cloud data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward. The waypoint is located between a current location of the robotic vehicle and a brake lever of a vehicle. The controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the point cloud data. The controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects.
- The present inventive subject matter will be better understood from reading the following description of non-limiting embodiments, with reference to the attached drawings, wherein below:
-
FIG. 1 illustrates one embodiment of a robotic system; -
FIG. 2 illustrates a flowchart of a method or state diagram of operation of a controller of the robotic system shown inFIG. 1 in directing movement of the robotic system according to one embodiment; -
FIG. 3 illustrates one example of sensor data that can be examined by the controller shown inFIG. 1 to determine how to autonomously move the robotic system also shown inFIG. 1 ; and -
FIG. 4 illustrates one example of a waypoint location that can be determined for the robotic system shown inFIG. 1 . -
FIG. 1 illustrates one embodiment of arobotic system 100. Therobotic system 100 may be used to autonomously move toward, grasp, and actuate (e.g., move) a brake lever or rod on a vehicle in order to change a state of a brake system of the vehicle. For example, therobotic system 100 may autonomously move toward, grasp, and move a brake rod of an air brake system on a rail car in order to bleed air out of the brake system. Therobotic system 100 includes arobotic vehicle 102 having apropulsion system 104 that operates to move therobotic system 100. Thepropulsion system 104 may include one or more motors, power sources (e.g., batteries, alternators, generators, etc.), or the like, for moving therobotic system 100. Acontroller 106 of therobotic system 100 includes hardware circuitry that includes and/or is connected with one or more processors (e.g., microprocessors, field programmable gate arrays, and/or integrated circuits) that direct operations of therobotic system 100. - The
robotic system 100 also includesseveral sensors robotic system 100 to move toward, grasp, and actuate brake levers. The sensors 108-111 are optical sensors, such as cameras, infrared projectors and/or detectors. While fouroptical sensors robotic system 100 may have a single optical sensor, less than four optical sensors, or more than four optical sensors. In one embodiment, thesensors sensors - The
sensor 112 is a touch sensor that detects when amanipulator arm 114 of therobotic system 100 contacts or otherwise engages a surface or object. Thetouch sensor 112 may be one or more of a variety of touch-sensitive devices, such as a switch (e.g., that is closed upon touch or contact), a capacitive element (e.g., that is charged or discharged upon touch or contact), or the like. Alternatively, one or more of the sensors 108-112 may be another type of sensor, such as a radar sensor, LIDAR sensor, etc. - The
manipulator arm 114 is an elongated body of therobotic system 100 that can move in a variety of directions, grasp, and pull and/or push a brake rod. Thecontroller 106 may be operably connected with thepropulsion system 104 and themanipulator arm 114 to control movement of therobotic system 100 and/or thearm 114, such as by one or more wired and/or wireless connections. Thecontroller 106 may be operably connected with the sensors 108-112 to receive data obtained, detected, or measured by the sensors 108-112. - The
robotic system 100 can include acommunication device 116 that communicates with an off-board control unit 118. Thecommunication device 116 can represent one or more antennas and associated transceiving circuitry, such as one or more modems, transceivers, receivers, transmitters, etc. Thecontrol unit 118 can represent hardware circuitry that includes and/or is connected with one or more processors (e.g., microprocessors, field programmable gate arrays, or integrated circuits) that receives user input to remotely control movement and other operation of therobotic system 100. In one embodiment, thecontrol unit 118 also represents one or more input devices, such as joysticks, touchscreens, styluses, keyboards, etc., to allow a user to remotely control movement and other operations of therobotic system 100. Thecontrol unit 118 also can include one or more antennas and associated transceiving circuitry to allow wireless communication with thecommunication device 116 of therobotic system 100. Alternatively or additionally, thecommunication device 116 of therobotic system 100 may be connected with thecontrol unit 118 by one or more wired connections to allow for remote control of therobotic system 100 via the wired connection(s). -
FIG. 2 illustrates a state diagram 200 of operation of thecontroller 106 in directing movement of therobotic system 100 shown inFIG. 1 according to one embodiment. The state diagram 200 can represent a flowchart of a method for controlling movement of therobotic system 100, and may represent or be used to create software that directs operation of thecontroller 106. - The
controller 106 may operate using the method represented by the state diagram 200 to move therobotic system 100 between or among different locations (e.g., in vehicle yards or other locations) to perform tasks, such as maintenance, inspection, repair, etc., of the vehicles. Thecontroller 106 may operate in different operational modes. One mode can be referred to as an autonomous navigation mode and another mode can be referred to as tele-operation mode. Operations performed or controlled by thecontroller 106 can be referred to herein as modules. The modules can represent different sets of functions performed by the same or different processors of thecontroller 106, and/or can represent different hardware components (e.g., processors and associated circuitry) performing the functions associated with the respective modules. - At 202, the
robotic system 100 is in a ready state. The ready state may involve therobotic system 100 being stationary and prepared to begin movement. Thecontroller 106 may monitor the communication device 116 (or wait for a signal from the communication device 116) to indicate whether therobotic system 100 is to begin movement. Thecontroller 106 may receive an input signal from thecontrol unit 118 via thecommunication device 116 and/or from an input device of the robotic system 100 (e.g., one or more buttons, knobs, switches, touchscreens, keyboards, etc.). Responsive to receiving the input signal, thecontroller 106 may determine whether the input signal indicates that therobotic system 100 is to operate in the autonomous navigation mode (also referred to as “Autonomous NAV” inFIG. 2 ; e.g., the operations or states shown in connection with 201 inFIG. 2 ) or the tele-operation (or manual navigation or remote control) mode (also referred to as “Manual NAV” inFIG. 2 ; e.g., the operations or states shown in connection with 203 inFIG. 2 ). The input signal may indicate that therobotic system 100 is to operate in the tele-operation mode if the input signal indicates movement of the input device of thecontrol unit 118, such as movement of a joystick or other input. The input signal may indicate that therobotic system 100 is to operate in the autonomous navigation mode if the input signal indicates other actuation of the input device of thecontrol unit 118, such as selection of an input that indicates autonomous operation. - If the
controller 106 determines that therobotic system 100 is to operate in the tele-operational mode 203, then flow of the method or state diagram 200 may proceed toward 204. If thecontroller 106 determines that therobotic system 100 is to operate in theautonomous navigation mode 201, then flow of the method or state diagram 200 may proceed toward 208. - At 204, the
robotic system 100 determines if a permissive signal to move has been generated or provided. The permissive signal may be generated or provided by a deliberation module of thecontroller 106. The deliberation module receives input from thecontrol unit 118, such as movement of a joystick or other input that indicates a direction of movement, speed, and/or acceleration of therobotic system 100. The deliberation module of thecontroller 106 also examines data or other information provided by one or more of the sensors 108-112 to determine whether movement, as requested or dictated by the input received from thecontrol unit 118, is feasible and/or safe. For example, the deliberation module of thecontroller 106 can obtain two dimensional (2D) image data (e.g., 2D images or video) from thesensors 109 and/or 111, three dimensional (3D) image data (e.g., 3D images or video, point clouds, etc.) from thesensors 108 and/or 110, and/or detection of engagement or touch of an object from thesensor 112. The deliberation module can examine this data to determine if the movement requested by thecontrol unit 118 can be performed without therobotic system 100 colliding with another object or operating in another unsafe manner. For example, the deliberation module can examine the 2D and/or 3D image data to determine if one or more obstacles remain in the movement path requested by the input. As described in more detail below, the image data provided by one or more of the sensors 108-111 can be used to determine whether any objects are in the path of therobotic system 100. - If the controller 106 (e.g., the deliberation module) determines that the
robotic system 100 can move according to the input provided by thecontrol unit 118 at 202, then flow of the method or state diagram 200 continues toward 206. Otherwise, the method or state diagram 200 may remain at 204 until permission to move is received from or otherwise provided by the deliberation module of thecontroller 106. - At 206, the
robotic system 100 moves according to the input provided by or otherwise received from thecontrol unit 118. For example, responsive to receiving permission to move therobotic system 100 according to the input provided by thecontrol unit 118, thecontroller 106 may generate control signals that are communicated to thepropulsion system 104 of therobotic system 100 to move therobotic system 100 according to the input. Upon completion of the movement, thepropulsion system 104 may stop moving therobotic system 100 and flow of the method or state diagram 200 may return toward 204. - If, at 204, it is determined that the
robotic system 100 is to operate in theautonomous navigation mode 201, then flow of the method or state diagram 200 may proceed toward 208. For example, if the input received by thecontroller 106 from thecontrol unit 118 indicates that therobotic system 100 is to autonomously move, then flow may proceed toward 208. - At 208, a navigation module of the
controller 106 informs the deliberation module that autonomous movement of therobotic system 100 has been initiated. This can involve thecontroller 106 from the manual navigation mode to the autonomous navigation mode. Therobotic system 100 may remain stationary and optionally prohibit movement of therobotic system 100 until confirmation of the change from the manual to autonomous navigation mode has been received. This confirmation may be provided from the deliberation module of thecontroller 106. - Responsive to receiving confirmation that the autonomous movement of the
robotic system 100 has been initiated, at 210, a determination is made as to whether therobotic system 100 can move. This determination may involve examining data provided by one or more of the sensors 108-112, in addition to or exclusive of other data provided to or accessible by thecontroller 106. For example, in addition to the image data provided by one or more of the sensors 108-111, thecontroller 106 may access a memory or database (not shown) onboard or off-board the robotic system 100 (e.g., via the communication device 116). Thecontroller 106 can obtain information such as a current location of the robotic system 100 (e.g., via a global positioning system receiver or data), locations of vehicles in the vehicle yard, numbers of vehicles in a vehicle consist that therobotic system 100 is to move alongside, known or designated locations of objects in or around therobotic system 100, etc. - A perception module of the
controller 106 can examine the sensor data and/or other data to determine how to autonomously move therobotic system 100. The perception module can examine this data to determine how to safely and efficiently move therobotic system 100 without intervention (or at least additional intervention) from a human operator. -
FIG. 3 illustrates one example ofsensor data 300 that can be examined by thecontroller 106 to determine how to autonomously move therobotic system 100. Thesensor data 300 is a point cloud that represents locations of different points in 3D space. Thesensor data 300 may be obtained from a structured light sensor, such as a Microsoft KINECT camera device or other structured light sensor. The point cloud indicates where different objects are located relative to thesensor controller 106 can examine the point cloud to determine if there are any objects that therobotic system 100 could collide with. Based on the locations of the points in the point cloud, the perception module of thecontroller 106 can determine how far the object is from the sensor that provided the data used to generate the point cloud. Optionally, thecontroller 106 may examine other data, such as 2D or 3D images obtained by the sensors 108-111, detection of touch as determined by thesensor 112, radar data provided by one or more sensors, or other data, to determine the presence, distance to, relative location, etc., of other object(s) around therobotic system 100. - Returning to the description of the method or state diagram 200 shown in
FIG. 2 , at 210, the perception module examines the data to determine whether therobotic system 100 can move without colliding with another object (stationary or moving) and, if therobotic system 100 can move without a collision, where therobotic system 100 can move. For example, thecontroller 106 can examine the data to determine allowable limits on where therobotic system 100 can move. These limits can include restrictions on how far therobotic system 100 can move in one or more directions, how fast therobotic system 100 can move in one or more directions, and/or how quickly therobotic system 100 can accelerate or decelerate in one or more directions. Thecontroller 106 may hold off on moving therobotic system 100 until the determination is made as to whether therobotic system 100 can move and limitations on how therobotic system 100 can move. - At 212, the
robotic system 100 autonomously moves. In one embodiment, the navigation module of thecontroller 106 determines a waypoint location for therobotic system 100 to move toward. The waypoint location may be a geographic location that is between a current location of therobotic system 100 and a final, destination, or goal location that therobotic system 100 is moving toward. For example, if therobotic system 100 is to move five meters to a brake lever of a vehicle in order to grasp and pull the brake lever (e.g., to bleed an air brake of the vehicle), the navigation module may generate control signals to cause therobotic system 100 to move to a waypoint that is fifty centimeters (or another distance) toward the brake lever from the current location of therobotic system 100, but that is not at the location of the brake lever. -
FIG. 4 illustrates one example of a waypoint location that can be determined for therobotic system 100. The navigation module can use the sensor data (and/or other data described herein) and determine locations of other objects (“Plane of Railcar” inFIG. 4 ), the surface on which therobotic system 100 is moving (“Ground Plane” inFIG. 4 ), and/or the waypoint location to which therobotic system 100 is moving (“Waypoint” inFIG. 4 ). For example, the point cloud obtained from one or more of thesensors FIG. 4 . - In one embodiment, the
controller 106 can determine the locations of objects using the data with simultaneous localization and mapping (SLAM). For example, thecontroller 106 can use real-time appearance-based mapping (RTAB-Map) to identify the locations of objects. - The navigation module of the
controller 106 can generate control signals to dictate how therobotic system 100 moves toward the waypoint location. These control signals may designate the direction of movement, the distance that therobotic system 100 is to move, the moving speed, and/or acceleration based on the current location of therobotic system 100, the waypoint location, and/or limitations determined by the perception module of the controller 106 (described below). The navigation module generates control signals that are communicated to thepropulsion system 104 of therobotic system 100. These control signals direct the motors and other components of thepropulsion system 104 how to operate to move therobotic system 100. - Movement of the
robotic system 100 can be monitored to determine whether the movement of therobotic system 100 has or will violate one or more predefined or previously designated limits. In the illustrated example, therobotic system 100 is not allowed to move more than forty inches (e.g., 102 centimeters). Optionally, another distance limitation or other limitation (e.g., a limitation on an upper or lower speed, a limitation on an upper or lower acceleration, a limitation on a direction of movement, etc.) may be used. If the movement of therobotic system 100 reaches or violates one or more of these limitations, flow of the method or state diagram 200 can proceed toward 214. - In one embodiment, the
controller 106 determines the movements of therobotic system 100 to try and achieve different goals. One goal is to move therobotic system 100 so as to minimize or reduce the distance between therobotic system 100 and the desired location, such as the next waypoint (relative to moving therobotic system 100 along one or more, or all, other feasible paths to the next waypoint). Another goal is to keep at least a designated safe distance between therobotic system 100 and one or more other objects, such as rail tracks on which the vehicles are disposed. Thecontroller 106 can determine commands for thepropulsion system 104 that drive therobotic system 100 toward the next waypoint and fuse these commands with commands that keep therobotic system 100 away from the vehicles (or other objects), by at least a designated, non-zero distance (e.g., four inches or ten centimeters). These commands are combined by thecontroller 106 to determine a velocity command that will control thepropulsion system 104 of therobotic system 100 to move. The fusion can be a weighted sum of the commands: -
cmd vel =α*cmd goal +β*cmd safety (1) -
α+β=1 (2) - where cmdvel represents the velocity command, cmdgoal and cmdsafety are generated using the artificial potential field algorithm, and α and β are parameters that are tuned or set based on the task-relevant situations.
- At 214, movement of the
robotic system 100 is stopped. The navigation module of thecontroller 106 can generate and communicate an alarm signal to thepropulsion system 104 that stops movement of therobotic system 100. This signal can direct motors to stop rotating wheels of thevehicle 102 of therobotic system 100 and/or direct a brake of thevehicle 102 to stop movement of therobotic system 100. Flow of the method or state diagram 200 may then return toward 202. - But, if movement of the
robotic system 100 does not reach or violate the limitation(s), then therobotic system 100 may continue autonomously moving toward the waypoint location. The following motion will be determined through a decision making process (e.g., motion and energy optimization). As described above, the navigation module may direct thepropulsion system 104 to move therobotic system 100 to move toward, but not all the way to, a destination or goal location. Instead, the navigation module can direct thepropulsion system 104 to move therobotic system 100 part of the way to the destination or goal location. - The
robotic system 100 moves toward the waypoint location subject to the limitations described above. Upon reaching the waypoint location, flow of the method or state diagram 200 can return toward 210 from 212. For example, the perception module can again determine whether therobotic system 100 can move based on the sensor data and/or other data, as described above. At least a portion of the method or state diagram 200 may repeat one or more times or iterations between perceiving the surroundings, determining a subsequent waypoint, determining limitations on movement toward the waypoint, and moving to the waypoint. Eventually, the final waypoint may be at or near the final destination of therobotic system 100. - In one embodiment, a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more sensors configured to be disposed onboard the robotic vehicle and to obtain image data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward. The waypoint is located between a current location of the robotic vehicle and a final destination of the robotic vehicle. The controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the image data. The controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects. The controller also is configured to determine one or more additional waypoints subsequent to the robotic vehicle reaching the waypoint, determine one or more additional limitations on the movement of the robotic vehicle toward each of the respective additional waypoints, and control the propulsion system of the robotic vehicle to sequentially move the robotic vehicle to the one or more additional waypoints.
- In one example, the one or more sensors are configured to obtain a point cloud of the external environment using one or more structured light sensors.
- In one example, controller is configured to determine the limitations on the movement of the robotic vehicle by determining relative locations of one or more objects in the external environment based on the image data and restricting movement of the robotic vehicle to avoid colliding with the one or more objects.
- In one example, the controller is configured to determine the limitations using simultaneous localization and mapping to restrict the movement of the robotic vehicle.
- In one example, the controller also is configured to stop movement of the robotic vehicle responsive to the robotic vehicle moving farther than a designated, non-zero distance toward the waypoint.
- In one example, the final destination of the robotic vehicle is a brake lever of a vehicle.
- In one example, the controller also is configured to switch between manual control of the movement of the robotic vehicle and autonomous movement of the robotic vehicle based on input received from a control unit disposed off-board the robotic vehicle.
- In one embodiment, a method includes obtaining image data representative of an environment external to a robotic system, determining a waypoint for the robotic system to move toward, the waypoint located between a current location of the robotic system and a final destination of the robotic system, determining limitations on movement of the robotic system toward the waypoint. The limitations are based on the image data, controlling a propulsion system of the robotic system to move the robotic system to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects, determining one or more additional waypoints subsequent to the robotic system reaching the waypoint, determining one or more additional limitations on the movement of the robotic system toward each of the respective additional waypoints, and controlling the propulsion system of the robotic system to sequentially move the robotic system to the one or more additional waypoints.
- In one example, obtaining the image data includes obtaining a point cloud using one or more structured light sensors.
- In one example, determining the limitations on the movement of the robotic system include determining relative locations of one or more objects in the environment based on the image data and restricting movement of the robotic system to avoid colliding with the one or more objects.
- In one example, determining the limitations includes using simultaneous localization and mapping to restrict the movement of the robotic system.
- In one example, the method also includes stopping movement of the robotic system responsive to the robotic system moving farther than a designated, non-zero distance toward the waypoint.
- In one example, the final destination of the robotic system is a brake lever of a vehicle.
- In one example, the method also includes switching between manual control of the movement of the robotic system and autonomous movement of the robotic system based on input received from a control unit disposed off-board the robotic system.
- In one embodiment, a robotic system includes a robotic vehicle having a propulsion system configured to propel the robotic vehicle, one or more structured light sensors configured to be disposed onboard the robotic vehicle and to obtain point cloud data representative of an external environment, and a controller configured to be disposed onboard the robotic vehicle and to determine a waypoint for the robotic vehicle to move toward. The waypoint is located between a current location of the robotic vehicle and a brake lever of a rail vehicle. The controller also is configured to determine limitations on movement of the robotic vehicle toward the waypoint. The limitations are based on the point cloud data. The controller is configured to control the propulsion system to move the robotic vehicle to the waypoint subject to the limitations on the movement to avoid colliding with one or more objects.
- In one example, the controller also is configured to determine one or more additional waypoints subsequent to the robotic vehicle reaching the waypoint, determine one or more additional limitations on the movement of the robotic vehicle toward each of the respective additional waypoints, and control the propulsion system of the robotic vehicle to sequentially move the robotic vehicle to the one or more additional waypoints.
- In one example, the controller is configured to determine the limitations on the movement of the robotic vehicle by determining relative locations of one or more objects in the external environment based on the point cloud data and restricting movement of the robotic vehicle to avoid colliding with the one or more objects.
- In one example, the controller is configured to determine the limitations using simultaneous localization and mapping to restrict the movement of the robotic vehicle.
- In one example, the controller also is configured to stop movement of the robotic vehicle responsive to the robotic vehicle moving farther than a designated, non-zero distance toward the waypoint.
- In one example, the controller also is configured to switch between manual control of the movement of the robotic vehicle and autonomous movement of the robotic vehicle based on input received from a control unit disposed off-board the robotic vehicle.
- As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the presently described subject matter are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.
- It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the subject matter set forth herein without departing from its scope. While the dimensions and types of materials described herein are intended to define the parameters of the disclosed subject matter, they are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the subject matter described herein should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. §112(f), unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
- This written description uses examples to disclose several embodiments of the subject matter set forth herein, including the best mode, and also to enable a person of ordinary skill in the art to practice the embodiments of disclosed subject matter, including making and using the devices or systems and performing the methods. The patentable scope of the subject matter described herein is defined by the claims, and may include other examples that occur to those of ordinary skill in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/282,102 US20170341235A1 (en) | 2016-05-27 | 2016-09-30 | Control System And Method For Robotic Motion Planning And Control |
US16/934,046 US11927969B2 (en) | 2015-05-01 | 2020-07-21 | Control system and method for robotic motion planning and control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662342448P | 2016-05-27 | 2016-05-27 | |
US15/282,102 US20170341235A1 (en) | 2016-05-27 | 2016-09-30 | Control System And Method For Robotic Motion Planning And Control |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/240,237 Continuation-In-Part US11020859B2 (en) | 2015-05-01 | 2019-01-04 | Integrated robotic system and method for autonomous vehicle maintenance |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/934,046 Continuation US11927969B2 (en) | 2015-05-01 | 2020-07-21 | Control system and method for robotic motion planning and control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170341235A1 true US20170341235A1 (en) | 2017-11-30 |
Family
ID=60420876
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/282,102 Abandoned US20170341235A1 (en) | 2015-05-01 | 2016-09-30 | Control System And Method For Robotic Motion Planning And Control |
US16/934,046 Active 2036-11-05 US11927969B2 (en) | 2015-05-01 | 2020-07-21 | Control system and method for robotic motion planning and control |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/934,046 Active 2036-11-05 US11927969B2 (en) | 2015-05-01 | 2020-07-21 | Control system and method for robotic motion planning and control |
Country Status (1)
Country | Link |
---|---|
US (2) | US20170341235A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
LU100766B1 (en) * | 2018-04-16 | 2019-02-06 | Cti Systems S A R L | Collision avoidance assistance system for movable work platforms |
CN110209171A (en) * | 2019-06-24 | 2019-09-06 | 深圳物控智联科技有限公司 | A kind of paths planning method based on Artificial Potential Field Method |
US20190321977A1 (en) * | 2018-04-23 | 2019-10-24 | General Electric Company | Architecture and methods for robotic mobile manipluation system |
US20200023523A1 (en) * | 2018-07-17 | 2020-01-23 | Fuji Xerox Co., Ltd. | Robot control system, robot apparatus, and non-transitory computer readable medium |
RU2716035C1 (en) * | 2017-12-20 | 2020-03-05 | СиТиАй Системс С.а.р.л. | Collision avoidance assist system for movable working platforms |
US10829354B2 (en) | 2017-12-20 | 2020-11-10 | Cti Systems S.a.r.l. | Collision avoidance assistance system for movable work platforms |
US11729509B2 (en) * | 2020-05-22 | 2023-08-15 | Magic Control Technology Corp. | 360-degree panoramic image selective displaying camera and method |
US11829147B2 (en) | 2018-10-10 | 2023-11-28 | Dyson Technology Limited | Path planning |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113110453B (en) * | 2021-04-15 | 2022-06-21 | 哈尔滨工业大学 | Artificial potential field obstacle avoidance method based on graph transformation |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020049530A1 (en) * | 1998-04-15 | 2002-04-25 | George Poropat | Method of tracking and sensing position of objects |
US20040093122A1 (en) * | 2002-11-07 | 2004-05-13 | John Galibraith | Vision-based obstacle avoidance |
US20050216182A1 (en) * | 2004-03-24 | 2005-09-29 | Hussain Talib S | Vehicle routing and path planning |
US20070156286A1 (en) * | 2005-12-30 | 2007-07-05 | Irobot Corporation | Autonomous Mobile Robot |
US20080009964A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotics Virtual Rail System and Method |
US20080009967A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotic Intelligence Kernel |
US20080027591A1 (en) * | 2006-07-14 | 2008-01-31 | Scott Lenser | Method and system for controlling a remote vehicle |
US20090037033A1 (en) * | 2007-05-14 | 2009-02-05 | Emilie Phillips | Autonomous Behaviors for a Remote Vehicle |
US20090088916A1 (en) * | 2007-09-28 | 2009-04-02 | Honeywell International Inc. | Method and system for automatic path planning and obstacle/collision avoidance of autonomous vehicles |
US20100241289A1 (en) * | 2006-06-22 | 2010-09-23 | Roy Sandberg | Method and apparatus for path planning, selection, and visualization |
US20100263948A1 (en) * | 2006-10-06 | 2010-10-21 | Couture Adam P | Robotic vehicle |
US20110035087A1 (en) * | 2009-08-10 | 2011-02-10 | Samsung Electronics Co., Ltd. | Method and apparatus to plan motion path of robot |
US20110054689A1 (en) * | 2009-09-03 | 2011-03-03 | Battelle Energy Alliance, Llc | Robots, systems, and methods for hazard evaluation and visualization |
US9008840B1 (en) * | 2013-04-19 | 2015-04-14 | Brain Corporation | Apparatus and methods for reinforcement-guided supervised learning |
US20150199458A1 (en) * | 2014-01-14 | 2015-07-16 | Energid Technologies Corporation | Digital proxy simulation of robotic hardware |
US20150251315A1 (en) * | 2014-03-10 | 2015-09-10 | Tecan Trading Ag | Process for Finding A Path in an Automated Handling System, and Handling System with Corresponding Control Module for Finding A Path |
US9146558B2 (en) * | 2010-11-30 | 2015-09-29 | Irobot Corporation | Mobile robot and method of operating thereof |
US20160059416A1 (en) * | 2014-08-29 | 2016-03-03 | General Electric Company | Systems and methods for railyard robotics |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6374155B1 (en) | 1999-11-24 | 2002-04-16 | Personal Robotics, Inc. | Autonomous multi-platform robot system |
EP2363774B1 (en) * | 2000-05-01 | 2017-06-21 | iRobot Corporation | Method and system for remote control of mobile robot |
US6836701B2 (en) | 2002-05-10 | 2004-12-28 | Royal Appliance Mfg. Co. | Autonomous multi-platform robotic system |
US8843244B2 (en) * | 2006-10-06 | 2014-09-23 | Irobot Corporation | Autonomous behaviors for a remove vehicle |
US8583313B2 (en) | 2008-09-19 | 2013-11-12 | International Electronic Machines Corp. | Robotic vehicle for performing rail-related actions |
US8395378B2 (en) | 2010-04-29 | 2013-03-12 | General Electric Company | Nondestructive robotic inspection method and system therefor |
JP5560979B2 (en) * | 2010-07-13 | 2014-07-30 | 村田機械株式会社 | Autonomous mobile |
ES2812568T3 (en) | 2012-01-25 | 2021-03-17 | Omron Tateisi Electronics Co | Autonomous mobile robot to execute work assignments in a physical environment in which there are stationary and non-stationary obstacles |
US9085080B2 (en) | 2012-12-06 | 2015-07-21 | International Business Machines Corp. | Human augmentation of robotic work |
CN104122843A (en) | 2013-04-24 | 2014-10-29 | 山东轻工业学院 | Concentrated control system for city underground railway intelligent detection robots and realization method |
-
2016
- 2016-09-30 US US15/282,102 patent/US20170341235A1/en not_active Abandoned
-
2020
- 2020-07-21 US US16/934,046 patent/US11927969B2/en active Active
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020049530A1 (en) * | 1998-04-15 | 2002-04-25 | George Poropat | Method of tracking and sensing position of objects |
US20040093122A1 (en) * | 2002-11-07 | 2004-05-13 | John Galibraith | Vision-based obstacle avoidance |
US20050216182A1 (en) * | 2004-03-24 | 2005-09-29 | Hussain Talib S | Vehicle routing and path planning |
US20070156286A1 (en) * | 2005-12-30 | 2007-07-05 | Irobot Corporation | Autonomous Mobile Robot |
US20100241289A1 (en) * | 2006-06-22 | 2010-09-23 | Roy Sandberg | Method and apparatus for path planning, selection, and visualization |
US20080009964A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotics Virtual Rail System and Method |
US20080009967A1 (en) * | 2006-07-05 | 2008-01-10 | Battelle Energy Alliance, Llc | Robotic Intelligence Kernel |
US20080027591A1 (en) * | 2006-07-14 | 2008-01-31 | Scott Lenser | Method and system for controlling a remote vehicle |
US20100263948A1 (en) * | 2006-10-06 | 2010-10-21 | Couture Adam P | Robotic vehicle |
US20090037033A1 (en) * | 2007-05-14 | 2009-02-05 | Emilie Phillips | Autonomous Behaviors for a Remote Vehicle |
US20090088916A1 (en) * | 2007-09-28 | 2009-04-02 | Honeywell International Inc. | Method and system for automatic path planning and obstacle/collision avoidance of autonomous vehicles |
US20110035087A1 (en) * | 2009-08-10 | 2011-02-10 | Samsung Electronics Co., Ltd. | Method and apparatus to plan motion path of robot |
US20110054689A1 (en) * | 2009-09-03 | 2011-03-03 | Battelle Energy Alliance, Llc | Robots, systems, and methods for hazard evaluation and visualization |
US9146558B2 (en) * | 2010-11-30 | 2015-09-29 | Irobot Corporation | Mobile robot and method of operating thereof |
US9008840B1 (en) * | 2013-04-19 | 2015-04-14 | Brain Corporation | Apparatus and methods for reinforcement-guided supervised learning |
US20150199458A1 (en) * | 2014-01-14 | 2015-07-16 | Energid Technologies Corporation | Digital proxy simulation of robotic hardware |
US20150251315A1 (en) * | 2014-03-10 | 2015-09-10 | Tecan Trading Ag | Process for Finding A Path in an Automated Handling System, and Handling System with Corresponding Control Module for Finding A Path |
US20160059416A1 (en) * | 2014-08-29 | 2016-03-03 | General Electric Company | Systems and methods for railyard robotics |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2716035C1 (en) * | 2017-12-20 | 2020-03-05 | СиТиАй Системс С.а.р.л. | Collision avoidance assist system for movable working platforms |
US10829354B2 (en) | 2017-12-20 | 2020-11-10 | Cti Systems S.a.r.l. | Collision avoidance assistance system for movable work platforms |
LU100766B1 (en) * | 2018-04-16 | 2019-02-06 | Cti Systems S A R L | Collision avoidance assistance system for movable work platforms |
US20190321977A1 (en) * | 2018-04-23 | 2019-10-24 | General Electric Company | Architecture and methods for robotic mobile manipluation system |
US10759051B2 (en) * | 2018-04-23 | 2020-09-01 | General Electric Company | Architecture and methods for robotic mobile manipulation system |
US20200023523A1 (en) * | 2018-07-17 | 2020-01-23 | Fuji Xerox Co., Ltd. | Robot control system, robot apparatus, and non-transitory computer readable medium |
US11829147B2 (en) | 2018-10-10 | 2023-11-28 | Dyson Technology Limited | Path planning |
CN110209171A (en) * | 2019-06-24 | 2019-09-06 | 深圳物控智联科技有限公司 | A kind of paths planning method based on Artificial Potential Field Method |
US11729509B2 (en) * | 2020-05-22 | 2023-08-15 | Magic Control Technology Corp. | 360-degree panoramic image selective displaying camera and method |
Also Published As
Publication number | Publication date |
---|---|
US20200348686A1 (en) | 2020-11-05 |
US11927969B2 (en) | 2024-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11927969B2 (en) | Control system and method for robotic motion planning and control | |
CN109933064B (en) | Multi-sensor safety path system for autonomous vehicles | |
US11865726B2 (en) | Control system with task manager | |
US11079754B2 (en) | Multi-stage operation of autonomous vehicles | |
Beul et al. | Fast autonomous flight in warehouses for inventory applications | |
US10802505B2 (en) | Driverless transport system | |
US10065314B2 (en) | System and method for manipulation platform | |
US11020859B2 (en) | Integrated robotic system and method for autonomous vehicle maintenance | |
KR101946110B1 (en) | Vehicle combination and method for forming and operating a vehicle combination | |
JP6278539B2 (en) | Flight mode selection based on situation | |
US9085080B2 (en) | Human augmentation of robotic work | |
US11312018B2 (en) | Control system with task manager | |
US20170341236A1 (en) | Integrated robotic system and method for autonomous vehicle maintenance | |
US20200039076A1 (en) | Robotic system and method for control and manipulation | |
US11822334B2 (en) | Information processing apparatus, information processing method, and program for control of a moving body capable of autonomous movement | |
US20180288372A1 (en) | Apparatus and method for treating containers and packages with flying machine for monitoring | |
ES2932553T3 (en) | Method and system for the autonomous driving of a vehicle | |
JP2022522284A (en) | Safety Rating Multicell Workspace Mapping and Monitoring | |
US20220241975A1 (en) | Control system with task manager | |
Liu et al. | Autonomous vehicle planning system design under perception limitation in pedestrian environment | |
KR102433595B1 (en) | Unmanned transportation apparatus based on autonomous driving for smart maintenance of railroad vehicles | |
US10466703B2 (en) | Method for controlling at least one vehicle, which moves at least partially autonomously within an operating environment, and structure | |
US11762390B1 (en) | Autonomous machine safety management in a dynamic environment | |
US20240091953A1 (en) | Integrated robotic system and method for autonomous vehicle maintenance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GENERAL ELECTRIC COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BALOCH, GHULAM ALI;TAN, HUAN;KANNAN, BALAJEE;AND OTHERS;SIGNING DATES FROM 20160929 TO 20161020;REEL/FRAME:040161/0486 |
|
AS | Assignment |
Owner name: GE GLOBAL SOURCING LLC, CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC COMPANY;REEL/FRAME:047952/0689 Effective date: 20181101 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |