WO2022106007A1 - Method of controlling mobile robot, control system and mobile robot - Google Patents

Method of controlling mobile robot, control system and mobile robot Download PDF

Info

Publication number
WO2022106007A1
WO2022106007A1 PCT/EP2020/082733 EP2020082733W WO2022106007A1 WO 2022106007 A1 WO2022106007 A1 WO 2022106007A1 EP 2020082733 W EP2020082733 W EP 2020082733W WO 2022106007 A1 WO2022106007 A1 WO 2022106007A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile robot
handle
human input
human
input
Prior art date
Application number
PCT/EP2020/082733
Other languages
French (fr)
Inventor
Simon LINGE
Saad AZHAR
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Priority to PCT/EP2020/082733 priority Critical patent/WO2022106007A1/en
Publication of WO2022106007A1 publication Critical patent/WO2022106007A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/162Mobile manipulator, movable base with manipulator arm mounted on it
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39439Joystick, handle, lever controls manipulator directly, manually by operator

Definitions

  • the present disclosure generally relates to control of a mobile robot.
  • a method of controlling a mobile robot, a control system for controlling a mobile robot, and a mobile robot comprising such control system are provided.
  • Mobile robots are used in a wide range of applications. Mobile robots may be used to deliver material between different locations in various facilities, such as hospitals and factories. A mobile robot may additionally be provided with a manipulator for carrying out various repetitive tasks.
  • Most mobile robots are controlled through a digital interface, either integrated into the mobile robot or separated from the mobile robot.
  • a digital interface is a teach pendant unit (TPU).
  • Most mobile robots are also provided with an emergency stop button, by means of which a human can bring the mobile robot to a full stop.
  • CN 107024934 A discloses a hospital service robot.
  • the service robot is provided with a plurality of cabinets for containing different medicine materials.
  • the service robot updates a planned walking route in real time according to a designated walking route, and in combination with data of an ultrasonic sensor, delivers the medicine materials to designated destinations.
  • mobile robots In addition to a digital interface and an emergency stop button as mentioned above, mobile robots typically have no interaction points for humans. This makes it unclear if or how a human may physically interact with the mobile robot. Most mobile robots have a considerable weight and are not possible to manually move due to a lack of tangible grasping or maneuvering elements. The quality of the interaction possibilities between humans and mobile robots is thus often poor, which may lead to skepticism to introduction of mobile robots to workspaces like factories and hospitals.
  • One object of the present disclosure is to provide a method of controlling a mobile robot, which method improves safety.
  • a further object of the present disclosure is to provide a method of controlling a mobile robot, which method enables a simple and/or intuitive interaction between a human and the mobile robot.
  • a still further object of the present disclosure is to provide a method of controlling a mobile robot, which method enables a human to take over control of the mobile robot in a simple, clear, reliable and/or intuitive manner.
  • a still further object of the present disclosure is to provide a method of controlling a mobile robot, which solves several or all of the foregoing objects in combination.
  • a still further object of the present disclosure is to provide a control system for controlling a mobile robot, which control system solves one, several or all of the foregoing objects.
  • a still further object of the present disclosure is to provide a mobile robot solving one, several or all of the foregoing objects.
  • a method of controlling a mobile robot comprising providing a mobile robot having a base structure, a handle attached to the base structure and a traction arrangement for moving the mobile robot over a surface; controlling the mobile robot in an automatic mode; detecting a human input to the handle; and stopping the control of the mobile robot in the automatic mode and controlling the mobile robot in a priority mode upon detection of the human input.
  • the traction arrangement may be automatically controlled in the automatic mode, e.g. autonomously or according to a predefined program. In the automatic mode, the entire mobile robot may operate autonomously. The mobile robot may perform various tasks and move between different locations in the automatic mode.
  • the control of the mobile robot switches to the priority mode.
  • the automatic mode is thereby overridden.
  • the traction arrangement maybe stopped and/or maybe controlled in response to the human input.
  • the traction arrangement is stopped by a first human input to the handle, and is controlled in response to any subsequent human input. That is, once the mobile robot has stopped, the mobile robot may be controlled in accordance with the human input.
  • the handle provides a clear, immediate and intuitive interaction point for humans in the vicinity of the mobile robot, apart from digital interfaces.
  • humans working in vicinity to the mobile robot may not have the experience or competence of programming the mobile robot, and they should not need to.
  • the interaction point provided by the handle is clear, immediate and intuitive also to humans having no experience of robotics. This improves safety in an environment where the mobile robot operates, and where humans work next to and/or in collaboration with the mobile robot.
  • the handle can be operated by means of a hand of a human to provide the human input.
  • the semantics of the handle invite humans to grab the handle, while the rest of the mobile robot typically do not.
  • a human can control the mobile robot in the priority mode.
  • the human input to the handle is a clear indication to the mobile robot that a human takes control over the operations of the mobile robot.
  • the human control of the mobile robot has higher priority than the automatic control of the mobile robot in the automatic mode, hence the term priority mode.
  • the priority mode may alternatively be referred to as a guided mode or manual mode although in one variant, the mobile robot is merely stopped in the priority mode.
  • the manipulator may be automatically controlled in the automatic mode, e.g. autonomously or according to a predefined program.
  • the manipulator maybe stopped or maybe controlled in response to the human input or another input (e.g. by means of one or more buttons on the handle).
  • the method may comprise an initial freezing of all movements of the mobile robot in response to the human input.
  • the handle can be used by a human to bring the mobile robot to a stop on the surface.
  • the handle can be used by a human to control a path of movement of the mobile robot. There might be an emergency that the mobile robot does not recognize or there may be an accident that can be avoided if control of the mobile robot can be quickly taken over by a human.
  • the mobile robot may also be docked to a workstation under the control of a human via the handle.
  • the handle can be used by humans with little to no experience of robotics, the handle can also be used by robot programmers for programming various tasks by the mobile robot, e.g. by means of lead- through programming. The programming may for example be an initial programming of the mobile robot during setup or for re-programming purposes.
  • the method enables a safe collaboration between a human and the mobile robot, and a safe control of the mobile robot.
  • the method may be carried out in various service robotics applications.
  • the mobile robot of the method may be of any type according to the present disclosure.
  • the method may further comprise providing feedback to the human by means of the mobile robot in response to the human input.
  • the feedback may be provided by means of the handle, e.g. by means of a vibration the handle.
  • the feedback maybe a light feedback or an audible feedback.
  • the human input may comprise a directional input.
  • the method may comprise initially stopping the mobile robot (e.g. both the traction arrangement and any manipulator thereof) in response to detecting a directional input in any direction.
  • the handle may comprise one or more sensors for sensing a direction of the human input. Examples of such sensors are capacitive sensors, movement sensors and pressure sensors.
  • the directional input may comprise a movement of the handle.
  • the handle maybe movable relative to the base structure.
  • the handle can be moved forward to provide a forward input, be moved to the left or be rotated in a counterclockwise direction to provide a left input, be moved to the right or be rotated in a clockwise direction to provide a right input, and be moved rearward to provide a rearward input.
  • the handle is movable in a plane substantially parallel with, or parallel with, the surface.
  • the human input may comprise a touch.
  • the method may comprise initially stopping the mobile robot (e.g. both the traction arrangement and any manipulator thereof) in response to detecting a human input in the form of a touch.
  • the mobile robot may then be controlled to move in the priority mode based on touch, e.g. where on the handle the touch is provided.
  • the handle may be configured to sense when it is gripped, e.g. by means of one or more capacitive sensors or pressure sensors.
  • the control of the mobile robot is automatically switched from the automatic mode to the priority mode.
  • the human input may comprise a magnitudinal input.
  • the handle may thus be configured to sense a magnitude of the human input. This can be implemented in various ways.
  • the handle is displaceable and configured to issue a human input signal representative of a magnitude of the displacement.
  • a first human input of a first magnitude can be provided by a first displacement of the handle
  • a second human input of a second magnitude, larger than the first magnitude can be provided by a second displacement of the handle, larger than the first displacement.
  • the first displacement and the second displacement thereby constitute examples of magnitudinal inputs.
  • the handle is configured to sense a gripping force and to issue a human input signal representative of a magnitude of the gripping force.
  • a first human input of a first magnitude can be provided by a first gripping force
  • a second human input of a second magnitude, larger than the first magnitude can be provided by a second gripping force, larger than the first gripping force.
  • the first gripping force and the second gripping force thereby constitute further examples of magnitudinal inputs.
  • the handle is configured to sense a touching force and to issue a human input signal representative of a magnitude of the touching force.
  • a first human input of a first magnitude can be provided by a first touching force
  • a second human input of a second magnitude, larger than the first magnitude can be provided by a second touching force, larger than the first touching force.
  • the first touching force and the second touching force thereby constitute further examples of magnitudinal inputs.
  • the priority mode may comprise controlling the traction arrangement based on the human input.
  • the mobile robot can thereby be easily moved by a human grabbing the handle.
  • the mobile robot may comprise one or more navigation sensors for assisting in movement of the mobile robot by means of the traction arrangement.
  • the mobile robot may check whether the desired movement is possible, e.g. by means of its one or more navigation sensors.
  • the mobile robot may for example be programmed to avoid collisions with objects.
  • the priority mode may comprise stopping the traction arrangement when the magnitudinal input is below a threshold value. For example, for magnitudinal inputs below the threshold value, the control of the mobile robot in the automatic mode is interrupted and the mobile robot is controlled in the priority mode by stopping the traction arrangement, and for magnitudinal inputs above the threshold value, the control of the mobile robot in the automatic mode is interrupted and the mobile robot is controlled in the priority mode by controlling the traction arrangement to move the mobile robot over the surface in response to the magnitudinal input.
  • the traction arrangement may be controlled to move the mobile robot at a relatively low speed, and for a relatively large magnitudinal input, larger than the relatively small magnitudinal input, the traction arrangement may be controlled to move the mobile robot at a relatively high speed, faster than the relatively low speed.
  • a control system for controlling a mobile robot having a base structure, a handle attached to the base structure and a traction arrangement for moving the mobile robot over a surface
  • the control system comprising at least one data processing device and at least one memory having a computer program stored thereon, the computer program comprising program code which, when executed by the at least one data processing device, causes the at least one data processing device to perform the steps of controlling the mobile robot in an automatic mode; receiving a human input signal indicative of a detection of a human input to the handle; and stopping the control of the mobile robot in the automatic mode and controlling the mobile robot in a priority mode in response to receiving the human input signal.
  • a mobile robot comprising the base structure, the handle, the traction arrangement and the control system according to the present disclosure.
  • the mobile robot may be of any type according to the present disclosure, in particular according to the first aspect.
  • the traction arrangement may be configured to move the mobile robot over the surface.
  • the traction arrangement may comprise one or more wheels. At least one of the wheels may be a traction wheel. One or more of the wheels may be steerable.
  • the handle may be configured to issue the human input signal in response to being subjected to the human input.
  • the handle maybe oriented substantially horizontally, or horizontally, when the mobile robot is supported on a horizontal surface.
  • the mobile robot may further comprise a manipulator movable relative to the base structure.
  • the mobile robot may comprise one or several manipulators. Each manipulator may be movable in three or more axes.
  • the manipulator may be movable relative to the base structure.
  • Each manipulator may be provided with an end effector, such as a gripper.
  • the mobile robot maybe a truly collaborative mobile robot, i.e. a collaborative mobile robot that is constructed to not be capable of injuring humans.
  • a truly collaborative mobile robot differs from an originally non- collaborative mobile robot that is retrofitted with sensors to be made collaborative.
  • the handle may optionally be provided with functionality for controlling the manipulator. Such functionality may for example be implemented by means of one or more buttons, a knob and/or a joystick on the handle.
  • the base structure may comprise a platform and a pillar positioned on the platform.
  • the platform and the traction arrangement may form an automated guided vehicle, AGV, and the pillar and the one or more manipulators may form a collaborative robot.
  • the manipulator may be movable relative to the handle.
  • the base structure may be positioned between the manipulator and the handle.
  • the manipulator may be provided on a front side of the base structure and the handle may be provided on a rear side of the base structure, opposite to the front side. In this way, humans outside the working area (where the manipulator works) are provided with a clear interaction point.
  • the handle may be movable relative to the base structure.
  • the handle may be positioned at a height of 0.5 m to 1.5 m.
  • the handle When the mobile robot drives over a surface, such as a floor, the handle is thus positioned at a height of 0.5 m to 1.5 m above the floor.
  • Fig. la schematically represents a rear view of a mobile robot comprising a handle
  • Fig. ib schematically represents a side view of the mobile robot
  • Fig. ic schematically represents a top view of the mobile robot
  • Fig. id schematically represents a perspective view of the mobile robot
  • Fig. 2 schematically represents a block diagram of the mobile robot
  • Fig. 3a schematically represents a rear view of a mobile robot comprising a further example of a handle
  • Fig. 3b schematically represents a side view of the mobile robot in Fig. 3a
  • Fig. 3c schematically represents a top view of the mobile robot in Figs. 3a and 3b
  • Fig. 3d schematically represents a perspective view of the mobile robot in Figs. 3a-3c;
  • Fig. 4a schematically represents one example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 4b schematically represents a movement of the mobile robot in response to the human input in Fig. 4a;
  • Fig. 5a schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 5b schematically represents a movement of the mobile robot in response to the human input in Fig. 5a;
  • Fig. 6a schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 6b schematically represents a movement of the mobile robot in response to the human input in Fig. 6a;
  • Fig. 7a schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 7b schematically represents a movement of the mobile robot in response to the human input in Fig. 7a;
  • Fig. 8a schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 8b schematically represents a behavior of the mobile robot in response to the human input in Fig. 8a;
  • Fig. 9a schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 9b schematically represents a movement of the mobile robot in response to the human input in Fig. 9a;
  • Fig. 10a schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
  • Fig. 10b schematically represents a movement of the mobile robot in response to the human input in Fig. 10a;
  • Fig. 11 schematically represents a top view of the mobile robot in Figs. 3a- 3 d and a human;
  • Fig. 12 schematically represents a top view of the mobile robot and the human when the human provides a human input to the mobile robot;
  • Fig. 13 schematically represents a top view of the mobile robot and the human when the human provides a further human input to the mobile robot.
  • Fig. la schematically represents a rear view of a mobile robot 10a
  • Fig. ib schematically represents a side view of the mobile robot 10a
  • Fig. ic schematically represents a top view of the mobile robot 10a
  • Fig. id schematically represents a perspective view of the mobile robot 10a.
  • the mobile robot 10a comprises a base structure 12, a traction arrangement 14 and a handle 16a.
  • the traction arrangement 14 is configured to drive the mobile robot 10a on a horizontal floor 18.
  • the floor 18 is one example of a surface according to the present disclosure.
  • the traction arrangement 14 may be omnidirectional.
  • the traction arrangement 14 comprises four wheels 20, where each wheel 20 is a steerable traction wheel.
  • the base structure 12 of this example comprises a platform 22 and a pillar 24 positioned on the platform 22.
  • the pillar 24 is rigidly supported on the platform 22.
  • the pillar 24 comprises a vertical section erected from the platform 22 and an inclined section extending at an angle from the vertical section.
  • the mobile robot 10a of this example further comprises two manipulators 26.
  • Each manipulator 26 comprises a gripper 28.
  • the grippers 28 are examples of end effectors according to the present disclosure. By means of the grippers 28, the mobile robot 10a can perform various tasks, such as picking, placing and various handling of items.
  • each manipulator 26 of this example has six degrees of freedom.
  • Each gripper 28 can thereby move along, and rotate about, three orthogonal axes relative to the base structure 12.
  • Each manipulator 26 is also movable relative to the handle 16a.
  • the handle 16a serves as a physical interaction point for humans in the vicinity of the mobile robot 10a.
  • the handle 16a of this example is connected to the pillar 24.
  • the handle 16a is positioned at a height of approximately 1 m above the floor 18.
  • the handle 16a in Figs, la-id is U-shaped and lies in a horizontal plane.
  • a right end of the handle 16a is connected to a right side of the pillar 24 and a left end of the handle 16a is connected to a left side of the pillar 24.
  • the handle 16a of this example thereby partly surrounds the pillar 24.
  • the handle 16a is generally positioned on a rear side of the pillar 24 opposite to a forward working side of the pillar 24 for the manipulators 26.
  • the handle 16a of this example is rigid and slightly movable relative to the pillar 24 from a neutral position.
  • the handle 16a shown in Figs, la-id is one of numerous possible handles according to the present disclosure.
  • the mobile robot 10a of this example further comprises a rotary knob 30 and three buttons 32 provided on the handle 16a, and two buttons 34 provided directly on the vertical section of the pillar 24.
  • the buttons 32 and 34 and the rotary knob 30 can be manipulated to command performance of common tasks for the mobile robot 10a, such as various key functions, or to provide shortcuts.
  • the three buttons 32 may for example be used to command the mobile robot 10a to perform different tasks, such as filling a box with items, moving the box from a first location to a second location, and collecting items to be filled in a box.
  • the mobile robot 10a is easier to control by humans not trained in operating the mobile robot 10a through a teach pendant unit (not shown).
  • buttons 32 and 34 and the rotary knob 30 an operator can also program the mobile robot 10a to carry out various tasks, e.g. define a preferred movement route, select an operation mode, perform lead-through programming, perform adjustments to its current task, and perform basic troubleshooting.
  • various tasks e.g. define a preferred movement route, select an operation mode, perform lead-through programming, perform adjustments to its current task, and perform basic troubleshooting.
  • the mobile robot 10a further comprises a display 36.
  • the display 36 is here positioned on the inclined section of the pillar 24.
  • Various information related to the mobile robot 10a can be displayed on the display 36.
  • the mobile robot 10a of this example is a truly collaborative mobile robot. This means that the mobile robot 10a is constructed to not be capable of injuring humans. To this end, the manipulators 26 may have a total mass of 100 kg or less. Alternatively, or in addition, the manipulators 26 maybe driven at a power less than 80 W. The traction arrangement 14 is dimensioned for relatively slow speeds and relatively weak propulsion.
  • a truly collaborative robot differs from an originally non-collaborative industrial robot that is retrofitted with sensor to be made collaborative.
  • the mobile robot 10a further comprises one or more navigation sensors (not shown) for sensing its surroundings.
  • navigation sensors include a LIDAR (light detection and ranging) and a camera.
  • LIDAR light detection and ranging
  • the mobile robot 10a can operate autonomously.
  • the handle 16a of this example enables several functions.
  • the primary function is to enable humans in vicinity to the mobile robot 10a to easily and intuitively quickly freeze movements of the mobile robot 10a. This is valuable in various emergency situations.
  • One example of such emergency situation is when the mobile robot 10a is about to enter an area it should not enter.
  • Another example of such emergency situation is when some object has been dropped on the floor 18, and the mobile robot 10a should be prevented from driving over the object which it has not detected.
  • the object may for example be spilled oil.
  • the second function provided by the handle 16a is to enable humans to easily and intuitively take control of the mobile robot 10a and to move it from a first location to a second location.
  • the handle 16a provides a natural focal point for interaction between a human and the mobile robot 10a.
  • Fig. 2 schematically represents a block diagram of the mobile robot 10a.
  • the mobile robot 10a further comprises a control system 38, here provided in the base structure 12.
  • the control system 38 of this example comprises a data processing device 40 and a memory 42.
  • the memory 42 comprises a computer program stored thereon.
  • the computer program comprises program code which, when executed by the data processing device 40, causes the data processing device 40 to perform, or command performance of, various steps as described herein.
  • the handle 16a can receive and sense a human input 44.
  • the human input 44 may for example be a push of the handle 16a, a pull of the handle 16a, a rotation of the handle 16a (e.g. in the horizontal plane), various touches on the handle 16a, and/ or a grip of the handle 16a by one or more hands of a human.
  • the handle 16a may comprise position sensors 46 for sensing movements, and magnitudes of such movements of the handle 16a relative to the pillar 24, such as a push, a pull and a rotation of the handle 16a.
  • the handle 16a may comprise capacitive sensors 48 for sensing touches, positions of such touches on the handle 16a, and magnitudes of such touches on the handle 16a.
  • the capacitive sensors 48 are configured to detect when a capacitive object, such as a human hand, is brought into contact with the handle 16a. In case only capacitive sensors 48 are provided in the handle 16a, the handle 16a does not have to be movable relative to the base structure 12.
  • the handle 16a may comprise pressure sensors 50 for sensing grips, positions of such grips on the handle 16a, and magnitudes of such grips on the handle 16a.
  • the handle 16a issues a human input signal 52 to the control system 38.
  • the control system 38 is configured to interpret the human input signal 52, e.g. to determine a magnitude and a direction of the human input 44, and to control the traction arrangement 14 based on the human input signal 52.
  • the manipulators 26 may be controlled by the control system 38 based on the human input signal 52.
  • the control system 38 is configured to interpret the human input signal 52 and to translate the human input signal 52 to movement instructions for the mobile robot 10a, e.g. movement instructions for the traction arrangement 14 and for the manipulators 26.
  • the physical interaction between the mobile robot 10a and the handle 16a in combination with the software implementation adds unique interaction possibilities that are not present in mobile robots of the prior art. If a movement of the traction arrangement 14 as commanded by the human input 44 is possible, e.g. if the sensors of the mobile robot 10a do not recognize any obstacles in a desired path, the control system 38 commands the traction arrangement 14 to execute the movement.
  • Fig. 3a schematically represents a rear view of a mobile robot 10b
  • Fig. 3b schematically represents a side view of the mobile robot 10b in Fig. 3a
  • Fig. 3c schematically represents a top view of the mobile robot 10b in Figs. 3a and 3b
  • Fig. 3d schematically represents a perspective view of the mobile robot 10b in Figs. 3a-3c.
  • the mobile robot 10b comprises a further example of a handle 16b.
  • the handle 16b differs from the handle 16a in that the handle 16b has a somewhat simpler design. As shown in Figs. 3a-3d, the handle 16b does not comprise any buttons or rotary knob. The handle 16b is however configured to sense movements, touches and grips by a human in the same way as the handle 16a. Also the handle 16b lies in a horizontal plane.
  • Fig. 4a schematically represents one example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 4a comprises a directional input 54, namely a forward push by both hands of a human. Since one hand pushes on the left side of the handle 16b and one hand pushes on the right side of the handle 16b, the handle 16b moves straight forward from a neutral position relative to the base structure 12. When the human releases the handle 16b, the handle 16b returns to its neutral position, e.g. by means of one or more springs (not shown).
  • a human input 44 corresponding to Fig. 4a may comprise a touch to each of a rear left side and a rear right side of the handle 16b.
  • a rear side of the handle 16b faces away from the base structure 12 and a front side of the handle 16b faces towards the base structure 12.
  • a human input 44 corresponding to Fig. 4a can be provided by gripping a left side of the handle 16b and a right side of the handle 16b with equal gripping forces.
  • Fig. 4b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 4a.
  • the mobile robot 10b travels straight forward in response to the directional input 54 in Fig. 4a.
  • the wheels 20 are controlled by the control system 38 to move the mobile robot 10b in a direction corresponding to the human input 44.
  • the mobile robot 10b can thereby be moved easily over the floor 18 by the user.
  • the human input signal 52 also contains information regarding a magnitude of the force currently exerted on the handle 16b.
  • the control system 38 also controls a speed of the traction arrangement 14 based on the magnitude of the force.
  • Fig. 5a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 5a comprises a further example of a directional input 54, namely a forward push on only a left side of the handle 16b.
  • the handle 16b is thereby pushed forward and rotated clockwise from the neutral position relative to the base structure 12.
  • a human input 44 corresponding to Fig. 5a may comprise a touch to a rear left side of the handle 16b.
  • a human input 44 corresponding to Fig. 4a can be provided by gripping a left side of the handle 16b, or by gripping the left side of the handle 16b more than a right side of the handle 16b.
  • Fig. 5b schematically represents a movement of the mobile robot lob in response to the human input 44 in Fig. 5a. As shown in Fig. 5b, the mobile robot 10b travels forward and turns right in response to the directional input 54 in Fig. 5a.
  • Fig. 6a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 6a comprises a further example of a directional input 54, namely a forward push on the right side of the handle 16b and a rearward pull on the left side of the handle 16b.
  • the handle 16b is thereby rotated counterclockwise from the neutral position relative to the base structure 12.
  • a human input 44 corresponding to Fig. 6a may comprise a touch to a rear right side and a front left side of the handle 16b.
  • Fig. 6b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 6a. As shown in Fig. 6a, the mobile robot 10b rotates on the spot in a counterclockwise direction in response to the directional input 54 in Fig. 6a.
  • Fig. 7a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 7a comprises a further example of a directional input 54, namely a rearward pull by both hands of a human. Since one hand pulls on the left side of the handle 16b and one hand pulls on the right side of the handle 16b, the handle 16b moves straight rearward from the neutral position relative to the base structure 12.
  • a human input 44 corresponding to Fig. 7a may comprise a touch to a front right side and a front left side of the handle 16b.
  • Fig. 7b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 7a. As shown in Fig. 7b, the mobile robot lob travels straight rearward in response to the human input 44 in Fig. 7a.
  • Fig. 8a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 8a is a directional input 54 similar to Fig. 4a in that the handle 16b is pushed forward by both hands of a human such that the handle 16b moves straight forward a relatively small distance from the neutral position relative to the base structure 12.
  • the human input 44 in Fig. 8a also comprises a magnitudinal input 56.
  • the handle 16b is only pushed forward with a small force below a threshold value 58.
  • Fig. 8b schematically represents a behavior of the mobile robot 10b in response to the human input 44 in Fig. 8a.
  • the control system 38 In response to the magnitudinal input 56 being below the threshold value 58, the control system 38 only commands the mobile robot 10b to stop. The mobile robot 10b can be thereby be stopped quickly by the handle 16b.
  • Fig. 9a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 9a is a directional input 54 and a magnitudinal input 56 similar to Fig. 8a in that the handle 16b is pushed forward by both hands of a human such that the handle 16b moves straight forward from the neutral position relative to the base structure 12. However, in Fig. 9a, the handle 16b is pushed forward with a force above the threshold value 58.
  • Fig. 9b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 9a. Since the magnitudinal input 56 in Fig. 9b is above the threshold value 58, the control system 38 controls the traction arrangement 14 to drive the mobile robot 10b forward with a speed corresponding to the magnitudinal input 56, here a relatively low speed.
  • Fig. 10a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d.
  • the human input 44 in Fig. 10a is a directional input 54 and a magnitudinal input 56 similar to Fig. 9a in that the handle 16b is pushed forward by both hands of a human such that the handle 16b moves straight forward a relatively large distance from the neutral position relative to the base structure 12.
  • the handle 16b is pushed forward with a larger force than in Fig. 9a.
  • Fig. 10b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 10a. Since the magnitudinal input 56 in Fig. 19b has a magnitude above the threshold value 58 and above the magnitude of the human input 44 in Fig. 9a, the control system 38 controls the traction arrangement 14 to drive the mobile robot 10b forward at a relatively high speed.
  • the human inputs 44 in Figs. 8a, 9a and 10a may alternatively be provided by different forces of a touch of the handle 16b, or by different gripping forces of the handle 16b.
  • Fig. 11 schematically represents a top view of the mobile robot 10b in Figs. 3a- 3d and a human 60.
  • the control system 38 controls the mobile robot 10b in an automatic mode 62.
  • the mobile robot 10b operates autonomously.
  • the mobile robot 10b may be driven between various different locations at a worksite by means of the traction arrangement 14 and the manipulators 26 may perform various tasks.
  • Fig. 12 schematically represents a top view of the mobile robot 10b and the human 60.
  • the human 60 grips the handle 16b with one hand as a human input 44.
  • the human input 44 is detected by the handle 16b.
  • the handle 16b sends a human input signal 52 corresponding to the human input 44 to the control system 38.
  • the control system 38 stops the control of the mobile robot 10b in the automatic mode 62 and starts a control of the mobile robot 10b in a priority mode 64.
  • the mobile robot 10b initially freezes, i.e.
  • the mobile robot 10b upon detecting the grip of the handle 16b, the mobile robot 10b is commanded to pause all its activities and to stop if it is moving.
  • the handle 16b when gripped, enables the human 6o to take control over the mobile robot lob.
  • the automatic mode 62 of the mobile robot 10b is overridden by the priority mode 64 when a human input 44 is detected by the handle 16b.
  • the priority mode 64 may comprise an initial time period during which the mobile robot 10b is frozen.
  • the mobile robot 10b may for example be frozen for 1 second after receiving the first human input 44.
  • the mobile robot 10b can be controlled in the priority mode 64 based on the human input 44 provided to the handle 16b.
  • the human 60 can thus easily and intuitively take control of the mobile robot 10b to move it in any desired direction.
  • Fig. 13 schematically represents a top view of the mobile robot 10b and the human 60 when the human 60 provides a further human input 44 to the mobile robot 10b.
  • the human 60 has in this example first performed a rotation of the mobile robot 10b (in the counterclockwise direction relative to Fig. 12), for example by means of a human input 44 corresponding to Fig. 6a.
  • the human 60 provides a human input 44 corresponding to Fig. 4a such that the traction arrangement 14 is commanded to propulse the mobile robot 10b in a forward direction.
  • the mobile robot 10b stops when the handle 16b is released, i.e. remains in the priority mode 64.
  • the mobile robot 10b maybe commanded to resume the automatic mode 62 by providing a short tap on the handle 16b by the hand of the human 60.

Abstract

A method of controlling a mobile robot (10a, 10b), the method comprising providing a mobile robot (10a, 10b) having a base structure (12), a handle (16a, 16b) attached to the base structure (12) and a traction arrangement (14) for moving the mobile robot (10a, 10b) over a surface (18); controlling the mobile robot (10a, 10b) in an automatic mode (62); detecting a human input (44) to the handle (16a, 16b); and stopping the control of the mobile robot (10a, 10b) in the automatic mode (62) and controlling the mobile robot (10a, 10b) in a priority mode (64) upon detection of the human input (44). A control system (38) and a mobile robot (10a, 10b) are also provided.

Description

METHOD OF CONTROLLING MOBILE ROBOT, CONTROL SYSTEM AND MOBILE ROBOT
Technical Field
The present disclosure generally relates to control of a mobile robot. In particular, a method of controlling a mobile robot, a control system for controlling a mobile robot, and a mobile robot comprising such control system, are provided.
Background
Mobile robots are used in a wide range of applications. Mobile robots may be used to deliver material between different locations in various facilities, such as hospitals and factories. A mobile robot may additionally be provided with a manipulator for carrying out various repetitive tasks.
Most mobile robots are controlled through a digital interface, either integrated into the mobile robot or separated from the mobile robot. One example of a separated digital interface is a teach pendant unit (TPU). Most mobile robots are also provided with an emergency stop button, by means of which a human can bring the mobile robot to a full stop.
CN 107024934 A discloses a hospital service robot. The service robot is provided with a plurality of cabinets for containing different medicine materials. The service robot updates a planned walking route in real time according to a designated walking route, and in combination with data of an ultrasonic sensor, delivers the medicine materials to designated destinations.
Summary
In addition to a digital interface and an emergency stop button as mentioned above, mobile robots typically have no interaction points for humans. This makes it unclear if or how a human may physically interact with the mobile robot. Most mobile robots have a considerable weight and are not possible to manually move due to a lack of tangible grasping or maneuvering elements. The quality of the interaction possibilities between humans and mobile robots is thus often poor, which may lead to skepticism to introduction of mobile robots to workspaces like factories and hospitals.
One object of the present disclosure is to provide a method of controlling a mobile robot, which method improves safety.
A further object of the present disclosure is to provide a method of controlling a mobile robot, which method enables a simple and/or intuitive interaction between a human and the mobile robot.
A still further object of the present disclosure is to provide a method of controlling a mobile robot, which method enables a human to take over control of the mobile robot in a simple, clear, reliable and/or intuitive manner.
A still further object of the present disclosure is to provide a method of controlling a mobile robot, which solves several or all of the foregoing objects in combination.
A still further object of the present disclosure is to provide a control system for controlling a mobile robot, which control system solves one, several or all of the foregoing objects.
A still further object of the present disclosure is to provide a mobile robot solving one, several or all of the foregoing objects.
According to one aspect, there is provided a method of controlling a mobile robot, the method comprising providing a mobile robot having a base structure, a handle attached to the base structure and a traction arrangement for moving the mobile robot over a surface; controlling the mobile robot in an automatic mode; detecting a human input to the handle; and stopping the control of the mobile robot in the automatic mode and controlling the mobile robot in a priority mode upon detection of the human input. The traction arrangement may be automatically controlled in the automatic mode, e.g. autonomously or according to a predefined program. In the automatic mode, the entire mobile robot may operate autonomously. The mobile robot may perform various tasks and move between different locations in the automatic mode.
By means of the human input to the handle, the control of the mobile robot switches to the priority mode. The automatic mode is thereby overridden. In the priority mode, the traction arrangement maybe stopped and/or maybe controlled in response to the human input. According to one example, the traction arrangement is stopped by a first human input to the handle, and is controlled in response to any subsequent human input. That is, once the mobile robot has stopped, the mobile robot may be controlled in accordance with the human input.
The handle provides a clear, immediate and intuitive interaction point for humans in the vicinity of the mobile robot, apart from digital interfaces. In particular in non-industrial environments, humans working in vicinity to the mobile robot may not have the experience or competence of programming the mobile robot, and they should not need to. The interaction point provided by the handle is clear, immediate and intuitive also to humans having no experience of robotics. This improves safety in an environment where the mobile robot operates, and where humans work next to and/or in collaboration with the mobile robot.
The handle can be operated by means of a hand of a human to provide the human input. The semantics of the handle invite humans to grab the handle, while the rest of the mobile robot typically do not. By means of the human input to the handle, a human can control the mobile robot in the priority mode.
The human input to the handle is a clear indication to the mobile robot that a human takes control over the operations of the mobile robot. The human control of the mobile robot has higher priority than the automatic control of the mobile robot in the automatic mode, hence the term priority mode. The priority mode may alternatively be referred to as a guided mode or manual mode although in one variant, the mobile robot is merely stopped in the priority mode.
In case the mobile robot comprises a manipulator, the manipulator may be automatically controlled in the automatic mode, e.g. autonomously or according to a predefined program. In the priority mode, the manipulator maybe stopped or maybe controlled in response to the human input or another input (e.g. by means of one or more buttons on the handle). Thus, the method may comprise an initial freezing of all movements of the mobile robot in response to the human input.
The handle can be used by a human to bring the mobile robot to a stop on the surface. Alternatively, the handle can be used by a human to control a path of movement of the mobile robot. There might be an emergency that the mobile robot does not recognize or there may be an accident that can be avoided if control of the mobile robot can be quickly taken over by a human. The mobile robot may also be docked to a workstation under the control of a human via the handle. Although the handle can be used by humans with little to no experience of robotics, the handle can also be used by robot programmers for programming various tasks by the mobile robot, e.g. by means of lead- through programming. The programming may for example be an initial programming of the mobile robot during setup or for re-programming purposes.
The method enables a safe collaboration between a human and the mobile robot, and a safe control of the mobile robot. The method may be carried out in various service robotics applications. The mobile robot of the method may be of any type according to the present disclosure.
The method may further comprise providing feedback to the human by means of the mobile robot in response to the human input. The feedback may be provided by means of the handle, e.g. by means of a vibration the handle. Alternatively, or in addition, the feedback maybe a light feedback or an audible feedback.
The human input may comprise a directional input. In this case, the method may comprise initially stopping the mobile robot (e.g. both the traction arrangement and any manipulator thereof) in response to detecting a directional input in any direction. In order to detect a directional input, the handle may comprise one or more sensors for sensing a direction of the human input. Examples of such sensors are capacitive sensors, movement sensors and pressure sensors.
The directional input may comprise a movement of the handle. In this case, the handle maybe movable relative to the base structure. For example, the handle can be moved forward to provide a forward input, be moved to the left or be rotated in a counterclockwise direction to provide a left input, be moved to the right or be rotated in a clockwise direction to provide a right input, and be moved rearward to provide a rearward input. According to one variant, the handle is movable in a plane substantially parallel with, or parallel with, the surface.
The human input may comprise a touch. In this case, the method may comprise initially stopping the mobile robot (e.g. both the traction arrangement and any manipulator thereof) in response to detecting a human input in the form of a touch. The mobile robot may then be controlled to move in the priority mode based on touch, e.g. where on the handle the touch is provided.
As one example of a handle configured to sense a touch, the handle may be configured to sense when it is gripped, e.g. by means of one or more capacitive sensors or pressure sensors. When the handle is gripped, the control of the mobile robot is automatically switched from the automatic mode to the priority mode. The human input may comprise a magnitudinal input. The handle may thus be configured to sense a magnitude of the human input. This can be implemented in various ways.
According to one example, the handle is displaceable and configured to issue a human input signal representative of a magnitude of the displacement. In this case, a first human input of a first magnitude can be provided by a first displacement of the handle, and a second human input of a second magnitude, larger than the first magnitude, can be provided by a second displacement of the handle, larger than the first displacement. The first displacement and the second displacement thereby constitute examples of magnitudinal inputs.
According to a further example, the handle is configured to sense a gripping force and to issue a human input signal representative of a magnitude of the gripping force. In this case, a first human input of a first magnitude can be provided by a first gripping force, and a second human input of a second magnitude, larger than the first magnitude, can be provided by a second gripping force, larger than the first gripping force. The first gripping force and the second gripping force thereby constitute further examples of magnitudinal inputs.
According to a further example, the handle is configured to sense a touching force and to issue a human input signal representative of a magnitude of the touching force. In this case, a first human input of a first magnitude can be provided by a first touching force, and a second human input of a second magnitude, larger than the first magnitude, can be provided by a second touching force, larger than the first touching force. The first touching force and the second touching force thereby constitute further examples of magnitudinal inputs.
The priority mode may comprise controlling the traction arrangement based on the human input. The mobile robot can thereby be easily moved by a human grabbing the handle. The mobile robot may comprise one or more navigation sensors for assisting in movement of the mobile robot by means of the traction arrangement. Prior to controlling the traction arrangement to move the mobile robot based on the human input, the mobile robot may check whether the desired movement is possible, e.g. by means of its one or more navigation sensors. The mobile robot may for example be programmed to avoid collisions with objects.
The priority mode may comprise stopping the traction arrangement when the magnitudinal input is below a threshold value. For example, for magnitudinal inputs below the threshold value, the control of the mobile robot in the automatic mode is interrupted and the mobile robot is controlled in the priority mode by stopping the traction arrangement, and for magnitudinal inputs above the threshold value, the control of the mobile robot in the automatic mode is interrupted and the mobile robot is controlled in the priority mode by controlling the traction arrangement to move the mobile robot over the surface in response to the magnitudinal input. For a relatively small magnitudinal input above the threshold value, the traction arrangement may be controlled to move the mobile robot at a relatively low speed, and for a relatively large magnitudinal input, larger than the relatively small magnitudinal input, the traction arrangement may be controlled to move the mobile robot at a relatively high speed, faster than the relatively low speed.
According to a further aspect, there is provided a control system for controlling a mobile robot having a base structure, a handle attached to the base structure and a traction arrangement for moving the mobile robot over a surface, the control system comprising at least one data processing device and at least one memory having a computer program stored thereon, the computer program comprising program code which, when executed by the at least one data processing device, causes the at least one data processing device to perform the steps of controlling the mobile robot in an automatic mode; receiving a human input signal indicative of a detection of a human input to the handle; and stopping the control of the mobile robot in the automatic mode and controlling the mobile robot in a priority mode in response to receiving the human input signal.
According to a further aspect, there is provided a mobile robot comprising the base structure, the handle, the traction arrangement and the control system according to the present disclosure. The mobile robot may be of any type according to the present disclosure, in particular according to the first aspect.
The traction arrangement may be configured to move the mobile robot over the surface. To this end, the traction arrangement may comprise one or more wheels. At least one of the wheels may be a traction wheel. One or more of the wheels may be steerable.
The handle may be configured to issue the human input signal in response to being subjected to the human input. The handle maybe oriented substantially horizontally, or horizontally, when the mobile robot is supported on a horizontal surface.
The mobile robot may further comprise a manipulator movable relative to the base structure. The mobile robot may comprise one or several manipulators. Each manipulator may be movable in three or more axes. The manipulator may be movable relative to the base structure. Each manipulator may be provided with an end effector, such as a gripper.
The mobile robot maybe a truly collaborative mobile robot, i.e. a collaborative mobile robot that is constructed to not be capable of injuring humans. A truly collaborative mobile robot differs from an originally non- collaborative mobile robot that is retrofitted with sensors to be made collaborative.
The handle may optionally be provided with functionality for controlling the manipulator. Such functionality may for example be implemented by means of one or more buttons, a knob and/or a joystick on the handle. The base structure may comprise a platform and a pillar positioned on the platform. In this case, the platform and the traction arrangement may form an automated guided vehicle, AGV, and the pillar and the one or more manipulators may form a collaborative robot.
The manipulator may be movable relative to the handle.
The base structure may be positioned between the manipulator and the handle. Thus, the manipulator may be provided on a front side of the base structure and the handle may be provided on a rear side of the base structure, opposite to the front side. In this way, humans outside the working area (where the manipulator works) are provided with a clear interaction point.
The handle may be movable relative to the base structure.
The handle may be positioned at a height of 0.5 m to 1.5 m. When the mobile robot drives over a surface, such as a floor, the handle is thus positioned at a height of 0.5 m to 1.5 m above the floor.
Brief Description of the Drawings
Further details, advantages and aspects of the present disclosure will become apparent from the following description taken in conjunction with the drawings, wherein:
Fig. la: schematically represents a rear view of a mobile robot comprising a handle;
Fig. ib: schematically represents a side view of the mobile robot;
Fig. ic: schematically represents a top view of the mobile robot;
Fig. id: schematically represents a perspective view of the mobile robot;
Fig. 2: schematically represents a block diagram of the mobile robot;
Fig. 3a: schematically represents a rear view of a mobile robot comprising a further example of a handle;
Fig. 3b: schematically represents a side view of the mobile robot in Fig. 3a; Fig. 3c: schematically represents a top view of the mobile robot in Figs. 3a and 3b; Fig. 3d: schematically represents a perspective view of the mobile robot in Figs. 3a-3c;
Fig. 4a: schematically represents one example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 4b: schematically represents a movement of the mobile robot in response to the human input in Fig. 4a;
Fig. 5a: schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 5b: schematically represents a movement of the mobile robot in response to the human input in Fig. 5a;
Fig. 6a: schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 6b: schematically represents a movement of the mobile robot in response to the human input in Fig. 6a;
Fig. 7a: schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 7b: schematically represents a movement of the mobile robot in response to the human input in Fig. 7a;
Fig. 8a: schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 8b: schematically represents a behavior of the mobile robot in response to the human input in Fig. 8a;
Fig. 9a: schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 9b: schematically represents a movement of the mobile robot in response to the human input in Fig. 9a;
Fig. 10a: schematically represents a further example of a human input to the handle of the mobile robot in Figs. 3a-3d;
Fig. 10b: schematically represents a movement of the mobile robot in response to the human input in Fig. 10a;
Fig. 11: schematically represents a top view of the mobile robot in Figs. 3a- 3 d and a human;
Fig. 12: schematically represents a top view of the mobile robot and the human when the human provides a human input to the mobile robot; and
Fig. 13: schematically represents a top view of the mobile robot and the human when the human provides a further human input to the mobile robot.
Detailed Description
In the following, a method of controlling a mobile robot, a control system for controlling a mobile robot, and a mobile robot comprising such control system, will be described. The same or similar reference numerals will be used to denote the same or similar structural features.
Fig. la schematically represents a rear view of a mobile robot 10a, Fig. ib schematically represents a side view of the mobile robot 10a, Fig. ic schematically represents a top view of the mobile robot 10a, and Fig. id schematically represents a perspective view of the mobile robot 10a. With collective reference to Figs, la-id, the mobile robot 10a comprises a base structure 12, a traction arrangement 14 and a handle 16a.
The traction arrangement 14 is configured to drive the mobile robot 10a on a horizontal floor 18. The floor 18 is one example of a surface according to the present disclosure. The traction arrangement 14 may be omnidirectional. In this example, the traction arrangement 14 comprises four wheels 20, where each wheel 20 is a steerable traction wheel.
The base structure 12 of this example comprises a platform 22 and a pillar 24 positioned on the platform 22. The pillar 24 is rigidly supported on the platform 22. The pillar 24 comprises a vertical section erected from the platform 22 and an inclined section extending at an angle from the vertical section.
The mobile robot 10a of this example further comprises two manipulators 26. Each manipulator 26 comprises a gripper 28. The grippers 28 are examples of end effectors according to the present disclosure. By means of the grippers 28, the mobile robot 10a can perform various tasks, such as picking, placing and various handling of items. Although only schematically illustrated, each manipulator 26 of this example has six degrees of freedom. Each gripper 28 can thereby move along, and rotate about, three orthogonal axes relative to the base structure 12. Each manipulator 26 is also movable relative to the handle 16a.
The handle 16a serves as a physical interaction point for humans in the vicinity of the mobile robot 10a. The handle 16a of this example is connected to the pillar 24. The handle 16a is positioned at a height of approximately 1 m above the floor 18. The handle 16a in Figs, la-id is U-shaped and lies in a horizontal plane. A right end of the handle 16a is connected to a right side of the pillar 24 and a left end of the handle 16a is connected to a left side of the pillar 24. The handle 16a of this example thereby partly surrounds the pillar 24. The handle 16a is generally positioned on a rear side of the pillar 24 opposite to a forward working side of the pillar 24 for the manipulators 26. The handle 16a of this example is rigid and slightly movable relative to the pillar 24 from a neutral position. The handle 16a shown in Figs, la-id is one of numerous possible handles according to the present disclosure.
The mobile robot 10a of this example further comprises a rotary knob 30 and three buttons 32 provided on the handle 16a, and two buttons 34 provided directly on the vertical section of the pillar 24. The buttons 32 and 34 and the rotary knob 30 can be manipulated to command performance of common tasks for the mobile robot 10a, such as various key functions, or to provide shortcuts. The three buttons 32 may for example be used to command the mobile robot 10a to perform different tasks, such as filling a box with items, moving the box from a first location to a second location, and collecting items to be filled in a box. By means of the buttons 32 and 34 and the rotary knob 30, the mobile robot 10a is easier to control by humans not trained in operating the mobile robot 10a through a teach pendant unit (not shown). Using the handle 16a, the buttons 32 and 34 and the rotary knob 30, an operator can also program the mobile robot 10a to carry out various tasks, e.g. define a preferred movement route, select an operation mode, perform lead-through programming, perform adjustments to its current task, and perform basic troubleshooting.
The mobile robot 10a further comprises a display 36. The display 36 is here positioned on the inclined section of the pillar 24. Various information related to the mobile robot 10a can be displayed on the display 36.
The mobile robot 10a of this example is a truly collaborative mobile robot. This means that the mobile robot 10a is constructed to not be capable of injuring humans. To this end, the manipulators 26 may have a total mass of 100 kg or less. Alternatively, or in addition, the manipulators 26 maybe driven at a power less than 80 W. The traction arrangement 14 is dimensioned for relatively slow speeds and relatively weak propulsion. A truly collaborative robot differs from an originally non-collaborative industrial robot that is retrofitted with sensor to be made collaborative.
The mobile robot 10a further comprises one or more navigation sensors (not shown) for sensing its surroundings. Examples of such navigation sensors include a LIDAR (light detection and ranging) and a camera. By means of the navigation sensors, the mobile robot 10a can operate autonomously.
The handle 16a of this example enables several functions. The primary function is to enable humans in vicinity to the mobile robot 10a to easily and intuitively quickly freeze movements of the mobile robot 10a. This is valuable in various emergency situations. One example of such emergency situation is when the mobile robot 10a is about to enter an area it should not enter. Another example of such emergency situation is when some object has been dropped on the floor 18, and the mobile robot 10a should be prevented from driving over the object which it has not detected. The object may for example be spilled oil.
The second function provided by the handle 16a is to enable humans to easily and intuitively take control of the mobile robot 10a and to move it from a first location to a second location. The handle 16a provides a natural focal point for interaction between a human and the mobile robot 10a. Fig. 2 schematically represents a block diagram of the mobile robot 10a. As shown in Fig. 2, the mobile robot 10a further comprises a control system 38, here provided in the base structure 12. The control system 38 of this example comprises a data processing device 40 and a memory 42. The memory 42 comprises a computer program stored thereon. The computer program comprises program code which, when executed by the data processing device 40, causes the data processing device 40 to perform, or command performance of, various steps as described herein.
The handle 16a can receive and sense a human input 44. The human input 44 may for example be a push of the handle 16a, a pull of the handle 16a, a rotation of the handle 16a (e.g. in the horizontal plane), various touches on the handle 16a, and/ or a grip of the handle 16a by one or more hands of a human. As shown in Fig. 2, the handle 16a may comprise position sensors 46 for sensing movements, and magnitudes of such movements of the handle 16a relative to the pillar 24, such as a push, a pull and a rotation of the handle 16a.
Alternatively, or in addition, the handle 16a may comprise capacitive sensors 48 for sensing touches, positions of such touches on the handle 16a, and magnitudes of such touches on the handle 16a. The capacitive sensors 48 are configured to detect when a capacitive object, such as a human hand, is brought into contact with the handle 16a. In case only capacitive sensors 48 are provided in the handle 16a, the handle 16a does not have to be movable relative to the base structure 12.
Alternatively, or in addition, the handle 16a may comprise pressure sensors 50 for sensing grips, positions of such grips on the handle 16a, and magnitudes of such grips on the handle 16a.
In response to readings from one or more of the position sensors 46, the capacitive sensors 48 and the pressure sensors 50, the handle 16a issues a human input signal 52 to the control system 38. The control system 38 is configured to interpret the human input signal 52, e.g. to determine a magnitude and a direction of the human input 44, and to control the traction arrangement 14 based on the human input signal 52. Also the manipulators 26 may be controlled by the control system 38 based on the human input signal 52. Thus, the control system 38 is configured to interpret the human input signal 52 and to translate the human input signal 52 to movement instructions for the mobile robot 10a, e.g. movement instructions for the traction arrangement 14 and for the manipulators 26.
The physical interaction between the mobile robot 10a and the handle 16a in combination with the software implementation adds unique interaction possibilities that are not present in mobile robots of the prior art. If a movement of the traction arrangement 14 as commanded by the human input 44 is possible, e.g. if the sensors of the mobile robot 10a do not recognize any obstacles in a desired path, the control system 38 commands the traction arrangement 14 to execute the movement.
Fig. 3a schematically represents a rear view of a mobile robot 10b, Fig. 3b schematically represents a side view of the mobile robot 10b in Fig. 3a, Fig. 3c schematically represents a top view of the mobile robot 10b in Figs. 3a and 3b, and Fig. 3d schematically represents a perspective view of the mobile robot 10b in Figs. 3a-3c. With collective reference to Figs. 3a-3d, the mobile robot 10b comprises a further example of a handle 16b.
The handle 16b differs from the handle 16a in that the handle 16b has a somewhat simpler design. As shown in Figs. 3a-3d, the handle 16b does not comprise any buttons or rotary knob. The handle 16b is however configured to sense movements, touches and grips by a human in the same way as the handle 16a. Also the handle 16b lies in a horizontal plane.
Fig. 4a schematically represents one example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 4a comprises a directional input 54, namely a forward push by both hands of a human. Since one hand pushes on the left side of the handle 16b and one hand pushes on the right side of the handle 16b, the handle 16b moves straight forward from a neutral position relative to the base structure 12. When the human releases the handle 16b, the handle 16b returns to its neutral position, e.g. by means of one or more springs (not shown).
Instead of the forward push by both hands, a human input 44 corresponding to Fig. 4a may comprise a touch to each of a rear left side and a rear right side of the handle 16b. A rear side of the handle 16b faces away from the base structure 12 and a front side of the handle 16b faces towards the base structure 12. As a further alternative, a human input 44 corresponding to Fig. 4a can be provided by gripping a left side of the handle 16b and a right side of the handle 16b with equal gripping forces.
Fig. 4b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 4a. As shown in Fig. 4b, the mobile robot 10b travels straight forward in response to the directional input 54 in Fig. 4a. Thus, the wheels 20 are controlled by the control system 38 to move the mobile robot 10b in a direction corresponding to the human input 44. The mobile robot 10b can thereby be moved easily over the floor 18 by the user.
The human input signal 52 also contains information regarding a magnitude of the force currently exerted on the handle 16b. The control system 38 also controls a speed of the traction arrangement 14 based on the magnitude of the force.
Fig. 5a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 5a comprises a further example of a directional input 54, namely a forward push on only a left side of the handle 16b. The handle 16b is thereby pushed forward and rotated clockwise from the neutral position relative to the base structure 12.
Instead of the forward push on only the left side of the handle 16b, a human input 44 corresponding to Fig. 5a may comprise a touch to a rear left side of the handle 16b. As a further alternative, a human input 44 corresponding to Fig. 4a can be provided by gripping a left side of the handle 16b, or by gripping the left side of the handle 16b more than a right side of the handle 16b.
Fig. 5b schematically represents a movement of the mobile robot lob in response to the human input 44 in Fig. 5a. As shown in Fig. 5b, the mobile robot 10b travels forward and turns right in response to the directional input 54 in Fig. 5a.
Fig. 6a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 6a comprises a further example of a directional input 54, namely a forward push on the right side of the handle 16b and a rearward pull on the left side of the handle 16b. The handle 16b is thereby rotated counterclockwise from the neutral position relative to the base structure 12. Instead of the forward push on the right side of the handle 16b and the rearward pull on the left side of the handle 16b, a human input 44 corresponding to Fig. 6a may comprise a touch to a rear right side and a front left side of the handle 16b.
Fig. 6b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 6a. As shown in Fig. 6a, the mobile robot 10b rotates on the spot in a counterclockwise direction in response to the directional input 54 in Fig. 6a.
Fig. 7a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 7a comprises a further example of a directional input 54, namely a rearward pull by both hands of a human. Since one hand pulls on the left side of the handle 16b and one hand pulls on the right side of the handle 16b, the handle 16b moves straight rearward from the neutral position relative to the base structure 12. Instead of the rearward pull of the handle 16b, a human input 44 corresponding to Fig. 7a may comprise a touch to a front right side and a front left side of the handle 16b.
Fig. 7b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 7a. As shown in Fig. 7b, the mobile robot lob travels straight rearward in response to the human input 44 in Fig. 7a.
Fig. 8a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 8a is a directional input 54 similar to Fig. 4a in that the handle 16b is pushed forward by both hands of a human such that the handle 16b moves straight forward a relatively small distance from the neutral position relative to the base structure 12. However, the human input 44 in Fig. 8a also comprises a magnitudinal input 56. As illustrated in Fig. 8a, the handle 16b is only pushed forward with a small force below a threshold value 58.
Fig. 8b schematically represents a behavior of the mobile robot 10b in response to the human input 44 in Fig. 8a. In response to the magnitudinal input 56 being below the threshold value 58, the control system 38 only commands the mobile robot 10b to stop. The mobile robot 10b can be thereby be stopped quickly by the handle 16b.
Fig. 9a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 9a is a directional input 54 and a magnitudinal input 56 similar to Fig. 8a in that the handle 16b is pushed forward by both hands of a human such that the handle 16b moves straight forward from the neutral position relative to the base structure 12. However, in Fig. 9a, the handle 16b is pushed forward with a force above the threshold value 58.
Fig. 9b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 9a. Since the magnitudinal input 56 in Fig. 9b is above the threshold value 58, the control system 38 controls the traction arrangement 14 to drive the mobile robot 10b forward with a speed corresponding to the magnitudinal input 56, here a relatively low speed.
Fig. 10a schematically represents a further example of a human input 44 to the handle 16b of the mobile robot 10b in Figs. 3a-3d. The human input 44 in Fig. 10a is a directional input 54 and a magnitudinal input 56 similar to Fig. 9a in that the handle 16b is pushed forward by both hands of a human such that the handle 16b moves straight forward a relatively large distance from the neutral position relative to the base structure 12. In Fig. 10a, the handle 16b is pushed forward with a larger force than in Fig. 9a.
Fig. 10b schematically represents a movement of the mobile robot 10b in response to the human input 44 in Fig. 10a. Since the magnitudinal input 56 in Fig. 19b has a magnitude above the threshold value 58 and above the magnitude of the human input 44 in Fig. 9a, the control system 38 controls the traction arrangement 14 to drive the mobile robot 10b forward at a relatively high speed. The human inputs 44 in Figs. 8a, 9a and 10a may alternatively be provided by different forces of a touch of the handle 16b, or by different gripping forces of the handle 16b.
Fig. 11 schematically represents a top view of the mobile robot 10b in Figs. 3a- 3d and a human 60. In Fig. 11, the control system 38 controls the mobile robot 10b in an automatic mode 62. In the automatic mode 62, the mobile robot 10b operates autonomously. During the autonomous operation, the mobile robot 10b may be driven between various different locations at a worksite by means of the traction arrangement 14 and the manipulators 26 may perform various tasks.
Fig. 12 schematically represents a top view of the mobile robot 10b and the human 60. In some situations, there is a need for the human 60 to easily and intuitively take control of the actions of the mobile robot 10b. As shown in Fig. 12, the human 60 grips the handle 16b with one hand as a human input 44. The human input 44 is detected by the handle 16b. The handle 16b sends a human input signal 52 corresponding to the human input 44 to the control system 38. Upon receiving the human input signal 52, the control system 38 stops the control of the mobile robot 10b in the automatic mode 62 and starts a control of the mobile robot 10b in a priority mode 64. In the priority mode 64, the mobile robot 10b initially freezes, i.e. stops any movements of the traction arrangement 14 and the manipulators 26. Thus, upon detecting the grip of the handle 16b, the mobile robot 10b is commanded to pause all its activities and to stop if it is moving. The handle 16b, when gripped, enables the human 6o to take control over the mobile robot lob. The automatic mode 62 of the mobile robot 10b is overridden by the priority mode 64 when a human input 44 is detected by the handle 16b.
The priority mode 64 may comprise an initial time period during which the mobile robot 10b is frozen. The mobile robot 10b may for example be frozen for 1 second after receiving the first human input 44. When the initial time period has lapsed, the mobile robot 10b can be controlled in the priority mode 64 based on the human input 44 provided to the handle 16b. The human 60 can thus easily and intuitively take control of the mobile robot 10b to move it in any desired direction.
Fig. 13 schematically represents a top view of the mobile robot 10b and the human 60 when the human 60 provides a further human input 44 to the mobile robot 10b. The human 60 has in this example first performed a rotation of the mobile robot 10b (in the counterclockwise direction relative to Fig. 12), for example by means of a human input 44 corresponding to Fig. 6a. In Fig. 13, the human 60 provides a human input 44 corresponding to Fig. 4a such that the traction arrangement 14 is commanded to propulse the mobile robot 10b in a forward direction.
According to one example, the mobile robot 10b stops when the handle 16b is released, i.e. remains in the priority mode 64. The mobile robot 10b maybe commanded to resume the automatic mode 62 by providing a short tap on the handle 16b by the hand of the human 60.
While the present disclosure has been described with reference to exemplary embodiments, it will be appreciated that the present invention is not limited to what has been described above. For example, it will be appreciated that the dimensions of the parts may be varied as needed. Accordingly, it is intended that the present invention may be limited only by the scope of the claims appended hereto.

Claims

1. A method of controlling a mobile robot (10a, 10b), the method comprising:
- providing a mobile robot (10a, lob) having a base structure (12), a handle (16a, 16b) attached to the base structure (12) and a traction arrangement (14) for moving the mobile robot (10a, 10b) over a surface (18);
- controlling the mobile robot (10a, 10b) in an automatic mode (62);
- detecting a human input (44) to the handle (16a, 16b); and
- stopping the control of the mobile robot (10a, 10b) in the automatic mode (62) and controlling the mobile robot (10a, 10b) in a priority mode (64) upon detection of the human input (44).
2. The method according to claim 1, wherein the human input (44) comprises a directional input (54).
3. The method according to claim 2, wherein the directional input (54) comprises a movement of the handle (16a, 16b).
4. The method according to any of the preceding claims, wherein the human input (44) comprises a touch.
5. The method according to any of the preceding claims, wherein the human input (44) comprises a magnitudinal input (56).
6. The method according to any of the preceding claims, wherein the priority mode (64) comprises controlling the traction arrangement (14) based on the human input (44).
7. The method according to claims 5 and 6, wherein the priority mode (64) comprises stopping the traction arrangement (14) when the magnitudinal input (56) is below a threshold value (58).
8. A control system (38) for controlling a mobile robot (10a, 10b) having a base structure (12), a handle (16a, 16b) attached to the base structure (12) and a traction arrangement (14) for moving the mobile robot (10a, 10b) over a surface (18), the control system (38) comprising at least one data processing device (40) and at least one memory (42) having a computer program stored thereon, the computer program comprising program code which, when executed by the at least one data processing device (40), causes the at least one data processing device (40) to perform the steps of:
- controlling the mobile robot (10a, 10b) in an automatic mode (62);
- receiving a human input signal (52) indicative of a detection of a human input (44) to the handle (16a, 16b); and
- stopping the control of the mobile robot (10a, 10b) in the automatic mode (62) and controlling the mobile robot (10a, 10b) in a priority mode (64) in response to receiving the human input signal (52).
9. A mobile robot (10a, 10b) comprising the base structure (12), the handle (16a, 16b), the traction arrangement (14) and the control system (38) according to claim 8.
10. The mobile robot (10a, 10b) according to claim 9, wherein the mobile robot (10a, 10b) further comprises a manipulator (26) movable relative to the base structure (12).
11. The mobile robot (10a, 10b) according to claim 10, wherein the manipulator (26) is movable relative to the handle (16a, 16b).
12. The mobile robot (10a, 10b) according to claim 10 or 11, wherein the base structure (12) is positioned between the manipulator (26) and the handle (16a, 16b).
13. The mobile robot (10a, 10b) according to any of claims 9 to 12, wherein the handle (16a, 16b) is movable relative to the base structure (12).
14. The mobile robot (10a, 10b) according to any of claims 9 to 13, wherein the handle (16a, 16b) is positioned at a height of 0.5 m to 1.5 m.
PCT/EP2020/082733 2020-11-19 2020-11-19 Method of controlling mobile robot, control system and mobile robot WO2022106007A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/082733 WO2022106007A1 (en) 2020-11-19 2020-11-19 Method of controlling mobile robot, control system and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/082733 WO2022106007A1 (en) 2020-11-19 2020-11-19 Method of controlling mobile robot, control system and mobile robot

Publications (1)

Publication Number Publication Date
WO2022106007A1 true WO2022106007A1 (en) 2022-05-27

Family

ID=73497784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/082733 WO2022106007A1 (en) 2020-11-19 2020-11-19 Method of controlling mobile robot, control system and mobile robot

Country Status (1)

Country Link
WO (1) WO2022106007A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140095007A1 (en) * 2005-09-30 2014-04-03 Colin Angle Companion robot for personal interaction
CN107024934A (en) 2017-04-21 2017-08-08 山东大学 A kind of hospital service robot and method based on cloud platform
WO2020010047A1 (en) * 2018-07-05 2020-01-09 Brain Corporation Systems and methods for operating autonomous tug robots

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140095007A1 (en) * 2005-09-30 2014-04-03 Colin Angle Companion robot for personal interaction
CN107024934A (en) 2017-04-21 2017-08-08 山东大学 A kind of hospital service robot and method based on cloud platform
WO2020010047A1 (en) * 2018-07-05 2020-01-09 Brain Corporation Systems and methods for operating autonomous tug robots

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GRAF BIRGIT ET AL: "Care-O-bot II-Development of a Next Generation Robotic Home Assistant", AUTONOMOUS ROBOTS, vol. 16, no. 2, 1 March 2004 (2004-03-01), NL, pages 193 - 205, XP055798296, ISSN: 0929-5593, Retrieved from the Internet <URL:https://link.springer.com/content/pdf/10.1023/B:AURO.0000016865.35796.e9.pdf> DOI: 10.1023/B:AURO.0000016865.35796.e9 *

Similar Documents

Publication Publication Date Title
US9216510B2 (en) Remote vehicle control system
US10980605B2 (en) Remote control robot system
US9770823B2 (en) Auto-reach method and system for a remote vehicle
Bruemmer et al. Dynamic-Autonomy for Urban Search and Rescue.
US11154985B1 (en) Null space jog control for robotic arm
US20080255704A1 (en) Control System and Teach Pendant For An Industrial Robot
JP2024501065A (en) Integrated robot vehicle system and control method
CN114072255A (en) Mobile robot sensor configuration
Levratti et al. TIREBOT: A novel tire workshop assistant robot
KR101740898B1 (en) Robot teaching apparatus
Sharp et al. Semiautonomous dual-arm mobile manipulator system with intuitive supervisory user interfaces
JP7230128B2 (en) LEARNING METHOD FOR ROBOT WORK AND ROBOT SYSTEM
WO2012149402A2 (en) Robotic agile lift system with extremity control
Schraft et al. Man-Machine-Interaction and co-operation for mobile and assisting robots
KR20190009106A (en) Controller for manipulator with incremental control and method for controlling manipulator using the same
WO2022106007A1 (en) Method of controlling mobile robot, control system and mobile robot
JPH0413580A (en) Manipulator for building work
Hentout et al. A telerobotic human/robot interface for mobile manipulators: A study of human operator performance
Wrock et al. Decoupled teleoperation of a holonomic mobile-manipulator system using automatic switching
Konz et al. Position/rate haptic control of a hydraulic forklift
CN114589680A (en) Control device, special robot system and control method thereof
Hoeniger Dynamically shared control in human-robot teams through physical interactions
Ng et al. A dual joystick-trackball interface for accurate and time-efficient teleoperation of cable-driven parallel robots within large workspaces
Jin et al. Collaborative operation of robotic manipulators with human intent prediction and shared control
Kristensen et al. Tactile man-robot interaction for an industrial service robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20810951

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20810951

Country of ref document: EP

Kind code of ref document: A1