WO2009063318A1 - Mobile robot and mobile robot danger zone indicating method - Google Patents
Mobile robot and mobile robot danger zone indicating method Download PDFInfo
- Publication number
- WO2009063318A1 WO2009063318A1 PCT/IB2008/003401 IB2008003401W WO2009063318A1 WO 2009063318 A1 WO2009063318 A1 WO 2009063318A1 IB 2008003401 W IB2008003401 W IB 2008003401W WO 2009063318 A1 WO2009063318 A1 WO 2009063318A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mobile robot
- danger zone
- movement
- indicating
- abnormality
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1615—Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
- B25J9/162—Mobile manipulator, movable base with manipulator arm mounted on it
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16P—SAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
- F16P3/00—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
- F16P3/12—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
- F16P3/14—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
- F16P3/142—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40202—Human robot coexistence
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40203—Detect position of operator, create non material barrier to protect operator
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0272—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
Definitions
- the invention relates to a mobile robot, and more particularly to technology for visually indicating a danger zone that is created around a mobile robot as the mobile robot moves.
- Robots capable of moving such as a “mobile robots"
- humanoid robots capable of bipedal locomotion i.e., walking on two legs
- traveling robots that travel on wheels
- These kinds of mobile robots show promise in applications such as guiding people to destinations and searching for objects according to a command by a human or in cooperation with a human, and holding and transporting objects.
- Such mobile robots are referred to as symbiotic robots or partner robots or the like.
- Symbiotic robots move and operate in the same areas that people do. Therefore, with a mobile robot that is a symbiotic robot, it is not possible to physically separate the areas where the robots operate from the area where people operate with safety fences or the like, as it is done with industrial robots that are fixed in place. As a result, a person may accidentally enter an area of where a mobile robot is operating, i.e., a danger zone created around the mobile robot as it moves, while the symbiotic mobile robot is moving.
- International Publication No. WO 2005/015466 describes technology that indicates the movement path of a mobile robot using a projector arranged in the area where the mobile robot is operating or on the mobile robot itself. According to the publication, indicating the movement path beforehand makes it possible for a person to avoid getting in the way of the mobile robot and getting hurt (see FIGS. 18Ato 18D, FIG 12A, and paragraphs 107 to 109, and 152, for example).
- JP-A-2007-102488 describes a mobile robot that follows a person who is its master and executes an operation to indicate danger to the master (in JP-A-2007-102488 this movement is referred to as a danger avoidance operation). For example, if an automobile, an obstacle, or a vehicle or the like is approaching the master and the master is in danger of colliding with that automobile, obstacle, or vehicle or the like, the mobile robot described in JP-A-2007-102488 projects a mark indicating the direction in which the master should move to avoid the collision, or a mark indicating that the master should stop, or the like onto the travel surface using a projector (see FIGS. 8 and 9, and paragraph 41, for example). [0006] Japanese Patent Application Publication No. 2003-222295
- JP-A-2003-222295 describes technology to detect when a person has entered an area which is established in advance as a machine safety area using an optoelectronic sensor, and activates a safety related function such as stopping operation of the machine. Furthermore, according to the publication, the range and size of the safety area where the safety related function activates are set according to the position, direction of movement, and speed of movement of the person to be protected (see FIGS. 1 to 3, and paragraphs 29, 30, 32, and 33, for example).
- JP-A-05-229784 describes technology that creates a cone-shaped curtain of colored laser light in a danger zone (referred to as "dangerous zone” in JP-A-05-229784) below a member that is being carried by a crane, which enables a person to visually see the danger zone (see FIGS. 1 and 2, and paragraph 20, for example).
- a person may accidentally enter the danger zone created around the mobile robot as it moves (hereinafter, simply referred to as the "danger zone of the mobile robot), while the symbiotic mobile robot is moving.
- the danger zone of the mobile robot constantly changes as the movement (e.g., the rate of movement (also referred to as the "moving velocity"), direction of movement, etc.) of the mobile robot changes, so it is highly likely that the symbiotic mobile robot will interfere with a person.
- the movement path of the mobile robot is fixed and the indicated zone does not change according to a change in the movement (e.g. rate or direction, etc. of movement) of the mobile robot.
- the zone projected by the projector is fixed. Therefore, the technology described in International Publication WO 2005/015466 is not suitable for a mobile robot that moves autonomously, in which case the movement path is not determined in advance. Also, even if the movement of the mobile robot changes dynamically, the technology described in International Publication WO 2005/015466 is not able to effectively indicate to a person the danger zone that constantly changes as the movement of the mobile robot changes.
- the mobile robot described in JP- A-2007- 102488 informs the master of the danger that the operating area where the master and a following robot are both located poses to the master. That is, the mobile robot described in JP- A-2007- 102488 does not inform the master of a danger zone where interference between the mobile robot itself and the master is potentially dangerous for the master. Therefore, JP-A-2007- 102488 naturally also does not mention changing the indicated shape of the danger zone according to a change in movement (e.g., rate, direction, etc. of movement) of the mobile robot.
- a change in movement e.g., rate, direction, etc. of movement
- JP-A-2003-222295 does not assume a mobile robot and therefore does not mention dynamically changing the danger zone of a mobile robot (the safety zone in JP-A-2003-222295) according to the movement (e.g., rate, direction, etc. of movement). That is, the technologies described in JP-A-2007-102488 and JP-A-2003-222295 do not visually make a person aware of a danger zone where there is potential danger to the person from interference between the mobile robot and the person.
- This invention provides a mobile robot that makes a person visually aware of a danger zone that changes as the movement of the mobile robot changes, as well as a method for indicating a danger zone of a mobile robot.
- a first aspect of the invention relates to a mobile robot that includes movement controlling means for controlling movement of the mobile robot, and indicating means for visually indicating a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot based on control by the movement controlling means.
- a second aspect of the invention relates to a danger zone indicating method for visually indicating a danger zone of a mobile robot.
- This method includes visually indicating a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot.
- the mobile robot according to the first aspect and the method according to the second aspect enable a person to be made visually aware of a danger zone that changes as the movement of the mobile robot changes, thereby minimizing unnecessary interference between the mobile robot and a person.
- the indicating means may extend the indicated shape of the danger zone, out in the direction of movement of the mobile robot as the moving velocity of the mobile robot increases.
- a person in front of the mobile robot i.e., in the direction of movement of the mobile robot
- the danger zone in the direction of movement can be extended out as the moving velocity of the mobile robot increases. Therefore, a person can be effectively made aware of a change in the danger zone that follows a change in moving velocity of the mobile robot.
- the indicating means may change the indicated shape of the danger zone to include the position of the movable portion according to displacement of the movable portion.
- the movable portion may be, for example, a multiple-jointed arm in which a plurality of links are connected together.
- the mobile robot moves the movable portion, e.g., extends the arm, that movable arm may interfere with a person in the area.
- the indicated range of the danger zone can be changed according to that movement, i.e., the displacement of the movable portion. Therefore, a person can be effectively made aware of a change in the danger zone that follows a change in the movement of the movable arm of the mobile robot.
- the indicating means may have a projector that projects the danger zone on a travel surface where the mobile robot is located.
- the mobile robot may also include abnormality detecting means for detecting an abnormality in the indication of the danger zone by the projector, and the movement controlling means may execute a safety operation in the mobile robot when the abnormality detecting means detects the abnormality.
- the safety operation may be, for example, stopping the mobile robot or sounding an alarm or the like.
- the abnormality detecting means may also include a camera that captures an image of the danger zone projected by the projector, and an abnormality detecting portion that detects the abnormality based on the image captured by the camera.
- a mobile robot that makes a person visually aware of a danger zone that changes as the movement of the mobile robot changes, as well as a method for indicating a danger zone of a mobile robot, are able to be provided.
- FIG 1 is a block diagram of a control system of a mobile robot according to a first example embodiment of the invention
- FIGs. 2A and 2B are external views of the mobile robot according to the first example embodiment of the invention.
- FIG 3 is a flowchart illustrating a control routine for indicating a danger zone around the mobile robot according to the first example embodiment of the invention
- FIG 4 is another flowchart illustrating another control routine for indicating the danger zone around the mobile robot according to the first example embodiment of the invention
- FIG 5 A is a view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention
- FIG 5B is another view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention
- FIG 5C is yet another view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention
- FIG 5D is a still another view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention.
- FIG 6 is a block diagram of a control system of a mobile robot according to a second example embodiment of the invention.
- FIG 7 is a schematic diagram illustrating an operation to detect an abnormality in the indication of the danger zone of the mobile robot according to the second example embodiment of the invention.
- FIG 8 is a flowchart illustrating a routine that is executed when it is detected that the danger zone of the mobile robot is not being indicated properly according to the second example embodiment of the invention
- a mobile robot 1 is a mobile robot which has multi-jointed arms 104, each of which is made of a plurality of links are connected together, and which travels on a surface using wheels 103.
- the mobile robot 1 has a plurality of arms but it may also have only one arm. Further, the mobile robot 1 runs on wheels but it may also walk on legs.
- FIG 1 is a block diagram of the structure of the control system of the mobile robot 1 that relates to danger zone indication.
- a movement planning portion 100 determines the movement of the mobile robot 1 based on, for example, environmental map information stored beforehand, movement path information that has been determined beforehand, and external information obtained from visual sensors, not shown. More specifically, the movement planning portion 100 generates a movement path, target moving velocity, and target acceleration of the mobile robot 1, as well as target angular trajectories and the like of the joints of the arms 104, and the like.
- a movement controlling portion 101 receives a target control value of a target moving velocity or target acceleration or the like determined by the movement planning portion 100, as well as a rotation amount of the wheels 103 measured by an encoder 105A. The movement controlling portion 101 then performs feedback control and calculates the torque control value for driving the wheels 103.
- the movement planning portion 100 and the movement controlling portion 101 in this example embodiment may be regarded as movement controlling means of the invention.
- the movement controlling portion 101 receives the target angular trajectory of each joint calculated by the movement planning portion 100 and the actual angle of each joint measured by an encoder 105B. The movement controlling portion 101 then executes feedback control and calculates the torque control value for driving the joints.
- the arms 104 of the mobile robot 1 are moved by driving the joints of the arms 104 with a driving portion 102B according to the torque control value calculated by the movement controlling portion 101.
- An indication controlling portion 106 receives the target control values of, for example, the target angular trajectories of the joints of the arms 104 and the target moving velocity determined by the movement planning portion 100 and the like, and determines the indicated range of the danger zone using these target control values.
- a projector 107 visually indicates the danger zone by projecting the danger zone determined by the indication controlling portion 106 on the surface on which the mobile robot 1 is traveling.
- a plurality of the projectors 107 may be provided on the mobile robot 1. That is, the number and arrangement of the projectors 107 may be determined appropriately according to how the danger zone around the mobile robot 1 is to be indicated.
- the indication controlling portion 106 and the projector 107 in this example embodiment may be regarded as indicating means of the invention.
- FIGS. 2A and 2B show an example of the positioning of the projectors
- FIG 2A is a left side view of the mobile robot 1 and FIG 2B is a top view of the mobile robot 1.
- the mobile robot 1 shown in FIGS. 2A and 2B has a right arm 104R, a left arm 104L, and a head 11 all connected to a trunk 10.
- the mobile robot 1 shown in FIGS. 2A and 2B runs by wheels 103A and 103B which are rotationally driven.
- the lower portion of the trunk 10 that is near the travel surface 50 is a skirt portion 12 to which four projectors 107F, 107B, 107R, and 107L are fixed.
- FIG 2B shows an example of a danger zone 60 projected on the travel surface 50 by the four projectors 107F, 107B, 107R, and 107L.
- the range (i.e., area) of the danger zone 60 changes according to the rate (i.e., velocity) and direction of movement of the mobile robot 1 and the movement of the arms 104 (i.e., 104R and 104L), as will be described later.
- the mobile robot 1 may also have fewer (i.e., three or less) projectors or more (i.e., five or more) projectors.
- the danger zone is not indicated in the areas that are diagonally to the front and rear of the mobile robot 1. However, the entire danger zone 360 degrees around the mobile robot 1 may be projected by adjusting the projection angles of the projectors 107 or increasing the number of projectors 107 arranged on the mobile robot 1.
- FIG 3 is a flowchart of a routine executed by the indication controlling portion 106 related to changing the danger zone according to the target moving velocity and direction of movement of the mobile robot 1.
- step SIl it is determined whether a target moving velocity V of the mobile robot 1 is zero, i.e., whether the mobile robot 1 is stopped. If the target moving velocity V is zero, the indication controlling portion 106 supplies indication data to the projectors 107 so that the projectors 107 will indicate the area that has been preset as the danger zone when the mobile robot 1 stops (step S 13).
- the danger zone 60 when the mobile robot 1 stops may be circular, centered about the axis of the mobile robot 1, as shown in FIG 5A.
- the indication controlling portion 106 supplies indication data to the projectors 107 so that the projectors 107 will indicate a danger zone that extends out in the direction of movement (step S 12).
- FIG 5B shows an example of the danger zone 60 when the mobile robot 1 is moving forward. As shown in FIG 5B, when the mobile robot 1 moves, the danger zone extends out in that direction. Also, the danger zone may also be extended out further in the direction of movement as the moving velocity of the mobile robot 1 increases, as shown in FIG 5C, in order to more reliably avoid the danger of a collision between a person and the mobile robot 1.
- FIG 4 is a flowchart of a routine executed by the indication controlling portion 106 related to changing the danger zone according to the movement of the arms 104 (i.e., 104R and 104L) of the mobile robot 1.
- step S21 it is determined whether an arm 104R or 104L is moving according to the target angular trajectory of the joints of the arm 104R or 104L. If an arm 104R or 104L is moving, the indication controlling portion 106 changes the indication of the danger zone by the projectors 107 to cover the position where the arm 104R or 104L will be after it moves.
- FIG 5D is a view showing the danger zone 60 after it has been changed when the arm 104L extends out to the left of the mobile robot 1. More specifically, an additional danger zone 6OA is indicated according to the displacement of the arm 104L so the position of the arm 104L is covered by the danger zone 6OA.
- the mobile robot 1 visually indicates the danger zone 60 using the projectors 107 and dynamically changes the indicated shape and range of the danger zone 60 according to changes in movement of the mobile robot 1, such as changes in the moving velocity and displacement of the arms 104.
- the mobile robot 1 can make a person visually aware of the danger zone 60 that changes as the movement of the mobile robot 1 changes, thereby minimizing unnecessary interference between the mobile robot 1 and a person.
- the indicated range of the danger zone 60 changes according to the displacement of an arm 104R or 104L.
- the mobile robot 1 may also be provided with another movable portion (such as legs for walking) instead of or in addition to the arms 104R and 104L.
- the indicated shape and range of the danger zone may be changed according to the displacement of the other movable portion other than the arms 104R and 104L.
- the indication of the danger zone changes according to changes in both the moving velocity of the mobile robot 1 and movement of the movable portion of the mobile robot 1.
- the mobile robot may be designed such that the danger zone changes according to either only the moving velocity of the mobile robot 1 or only movement of the movable portion of the mobile robot 1.
- a mobile robot 2 according to the second example embodiment is similar to the mobile robot 1 described above with the addition of a self-diagnostic function for diagnosing whether the projectors 107 are projecting the danger zone properly.
- FIG 6 shows the structure of a control system of the mobile robot 2 related to indicating the danger zone.
- FIG 6 differs from FIG 1 described above in a camera 208 and an abnormality detecting portion 209 are provided, and a movement controlling portion 201 is provided instead of the movement controlling portion 101.
- the other constituent elements in FIG 6 are the same as those shown in FIG 1 so they will be denoted by the same reference characters as they are in FIG 1, and detailed descriptions related to those elements will be omitted.
- the camera 208 is a camera for capturing an image of the danger zone projected on the travel surface 50 by the projectors 107.
- the abnormality detecting portion 209 detects an abnormality in the indication of the danger zone by the projectors 107 by referencing the image captured by the camera 208.
- the abnormality detecting portion 209 may detect an abnormality in an actual projected image 71 by comparing an image that should be projected by the projectors 107 (i.e., a planned projected image 70) with a captured image 72 obtained by capturing the actual projected image 71 with the camera 208.
- the mobile robot 2 can detect an abnormality in the indication by the projectors 107. That is, the mobile robot 2 can detect, for example, if the image is not yet projected or if the shape of the danger zone that is projected differs from the planned projection.
- the flowchart show in FIG 8 illustrates a routine that is executed by the movement controlling portion 201 when it is detected that the danger zone of the mobile robot 2 is not being indicated properly.
- the movement controlling portion 201 executes a safety operation in the mobile robot 2 (i.e., steps S31 and S32).
- This safety operation may be, for example, stopping the mobile robot 2, reducing the moving velocity of the mobile robot 2, sounding an alarm using a speaker, not shown, or the like.
- the mobile robot 2 is able to minimize unnecessary interference between the mobile robot 2 and a person by executing a safety operation as an alternative to the danger zone indicating function. This makes the mobile robot 2 even safer.
- a mobile robot that travels on wheels is described in detail.
- the invention may also be applied to a variety of mobile robots, not only those that travel on wheels.
- the invention may also be applied to a mobile robot that travels by rotatably driving a spherical rotating body, or a two-legged or multiple-legged mobile robot that travels using legs.
Abstract
A mobile robot (1) has movement controlling portion (100, 101) that controls movement of the mobile robot (1), and indicating portion (106, 107) that visually indicates a danger zone (60) that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot based on control by the movement controlling portion.
Description
MOBILE ROBOTAND MOBILE ROBOT DANGER ZONE INDICATING METHOD
BACKGROUND OF THE INVENTION
1. Field of the Invention [0001] The invention relates to a mobile robot, and more particularly to technology for visually indicating a danger zone that is created around a mobile robot as the mobile robot moves.
2. Description of the Related Art
[0002] Robots capable of moving (hereinafter, referred to as a "mobile robots"), such as humanoid robots capable of bipedal locomotion (i.e., walking on two legs) and traveling robots that travel on wheels, are known. These kinds of mobile robots show promise in applications such as guiding people to destinations and searching for objects according to a command by a human or in cooperation with a human, and holding and transporting objects. Such mobile robots are referred to as symbiotic robots or partner robots or the like.
[0003] Symbiotic robots move and operate in the same areas that people do. Therefore, with a mobile robot that is a symbiotic robot, it is not possible to physically separate the areas where the robots operate from the area where people operate with safety fences or the like, as it is done with industrial robots that are fixed in place. As a result, a person may accidentally enter an area of where a mobile robot is operating, i.e., a danger zone created around the mobile robot as it moves, while the symbiotic mobile robot is moving.
[0004] International Publication No. WO 2005/015466 describes technology that indicates the movement path of a mobile robot using a projector arranged in the area where the mobile robot is operating or on the mobile robot itself. According to the publication, indicating the movement path beforehand makes it possible for a person to avoid getting in the way of the mobile robot and getting hurt (see FIGS. 18Ato 18D, FIG 12A, and paragraphs 107 to 109, and 152, for example).
[0005] Japanese Patent Application Publication No., 2007-102488
(JP-A-2007-102488) describes a mobile robot that follows a person who is its master and executes an operation to indicate danger to the master (in JP-A-2007-102488 this movement is referred to as a danger avoidance operation). For example, if an automobile, an obstacle, or a vehicle or the like is approaching the master and the master is in danger of colliding with that automobile, obstacle, or vehicle or the like, the mobile robot described in JP-A-2007-102488 projects a mark indicating the direction in which the master should move to avoid the collision, or a mark indicating that the master should stop, or the like onto the travel surface using a projector (see FIGS. 8 and 9, and paragraph 41, for example). [0006] Japanese Patent Application Publication No. 2003-222295
(JP-A-2003-222295) describes technology to detect when a person has entered an area which is established in advance as a machine safety area using an optoelectronic sensor, and activates a safety related function such as stopping operation of the machine. Furthermore, according to the publication, the range and size of the safety area where the safety related function activates are set according to the position, direction of movement, and speed of movement of the person to be protected (see FIGS. 1 to 3, and paragraphs 29, 30, 32, and 33, for example).
[0007] Japanese Patent Application Publication No. 05-229784 (JP-A-05-229784) describes technology that creates a cone-shaped curtain of colored laser light in a danger zone (referred to as "dangerous zone" in JP-A-05-229784) below a member that is being carried by a crane, which enables a person to visually see the danger zone (see FIGS. 1 and 2, and paragraph 20, for example).
[0008] As described above, a person may accidentally enter the danger zone created around the mobile robot as it moves (hereinafter, simply referred to as the "danger zone of the mobile robot), while the symbiotic mobile robot is moving. Also, the danger zone of the mobile robot constantly changes as the movement (e.g., the rate of movement (also referred to as the "moving velocity"), direction of movement, etc.) of the mobile robot changes, so it is highly likely that the symbiotic mobile robot will interfere with a person.
[0009] Incidentally, in the technology described in International Publication
WO 2005/015466, the movement path of the mobile robot is fixed and the indicated zone does not change according to a change in the movement (e.g. rate or direction, etc. of movement) of the mobile robot. Thus, the zone projected by the projector is fixed. Therefore, the technology described in International Publication WO 2005/015466 is not suitable for a mobile robot that moves autonomously, in which case the movement path is not determined in advance. Also, even if the movement of the mobile robot changes dynamically, the technology described in International Publication WO 2005/015466 is not able to effectively indicate to a person the danger zone that constantly changes as the movement of the mobile robot changes.
[0010] Furthermore, the mobile robot described in JP- A-2007- 102488 informs the master of the danger that the operating area where the master and a following robot are both located poses to the master. That is, the mobile robot described in JP- A-2007- 102488 does not inform the master of a danger zone where interference between the mobile robot itself and the master is potentially dangerous for the master. Therefore, JP-A-2007- 102488 naturally also does not mention changing the indicated shape of the danger zone according to a change in movement (e.g., rate, direction, etc. of movement) of the mobile robot.
[0011] Also, JP-A-2003-222295 does not assume a mobile robot and therefore does not mention dynamically changing the danger zone of a mobile robot (the safety zone in JP-A-2003-222295) according to the movement (e.g., rate, direction, etc. of movement). That is, the technologies described in JP-A-2007-102488 and JP-A-2003-222295 do not visually make a person aware of a danger zone where there is potential danger to the person from interference between the mobile robot and the person.
SUMMARY OF THE INVENTION
[0012] This invention provides a mobile robot that makes a person visually aware of a danger zone that changes as the movement of the mobile robot changes, as well as a method for indicating a danger zone of a mobile robot.
[0013] A first aspect of the invention relates to a mobile robot that includes movement controlling means for controlling movement of the mobile robot, and indicating means for visually indicating a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot based on control by the movement controlling means.
[0014] Also, a second aspect of the invention relates to a danger zone indicating method for visually indicating a danger zone of a mobile robot. This method includes visually indicating a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot.
[0015] The mobile robot according to the first aspect and the method according to the second aspect enable a person to be made visually aware of a danger zone that changes as the movement of the mobile robot changes, thereby minimizing unnecessary interference between the mobile robot and a person.
[0016] Incidentally, the indicating means may extend the indicated shape of the danger zone, out in the direction of movement of the mobile robot as the moving velocity of the mobile robot increases. For example, a person in front of the mobile robot (i.e., in the direction of movement of the mobile robot) is in greater danger of being hit or the like by the mobile robot due to a delay in an avoidance operation of the mobile robot when the mobile robot is moving at a greater speed (i.e., velocity). According to the structure described above, the danger zone in the direction of movement can be extended out as the moving velocity of the mobile robot increases. Therefore, a person can be effectively made aware of a change in the danger zone that follows a change in moving velocity of the mobile robot.
[0017] Also, when the mobile robot has a movable portion that is driven based on control by the movement controlling means, the indicating means may change the indicated shape of the danger zone to include the position of the movable portion according to displacement of the movable portion. In this case, the movable portion
may be, for example, a multiple-jointed arm in which a plurality of links are connected together. For example, if the mobile robot moves the movable portion, e.g., extends the arm, that movable arm may interfere with a person in the area. According to the structure described above, when the mobile robot moves the movable portion, the indicated range of the danger zone can be changed according to that movement, i.e., the displacement of the movable portion. Therefore, a person can be effectively made aware of a change in the danger zone that follows a change in the movement of the movable arm of the mobile robot.
[0018] Also, the indicating means may have a projector that projects the danger zone on a travel surface where the mobile robot is located. Furthermore, the mobile robot may also include abnormality detecting means for detecting an abnormality in the indication of the danger zone by the projector, and the movement controlling means may execute a safety operation in the mobile robot when the abnormality detecting means detects the abnormality. In this case, the safety operation may be, for example, stopping the mobile robot or sounding an alarm or the like. This kind of structure makes it possible to minimize unnecessary interference between the mobile robot and a person by executing the safety operation as an alternative to the danger zone indicating function when the danger zone indicating function is not operating properly. Incidentally, the abnormality detecting means may also include a camera that captures an image of the danger zone projected by the projector, and an abnormality detecting portion that detects the abnormality based on the image captured by the camera.
[0019] Thus, as described above, a mobile robot that makes a person visually aware of a danger zone that changes as the movement of the mobile robot changes, as well as a method for indicating a danger zone of a mobile robot, are able to be provided.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The foregoing and further objects, features and advantages of the invention will become apparent from the following description of example embodiments with reference to the accompanying drawings, wherein like numerals are used to
represent like elements and wherein:
FIG 1 is a block diagram of a control system of a mobile robot according to a first example embodiment of the invention;
FIGs. 2A and 2B are external views of the mobile robot according to the first example embodiment of the invention;
FIG 3 is a flowchart illustrating a control routine for indicating a danger zone around the mobile robot according to the first example embodiment of the invention;
FIG 4 is another flowchart illustrating another control routine for indicating the danger zone around the mobile robot according to the first example embodiment of the invention;
FIG 5 A is a view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention;
FIG 5B is another view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention; FIG 5C is yet another view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention;
FIG 5D is a still another view illustrating the control to indicate the danger zone around the mobile robot according to the first example embodiment of the invention;
FIG 6 is a block diagram of a control system of a mobile robot according to a second example embodiment of the invention;
FIG 7 is a schematic diagram illustrating an operation to detect an abnormality in the indication of the danger zone of the mobile robot according to the second example embodiment of the invention; and
FIG 8 is a flowchart illustrating a routine that is executed when it is detected that the danger zone of the mobile robot is not being indicated properly according to the second example embodiment of the invention;
DETAILED DESCRIPTION OF EMBODIMENTS [0021] Example embodiments of the present invention will be described in
greater detail below with reference to the accompanying drawings. Like elements in the drawings will be denoted by like reference characters and redundant descriptions will be omitted when necessary to make the description clearer.
[0022] A mobile robot 1 according to a first example embodiment is a mobile robot which has multi-jointed arms 104, each of which is made of a plurality of links are connected together, and which travels on a surface using wheels 103. Incidentally, in this example embodiment, the mobile robot 1 has a plurality of arms but it may also have only one arm. Further, the mobile robot 1 runs on wheels but it may also walk on legs.
[0023] FIG 1 is a block diagram of the structure of the control system of the mobile robot 1 that relates to danger zone indication. In the drawing, a movement planning portion 100 determines the movement of the mobile robot 1 based on, for example, environmental map information stored beforehand, movement path information that has been determined beforehand, and external information obtained from visual sensors, not shown. More specifically, the movement planning portion 100 generates a movement path, target moving velocity, and target acceleration of the mobile robot 1, as well as target angular trajectories and the like of the joints of the arms 104, and the like.
[0024] A movement controlling portion 101 receives a target control value of a target moving velocity or target acceleration or the like determined by the movement planning portion 100, as well as a rotation amount of the wheels 103 measured by an encoder 105A. The movement controlling portion 101 then performs feedback control and calculates the torque control value for driving the wheels 103. Incidentally, the movement planning portion 100 and the movement controlling portion 101 in this example embodiment, for example, may be regarded as movement controlling means of the invention. [0025] Also, the movement controlling portion 101 receives the target angular trajectory of each joint calculated by the movement planning portion 100 and the actual angle of each joint measured by an encoder 105B. The movement controlling portion 101 then executes feedback control and calculates the torque control value for driving the joints. The arms 104 of the mobile robot 1 are moved by driving the joints of the arms
104 with a driving portion 102B according to the torque control value calculated by the movement controlling portion 101.
[0026] An indication controlling portion 106 receives the target control values of, for example, the target angular trajectories of the joints of the arms 104 and the target moving velocity determined by the movement planning portion 100 and the like, and determines the indicated range of the danger zone using these target control values.
[0027] A projector 107 visually indicates the danger zone by projecting the danger zone determined by the indication controlling portion 106 on the surface on which the mobile robot 1 is traveling. Incidentally, a plurality of the projectors 107 may be provided on the mobile robot 1. That is, the number and arrangement of the projectors 107 may be determined appropriately according to how the danger zone around the mobile robot 1 is to be indicated. Incidentally, for example, the indication controlling portion 106 and the projector 107 in this example embodiment may be regarded as indicating means of the invention. [0028] FIGS. 2A and 2B show an example of the positioning of the projectors
107. FIG 2A is a left side view of the mobile robot 1 and FIG 2B is a top view of the mobile robot 1. The mobile robot 1 shown in FIGS. 2A and 2B has a right arm 104R, a left arm 104L, and a head 11 all connected to a trunk 10. The mobile robot 1 shown in FIGS. 2A and 2B runs by wheels 103A and 103B which are rotationally driven. Also, the lower portion of the trunk 10 that is near the travel surface 50 is a skirt portion 12 to which four projectors 107F, 107B, 107R, and 107L are fixed. The projector 107F is arranged on the front of the mobile robot 1, the projector 107B is arranged on the back of the mobile robot 1, the projector 107R is arranged on the right side of the mobile robot 1, and the projector 107Lis arranged on the left side of the mobile robot 1. [0029] FIG 2B shows an example of a danger zone 60 projected on the travel surface 50 by the four projectors 107F, 107B, 107R, and 107L. Incidentally, the range (i.e., area) of the danger zone 60 changes according to the rate (i.e., velocity) and direction of movement of the mobile robot 1 and the movement of the arms 104 (i.e., 104R and 104L), as will be described later. Also, in this example, four projectors are
used, i.e., 107F, 107B, 107R, and 107L, but the mobile robot 1 may also have fewer (i.e., three or less) projectors or more (i.e., five or more) projectors. Further, in the example shown in FIG 2B, the danger zone is not indicated in the areas that are diagonally to the front and rear of the mobile robot 1. However, the entire danger zone 360 degrees around the mobile robot 1 may be projected by adjusting the projection angles of the projectors 107 or increasing the number of projectors 107 arranged on the mobile robot 1.
[0030] Continuing on, an operation for changing the indicated range of the danger zone 60 will hereinafter be described. FIG 3 is a flowchart of a routine executed by the indication controlling portion 106 related to changing the danger zone according to the target moving velocity and direction of movement of the mobile robot 1. In step SIl, it is determined whether a target moving velocity V of the mobile robot 1 is zero, i.e., whether the mobile robot 1 is stopped. If the target moving velocity V is zero, the indication controlling portion 106 supplies indication data to the projectors 107 so that the projectors 107 will indicate the area that has been preset as the danger zone when the mobile robot 1 stops (step S 13). For example, the danger zone 60 when the mobile robot 1 stops may be circular, centered about the axis of the mobile robot 1, as shown in FIG 5A.
[0031] If, on the other hand, the target moving velocity V is not zero and the mobile robot 1 is going to move, the indication controlling portion 106 supplies indication data to the projectors 107 so that the projectors 107 will indicate a danger zone that extends out in the direction of movement (step S 12). FIG 5B shows an example of the danger zone 60 when the mobile robot 1 is moving forward. As shown in FIG 5B, when the mobile robot 1 moves, the danger zone extends out in that direction. Also, the danger zone may also be extended out further in the direction of movement as the moving velocity of the mobile robot 1 increases, as shown in FIG 5C, in order to more reliably avoid the danger of a collision between a person and the mobile robot 1.
[0032] FIG 4 is a flowchart of a routine executed by the indication controlling portion 106 related to changing the danger zone according to the movement of the arms 104 (i.e., 104R and 104L) of the mobile robot 1. In step S21, it is determined whether
an arm 104R or 104L is moving according to the target angular trajectory of the joints of the arm 104R or 104L. If an arm 104R or 104L is moving, the indication controlling portion 106 changes the indication of the danger zone by the projectors 107 to cover the position where the arm 104R or 104L will be after it moves. FIG 5D is a view showing the danger zone 60 after it has been changed when the arm 104L extends out to the left of the mobile robot 1. More specifically, an additional danger zone 6OA is indicated according to the displacement of the arm 104L so the position of the arm 104L is covered by the danger zone 6OA.
[0033] As described above, the mobile robot 1 according to this example embodiment visually indicates the danger zone 60 using the projectors 107 and dynamically changes the indicated shape and range of the danger zone 60 according to changes in movement of the mobile robot 1, such as changes in the moving velocity and displacement of the arms 104. As a result, the mobile robot 1 can make a person visually aware of the danger zone 60 that changes as the movement of the mobile robot 1 changes, thereby minimizing unnecessary interference between the mobile robot 1 and a person.
[0034] Incidentally, in this example embodiment, the indicated range of the danger zone 60 changes according to the displacement of an arm 104R or 104L. However, the mobile robot 1 may also be provided with another movable portion (such as legs for walking) instead of or in addition to the arms 104R and 104L. In this case, the indicated shape and range of the danger zone may be changed according to the displacement of the other movable portion other than the arms 104R and 104L.
[0035] Incidentally, in this example embodiment, the indication of the danger zone changes according to changes in both the moving velocity of the mobile robot 1 and movement of the movable portion of the mobile robot 1. However, the mobile robot may be designed such that the danger zone changes according to either only the moving velocity of the mobile robot 1 or only movement of the movable portion of the mobile robot 1.
[0036] A mobile robot 2 according to the second example embodiment is
similar to the mobile robot 1 described above with the addition of a self-diagnostic function for diagnosing whether the projectors 107 are projecting the danger zone properly. FIG 6 shows the structure of a control system of the mobile robot 2 related to indicating the danger zone. FIG 6 differs from FIG 1 described above in a camera 208 and an abnormality detecting portion 209 are provided, and a movement controlling portion 201 is provided instead of the movement controlling portion 101. The other constituent elements in FIG 6 are the same as those shown in FIG 1 so they will be denoted by the same reference characters as they are in FIG 1, and detailed descriptions related to those elements will be omitted. [0037] The camera 208 is a camera for capturing an image of the danger zone projected on the travel surface 50 by the projectors 107. The abnormality detecting portion 209 detects an abnormality in the indication of the danger zone by the projectors 107 by referencing the image captured by the camera 208. For example, as shown in the schematic diagram in FIG 7, the abnormality detecting portion 209 may detect an abnormality in an actual projected image 71 by comparing an image that should be projected by the projectors 107 (i.e., a planned projected image 70) with a captured image 72 obtained by capturing the actual projected image 71 with the camera 208. Accordingly, the mobile robot 2 can detect an abnormality in the indication by the projectors 107. That is, the mobile robot 2 can detect, for example, if the image is not yet projected or if the shape of the danger zone that is projected differs from the planned projection.
[0038] The flowchart show in FIG 8 illustrates a routine that is executed by the movement controlling portion 201 when it is detected that the danger zone of the mobile robot 2 is not being indicated properly. As shown in FIG 8, when the abnormality detecting portion 209 detects an abnormality in the indication by the projectors 107, the movement controlling portion 201 executes a safety operation in the mobile robot 2 (i.e., steps S31 and S32). This safety operation may be, for example, stopping the mobile robot 2, reducing the moving velocity of the mobile robot 2, sounding an alarm using a speaker, not shown, or the like.
[0039] The mobile robot 2 according to this example embodiment is able to minimize unnecessary interference between the mobile robot 2 and a person by executing a safety operation as an alternative to the danger zone indicating function. This makes the mobile robot 2 even safer. [0040] Incidentally, in the first and second example embodiments described above, a mobile robot that travels on wheels is described in detail. However, the invention may also be applied to a variety of mobile robots, not only those that travel on wheels. For example, the invention may also be applied to a mobile robot that travels by rotatably driving a spherical rotating body, or a two-legged or multiple-legged mobile robot that travels using legs.
[0041] While some of the embodiments of the invention have been illustrated above, it is to be understood that the invention is not limited to the details of the illustrated embodiments, but may be embodied with various changes, modifications or improvements, which may occur to those skilled in the art, without departing from the spirit and scope of the invention.
Claims
1. A mobile robot characterized by comprising: movement controlling means for controlling movement of the mobile robot; and indicating means for visually indicating a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot based on control by the movement controlling means.
2. The mobile robot according to claim 1, wherein the indicating means extends the indicated shape of the danger zone out in the direction of movement of the mobile robot as the moving velocity of the mobile robot increases.
3. The mobile robot according to claim 1 or 2, further comprising: a movable portion that is driven based on control by the movement controlling means, wherein the indicating means changes the indicated shape of the danger zone to include the position of the movable portion according to displacement of the movable portion.
4. The mobile robot according to any one of claims 1 to 3, wherein the indicating means has a projector that projects the danger zone on a travel surface where the mobile robot is located.
5. The mobile robot according to claim 4, further comprising: abnormality detecting means for detecting an abnormality in the indication of the danger zone by the projector, wherein the movement controlling means executes a safety operation in the mobile robot when the abnormality detecting means detects the abnormality.
6. The mobile robot according to claim 5, wherein the abnormality determining means includes a camera that captures an image of the danger zone projected by the projector, and an abnormality detecting portion that detects the abnormality based on the image captured by the camera.
7. A danger zone indicating method for visually indicating a danger zone of a mobile robot, comprising: visually indicating a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot.
8. The danger zone indicating method according to claim 7, wherein the indicated shape of the danger zone is extended in the direction of movement of the mobile robot as the moving velocity of the mobile robot increases.
9. The danger zone indicating method according to claim 7 or 8, wherein the indicated shape of the danger zone is changed to include the position of the movable portion according to displacement of a movable portion provided on the mobile robot.
10. The danger zone indicating method according to any one of claims 7 to 9, the danger zone is visually indicated by being projected on a travel surface where the mobile robot is located using a projector provided on the mobile robot.
11. The danger zone indicating method according to claim 10, further comprising: capturing an image of the danger zone projected by the projector using a camera; detecting an abnormality in the indication of the danger zone based on the image captured by the camera; and executing a safety operation in the mobile robot when the abnormality is detected.
12. A mobile robot comprising: a movement controlling portion that controls movement of the mobile robot; and an indicating portion that visually indicates a danger zone that is created around the mobile robot as the mobile robot moves, while changing the shape of the danger zone according to a change in movement of the mobile robot based on control by the movement controlling portion.
13. The mobile robot according to claim 12, wherein the indicating portion extends the indicated shape of the danger zone out in the direction of movement of the mobile robot as the moving velocity of the mobile robot increases.
14. The mobile robot according to claim 12 or 13, further comprising: an abnormality detecting portion that detects an abnormality in the indication of the danger zone by the indicating portion, wherein the movement controlling portion executes a safety operation in the mobile robot when the abnormality detecting portion detects the abnormality.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007-297538 | 2007-11-16 | ||
JP2007297538A JP2009123045A (en) | 2007-11-16 | 2007-11-16 | Traveling robot and method for displaying dangerous range of traveling robot |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009063318A1 true WO2009063318A1 (en) | 2009-05-22 |
Family
ID=40379049
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2008/003401 WO2009063318A1 (en) | 2007-11-16 | 2008-11-13 | Mobile robot and mobile robot danger zone indicating method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2009123045A (en) |
WO (1) | WO2009063318A1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102448681A (en) * | 2009-12-28 | 2012-05-09 | 松下电器产业株式会社 | Operating space presentation device, operating space presentation method, and program |
CN102686371A (en) * | 2010-01-25 | 2012-09-19 | 松下电器产业株式会社 | Danger presentation device,danger presentation system,danger presentation method, and program |
DE102013215409A1 (en) * | 2013-08-06 | 2015-02-12 | Robert Bosch Gmbh | Projection unit for a self-contained mobile platform, transport robot and method for operating a self-contained mobile platform |
US8983662B2 (en) | 2012-08-03 | 2015-03-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Robots comprising projectors for projecting images on identified projection surfaces |
DE102013222137A1 (en) * | 2013-10-30 | 2015-04-30 | Continental Automotive Gmbh | Camera arrangement for a motor vehicle, motor vehicle and method |
WO2016000770A1 (en) | 2014-07-02 | 2016-01-07 | Siemens Aktiengesellschaft | Warning method and robot system |
WO2016034843A1 (en) * | 2014-09-03 | 2016-03-10 | Dyson Technology Limited | A mobile robot |
DE102014219538A1 (en) * | 2014-09-26 | 2016-03-31 | Robert Bosch Gmbh | Method for operating mobile platforms and mobile platform |
CN106313040A (en) * | 2015-07-03 | 2017-01-11 | 电装波动株式会社 | Robot system |
DE102015224309A1 (en) * | 2015-12-04 | 2017-06-08 | Kuka Roboter Gmbh | Representation of variable protective fields |
ITUA20164576A1 (en) * | 2016-06-21 | 2017-12-21 | Alumotion S R L | Collaborative robot, reporting system and process of reporting a movement of a collaborative robot |
WO2018068537A1 (en) * | 2016-10-14 | 2018-04-19 | 平安科技(深圳)有限公司 | Tour guide robot and moving area calibration method, computer readable storage medium |
DE102014011811B4 (en) | 2014-08-09 | 2018-08-09 | Audi Ag | Informing a road user about an autopilot-controlled journey |
US10112302B2 (en) | 2014-09-03 | 2018-10-30 | Dyson Technology Limited | Mobile robot |
US10144342B2 (en) | 2014-09-03 | 2018-12-04 | Dyson Technology Limited | Mobile robot |
DE102017117545A1 (en) * | 2017-08-02 | 2019-02-07 | Jungheinrich Aktiengesellschaft | Method for monitoring the travel path of a truck and an industrial truck |
EP3546136A1 (en) * | 2018-03-29 | 2019-10-02 | Sick AG | Augmented reality system |
CN111246978A (en) * | 2017-10-17 | 2020-06-05 | 库卡德国有限公司 | Method and system for operating a robot arm |
CN111566582A (en) * | 2018-01-08 | 2020-08-21 | 三星电子株式会社 | Electronic device and control method thereof |
US10896543B2 (en) | 2014-08-25 | 2021-01-19 | X Development Llc | Methods and systems for augmented reality to display virtual representations of robotic device actions |
US11498587B1 (en) * | 2019-01-25 | 2022-11-15 | Amazon Technologies, Inc. | Autonomous machine motion planning in a dynamic environment |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101089735B1 (en) | 2009-07-08 | 2011-12-07 | 한국과학기술원 | Projection that display image inserting a beam in doughnut shape on underneath |
EP2810748A4 (en) | 2012-02-03 | 2016-09-07 | Nec Corp | Communication draw-in system, communication draw-in method, and communication draw-in program |
US10109223B2 (en) * | 2014-07-02 | 2018-10-23 | Sony Corporation | Image display apparatus |
JP6464945B2 (en) * | 2015-07-03 | 2019-02-06 | 株式会社デンソーウェーブ | Robot system |
DE102015119501A1 (en) | 2015-11-11 | 2017-05-11 | RobArt GmbH | Subdivision of maps for robot navigation |
JP7172039B2 (en) * | 2015-12-28 | 2022-11-16 | 日本電気株式会社 | Management system, mobile unit, management device, position notification method, management method and program |
JP2017148905A (en) * | 2016-02-25 | 2017-08-31 | ファナック株式会社 | Robot system and robot control unit |
JP6852447B2 (en) * | 2016-05-16 | 2021-03-31 | セイコーエプソン株式会社 | Robot system |
US11709489B2 (en) * | 2017-03-02 | 2023-07-25 | RobArt GmbH | Method for controlling an autonomous, mobile robot |
WO2018213931A1 (en) * | 2017-05-25 | 2018-11-29 | Clearpath Robotics Inc. | Systems and methods for process tending with a robot arm |
JP2019010704A (en) * | 2017-06-30 | 2019-01-24 | Idec株式会社 | Illumination light display device |
KR102340852B1 (en) * | 2018-12-13 | 2021-12-20 | 키넷 주식회사 | Oxygen generator |
JP7162564B2 (en) * | 2019-04-16 | 2022-10-28 | 清水建設株式会社 | Information display system and information display method |
WO2020222392A1 (en) * | 2019-04-29 | 2020-11-05 | 경희대학교산학협력단 | Method for robot safety evaluation on basis of collision force bigdata that enables real-time robot collision risk monitoring using graphic information |
KR102289375B1 (en) * | 2019-04-29 | 2021-08-13 | 경희대학교 산학협력단 | Real-time safety evaluation method of robot based on the big data of collision physical force using graphic information |
WO2020234938A1 (en) * | 2019-05-17 | 2020-11-26 | 三菱電機株式会社 | Robot movement assist system |
JP7192748B2 (en) * | 2019-11-25 | 2022-12-20 | トヨタ自動車株式会社 | Conveyance system, learned model generation method, learned model, control method and program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1334871A2 (en) * | 2002-02-07 | 2003-08-13 | Toyota Jidosha Kabushiki Kaisha | Movable body safety system and movable body operation support method |
DE10240227A1 (en) * | 2002-08-28 | 2004-03-11 | Daimlerchrysler Ag | Method for operating display device on production machine, uses optical lighting unit to project working zone on surface located in production machine region |
US20050207618A1 (en) * | 2002-09-24 | 2005-09-22 | Christian Wohler | Method and device for safeguarding a hazardous area |
US20060195226A1 (en) * | 2003-08-07 | 2006-08-31 | Matsushita Electric Industrial Co., Ltd. | Mobile robot system and program for controlling the same |
DE202006020026U1 (en) * | 2006-01-21 | 2007-09-13 | Linde Material Handling Gmbh | Industrial truck with an optical warning device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6016386A (en) * | 1983-07-04 | 1985-01-28 | 松下電器産業株式会社 | Monitor device for operation |
JP3830305B2 (en) * | 1999-06-02 | 2006-10-04 | 俊弘 津村 | Peripheral surface display device for moving object |
JP4126291B2 (en) * | 2004-06-23 | 2008-07-30 | 三菱重工業株式会社 | Robot control program updating method and system |
-
2007
- 2007-11-16 JP JP2007297538A patent/JP2009123045A/en active Pending
-
2008
- 2008-11-13 WO PCT/IB2008/003401 patent/WO2009063318A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1334871A2 (en) * | 2002-02-07 | 2003-08-13 | Toyota Jidosha Kabushiki Kaisha | Movable body safety system and movable body operation support method |
DE10240227A1 (en) * | 2002-08-28 | 2004-03-11 | Daimlerchrysler Ag | Method for operating display device on production machine, uses optical lighting unit to project working zone on surface located in production machine region |
US20050207618A1 (en) * | 2002-09-24 | 2005-09-22 | Christian Wohler | Method and device for safeguarding a hazardous area |
US20060195226A1 (en) * | 2003-08-07 | 2006-08-31 | Matsushita Electric Industrial Co., Ltd. | Mobile robot system and program for controlling the same |
DE202006020026U1 (en) * | 2006-01-21 | 2007-09-13 | Linde Material Handling Gmbh | Industrial truck with an optical warning device |
Non-Patent Citations (1)
Title |
---|
WAKITA Y ET AL: "Knowledge projection on robot task environment", ROBOT AND HUMAN COMMUNICATION, 1997. RO-MAN '97. PROCEEDINGS., 6TH IEE E INTERNATIONAL WORKSHOP ON SENDAI, JAPAN 29 SEPT.-1 OCT. 1997, NEW YORK, NY, USA,IEEE, US, 29 September 1997 (1997-09-29), pages 136 - 141, XP010263188, ISBN: 978-0-7803-4076-3 * |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8731276B2 (en) | 2009-12-28 | 2014-05-20 | Panasonic Corporation | Motion space presentation device and motion space presentation method |
CN102448681B (en) * | 2009-12-28 | 2014-09-10 | 松下电器产业株式会社 | Operating space presentation device, operating space presentation method, and program |
CN102448681A (en) * | 2009-12-28 | 2012-05-09 | 松下电器产业株式会社 | Operating space presentation device, operating space presentation method, and program |
CN102686371A (en) * | 2010-01-25 | 2012-09-19 | 松下电器产业株式会社 | Danger presentation device,danger presentation system,danger presentation method, and program |
US8816874B2 (en) | 2010-01-25 | 2014-08-26 | Panasonic Corporation | Danger presentation device, danger presentation system, danger presentation method and program |
US8983662B2 (en) | 2012-08-03 | 2015-03-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Robots comprising projectors for projecting images on identified projection surfaces |
US9943965B2 (en) | 2012-08-03 | 2018-04-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Robots comprising projectors for projecting images on identified projection surfaces |
US9508235B2 (en) | 2013-08-06 | 2016-11-29 | Robert Bosch Gmbh | Projection unit for a self-directing mobile platform, transport robot and method for operating a self-directing mobile platform |
DE102013215409A1 (en) * | 2013-08-06 | 2015-02-12 | Robert Bosch Gmbh | Projection unit for a self-contained mobile platform, transport robot and method for operating a self-contained mobile platform |
EP2837473A3 (en) * | 2013-08-06 | 2015-12-02 | Robert Bosch Gmbh | Projection unit for an automatically mobile platform, transport robot and method for operating an automatically mobile platform |
DE102013222137A1 (en) * | 2013-10-30 | 2015-04-30 | Continental Automotive Gmbh | Camera arrangement for a motor vehicle, motor vehicle and method |
CN106660215A (en) * | 2014-07-02 | 2017-05-10 | 西门子公司 | Warning system and robot system |
US9908244B2 (en) | 2014-07-02 | 2018-03-06 | Siemens Aktiengesellschaft | Warning method and robot system |
WO2016000770A1 (en) | 2014-07-02 | 2016-01-07 | Siemens Aktiengesellschaft | Warning method and robot system |
DE102014011811B4 (en) | 2014-08-09 | 2018-08-09 | Audi Ag | Informing a road user about an autopilot-controlled journey |
US10896543B2 (en) | 2014-08-25 | 2021-01-19 | X Development Llc | Methods and systems for augmented reality to display virtual representations of robotic device actions |
CN106662877A (en) * | 2014-09-03 | 2017-05-10 | 戴森技术有限公司 | A mobile robot |
CN106662877B (en) * | 2014-09-03 | 2020-11-17 | 戴森技术有限公司 | Mobile robot |
WO2016034843A1 (en) * | 2014-09-03 | 2016-03-10 | Dyson Technology Limited | A mobile robot |
US10144342B2 (en) | 2014-09-03 | 2018-12-04 | Dyson Technology Limited | Mobile robot |
US10112302B2 (en) | 2014-09-03 | 2018-10-30 | Dyson Technology Limited | Mobile robot |
DE102014219538A1 (en) * | 2014-09-26 | 2016-03-31 | Robert Bosch Gmbh | Method for operating mobile platforms and mobile platform |
US10434666B2 (en) | 2015-07-03 | 2019-10-08 | Denso Wave Incorporated | Industrial robot system optically indicating motion area of robot |
CN106313040A (en) * | 2015-07-03 | 2017-01-11 | 电装波动株式会社 | Robot system |
EP3383595B1 (en) | 2015-12-04 | 2021-09-01 | KUKA Deutschland GmbH | Displaying of variable safety zones |
DE102015224309A1 (en) * | 2015-12-04 | 2017-06-08 | Kuka Roboter Gmbh | Representation of variable protective fields |
WO2017221171A1 (en) * | 2016-06-21 | 2017-12-28 | Alumotion S.R.L. | Collaborative robot, signalling system and process of signalling a displacement of a collaborative robot |
ITUA20164576A1 (en) * | 2016-06-21 | 2017-12-21 | Alumotion S R L | Collaborative robot, reporting system and process of reporting a movement of a collaborative robot |
WO2018068537A1 (en) * | 2016-10-14 | 2018-04-19 | 平安科技(深圳)有限公司 | Tour guide robot and moving area calibration method, computer readable storage medium |
US11009889B2 (en) | 2016-10-14 | 2021-05-18 | Ping An Technology (Shenzhen) Co., Ltd. | Guide robot and method of calibrating moving region thereof, and computer readable storage medium |
DE102017117545A1 (en) * | 2017-08-02 | 2019-02-07 | Jungheinrich Aktiengesellschaft | Method for monitoring the travel path of a truck and an industrial truck |
US11027953B2 (en) | 2017-08-02 | 2021-06-08 | Jungheinrich Aktiengesellschaft | Method for monitoring the road path of a truck and a floor conveyor |
CN111246978B (en) * | 2017-10-17 | 2024-03-08 | 库卡德国有限公司 | Method and system for operating a robotic arm |
CN111246978A (en) * | 2017-10-17 | 2020-06-05 | 库卡德国有限公司 | Method and system for operating a robot arm |
CN111566582A (en) * | 2018-01-08 | 2020-08-21 | 三星电子株式会社 | Electronic device and control method thereof |
EP3546136A1 (en) * | 2018-03-29 | 2019-10-02 | Sick AG | Augmented reality system |
US11498587B1 (en) * | 2019-01-25 | 2022-11-15 | Amazon Technologies, Inc. | Autonomous machine motion planning in a dynamic environment |
Also Published As
Publication number | Publication date |
---|---|
JP2009123045A (en) | 2009-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009063318A1 (en) | Mobile robot and mobile robot danger zone indicating method | |
CN107428003B (en) | Mobile robot with collision recognition system | |
KR102162756B1 (en) | Mobile robot platform system for process and production management | |
US7649331B2 (en) | Mobile robot | |
JP5337408B2 (en) | Autonomous mobile body and its movement control method | |
JP6588624B2 (en) | Humanoid robot | |
WO2009098927A1 (en) | Autonomous mobile body, and method and system for controlling the same | |
WO2016199312A1 (en) | Autonomous movement system | |
Endo et al. | Path following control for tracked vehicles based on slip-compensating odometry | |
US10353392B2 (en) | Autonomous moving body and movement control method of autonomous moving body | |
KR101664575B1 (en) | Method to obstacle avoidance for wheeled mobile robots | |
JP2014161991A (en) | Robot movement mechanism and robot comprising the same | |
JP2008142841A (en) | Mobile robot | |
JP2022542216A (en) | Mobile robot sensor configuration | |
JP2019514103A (en) | Autonomous Robot with Guidance in Push Mode | |
JP2015084129A (en) | Guidance robot | |
Mathew et al. | Trajectory tracking and control of differential drive robot for predefined regular geometrical path | |
Li et al. | Self-balancing two-wheeled robot featuring intelligent end-to-end deep visual-steering | |
JP6962027B2 (en) | Mobile vehicle | |
EP2236251B1 (en) | Mobile robot controller | |
JP5761152B2 (en) | Traveling device | |
JP7000378B2 (en) | Automated guided vehicle system | |
JP5304143B2 (en) | Autonomous traveling control device, autonomous traveling control method, and self-propelled vehicle | |
Aref et al. | Position-based visual servoing for pallet picking by an articulated-frame-steering hydraulic mobile machine | |
JP2010079697A (en) | Obstacle avoiding device, obstacle avoiding method and self-propelling vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08850418 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08850418 Country of ref document: EP Kind code of ref document: A1 |