WO2002023297A1 - Systeme de commande de mouvement de corps mobiles - Google Patents

Systeme de commande de mouvement de corps mobiles Download PDF

Info

Publication number
WO2002023297A1
WO2002023297A1 PCT/JP2001/007878 JP0107878W WO0223297A1 WO 2002023297 A1 WO2002023297 A1 WO 2002023297A1 JP 0107878 W JP0107878 W JP 0107878W WO 0223297 A1 WO0223297 A1 WO 0223297A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving
route
movement
space
moving body
Prior art date
Application number
PCT/JP2001/007878
Other languages
English (en)
Japanese (ja)
Inventor
Kunikatsu Takase
Yoshiro Hada
Original Assignee
Kunikatsu Takase
Yoshiro Hada
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunikatsu Takase, Yoshiro Hada filed Critical Kunikatsu Takase
Priority to JP2002527882A priority Critical patent/JPWO2002023297A1/ja
Priority to AU2001284520A priority patent/AU2001284520A1/en
Publication of WO2002023297A1 publication Critical patent/WO2002023297A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room

Definitions

  • the present invention relates to a moving body movement control system for moving a moving body such as a robot to a desired point, and particularly to a moving body movement control system that enables a plurality of moving bodies to move smoothly to a desired point.
  • a moving body movement control system for moving a moving body such as a robot to a desired point
  • a moving body movement control system that enables a plurality of moving bodies to move smoothly to a desired point.
  • Helpmates Various moving bodies such as robots that move autonomously in a predetermined space such as a room have been proposed.
  • the helpmate is a mobile robot equipped with wheels and traveling using it, and has an electronic map showing the location of obstacles and the like in the space.
  • the robot is equipped with a robot-mounted sensor that detects obstacles during the progress of the mouth pot, and can reach the destination (goal) while detecting obstacles and avoiding them.
  • Such moving objects are designed to detect obstacles using electronic maps and sensors, and to set a route to avoid them, and move to the goal.
  • the sensor in order to avoid collisions when a human crosses just before the moving object, when the sensor detects such an obstacle on the route, the sensor reflexively stops the movement. When the sensor detects that the obstacle has disappeared after a lapse of a predetermined time, the movement is resumed at that point.
  • robots evolve and perform fixed work at fixed positions like industrial robots, and also move around a predetermined space such as a room and clean, carry documents, or provide care for the sick.
  • Various proposals have been made. If the latter proposal becomes more concrete, a relationship may arise in which multiple moving objects (hereafter simply referred to as autonomous robots) co-exist in one space. For example, in a certain space, other than carrying food to the sick near the autonomous robot that is cleaning An example is the case where two autonomous robots pass each other. In such a case, if an attempt is made to avoid collision when the autonomous robots move, one or both autonomous robots may stop traveling.
  • Figure 29 illustrates this phenomenon. It is assumed that the first autonomous robot 101 has a plan to travel in a predetermined space on a route 102 shown by a solid line, and the second autonomous robot 103 is shown by a broken line. It is assumed that the person has a plan to travel in the same space on route 104 of. It is assumed that both routes 102 and 104 intersect at a first point 105 and a second point 106, for example. In such a case, for example, it is assumed that the first autonomous robot 101 arrives at the point 105 in advance. At this point, if the second autonomous robot 103 attempts to start traveling, it recognizes that an obstacle exists in a form blocking the route 104.
  • the second autonomous robot 103 at the starting point cannot depart from the starting point as long as it is going to run on the route 104. Whether a moving object, such as the first autonomous robot 101, happens to be at the first point 105, or whether there is a permanent obstacle at this position. This is because the determination cannot be made from the second autonomous robot 103 side.
  • a similar problem occurs at the second location 106. That is, when one of the first or second autonomous robots 101 and 1 ⁇ 3 arrives at the second point 106 first, the other autonomous robot 1 As long as 01 and 103 are blocking the passage, movement cannot be started and the vehicle will be stopped.
  • the movement state of the other party's autonomous robot is predicted, and if the other party is on the route 102 or 104 on which the other party is traveling, the first or second case is determined. Proceed to the vicinity of the second point 105, 106 and wait until the fault condition is cleared. If the autonomous robot that caused the failure state deviates from the route 102 or 104 on which it travels while performing such an operation, it is necessary to stop traveling somehow. And efficient movement is possible. However, it is difficult for the autonomous mouth pots 101 and 103 to make such predictions. Therefore, if there is another obstacle on the route 102, 104 as the route once set, if such a situation occurs, the obstacle is removed, Unless the obstacle autonomously moves to another location, it cannot move at all.
  • each autonomous mouth pot identifies other robots or obstacles and, if necessary, tracks their movements with advanced image recognition technology and prediction on the autonomous robot side Control is required.
  • FIG. 29 shows a case where only one of the two autonomous mouth pots temporarily stops, but if both of them are in this state at the same time, the stop state cannot be released.
  • Figure 30 shows an example in which a deadlock phenomenon occurs in which both autonomous robots cannot move.
  • the first autonomous robot 113 moves toward its goal 114 on a route avoiding them. Let 1 1 5 also proceed towards its goal 1 1 6 avoiding the obstacles 1 1 1. 1 1 2.
  • Both autonomous It is assumed that Lopots 1 13 and 1 15 stop at the positions shown in the figure to recognize the other on their route and avoid collision. In this case, the deadlock is not released because the positional relationship between the two autonomous robots 113 and 115 remains fixed over time.
  • the problem of deadlock as a permanent stop occurs in a situation where there are multiple autonomous robots.
  • an object of the present invention is to provide a moving body movement control system that can smoothly control the movement of a plurality of moving bodies without performing enormous operations that hinder the movement control. . Disclosure of the invention
  • Environment-side imaging means comprising one or a plurality of image-capturing cameras for imaging; and (mouth) a target specification for specifying each moving object and other objects present on a passage from image data taken by the environment-side imaging means.
  • the moving body movement control system further includes a traveling control means for setting a route again by the route setting means and repeating the traveling control toward the goal by the moving body-specific traveling control means.
  • one or a plurality of imaging cameras that respectively cover a part or the whole of a space in which a plurality of moving bodies move are prepared, and the image data captured by the environment-side imaging unit is prepared. Identify other objects existing on the passage of each moving object by the target identification means, and identify the position of the moving object and other objects existing on the passage from the image data captured by the environmental imaging means. I try to do it. Then, a route for traveling from the current position specified by the position specifying means to each of the goal positions is set by the path setting means for each moving object specified by the target specifying means.
  • the traveling control unit controls the traveling of the moving body toward the goal by a predetermined unit from the current position by the moving body-specific traveling control means along the set route.
  • the predetermined unit means that the traveling control may be performed in a predetermined time unit or the traveling control may be performed in a predetermined amount.
  • the traveling control means sets a traveling route from the current position to the goal position for each moving object again by the route setting means until the plurality of moving objects reach the respective goals, and travels by moving object.
  • the traveling control toward the goal is repeated by the control means.
  • each moving object may be temporarily stopped by the movement of another moving object, but can reach each goal in a finite time except in special cases.
  • an environment-side image pickup means comprising one or more image pickup cameras for picking up marks attached to necessary ones of movable movable objects; and (mouth) a mark for each mark picked up by the environment-side image pickup means.
  • Moving control means for performing moving control for moving the moving body toward the goal by a predetermined unit from the current position within a moving range in which the moving body does not collide with an obstacle such as another moving body along each route; Until one of the moving objects reaches each goal, the route for traveling from the current position to the goal position is set again by the route setting means for each moving object, and the traveling control means for each moving object sets the route to the goal.
  • a traveling control means for repeating traveling control is provided in the moving object movement control system.
  • a part or the whole of the space in which the plurality of moving bodies move is individually subjected to power, and the moving body that moves by itself in this space can move with the force applied from others.
  • One or a plurality of imaging power lenses that image the marks attached to the necessary movable objects in advance are prepared, and each moving object is extracted from the unique pattern for each mark imaged by the environmental imaging means.
  • the movable object are specified by the target specifying means, and the position of the moving object and the movable object with the mark in the space is specified by the position specifying means from the position of the mark imaged by the environment-side imaging means.
  • a route for traveling from the current position specified by the position specifying means to each of the goal positions is set by the path setting means for each moving object specified by the target specifying means.
  • the traveling control for moving the mobile unit toward the goal from the current position to the goal by a predetermined unit is performed by the traveling control unit for each mobile unit along the set route.
  • the predetermined unit means that the travel control may be performed in a predetermined time unit or the travel control may be performed in a predetermined amount.
  • the traveling control means sets a route for traveling from the current position to the goal position for each moving object again by the route setting means until the plurality of moving objects reach the respective goals, and separates each moving object.
  • the traveling control toward the goal is repeated by the traveling control means.
  • the route setting means includes a predetermined setting in which routes of a plurality of moving objects may intersect or closely parallel. It is characterized by restricting the running order for these mobiles to travel without collision at the place where the moving bodies collide, or placing a restricting passage that regulates the traveling direction with respect to each other.
  • the route setting means is configured such that the plurality of moving bodies travel at predetermined positions where the paths of the plurality of moving bodies may intersect or are close to and parallel to each other without collision.
  • a restriction path is provided to regulate the traveling order of the vehicles or to regulate the traveling direction with respect to each other.
  • the route setting means may include a predetermined path in which the paths of the plurality of moving bodies may intersect or closely parallel. At a location near these locations that are not on the path of these moving objects, at least one of the potentially colliding moving objects should be temporarily retracted to avoid collision with other moving objects. And a temporary evacuation point setting means for setting a route.
  • the route setting means is configured to determine at least one of the moving objects that may collide when there is a possibility that a route that may cause a collision between the moving objects is set.
  • the temporary evacuation point is set by the temporary evacuation point setting means so that the evacuation point is temporarily evacuated to avoid collision with other moving objects to avoid collision.
  • the route setting means divides a space in which a plurality of moving bodies can travel simultaneously into a plurality of spaces.
  • a check point for moving objects that moves from one small area to the other small area is placed at the point where the small areas are connected, and the traveling control means for each moving object Until it passes, it is characterized by running control without considering obstacles in the small area after passing.
  • the route setting means divides a space in which a plurality of mobiles can run at the same time into a plurality of spaces and connects the divided small regions to a plurality of spaces.
  • the mark emits infrared light
  • the environment-side imaging unit is responsive to the infrared light
  • the invention according to claim 6 is characterized in that the mark emits infrared light, and the environment-side imaging means is a means responsive to the infrared light.
  • the movement of the moving object can be controlled using the mark that does not obstruct human eyes.
  • the restriction path is provided such that the moving bodies of the plurality of moving bodies are not in contact with each other at least within a range of an area where the moving paths of the plurality of moving bodies can be regarded as common.
  • the passage is characterized by being moved at a constant speed in a predetermined common direction at a constant speed.
  • the restricting passage is a passage that moves a plurality of moving bodies at a predetermined interval that does not contact each other at a constant speed in a predetermined common direction. As a result, each moving body can be moved without stopping as in the case of riding on an escalator.
  • moving body rotating means for switching each moving body that has moved to the intersection of the moving paths of the plurality of moving bodies to a desired moving direction. It is characterized by doing.
  • the invention described in claim 8 deals with another aspect of the restriction passage described in claim 3.
  • a common intersection is provided for a plurality of moving bodies, and the moving direction is switched so that each moving body that has moved so far can move in a desired moving direction. If the intersection is composed of a ring-shaped passage and a plurality of radial passages connected to it, even if a plurality of moving objects use the intersection, these moving objects can be moved efficiently through the ring-shaped passage.
  • One of a plurality of radial passages can be used to deliver the desired direction.
  • the space in which the moving body moves is mapped or mapped to the layout space, and the layout space is further divided into cells of a predetermined unit. It is characterized in that the route setting means sets a route in units of cells.
  • the route planning is facilitated by setting the route of the moving object in cell units.
  • the route initially set for each cell does not form a smooth curve or straight line, it is possible to correct this to a smooth path.
  • the moving body in the moving body movement control system according to the first or second aspect, includes a mounted sensor for detecting a surrounding obstacle, and the mounted sensor sets a route. It is characterized by further comprising correction path setting means for independently setting a correction path for avoiding an obstacle when an obstacle that can be avoided on the path set by the means is detected.
  • the moving object itself has an on-board sensor for detecting a surrounding obstacle, and runs on a route set by imaging by the environment-side imaging means.
  • a correction route can be set up independently to minimize this. Even when a person suddenly appears on the set route, collision can be avoided by setting the corrected route by the corrected route setting means.
  • the invention according to claim 11 is the mobile object movement control system according to claim 10, wherein when the correction route setting means sets a correction route, the correction result of the route is notified to the route setting means. I have.
  • the corrected route setting means on the moving body side sets the corrected route
  • the correction result of the route is notified to the route setting means, so that the future route of the moving body can be created.
  • this can this be used as a reference, but it can also be used as a reference for setting the travel route of another moving object when traveling using this corrected route.
  • Routes can be set by erasing the body and other objects in order It is characterized in that it is provided with an obstacle specifying means for specifying an obstacle which is an obstacle to movement by determining whether there is an obstacle.
  • the route setting means when the route setting means cannot set the route to the goal for the specific moving object, the other moving objects and other objects are sequentially erased to set the route. By determining whether the setting is possible, an obstacle that hinders movement is specified.
  • part or all of the environment-side imaging means outputs stereo image data capable of measuring a three-dimensional position of each mark. It is characterized by being imaging means.
  • the invention according to claim 14 indicates that if part or all of the environment-side imaging means is stereo imaging means, it is possible to grasp the three-dimensional position of each mark.
  • the other moving objects are mutually present on the path of the plurality of moving objects, so that these moving objects are mutually connected. It is characterized by having deadlock detecting means for detecting that a deadlock state that cannot be moved to another is detected by erasing each other's moving objects on the route to determine whether or not it is possible to move. .
  • the invention of claim 15 shows that the deadlock detecting means detects whether or not it is possible to move by erasing the moving objects on the route.
  • the invention according to claim 16 is the moving object movement control system according to claim 2, wherein the environment-side imaging means detects the rotation angle of the moving object based on the directionality of the pattern forming the mark. .
  • the invention according to claim 16 indicates that the environment-side imaging means can detect not only the position of the mark but also the rotation angle of the moving body. .
  • the traveling control means for each moving body is provided based on the position of each object specified by the position specifying means. It is characterized in that commands for controlling the movement of the moving object are issued one after another, and the moving object controls its own movement using the received commands. That is, in the invention according to claim 17, the traveling control means for each moving body sequentially issues a command for controlling the movement of each moving body based on the position of each object specified by the position specifying means on the environment side. As a result, each mobile unit can receive these commands and use its own drive mechanism to perform control for movement, thereby relieving complicated control for movement.
  • a route that allows a specific moving body to move in the space is set by the route setting means, and the specific direction is set by the movement instruction means along this route. Instructions regarding movement are given to each of the moving destinations where the moving body moves one after another.
  • the environment-side imaging means, object specifying means, and position specifying means existing on the environment side the position of the specific moving body during or after movement can be grasped. The position during or after the movement can be fed-packed to the instruction content of the movement instruction means. Therefore, a more accurate movement can be easily realized as compared with a case where the moving body itself moves along the route set by the route setting means while capturing the image by the imaging means on its own side.
  • FIG. 1 is a schematic configuration diagram showing an outline of a mobile object movement control system according to an embodiment of the present invention.
  • FIG. 2 is an explanatory diagram showing an example of an arrangement pattern of the first to third light emitting elements in the present embodiment.
  • FIG. 3 is a plan view showing a first example of a total surface pattern that can be obtained when it is assumed that three set patterns are arranged on one surface of the object.
  • FIG. 4 is a plan view showing a second example of a possible overall surface pattern when it is assumed that three set patterns are arranged on one surface of the object.
  • FIG. 5 is a plan view showing a third example of a possible overall surface pattern when it is assumed that three set patterns are arranged on one surface of the object.
  • FIG. 6 is a plan view showing a fourth example of a possible overall surface pattern when it is assumed that three set patterns are arranged on one surface of the object.
  • FIG. 7 is a plan view showing a fifth example of a total surface pattern that can be taken on the assumption that three set patterns are arranged on one surface of the object.
  • FIG. 8 is a plan view showing a sixth example of a total surface pattern that can be obtained on the assumption that three set patterns are arranged on one surface of the object.
  • FIG. 9 is a plan view showing a case where one light emitting element is configured as a set of a plurality of light emitting diodes.
  • FIG. 10 is a perspective view showing an example of the autonomous robot used in the present embodiment.
  • FIG. 11 is a perspective view showing a container attachment as another example of the attachment attached to the robot body.
  • FIG. 12 is an explanatory diagram showing a method of generating a route without causing an autonomous robot to collide with an obstacle.
  • FIG. 13 is an explanatory diagram showing the layout space when the space shown in Fig. 12 is divided into cells, and the cells hanging on the C obstacle are regarded as a part of the C obstacle.
  • FIG. 14 is a flowchart showing a flow of a route generation process for a plurality of autonomous robots.
  • FIG. 15 is an explanatory diagram showing a state in which deadlock occurs without using a save point.
  • FIG. 16 is an explanatory diagram showing a state where the first autonomous robot has retreated to the evacuation point and the deadlock has been resolved.
  • FIG. 17 is a plan view illustrating an example of a closed state of a route due to a moving obstacle.
  • FIG. 18 is an explanatory diagram showing the first stage of obstacle avoidance in which a route is changed by an obstacle that cannot be detected by the global information sensing system.
  • FIG. 19 is an explanatory diagram showing a second stage of the obstacle avoidance in the example shown in FIG.
  • FIG. 20 is an explanatory diagram showing a third stage of the failure avoidance in the example shown in FIG.
  • FIG. 21 is an explanatory diagram showing a concept of a relay point as a first concept for quickly driving.
  • FIG. 22 is an explanatory diagram showing the concept of direction regulation as a second concept for quickly driving.
  • FIG. 23 is an explanatory diagram showing the concept of a rotary as a third concept for speeding up traveling.
  • FIG. 24 is an explanatory diagram showing a concept of a checkpoint as a fourth concept for speeding up traveling.
  • FIG. 25 is a plan view showing an actual arrangement example of the space.
  • FIG. 26 is a perspective view showing an example of an arrangement of a checkpoint in a three-dimensional space.
  • FIG. 27 is a flowchart showing the basics of the movement control of the autonomous robot of the present embodiment.
  • FIG. 28 is a characteristic diagram showing a result of moving the autonomous robot along the L-shaped trajectory by the control shown in FIG.
  • FIG. 29 is an explanatory diagram showing an example where one of the conventional autonomous robots cannot move temporarily.
  • FIG. 30 is an explanatory diagram showing an example in which both autonomous robots in the related art cause a deadlock.
  • FIG. 31 is an explanatory diagram for explaining a modification of the present embodiment, and is a diagram illustrating grouping of autonomous lopots.
  • FIG. 32 is an explanatory diagram for explaining a modified example of the present embodiment, and is a diagram showing a moving state of the grouped autonomous robots.
  • Figures (a) to (c) show how the robot position changes over time.
  • FIG. 33 is an explanatory diagram for describing a modification of the present embodiment, and is a diagram illustrating a moving state of an ungrouped autonomous mouth pot that is not grouped.
  • Figures (a) to (c) show changes in the robot position over time.
  • FIG. 1 shows an outline of a mobile object movement control system according to an embodiment of the present invention.
  • ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , 202 N are arranged movably.
  • the first and Nth autonomous robots 202 ⁇ 202 N exist in this space 201 ⁇ , but other autonomous robots enter this space 201 over time. It is possible that they may come in or out of this space 201.
  • a plurality of TV cameras 203 i to 203 M are mounted on the ceiling of the space 201. These TV cameras 203 i to 203 M are connected to a common image transmission cable 204, and each image data is input to the environment-side computer 205.
  • Television camera 203i ⁇ 20 3 M of multiple, each autonomous robot 202 have « in space 20 within 1, the moving state of 20 2 N, each in charge within the area partially overlap each other, regarded as an image What is provided for It is. Therefore, a small space and relatively prospects good space can reduce the number of these television camera 2 0 3 ⁇ 2 0 3 M appropriately, in an extreme case can also be used one Te Rebirash It is.
  • a plurality of television cameras 20 Each of the 0 3 M is provided with an infrared transmission filter 2 11, so that only an image that emits infrared light is captured.
  • a plurality of autonomous robot 2 0 2 I ??, 2 0 2 N (mobile) and desk 2 0 7, Ropotto the chair 2 1 8 (movable object) given obstacles such as identification Separate mark 2 1 2 or object identification mark 2 1 3 is attached.
  • this figure shows an example in which the robot weaving mark 2 12 and the object identification mark 2 13 are attached to the autonomous robot 202 and other objects 207 and 208, respectively. Although it is shown, it is free to arrange a plurality of each.
  • the object identification mark 2 13 attached to the desk 207 and the chair 218 is a fixed object because they may move in the space 201 artificially as movable objects. This is because more reliable information on the position of obstacles can be obtained than when control is performed based only on the map.
  • the robot identification mark 2 12 and the object identification mark 2 13 are formed of a light-emitting diode array irradiating plate or an infrared reflecting plate that generates a temporally changing or temporally constant infrared pattern.
  • the light emitting diode array irradiation plate is a plate in which a plurality of light emitting diodes emitting infrared rays are arranged in a line at a predetermined interval as described later, and the infrared reflecting plate absorbs visible light and has a predetermined pattern, for example.
  • a plate that emits infrared light is a plate in which a plurality of light emitting diodes emitting infrared rays are arranged in a line at a predetermined interval as described later, and the infrared reflecting plate absorbs visible light and has a predetermined pattern, for example.
  • the construction of a general-purpose recognition system with reliability and real-time performance is currently underway. It is also impossible in terms of cost. Therefore, in this embodiment, special recognition marks such as the Lopot identification mark 2 12 and the object identification mark 2 13 are attached to an object to facilitate their recognition, and furthermore, the infrared camera is used to detect the TV camera. eliminating the difficulty of processing by the visible light of noise in the 2 0 3 i ⁇ 2 0 3 M side to ensure the speed and reliability of the image processing. In addition, even when a person is present in the space 201, the use of infrared rays can make the robot identification mark 2 12 and the object identification mark 2 13 unobtrusive.
  • the environment-side computer 205 is a robot identification mark 211 and an object identification mark captured by a TV camera 203 i to 203 M ?
  • the image data of 13 is processed to determine the coordinates (hereinafter, referred to as world coordinates) indicating these positions in the space 201 and the posture.
  • the determination of the posture refers to, for example, determining in which direction the autonomous robot 202 faces in a plane based on its rotation angle.
  • a change in three-dimensional position such as a tilted arm of the mouth pot, may be determined as a change in posture. .
  • the environment-side computer 205 has an antenna 216 for outputting the processing result via a wireless LAN (local area network).
  • coordinate data may be represented by environmental side coordinates, or converted into individual autonomous mouth pots 202, ..., 202N side coordinates (hereinafter referred to as local coordinates). It may be the data after performing.
  • the environment-side computers 205 are controlled independently of each other. These local coordinates are sent to the type robot 202 2..., 202 N.
  • the predetermined desk 2 0 7 2 personal combi-menu data 2 2 1 as a user I interface terminal is installed.
  • This personal computer 222 also has an antenna 222 for configuring a wireless LAN, and is connected to the Internet by a predetermined cable 222.
  • FIG. 2 shows the basic configuration of the robot identification mark used in the present embodiment.
  • the robot identification mark 2 12 uses the first light emitting element 2 3 1 as the origin of the local coordinates of the autonomous robot 200 2 side, and the first light emitting element 2 3 1 to the second light emitting element 2 3 2.
  • the connected line segment 233 is defined as the X axis
  • the line segment 235 connecting the first light emitting element 231 to the third light emitting element 234 is defined as the Y axis. That is, the first to third light-emitting elements 2 3 1, 2 3 2, and 2 3 4 are placed on the surface of the autonomous mouth pot 20 2 so that the intersection of the two line segments 2 3 3 and 2 3 5 is always at a right angle. In a predetermined area.
  • the length of the X-axis component 2 33 is a known fixed length, and the length L 2 of the Y-axis component 2 35 is the light emitting element 2 3 1, 2 3 2, 2 3
  • the length is unique to the combination of 4
  • length 1 ⁇ is always 2 cm, but length L 2 is 1 cm, 2 cm, or 3 cm
  • the lengths are different.
  • a plurality of sets of the first to third light emitting elements 2 3 1, 2 3 2 and 2 3 4 are arranged at different places of one autonomous robot 202 and these are placed on the environment side computer 2. By performing the analysis at 05, it is possible to determine the movement in the Z-axis direction or the change in the posture of the autonomous robot 202.
  • Figs. 3 to 8 show the case where three patterns (hereinafter, referred to as a set pattern) in which the first to third light emitting elements are arranged one by one on one surface of the object are arranged respectively.
  • a set pattern three patterns
  • surface comprehensive patterns three sets of patterns A, B, and C are arranged on a given surface 2 41 in the positional relationship shown in this figure.
  • the length L 2 is 1 cm in A pattern
  • in pattern B in length L 2 is 2 cm
  • the C pattern is the length L 2 is 3 cm.
  • FIG. 4 three sets of patterns A, B, and D are arranged on the surface 242 in the positional relationship shown in this figure.
  • D pattern is the length L 2 is 4 cm.
  • the paired patterns A and B are common to the surface 241 and the surface 242, the total surface pattern differs depending on the paired patterns C and D which are not common to each other. That is, individual identification of these objects becomes possible.
  • FIG. 9 shows an example of the configuration of one light emitting element when each light emitting element is configured as a set of a plurality of light emitting diodes.
  • FIGS. 2 to 8 show the first to third light emitting elements 2 31, 2 32, and 2 34, each of which is described as being composed of one light emitting diode.
  • the television camera 2 0 3 i to 2 0 3 M as the Toka when space 2 0 1 is relatively wide as shown, a large entertainment for venues 1
  • the light emission amount may be insufficient.
  • FIG. 10 shows an example of the autonomous robot used in the present embodiment.
  • the autonomous robot 220 used in this embodiment has a cylindrical body portion 261, which has a cylindrical shape as a whole, and an omnidirectional moving platform 2663 using a plurality of wheels 262 underneath. Is arranged. As a result, any translation with two degrees of freedom and rotation with one degree of freedom are possible.
  • PSD sensors 265 j to 265 8 are mounted on the top of the robot main body 261 at equal intervals along the circumference. These PSD sensor 2 6 5 i to 2 6 5 8 is a sensor for detecting the presence of obstacles in the vicinity of the mouth pot body portion 2 6 1. Autonomous robot 2 0 2 By inputting the outputs of these PSD sensors 2 6 5 21 to 6 5 8 built Personal ⁇ computer (not shown), while avoiding the individual obstacle present in these neighboring Control to move is performed. As described above, the movement path of the autonomous mouth pot 202 is set based on the position obtained by recognizing the robot identification mark 2 12 by the TV camera 203 1 to 203 M shown in FIG. This is performed based on coordinates.
  • a relatively large hollow part 267 is arranged in the robot body part 261.
  • Various attachments can be attached to this hollow part 267, and by changing the attachments to be attached, robots for various purposes can be realized.
  • FIG. 10 shows an example in which the cleaning attachment 268 is attached to the hollow portion 267 and used as a cleaning robot.
  • a battery (not shown) is stored in the robot body 261, and supplies power required by the attachment as well as the body.
  • the autonomous robot 202 is assisted by the environment-side computer 205 for its control, this battery secures energy independence.
  • the battery can be charged by the robot itself at a charging station (not shown) in the space 201.
  • the function of the autonomous robot 202 is different.Therefore, when the attachment is replaced, the autonomous mouth pot 20 determines this, and the mouth pot identification mark 2 1 2 corresponds to the change in the function. It is preferable to control the robot so that part or all of the control is changed. For example, the individual identification pattern corresponding to the individual identification information of the autonomous robot 202 itself may be left as it is, and the pattern for each functional wall representing functions such as the cleaning robot and the nursing care robot may be changed.
  • Fig. 11 shows a container attachment as another example of the attachment.
  • the container attachment 2 7 1 is composed of a container body 2 7 3 with a lid that can hold various articles 2 7 2 and a container body push-out mechanism (not shown).
  • a container body push-out mechanism (not shown).
  • Various robots such as a robot, a human-carrying mouth pot with a stool attachment, a garbage collection port bot, and an AGV (Automatically Guided Vehicle) robot, can be realized.
  • a plurality of autonomous robot 2 0 2 I ?? in, 2 0 2 N can move in parallel controlled environment .
  • the autonomous robots 202 do not move at once to the target point, but move to the point (subgoal) on the way to the final point to check the situation of the surrounding obstacles. Judgment is made, and the control is repeated as it proceeds to the next subgoal.
  • Such control is achieved by executing a control program stored in a storage medium (not shown) in the CPU (central processing unit) 1S in the environment-side computer 205 shown in FIG.
  • each autonomous robot 202,..., 202N also has a built-in personal computer, but for movement control, it sequentially receives movement control data calculated by the environment-side computer 205. According to the instruction (command), the omni-directional movement platform 263 shown in FIG. 10 is driven and controlled, thereby being free from complicated movement control.
  • Figure 12 illustrates how an autonomous robot generates a route without colliding with an obstacle.
  • the autonomous robot 202 has a slightly thick L-shaped shape.
  • Coordinate system C R - origin O r (not shown) to match the representative point as a center of control.
  • autonomous position of the robot 2 0 2 of the origin O r viewed from the world coordinate system C XY (not shown) X, Te cowpea the Y-coordinate, orientation is not C XY (shown in the world coordinate system )
  • the coordinate system C R (not shown).
  • the arrangement (configuration) is represented by the x, y coordinates and the rotation angle ⁇ of the representative point ⁇ ⁇ of the autonomous robot 202.
  • the autonomous robot 2 2 has made a round while approaching the obstacle 3 2 1 while keeping the rotation angle ⁇ constant.
  • 3 2 2 is shown.
  • the closed space (the inner space indicated by hatching) formed by the trajectory 3 222 is an obstacle region in which the representative point of the autonomous mouth pot 202 cannot enter. This obstacle area will be referred to as C obstacle 3 2 3.
  • a complete three-dimensional configuration space is constructed.
  • the path search problem for robots having various shapes in the real space can be replaced with the path search problem for points in the configuration space.
  • Fig. 13 shows the layout space when the space shown in Fig. 12 is divided into cells, and the cells hanging on the C obstacle are regarded as one sound of the C obstacle.
  • Each cell 331 on the locus 322 indicating the boundary of the C obstacle 3233 shown in FIG. 12 is a portion considered as a part of the C obstacle.
  • the path search of the autonomous robot 202 is performed by forming one cell at a time from the cell at the departure position (hereinafter referred to as “departure cell”) to the goal senole (hereinafter referred to as “goal cell”). This means searching for a road, and finding a column of cells that do not belong to the cell 331 which is regarded as a part of the C obstacle 3232.
  • a horizontal search in which the search range is expanded concentrically from the departure cell is used as the route search in the cell-shaped arrangement space.
  • the search time is roughly proportional to the number of cells that make up the placement space.
  • the layout space for describing the layout of N autonomous robots 2 0 2 ⁇ 2 0 2 N is 3 N-dimensional. By deciding one point in the placement space, the coordinates (X, ⁇ , ⁇ ) of each autonomous mouth pot 2 0 2 ⁇ 2, 2 0 2 N are determined arbitrarily, and vice versa.
  • the description of the C obstacle in the layout space at this time is performed as follows. The placement space is divided into cells, and it is checked whether the placement corresponding to each cell can be taken in the real space.
  • the horizontal search time is proportional to the total number of cells as the total number of cells, and the total number of cells is K 3N where K is the number of divisions in each axis direction.
  • the path plan of each robot is created by regarding other robots as obstacles, and along the path generated by one path plan.
  • the approach is to take a short distance and then re-recognize the overall situation and plan the route.
  • route planning for each autonomous robot 202 ... N.
  • An example of specific operation control is shown below.
  • Figure 14 shows the flow of route generation processing for multiple autonomous robots.
  • the processing of N autonomous robots 202 2..., 202 N is sequentially performed one by one.
  • the autonomous robot 202 on which the processing is performed is defined as an i-th mouth pot. Therefore, first, the variable i is set to "1" (step S401). Next, an arrangement space for the i-th mouth pot is created. At this time, all robots other than the i-th robot among the autonomous mouth pots 202 2..., 202 N are treated as obstacles (step S402). Sweep areas (cells) due to the movement of the autonomous robot 202 whose path planning has already been completed are also treated as obstacles.
  • the layout space for the i-th robot is searched to generate a route to the goal cell (step S403).
  • the current travel route is defined as a route from the current position to a position advanced by a predetermined distance along this route (step S404).
  • step S406 when a path of a predetermined length is generated for each of the autonomous robots 202 ⁇ and 202 N (step S406: Y), all of these autonomous robots 202 ⁇ ..., Check the power 'to see if the 202 N route has reached the goal cell (step S407). If it has not reached (N), the process returns to step S401 to initialize the variable i to "1" again and to sequentially generate the remaining routes (steps S401 to S407). ).
  • deadlock may occur even in the autonomous robot 202 of the present embodiment.
  • the autonomous robot 1 15 and the autonomous robot 1 13 are moving in a situation where they pass each other, a distant movable obstacle 1 11
  • FIG. 15 shows a method of introducing a save point as one method that can eliminate deadlock in the present invention. It is assumed that, for example, two obstacles 341 and 342 such as main stands are arranged at predetermined intervals in the space 201, and a narrow passage 343 is formed therebetween.
  • the first autonomous robot 2 0 2 j is the aim of Goruseru 3 4 4 planning a route through the passage 3 4 3
  • second autonomous port pot 2 0 2 2 Goruseru 3 Suppose that a route was planned to pass through this passage 3 4 3 in the opposite direction, aiming at 4 5.
  • a retreat point 347 is set in advance relatively near the passage 343. If the possibility of deadlock is detected in the path planning of the bi-autonomous robot 200 2 1 2 0 2 2 , which of the bi-autonomous robots 2 0 2 ⁇ 2 0 2 2 is the evacuation point 3 4 7 To determine if it is located near. Then, the route is changed so that the closer mouth pot, in this case, the first autonomous robot 202 i is once evacuated to the evacuation point 347.
  • Figure 16 shows a state where the first autonomous robot has retreated to the evacuation point and the deadlock has been resolved. Once the first autonomous robot 202 retreats to the evacuation point 3 4 7, the autonomous robot 2 0 2 ⁇ 2 0 2 2 passing through the passage 3 4 3 no longer touches each other. Goal cells 3 4 4 and 3 4 5 can be reached.
  • Figure 17 shows the path being closed by a moving obstacle.
  • An obstacle such as a desk may move its position by human operation or the like. If such a moving obstacle 35 1 is placed in the vicinity of a narrow passage 3 5 4 connecting two rooms 3 5 2 and 3 5 3, one room 3 5 3 will move to the other room 3 5 2
  • the autonomous robot 202 trying to advance to the goal cell 3 5 5 in is closed on its path.
  • the plurality of television cameras 2 0 S i S 0 3 global information sensing system described above according to M shown in FIG. 1 in the present embodiment the progression of the path when demarcating a path autonomous Ropotto 2 0 2 Total Impossible obstacles can be identified. If the moving obstacle 3 51 is not a mobile robot such as the autonomous robot 202, change the route and secure each other's course as described in Fig. 15 and Fig. 16 I can't say. Therefore, in such a case, for example, The data 205 informs the system administrator to have the moving obstacle 351 removed from the route.
  • FIG. 18 to FIG. 20 illustrate how a route once set is changed by an obstacle that cannot be detected by the global information sensing system.
  • Figure 18 shows the first stage of obstacle avoidance.
  • the autonomous robot 202 sets the subgoal 3 74 as the destination of the first stage, as a short distance along the path 3 72 toward the goal cell 3 71 while avoiding the obstacle 3 73.
  • the first path 375 is set as the path up to that point.
  • Figure 19 shows the second stage of obstacle avoidance. While the autonomous robot 202 is moving to the subgoal 37 4, the moving obstacle 3 81 that was not detected by the global information sensing system is replaced by the sensor 26 S i S 6 shown in FIG. in the detection operation of 5 8 and is detected on the first path 7 5.
  • a human can be considered as the moving obstacle 3 8 1.
  • An autonomous Ropotto 2 0 2 by this detection operation is determined impossible to travel the first path 7 5, obstacle avoidance movement control mode based on the detection of the sensor 2 6 5 i to 2 6 5 8 The switch to is performed. Then, a second path 382 that avoids the moving obstacle 381 is set between the subgoal 374.
  • the environment-side computer 205 obtains the information not detected by the TV cameras 203 i to 203 M to determine the final path to the goal cell 371 of the autonomous robot 202. This can be useful for
  • FIG. 20 shows the third stage of the obstacle avoidance in this example.
  • the autonomous robot 202 detects another obstacle 3 84 by a human or the like while traveling the second path 3 82 and determines that it cannot reach the sub goal 3 74 To do so, switch to the third pass 385 on the way. In this case as well, such information is transmitted to the environment-side computer 205, and is used for setting a route thereafter.
  • FIG. 21 explains the concept of a relay point as a first concept for speeding up traveling. It is assumed that there is an L-shaped obstacle 4222 that narrows the passageway 4221 in the space 201. It is assumed that the goal 4 25 of the first autonomous robot 200 2 i exists in the narrow space 4 2 3 partitioned by the obstacles 4 2 2. Second self 'law robot 2 0 2 2 back and forth repeatedly between the narrow space 4 2 3 position 4 2 6 and the passage 4 2 other predetermined position 1 through the wide space side 4 2 7 It is assumed that predetermined work has been performed.
  • the second autonomous lo-po an autonomous robot 2 0 2 2 exists in a position closing the passage 4 2 1, which is movement of the first autonomous Ropotto 2 0 2 i until terminated stopped.
  • Rukoto reach the finite time the first autonomous robot 2 0 2 i is that goal 4 2 5 is guaranteed, the behavior is the second autonomous Ropotto 2 0 2 2 If the vehicle stops every time it comes to a position blocking road 4 21, it will be intermittent and slow.
  • the relay point 4 2 8 At a position where the second running of the autonomous robot 2 0 2 2 does not interfere with a relatively close passages 4 2 1 in the present embodiment in order to solve the problem ing.
  • the first autonomous robot 202i first generates a route so as to stop at the relay point 428 and then aim for the goal 425.
  • the first autonomous robot 2 0 2 exactly independently of the running of the second autonomous robot 2 0 2 2, can reach the relay point 4 2 8.
  • the relay point 4 2 8 to ⁇ to ⁇ specific goals 4 2 5 while adjusting the travel to achieve coexistence and the running of the second autonomous robot 2 0 2 2 ⁇ tau-> or ⁇ tau .
  • a median strip 4 4 3 is conceptually provided. Then, with this as a boundary, this space is set as a first direction passage 445 and a second direction passage 446.
  • the first direction passage 4 4 5 is a passage that allows all autonomous mouth pots 202 2..., 202 N to travel only in the first direction
  • the second direction passage 4 4 6 Is a passage that allows all the autonomous mouth pots 202 2..., 202 N to run in the direction opposite to the first direction passage 4445.
  • a second direction no-going wall 4 4 7 is conceptually arranged, and the autonomous robot 202 trying to travel in the second direction is connected to the first direction passage 4 4 5.
  • a first direction no-going wall 448 is conceptually arranged, and the autonomous robot 2202 trying to travel in the first direction is placed in the second direction passage. 4 4 6 to prevent accidental entry.
  • a specific space area is set as an area where traveling in only one direction is possible.
  • a one-way passage 4 4 5 first to third autonomous Ropotto 2 0 2
  • the second axial channel 4 4 6 fourth to sixth autonomous Ropo' bets 2 0 2 4, ..., 2 0 2 6 shows a state in which sequentially traveling.
  • Each autonomous robot 2 0 2 had ..., if 2 0 2 6 traveling speed at a constant, though sufficiently packed and efficient running multiply One was like the spacing of these autonomous Ropotto 2 0 2 in escalator It becomes possible.
  • FIG. 23 is for explaining the concept of a rotary as a third concept for speeding up traveling.
  • annular one-way annular passage 451 In the space 201, there is provided an annular one-way annular passage 451, and a plurality of sets of radial bidirectional passages 452 ... 452 ⁇ connected thereto.
  • Each two-way passage 4 5 2 ??, 4 5 The first direction shown in Figure 22 This is a combination of the passage 445 and the second direction passage 446.
  • the autonomous robot 202 By using the one-way annular passage 451, the autonomous robot 202, not shown, can be guided into the one-way annular passage 451, from a desired direction, and sent out in another desired direction.
  • the bidirectional passage 4 5 2 have « that enables parallel to two directions of traffic in FIG 3, the one-way annular passageway 4 5 1, but is connected to 4 5 2 L, 1 single autonomous robot It is also possible to provide a passage having a width enough to allow the vehicle to travel, and determine the traveling direction for each passage, or switch one passage to traveling in both directions as appropriate.
  • Fig. 24 explains the concept of a checkpoint as a fourth concept for speeding up traveling. If the space 201 is relatively large or if the space 201 can be divided into a plurality of small areas 471 and 472 as shown in Fig. 24, the space 201 should be The concept divided into is adopted. It is not always necessary that a physical separation such as a wall exists between the small areas 471 and 472. A passage control concept called “separate section 4 7 3” is placed at the divided connection parts. If the area is divided into two sub-areas 4 7 1 and 4 7 2 as shown in the figure, place a Seki 4 7 3 between them. Then, the autonomous robot 202 does not consider the obstacles in the subsequent division or the small area 471 (4722) until it reaches the checkpoint 473.
  • the small area 4 It is possible to reach the checkpoint 4 7 3 without worrying about the obstacles in 7 1, for example, the second autonomous robot 2 2 2 .
  • the travel of each autonomous robot 202 can be performed smoothly. Will be able to do so.
  • a plurality of checkpoints 473 may be arranged in one space 201.
  • Figure 25 shows an example of the actual spatial arrangement.
  • the space 201 in this example becomes the first movable space 505 serving as the field of view of the first TV camera 203i.
  • the second movable space 5 0 5 2 that Do a second television camera 2 0 3 2 of the visual field
  • a third television camera 2 0 3 of 3 of the field of view of these passages specific third movable space 5 It consists of 0 5 3 .
  • the third movable space 5 0 5 3 Overlap the first and second part and movable space 505 I 505 2.
  • Figure 26 shows an example of a checkpoint in a multi-story building.
  • the first space 201 forms the first floor of the building
  • the second space 201 2 forms the second floor of the building.
  • Both spaces 201 ⁇ 20 1 2 are connected by elevator 521. If a plurality of autonomous robots 202 (not shown) move in such a space 201, the part of the elevator 521 is set as a checkpoint 522, so that the autonomous robot 202 can ride on the elevator 521 and reach a desired floor. It will be able to avoid having to consider any mouth pot 202 in the space 20 1 ⁇ or 201 2 of that floor.
  • the space 201 I 201 2 constituting each floor further divided into a plurality providing a barrier also between these can, as previously described It is.
  • Fig. 27 shows the flow of the trajectory control of the autonomous robot according to the present embodiment.
  • the trajectory control moves a small distance along the route set in the route plan
  • the CPU in the environment-side computer 205 first controls the trajectories X (t), Y (t), and trajectory of the autonomous mouth pot 202 to perform the movement control.
  • Set ⁇ (t) (step S301 in Fig. 27) where the symbols X (t) and Y (t) indicate the two-dimensional coordinate position on the system side, and the symbol 0 (t) is shown in Fig. 10. Indicates the rotation angle of the mouth pot body portion 26.
  • the symbol t is the current time.
  • the parameter n is initially set to "0" (step S302).
  • Step S304 The calculated values are assumed to be X (t n ), Y (t n ), and ⁇ (t n ) ( sign ⁇ is a time interval required for one movement control.
  • the CPU uses this global autonomous robot system based on a global information sensing system, which is grasped by the environment-side computer 205 processed using the multiple TV cameras 200 SS 03 M shown in Fig. 1.
  • (X, y, ⁇ ) is obtained (step S305).
  • the velocity set value (v x , v y , ⁇ ) of the autonomous robot 202 to become the position and posture at the next time is calculated (step S306), and the set velocity is calculated as the autonomous value.
  • step S307 It is converted into the local coordinate system of the type robot 202 (step S307).
  • the speed after the conversion is notified to the autonomous robot 202.
  • the movement control is performed until the next time (step S308).
  • the actual state of movement of the autonomous robot 202 by this can be checked by the environment-side computer 205 using the television camera 20SiS03 0. That is, it is possible to control the movement state during or after the movement by feed pack control.
  • Fig. 28 shows the result of moving the autonomous robot along the L-shaped trajectory under the control shown in Fig. 27.
  • the movement of the autonomous robot 202 is controlled by the environment-side computer 205 using the television cameras 203 i to 203 M shown in FIG. For this reason, the autonomous robot 202 can move with much higher precision with simple movement control compared to the case where the autonomous robot 202 itself controls movement with its own TV camera as before. I understand.
  • the robot identification mark 2 12 or the object identification mark 2 13 has been described as emitting a fixed pattern that does not change over time. However, a pattern that changes over time may be emitted.
  • the lopot identification mark 211 or the object identification mark 212 emits light with a light emission pattern that maximizes the amount of light. It is also possible to use a method in which individual patterns are emitted so that the individual can easily recognize the image and then enable individual recognition.
  • these identification markers If it is possible to change the patterns of h 2 1 2 and 2 1 3 over time by controlling the lighting of a plurality of light-emitting diodes, it will not only transmit information for distinguishing robots, etc. It is also possible to transmit other information to be transmitted to 05 by changing the pattern.
  • the television camera 20 SS03 M detects infrared light.
  • the present invention is not limited to this, and it is also possible to detect light in a predetermined wavelength region such as visible light.
  • the route plan to the goal should be implemented with all other lopots deleted first, and the checkpoint that led to the passage at that time should be adopted. I have to. Therefore, for example, in the situation shown in FIG. 21, it is also possible to use a bureau instead of a relay point.
  • the major difference between the gateway and the relay point is that the former is automatically selected by the system, while the latter is specified by the user (programmer).
  • the details of checking whether the pot is in the deadlock state are not described in detail.However, in order to perform such a check, there is a plurality of mouth pots that cannot be moved for a certain period of time. whether always and c may be to monitor, if in the case where a plurality of robots has become unmovable fixed time thus exists, it is determined whether the deadlock in the above manner. If it is not a deadlock and there is a mouth pot that cannot be moved for a certain period of time, the movable obstacle that causes the movement cannot be identified.
  • the direction of mouth pot traffic in a certain area is to be regulated in accordance with the traffic volume, the number of robots in that area is monitored, and when the number of robots exceeds a certain number, direction-restricted traffic is implemented. What should I do?
  • the continuous running of a large number of robots in the direction control passage can be controlled by switching from wide-area operation control to track control at the entrance and giving each robot a track that keeps the distance between the robots at a certain level or more.
  • each robot is switched to wide-area operation regulation at the exit of the direction regulation passage.
  • a robot outside the direction control passage plans a route to the goal, By ignoring the lopot in the control passage, it can be prevented from becoming an obstacle to movement.
  • the goal, evacuation point, and relay point are, of course, set at locations where the mouth pots located there do not prevent the movement of other mouth pots.
  • autonomous port pot 2 0 2 i ⁇ 2 0 2 6 as shown in FIG. 3 1, divided into groups A to C. Group C - one and a mover 2 0 2 6.
  • the taloop may be composed of one mobile.
  • the movement control of the moving body as described above is performed in units of this group.
  • a route is generated for each mobile unit belonging to that group to reach the goal.
  • each mobile unit in the group generates a route together.
  • the route of one mobile unit in which the other mobile unit is stationary is generated, but in this example, the other group is set in a stationary state while the other group is stationary.
  • the route of the moving object belonging to is generated.
  • Fig. 33 shows an example without grouping. In this case, while the robot 2 0 2 ⁇ passes narrow space (passage), mouth pot 2 0 2 2 is stationary. Therefore, the time in which the robot 2 0 2 2 arrives at the goal G 2 becomes longer.
  • the respective positions from the current position specified by the position specifying means to these goals are set.
  • the route for traveling to is set by the route setting means, and the traveling control for each moving body is performed by the traveling body-specific traveling control means so that the traveling body is controlled from the current position to the goal by a predetermined unit along the route for each moving body, After that, the route is set again by the route setting means, and the control of moving to the goal by a predetermined unit is repeated until the final goal is reached.
  • the environment-side imaging means is required for a moving object that moves by itself in the space and a movable object that can move with a force applied from another.
  • these objects can be recognized without the need for sophisticated recognition technology for recognizing individual moving and movable objects. .
  • This makes it possible to identify whether the object is a moving object that moves on its own or a movable object that can move with a force applied from another. In the case of a moving object that moves on its own, it may be temporary even if it blocks the path of its own moving object. it can.
  • the environment-side imaging means sets the route of each moving body moving in the space, it is easy to adjust the movement of these moving bodies, and each moving body is equipped with its own camera. It is much easier to control the movement of a plurality of moving objects in the same space than in the case where traveling is controlled only by using a single vehicle. .
  • the route setting unit may have a plurality of moving body paths that intersect or are closely parallel to each other. These vehicles move without colliding with a predetermined location Therefore, there is a possibility that deadlocks may occur, in which mobile bodies cannot move due to the presence of each other in relatively narrow passages. In some cases, environmental restrictions can eliminate such risks and speed up the time to reach each goal.
  • the invention described in claim 4 shows one form of the restriction passage, and at least one of the moving bodies having a possibility of collision is temporarily evacuated to avoid collision with another moving body.
  • the route setting means divides a space in which a plurality of moving bodies can travel at the same time into a plurality of spaces.
  • a checkpoint for checking a moving object moving from one small area to the other small area is arranged at a place where the divided small areas are connected. It is not affected by the behavior of other moving objects in the area, so that not only can control be simplified, but also the time required for traveling control to the checkpoint can be shortened.
  • the mark emits infrared light
  • the environment-side imaging means is sensitive to infrared light, so that it is not obstructive to humans. It is possible to use a mark that does not need to be used, and to realize an environment that considers humans in a space where humans and moving objects coexist.
  • the regulating passage contacts the moving bodies at least in a range of an area where the moving paths of the plurality of moving bodies can be regarded as common. It is a passage that moves at a constant speed and in a predetermined common direction at a constant interval, so it is efficient without stopping both moving bodies together as when riding on an escalator Can be moved.
  • each moving object that has moved to the intersection of the moving paths of the plurality of moving objects is respectively desired to be moved.
  • the intersection is composed of a ring-shaped passage and a plurality of radial passages connected to it, a plurality of moving bodies will use the intersection because the moving body rotating means for switching the moving direction is provided. Also in this case, these moving bodies can be efficiently moved through the ring-shaped passage and sent out in a desired direction using one of the plurality of radial passages.
  • the space in which the moving body moves is mapped or mapped to the layout space. Since the route setting means sets the route in units of cells, it is easier to plan each route of the moving object than setting a route with fine coordinates.
  • the moving body in the moving body movement control system according to the first or second aspect, includes an on-board sensor for detecting a surrounding obstacle and the on-board sensor.
  • the sensor detects an obstacle that can be avoided on the route set by the route setting unit
  • the sensor further includes a correction route setting unit that independently sets a correction route for avoiding the obstacle. Even when there is an obstacle that should not exist on this route while operating the route that has been set by imaging, it is possible to independently set a correction route to minimize this. In other words, this makes it possible to deal with obstacles that could not be detected by the environment-side imaging means, and also when an obstacle such as a person suddenly moves and blocks the route after setting the route. It can respond flexibly.
  • the correction route setting means sets a correction route
  • the correction result of the route is notified to the route setting means. Therefore, it can be used not only as a reference when creating a future route for the moving object, but also as a reference for setting a traveling route of another moving object when traveling using the corrected route.
  • an instruction to remove the obstacle is given.
  • Specific obstacle removal instructing means for example, a person cannot move, as if a human had moved a seat on the path of a moving object. If it does, it can be removed from the route to ensure movement.
  • a part or all of the environment-side imaging means outputs image data capable of measuring a three-dimensional position of each mark. Since this is a stereo imaging means, it is possible to set a three-dimensional path to an obstacle even when the moving object moves under the desk and performs cleaning.
  • the traveling control means for each moving body sequentially issues a command for controlling the movement of each moving body based on the position of each object specified by the position specifying means on the environment side. Since each mobile unit receives these commands and can use its own drive mechanism to perform control for movement, it is only necessary to release the complicated control for movement. Therefore, there is an advantage that the control circuit can be greatly simplified, and high-precision movement can be performed even with a small moving body.
  • the environment-side imaging means, the object specifying means, and the position specifying means existing on the environment side during or after the movement of the specific moving body.
  • the position of the moving body can be grasped each time, and the feed pack control means feeds back the position during or after the movement to the content of the movement instructing means, so that the moving body itself can take the imaging means on its own side.
  • moving object movement control system according to claim 19, wherein the term "moving object” includes "a group having a plurality of moving objects” in any one of claims 1 to 18. This has the effect that efficient route generation is possible.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système de commande de mouvement de corps mobiles capable de commander en souplesse les mouvements d'une pluralité de corps mobiles, avec un faible nombre de calculs. Une pluralité de robots autonomes libres (2021, ., 202N) se déplacent sur le sol d'un espace (201) tel qu'une pièce et une pluralité de caméras de télévision (2031-203M) sont fixées au plafond, des bureaux (207) et des chaises (218) représentant des objets mobiles situés dans l'espace. Un ordinateur d'analyse environnementale (205) traite les images sur des caméras de télévision respectives (203) et identifie des repères d'identification de robots (212) associés aux robots autonomes respectives (202) et des repères d'identification d'objets (213) afin de les spécifier et de détecter leurs positions. Ce système détermine également les trajectoires de déplacement des robots autonomes respectifs et répète des opérations de commande de trajectoire après chaque trajectoire d'une distance spécifiée afin d'éviter des collisions et de permettre la sortie d'impasses. De plus, diverses commandes de trafic garantissent un déplacement en souplesse.
PCT/JP2001/007878 2000-09-11 2001-09-11 Systeme de commande de mouvement de corps mobiles WO2002023297A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2002527882A JPWO2002023297A1 (ja) 2000-09-11 2001-09-11 移動体移動制御システム
AU2001284520A AU2001284520A1 (en) 2000-09-11 2001-09-11 Mobile body movement control system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2000275319 2000-09-11
JP2000-275319 2000-09-11

Publications (1)

Publication Number Publication Date
WO2002023297A1 true WO2002023297A1 (fr) 2002-03-21

Family

ID=18760962

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2001/007878 WO2002023297A1 (fr) 2000-09-11 2001-09-11 Systeme de commande de mouvement de corps mobiles

Country Status (3)

Country Link
JP (1) JPWO2002023297A1 (fr)
AU (1) AU2001284520A1 (fr)
WO (1) WO2002023297A1 (fr)

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004133882A (ja) * 2002-05-10 2004-04-30 Royal Appliance Mfg Co 自律性マルチプラットフォーム・ロボットシステム
JP2007148527A (ja) * 2005-11-24 2007-06-14 Denso Wave Inc ロボットの干渉回避方法およびロボット
JP2007196301A (ja) * 2006-01-24 2007-08-09 Denso Corp 画像を用いた自動運転装置及び自動運転方法
JP2007237334A (ja) * 2006-03-08 2007-09-20 Toyota Motor Corp ロボットハンドによる把持制御方法
JP2008134744A (ja) * 2006-11-27 2008-06-12 Matsushita Electric Works Ltd 自律移動装置群制御システム
JP2009116790A (ja) * 2007-11-09 2009-05-28 Nissan Motor Co Ltd 車両用運転支援装置および運転支援方法
JP2009140251A (ja) * 2007-12-06 2009-06-25 Mori Seiki Co Ltd 干渉確認装置
JP2009184059A (ja) * 2008-02-05 2009-08-20 Nec Corp ランドマーク検出装置および方法ならびにプログラム
KR20100099489A (ko) * 2009-03-03 2010-09-13 삼성전자주식회사 충돌 예측 기반 주행 제어장치 및 그 방법
JP2010231698A (ja) * 2009-03-30 2010-10-14 Advanced Telecommunication Research Institute International ネットワークロボットシステム、ロボット制御装置、ロボット制御方法およびロボット制御プログラム
US8000837B2 (en) 2004-10-05 2011-08-16 J&L Group International, Llc Programmable load forming system, components thereof, and methods of use
JP2011221631A (ja) * 2010-04-06 2011-11-04 Advanced Telecommunication Research Institute International ロボット自己位置同定システム
WO2012008084A1 (fr) * 2010-07-13 2012-01-19 村田機械株式会社 Corps de locomotion autonome
JP2014145629A (ja) * 2013-01-28 2014-08-14 Tohoku Electric Power Co Inc 三次元表示可能な地中レーダシステム
KR101480774B1 (ko) * 2013-09-30 2015-01-13 전자부품연구원 Cctv를 이용한 이동로봇의 위치 인식 장치 및 방법
WO2015068229A1 (fr) * 2013-11-07 2015-05-14 富士機械製造株式会社 Système de pilotage automatique et machine de transport automatique
JP2016024630A (ja) * 2014-07-18 2016-02-08 シャダイ株式会社 移動体プラットフォームシステム
WO2016067467A1 (fr) * 2014-10-31 2016-05-06 三菱電機株式会社 Dispositif de commande de robot, système de robot, procédé de commande de robot, et programme
JP2016206876A (ja) * 2015-04-21 2016-12-08 Cyberdyne株式会社 自律移動体の走行経路教示システムおよび走行経路教示方法
JP2016224992A (ja) * 2016-10-03 2016-12-28 シャダイ株式会社 移動体プラットフォームシステム
JP2017117353A (ja) * 2015-12-25 2017-06-29 シャダイ株式会社 移動体プラットフォームシステム
JP2018116359A (ja) * 2017-01-16 2018-07-26 本田技研工業株式会社 自律移動型ロボット運行管理システム
EP3499334A1 (fr) * 2017-12-18 2019-06-19 The Boeing Company Système de trajet sûr à capteurs multiples pour véhicules autonomes
KR20190082674A (ko) * 2017-12-31 2019-07-10 사르코스 코퍼레이션 로봇이 볼 수 있는 비밀 식별 태그 및 로봇 장치
CN110235079A (zh) * 2017-01-27 2019-09-13 威欧.艾姆伊有限公司 滚动装置和利用集成式滚动装置使设备自主重新定位的方法
WO2019183859A1 (fr) * 2018-03-28 2019-10-03 Abb Schweiz Ag Procédé et dispositif pour commande de robot
CN110609540A (zh) * 2018-06-15 2019-12-24 丰田自动车株式会社 自主移动体和用于自主移动体的控制程序
CN110632933A (zh) * 2019-10-18 2019-12-31 鱼越号机器人科技(上海)有限公司 一种路径移动方法、机器人、计算机可读存储介质
JP6644212B1 (ja) * 2019-05-30 2020-02-12 三菱電機株式会社 作業移動システム
WO2020060091A1 (fr) * 2018-09-20 2020-03-26 삼성전자주식회사 Robot de nettoyage et procédé de réalisation de tâche associé
CN111065981A (zh) * 2017-09-25 2020-04-24 日本电产新宝株式会社 移动体和移动体系统
WO2020105189A1 (fr) * 2018-11-22 2020-05-28 日本電気株式会社 Dispositif de planification d'itinéraire, procédé de planification d'itinéraire, et support d'enregistrement lisible par ordinateur
JP2020099953A (ja) * 2018-12-20 2020-07-02 前田建設工業株式会社 椅子の自動配置システム
WO2020217976A1 (fr) * 2019-04-23 2020-10-29 日本電気株式会社 Système de commande de robot, dispositif de gestion, robot mobile, procédé de commande de robot et programme
CN112214013A (zh) * 2020-08-07 2021-01-12 上海海得控制系统股份有限公司 直线往复式多rgv死锁避免和冲突实时控制方法、系统、介质、终端
KR20210033147A (ko) * 2019-09-18 2021-03-26 정동화 물류 이송 로봇 시스템
CN112650227A (zh) * 2020-12-14 2021-04-13 锐捷网络股份有限公司 一种自动导引车agv的调度方法、装置、设备及介质
CN113075923A (zh) * 2019-12-18 2021-07-06 财团法人工业技术研究院 移动载具及其状态估测与感测融合切换方法
CN113985880A (zh) * 2021-10-29 2022-01-28 深圳优地科技有限公司 多机器人路径规划方法、多机器人系统及机器人
CN114407929A (zh) * 2022-01-29 2022-04-29 上海木蚁机器人科技有限公司 无人驾驶绕障处理方法、装置、电子设备及存储介质
WO2022153923A1 (fr) * 2021-01-15 2022-07-21 川崎重工業株式会社 Système de robot et procédé de commande de robot
JP2022535686A (ja) * 2019-05-31 2022-08-10 フラバ ベー.フェー. 少なくとも1つの車両を自動制御するための車両制御アセンブリおよびその制御方法
WO2022196909A1 (fr) * 2021-03-19 2022-09-22 네이버랩스 주식회사 Procédé et système de commande à distance de robots, et bâtiment ayant des robots mobiles répondant de manière flexible à des obstacles
KR102462500B1 (ko) * 2021-05-06 2022-11-03 네이버랩스 주식회사 건물의 협소 구역을 자율주행하는 멀티 로봇이 배치되는 건물
WO2022249486A1 (fr) * 2021-05-28 2022-12-01 株式会社やまびこ Système de traitement d'informations, machine de travail et programme
US11590970B2 (en) 2018-11-06 2023-02-28 Kabushiki Kaisha Toshiba Deadlock detection device, information processing device, deadlock detection method, and non-transitory computer readable medium
IT202100024962A1 (it) * 2021-09-29 2023-03-29 Centro Di Ricerca Sviluppo E Studi Superiori In Sardegna Crs4 Srl Uninominale Sistema elettronico di controllo della navigazione indoor di uno o più robot
DE112021004289T5 (de) 2020-11-27 2023-09-07 Hitachi, Ltd. Bewegungssteuerungs-unterstützungsvorrichtung und -verfahren

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59106634A (ja) * 1982-12-11 1984-06-20 Caterpillar Mitsubishi Ltd 建設機械の運行制御システム
EP0221643A2 (fr) * 1985-08-30 1987-05-13 Texas Instruments Incorporated Système de navigation à télévision pour un robot mobile rôdant
EP0367527A2 (fr) * 1988-10-31 1990-05-09 Texas Instruments Incorporated Méthode de contrôle du mouvement d'un robot mobile dans une usine à repères multiples
JPH08123863A (ja) * 1994-10-25 1996-05-17 Mitsubishi Heavy Ind Ltd 工程管理ルール設計装置
JPH09185412A (ja) * 1995-12-28 1997-07-15 Yaskawa Electric Corp 自律移動装置
JPH09230933A (ja) * 1996-02-27 1997-09-05 Mitsubishi Electric Corp 自動搬送装置
JPH10275017A (ja) * 1997-03-31 1998-10-13 Hitachi Ltd 搬送制御方法および搬送装置
JPH10312217A (ja) * 1997-05-12 1998-11-24 Shinko Electric Co Ltd 運行管理制御装置および運行管理制御方法
JPH11242520A (ja) * 1997-12-08 1999-09-07 Caterpillar Inc 障害物検出応答式代替通路決定方法及びその装置
JPH11259131A (ja) * 1998-03-13 1999-09-24 Nippon Steel Corp 無人搬送台車の干渉防止制御システムおよび方法、記録媒体

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59106634A (ja) * 1982-12-11 1984-06-20 Caterpillar Mitsubishi Ltd 建設機械の運行制御システム
EP0221643A2 (fr) * 1985-08-30 1987-05-13 Texas Instruments Incorporated Système de navigation à télévision pour un robot mobile rôdant
EP0367527A2 (fr) * 1988-10-31 1990-05-09 Texas Instruments Incorporated Méthode de contrôle du mouvement d'un robot mobile dans une usine à repères multiples
JPH08123863A (ja) * 1994-10-25 1996-05-17 Mitsubishi Heavy Ind Ltd 工程管理ルール設計装置
JPH09185412A (ja) * 1995-12-28 1997-07-15 Yaskawa Electric Corp 自律移動装置
JPH09230933A (ja) * 1996-02-27 1997-09-05 Mitsubishi Electric Corp 自動搬送装置
JPH10275017A (ja) * 1997-03-31 1998-10-13 Hitachi Ltd 搬送制御方法および搬送装置
JPH10312217A (ja) * 1997-05-12 1998-11-24 Shinko Electric Co Ltd 運行管理制御装置および運行管理制御方法
JPH11242520A (ja) * 1997-12-08 1999-09-07 Caterpillar Inc 障害物検出応答式代替通路決定方法及びその装置
JPH11259131A (ja) * 1998-03-13 1999-09-24 Nippon Steel Corp 無人搬送台車の干渉防止制御システムおよび方法、記録媒体

Cited By (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004133882A (ja) * 2002-05-10 2004-04-30 Royal Appliance Mfg Co 自律性マルチプラットフォーム・ロボットシステム
US8000837B2 (en) 2004-10-05 2011-08-16 J&L Group International, Llc Programmable load forming system, components thereof, and methods of use
US7765031B2 (en) 2005-11-24 2010-07-27 Denso Wave Incorporated Robot and multi-robot interference avoidance method
JP2007148527A (ja) * 2005-11-24 2007-06-14 Denso Wave Inc ロボットの干渉回避方法およびロボット
JP4544145B2 (ja) * 2005-11-24 2010-09-15 株式会社デンソーウェーブ ロボットの干渉回避方法およびロボット
JP2007196301A (ja) * 2006-01-24 2007-08-09 Denso Corp 画像を用いた自動運転装置及び自動運転方法
JP2007237334A (ja) * 2006-03-08 2007-09-20 Toyota Motor Corp ロボットハンドによる把持制御方法
JP2008134744A (ja) * 2006-11-27 2008-06-12 Matsushita Electric Works Ltd 自律移動装置群制御システム
JP2009116790A (ja) * 2007-11-09 2009-05-28 Nissan Motor Co Ltd 車両用運転支援装置および運転支援方法
JP2009140251A (ja) * 2007-12-06 2009-06-25 Mori Seiki Co Ltd 干渉確認装置
JP2009184059A (ja) * 2008-02-05 2009-08-20 Nec Corp ランドマーク検出装置および方法ならびにプログラム
KR20100099489A (ko) * 2009-03-03 2010-09-13 삼성전자주식회사 충돌 예측 기반 주행 제어장치 및 그 방법
KR101633890B1 (ko) * 2009-03-03 2016-06-28 삼성전자주식회사 충돌 예측 기반 주행 제어장치 및 그 방법
JP2010231698A (ja) * 2009-03-30 2010-10-14 Advanced Telecommunication Research Institute International ネットワークロボットシステム、ロボット制御装置、ロボット制御方法およびロボット制御プログラム
JP2011221631A (ja) * 2010-04-06 2011-11-04 Advanced Telecommunication Research Institute International ロボット自己位置同定システム
WO2012008084A1 (fr) * 2010-07-13 2012-01-19 村田機械株式会社 Corps de locomotion autonome
JP2012022467A (ja) * 2010-07-13 2012-02-02 Murata Mach Ltd 自律移動体
US9020682B2 (en) 2010-07-13 2015-04-28 Murata Machinery, Ltd. Autonomous mobile body
JP2014145629A (ja) * 2013-01-28 2014-08-14 Tohoku Electric Power Co Inc 三次元表示可能な地中レーダシステム
KR101480774B1 (ko) * 2013-09-30 2015-01-13 전자부품연구원 Cctv를 이용한 이동로봇의 위치 인식 장치 및 방법
CN105706011A (zh) * 2013-11-07 2016-06-22 富士机械制造株式会社 自动运转系统及自动行走机
WO2015068229A1 (fr) * 2013-11-07 2015-05-14 富士機械製造株式会社 Système de pilotage automatique et machine de transport automatique
EP3067770A4 (fr) * 2013-11-07 2017-06-28 Fuji Machine Mfg. Co., Ltd. Système de pilotage automatique et machine de transport automatique
JPWO2015068229A1 (ja) * 2013-11-07 2017-03-09 富士機械製造株式会社 自動運転システム及び自動走行機
JP2016024630A (ja) * 2014-07-18 2016-02-08 シャダイ株式会社 移動体プラットフォームシステム
JPWO2016067467A1 (ja) * 2014-10-31 2017-04-27 三菱電機株式会社 ロボット制御装置、ロボットシステム、ロボット制御方法及びプログラム
WO2016067467A1 (fr) * 2014-10-31 2016-05-06 三菱電機株式会社 Dispositif de commande de robot, système de robot, procédé de commande de robot, et programme
JP2016206876A (ja) * 2015-04-21 2016-12-08 Cyberdyne株式会社 自律移動体の走行経路教示システムおよび走行経路教示方法
JP2017117353A (ja) * 2015-12-25 2017-06-29 シャダイ株式会社 移動体プラットフォームシステム
JP2016224992A (ja) * 2016-10-03 2016-12-28 シャダイ株式会社 移動体プラットフォームシステム
JP2018116359A (ja) * 2017-01-16 2018-07-26 本田技研工業株式会社 自律移動型ロボット運行管理システム
US10649467B2 (en) 2017-01-16 2020-05-12 Honda Motor Co., Ltd. Operation management system for autonomous mobile robots
CN110235079B (zh) * 2017-01-27 2022-09-27 威欧.艾姆伊有限公司 滚动装置和利用集成式滚动装置使设备自主重新定位的方法
CN110235079A (zh) * 2017-01-27 2019-09-13 威欧.艾姆伊有限公司 滚动装置和利用集成式滚动装置使设备自主重新定位的方法
CN111065981A (zh) * 2017-09-25 2020-04-24 日本电产新宝株式会社 移动体和移动体系统
JP2019109879A (ja) * 2017-12-18 2019-07-04 ザ・ボーイング・カンパニーThe Boeing Company 自律車両のためのマルチセンサ安全経路システム
US10394234B2 (en) 2017-12-18 2019-08-27 The Boeing Company Multi-sensor safe path system for autonomous vehicles
CN109933064B (zh) * 2017-12-18 2024-04-30 波音公司 用于自主车辆的多传感器安全路径系统
CN109933064A (zh) * 2017-12-18 2019-06-25 波音公司 用于自主车辆的多传感器安全路径系统
EP3499334A1 (fr) * 2017-12-18 2019-06-19 The Boeing Company Système de trajet sûr à capteurs multiples pour véhicules autonomes
US10942515B2 (en) 2017-12-18 2021-03-09 The Boeing Company Multi-sensor safe path system for autonomous vehicles
KR20190082674A (ko) * 2017-12-31 2019-07-10 사르코스 코퍼레이션 로봇이 볼 수 있는 비밀 식별 태그 및 로봇 장치
US11413755B2 (en) 2017-12-31 2022-08-16 Sarcos Corp. Covert identification tags viewable by robots and robotic devices
KR102244975B1 (ko) * 2017-12-31 2021-04-27 사르코스 코퍼레이션 로봇이 볼 수 있는 비밀 식별 태그 및 로봇 장치
WO2019183859A1 (fr) * 2018-03-28 2019-10-03 Abb Schweiz Ag Procédé et dispositif pour commande de robot
CN110609540A (zh) * 2018-06-15 2019-12-24 丰田自动车株式会社 自主移动体和用于自主移动体的控制程序
WO2020060091A1 (fr) * 2018-09-20 2020-03-26 삼성전자주식회사 Robot de nettoyage et procédé de réalisation de tâche associé
US11590970B2 (en) 2018-11-06 2023-02-28 Kabushiki Kaisha Toshiba Deadlock detection device, information processing device, deadlock detection method, and non-transitory computer readable medium
US11782446B2 (en) 2018-11-22 2023-10-10 Nec Corporation Route planning apparatus, route planning method, and computer-readable recording medium
WO2020105189A1 (fr) * 2018-11-22 2020-05-28 日本電気株式会社 Dispositif de planification d'itinéraire, procédé de planification d'itinéraire, et support d'enregistrement lisible par ordinateur
JP7160110B2 (ja) 2018-11-22 2022-10-25 日本電気株式会社 経路計画装置、経路計画方法、及びプログラム
JPWO2020105189A1 (ja) * 2018-11-22 2021-09-27 日本電気株式会社 経路計画装置、経路計画方法、及びプログラム
JP2020099953A (ja) * 2018-12-20 2020-07-02 前田建設工業株式会社 椅子の自動配置システム
JP7208783B2 (ja) 2018-12-20 2023-01-19 前田建設工業株式会社 椅子の自動配置システム
WO2020217976A1 (fr) * 2019-04-23 2020-10-29 日本電気株式会社 Système de commande de robot, dispositif de gestion, robot mobile, procédé de commande de robot et programme
WO2020240792A1 (fr) * 2019-05-30 2020-12-03 三菱電機株式会社 Robot mobile
JP6644212B1 (ja) * 2019-05-30 2020-02-12 三菱電機株式会社 作業移動システム
JP2022535686A (ja) * 2019-05-31 2022-08-10 フラバ ベー.フェー. 少なくとも1つの車両を自動制御するための車両制御アセンブリおよびその制御方法
KR102274541B1 (ko) * 2019-09-18 2021-07-06 정동화 물류 이송 로봇 시스템
KR20210033147A (ko) * 2019-09-18 2021-03-26 정동화 물류 이송 로봇 시스템
CN110632933B (zh) * 2019-10-18 2022-05-20 鱼越号机器人科技(上海)有限公司 一种路径移动方法、机器人、计算机可读存储介质
CN110632933A (zh) * 2019-10-18 2019-12-31 鱼越号机器人科技(上海)有限公司 一种路径移动方法、机器人、计算机可读存储介质
CN113075923B (zh) * 2019-12-18 2024-04-12 财团法人工业技术研究院 移动载具及其状态估测与感测融合切换方法
CN113075923A (zh) * 2019-12-18 2021-07-06 财团法人工业技术研究院 移动载具及其状态估测与感测融合切换方法
CN112214013A (zh) * 2020-08-07 2021-01-12 上海海得控制系统股份有限公司 直线往复式多rgv死锁避免和冲突实时控制方法、系统、介质、终端
DE112021004289T5 (de) 2020-11-27 2023-09-07 Hitachi, Ltd. Bewegungssteuerungs-unterstützungsvorrichtung und -verfahren
CN112650227A (zh) * 2020-12-14 2021-04-13 锐捷网络股份有限公司 一种自动导引车agv的调度方法、装置、设备及介质
WO2022153923A1 (fr) * 2021-01-15 2022-07-21 川崎重工業株式会社 Système de robot et procédé de commande de robot
WO2022196909A1 (fr) * 2021-03-19 2022-09-22 네이버랩스 주식회사 Procédé et système de commande à distance de robots, et bâtiment ayant des robots mobiles répondant de manière flexible à des obstacles
KR102462500B1 (ko) * 2021-05-06 2022-11-03 네이버랩스 주식회사 건물의 협소 구역을 자율주행하는 멀티 로봇이 배치되는 건물
WO2022234925A1 (fr) * 2021-05-06 2022-11-10 네이버랩스 주식회사 Procédé et système de commande d'une pluralité de robots se déplaçant à travers une région désignée, et bâtiment dans lequel des robots sont disposés
KR102462491B1 (ko) * 2021-05-06 2022-11-03 네이버랩스 주식회사 지정 구역을 주행하는 다수의 로봇들을 제어하는 방법 및 시스템
WO2022249486A1 (fr) * 2021-05-28 2022-12-01 株式会社やまびこ Système de traitement d'informations, machine de travail et programme
IT202100024962A1 (it) * 2021-09-29 2023-03-29 Centro Di Ricerca Sviluppo E Studi Superiori In Sardegna Crs4 Srl Uninominale Sistema elettronico di controllo della navigazione indoor di uno o più robot
WO2023053048A1 (fr) * 2021-09-29 2023-04-06 CENTRO DI RICERCA, SVILUPPO E STUDI SUPERIORI IN SARDEGNA - CRS4 Srl Uninominale Système électronique de commande de navigation en intérieur d'un ou plusieurs robots
CN113985880A (zh) * 2021-10-29 2022-01-28 深圳优地科技有限公司 多机器人路径规划方法、多机器人系统及机器人
CN114407929B (zh) * 2022-01-29 2023-12-12 上海木蚁机器人科技有限公司 无人驾驶绕障处理方法、装置、电子设备及存储介质
CN114407929A (zh) * 2022-01-29 2022-04-29 上海木蚁机器人科技有限公司 无人驾驶绕障处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
AU2001284520A1 (en) 2002-03-26
JPWO2002023297A1 (ja) 2004-01-22

Similar Documents

Publication Publication Date Title
WO2002023297A1 (fr) Systeme de commande de mouvement de corps mobiles
US11673269B2 (en) Method of identifying dynamic obstacle and robot implementing same
JP7352318B2 (ja) ロボット汎用の充電台帰還充電制御方法、チップ及びロボット
US7539557B2 (en) Autonomous mobile robot
KR102090590B1 (ko) 장애물 회피에 기반하여 경유 지점을 주행하는 로봇 및 주행하는 방법
JP5747191B2 (ja) 移動体遠隔操縦システムおよびそのための制御プログラム
Harapanahalli et al. Autonomous Navigation of mobile robots in factory environment
KR20210068446A (ko) 지형 인식 스텝 플래닝 시스템
KR20140039275A (ko) 자율 이동 장치 및 그 제어 방법
US11768499B2 (en) Method for generating intersection point pattern recognition model using sensor data of mobile robot and intersection point pattern recognition system
CN114072255A (zh) 移动机器人传感器配置
US20220291685A1 (en) Method and system to improve autonomous robotic systems responsive behavior
CN114740849B (zh) 基于行人步行决策规则的移动机器人自主导航方法及装置
CN113219999B (zh) 一种机器人自动回充路径规划方法及系统
KR20210026595A (ko) 로봇이 관리자 모드로 이동하는 방법 및 이를 구현하는 로봇
JP7061474B2 (ja) 走行装置
Lidoris et al. The autonomous city explorer project: Aims and system overview
CN115655261B (zh) 地图生成方法、装置、机器人以及存储介质
JP7317436B2 (ja) ロボット、ロボット制御プログラムおよびロボット制御方法
US20170325400A1 (en) Method for navigation and joint coordination of automated devices
Negishi et al. Adaptive robot speed control by considering map and localization uncertainty
US20240316762A1 (en) Environmental feature-specific actions for robot navigation
CN115790606B (zh) 轨迹预测方法、装置、机器人及存储介质
Jiang et al. Design of a universal self-driving system for urban scenarios—BIT-III in the 2011 Intelligent Vehicle Future Challenge
Gupta Autonomous robots and agents

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase