WO2022190514A1 - Conveyance system - Google Patents

Conveyance system Download PDF

Info

Publication number
WO2022190514A1
WO2022190514A1 PCT/JP2021/046856 JP2021046856W WO2022190514A1 WO 2022190514 A1 WO2022190514 A1 WO 2022190514A1 JP 2021046856 W JP2021046856 W JP 2021046856W WO 2022190514 A1 WO2022190514 A1 WO 2022190514A1
Authority
WO
WIPO (PCT)
Prior art keywords
transport robot
transport
information
image information
distance image
Prior art date
Application number
PCT/JP2021/046856
Other languages
French (fr)
Japanese (ja)
Inventor
祐二 川元
孝浩 井上
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022190514A1 publication Critical patent/WO2022190514A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a transport system equipped with a transport robot.
  • Self-propelled transport robots such as Automated Guided Vehicles (AGV) and Automated Guided Forklifts (AGF) have been proposed for use in factories and warehouses.
  • a plurality of such transport robots are controlled by a control device through wireless communication to constitute a transport system that realizes automation of transport in a factory or the like.
  • a transport robot that grasps its own position based on distance information obtained from distance sensors such as LiDAR (Light Detection And Ranging) and stereo cameras is also under consideration.
  • Such a transport robot autonomously determines a travel route without installing a guide indicating a predetermined travel route on the floor of a factory, warehouse, or the like.
  • the present invention has been made in view of such circumstances, and is intended to improve the accuracy of grasping the position of a transport robot in a transport system in which a transport robot that autonomously determines a travel route is applied. With the goal.
  • a transport system includes a plurality of self-propelled transport robots and a control device that communicates with the plurality of transport robots. and the signal intensity from the object, and acquires it as distance image information. and an outer circumference position specifying unit added to the outer circumference of the transport robot, wherein the control device acquires the self-position information and the distance image information from each of the plurality of transport robots.
  • a state acquisition unit the position of the transport robot included in the self-position information acquired from the transport robot; and the distance image information acquired from the transport robot other than the transport robot; an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transfer robot using information about the signal strength that reflects the position of the transfer robot; and an estimation that notifies the transfer robot of the estimated position of the transfer robot. and a location transmitter.
  • the accuracy of grasping the position of the transport robot is enhanced.
  • FIG. 1 is a block diagram showing the configuration of main parts of a transport system according to an embodiment of the present invention
  • FIG. It is a figure which shows the example of an external shape of the conveyance robot of the conveyance system which concerns on embodiment of this invention. It is a floor map which shows typically the example of the factory to which the conveyance system which concerns on embodiment of this invention is applied.
  • FIG. 10 is a diagram for explaining a method for a transport robot to detect surrounding objects using a distance sensor;
  • FIG. 6 is a graph showing distance image information acquired and output by a distance sensor of a transport robot in the example shown in FIG. 5 ;
  • FIG. 10 is a diagram for explaining a method for a transport robot to detect surrounding objects using a distance sensor; Indicates a situation in which an obstacle exists within the monitored area.
  • FIG. 5 is a diagram showing a situation in the case shown in FIG. 4 when there is still another transport robot within the monitoring range of the transport robot;
  • FIG. 8 is a graph showing the distance image information obtained and output by the distance sensor of the transport robot in the example shown in FIG. graph.
  • FIG. 10 is an image diagram of distance image information of an outer peripheral position specifying unit; FIG.
  • 10 is a diagram showing a situation in which a specific transport robot exists within monitoring ranges of a plurality of other transport robots; 4 is a flowchart showing characteristic operations performed by the transport system according to the embodiment of the present invention; It is a figure which shows the example of an external form of the conveyance robot of the conveyance system which concerns on the modification of this invention.
  • FIG. 1 is a block diagram showing the configuration of a transport system 1 according to Embodiment 1.
  • the transport system 1 includes a control device 10 and a plurality of self-propelled transport robots 30 .
  • the control device 10 communicates with a plurality of transport robots 30 and gives instructions to each of the transport robots 30 .
  • the transport robot 30 detects the distance to a surrounding object and the signal strength from the object, and acquires the distance image information as a distance sensor 36 (distance image information acquisition unit). and a self-position calculation unit 33 for calculating self-position information regarding a position including orientation. Further, the transport robot 30 has an outer peripheral position specifying section 40 attached to the outer periphery of the transport robot 30 .
  • the control device 10 has an eigenstate acquisition unit 13 that acquires the self-position information and the distance image information from each transport robot 30 .
  • the control device 10 also has an estimated position calculation unit 15 that calculates estimated position information regarding the position of the transfer robot 30 using information (1) to (3) below.
  • Signal strength information that reflects the outer peripheral position specifying portion of each transfer robot 30 included in the .
  • control device 10 has an estimated position transmitting section 16 that notifies the transport robot 30 of the estimated position of the transport robot 30 .
  • the estimated position information which is information about the position of the self-propelled transport robot 30, is the self-position information calculated by the transport robot 30 (hereinafter referred to as self-machine) in the control device 10.
  • self-machine the transport robot 30
  • it is calculated based on the distance image information from a transport robot other than the transport robot 30 (hereinafter referred to as another machine).
  • the estimated position information is reported from the control device 10 to the transport robot 30 itself. Therefore, in the transport system 1, the position of the transport robot 30 can be estimated based on the estimated position information whose accuracy is higher than that of the self-position information. Therefore, it is possible to realize a transport system in which the position estimation accuracy of the self-propelled transport robot is improved.
  • FIG. 2 is a diagram showing an appearance example of the transport robot 30 of the transport system 1 according to this application example.
  • FIG. 3 is a diagram schematically showing a floor map of a factory 100, which is an example of areas such as factories and warehouses to which the transport system 1 according to this application example can be applied.
  • the side of the transport robot 30 where the distance sensor 36 is arranged is the front of the transport robot 30, the opposite side is the rear, and the direction perpendicular to the front-rear direction is the left-right direction.
  • the transport robot 30 is shown with the outer peripheral position specifying unit 40 omitted.
  • self-propelled transport robots 30 that transport objects to be transported, such as products, semi-finished products, parts, raw materials, tools, jigs, packaging materials, and cassettes that store them, are deployed. Furthermore, a self-propelled transport robot 30M provided with a robot arm (manipulator) for gripping an object to be transported may be arranged as part of the transport system 1 on a transport robot as a cart.
  • the transport robot 30M is an example of the transport robot 30, and shall be included in the transport robot 30 in the following description.
  • the form of the transport robot 30 may be an unmanned guided vehicle, an AGF, or other forms of self-propelled transport devices.
  • a shelf 110 on which objects to be transported can be placed is installed in the factory 100 . Further, in the factory 100, there are also installed production facilities for processing, assembling, treating, inspecting, etc. on the objects to be conveyed or by using the objects to be conveyed.
  • the transport robot 30 of the transport system 1 can transport objects to be transported between these facilities.
  • each transport robot 30 is represented by XY coordinates defined on the floor of the factory 100.
  • the orientation of each transfer robot 30 is represented by an angle ⁇ defined with respect to the XY coordinates.
  • the position and orientation of each transfer robot 30 can be expressed as (X, Y, ⁇ ).
  • information about the position of the transport robot means information about the position and orientation (X, Y, ⁇ ) of the transport robot.
  • the transport system 1 includes a control device 10 and a plurality of self-propelled transport robots 30 .
  • the control device 10 is an information processing system responsible for managing transportation, which is sometimes called a transportation system server (AMHS server: Automated Material Handling System Server).
  • AMHS server Automated Material Handling System Server
  • the control device 10 transmits a more specific transport instruction to the transport robot 30 in the transport system 1 based on a command from the upper information processing system or the like.
  • the control device 10 may be any information processing system capable of executing such processing, and does not need to be a device physically housed in one housing.
  • the upper information processing system that manages the production of products in the production factory may be called a manufacturing execution system server (MES server).
  • MES server manufacturing execution system server
  • WMS server warehouse management system server
  • the transport robot 30 has functional blocks of an instruction reception unit 31, a travel control unit 32, a self-position calculation unit 33, an eigenstate notification unit 34, and an estimated position acquisition unit .
  • the transport robot 30 also has a distance sensor 36 , a slave storage unit 37 , a traveling mechanism unit 38 , a slave communication unit 39 and an outer circumference position specifying unit 40 .
  • the distance sensor 36 is arranged on the front side of the transport robot 30 and monitors the front of the transport robot 30 in the running direction.
  • the distance sensor 36 consists of two LiDARs, and a distance image showing the distance to an object present in the monitoring area and the signal strength from the object. information can be obtained.
  • the distance sensor 36 irradiates light on an object existing within the monitoring area and detects the distance to the object based on the light reflected from the object.
  • the distance image information is not limited to a two-dimensional image, and may be a one-dimensional (straight line) image.
  • the area to be monitored can include a range of up to about 120 degrees to the left and right from the front in front of the transport robot 30 in the traveling direction.
  • An example of such distance image information will be described later. Note that the distance image information is also generally called a distance image.
  • the number of LiDARs arranged on the transport robot 30 may be singular or plural, and if plural, they may be arranged so that the rear can also be monitored. Further, the type of the distance sensor 36 is not limited to LiDAR, and may be a sensor that acquires a distance image such as a stereo camera or a ToF (Time-of-Flight) camera, or other methods.
  • a distance image such as a stereo camera or a ToF (Time-of-Flight) camera, or other methods.
  • the slave storage unit 37 is a recording device provided in the transport robot 30 .
  • the slave storage unit 37 stores identification information of the transport robot 30 and map information of the floor of the factory 100, as well as various information and travel history necessary for the transport robot 30 to run, a control program for the transport robot 30, and so on. etc. will be retained as appropriate.
  • the travel mechanism unit 38 is a mechanism unit that operates under the control of the travel control unit 32 and allows the transport robot 30 to travel on the floor surface. In the external view of the transport robot 30 in FIG. 2, wheels 38A that are part of the traveling mechanism section 38 are shown.
  • the slave communication unit 39 is a communication interface for the transport robot 30 to communicate with the control device 10. Since real-time distance image information is included in communication with the transport robot 30 through the slave communication unit 39, high speed, low delay, and multiple connections are preferable. Therefore, the slave communication unit 39 preferably performs 5G (5th Generation) communication or Wi-Fi6 communication (WiFi: registered trademark) with the master communication unit 19 of the control device 10. An antenna 39A that is part of the slave communication unit 39 is shown in the external view of the transport robot 30 in FIG.
  • the outer circumference position specifying part 40 is attached to the outer circumference along the side surface of the transport robot 30 at a position that does not interfere with the irradiation of light from the distance sensor 36 .
  • the outer peripheral position specifying unit 40 is added substantially parallel to the upper surface of the transport robot 30 at a position (height) where light is emitted from the distance sensor 36 of the other transport robot 30 .
  • the outer circumference position specifying unit 40 has different light luminance reflectances depending on the position on the outer circumference of the transport robot 30 .
  • the outer circumference position specifying part 40 may be, for example, a reflector.
  • the outer peripheral position specifying unit 40 is added, for example, so that the luminance reflectance increases toward the front of the transport robot 30 (where the distance sensor 36 is installed).
  • the outer peripheral position specifying unit 40 includes a side surface front specifying unit 41 added to the front of the left and right side surfaces of the transport robot 30, a side surface rear specifying unit 42 added to the rear of the side surface, and the transport robot. and a rear surface identification portion 43 added to the rear surface of the 30 .
  • the side front specifying portion 41 has the highest luminance reflectance
  • the side front specifying portion 41, the side rear specifying portion 42, and the rear surface specifying portion 43 have lower luminance reflectances in this order.
  • the signal strength from the side front specifying portion 41 is the highest, and the side front specifying portion 41, the side rear specifying portion 42, and the rear surface specifying portion 43 are the highest.
  • the signal strength decreases in the order of
  • the signal strength acquired from the rear face identifying portion 43 is higher than the signal strength from the portion where the outer circumference position identifying portion 40 is not added.
  • the instruction reception unit 31 is a functional block that receives instructions from the control device 10 via the slave communication unit 39 .
  • the travel control unit 32 is a functional block that controls the travel mechanism unit 38 and causes the transport robot 30 to travel.
  • the travel control unit 32 also calculates odometry data based on the operation information of each mechanism from the travel mechanism unit 38, specifically, the rotary encoder output of the motor and the like.
  • the odometry data is information that indicates the relative position of the transport robot 30 during or after running with respect to the position of the transport robot 30 at a certain point in time.
  • the self-position calculator 33 calculates the approximate position of the transport robot 30 from the odometry data, and compares the distance image information with the map information around the approximate position to calculate the self-position regarding the position of the transport robot 30 including its own orientation. It is a functional block that calculates information.
  • the peculiar state notification unit 34 is a functional block that notifies the control device 10 of peculiar information of the transport robot 30 via the slave communication unit 39 .
  • the estimated position acquisition unit 35 is a functional block that acquires estimated position information from the control device 10 via the slave communication unit 39 . The estimated position information will be described later.
  • the unique information refers to information about the unique state related to the individual transport robot 30 itself, and includes distance image information and self-position information. Furthermore, for example, it may include information related to the operation of the transport robot 30, such as the state of operation of the transport robot 30, the state of loading of objects to be transported, the remaining battery level, and other internal states.
  • the transport robot 30 performs the required transport in the following manner in accordance with instructions from the control device 10.
  • the instruction receiving unit 31 receives a transport instruction from the control device 10 via the slave communication unit 39 .
  • the travel control unit 32 controls the travel mechanism unit 38 based on the information about the self position and the position information of the transport destination included in the transport instruction, and causes the transport robot 30 to travel to the transport destination.
  • the self-position calculator 33 continues to update the self-position information.
  • the control device 10 includes functions of an upper-level command receiving unit 11, an instruction issuing unit 12, an eigenstate acquiring unit 13, a positional relationship calculating unit 14, an estimated position calculating unit 15, and an estimated position transmitting unit 16. have blocks.
  • the control device 10 also has a master storage unit 17 , an upper communication unit 18 and a master communication unit 19 .
  • the master storage unit 17 is a recording device provided in the control device 10 .
  • the master storage unit 17 holds the map information of the floor of the factory 100, the control program of the control device 10, and also the unique information of each transfer robot 30, the operation log, and the like as appropriate.
  • the host communication unit 18 is a communication interface for the control device 10 to communicate with the host information processing system.
  • the master communication unit 19 is a communication interface for the control device 10 to communicate with the transport robot 30 .
  • the higher-level command receiving unit 11 is a functional block that receives commands from the higher-level information processing system via the higher-level communication unit 18 .
  • the instruction issuing unit 12 issues instructions to individual transport robots 30 based on instructions from the host information processing system, and transmits instructions to individual transport robots 30 through the master communication unit 19 .
  • the instruction issuing unit 12 refers to the eigenstate of each transport robot 30 acquired by the eigenstate acquisition unit 13, and instructs the appropriate individual transport robot 30 to execute the command from the host information processing system. issue instructions for Each function of the positional relationship calculator 14, the estimated position calculator 15, and the estimated position transmitter 16 will be described later.
  • FIG. 4 is a diagram showing a situation in which the transport robot 30 uses the distance sensor 36 to measure the distance to surrounding objects on the floor of the factory 100.
  • FIG. 5 is a graph showing the distance among the distance image information acquired and output by the distance sensor 36 at that time.
  • the map information includes information on the shelves placed on the floor of the factory 100 and the positions of the production equipment. For the production facility 120a shown in FIG. 4, information about the positions of the four corners of the housing is registered in the map information as dot landmarks MC1 to MC4.
  • the monitoring range R of the distance sensor 36 of the transport robot 30 is indicated by a dotted line.
  • the horizontal axis represents the angle from the front of the transport robot 30, that is, the azimuth with respect to the transport robot 30, and the vertical axis represents the distance from the reference point of the transport robot 30.
  • the distance image information the distance is graphically represented. It is a diagram.
  • the monitoring range R is also shown in FIG.
  • the distance image information acquired and output by the distance sensor 36 includes information about the distance to an object positioned within the monitoring range R.
  • the self-position calculation unit 33 calculates the approximate position including the orientation of the transport robot 30 based on the past self-position information and the odometry data, and calculates the map information around the area where the monitoring range R assumed therefrom is arranged. and distance image information. Since the approximate position calculated from the odometry data contains an error, a discrepancy occurs between the map information and the distance image information.
  • the self-position calculator 33 calculates self-position information, which is information about the position of the transport robot 30 itself, by correcting the deviation from the approximate position.
  • FIG. 6 shows a situation in which a worker W has entered the monitoring range R of the distance sensor 36 of the transport robot 30 and a temporary article Ob is placed. Due to the occurrence of occlusion by the worker W and the temporarily placed article Ob, a shielded area Da is generated in which the distance sensor 36 cannot grasp the distance to the object registered in the map information.
  • the self-position calculation unit 33 determines the accuracy of the self-position information according to the matching situation between the map information and the distance image information, for example, how many point landmarks and line landmarks have been matched. Calculate the information together.
  • FIG. 7 shows a situation in which another transport robot 30b exists within the monitoring range R of the distance sensor 36 of the transport robot 30a in the situation of FIG.
  • FIG. 8 is a graph showing distance image information acquired and output by the distance sensor 36 of the transport robot 30a in the example shown in FIG. More specifically, the graph 1001 in FIG. 8 shows, in the situation of FIG. It is the figure which expressed the distance among the distance image information by making into a graph.
  • a graph 1002 in FIG. 8 is a graph showing the signal intensity of the distance image information, with the horizontal axis representing the angle from the front of the transport robot 30a and the vertical axis representing the signal intensity in the situation of FIG.
  • the distance image information acquired by the transport robot 30a includes information about the position of the other transport robot 30b and information about the signal intensity reflecting the outer circumference position specifying part 40 of the transport robot 30b. Specifically, as shown in a graph 1002, the transport robot 30a acquires a high signal strength in the distance image information obtained by the side front specifying unit 41 of the outer peripheral position specifying unit 40 of the transport robot 30b. Further, in the distance image information obtained by the side rear specifying unit 42 , the transport robot 30 a obtains a signal strength lower than that of the side front specifying unit 41 and higher than that of the rear surface specifying unit 43 .
  • the transport robot 30 a obtains a signal strength lower than that of the side rear specifying unit 42 and higher than that of other than the outer peripheral position specifying unit 40 . That is, it can be seen that the intensity of the acquired signal increases as it goes forward of the transport robot 30b.
  • a range QV in FIG. 9 is a range of distance image information of the transport robot 30b acquired by the transport robot 30a when the transport robot 30a senses the transport robot 30b in FIG.
  • the point cloud of the distance image information of the transport robot 30b acquired by the transport robot 30a is shown in FIG. , the brightness differs for each point cloud at each location.
  • the acquired signal strength is higher for the distance image information in front of the transport robot 30b.
  • the position including the orientation of the transport robot 30b can be accurately calculated from the self-position information of the transport robot 30a and the distance image information acquired by the transport robot 30a.
  • FIG. 10 shows a situation where the transport robot 30a exists within the monitoring range Rb of the transport robot 30b and within the monitoring range Rc of the transport robot 30c.
  • the position of the transport robot 30a There are the following three types of information regarding the position of the transport robot 30a.
  • estimated values X1 to X3 are obtained from (i) to (iii) above for the coordinate X, respectively.
  • a simple average of the estimates X1-X3 can be the combined position estimate.
  • a weighted average using weighting coefficients determined according to the accuracy information of each of the estimates X1 to X3 can be used as the integrated position estimate.
  • a probability distribution function such as a normal distribution having a variance determined according to the respective accuracy information and peaking at each estimated value is calculated, and these probability distribution functions are added.
  • the peaks of the distribution function obtained can be taken as the integrated position estimate. The same applies to the coordinate Y and the direction ⁇ .
  • the weighting coefficients may be determined as follows. That is, the weighting factor of each distance image information may be determined according to the positions and orientations of the transport robots 30b and 30c other than the transport robot 30a with respect to the transport robot 30a.
  • the estimated position calculator 15 determines the orientation of the transport robot 30b and the transport robot 30c with respect to the transport robot 30a from the distance image information of the transport robot 30a. Based on this, the estimated position calculator 15 determines weighting coefficients for the distance image information from the transport robot 30b and the distance image information from the transport robot 30c, which are employed when calculating the estimated position of the transport robot 30a.
  • the transport robot 30a since the transport robot 30a is in the straight-ahead direction (vertical direction) of the transport robot 30b, it can be determined that the accuracy of the distance image information regarding the transport robot 30a acquired by the transport robot 30b is high. On the other hand, since the transport robot 30a is positioned at an angle (having an angle with respect to the vertical direction) from the vertical direction of the transport robot 30c, the accuracy of the distance image information regarding the transport robot 30a acquired by the transport robot 30c is low. can be judged. Therefore, when calculating the estimated position of the transport robot 30a, the estimated position calculator 15 increases the weighting coefficient of the distance image information of the transport robot 30b and decreases the weighting coefficient of the distance image information of the transport robot 30c.
  • the transport robot 30b exists within the monitoring range Ra of the transport robot 30a and within the monitoring range Rc of the transport robot 30c. Therefore, information on the position of the transport robot 30b can also be calculated with improved accuracy based on information from the other transport robots 30a and 30c. In this way, based on mutual information of the transport robots 30, it is possible to obtain information on the position of the transport robot 30 with higher accuracy.
  • Step S1 In each transport robot 30 of the transport system 1, the distance sensor 36 acquires distance image information. Further, the travel control unit 32 calculates odometry data.
  • Step S2 In each transport robot 30 of the transport system 1, the self-position calculator 33 calculates self-position information based on the distance image information and the odometry data.
  • Step S3 In each of the transport robots 30 of the transport system 1, the peculiar state notification unit transmits the peculiar information including the distance image information and the self position information together with the identification information of the transport robot 30 itself stored in the slave storage unit 37. , to the control device 10 via the slave communication unit 39 .
  • Step S4 The unique state acquisition unit 13 of the control device 10 acquires unique information with identification information from each transport robot 30 through the master communication unit 19 .
  • Step S5 The positional relationship calculator 14 of the control device 10 calculates the monitoring range R of each transport robot 30 from the self-position information of each transport robot 30 included in the unique information. At that time, each transport robot 30 is identified based on the identification information of the transport robot 30 attached to the unique information. Further, the positional relationship calculation unit 14 calculates, based on each self-position information and each calculated monitoring range R, which transport robot 30 is within the monitoring range R of the other transport robot 30 .
  • Step S6 The estimated position calculation unit 15 of the control device 10 calculates estimated position information regarding the position of each transport robot 30 for each transport robot. At that time, the estimated position calculation unit 15 calculates the position of the transport robot included in the self-position information acquired from the transport robot 30, the distance image information acquired from the transport robots 30 other than the transport robot 30, and the position of the transport robot from the transport robot. In addition to the information about the signal intensity reflecting the outer peripheral position specifying part, the self-position information acquired from the transport robots 30 other than the transport robot 30 concerned is used. Calculation of the estimated position information is performed based on the method of calculating the "integrated position estimate" described in the principle of the estimated position calculation above.
  • Step S7 The estimated position transmission unit 16 of the control device 10 transmits the estimated position information of each transport robot 30 calculated by the estimated position calculation unit 15 to each transport robot 30 via the master communication unit 19. do. At that time, the estimated position transmitting unit 16 identifies each transport robot 30 based on the acquired identification information of the transport robot 30 .
  • Step S8 In each transport robot 30 of the transport system 1, the estimated position acquisition unit 35 acquires its own estimated position information from the control device 10 through the slave communication unit 39. The estimated position acquiring unit 35 updates the information regarding the position of the own aircraft based on the acquired estimated position information.
  • the position estimation of the transport robot 30 is based not only on the distance image information from the distance sensor 36 of the robot itself, but also on the distance image information from the distance sensors 36 of the other robots. is executed. Therefore, the position estimation accuracy of the transport robot 30 can be further improved compared to the case where the position estimation is performed only by the distance image information from the distance sensor 36 of the robot itself.
  • the outer circumference position specifying unit 40 is added to a position where the light from the distance sensor 36 of the other transport robot 30 is irradiated, and the brightness of the light is reflected by the position of the outer circumference. different rates.
  • the distance sensor 36 detects the distance to the object from an image including the object, such as a stereo camera, it is not limited to such an example, and may be appropriately selected according to the embodiment. you can
  • a transport robot 30Z which is a modified example of the transport robot 30, may have an outer peripheral position specifying unit 40Z instead of the outer peripheral position specifying unit 40.
  • FIG. 12 is a diagram showing an example of the outer shape of the transport robot 30Z of the transport system according to the modification of the present invention.
  • the outer peripheral position specifying part 40Z has different patterns along the outer periphery in the upper part of the transport robot 30Z depending on the position of the outer periphery.
  • the outer peripheral position specifying portion 40Z has a center specifying portion 41Z at the central portion in the length direction along the outer periphery of the outer peripheral position specifying portion 40Z on each side.
  • the center identifying portion 41Z has a different pattern on each side, and by determining the pattern, it is possible to determine which side of the transport robot 30Z it is.
  • the orientation of the transport robot 30Z can be accurately determined, so information regarding the position of the transport robot 30Z can be obtained with higher accuracy based on the mutual information of the transport robots 30Z.
  • the transport robot 30Z may have a light-emitting diode at a predetermined location on the outer circumference as the outer circumference position specifying section 40Z.
  • a light-emitting diode may be provided on the front surface of the transport robot 30Z, and may emit light at the timing (sensing timing) when the other transport robot 30Z detects the distance image information of the transport robot 30Z.
  • the intensity of the signal from the front surface of the transport robot 30Z in the range image information is increased, so the orientation of the transport robot 30Z can be accurately grasped by analyzing the range image information.
  • the positional relationship calculation unit 14 of the control device 10 may perform the following processing. That is, the positional relationship calculation unit 14 estimates the timing at which another transport robot 30Z having the monitoring range R where the transport robot 30Z is located senses the transport robot 30Z based on the unique information. Then, the positional relationship calculation unit 14 instructs the transport robot 30Z to cause the light emitting diode to emit light at the timing when the estimated other transport robot 30Z senses the transport robot 30Z.
  • the transport robot 30Z may emit light at the timing when another transport robot 30Z detects the distance image information of the transport robot 30Z.
  • the transport robot 30Z may be provided with light emitting diodes at each vertex of the transport robot 30Z so as to illuminate the entire transport robot 30Z.
  • a light-emitting diode may be placed in the As a result, when the other transport robot 30Z acquires the image of the transport robot 30Z, the surrounding or outer peripheral position specifying part 40Z of the transport robot 30Z becomes bright, and the accuracy of the distance image information acquired by the other transport robot 30Z is reduced. improves.
  • Each functional block of the control device 10 (particularly, the upper command receiving unit 11, the instruction issuing unit 12, the unique state acquiring unit 13, the positional relationship calculating unit 14, the estimated position calculating unit 15, the estimated position transmitting unit 16) or the transport robot 30
  • the functional blocks of are logic circuits ( hardware) or software.
  • control device 10 or the transport robot 30 is equipped with a computer that executes program instructions, which are software that implements each function.
  • This computer includes, for example, one or more processors, and a computer-readable recording medium storing the program.
  • the processor reads the program from the recording medium and executes it, thereby achieving the object of the present invention.
  • a CPU Central Processing Unit
  • a recording medium a "non-temporary tangible medium” such as a ROM (Read Only Memory), a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • a RAM Random Access Memory
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast waves, etc.) capable of transmitting the program.
  • any transmission medium communication network, broadcast waves, etc.
  • one aspect of the present invention can also be implemented in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • a transport system includes a plurality of self-propelled transport robots and a control device that communicates with the plurality of transport robots. and the signal intensity from the object, and acquires it as distance image information. and an outer circumference position specifying unit added to the outer circumference of the transport robot, wherein the control device acquires the self-position information and the distance image information from each of the plurality of transport robots.
  • a state acquisition unit the position of the transport robot included in the self-position information acquired from the transport robot; and the distance image information acquired from the transport robot other than the transport robot; an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transfer robot using information about the signal strength that reflects the position of the transfer robot; and an estimation that notifies the transfer robot of the estimated position of the transfer robot. and a location transmitter.
  • the transport robot of the transport system detects the distance to surrounding objects and the signal intensity from the objects, and acquires them as distance image information. Also, the self-position calculation unit calculates self-position information regarding the position including the direction of the self using the distance image information. Thereby, the transport robot can specify the position of the transport robot relative to the surrounding objects, including the direction of the transport robot.
  • the estimated position calculation unit of the control device of the transport system calculates the estimated position information of the transport robot (hereinafter referred to as self machine) based on (2) as well as (1) below.
  • self machine the estimated position information of the transport robot
  • self-location calculation unit calculates the estimated position information of the transport robot (hereinafter referred to as self machine) based on (2) as well as (1) below.
  • self-location information calculated by the self-location calculation unit
  • self-location information of a transport robot other than the own robot hereinafter referred to as another robot
  • Range image information including information about
  • the estimated position calculation unit can specify the distance from the other aircraft to the position on the outer circumference of the own aircraft by (2), so that the orientation of the own aircraft with respect to the other aircraft can be accurately estimated. Thereby, the estimated position information of the own aircraft can be calculated in consideration of the orientation of the own aircraft with respect to the other aircraft.
  • the estimated position information of the aircraft is notified to the aircraft from the control device.
  • the position and angle of the transport robot can be estimated based on the estimated position information whose accuracy is higher than that of the self-position information.
  • the estimated position calculation unit calculates the estimated position information by weighted average of the distance image information, and calculates the positions of the transport robots other than the transport robot with respect to the transport robot and The orientation may determine the weighting factor of the weighted average.
  • the estimated position calculation unit can calculate the estimated position information using more accurate distance image information, so the accuracy of the estimated position information is improved.
  • the distance image information acquiring unit irradiates the object with light and detects the distance to the object based on the light reflected from the object;
  • the luminance reflectance of the light may differ depending on the position of the outer periphery.
  • the distance image information acquisition unit detects a distance to the object from an image including the object
  • the outer circumference position specifying unit has a pattern that varies depending on the position of the outer circumference.
  • the transport robot may emit light at the timing when the distance image information acquiring section detects the distance image information.
  • the distance image information acquisition can reliably detect the distance image information related to the transport robot.
  • control device may be realized by a computer.
  • the control device is realized by the computer by operating the computer as each part (software element) included in the control device.
  • a control program for a control device and a computer-readable recording medium recording it are also included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention increases accuracy of estimating the position of a conveyance robot. A control device (10) in this conveyance system (1) calculates estimated position information pertaining to an orientation-including position of a conveyance robot (30) by using: host position information acquired from the conveyance robot (30); and information pertaining to the position of the aforementioned conveyance robot, said position being included in distance image information acquired from a conveyance robot other than the aforementioned conveyance robot, and to the signal intensity from the aforementioned conveyance robot, the signal intensity reflecting an outer peripheral position specification unit (40).

Description

搬送システムConveyor system
 本発明は、搬送ロボットを備えた搬送システムに関する。 The present invention relates to a transport system equipped with a transport robot.
 工場や倉庫等において利用される、無人搬送車(AGV:Automated Guided Vehicle)や、無人フォークリフト(AGF:Automated Guided Forklift)等の、自走式の搬送ロボットが提案されている。複数台のこのような搬送ロボットが、無線通信を通じて制御装置により制御されて、工場等における搬送の自動化を実現する搬送システムが構成されている。 Self-propelled transport robots such as Automated Guided Vehicles (AGV) and Automated Guided Forklifts (AGF) have been proposed for use in factories and warehouses. A plurality of such transport robots are controlled by a control device through wireless communication to constitute a transport system that realizes automation of transport in a factory or the like.
日本国特開2019-48689号公報Japanese Patent Application Laid-Open No. 2019-48689
 LiDAR(Light Detection And Ranging)やステレオカメラ等の距離センサから得られる、距離情報に基づいて自身の位置を把握する搬送ロボットも検討されている。このような搬送ロボットは、工場や倉庫等の床面にあらかじめ決められた走行ルートを示すガイドを設置することを要せずに、自律的に走行ルートを判断する。 A transport robot that grasps its own position based on distance information obtained from distance sensors such as LiDAR (Light Detection And Ranging) and stereo cameras is also under consideration. Such a transport robot autonomously determines a travel route without installing a guide indicating a predetermined travel route on the floor of a factory, warehouse, or the like.
 よってこのような搬送ロボットを適用すれば、工場や倉庫での多種多様な搬送作業への対応や、ラインの変更に対しての柔軟な対応が可能な、フレキシブルな搬送システムを構築できるメリットがある。しかし、自律的に走行ルートを判断する搬送ロボットが適用された搬送システムでは、ガイドに沿って決められたルートのみを走行する搬送ロボットが適用された搬送システムと比較すると、搬送ロボットの位置の把握の精度を高めることが難しい課題がある。 Therefore, if such a transport robot is applied, it has the advantage of being able to build a flexible transport system that can handle a wide variety of transport tasks in factories and warehouses and flexibly respond to line changes. . However, in a transport system to which a transport robot that autonomously determines its travel route is applied, it is difficult to grasp the position of the transport robot compared to a transport system to which a transport robot that travels only along a route determined along a guide is applied. It is difficult to improve the accuracy of
 本発明は、一側面では、このような実情を鑑みてなされたものであり、自律的に走行ルートを判断する搬送ロボットが適用された搬送システムにおいて、搬送ロボットの位置の把握の精度を高めることを目的とする。 In one aspect, the present invention has been made in view of such circumstances, and is intended to improve the accuracy of grasping the position of a transport robot in a transport system in which a transport robot that autonomously determines a travel route is applied. With the goal.
 前記の課題を解決するために、以下の構成を採用する。本発明の一側面に係る搬送システムは、自走式の複数の搬送ロボットと、複数の前記搬送ロボットとの間で通信を行う制御装置とを備え、前記搬送ロボットは、周囲の物体への距離と当該物体からの信号強度とを検知し、距離画像情報として取得する距離画像情報取得部と、前記距離画像情報を用いて自己の向きを含めた位置に関する自己位置情報を算出する自己位置算出部と、前記搬送ロボットの外周に付加された外周位置特定部と、を有し、前記制御装置は、複数の前記搬送ロボットから、それぞれの前記自己位置情報と、前記距離画像情報とを取得する固有状態取得部と、前記搬送ロボットから取得した前記自己位置情報と、当該搬送ロボット以外の前記搬送ロボットから取得した前記距離画像情報に含まれる当該搬送ロボットの位置及び当該搬送ロボットからの前記外周位置特定部を反映した信号強度に関する情報とを用いて、当該搬送ロボットの向きを含めた位置に関する推定位置情報を算出する推定位置算出部と、当該搬送ロボットの推定位置を、当該搬送ロボットに報知する推定位置送信部と、を有する。 In order to solve the above problems, the following configuration is adopted. A transport system according to one aspect of the present invention includes a plurality of self-propelled transport robots and a control device that communicates with the plurality of transport robots. and the signal intensity from the object, and acquires it as distance image information. and an outer circumference position specifying unit added to the outer circumference of the transport robot, wherein the control device acquires the self-position information and the distance image information from each of the plurality of transport robots. a state acquisition unit; the position of the transport robot included in the self-position information acquired from the transport robot; and the distance image information acquired from the transport robot other than the transport robot; an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transfer robot using information about the signal strength that reflects the position of the transfer robot; and an estimation that notifies the transfer robot of the estimated position of the transfer robot. and a location transmitter.
 本発明の一側面によれば、自律的に走行ルートを判断する搬送ロボットが適用された搬送システムにおいて、搬送ロボットの位置の把握の精度が高められる。 According to one aspect of the present invention, in a transport system in which a transport robot that autonomously determines a travel route is applied, the accuracy of grasping the position of the transport robot is enhanced.
本発明の実施形態に係る搬送システムの要部の構成を示すブロック図である。1 is a block diagram showing the configuration of main parts of a transport system according to an embodiment of the present invention; FIG. 本発明の実施形態に係る搬送システムの搬送ロボットの外形例を示す図である。It is a figure which shows the example of an external shape of the conveyance robot of the conveyance system which concerns on embodiment of this invention. 本発明の実施形態に係る搬送システムが適用される工場の例を模式的に示す、フロアマップである。It is a floor map which shows typically the example of the factory to which the conveyance system which concerns on embodiment of this invention is applied. 搬送ロボットが距離センサを用いて周囲の物体を検知する方法を説明するための図である。FIG. 10 is a diagram for explaining a method for a transport robot to detect surrounding objects using a distance sensor; 図5に表された事例において、搬送ロボットの距離センサが取得し出力する距離画像情報をグラフとして表した図である。FIG. 6 is a graph showing distance image information acquired and output by a distance sensor of a transport robot in the example shown in FIG. 5 ; 搬送ロボットが距離センサを用いて周囲の物体を検知する方法を説明するための図である。障害物が監視エリア内に存在する状況を示す。FIG. 10 is a diagram for explaining a method for a transport robot to detect surrounding objects using a distance sensor; Indicates a situation in which an obstacle exists within the monitored area. 図4に表された事例について、更に他の搬送ロボットが搬送ロボットの監視レンジ内に存在する場合の状況を示す図である。FIG. 5 is a diagram showing a situation in the case shown in FIG. 4 when there is still another transport robot within the monitoring range of the transport robot; 図7に表された事例において、搬送ロボットの距離センサが取得し出力する距離画像情報をグラフとして表した図であり、グラフ1001は角度と距離のグラフであり、グラフ1002は角度と信号強度のグラフである。FIG. 8 is a graph showing the distance image information obtained and output by the distance sensor of the transport robot in the example shown in FIG. graph. 外周位置特定部の距離画像情報のイメージ図である。FIG. 10 is an image diagram of distance image information of an outer peripheral position specifying unit; 複数の他の搬送ロボットの監視レンジ内に、特定の搬送ロボットが存在している状況を示す図である。FIG. 10 is a diagram showing a situation in which a specific transport robot exists within monitoring ranges of a plurality of other transport robots; 本発明の実施形態に係る搬送システムが実行する特徴的な動作を表すフローチャートである。4 is a flowchart showing characteristic operations performed by the transport system according to the embodiment of the present invention; 本発明の変形例に係る搬送システムの搬送ロボットの外形例を示す図である。It is a figure which shows the example of an external form of the conveyance robot of the conveyance system which concerns on the modification of this invention.
 〔実施形態〕
 以下、本発明の一側面に係る実施の形態が、図面に基づいて説明される。
[Embodiment]
Hereinafter, embodiments according to one aspect of the present invention will be described based on the drawings.
 §1 適用例
 図1を参照しつつ、本発明が適用される場面の一例が説明される。図1は、実施形態1に係る搬送システム1の構成を示すブロック図である。搬送システム1は、制御装置10と複数の自走式の搬送ロボット30とを備える。制御装置10は、複数の搬送ロボット30との間で通信を行い、それぞれの搬送ロボット30に指示を行う。
§1 Application Example An example of a scene to which the present invention is applied will be described with reference to FIG. FIG. 1 is a block diagram showing the configuration of a transport system 1 according to Embodiment 1. As shown in FIG. The transport system 1 includes a control device 10 and a plurality of self-propelled transport robots 30 . The control device 10 communicates with a plurality of transport robots 30 and gives instructions to each of the transport robots 30 .
 図1において、搬送ロボット30は1台分について、内部構成が詳細に示されているが、他の搬送ロボット30についても同様の内部構成を備える。搬送ロボット30は、周囲の物体への距離と当該物体からの信号強度とを検知し、距離画像情報として取得する距離センサ36(距離画像情報取得部)と、前記距離画像情報を用いて自己の向きを含めた位置に関する自己位置情報を算出する自己位置算出部33とを有している。また、搬送ロボット30は、搬送ロボット30の外周に付加された外周位置特定部40を有している。 In FIG. 1, the internal configuration of one transport robot 30 is shown in detail, but the other transport robots 30 also have the same internal configuration. The transport robot 30 detects the distance to a surrounding object and the signal strength from the object, and acquires the distance image information as a distance sensor 36 (distance image information acquisition unit). and a self-position calculation unit 33 for calculating self-position information regarding a position including orientation. Further, the transport robot 30 has an outer peripheral position specifying section 40 attached to the outer periphery of the transport robot 30 .
 制御装置10は、それぞれの搬送ロボット30から、前記自己位置情報と、前記距離画像情報とを取得する、固有状態取得部13を有している。また、制御装置10は、下記の(1)から(3)の情報を用いて当該搬送ロボット30の位置に関する推定位置情報を算出する推定位置算出部15を有している。(1)それぞれの搬送ロボット30の自己位置情報、(2)それぞれの搬送ロボット30から取得したそれぞれの前記距離画像情報に含まれる他の搬送ロボット30の位置に関する情報、(3)前記距離画像情報に含まれる、それぞれの搬送ロボット30の外周位置特定部を反映した信号強度の情報。 The control device 10 has an eigenstate acquisition unit 13 that acquires the self-position information and the distance image information from each transport robot 30 . The control device 10 also has an estimated position calculation unit 15 that calculates estimated position information regarding the position of the transfer robot 30 using information (1) to (3) below. (1) self-position information of each transport robot 30, (2) information about the position of other transport robots 30 included in each of the distance image information acquired from each transport robot 30, and (3) the distance image information. Signal strength information that reflects the outer peripheral position specifying portion of each transfer robot 30 included in the .
 更に制御装置10は、当該搬送ロボット30の推定位置を、当該搬送ロボット30に報知する推定位置送信部16を有する。 Furthermore, the control device 10 has an estimated position transmitting section 16 that notifies the transport robot 30 of the estimated position of the transport robot 30 .
 実施形態1に係る搬送システム1では、自走式の搬送ロボット30の位置に関する情報である推定位置情報が、制御装置10において、当該搬送ロボット30(以下、自機)により算出された自己位置情報のみならず、当該搬送ロボット30以外の搬送ロボット(以下、他機)からの前記距離画像情報にも基づいて算出される。 In the transport system 1 according to the first embodiment, the estimated position information, which is information about the position of the self-propelled transport robot 30, is the self-position information calculated by the transport robot 30 (hereinafter referred to as self-machine) in the control device 10. In addition, it is calculated based on the distance image information from a transport robot other than the transport robot 30 (hereinafter referred to as another machine).
 そうして、前記推定位置情報が、制御装置10から自機である当該搬送ロボット30に報知される。従って、搬送システム1では、搬送ロボット30についての自機の位置に関する推定が、自己位置情報よりも精度が高められた推定位置情報に基づいて行うことができるようになる。そのため、自走式の搬送ロボットの位置推定の精度が高められた搬送システムを実現することができる。 Then, the estimated position information is reported from the control device 10 to the transport robot 30 itself. Therefore, in the transport system 1, the position of the transport robot 30 can be estimated based on the estimated position information whose accuracy is higher than that of the self-position information. Therefore, it is possible to realize a transport system in which the position estimation accuracy of the self-propelled transport robot is improved.
 §2 構成例
 <搬送システムの適用場面>
 図2は本適用例に係る搬送システム1の搬送ロボット30の外観例を示した図である。図3は、本適用例に係る搬送システム1が適用され得る工場や倉庫等のエリアの一例である、工場100のフロアマップを模式的に示した図である。なお、以下においては、図2に示すように、搬送ロボット30において距離センサ36が配置されている側を搬送ロボット30の前方、反対側を後方とし、前後方向に直行する向きを左右方向として説明する。また、図3、図4及び図6においては、外周位置特定部40を省略して搬送ロボット30を示している。
§2 Configuration example <Application of transport system>
FIG. 2 is a diagram showing an appearance example of the transport robot 30 of the transport system 1 according to this application example. FIG. 3 is a diagram schematically showing a floor map of a factory 100, which is an example of areas such as factories and warehouses to which the transport system 1 according to this application example can be applied. In the following description, as shown in FIG. 2, the side of the transport robot 30 where the distance sensor 36 is arranged is the front of the transport robot 30, the opposite side is the rear, and the direction perpendicular to the front-rear direction is the left-right direction. do. 3, 4, and 6, the transport robot 30 is shown with the outer peripheral position specifying unit 40 omitted.
 工場100内には、製品、半製品、部品、原材料、工具、治具、梱包材やそれらを収納するカセット等の、搬送対象物を搬送する自走式の搬送ロボット30が配備されている。更には、台車としての搬送ロボット上に、搬送対象物を把持する、ロボットアーム(マニピュレータ)が設けられた自走式の搬送ロボット30Mが、搬送システム1の一部として配備されていてもよい。 In the factory 100, self-propelled transport robots 30 that transport objects to be transported, such as products, semi-finished products, parts, raw materials, tools, jigs, packaging materials, and cassettes that store them, are deployed. Furthermore, a self-propelled transport robot 30M provided with a robot arm (manipulator) for gripping an object to be transported may be arranged as part of the transport system 1 on a transport robot as a cart.
 搬送ロボット30Mは搬送ロボット30の一例であり、以降の記述では、搬送ロボット30に包含されるものとする。搬送ロボット30の形態としては、無人搬送台車や、AGF、その他の形態の自走式搬送装置であってもよい。 The transport robot 30M is an example of the transport robot 30, and shall be included in the transport robot 30 in the following description. The form of the transport robot 30 may be an unmanned guided vehicle, an AGF, or other forms of self-propelled transport devices.
 工場100内には搬送対象物を載置し得る棚110が設置されている。また、工場100内には、搬送対象物に対して、あるいは搬送対象物を利用して、加工、組み立て、処理、検査等を実行するための生産設備も設置されている。搬送システム1の搬送ロボット30は、これらの設備間で、搬送対象物を搬送し得る。 A shelf 110 on which objects to be transported can be placed is installed in the factory 100 . Further, in the factory 100, there are also installed production facilities for processing, assembling, treating, inspecting, etc. on the objects to be conveyed or by using the objects to be conveyed. The transport robot 30 of the transport system 1 can transport objects to be transported between these facilities.
 図3中に示されるように、それぞれの搬送ロボット30の位置は、工場100のフロア上に定義されたX-Y座標で表される。またそれぞれの搬送ロボット30の向きは、X-Y座標に対して定義された、角度θで表される。このようにそれぞれの搬送ロボット30の位置と向きが(X、Y、θ)のようにして表され得る。以下、本明細書において、搬送ロボットの位置に関する情報とは、このような搬送ロボットの位置と向き(X、Y、θ)に関する情報をいう。 As shown in FIG. 3, the position of each transport robot 30 is represented by XY coordinates defined on the floor of the factory 100. Also, the orientation of each transfer robot 30 is represented by an angle θ defined with respect to the XY coordinates. Thus, the position and orientation of each transfer robot 30 can be expressed as (X, Y, θ). Hereinafter, in this specification, information about the position of the transport robot means information about the position and orientation (X, Y, θ) of the transport robot.
 <搬送システムの構成概要>
 以下に、搬送システム1のより具体的な構成例と動作が説明される。図1に示されるように、搬送システム1は、制御装置10と複数の自走式の搬送ロボット30とを備える。制御装置10は、搬送システムサーバ(AMHSサーバ:Automated Material Handling System Server)等の名称で呼ばれることもある、搬送についての管理を担う情報処理システムである。
<Outline of configuration of transport system>
A more specific configuration example and operation of the transport system 1 will be described below. As shown in FIG. 1 , the transport system 1 includes a control device 10 and a plurality of self-propelled transport robots 30 . The control device 10 is an information processing system responsible for managing transportation, which is sometimes called a transportation system server (AMHS server: Automated Material Handling System Server).
 制御装置10は、上位情報処理システム等からの指令に基づいて、搬送システム1中の搬送ロボット30に、より具体的に搬送の指示を送信する。制御装置10は、このような処理を実行し得る情報処理システムであればよく、物理的に一筐体に納められた装置である必要は無い。 The control device 10 transmits a more specific transport instruction to the transport robot 30 in the transport system 1 based on a command from the upper information processing system or the like. The control device 10 may be any information processing system capable of executing such processing, and does not need to be a device physically housed in one housing.
 搬送システム1が適用される場面が生産工場である場合、生産工場における製品の生産を管理する上位情報処理システムは、製造実行システムサーバ(MESサーバ:Manufacturing Execution System Server)と呼称されることがある。搬送システム1が適用される場面が物流倉庫である場合には、物流倉庫における保管品の入庫・出庫を管理する上位情報処理システムは、倉庫管理システムサーバ(WMSサーバ:Warehouse Management System Server)と呼称されることがある。 When the scene where the transport system 1 is applied is a production factory, the upper information processing system that manages the production of products in the production factory may be called a manufacturing execution system server (MES server). . When the transportation system 1 is applied to a distribution warehouse, the upper information processing system that manages the storage and delivery of stored items in the distribution warehouse is called a warehouse management system server (WMS server). may be
 <搬送ロボットの構成>
 図1及び図2に示されるように、搬送ロボット30は、指示受付部31、走行制御部32、自己位置算出部33、固有状態報知部34、推定位置取得部35の各機能ブロックを有する。また搬送ロボット30は、距離センサ36、スレーブ記憶部37、走行機構部38、スレーブ通信部39、外周位置特定部40を有する。
<Configuration of transfer robot>
As shown in FIGS. 1 and 2, the transport robot 30 has functional blocks of an instruction reception unit 31, a travel control unit 32, a self-position calculation unit 33, an eigenstate notification unit 34, and an estimated position acquisition unit . The transport robot 30 also has a distance sensor 36 , a slave storage unit 37 , a traveling mechanism unit 38 , a slave communication unit 39 and an outer circumference position specifying unit 40 .
 距離センサ36は、搬送ロボット30の前方側に配置されており、搬送ロボット30の走行方向前方を監視する。図2に示されるように、実施形態1の具体例において距離センサ36は、2台のLiDARからなり、その監視エリア内に存在する物体への距離と当該物体からの信号強度とを示す距離画像情報を取得できる。具体的には、距離センサ36は、その監視エリア内に存在する物体に光を照射して、物体から反射した光に基づき物体への距離を検知する。なお、距離画像情報とは、二次元の画像に限らず、一次元(直線)の画像であってもよい。 The distance sensor 36 is arranged on the front side of the transport robot 30 and monitors the front of the transport robot 30 in the running direction. As shown in FIG. 2, in the specific example of Embodiment 1, the distance sensor 36 consists of two LiDARs, and a distance image showing the distance to an object present in the monitoring area and the signal strength from the object. information can be obtained. Specifically, the distance sensor 36 irradiates light on an object existing within the monitoring area and detects the distance to the object based on the light reflected from the object. Note that the distance image information is not limited to a two-dimensional image, and may be a one-dimensional (straight line) image.
 監視する領域は、例示として、搬送ロボット30の走行方向前方の正面から左右それぞれに120°程度までの範囲を含むものとすることができる。このような距離画像情報の事例は後述される。なお、距離画像情報は一般に距離画像と呼称されることもある。 As an example, the area to be monitored can include a range of up to about 120 degrees to the left and right from the front in front of the transport robot 30 in the traveling direction. An example of such distance image information will be described later. Note that the distance image information is also generally called a distance image.
 搬送ロボット30に配置されるLiDARの個数は、単数でも複数であってもよく、複数である場合に後方をも監視できるように配置されていてもよい。また、距離センサ36の種類は、LiDARに限られず、ステレオカメラあるいはToF(Time-of-Flight)カメラ等の距離画像を取得するセンサや、その他の手法によるものであってもよい。 The number of LiDARs arranged on the transport robot 30 may be singular or plural, and if plural, they may be arranged so that the rear can also be monitored. Further, the type of the distance sensor 36 is not limited to LiDAR, and may be a sensor that acquires a distance image such as a stereo camera or a ToF (Time-of-Flight) camera, or other methods.
 スレーブ記憶部37は、搬送ロボット30に設けられた記録装置である。スレーブ記憶部37は、搬送ロボット30の識別情報、工場100のフロア内のマップ情報を保持する他、搬送ロボット30が走行を実行するために必要な各種情報や走行履歴、搬送ロボット30の制御プログラム等を適宜保持する。 The slave storage unit 37 is a recording device provided in the transport robot 30 . The slave storage unit 37 stores identification information of the transport robot 30 and map information of the floor of the factory 100, as well as various information and travel history necessary for the transport robot 30 to run, a control program for the transport robot 30, and so on. etc. will be retained as appropriate.
 走行機構部38は、走行制御部32の制御により動作する、搬送ロボット30が床面上を走行するための機構部である。図2の搬送ロボット30の外観図において、走行機構部38の一部である車輪38Aが示されている。 The travel mechanism unit 38 is a mechanism unit that operates under the control of the travel control unit 32 and allows the transport robot 30 to travel on the floor surface. In the external view of the transport robot 30 in FIG. 2, wheels 38A that are part of the traveling mechanism section 38 are shown.
 スレーブ通信部39は、搬送ロボット30が制御装置10との間で通信を行うための通信インターフェースである。スレーブ通信部39を通じて行う搬送ロボット30との通信には、リアルタイムの距離画像情報が含まれるため、高速低遅延かつ多接続であることが好ましい。そのため、スレーブ通信部39は、制御装置10のマスタ通信部19との間で、5G(5th Generation)通信、あるいはWi-Fi6通信を行うものであることが好ましい(Wi Fi:登録商標)。図2の搬送ロボット30の外観図において、スレーブ通信部39の一部であるアンテナ39Aが示されている。 The slave communication unit 39 is a communication interface for the transport robot 30 to communicate with the control device 10. Since real-time distance image information is included in communication with the transport robot 30 through the slave communication unit 39, high speed, low delay, and multiple connections are preferable. Therefore, the slave communication unit 39 preferably performs 5G (5th Generation) communication or Wi-Fi6 communication (WiFi: registered trademark) with the master communication unit 19 of the control device 10. An antenna 39A that is part of the slave communication unit 39 is shown in the external view of the transport robot 30 in FIG.
 外周位置特定部40は、図2に示すように、距離センサ36の光の照射を妨げない位置において、搬送ロボット30の側面に沿った外周に付加されている。また、外周位置特定部40は、他の搬送ロボット30の距離センサ36から光が照射される位置(高さ)に、搬送ロボット30の上面に略平行に付加されている。外周位置特定部40は、搬送ロボット30の外周の位置により光の輝度反射率が異なる。外周位置特定部40は、例えば、反射板であってもよい。外周位置特定部40は、例えば、搬送ロボット30の前方(距離センサ36が設置されている方)に行くほど輝度反射率が高くなるように付加されている。 As shown in FIG. 2, the outer circumference position specifying part 40 is attached to the outer circumference along the side surface of the transport robot 30 at a position that does not interfere with the irradiation of light from the distance sensor 36 . In addition, the outer peripheral position specifying unit 40 is added substantially parallel to the upper surface of the transport robot 30 at a position (height) where light is emitted from the distance sensor 36 of the other transport robot 30 . The outer circumference position specifying unit 40 has different light luminance reflectances depending on the position on the outer circumference of the transport robot 30 . The outer circumference position specifying part 40 may be, for example, a reflector. The outer peripheral position specifying unit 40 is added, for example, so that the luminance reflectance increases toward the front of the transport robot 30 (where the distance sensor 36 is installed).
 本実施形態では、外周位置特定部40は、搬送ロボット30の左右方向の側面において前方に付加された側面前方特定部41と、前記側面において後方に付加された側面後方特定部42と、搬送ロボット30の後面に付加された後面特定部43と、を有する。外周位置特定部40は、側面前方特定部41の輝度反射率が一番高く、側面前方特定部41、側面後方特定部42、後面特定部43の順に輝度反射率が低くなる。言い換えると、距離センサ36が取得する外周位置特定部40の距離画像情報では、側面前方特定部41からの信号強度が一番高く、側面前方特定部41、側面後方特定部42、後面特定部43の順に信号強度が低くなる。なお、後面特定部43から取得される信号強度は、外周位置特定部40が付加されていない箇所からの信号強度よりも高い信号強度となる。 In the present embodiment, the outer peripheral position specifying unit 40 includes a side surface front specifying unit 41 added to the front of the left and right side surfaces of the transport robot 30, a side surface rear specifying unit 42 added to the rear of the side surface, and the transport robot. and a rear surface identification portion 43 added to the rear surface of the 30 . In the peripheral position specifying portion 40, the side front specifying portion 41 has the highest luminance reflectance, and the side front specifying portion 41, the side rear specifying portion 42, and the rear surface specifying portion 43 have lower luminance reflectances in this order. In other words, in the distance image information of the outer circumference position specifying portion 40 acquired by the distance sensor 36, the signal strength from the side front specifying portion 41 is the highest, and the side front specifying portion 41, the side rear specifying portion 42, and the rear surface specifying portion 43 are the highest. The signal strength decreases in the order of In addition, the signal strength acquired from the rear face identifying portion 43 is higher than the signal strength from the portion where the outer circumference position identifying portion 40 is not added.
 指示受付部31は、スレーブ通信部39を介して制御装置10からの指示を受け付ける機能ブロックである。走行制御部32は、走行機構部38を制御し、搬送ロボット30を走行させる機能ブロックである。走行制御部32はまた、走行機構部38からの各機構の動作情報、具体的にはモータのロータリエンコーダ出力等に基づいて、オドメトリデータを算出する。オドメトリデータとは、ある時点の搬送ロボット30の位置に対する、走行時あるいは走行後の位置を相対的に示す情報である。 The instruction reception unit 31 is a functional block that receives instructions from the control device 10 via the slave communication unit 39 . The travel control unit 32 is a functional block that controls the travel mechanism unit 38 and causes the transport robot 30 to travel. The travel control unit 32 also calculates odometry data based on the operation information of each mechanism from the travel mechanism unit 38, specifically, the rotary encoder output of the motor and the like. The odometry data is information that indicates the relative position of the transport robot 30 during or after running with respect to the position of the transport robot 30 at a certain point in time.
 自己位置算出部33は、搬送ロボット30の概略位置をオドメトリデータにより算出し、更に距離画像情報と、概略位置付近のマップ情報との比較から搬送ロボット30の自己の向きを含めた位置に関する自己位置情報を算出する機能ブロックである。固有状態報知部34は搬送ロボット30の固有情報を、スレーブ通信部39を介して制御装置10に報知する機能ブロックである。推定位置取得部35は、制御装置10からスレーブ通信部39を介して推定位置情報を取得する機能ブロックである。推定位置情報については後述される。 The self-position calculator 33 calculates the approximate position of the transport robot 30 from the odometry data, and compares the distance image information with the map information around the approximate position to calculate the self-position regarding the position of the transport robot 30 including its own orientation. It is a functional block that calculates information. The peculiar state notification unit 34 is a functional block that notifies the control device 10 of peculiar information of the transport robot 30 via the slave communication unit 39 . The estimated position acquisition unit 35 is a functional block that acquires estimated position information from the control device 10 via the slave communication unit 39 . The estimated position information will be described later.
 ここで、固有情報とは、個別の搬送ロボット30自身に関わる固有状態についての情報をいい、距離画像情報と、自己位置情報とを含む。更には、例えば、搬送ロボット30の動作の状態、搬送対象物の積載の状態、バッテリー残量等、搬送ロボット30の動作や、その他の内部状態に関する情報を含んでいてもよい。 Here, the unique information refers to information about the unique state related to the individual transport robot 30 itself, and includes distance image information and self-position information. Furthermore, for example, it may include information related to the operation of the transport robot 30, such as the state of operation of the transport robot 30, the state of loading of objects to be transported, the remaining battery level, and other internal states.
 搬送ロボット30は、基本的な動作として、制御装置10からの指示に従い、以下のようにして所要の搬送を実行する。指示受付部31が、スレーブ通信部39を介して制御装置10からの搬送の指示を受け付ける。走行制御部32が、自己位置に関する情報と、搬送の指示に含まれる搬送先の位置情報に基づいて、走行機構部38を制御し、搬送ロボット30を搬送先へと走行させる。その際、走行制御部32からのオドメトリデータ及び距離センサ36からの距離画像情報に基づいて、自己位置算出部33は、自己位置情報を更新し続ける。 As a basic operation, the transport robot 30 performs the required transport in the following manner in accordance with instructions from the control device 10. The instruction receiving unit 31 receives a transport instruction from the control device 10 via the slave communication unit 39 . The travel control unit 32 controls the travel mechanism unit 38 based on the information about the self position and the position information of the transport destination included in the transport instruction, and causes the transport robot 30 to travel to the transport destination. At that time, based on the odometry data from the travel control unit 32 and the distance image information from the distance sensor 36, the self-position calculator 33 continues to update the self-position information.
 <制御装置の構成>
 図1に示されるように、制御装置10は、上位指令受付部11、指示発行部12、固有状態取得部13、位置関係算出部14、推定位置算出部15、推定位置送信部16の各機能ブロックを有する。また制御装置10は、マスタ記憶部17、上位通信部18、マスタ通信部19を有する。
<Configuration of control device>
As shown in FIG. 1, the control device 10 includes functions of an upper-level command receiving unit 11, an instruction issuing unit 12, an eigenstate acquiring unit 13, a positional relationship calculating unit 14, an estimated position calculating unit 15, and an estimated position transmitting unit 16. have blocks. The control device 10 also has a master storage unit 17 , an upper communication unit 18 and a master communication unit 19 .
 マスタ記憶部17は、制御装置10に設けられた記録装置である。マスタ記憶部17は、工場100のフロア内のマップ情報、制御装置10の制御プログラムを保持する他、それぞれの搬送ロボット30の固有情報、動作ログ等を適宜保持する。上位通信部18は、制御装置10が上位情報処理システムとの間で通信を行うための通信インターフェースである。マスタ通信部19は、制御装置10が搬送ロボット30との間で通信を行うための通信インターフェースである。 The master storage unit 17 is a recording device provided in the control device 10 . The master storage unit 17 holds the map information of the floor of the factory 100, the control program of the control device 10, and also the unique information of each transfer robot 30, the operation log, and the like as appropriate. The host communication unit 18 is a communication interface for the control device 10 to communicate with the host information processing system. The master communication unit 19 is a communication interface for the control device 10 to communicate with the transport robot 30 .
 上位指令受付部11は、上位通信部18を介して上位情報処理システムからの指令を受け付ける機能ブロックである。指示発行部12は、上位情報処理システムからの指令に基づいて、個々の搬送ロボット30への指示を発行し、マスタ通信部19を通じて個々の搬送ロボット30への指示を送信する。 The higher-level command receiving unit 11 is a functional block that receives commands from the higher-level information processing system via the higher-level communication unit 18 . The instruction issuing unit 12 issues instructions to individual transport robots 30 based on instructions from the host information processing system, and transmits instructions to individual transport robots 30 through the master communication unit 19 .
 その際、指示発行部12は、固有状態取得部13が取得したそれぞれの搬送ロボット30の固有状態を参照して、上位情報処理システムからの指令を実行するために適切な個々の搬送ロボット30への指示を発行する。位置関係算出部14、推定位置算出部15、推定位置送信部16の各機能は後述される。 At this time, the instruction issuing unit 12 refers to the eigenstate of each transport robot 30 acquired by the eigenstate acquisition unit 13, and instructs the appropriate individual transport robot 30 to execute the command from the host information processing system. issue instructions for Each function of the positional relationship calculator 14, the estimated position calculator 15, and the estimated position transmitter 16 will be described later.
 <推定位置算出の原理>
 以下に、図4~図10を参照して、構成例に係る搬送システム1において、搬送ロボット30の推定位置が算出される原理が説明される。図4は工場100のフロア上において、搬送ロボット30が距離センサ36を用いて、周囲の物体との距離を測定する状況を示す図である。図5は、その際に距離センサ36が取得し出力する距離画像情報のうち、距離をグラフにして表した図である。
<Principle of estimated position calculation>
The principle of calculating the estimated position of the transport robot 30 in the transport system 1 according to the configuration example will be described below with reference to FIGS. 4 to 10. FIG. FIG. 4 is a diagram showing a situation in which the transport robot 30 uses the distance sensor 36 to measure the distance to surrounding objects on the floor of the factory 100. As shown in FIG. FIG. 5 is a graph showing the distance among the distance image information acquired and output by the distance sensor 36 at that time.
 マップ情報には、工場100のフロア上に載置されている棚や、生産設備の位置に関する情報が含まれている。図4に示される生産設備120aについて、筐体の四隅の角部の位置に関する情報が、点ランドマークMC1~MC4として、マップ情報に登録されている。 The map information includes information on the shelves placed on the floor of the factory 100 and the positions of the production equipment. For the production facility 120a shown in FIG. 4, information about the positions of the four corners of the housing is registered in the map information as dot landmarks MC1 to MC4.
 また、生産設備120aの角部と角部とを結ぶ線(辺)の位置に関する情報が、線ランドマークML1~ML4としてマップ情報に登録されている。生産設備120bについても同様に、点ランドマークMC5~MC8、線ランドマークML5~ML8がマップ情報に登録されている。 Information about the positions of lines (sides) connecting corners of the production facility 120a is registered in the map information as line landmarks ML1 to ML4. Similarly, dot landmarks MC5 to MC8 and line landmarks ML5 to ML8 are registered in the map information for the production equipment 120b.
 図4において、搬送ロボット30の距離センサ36の監視レンジRが点線で示されている。図5は、横軸を搬送ロボット30正面からの角度、すなわち搬送ロボット30に対する方位を、縦軸を搬送ロボット30の基準点からの距離として、距離画像情報のうち、距離をグラフ化して表した図である。図5においても監視レンジRが示されている。図5に示されるように、距離センサ36が取得し出力する距離画像情報には、監視レンジR内に位置する物体までの距離についての情報が含まれる。 In FIG. 4, the monitoring range R of the distance sensor 36 of the transport robot 30 is indicated by a dotted line. In FIG. 5, the horizontal axis represents the angle from the front of the transport robot 30, that is, the azimuth with respect to the transport robot 30, and the vertical axis represents the distance from the reference point of the transport robot 30. Among the distance image information, the distance is graphically represented. It is a diagram. The monitoring range R is also shown in FIG. As shown in FIG. 5, the distance image information acquired and output by the distance sensor 36 includes information about the distance to an object positioned within the monitoring range R. FIG.
 自己位置算出部33は、搬送ロボット30の向きも含めた概略位置を、過去の自己位置に関する情報と、オドメトリデータにより算出し、そこから想定される監視レンジRが配置される領域付近のマップ情報と、距離画像情報とをマッチングする。オドメトリデータにより算出した概略位置には誤差が含まれるため、当該マップ情報と距離画像情報とにはずれが生じる。自己位置算出部33は、概略位置からそのずれを補正することで、搬送ロボット30の自己の位置に関する情報である自己位置情報を算出する。 The self-position calculation unit 33 calculates the approximate position including the orientation of the transport robot 30 based on the past self-position information and the odometry data, and calculates the map information around the area where the monitoring range R assumed therefrom is arranged. and distance image information. Since the approximate position calculated from the odometry data contains an error, a discrepancy occurs between the map information and the distance image information. The self-position calculator 33 calculates self-position information, which is information about the position of the transport robot 30 itself, by correcting the deviation from the approximate position.
 図6は、搬送ロボット30の距離センサ36の監視レンジR内に、作業者Wが立ち入っており、また、一時置きの物品Obが置かれている状況を示す。作業者W及び一時置きの物品Obによるオクルージョンの発生により、距離センサ36がマップ情報に登録された物体までの距離を把握できない、遮蔽エリアDaが生じている。 FIG. 6 shows a situation in which a worker W has entered the monitoring range R of the distance sensor 36 of the transport robot 30 and a temporary article Ob is placed. Due to the occurrence of occlusion by the worker W and the temporarily placed article Ob, a shielded area Da is generated in which the distance sensor 36 cannot grasp the distance to the object registered in the map information.
 このような場合に、自己位置算出部33が算出する自己位置情報の精度が低下することは、容易に理解される。従って、自己位置算出部33は、マップ情報と、距離画像情報とのマッチングの状況、例えば、いくつの点ランドマーク、線ランドマークにマッチングできたか等に応じ、自己位置情報の確からしさについての確度情報を合わせて算出する。 It is easy to understand that in such a case, the accuracy of the self-location information calculated by the self-location calculation unit 33 is degraded. Therefore, the self-position calculation unit 33 determines the accuracy of the self-position information according to the matching situation between the map information and the distance image information, for example, how many point landmarks and line landmarks have been matched. Calculate the information together.
 図7は、図4の状況において、搬送ロボット30aの距離センサ36の監視レンジR内に、更に他の搬送ロボット30bが存在するようになった状況を示す。図8は図7に表された事例において、搬送ロボット30aの距離センサ36が取得し出力する距離画像情報をグラフとして表した図である。より詳しくは、図8のグラフ1001は、図7の状況において、横軸を搬送ロボット30a正面からの角度、すなわち搬送ロボット30aに対する方位とし、縦軸を搬送ロボット30aの基準点からの距離として、距離画像情報のうち距離をグラフ化して表した図である。図8のグラフ1002は、図7の状況において、横軸を搬送ロボット30a正面からの角度とし、縦軸を信号強度として、距離画像情報のうち信号強度をグラフ化して表した図である。 FIG. 7 shows a situation in which another transport robot 30b exists within the monitoring range R of the distance sensor 36 of the transport robot 30a in the situation of FIG. FIG. 8 is a graph showing distance image information acquired and output by the distance sensor 36 of the transport robot 30a in the example shown in FIG. More specifically, the graph 1001 in FIG. 8 shows, in the situation of FIG. It is the figure which expressed the distance among the distance image information by making into a graph. A graph 1002 in FIG. 8 is a graph showing the signal intensity of the distance image information, with the horizontal axis representing the angle from the front of the transport robot 30a and the vertical axis representing the signal intensity in the situation of FIG.
 図8に示すように、搬送ロボット30aが取得した距離画像情報には、他の搬送ロボット30bの位置に関する情報及び搬送ロボット30bの外周位置特定部40を反映した信号強度に関する情報が含まれる。具体的には、グラフ1002に示すように、搬送ロボット30bの外周位置特定部40の側面前方特定部41による距離画像情報では、搬送ロボット30aは高い信号強度を取得する。また、側面後方特定部42による距離画像情報では、搬送ロボット30aは側面前方特定部41より低く後面特定部43よりも高い信号強度を取得する。後面特定部43による距離画像情報では、搬送ロボット30aは側面後方特定部42より低く外周位置特定部40以外よりも高い信号強度を取得する。つまり、搬送ロボット30bの前方に行くにつれ、取得する信号強度が高くなることが分かる。 As shown in FIG. 8, the distance image information acquired by the transport robot 30a includes information about the position of the other transport robot 30b and information about the signal intensity reflecting the outer circumference position specifying part 40 of the transport robot 30b. Specifically, as shown in a graph 1002, the transport robot 30a acquires a high signal strength in the distance image information obtained by the side front specifying unit 41 of the outer peripheral position specifying unit 40 of the transport robot 30b. Further, in the distance image information obtained by the side rear specifying unit 42 , the transport robot 30 a obtains a signal strength lower than that of the side front specifying unit 41 and higher than that of the rear surface specifying unit 43 . In the distance image information obtained by the rear surface specifying unit 43 , the transport robot 30 a obtains a signal strength lower than that of the side rear specifying unit 42 and higher than that of other than the outer peripheral position specifying unit 40 . That is, it can be seen that the intensity of the acquired signal increases as it goes forward of the transport robot 30b.
 外周位置特定部40の距離画像情報のイメージを図9に示す。図9において、ハッチングが薄いほど輝度が高いこと、すなわち信号強度が高いことを示す。また、図9の範囲QVは、図7において搬送ロボット30aが搬送ロボット30bをセンシングした場合、搬送ロボット30aが取得する搬送ロボット30bの距離画像情報の範囲である。 An image of the distance image information of the outer circumference position specifying unit 40 is shown in FIG. In FIG. 9, the lighter the hatching, the higher the brightness, that is, the higher the signal intensity. A range QV in FIG. 9 is a range of distance image information of the transport robot 30b acquired by the transport robot 30a when the transport robot 30a senses the transport robot 30b in FIG.
 外周位置特定部40の各箇所の輝度反射率及び外周位置特定部40以外の部分の輝度反射率が異なるため、搬送ロボット30aにより取得される搬送ロボット30bの距離画像情報の点群は、図9に示すように、各箇所の点群毎に輝度が異なる。このように、図9においても、搬送ロボット30bの前方の距離画像情報の方が取得する信号強度が高くなっていることが分かる。 Since the luminance reflectance of each part of the outer circumference position specifying part 40 and the luminance reflectance of the part other than the outer circumference position specifying part 40 are different, the point cloud of the distance image information of the transport robot 30b acquired by the transport robot 30a is shown in FIG. , the brightness differs for each point cloud at each location. Thus, in FIG. 9 as well, it can be seen that the acquired signal strength is higher for the distance image information in front of the transport robot 30b.
 従って、搬送ロボット30aが取得した搬送ロボット30bに関わる距離画像情報により、高い信号強度を示す方が搬送ロボット30bの前方であることと判断できるため、搬送ロボット30bの向きを正確に判断することができる。 Therefore, based on the distance image information related to the transport robot 30b acquired by the transport robot 30a, it can be determined that the direction showing the high signal strength is in front of the transport robot 30b. can.
 つまり、搬送ロボット30aが取得した距離画像情報に含まれる距離情報を分析することで、他の搬送ロボット30bの搬送ロボット30aからの距離と方位に関する情報を得ることができる。また、距離画像情報に含まれる信号強度を分析することで、搬送ロボット30bの向きに関する情報を正確に得ることができる。よって、搬送ロボット30aの自己位置情報と搬送ロボット30aが取得した距離画像情報とから、搬送ロボット30bの向きを含む位置について、正確に算出し得ることが理解される。 That is, by analyzing the distance information included in the distance image information acquired by the transport robot 30a, it is possible to obtain information about the distance and orientation of the other transport robot 30b from the transport robot 30a. Further, by analyzing the signal intensity included in the distance image information, it is possible to accurately obtain information regarding the orientation of the transport robot 30b. Therefore, it is understood that the position including the orientation of the transport robot 30b can be accurately calculated from the self-position information of the transport robot 30a and the distance image information acquired by the transport robot 30a.
 図10は、搬送ロボット30aが、搬送ロボット30bの監視レンジRb内及び搬送ロボット30cの監視レンジRc内に存在している状況を示す。搬送ロボット30aの位置に関する情報は、以下の3通り存在する。(i)搬送ロボット30aが算出する自身の自己位置情報。(ii)搬送ロボット30bが算出する自身の自己位置情報と、搬送ロボット30bの距離画像情報とから導かれる搬送ロボット30aの位置に関する情報。(iii)搬送ロボット30cが算出する自身の自己位置情報と、搬送ロボット30cの距離画像情報とから導かれる搬送ロボット30aの位置に関する情報。 FIG. 10 shows a situation where the transport robot 30a exists within the monitoring range Rb of the transport robot 30b and within the monitoring range Rc of the transport robot 30c. There are the following three types of information regarding the position of the transport robot 30a. (i) Self-location information calculated by the transport robot 30a. (ii) Information about the position of the transport robot 30a derived from the self-position information calculated by the transport robot 30b and the distance image information of the transport robot 30b. (iii) Information about the position of the transport robot 30a derived from the self-position information calculated by the transport robot 30c and the distance image information of the transport robot 30c.
 これらの情報を統合して、搬送ロボット30aの位置に関する情報を算出することにより、(i)搬送ロボット30aが算出する自身の自己位置情報のみに依拠するよりも更に精度を高めることができるようになる。より具体的には、以下のようにして、これらの情報を統合し位置の推定値を得ることができる。 By integrating these pieces of information and calculating information related to the position of the transport robot 30a, (i) the accuracy can be further improved compared to relying only on the self-position information calculated by the transport robot 30a. Become. More specifically, this information can be integrated to obtain a position estimate as follows.
 例えば座標Xについて、前記(i)~(iii)によって、それぞれ推定値X1~X3が得られたとする。推定値X1~X3の単純平均を、統合された位置の推定値とすることができる。あるいは、推定値X1~X3についてのそれぞれの確度情報に応じて定められる重み付け係数を用いた加重平均を、統合された位置の推定値とすることができる。 For example, assume that estimated values X1 to X3 are obtained from (i) to (iii) above for the coordinate X, respectively. A simple average of the estimates X1-X3 can be the combined position estimate. Alternatively, a weighted average using weighting coefficients determined according to the accuracy information of each of the estimates X1 to X3 can be used as the integrated position estimate.
 またあるいは、推定値X1~X3毎に、それぞれの確度情報に応じて定められる分散を有し各推定値をピークとする正規分布等の確率分布関数を算出し、これらの確率分布関数を足し合わせた分布関数のピークを統合された位置の推定値とすることができる。座標Y、向きθについても同様である。 Alternatively, for each of the estimated values X1 to X3, a probability distribution function such as a normal distribution having a variance determined according to the respective accuracy information and peaking at each estimated value is calculated, and these probability distribution functions are added. The peaks of the distribution function obtained can be taken as the integrated position estimate. The same applies to the coordinate Y and the direction θ.
 ここで、前述の重み付け係数を用いた加重平均により、統合された位置の推定値として任意の搬送ロボット30の推定位置を算出する場合、下記のように重み付け係数を決定してもよい。すなわち、搬送ロボット30aに対する、搬送ロボット30a以外の搬送ロボット30b及び搬送ロボット30cの位置及び向きにより各距離画像情報の重み付け係数を決定してもよい。 Here, when the estimated position of any transport robot 30 is calculated as an integrated estimated position value by weighted averaging using the weighting coefficients described above, the weighting coefficients may be determined as follows. That is, the weighting factor of each distance image information may be determined according to the positions and orientations of the transport robots 30b and 30c other than the transport robot 30a with respect to the transport robot 30a.
 例えば、図10の状況において、推定位置算出部15が加重平均を用いて搬送ロボット30aの推定位置を算出する場合について説明する。まず、推定位置算出部15は、搬送ロボット30aの距離画像情報から、搬送ロボット30aに対する搬送ロボット30b及び搬送ロボット30cの向きを判定する。それに基づき、推定位置算出部15は、搬送ロボット30aの推定位置を算出する際に採用する、搬送ロボット30bからの距離画像情報及び搬送ロボット30cからの距離画像情報の重み付け係数を決定する。 For example, in the situation of FIG. 10, the case where the estimated position calculation unit 15 calculates the estimated position of the transport robot 30a using the weighted average will be described. First, the estimated position calculator 15 determines the orientation of the transport robot 30b and the transport robot 30c with respect to the transport robot 30a from the distance image information of the transport robot 30a. Based on this, the estimated position calculator 15 determines weighting coefficients for the distance image information from the transport robot 30b and the distance image information from the transport robot 30c, which are employed when calculating the estimated position of the transport robot 30a.
 具体的には、搬送ロボット30aは搬送ロボット30bの直進方向(垂直方向)にいるため、搬送ロボット30bが取得する搬送ロボット30aに関する距離画像情報の精度が高いと判断できる。それに対して、搬送ロボット30aは搬送ロボット30cの垂直方向からずれた角度(垂直方向に対して角度を有する)に位置するため、搬送ロボット30cが取得する搬送ロボット30aに関する距離画像情報の精度が低いと判断できる。そのため、搬送ロボット30aの推定位置を算出する際に、推定位置算出部15は、搬送ロボット30bの距離画像情報の重み付け係数を大きくし、搬送ロボット30cの距離画像情報の重み付け係数を小さくする。これにより、より精度の高い搬送ロボット30aの位置に関する推定位置情報を算出することができる。つまり、本実施形態においては、各搬送ロボット30の向きが正確に判断できるため、各搬送ロボット30の相互の情報に基づいて、精度をより高めた搬送ロボット30の位置に関する情報を得ることができるようになる。 Specifically, since the transport robot 30a is in the straight-ahead direction (vertical direction) of the transport robot 30b, it can be determined that the accuracy of the distance image information regarding the transport robot 30a acquired by the transport robot 30b is high. On the other hand, since the transport robot 30a is positioned at an angle (having an angle with respect to the vertical direction) from the vertical direction of the transport robot 30c, the accuracy of the distance image information regarding the transport robot 30a acquired by the transport robot 30c is low. can be judged. Therefore, when calculating the estimated position of the transport robot 30a, the estimated position calculator 15 increases the weighting coefficient of the distance image information of the transport robot 30b and decreases the weighting coefficient of the distance image information of the transport robot 30c. As a result, it is possible to calculate the estimated position information regarding the position of the transport robot 30a with higher accuracy. In other words, in the present embodiment, since the orientation of each transport robot 30 can be determined accurately, it is possible to obtain information regarding the position of the transport robot 30 with higher accuracy based on the mutual information of each transport robot 30 . become.
 また、図10に表された事例においては、搬送ロボット30bが、搬送ロボット30aの監視レンジRa内及び搬送ロボット30cの監視レンジRc内に存在している。そのため、搬送ロボット30bの位置に関する情報についても、同様にして他の搬送ロボット30a、搬送ロボット30cからの情報に基づいて、精度を高めた情報を算出することができる。このようにして、搬送ロボット30の相互の情報に基づいて、精度をより高めた搬送ロボット30の位置に関する情報を得ることができるようになる。 Also, in the case shown in FIG. 10, the transport robot 30b exists within the monitoring range Ra of the transport robot 30a and within the monitoring range Rc of the transport robot 30c. Therefore, information on the position of the transport robot 30b can also be calculated with improved accuracy based on information from the other transport robots 30a and 30c. In this way, based on mutual information of the transport robots 30, it is possible to obtain information on the position of the transport robot 30 with higher accuracy.
 <制御装置の動作>
 以下に、前記推定位置算出の原理に基づいた、搬送システム1の特徴的な動作が図11のフローチャットに沿って説明される。搬送システム1では、搬送ロボット30に搬送の動作を実行させている際に、図11に示されるフローがリアルタイムで繰り返し実行される。
<Operation of the control device>
Characteristic operations of the transportation system 1 based on the principle of estimated position calculation will be described below along the flow chart of FIG. 11 . In the transport system 1, the flow shown in FIG. 11 is repeatedly executed in real time while the transport robot 30 is causing the transport robot 30 to perform the transport operation.
 ステップS1:搬送システム1のそれぞれの搬送ロボット30において、距離センサ36により距離画像情報が取得される。また、走行制御部32において、オドメトリデータが算出される。 Step S1: In each transport robot 30 of the transport system 1, the distance sensor 36 acquires distance image information. Further, the travel control unit 32 calculates odometry data.
 ステップS2:搬送システム1のそれぞれの搬送ロボット30において、自己位置算出部33が、距離画像情報とオドメトリデータとに基づいて、自己位置情報を算出する。 Step S2: In each transport robot 30 of the transport system 1, the self-position calculator 33 calculates self-position information based on the distance image information and the odometry data.
 ステップS3:搬送システム1のそれぞれの搬送ロボット30において、固有状態報知部が、距離画像情報と自己位置情報とを含む固有情報を、スレーブ記憶部37に記憶された搬送ロボット30自身の識別情報とともに、スレーブ通信部39を介して、制御装置10に送信する。 Step S3: In each of the transport robots 30 of the transport system 1, the peculiar state notification unit transmits the peculiar information including the distance image information and the self position information together with the identification information of the transport robot 30 itself stored in the slave storage unit 37. , to the control device 10 via the slave communication unit 39 .
 ステップS4:制御装置10の固有状態取得部13は、マスタ通信部19を通じて、それぞれの搬送ロボット30から、識別情報が付された固有情報を取得する。 Step S4: The unique state acquisition unit 13 of the control device 10 acquires unique information with identification information from each transport robot 30 through the master communication unit 19 .
 ステップS5:制御装置10の位置関係算出部14は、固有情報に含まれたそれぞれの搬送ロボット30の自己位置情報から、それぞれの搬送ロボット30の監視レンジRを算出する。その際、固有情報に付された搬送ロボット30の識別情報に基づいて、それぞれの搬送ロボット30を識別する。更に位置関係算出部14は、各自己位置情報と算出した各監視レンジRに基づいて、それぞれの搬送ロボット30が、どの他の搬送ロボット30の監視レンジR内にあるかを算出する。 Step S5: The positional relationship calculator 14 of the control device 10 calculates the monitoring range R of each transport robot 30 from the self-position information of each transport robot 30 included in the unique information. At that time, each transport robot 30 is identified based on the identification information of the transport robot 30 attached to the unique information. Further, the positional relationship calculation unit 14 calculates, based on each self-position information and each calculated monitoring range R, which transport robot 30 is within the monitoring range R of the other transport robot 30 .
 ステップS6:制御装置10の推定位置算出部15は、それぞれの搬送ロボット30について、当該搬送ロボットの位置に関する推定位置情報を算出する。その際、推定位置算出部15は、搬送ロボット30から取得した自己位置情報と、当該搬送ロボット30以外の搬送ロボット30から取得した距離画像情報に含まれる当該搬送ロボットの位置及び当該搬送ロボットからの外周位置特定部を反映した信号強度に関する情報に併せ、当該搬送ロボット30以外の搬送ロボット30から取得した自己位置情報と、を用いる。推定位置情報の算出は、前記の推定位置算出の原理で説明された「統合された位置の推定値」を算出する方法に基づいて実行される。 Step S6: The estimated position calculation unit 15 of the control device 10 calculates estimated position information regarding the position of each transport robot 30 for each transport robot. At that time, the estimated position calculation unit 15 calculates the position of the transport robot included in the self-position information acquired from the transport robot 30, the distance image information acquired from the transport robots 30 other than the transport robot 30, and the position of the transport robot from the transport robot. In addition to the information about the signal intensity reflecting the outer peripheral position specifying part, the self-position information acquired from the transport robots 30 other than the transport robot 30 concerned is used. Calculation of the estimated position information is performed based on the method of calculating the "integrated position estimate" described in the principle of the estimated position calculation above.
 ステップS7:制御装置10の推定位置送信部16は、推定位置算出部15が算出したそれぞれの搬送ロボット30の推定位置情報を、マスタ通信部19を介して、それぞれの搬送ロボット30に対して送信する。その際、推定位置送信部16は、取得した搬送ロボット30の識別情報に基づいて、それぞれの搬送ロボット30を識別する。 Step S7: The estimated position transmission unit 16 of the control device 10 transmits the estimated position information of each transport robot 30 calculated by the estimated position calculation unit 15 to each transport robot 30 via the master communication unit 19. do. At that time, the estimated position transmitting unit 16 identifies each transport robot 30 based on the acquired identification information of the transport robot 30 .
 ステップS8:搬送システム1のそれぞれの搬送ロボット30において、推定位置取得部35が、スレーブ通信部39を通じて、制御装置10から、自機の推定位置情報を取得する。推定位置取得部35は、自機の位置に関する情報を、取得した推定位置情報に基づいて更新する。 Step S8: In each transport robot 30 of the transport system 1, the estimated position acquisition unit 35 acquires its own estimated position information from the control device 10 through the slave communication unit 39. The estimated position acquiring unit 35 updates the information regarding the position of the own aircraft based on the acquired estimated position information.
 構成例に係る搬送システム1によれば、搬送ロボット30の位置推定が、自機の距離センサ36からの距離画像情報からのみならず、他機の距離センサ36からの距離画像情報にも依拠して実行される。そのため、自機の距離センサ36からの距離画像情報のみによって位置推定が実行される場合と比較して、搬送ロボット30の位置推定の精度をより高めることができる。 According to the transport system 1 according to the configuration example, the position estimation of the transport robot 30 is based not only on the distance image information from the distance sensor 36 of the robot itself, but also on the distance image information from the distance sensors 36 of the other robots. is executed. Therefore, the position estimation accuracy of the transport robot 30 can be further improved compared to the case where the position estimation is performed only by the distance image information from the distance sensor 36 of the robot itself.
 §3 変形例
 以上、本発明の実施の形態を詳細に説明してきたが、前述までの説明はあらゆる点において本発明の例示に過ぎない。本発明の範囲を逸脱することなく種々の改良や変形を行うことができることは言うまでもない。例えば、以下のような変更が可能である。なお、以下では、前記実施形態と同様の構成要素に関しては同様の符号を用い、前記実施形態と同様の点については、適宜説明を省略した。以下の変形例は適宜組み合わせ可能である。
§3 Modifications Although the embodiments of the present invention have been described in detail, the above descriptions are merely examples of the present invention in every respect. It goes without saying that various modifications and variations can be made without departing from the scope of the invention. For example, the following changes are possible. In the following description, the same reference numerals are used for the same components as in the above-described embodiment, and the description of the same points as in the above-described embodiment is omitted as appropriate. The following modified examples can be combined as appropriate.
 例えば、前記実施形態では、図2に示されるとおり、外周位置特定部40は、他の搬送ロボット30の距離センサ36からの光が照射される位置に付加され、外周の位置により光の輝度反射率が異なる。しかしながら、距離センサ36が、ステレオカメラ等の物体を含む画像により前記物体への距離を検知するものである場合、このような例に限定されなくてもよく、実施の形態に応じて適宜選択されてよい。 For example, in the above-described embodiment, as shown in FIG. 2, the outer circumference position specifying unit 40 is added to a position where the light from the distance sensor 36 of the other transport robot 30 is irradiated, and the brightness of the light is reflected by the position of the outer circumference. different rates. However, when the distance sensor 36 detects the distance to the object from an image including the object, such as a stereo camera, it is not limited to such an example, and may be appropriately selected according to the embodiment. you can
 例えば、図12に示すように、搬送ロボット30の変形例である搬送ロボット30Zは、外周位置特定部40にかえて外周位置特定部40Zを有していてもよい。図12は、本発明の変形例に係る搬送システムの搬送ロボット30Zの外形例を示す図である。外周位置特定部40Zは、搬送ロボット30Zの上部において、外周に沿って、外周の位置により異なる模様を有する。 For example, as shown in FIG. 12, a transport robot 30Z, which is a modified example of the transport robot 30, may have an outer peripheral position specifying unit 40Z instead of the outer peripheral position specifying unit 40. FIG. 12 is a diagram showing an example of the outer shape of the transport robot 30Z of the transport system according to the modification of the present invention. The outer peripheral position specifying part 40Z has different patterns along the outer periphery in the upper part of the transport robot 30Z depending on the position of the outer periphery.
 本変形例では、外周位置特定部40Zは、各側面の外周位置特定部40Zの外周に沿った長さ方向の中央部に、中心特定部41Zを有する。中心特定部41Zは各側面により異なる模様を有し、その模様を判別することで、搬送ロボット30Zのどの側面であるかを判定できる。これにより、搬送ロボット30Zの向きが正確に判断できるため、各搬送ロボット30Zの相互の情報に基づいて、精度をより高めた搬送ロボット30Zの位置に関する情報を得ることができるようになる。 In this modified example, the outer peripheral position specifying portion 40Z has a center specifying portion 41Z at the central portion in the length direction along the outer periphery of the outer peripheral position specifying portion 40Z on each side. The center identifying portion 41Z has a different pattern on each side, and by determining the pattern, it is possible to determine which side of the transport robot 30Z it is. As a result, the orientation of the transport robot 30Z can be accurately determined, so information regarding the position of the transport robot 30Z can be obtained with higher accuracy based on the mutual information of the transport robots 30Z.
 さらに、例えば、搬送ロボット30Zは、外周位置特定部40Zとして、外周の所定の箇所に発光ダイオードを有していてもよい。具体的には、搬送ロボット30Zの前面に発光ダイオードを有し、他の搬送ロボット30Zが当該搬送ロボット30Zの距離画像情報を検知するタイミング(センシングするタイミング)で発光してもよい。これにより、距離画像情報において搬送ロボット30Zの前面からの信号強度が高くなるため、距離画像情報を分析することで、搬送ロボット30Zの向きが正確に把握できる。 Further, for example, the transport robot 30Z may have a light-emitting diode at a predetermined location on the outer circumference as the outer circumference position specifying section 40Z. Specifically, a light-emitting diode may be provided on the front surface of the transport robot 30Z, and may emit light at the timing (sensing timing) when the other transport robot 30Z detects the distance image information of the transport robot 30Z. As a result, the intensity of the signal from the front surface of the transport robot 30Z in the range image information is increased, so the orientation of the transport robot 30Z can be accurately grasped by analyzing the range image information.
 その際、例えば、制御装置10の位置関係算出部14が、下記の処理を行ってもよい。すなわち、位置関係算出部14は、搬送ロボット30Zが位置する監視レンジRを有する他の搬送ロボット30Zが当該搬送ロボット30Zをセンシングするタイミングを、固有情報を基に推定する。そして、位置関係算出部14は、推定した他の搬送ロボット30Zが当該搬送ロボット30Zをセンシングするタイミングに合わせて、発光ダイオードが発光するように当該搬送ロボット30Zに指示する。 At that time, for example, the positional relationship calculation unit 14 of the control device 10 may perform the following processing. That is, the positional relationship calculation unit 14 estimates the timing at which another transport robot 30Z having the monitoring range R where the transport robot 30Z is located senses the transport robot 30Z based on the unique information. Then, the positional relationship calculation unit 14 instructs the transport robot 30Z to cause the light emitting diode to emit light at the timing when the estimated other transport robot 30Z senses the transport robot 30Z.
 また、距離センサ36が物体を含む画像により前記物体への距離を検知するものである場合、画像の精度が、画像を取得する物体の周囲の明るさに依存する。つまり、前記物体の周囲が暗い場合、距離画像情報の精度が悪くなってしまう。そこで、搬送ロボット30Zは、他の搬送ロボット30Zが当該搬送ロボット30Zの距離画像情報を検知するタイミングで発光してもよい。例えば、搬送ロボット30Zには、搬送ロボット30Z全体を照らすように、搬送ロボット30Zの各頂点に発光ダイオードが配置されていてもよく、外周位置特定部40Zを照らすように外周位置特定部40Zの裏側に発光ダイオードが配置されていてもよい。これにより、他の搬送ロボット30Zが当該搬送ロボット30Zの画像を取得する際、当該搬送ロボット30Zの周囲または外周位置特定部40Zが明るくなり、他の搬送ロボット30Zが取得する距離画像情報の精度が向上する。 Also, when the distance sensor 36 detects the distance to the object from an image containing the object, the accuracy of the image depends on the brightness around the object from which the image is acquired. That is, when the surroundings of the object are dark, the accuracy of the distance image information is deteriorated. Therefore, the transport robot 30Z may emit light at the timing when another transport robot 30Z detects the distance image information of the transport robot 30Z. For example, the transport robot 30Z may be provided with light emitting diodes at each vertex of the transport robot 30Z so as to illuminate the entire transport robot 30Z. A light-emitting diode may be placed in the As a result, when the other transport robot 30Z acquires the image of the transport robot 30Z, the surrounding or outer peripheral position specifying part 40Z of the transport robot 30Z becomes bright, and the accuracy of the distance image information acquired by the other transport robot 30Z is reduced. improves.
 〔ソフトウェアによる実現例〕
 制御装置10の各機能ブロック(特に、上位指令受付部11、指示発行部12、固有状態取得部13、位置関係算出部14、推定位置算出部15、推定位置送信部16)あるいは、搬送ロボット30の機能ブロック(特に、指示受付部31、走行制御部32、自己位置算出部33、固有状態報知部34、推定位置取得部35)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現されてもよいし、ソフトウェアによって実現されてもよい。
[Example of realization by software]
Each functional block of the control device 10 (particularly, the upper command receiving unit 11, the instruction issuing unit 12, the unique state acquiring unit 13, the positional relationship calculating unit 14, the estimated position calculating unit 15, the estimated position transmitting unit 16) or the transport robot 30 The functional blocks of (particularly, the instruction receiving unit 31, the travel control unit 32, the self-position calculation unit 33, the unique state notification unit 34, and the estimated position acquisition unit 35) are logic circuits ( hardware) or software.
 後者の場合、制御装置10あるいは搬送ロボット30は、各機能を実現するソフトウェアであるプログラムの命令を実行するコンピュータを備えている。このコンピュータは、例えば1つ以上のプロセッサを備えていると共に、前記プログラムを記憶したコンピュータ読み取り可能な記録媒体を備えている。そして、前記コンピュータにおいて、前記プロセッサが前記プログラムを前記記録媒体から読み取って実行することにより、本発明の目的が達成される。 In the latter case, the control device 10 or the transport robot 30 is equipped with a computer that executes program instructions, which are software that implements each function. This computer includes, for example, one or more processors, and a computer-readable recording medium storing the program. In the computer, the processor reads the program from the recording medium and executes it, thereby achieving the object of the present invention.
 前記プロセッサとしては、例えばCPU(Central Processing Unit)を用いることができる。前記記録媒体としては、「一時的でない有形の媒体」、例えば、ROM(Read Only Memory)等の他、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、前記プログラムを展開するRAM(Random Access Memory)などを更に備えていてもよい。 As the processor, for example, a CPU (Central Processing Unit) can be used. As the recording medium, a "non-temporary tangible medium" such as a ROM (Read Only Memory), a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. Also, a RAM (Random Access Memory) for developing the program may be further provided.
 また、前記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して前記コンピュータに供給されてもよい。なお、本発明の一態様は、前記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 Also, the program may be supplied to the computer via any transmission medium (communication network, broadcast waves, etc.) capable of transmitting the program. Note that one aspect of the present invention can also be implemented in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
 (まとめ)
 前記の課題を解決するために、以下の構成を採用する。本発明の一側面に係る搬送システムは、自走式の複数の搬送ロボットと、複数の前記搬送ロボットとの間で通信を行う制御装置とを備え、前記搬送ロボットは、周囲の物体への距離と当該物体からの信号強度とを検知し、距離画像情報として取得する距離画像情報取得部と、前記距離画像情報を用いて自己の向きを含めた位置に関する自己位置情報を算出する自己位置算出部と、前記搬送ロボットの外周に付加された外周位置特定部と、を有し、前記制御装置は、複数の前記搬送ロボットから、それぞれの前記自己位置情報と、前記距離画像情報とを取得する固有状態取得部と、前記搬送ロボットから取得した前記自己位置情報と、当該搬送ロボット以外の前記搬送ロボットから取得した前記距離画像情報に含まれる当該搬送ロボットの位置及び当該搬送ロボットからの前記外周位置特定部を反映した信号強度に関する情報とを用いて、当該搬送ロボットの向きを含めた位置に関する推定位置情報を算出する推定位置算出部と、当該搬送ロボットの推定位置を、当該搬送ロボットに報知する推定位置送信部と、を有する。
(summary)
In order to solve the above problems, the following configuration is adopted. A transport system according to one aspect of the present invention includes a plurality of self-propelled transport robots and a control device that communicates with the plurality of transport robots. and the signal intensity from the object, and acquires it as distance image information. and an outer circumference position specifying unit added to the outer circumference of the transport robot, wherein the control device acquires the self-position information and the distance image information from each of the plurality of transport robots. a state acquisition unit; the position of the transport robot included in the self-position information acquired from the transport robot; and the distance image information acquired from the transport robot other than the transport robot; an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transfer robot using information about the signal strength that reflects the position of the transfer robot; and an estimation that notifies the transfer robot of the estimated position of the transfer robot. and a location transmitter.
 前記構成によれば、搬送システムの搬送ロボットは、周囲の物体への距離と当該物体からの信号強度とを検知し、距離画像情報として取得する。また、自己位置算出部は、距離画像情報を用いて自己の向きを含めた位置に関する自己位置情報を算出する。これにより、搬送ロボットは、周囲の物体に対する当該搬送ロボットの位置を、当該搬送ロボットの向きを含めて特定することができる。 According to the above configuration, the transport robot of the transport system detects the distance to surrounding objects and the signal intensity from the objects, and acquires them as distance image information. Also, the self-position calculation unit calculates self-position information regarding the position including the direction of the self using the distance image information. Thereby, the transport robot can specify the position of the transport robot relative to the surrounding objects, including the direction of the transport robot.
 また、搬送システムの制御装置の推定位置算出部では、下記の(1)のみならず(2)に基づいて、搬送ロボット(以下、自機)の推定位置情報が算出される。(1)自己位置算出部により算出された自己位置情報、(2)自機以外の搬送ロボット(以下、他機)の自己位置情報及び他機から自機の外周位置特定部を反映した信号強度に関する情報を含む距離画像情報。 In addition, the estimated position calculation unit of the control device of the transport system calculates the estimated position information of the transport robot (hereinafter referred to as self machine) based on (2) as well as (1) below. (1) self-location information calculated by the self-location calculation unit, (2) self-location information of a transport robot other than the own robot (hereinafter referred to as another robot) and signal strength from the other robot that reflects the outer peripheral position specifying unit of the own robot Range image information, including information about
 そのため、推定位置算出部は、(2)により、他機から自機の外周上の位置に対する距離を特定することができるため、他機に対する自機の向きを正確に推定できる。これにより、他機に対する自機の向きを考慮して自機の推定位置情報を算出することができる。 Therefore, the estimated position calculation unit can specify the distance from the other aircraft to the position on the outer circumference of the own aircraft by (2), so that the orientation of the own aircraft with respect to the other aircraft can be accurately estimated. Thereby, the estimated position information of the own aircraft can be calculated in consideration of the orientation of the own aircraft with respect to the other aircraft.
 また、自機の推定位置情報が、制御装置から自機に報知される。これにより、搬送システムでは、搬送ロボットについての自機の位置および角度に関する推定が、自己位置情報よりも精度が高められた推定位置情報に基づいて行うことができるようになる。その結果、自律的に走行ルートを判断する搬送ロボットが適用された搬送システムにおいて、搬送ロボットの位置の把握の精度を高めることができる。 In addition, the estimated position information of the aircraft is notified to the aircraft from the control device. As a result, in the transport system, the position and angle of the transport robot can be estimated based on the estimated position information whose accuracy is higher than that of the self-position information. As a result, in a transport system in which a transport robot that autonomously determines a travel route is applied, it is possible to improve the accuracy of grasping the position of the transport robot.
 前記一側面に係る搬送システムにおいて、前記推定位置算出部は、前記推定位置情報を、前記距離画像情報の加重平均により算出し、前記当該搬送ロボットに対する、当該搬送ロボット以外の前記搬送ロボットの位置及び向きにより前記加重平均の重み付け係数を決定してもよい。 In the transport system according to the one aspect, the estimated position calculation unit calculates the estimated position information by weighted average of the distance image information, and calculates the positions of the transport robots other than the transport robot with respect to the transport robot and The orientation may determine the weighting factor of the weighted average.
 前記構成によれば、推定位置算出部は、より精度の高い距離画像情報を採用して推定位置情報を算出することができるので、推定位置情報の精度が向上する。 According to the above configuration, the estimated position calculation unit can calculate the estimated position information using more accurate distance image information, so the accuracy of the estimated position information is improved.
 前記一側面に係る搬送システムにおいて、前記距離画像情報取得部は、前記物体に光を照射して前記物体から反射した前記光に基づき前記物体への距離を検知し、前記外周位置特定部は、前記外周の位置により前記光の輝度反射率が異なっていてもよい。 In the transport system according to the one aspect, the distance image information acquiring unit irradiates the object with light and detects the distance to the object based on the light reflected from the object; The luminance reflectance of the light may differ depending on the position of the outer periphery.
 前記構成によれば、外周位置特定部から反射した光の輝度の強度を判定することにより、搬送ロボットの外周の位置と、距離画像情報取得部から前記位置までの距離とを検知することができる。 According to the above configuration, it is possible to detect the position of the outer circumference of the transport robot and the distance from the distance image information acquisition section to the position by determining the intensity of the brightness of the light reflected from the outer circumference position specifying section. .
 前記一側面に係る搬送システムにおいて、前記距離画像情報取得部は、前記物体を含む画像により前記物体への距離を検知し、前記外周位置特定部は、前記外周の位置により異なる模様を有していてもよい。 In the transport system according to one aspect, the distance image information acquisition unit detects a distance to the object from an image including the object, and the outer circumference position specifying unit has a pattern that varies depending on the position of the outer circumference. may
 前記構成によれば、模様を判定することにより、搬送ロボットの外周の位置と、距離画像情報取得部から前記位置までの距離とを検知することができる。 According to the above configuration, by determining the pattern, it is possible to detect the position of the outer periphery of the transport robot and the distance from the distance image information acquisition unit to the above position.
 前記一側面に係る搬送システムにおいて、前記搬送ロボットは、前記距離画像情報取得部が前記距離画像情報を検知するタイミングで発光してもよい。 In the transport system according to the one aspect, the transport robot may emit light at the timing when the distance image information acquiring section detects the distance image information.
 前記構成によれば、搬送ロボットの周囲が暗い場合であっても、距離画像情報取得が当該搬送ロボットに関する距離画像情報を確実に検知することができる。 According to the above configuration, even when the surroundings of the transport robot are dark, the distance image information acquisition can reliably detect the distance image information related to the transport robot.
 本発明の各態様に係る制御装置は、コンピュータによって実現してもよく、この場合には、コンピュータを前記制御装置が備える各部(ソフトウェア要素)として動作させることにより前記制御装置をコンピュータにて実現させる制御装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The control device according to each aspect of the present invention may be realized by a computer. In this case, the control device is realized by the computer by operating the computer as each part (software element) included in the control device. A control program for a control device and a computer-readable recording medium recording it are also included in the scope of the present invention.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。 The present invention is not limited to the above-described embodiments, and can be modified in various ways within the scope of the claims. Included in the technical scope.
 1 搬送システム
 10 制御装置
 11 上位指令受付部
 12 指示発行部
 13 固有状態取得部
 14 位置関係算出部
 15 推定位置算出部
 16 推定位置送信部
 17 マスタ記憶部
 18 上位通信部
 19 マスタ通信部
 30、30a~30c、30M、30Z 搬送ロボット
 31 指示受付部
 32 走行制御部
 33 自己位置算出部
 34 固有状態報知部
 35 推定位置取得部
 36 距離センサ(距離画像情報取得部)
 37 スレーブ記憶部
 38 走行機構部
 39 スレーブ通信部
 40、40Z 外周位置特定部
 41 側面前方特定部(外周位置特定部)
 42 側面後方特定部(外周位置特定部)
 43 後面特定部(外周位置特定部)
 41Z 中心特定部(外周位置特定部)
 R、Ra~Rc 監視レンジ
Reference Signs List 1 transport system 10 control device 11 upper command reception unit 12 instruction issuing unit 13 unique state acquisition unit 14 positional relationship calculation unit 15 estimated position calculation unit 16 estimated position transmission unit 17 master storage unit 18 upper communication unit 19 master communication unit 30, 30a 30c, 30M, 30Z transport robot 31 instruction receiving unit 32 traveling control unit 33 self-position calculating unit 34 unique state reporting unit 35 estimated position acquiring unit 36 distance sensor (distance image information acquiring unit)
37 slave storage unit 38 travel mechanism unit 39 slave communication unit 40, 40Z outer circumference position specifying unit 41 side front specifying unit (outer peripheral position specifying unit)
42 Lateral side specifying part (peripheral position specifying part)
43 Rear face specifying part (peripheral position specifying part)
41Z center specifying part (peripheral position specifying part)
R, Ra to Rc monitoring range

Claims (5)

  1.  自走式の複数の搬送ロボットと、
     複数の前記搬送ロボットとの間で通信を行う制御装置とを備え、
     前記搬送ロボットは、周囲の物体への距離と当該物体からの信号強度とを検知し、距離画像情報として取得する距離画像情報取得部と、前記距離画像情報を用いて自己の向きを含めた位置に関する自己位置情報を算出する自己位置算出部と、前記搬送ロボットの外周に付加された外周位置特定部と、を有し、
     前記制御装置は、
     複数の前記搬送ロボットから、それぞれの前記自己位置情報と、前記距離画像情報とを取得する固有状態取得部と、
     前記搬送ロボットから取得した前記自己位置情報と、当該搬送ロボット以外の前記搬送ロボットから取得した前記距離画像情報に含まれる当該搬送ロボットの位置及び当該搬送ロボットからの前記外周位置特定部を反映した信号強度に関する情報とを用いて、当該搬送ロボットの向きを含めた位置に関する推定位置情報を算出する推定位置算出部と、
     当該搬送ロボットの推定位置を、当該搬送ロボットに報知する推定位置送信部と、を有する、搬送システム。
    a plurality of self-propelled transport robots;
    A control device that communicates with a plurality of the transport robots,
    The transport robot includes a distance image information acquisition unit that detects a distance to a surrounding object and a signal intensity from the object and acquires it as distance image information, and a position including its own orientation using the distance image information. a self-position calculation unit that calculates self-position information about the transport robot;
    The control device is
    an eigenstate acquisition unit that acquires the self-position information and the distance image information from each of the plurality of transport robots;
    A signal reflecting the position of the transport robot included in the self-position information acquired from the transport robot and the distance image information acquired from the transport robot other than the transport robot and the outer peripheral position specifying part from the transport robot an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transport robot using the information regarding the strength;
    and an estimated position transmitter that notifies the transport robot of the estimated position of the transport robot.
  2.  前記推定位置算出部は、前記推定位置情報を、前記距離画像情報の加重平均により算出し、前記当該搬送ロボットに対する、当該搬送ロボット以外の前記搬送ロボットの位置及び向きにより前記加重平均の重み付け係数を決定する請求項1に記載の搬送システム。 The estimated position calculation unit calculates the estimated position information by a weighted average of the distance image information, and calculates a weighting factor of the weighted average based on the position and orientation of the transport robot other than the transport robot with respect to the transport robot. 2. The transport system of claim 1, wherein determining.
  3.  前記距離画像情報取得部は、前記物体に光を照射して前記物体から反射した前記光に基づき前記物体への距離を検知し、
     前記外周位置特定部は、前記外周の位置により前記光の輝度反射率が異なる、請求項1または2に記載の搬送システム。
    The distance image information acquisition unit irradiates the object with light and detects the distance to the object based on the light reflected from the object,
    The conveying system according to claim 1 or 2, wherein the outer circumference position specifying unit has a different luminance reflectance of the light depending on the position of the outer circumference.
  4.  前記距離画像情報取得部は、前記物体を含む画像により前記物体への距離を検知し、
     前記外周位置特定部は、前記外周の位置により異なる模様を有する、請求項1または2に記載の搬送システム。
    The distance image information acquisition unit detects a distance to the object from an image including the object,
    The conveying system according to claim 1 or 2, wherein the outer peripheral position specifying part has a different pattern depending on the position of the outer periphery.
  5.  前記搬送ロボットは、前記距離画像情報取得部が前記距離画像情報を検知するタイミングで発光する、請求項4に記載の搬送システム。 The transport system according to claim 4, wherein the transport robot emits light at the timing when the distance image information acquisition unit detects the distance image information.
PCT/JP2021/046856 2021-03-11 2021-12-17 Conveyance system WO2022190514A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-039271 2021-03-11
JP2021039271A JP2022139054A (en) 2021-03-11 2021-03-11 Carrier system

Publications (1)

Publication Number Publication Date
WO2022190514A1 true WO2022190514A1 (en) 2022-09-15

Family

ID=83227540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/046856 WO2022190514A1 (en) 2021-03-11 2021-12-17 Conveyance system

Country Status (2)

Country Link
JP (1) JP2022139054A (en)
WO (1) WO2022190514A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07129237A (en) * 1993-11-01 1995-05-19 Nippon Telegr & Teleph Corp <Ntt> Method for recognizing in-environment position of mobile robot
JP2013057541A (en) * 2011-09-07 2013-03-28 Ihi Corp Method and device for measuring relative position to object
WO2017090108A1 (en) * 2015-11-25 2017-06-01 株式会社日立製作所 Shelf arrangement system, conveyance robot, and shelf arrangement method
JP2019091273A (en) * 2017-11-15 2019-06-13 株式会社日立製作所 Carrier system, carrier control system and carrier control method
JP2021026637A (en) * 2019-08-08 2021-02-22 三菱重工業株式会社 Position calculation system, position calculation method, and unmanned carrier

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07129237A (en) * 1993-11-01 1995-05-19 Nippon Telegr & Teleph Corp <Ntt> Method for recognizing in-environment position of mobile robot
JP2013057541A (en) * 2011-09-07 2013-03-28 Ihi Corp Method and device for measuring relative position to object
WO2017090108A1 (en) * 2015-11-25 2017-06-01 株式会社日立製作所 Shelf arrangement system, conveyance robot, and shelf arrangement method
JP2019091273A (en) * 2017-11-15 2019-06-13 株式会社日立製作所 Carrier system, carrier control system and carrier control method
JP2021026637A (en) * 2019-08-08 2021-02-22 三菱重工業株式会社 Position calculation system, position calculation method, and unmanned carrier

Also Published As

Publication number Publication date
JP2022139054A (en) 2022-09-26

Similar Documents

Publication Publication Date Title
US11613017B2 (en) Safety-rated multi-cell workspace mapping and monitoring
US10656646B2 (en) Ground plane detection to verify depth sensor status for robot navigation
US10007266B2 (en) Using planar sensors for pallet detection
JP6811258B2 (en) Position measurement of robot vehicle
US8548671B2 (en) Method and apparatus for automatically calibrating vehicle parameters
US20180021954A1 (en) Reorienting a Distance Sensor using an Adjustable Leveler
JP6025289B2 (en) Linked system of automated guided vehicle and inventory management system
JP7395967B2 (en) Self-propelled transport device
EP3888306A1 (en) Safety-rated multi-cell workspace mapping and monitoring
EP2677274A2 (en) System and method for guiding a mobile device
JP2015171933A (en) Map information updating method in interlocking system of unmanned carrier and warehouse management system
US11537140B2 (en) Mobile body, location estimation device, and computer program
WO2022190514A1 (en) Conveyance system
US20210116923A1 (en) Autonomous mobile vehicle
JP2021128755A (en) Conveyance system and conveyance method
WO2022118488A1 (en) Conveyance system
WO2022190505A1 (en) Control device and transport system
WO2022168377A1 (en) Baggage transport system, and method and computer program used in baggage transport system
US20220009759A1 (en) Industrial vehicle distance and range measurement device calibration
EP4137780A1 (en) Autonomous measuring robot system
WO2021171996A1 (en) Control device and conveying system
WO2023192270A1 (en) Validating the pose of a robotic vehicle that allows it to interact with an object on fixed infrastructure
CN116101937A (en) Cargo handling method, system, apparatus, and storage medium
WO2021216946A1 (en) Calibration of a distance and range measurement device
JPWO2019059299A1 (en) Operation management device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930359

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930359

Country of ref document: EP

Kind code of ref document: A1