WO2022190514A1 - Système de transport - Google Patents

Système de transport Download PDF

Info

Publication number
WO2022190514A1
WO2022190514A1 PCT/JP2021/046856 JP2021046856W WO2022190514A1 WO 2022190514 A1 WO2022190514 A1 WO 2022190514A1 JP 2021046856 W JP2021046856 W JP 2021046856W WO 2022190514 A1 WO2022190514 A1 WO 2022190514A1
Authority
WO
WIPO (PCT)
Prior art keywords
transport robot
transport
information
image information
distance image
Prior art date
Application number
PCT/JP2021/046856
Other languages
English (en)
Japanese (ja)
Inventor
祐二 川元
孝浩 井上
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022190514A1 publication Critical patent/WO2022190514A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a transport system equipped with a transport robot.
  • Self-propelled transport robots such as Automated Guided Vehicles (AGV) and Automated Guided Forklifts (AGF) have been proposed for use in factories and warehouses.
  • a plurality of such transport robots are controlled by a control device through wireless communication to constitute a transport system that realizes automation of transport in a factory or the like.
  • a transport robot that grasps its own position based on distance information obtained from distance sensors such as LiDAR (Light Detection And Ranging) and stereo cameras is also under consideration.
  • Such a transport robot autonomously determines a travel route without installing a guide indicating a predetermined travel route on the floor of a factory, warehouse, or the like.
  • the present invention has been made in view of such circumstances, and is intended to improve the accuracy of grasping the position of a transport robot in a transport system in which a transport robot that autonomously determines a travel route is applied. With the goal.
  • a transport system includes a plurality of self-propelled transport robots and a control device that communicates with the plurality of transport robots. and the signal intensity from the object, and acquires it as distance image information. and an outer circumference position specifying unit added to the outer circumference of the transport robot, wherein the control device acquires the self-position information and the distance image information from each of the plurality of transport robots.
  • a state acquisition unit the position of the transport robot included in the self-position information acquired from the transport robot; and the distance image information acquired from the transport robot other than the transport robot; an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transfer robot using information about the signal strength that reflects the position of the transfer robot; and an estimation that notifies the transfer robot of the estimated position of the transfer robot. and a location transmitter.
  • the accuracy of grasping the position of the transport robot is enhanced.
  • FIG. 1 is a block diagram showing the configuration of main parts of a transport system according to an embodiment of the present invention
  • FIG. It is a figure which shows the example of an external shape of the conveyance robot of the conveyance system which concerns on embodiment of this invention. It is a floor map which shows typically the example of the factory to which the conveyance system which concerns on embodiment of this invention is applied.
  • FIG. 10 is a diagram for explaining a method for a transport robot to detect surrounding objects using a distance sensor;
  • FIG. 6 is a graph showing distance image information acquired and output by a distance sensor of a transport robot in the example shown in FIG. 5 ;
  • FIG. 10 is a diagram for explaining a method for a transport robot to detect surrounding objects using a distance sensor; Indicates a situation in which an obstacle exists within the monitored area.
  • FIG. 5 is a diagram showing a situation in the case shown in FIG. 4 when there is still another transport robot within the monitoring range of the transport robot;
  • FIG. 8 is a graph showing the distance image information obtained and output by the distance sensor of the transport robot in the example shown in FIG. graph.
  • FIG. 10 is an image diagram of distance image information of an outer peripheral position specifying unit; FIG.
  • 10 is a diagram showing a situation in which a specific transport robot exists within monitoring ranges of a plurality of other transport robots; 4 is a flowchart showing characteristic operations performed by the transport system according to the embodiment of the present invention; It is a figure which shows the example of an external form of the conveyance robot of the conveyance system which concerns on the modification of this invention.
  • FIG. 1 is a block diagram showing the configuration of a transport system 1 according to Embodiment 1.
  • the transport system 1 includes a control device 10 and a plurality of self-propelled transport robots 30 .
  • the control device 10 communicates with a plurality of transport robots 30 and gives instructions to each of the transport robots 30 .
  • the transport robot 30 detects the distance to a surrounding object and the signal strength from the object, and acquires the distance image information as a distance sensor 36 (distance image information acquisition unit). and a self-position calculation unit 33 for calculating self-position information regarding a position including orientation. Further, the transport robot 30 has an outer peripheral position specifying section 40 attached to the outer periphery of the transport robot 30 .
  • the control device 10 has an eigenstate acquisition unit 13 that acquires the self-position information and the distance image information from each transport robot 30 .
  • the control device 10 also has an estimated position calculation unit 15 that calculates estimated position information regarding the position of the transfer robot 30 using information (1) to (3) below.
  • Signal strength information that reflects the outer peripheral position specifying portion of each transfer robot 30 included in the .
  • control device 10 has an estimated position transmitting section 16 that notifies the transport robot 30 of the estimated position of the transport robot 30 .
  • the estimated position information which is information about the position of the self-propelled transport robot 30, is the self-position information calculated by the transport robot 30 (hereinafter referred to as self-machine) in the control device 10.
  • self-machine the transport robot 30
  • it is calculated based on the distance image information from a transport robot other than the transport robot 30 (hereinafter referred to as another machine).
  • the estimated position information is reported from the control device 10 to the transport robot 30 itself. Therefore, in the transport system 1, the position of the transport robot 30 can be estimated based on the estimated position information whose accuracy is higher than that of the self-position information. Therefore, it is possible to realize a transport system in which the position estimation accuracy of the self-propelled transport robot is improved.
  • FIG. 2 is a diagram showing an appearance example of the transport robot 30 of the transport system 1 according to this application example.
  • FIG. 3 is a diagram schematically showing a floor map of a factory 100, which is an example of areas such as factories and warehouses to which the transport system 1 according to this application example can be applied.
  • the side of the transport robot 30 where the distance sensor 36 is arranged is the front of the transport robot 30, the opposite side is the rear, and the direction perpendicular to the front-rear direction is the left-right direction.
  • the transport robot 30 is shown with the outer peripheral position specifying unit 40 omitted.
  • self-propelled transport robots 30 that transport objects to be transported, such as products, semi-finished products, parts, raw materials, tools, jigs, packaging materials, and cassettes that store them, are deployed. Furthermore, a self-propelled transport robot 30M provided with a robot arm (manipulator) for gripping an object to be transported may be arranged as part of the transport system 1 on a transport robot as a cart.
  • the transport robot 30M is an example of the transport robot 30, and shall be included in the transport robot 30 in the following description.
  • the form of the transport robot 30 may be an unmanned guided vehicle, an AGF, or other forms of self-propelled transport devices.
  • a shelf 110 on which objects to be transported can be placed is installed in the factory 100 . Further, in the factory 100, there are also installed production facilities for processing, assembling, treating, inspecting, etc. on the objects to be conveyed or by using the objects to be conveyed.
  • the transport robot 30 of the transport system 1 can transport objects to be transported between these facilities.
  • each transport robot 30 is represented by XY coordinates defined on the floor of the factory 100.
  • the orientation of each transfer robot 30 is represented by an angle ⁇ defined with respect to the XY coordinates.
  • the position and orientation of each transfer robot 30 can be expressed as (X, Y, ⁇ ).
  • information about the position of the transport robot means information about the position and orientation (X, Y, ⁇ ) of the transport robot.
  • the transport system 1 includes a control device 10 and a plurality of self-propelled transport robots 30 .
  • the control device 10 is an information processing system responsible for managing transportation, which is sometimes called a transportation system server (AMHS server: Automated Material Handling System Server).
  • AMHS server Automated Material Handling System Server
  • the control device 10 transmits a more specific transport instruction to the transport robot 30 in the transport system 1 based on a command from the upper information processing system or the like.
  • the control device 10 may be any information processing system capable of executing such processing, and does not need to be a device physically housed in one housing.
  • the upper information processing system that manages the production of products in the production factory may be called a manufacturing execution system server (MES server).
  • MES server manufacturing execution system server
  • WMS server warehouse management system server
  • the transport robot 30 has functional blocks of an instruction reception unit 31, a travel control unit 32, a self-position calculation unit 33, an eigenstate notification unit 34, and an estimated position acquisition unit .
  • the transport robot 30 also has a distance sensor 36 , a slave storage unit 37 , a traveling mechanism unit 38 , a slave communication unit 39 and an outer circumference position specifying unit 40 .
  • the distance sensor 36 is arranged on the front side of the transport robot 30 and monitors the front of the transport robot 30 in the running direction.
  • the distance sensor 36 consists of two LiDARs, and a distance image showing the distance to an object present in the monitoring area and the signal strength from the object. information can be obtained.
  • the distance sensor 36 irradiates light on an object existing within the monitoring area and detects the distance to the object based on the light reflected from the object.
  • the distance image information is not limited to a two-dimensional image, and may be a one-dimensional (straight line) image.
  • the area to be monitored can include a range of up to about 120 degrees to the left and right from the front in front of the transport robot 30 in the traveling direction.
  • An example of such distance image information will be described later. Note that the distance image information is also generally called a distance image.
  • the number of LiDARs arranged on the transport robot 30 may be singular or plural, and if plural, they may be arranged so that the rear can also be monitored. Further, the type of the distance sensor 36 is not limited to LiDAR, and may be a sensor that acquires a distance image such as a stereo camera or a ToF (Time-of-Flight) camera, or other methods.
  • a distance image such as a stereo camera or a ToF (Time-of-Flight) camera, or other methods.
  • the slave storage unit 37 is a recording device provided in the transport robot 30 .
  • the slave storage unit 37 stores identification information of the transport robot 30 and map information of the floor of the factory 100, as well as various information and travel history necessary for the transport robot 30 to run, a control program for the transport robot 30, and so on. etc. will be retained as appropriate.
  • the travel mechanism unit 38 is a mechanism unit that operates under the control of the travel control unit 32 and allows the transport robot 30 to travel on the floor surface. In the external view of the transport robot 30 in FIG. 2, wheels 38A that are part of the traveling mechanism section 38 are shown.
  • the slave communication unit 39 is a communication interface for the transport robot 30 to communicate with the control device 10. Since real-time distance image information is included in communication with the transport robot 30 through the slave communication unit 39, high speed, low delay, and multiple connections are preferable. Therefore, the slave communication unit 39 preferably performs 5G (5th Generation) communication or Wi-Fi6 communication (WiFi: registered trademark) with the master communication unit 19 of the control device 10. An antenna 39A that is part of the slave communication unit 39 is shown in the external view of the transport robot 30 in FIG.
  • the outer circumference position specifying part 40 is attached to the outer circumference along the side surface of the transport robot 30 at a position that does not interfere with the irradiation of light from the distance sensor 36 .
  • the outer peripheral position specifying unit 40 is added substantially parallel to the upper surface of the transport robot 30 at a position (height) where light is emitted from the distance sensor 36 of the other transport robot 30 .
  • the outer circumference position specifying unit 40 has different light luminance reflectances depending on the position on the outer circumference of the transport robot 30 .
  • the outer circumference position specifying part 40 may be, for example, a reflector.
  • the outer peripheral position specifying unit 40 is added, for example, so that the luminance reflectance increases toward the front of the transport robot 30 (where the distance sensor 36 is installed).
  • the outer peripheral position specifying unit 40 includes a side surface front specifying unit 41 added to the front of the left and right side surfaces of the transport robot 30, a side surface rear specifying unit 42 added to the rear of the side surface, and the transport robot. and a rear surface identification portion 43 added to the rear surface of the 30 .
  • the side front specifying portion 41 has the highest luminance reflectance
  • the side front specifying portion 41, the side rear specifying portion 42, and the rear surface specifying portion 43 have lower luminance reflectances in this order.
  • the signal strength from the side front specifying portion 41 is the highest, and the side front specifying portion 41, the side rear specifying portion 42, and the rear surface specifying portion 43 are the highest.
  • the signal strength decreases in the order of
  • the signal strength acquired from the rear face identifying portion 43 is higher than the signal strength from the portion where the outer circumference position identifying portion 40 is not added.
  • the instruction reception unit 31 is a functional block that receives instructions from the control device 10 via the slave communication unit 39 .
  • the travel control unit 32 is a functional block that controls the travel mechanism unit 38 and causes the transport robot 30 to travel.
  • the travel control unit 32 also calculates odometry data based on the operation information of each mechanism from the travel mechanism unit 38, specifically, the rotary encoder output of the motor and the like.
  • the odometry data is information that indicates the relative position of the transport robot 30 during or after running with respect to the position of the transport robot 30 at a certain point in time.
  • the self-position calculator 33 calculates the approximate position of the transport robot 30 from the odometry data, and compares the distance image information with the map information around the approximate position to calculate the self-position regarding the position of the transport robot 30 including its own orientation. It is a functional block that calculates information.
  • the peculiar state notification unit 34 is a functional block that notifies the control device 10 of peculiar information of the transport robot 30 via the slave communication unit 39 .
  • the estimated position acquisition unit 35 is a functional block that acquires estimated position information from the control device 10 via the slave communication unit 39 . The estimated position information will be described later.
  • the unique information refers to information about the unique state related to the individual transport robot 30 itself, and includes distance image information and self-position information. Furthermore, for example, it may include information related to the operation of the transport robot 30, such as the state of operation of the transport robot 30, the state of loading of objects to be transported, the remaining battery level, and other internal states.
  • the transport robot 30 performs the required transport in the following manner in accordance with instructions from the control device 10.
  • the instruction receiving unit 31 receives a transport instruction from the control device 10 via the slave communication unit 39 .
  • the travel control unit 32 controls the travel mechanism unit 38 based on the information about the self position and the position information of the transport destination included in the transport instruction, and causes the transport robot 30 to travel to the transport destination.
  • the self-position calculator 33 continues to update the self-position information.
  • the control device 10 includes functions of an upper-level command receiving unit 11, an instruction issuing unit 12, an eigenstate acquiring unit 13, a positional relationship calculating unit 14, an estimated position calculating unit 15, and an estimated position transmitting unit 16. have blocks.
  • the control device 10 also has a master storage unit 17 , an upper communication unit 18 and a master communication unit 19 .
  • the master storage unit 17 is a recording device provided in the control device 10 .
  • the master storage unit 17 holds the map information of the floor of the factory 100, the control program of the control device 10, and also the unique information of each transfer robot 30, the operation log, and the like as appropriate.
  • the host communication unit 18 is a communication interface for the control device 10 to communicate with the host information processing system.
  • the master communication unit 19 is a communication interface for the control device 10 to communicate with the transport robot 30 .
  • the higher-level command receiving unit 11 is a functional block that receives commands from the higher-level information processing system via the higher-level communication unit 18 .
  • the instruction issuing unit 12 issues instructions to individual transport robots 30 based on instructions from the host information processing system, and transmits instructions to individual transport robots 30 through the master communication unit 19 .
  • the instruction issuing unit 12 refers to the eigenstate of each transport robot 30 acquired by the eigenstate acquisition unit 13, and instructs the appropriate individual transport robot 30 to execute the command from the host information processing system. issue instructions for Each function of the positional relationship calculator 14, the estimated position calculator 15, and the estimated position transmitter 16 will be described later.
  • FIG. 4 is a diagram showing a situation in which the transport robot 30 uses the distance sensor 36 to measure the distance to surrounding objects on the floor of the factory 100.
  • FIG. 5 is a graph showing the distance among the distance image information acquired and output by the distance sensor 36 at that time.
  • the map information includes information on the shelves placed on the floor of the factory 100 and the positions of the production equipment. For the production facility 120a shown in FIG. 4, information about the positions of the four corners of the housing is registered in the map information as dot landmarks MC1 to MC4.
  • the monitoring range R of the distance sensor 36 of the transport robot 30 is indicated by a dotted line.
  • the horizontal axis represents the angle from the front of the transport robot 30, that is, the azimuth with respect to the transport robot 30, and the vertical axis represents the distance from the reference point of the transport robot 30.
  • the distance image information the distance is graphically represented. It is a diagram.
  • the monitoring range R is also shown in FIG.
  • the distance image information acquired and output by the distance sensor 36 includes information about the distance to an object positioned within the monitoring range R.
  • the self-position calculation unit 33 calculates the approximate position including the orientation of the transport robot 30 based on the past self-position information and the odometry data, and calculates the map information around the area where the monitoring range R assumed therefrom is arranged. and distance image information. Since the approximate position calculated from the odometry data contains an error, a discrepancy occurs between the map information and the distance image information.
  • the self-position calculator 33 calculates self-position information, which is information about the position of the transport robot 30 itself, by correcting the deviation from the approximate position.
  • FIG. 6 shows a situation in which a worker W has entered the monitoring range R of the distance sensor 36 of the transport robot 30 and a temporary article Ob is placed. Due to the occurrence of occlusion by the worker W and the temporarily placed article Ob, a shielded area Da is generated in which the distance sensor 36 cannot grasp the distance to the object registered in the map information.
  • the self-position calculation unit 33 determines the accuracy of the self-position information according to the matching situation between the map information and the distance image information, for example, how many point landmarks and line landmarks have been matched. Calculate the information together.
  • FIG. 7 shows a situation in which another transport robot 30b exists within the monitoring range R of the distance sensor 36 of the transport robot 30a in the situation of FIG.
  • FIG. 8 is a graph showing distance image information acquired and output by the distance sensor 36 of the transport robot 30a in the example shown in FIG. More specifically, the graph 1001 in FIG. 8 shows, in the situation of FIG. It is the figure which expressed the distance among the distance image information by making into a graph.
  • a graph 1002 in FIG. 8 is a graph showing the signal intensity of the distance image information, with the horizontal axis representing the angle from the front of the transport robot 30a and the vertical axis representing the signal intensity in the situation of FIG.
  • the distance image information acquired by the transport robot 30a includes information about the position of the other transport robot 30b and information about the signal intensity reflecting the outer circumference position specifying part 40 of the transport robot 30b. Specifically, as shown in a graph 1002, the transport robot 30a acquires a high signal strength in the distance image information obtained by the side front specifying unit 41 of the outer peripheral position specifying unit 40 of the transport robot 30b. Further, in the distance image information obtained by the side rear specifying unit 42 , the transport robot 30 a obtains a signal strength lower than that of the side front specifying unit 41 and higher than that of the rear surface specifying unit 43 .
  • the transport robot 30 a obtains a signal strength lower than that of the side rear specifying unit 42 and higher than that of other than the outer peripheral position specifying unit 40 . That is, it can be seen that the intensity of the acquired signal increases as it goes forward of the transport robot 30b.
  • a range QV in FIG. 9 is a range of distance image information of the transport robot 30b acquired by the transport robot 30a when the transport robot 30a senses the transport robot 30b in FIG.
  • the point cloud of the distance image information of the transport robot 30b acquired by the transport robot 30a is shown in FIG. , the brightness differs for each point cloud at each location.
  • the acquired signal strength is higher for the distance image information in front of the transport robot 30b.
  • the position including the orientation of the transport robot 30b can be accurately calculated from the self-position information of the transport robot 30a and the distance image information acquired by the transport robot 30a.
  • FIG. 10 shows a situation where the transport robot 30a exists within the monitoring range Rb of the transport robot 30b and within the monitoring range Rc of the transport robot 30c.
  • the position of the transport robot 30a There are the following three types of information regarding the position of the transport robot 30a.
  • estimated values X1 to X3 are obtained from (i) to (iii) above for the coordinate X, respectively.
  • a simple average of the estimates X1-X3 can be the combined position estimate.
  • a weighted average using weighting coefficients determined according to the accuracy information of each of the estimates X1 to X3 can be used as the integrated position estimate.
  • a probability distribution function such as a normal distribution having a variance determined according to the respective accuracy information and peaking at each estimated value is calculated, and these probability distribution functions are added.
  • the peaks of the distribution function obtained can be taken as the integrated position estimate. The same applies to the coordinate Y and the direction ⁇ .
  • the weighting coefficients may be determined as follows. That is, the weighting factor of each distance image information may be determined according to the positions and orientations of the transport robots 30b and 30c other than the transport robot 30a with respect to the transport robot 30a.
  • the estimated position calculator 15 determines the orientation of the transport robot 30b and the transport robot 30c with respect to the transport robot 30a from the distance image information of the transport robot 30a. Based on this, the estimated position calculator 15 determines weighting coefficients for the distance image information from the transport robot 30b and the distance image information from the transport robot 30c, which are employed when calculating the estimated position of the transport robot 30a.
  • the transport robot 30a since the transport robot 30a is in the straight-ahead direction (vertical direction) of the transport robot 30b, it can be determined that the accuracy of the distance image information regarding the transport robot 30a acquired by the transport robot 30b is high. On the other hand, since the transport robot 30a is positioned at an angle (having an angle with respect to the vertical direction) from the vertical direction of the transport robot 30c, the accuracy of the distance image information regarding the transport robot 30a acquired by the transport robot 30c is low. can be judged. Therefore, when calculating the estimated position of the transport robot 30a, the estimated position calculator 15 increases the weighting coefficient of the distance image information of the transport robot 30b and decreases the weighting coefficient of the distance image information of the transport robot 30c.
  • the transport robot 30b exists within the monitoring range Ra of the transport robot 30a and within the monitoring range Rc of the transport robot 30c. Therefore, information on the position of the transport robot 30b can also be calculated with improved accuracy based on information from the other transport robots 30a and 30c. In this way, based on mutual information of the transport robots 30, it is possible to obtain information on the position of the transport robot 30 with higher accuracy.
  • Step S1 In each transport robot 30 of the transport system 1, the distance sensor 36 acquires distance image information. Further, the travel control unit 32 calculates odometry data.
  • Step S2 In each transport robot 30 of the transport system 1, the self-position calculator 33 calculates self-position information based on the distance image information and the odometry data.
  • Step S3 In each of the transport robots 30 of the transport system 1, the peculiar state notification unit transmits the peculiar information including the distance image information and the self position information together with the identification information of the transport robot 30 itself stored in the slave storage unit 37. , to the control device 10 via the slave communication unit 39 .
  • Step S4 The unique state acquisition unit 13 of the control device 10 acquires unique information with identification information from each transport robot 30 through the master communication unit 19 .
  • Step S5 The positional relationship calculator 14 of the control device 10 calculates the monitoring range R of each transport robot 30 from the self-position information of each transport robot 30 included in the unique information. At that time, each transport robot 30 is identified based on the identification information of the transport robot 30 attached to the unique information. Further, the positional relationship calculation unit 14 calculates, based on each self-position information and each calculated monitoring range R, which transport robot 30 is within the monitoring range R of the other transport robot 30 .
  • Step S6 The estimated position calculation unit 15 of the control device 10 calculates estimated position information regarding the position of each transport robot 30 for each transport robot. At that time, the estimated position calculation unit 15 calculates the position of the transport robot included in the self-position information acquired from the transport robot 30, the distance image information acquired from the transport robots 30 other than the transport robot 30, and the position of the transport robot from the transport robot. In addition to the information about the signal intensity reflecting the outer peripheral position specifying part, the self-position information acquired from the transport robots 30 other than the transport robot 30 concerned is used. Calculation of the estimated position information is performed based on the method of calculating the "integrated position estimate" described in the principle of the estimated position calculation above.
  • Step S7 The estimated position transmission unit 16 of the control device 10 transmits the estimated position information of each transport robot 30 calculated by the estimated position calculation unit 15 to each transport robot 30 via the master communication unit 19. do. At that time, the estimated position transmitting unit 16 identifies each transport robot 30 based on the acquired identification information of the transport robot 30 .
  • Step S8 In each transport robot 30 of the transport system 1, the estimated position acquisition unit 35 acquires its own estimated position information from the control device 10 through the slave communication unit 39. The estimated position acquiring unit 35 updates the information regarding the position of the own aircraft based on the acquired estimated position information.
  • the position estimation of the transport robot 30 is based not only on the distance image information from the distance sensor 36 of the robot itself, but also on the distance image information from the distance sensors 36 of the other robots. is executed. Therefore, the position estimation accuracy of the transport robot 30 can be further improved compared to the case where the position estimation is performed only by the distance image information from the distance sensor 36 of the robot itself.
  • the outer circumference position specifying unit 40 is added to a position where the light from the distance sensor 36 of the other transport robot 30 is irradiated, and the brightness of the light is reflected by the position of the outer circumference. different rates.
  • the distance sensor 36 detects the distance to the object from an image including the object, such as a stereo camera, it is not limited to such an example, and may be appropriately selected according to the embodiment. you can
  • a transport robot 30Z which is a modified example of the transport robot 30, may have an outer peripheral position specifying unit 40Z instead of the outer peripheral position specifying unit 40.
  • FIG. 12 is a diagram showing an example of the outer shape of the transport robot 30Z of the transport system according to the modification of the present invention.
  • the outer peripheral position specifying part 40Z has different patterns along the outer periphery in the upper part of the transport robot 30Z depending on the position of the outer periphery.
  • the outer peripheral position specifying portion 40Z has a center specifying portion 41Z at the central portion in the length direction along the outer periphery of the outer peripheral position specifying portion 40Z on each side.
  • the center identifying portion 41Z has a different pattern on each side, and by determining the pattern, it is possible to determine which side of the transport robot 30Z it is.
  • the orientation of the transport robot 30Z can be accurately determined, so information regarding the position of the transport robot 30Z can be obtained with higher accuracy based on the mutual information of the transport robots 30Z.
  • the transport robot 30Z may have a light-emitting diode at a predetermined location on the outer circumference as the outer circumference position specifying section 40Z.
  • a light-emitting diode may be provided on the front surface of the transport robot 30Z, and may emit light at the timing (sensing timing) when the other transport robot 30Z detects the distance image information of the transport robot 30Z.
  • the intensity of the signal from the front surface of the transport robot 30Z in the range image information is increased, so the orientation of the transport robot 30Z can be accurately grasped by analyzing the range image information.
  • the positional relationship calculation unit 14 of the control device 10 may perform the following processing. That is, the positional relationship calculation unit 14 estimates the timing at which another transport robot 30Z having the monitoring range R where the transport robot 30Z is located senses the transport robot 30Z based on the unique information. Then, the positional relationship calculation unit 14 instructs the transport robot 30Z to cause the light emitting diode to emit light at the timing when the estimated other transport robot 30Z senses the transport robot 30Z.
  • the transport robot 30Z may emit light at the timing when another transport robot 30Z detects the distance image information of the transport robot 30Z.
  • the transport robot 30Z may be provided with light emitting diodes at each vertex of the transport robot 30Z so as to illuminate the entire transport robot 30Z.
  • a light-emitting diode may be placed in the As a result, when the other transport robot 30Z acquires the image of the transport robot 30Z, the surrounding or outer peripheral position specifying part 40Z of the transport robot 30Z becomes bright, and the accuracy of the distance image information acquired by the other transport robot 30Z is reduced. improves.
  • Each functional block of the control device 10 (particularly, the upper command receiving unit 11, the instruction issuing unit 12, the unique state acquiring unit 13, the positional relationship calculating unit 14, the estimated position calculating unit 15, the estimated position transmitting unit 16) or the transport robot 30
  • the functional blocks of are logic circuits ( hardware) or software.
  • control device 10 or the transport robot 30 is equipped with a computer that executes program instructions, which are software that implements each function.
  • This computer includes, for example, one or more processors, and a computer-readable recording medium storing the program.
  • the processor reads the program from the recording medium and executes it, thereby achieving the object of the present invention.
  • a CPU Central Processing Unit
  • a recording medium a "non-temporary tangible medium” such as a ROM (Read Only Memory), a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • a RAM Random Access Memory
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast waves, etc.) capable of transmitting the program.
  • any transmission medium communication network, broadcast waves, etc.
  • one aspect of the present invention can also be implemented in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • a transport system includes a plurality of self-propelled transport robots and a control device that communicates with the plurality of transport robots. and the signal intensity from the object, and acquires it as distance image information. and an outer circumference position specifying unit added to the outer circumference of the transport robot, wherein the control device acquires the self-position information and the distance image information from each of the plurality of transport robots.
  • a state acquisition unit the position of the transport robot included in the self-position information acquired from the transport robot; and the distance image information acquired from the transport robot other than the transport robot; an estimated position calculation unit that calculates estimated position information regarding the position including the orientation of the transfer robot using information about the signal strength that reflects the position of the transfer robot; and an estimation that notifies the transfer robot of the estimated position of the transfer robot. and a location transmitter.
  • the transport robot of the transport system detects the distance to surrounding objects and the signal intensity from the objects, and acquires them as distance image information. Also, the self-position calculation unit calculates self-position information regarding the position including the direction of the self using the distance image information. Thereby, the transport robot can specify the position of the transport robot relative to the surrounding objects, including the direction of the transport robot.
  • the estimated position calculation unit of the control device of the transport system calculates the estimated position information of the transport robot (hereinafter referred to as self machine) based on (2) as well as (1) below.
  • self machine the estimated position information of the transport robot
  • self-location calculation unit calculates the estimated position information of the transport robot (hereinafter referred to as self machine) based on (2) as well as (1) below.
  • self-location information calculated by the self-location calculation unit
  • self-location information of a transport robot other than the own robot hereinafter referred to as another robot
  • Range image information including information about
  • the estimated position calculation unit can specify the distance from the other aircraft to the position on the outer circumference of the own aircraft by (2), so that the orientation of the own aircraft with respect to the other aircraft can be accurately estimated. Thereby, the estimated position information of the own aircraft can be calculated in consideration of the orientation of the own aircraft with respect to the other aircraft.
  • the estimated position information of the aircraft is notified to the aircraft from the control device.
  • the position and angle of the transport robot can be estimated based on the estimated position information whose accuracy is higher than that of the self-position information.
  • the estimated position calculation unit calculates the estimated position information by weighted average of the distance image information, and calculates the positions of the transport robots other than the transport robot with respect to the transport robot and The orientation may determine the weighting factor of the weighted average.
  • the estimated position calculation unit can calculate the estimated position information using more accurate distance image information, so the accuracy of the estimated position information is improved.
  • the distance image information acquiring unit irradiates the object with light and detects the distance to the object based on the light reflected from the object;
  • the luminance reflectance of the light may differ depending on the position of the outer periphery.
  • the distance image information acquisition unit detects a distance to the object from an image including the object
  • the outer circumference position specifying unit has a pattern that varies depending on the position of the outer circumference.
  • the transport robot may emit light at the timing when the distance image information acquiring section detects the distance image information.
  • the distance image information acquisition can reliably detect the distance image information related to the transport robot.
  • control device may be realized by a computer.
  • the control device is realized by the computer by operating the computer as each part (software element) included in the control device.
  • a control program for a control device and a computer-readable recording medium recording it are also included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention accroît la précision d'estimation de la position d'un robot de transport. Un dispositif de commande (10) dans ce système de transport (1) calcule des informations de position estimée se rapportant à une position comprenant une orientation d'un robot de transport (30) en utilisant : des informations de position d'hôte acquises à partir du robot de transport (30) ; et des informations concernant la position du robot de transport susmentionné, ladite position étant incluse dans des informations d'image de distance acquises à partir d'un robot de transport autre que le robot de transport susmentionné, et l'intensité de signal en provenance du robot de transport susmentionné, l'intensité de signal reflétant une unité de spécification de position périphérique externe (40).
PCT/JP2021/046856 2021-03-11 2021-12-17 Système de transport WO2022190514A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-039271 2021-03-11
JP2021039271A JP2022139054A (ja) 2021-03-11 2021-03-11 搬送システム

Publications (1)

Publication Number Publication Date
WO2022190514A1 true WO2022190514A1 (fr) 2022-09-15

Family

ID=83227540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/046856 WO2022190514A1 (fr) 2021-03-11 2021-12-17 Système de transport

Country Status (2)

Country Link
JP (1) JP2022139054A (fr)
WO (1) WO2022190514A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024111547A1 (fr) * 2022-11-21 2024-05-30 興和株式会社 Système de commande de robot de transport à déplacement autonome

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07129237A (ja) * 1993-11-01 1995-05-19 Nippon Telegr & Teleph Corp <Ntt> 移動ロボットの環境内位置認識方法
JP2013057541A (ja) * 2011-09-07 2013-03-28 Ihi Corp 対象物との相対位置計測方法と装置
WO2017090108A1 (fr) * 2015-11-25 2017-06-01 株式会社日立製作所 Système d'agencement d'étagère, robot de transport, et procédé d'agencement d'étagère
JP2019091273A (ja) * 2017-11-15 2019-06-13 株式会社日立製作所 搬送車システム、搬送車制御システム及び搬送車制御方法
JP2021026637A (ja) * 2019-08-08 2021-02-22 三菱重工業株式会社 位置演算システム、位置演算方法および無人搬送車

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07129237A (ja) * 1993-11-01 1995-05-19 Nippon Telegr & Teleph Corp <Ntt> 移動ロボットの環境内位置認識方法
JP2013057541A (ja) * 2011-09-07 2013-03-28 Ihi Corp 対象物との相対位置計測方法と装置
WO2017090108A1 (fr) * 2015-11-25 2017-06-01 株式会社日立製作所 Système d'agencement d'étagère, robot de transport, et procédé d'agencement d'étagère
JP2019091273A (ja) * 2017-11-15 2019-06-13 株式会社日立製作所 搬送車システム、搬送車制御システム及び搬送車制御方法
JP2021026637A (ja) * 2019-08-08 2021-02-22 三菱重工業株式会社 位置演算システム、位置演算方法および無人搬送車

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024111547A1 (fr) * 2022-11-21 2024-05-30 興和株式会社 Système de commande de robot de transport à déplacement autonome

Also Published As

Publication number Publication date
JP2022139054A (ja) 2022-09-26

Similar Documents

Publication Publication Date Title
US11097422B2 (en) Safety-rated multi-cell workspace mapping and monitoring
US10656646B2 (en) Ground plane detection to verify depth sensor status for robot navigation
JP6811258B2 (ja) ロボット車両の位置測定
US8548671B2 (en) Method and apparatus for automatically calibrating vehicle parameters
JP6025289B2 (ja) 無人搬送車と在庫管理システムの連動システム
JP6141782B2 (ja) 無人搬送車と在庫管理システムの連動システムにおける地図情報更新方法
US20130336537A1 (en) System and Method for Guiding a Mobile Device
WO2022190514A1 (fr) Système de transport
JP7395967B2 (ja) 自走式搬送装置
EP3888306A1 (fr) Cartographie et surveillance d&#39;espaces de travail a cellules multiples pour la sécurité
US11537140B2 (en) Mobile body, location estimation device, and computer program
US20210122589A1 (en) Measurement system on a conveyor
US20210116923A1 (en) Autonomous mobile vehicle
JP2021128755A (ja) 搬送システムおよび搬送方法
WO2022118488A1 (fr) Système de transport
WO2022190505A1 (fr) Dispositif de commande et système de transport
WO2022168377A1 (fr) Système de transport de bagages, ainsi que procédé et programme informatique utilisés dans un système de transport de bagages
US20220009759A1 (en) Industrial vehicle distance and range measurement device calibration
EP4137780A1 (fr) Système de robot de mesure autonome
WO2021171996A1 (fr) Dispositif de commande et système de transport
WO2023192270A1 (fr) Validation de la posture d&#39;un véhicule robotisé qui lui permet d&#39;interagir avec un objet sur une infrastructure fixe
CN116101937A (zh) 货物搬运方法及系统、设备、存储介质
WO2023192272A1 (fr) Système de localisation hybride sensible au contexte pour véhicules terrestres
WO2021216946A1 (fr) Étalonnage d&#39;un dispositif de mesure de distance et de portée
JPWO2019059299A1 (ja) 運行管理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930359

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930359

Country of ref document: EP

Kind code of ref document: A1