WO2019026761A1 - Corps mobile et programme informatique - Google Patents

Corps mobile et programme informatique Download PDF

Info

Publication number
WO2019026761A1
WO2019026761A1 PCT/JP2018/028092 JP2018028092W WO2019026761A1 WO 2019026761 A1 WO2019026761 A1 WO 2019026761A1 JP 2018028092 W JP2018028092 W JP 2018028092W WO 2019026761 A1 WO2019026761 A1 WO 2019026761A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning device
mobile
estimation result
agv
estimation
Prior art date
Application number
PCT/JP2018/028092
Other languages
English (en)
Japanese (ja)
Inventor
伊知朗 宮崎
知好 横山
清水 仁
Original Assignee
日本電産シンポ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産シンポ株式会社 filed Critical 日本電産シンポ株式会社
Priority to CN201880050090.1A priority Critical patent/CN110998472A/zh
Priority to JP2019534452A priority patent/JPWO2019026761A1/ja
Publication of WO2019026761A1 publication Critical patent/WO2019026761A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present disclosure relates to a mobile and a computer program for controlling movement of the mobile.
  • a moving body that performs self-position estimation includes an external sensor such as a laser range-finding sensor, for example, and senses the surrounding space while moving to acquire sensor data. For example, it is possible to identify a self-location on an environmental map by matching local map data around a moving object created from sensor data with a wider range of environmental map data.
  • Japanese Patent Laid-Open No. 2016-224680 discloses a self-position estimation apparatus that includes a first self-position estimation unit and a second self-position estimation unit, and executes estimation processing for each step.
  • the first self-position estimation unit obtains a probability distribution of the latest position of the moving object from the sensor data and the environmental map, and estimates the first self-position based on the probability distribution.
  • the second self-position estimating unit estimates the second self-position by adding the moving distance and the moving direction from the previous step to the current step, which are obtained by odometry, to the final self-position estimated in the previous step Do.
  • the weighted average value of the first self position and the second self position is taken as the final self position in the current step.
  • WO 2013/002067 discloses a self position and attitude estimation system for an autonomous mobile robot using a particle filter.
  • This system estimates the position and orientation of a robot using measurement data from a distance sensor, map data, and odometry data from an encoder. An evaluation value of the reliability of the position and orientation estimation result is calculated based on the dispersion of the particles. According to this system, it is determined whether or not the position and orientation estimation of the mobile robot is normally performed. If it is not normal, the robot is decelerated, the emergency stop is performed, or it is indicated that the robot is not normally performed. It can output a signal.
  • the present disclosure provides a technique for further stabilizing the traveling of a mobile including two types of positioning devices with different sensing methods.
  • a mobile includes, in an exemplary embodiment, at least one motor, a drive that controls the at least one motor to move the mobile, and movement of the mobile according to a first sensing method.
  • a first sensor that outputs first sensor data indicating a sensing result acquired according to the second sensing method, and a second sensor indicating a sensing result acquired according to the movement of the moving object by a second sensing method different from the first sensing method
  • a second sensor that outputs data; a first positioning device that performs a first estimation operation using the first sensor data to estimate the position of the mobile object; and the first estimation using the second sensor data
  • a second positioning device for estimating the position of the mobile body by performing a second estimation operation different from the operation, and a signal indicating the likelihood of the estimation result by the first positioning device
  • an arithmetic circuit for selecting one of the estimation result by the first positioning device and the estimation result by the second positioning device as the position of the moving object, depending on whether the polarity data matches a predetermined condition or not.
  • the estimation result by the first positioning device and the reliability data indicating the likelihood of the estimation result by the first positioning device match the predetermined condition and One of the estimation results by the second positioning device is selected as the position of the mobile body.
  • FIG. 1 is a block diagram showing an example of a basic configuration of a mobile unit in an exemplary embodiment of the present disclosure.
  • FIG. 2 is a diagram showing an outline of a control system that controls traveling of each AGV according to the present disclosure.
  • FIG. 3 is a view showing an example of a moving space S in which an AGV is present.
  • FIG. 4A shows the AGV and tow truck before being connected.
  • FIG. 4B shows the connected AGV and tow truck.
  • FIG. 5 is an external view of an exemplary AGV according to the present embodiment.
  • FIG. 6A is a diagram showing an example of a first hardware configuration of an AGV.
  • FIG. 6B is a diagram showing a second hardware configuration example of the AGV.
  • FIG. 7A shows an AGV that generates a map while moving.
  • FIG. 7B is a diagram showing an AGV that generates a map while moving.
  • FIG. 7C is a diagram showing an AGV that generates a map while moving.
  • FIG. 7D is a diagram showing an AGV that generates a map while moving.
  • FIG. 7E is a diagram showing an AGV that generates a map while moving.
  • FIG. 7F is a view schematically showing a part of the completed map.
  • FIG. 8 is a diagram showing an example of a hardware configuration of the operation management device.
  • FIG. 9 is a diagram schematically showing an example of the AGV movement route determined by the operation management device.
  • FIG. 10 is a block diagram showing a configuration example of the mobile unit 10.
  • FIG. 11 is a view schematically showing the flow of signals between components in the present embodiment.
  • FIG. 12 is a flowchart showing an example of the operation of the mobile unit 10.
  • FIG. 13 is a diagram for explaining the map switching area.
  • FIG. 14 is a diagram schematically showing an example of the operation of the mobile unit 10.
  • FIG. 15 is a diagram showing temporal changes in velocity of the mobile unit 10 in an embodiment.
  • FIG. 16 is a flowchart illustrating an operation of traveling using encoder coordinates in an embodiment.
  • FIG. 17 is a diagram schematically showing map switching processing in a normal state in which the reliability of LRF coordinates is high.
  • FIG. 18 is a diagram schematically showing map switching processing in the case where the reliability of LRF coordinates decreases while traveling in the map switching area.
  • FIG. 19 is a diagram illustrating an example of a change in LRF coordinates and a change in encoder coordinates.
  • unmanned transport vehicle means a trackless vehicle that manually or automatically loads a load on a main body, travels automatically to a designated location, and unloads manually or automatically.
  • unmanned aerial vehicle includes unmanned tow vehicles and unmanned forklifts.
  • unmanned means that the steering of the vehicle does not require a person, and does not exclude that the unmanned carrier conveys a "person (e.g., a person who unloads a package)".
  • the "unmanned tow truck” is a trackless vehicle that is tow to a designated location by automatically pulling a cart for loading and unloading a load manually or automatically.
  • the “unmanned forklift” is a trackless vehicle equipped with a mast that raises and lowers a load transfer fork and the like, automatically transfers the load onto the fork and so on, and automatically travels to a designated location to perform an automatic cargo handling operation.
  • a “trackless vehicle” is a vehicle that includes a wheel and an electric motor or engine that rotates the wheel.
  • a “mobile” is a device that moves while carrying a person or a load, and includes a driving device such as a wheel, a biped or multi-legged walking device, or a propeller that generates a traction for movement.
  • a driving device such as a wheel, a biped or multi-legged walking device, or a propeller that generates a traction for movement.
  • the term “mobile” in the present disclosure includes mobile robots and drone as well as unmanned transport vehicles in a narrow sense.
  • the “automatic traveling” includes traveling based on an instruction of an operation management system of a computer to which the automated guided vehicle is connected by communication, and autonomous traveling by a control device provided in the automated guided vehicle.
  • the autonomous traveling includes not only traveling by the automated guided vehicle toward a destination along a predetermined route, but also traveling by following a tracking target.
  • the automatic guided vehicle may perform manual traveling temporarily based on the instruction of the worker.
  • “automatic travel” generally includes both “guided” travel and “guideless” travel, in the present disclosure, “guideless” travel is meant.
  • the “guided type” is a system in which a derivative is installed continuously or intermittently and a guided vehicle is guided using the derivative.
  • the “guideless type” is a method of guiding without installing a derivative.
  • the unmanned transfer vehicle according to the embodiment of the present disclosure includes a self-position estimation device and travels in a guideless manner.
  • the “self-position estimation device” is a device that estimates the self-location on the environment map based on sensor data acquired by an external sensor such as a laser range finder.
  • SAM Simultaneous Localization and Mapping
  • FIG. 1 is a block diagram showing an example of a basic configuration of a mobile unit in an exemplary embodiment of the present disclosure.
  • the mobile unit 10 in this example includes a first sensor 101, a second sensor 102, a first positioning device 103, a second positioning device 104, an arithmetic circuit 105, and at least one electric motor (hereinafter simply referred to as "motor And a drive unit 107.
  • the first positioning device 103 is connected between the first sensor 101 and the arithmetic circuit 105.
  • the second positioning device 104 is connected between the second sensor 102 and the arithmetic circuit 105.
  • the drive device 107 controls the at least one motor 106 to move the moving body 10.
  • a typical example of the moving body 10 has at least one drive wheel (not shown) mechanically coupled to the motor 106, and can travel on the ground by the traction of the drive wheel.
  • first sensor 101 and the second sensor 102 acquire information corresponding to the movement of the mobile unit 10 by different sensing methods. Each sensing result is used for position estimation of the mobile unit 10.
  • the data output from the first sensor 101 and the second sensor 102 will be referred to as first sensor data and second sensor data, respectively.
  • the “external sensor” is a sensor that senses the external state of the mobile object 10.
  • the external sensors include, for example, a laser range finder, a camera (or an imaging device), a light detection and ranging (LIDAR), a millimeter wave radar, and a magnetic sensor.
  • the “internal sensor” is a sensor that senses the internal state of the mobile object 10.
  • the internal sensors include, for example, a rotary encoder (hereinafter, may be simply referred to as an "encoder”), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
  • the first sensor 101 and the second sensor 102 are different types of sensors.
  • the first sensor 101 may be an external sensor, and the second sensor 102 may be an internal sensor.
  • the first sensor 101 includes a laser range finder (hereinafter also referred to as LRF), and the second sensor 102 includes at least one rotary encoder.
  • LRF laser range finder
  • the present disclosure is not limited to such a form.
  • Each of the first sensor 101 and the second sensor 102 is not limited to a specific type of sensor as long as it outputs data used to estimate the position of the mobile object 10.
  • Each of the first sensor 101 and the second sensor 102 may be, for example, a device such as a camera, an imaging device, an imaging device, a magnetic sensor, a LIDAR, a millimeter wave radar, an angular velocity sensor, or an acceleration sensor.
  • a device such as a camera, an imaging device, an imaging device, a magnetic sensor, a LIDAR, a millimeter wave radar, an angular velocity sensor, or an acceleration sensor.
  • the first positioning device 103 While the mobile unit 10 is moving, the first positioning device 103 performs a first estimation operation using the first sensor data output from the first sensor 101 to estimate the current position of the mobile unit 10. For example, when the first sensor 101 is a laser range finder, the first positioning device 103 collates map data prepared in advance with data acquired by the laser range finder, and moves the mobile object to any position on the map Estimate if is located. The first positioning device 103 may estimate not only the position of the mobile body but also the orientation (or attitude). The first positioning device 103 outputs data indicating the estimation result as first position information.
  • the second positioning device 104 performs a second estimation operation using the second sensor data output from the second sensor 102 to estimate the current position of the mobile object 10. For example, when the second sensor 102 includes at least one rotary encoder, the second positioning device 104 indicates the information of the initial position previously recorded on a recording medium such as a memory and the rotational state of the wheel output from the rotary encoder. From the information, the current position can be estimated. The second positioning device 104 may also estimate the direction as well as the position of the mobile body. The second positioning device 104 outputs data indicating the estimation result as second position information.
  • the mobile unit 10 may further include a storage device for storing map data created in advance based on sensor data periodically output from the laser range finder.
  • the laser range finder (LRF) used to create map data may be an LRF (first sensor 101) mounted on the moving object 10 or another LRF.
  • the first positioning device 103 collates the first sensor data with the map data to estimate the position of the mobile object 10. This operation is called "self-position estimation".
  • Self-position estimation may include not only coordinates but also estimation of an angle from a reference axis.
  • the moving body 10 may be a vehicle provided with a plurality of wheels including a first wheel and a second wheel.
  • the at least one motor 106 may include a first motor mechanically connected to the first wheel and a second motor mechanically connected to the second wheel.
  • the moving body 10 has any one position of a first rotary encoder that measures rotation at any position of the power transmission mechanism from the first motor to the first wheel, and any position of the power transmission mechanism from the second motor to the second wheel. And a second rotary encoder for measuring the rotation of the motor.
  • “To measure rotation” means to measure at least “rotational direction” and “rotational position (in consideration of the number of rotations)”.
  • the first and second rotary encoders measure the rotation of the first and second wheels, respectively.
  • the second positioning device 104 measures a relative displacement amount from a given initial position using the second sensor data output from each of the first rotary encoder and the second rotary encoder, and performs an initial position.
  • the position moved by the displacement amount can be estimated as the position of the moving body 10 from the above.
  • the initial position may be updated periodically or irregularly while the mobile object 10 is traveling.
  • the arithmetic circuit 105 may update the value of the initial position described above with the value of the position (coordinates) estimated by the first positioning device 103. While the moving object 10 is traveling, the arithmetic circuit 105 may perform the update of the initial position at a predetermined cycle or aperiodically.
  • the first position information acquired using the LRF is higher than the second position information acquired using the encoder. It tends to be reliable. This is because the odometry data output from the encoder is likely to have an error, and the error is likely to be accumulated, due to slippage of wheels caused by road surface conditions or a shift due to a step.
  • Such a mismatch in reliability may occur not only in the combination of LRF and encoder but also in the combination of two other types of sensors (for example, a camera and a gyro sensor). Therefore, among the first position information and the second position information, the one with higher reliability is mainly used, and the other is used auxiliary.
  • the first position information may be less reliable than the second position information.
  • the first positioning device 103 may erroneously output coordinates which are completely different from the actual ones. . This is likely to occur, for example, when there are a plurality of locations including similar feature points on the route, or when there is an object (in particular, an object that is easily confused with a wall or the like) that did not exist at the time of mapping. In such a case, if self-position estimation using the first position information is continued, it is not possible to travel an accurate route. As a result, not only can the destination point not be reached, there is a risk of overruns or collisions.
  • the arithmetic circuit 105 estimates an estimation result estimated to be more accurate among the estimation results of the first positioning device 103 and the second positioning device 104. Select to control travel. Accordingly, for example, when it is determined that the vehicle travels normally using the position information estimated by the first positioning device 103 and it is determined that the reliability of the position information by the first positioning device 103 is low, the second positioning device 04 It is possible to switch to travel using position information.
  • the arithmetic circuit 105 determines the current position of the moving body 10 based on the first position information and the second position information, and controls the drive device 107.
  • the arithmetic circuit 105 acquires reliability data indicating the likelihood of the estimation result by the first positioning device 103 in addition to the first and second position information.
  • the arithmetic circuit 105 selects one of the estimation result by the first positioning device 103 and the estimation result by the second positioning device 104 as the position of the mobile object 10 according to whether or not the reliability data conforms to a predetermined condition. Do.
  • the arithmetic circuit 105 receives an instruction of a destination from, for example, an external device, controls the drive device 107 using the position of the selected mobile unit 10, and moves the mobile unit 10 toward the destination.
  • the reliability data may be output from the first positioning device 103 or may be generated by the arithmetic circuit 105 itself.
  • the first positioning device 103 may output data indicating the degree of coincidence between the first sensor data and the map data as the first reliability data.
  • the calculation circuit 105 selects the estimation result of the first positioning device 103 as the position of the mobile object 10
  • the second positioning is performed when the value of the first reliability data becomes equal to or less than the switching threshold.
  • the estimation result by the device 104 can be selected as the position of the moving body 10.
  • the calculation circuit 105 selects the estimation result of the second positioning device 104 as the position of the mobile object 10.
  • the first positioning is performed when the value of the first reliability data becomes equal to or more than the recovery threshold. It is possible to return to the state of selecting the estimation result by the device 103 as the position of the moving body 10.
  • the return threshold may be the same value as the switching threshold, or may be a value larger than the switching threshold.
  • the operation can be made more stable by setting the recovery threshold to a value that is several% to 30% higher than the switching threshold. it can.
  • the arithmetic circuit 105 does not approximate the movement of the coordinates indicated by the first position information and the movement of the coordinates indicated by the second position information.
  • the position indicated by the second position information may be selected as the position of the moving body 10.
  • the estimation result by the second positioning device 104 is a mobile object It may be selected as 10 positions and the operation may be continued.
  • the reliability data may include data (referred to as “second reliability data”) indicating the difference between the coordinates indicated by the first position information and the coordinates indicated by the second position information.
  • the arithmetic circuit 105 outputs the difference between the position obtained as the estimation result of the first positioning device 103 and the position obtained as the estimation result of the second positioning device 104 as second reliability data.
  • the second reliability data is, for example, an absolute value of (x1-x2), the absolute value of (y1-y2), or (x1-x2) may be a 2 + (y1-y2) data indicating a value of 2 or a square root.
  • the calculation circuit 105 selects the estimation result of the first positioning device 103 as the position of the mobile object 10.
  • the second positioning is performed when the value of the second reliability data becomes equal to or greater than a predetermined allowable value.
  • the estimation result by the device 104 can be selected as the position of the moving body 10.
  • the calculation circuit 105 selects the estimation result by the second positioning device 104 as the position of the mobile object 10
  • the value of the second reliability data becomes less than the allowable value or a value smaller than that value. At this time, it is possible to return to the state of selecting the measurement result by the first positioning device 103 as the position of the mobile object 10.
  • the present disclosure also includes a computer program executed by an arithmetic circuit in a mobile. Such programs are stored in the memory of the mobile.
  • the computer program causes the arithmetic circuit to calculate the estimation result by the first positioning device and the first estimation device depending on whether reliability data indicating the likelihood of the estimation result by the first positioning device matches a predetermined condition. 2) One of the estimation results by the positioning device is selected as the position of the mobile unit.
  • the present embodiment relates to a system provided with an unmanned transport vehicle as an example of a mobile object.
  • an unmanned carrier is described as "AGV" using abbreviations.
  • the first sensor 101 includes a laser range finder
  • the second sensor 102 includes two rotary encoders that measure rotational speeds (rotations per unit time) of the two wheels.
  • FIG. 2 shows an example of the basic configuration of an exemplary mobile management system 100 according to the present disclosure.
  • the mobile management system 100 includes at least one AGV 10, a terminal device 20 operated by the user 1, and an operation management device 50 that performs operation management of the AGV 10.
  • the AGV 10 is an unmanned transport carriage capable of "guideless" traveling, which does not require a derivative such as a magnetic tape for traveling.
  • the AGV 10 can perform self-position estimation, and can transmit the result of estimation to the terminal device 20 and the operation management device 50.
  • the AGV 10 can automatically travel in the moving space S in accordance with a command from the operation management device 50.
  • the operation management device 50 is a computer system that tracks the position of each AGV 10 and manages traveling of each AGV 10.
  • the operation management device 50 may be a desktop PC, a laptop PC, and / or a server computer.
  • the operation management apparatus 50 communicates with each AGV 10 via the plurality of access points 2. For example, the operation management device 50 transmits, to each AGV 10, data of coordinates of a position to which each AGV 10 should go next.
  • Each AGV 10 periodically transmits data indicating its position and attitude to the operation management device 50, for example, every 100 milliseconds.
  • the operation management device 50 transmits data of coordinates of a position to be further advanced.
  • the AGV 10 can also travel in the moving space S in accordance with the operation of the user 1 input to the terminal device 20.
  • An example of the terminal device 20 is a tablet computer.
  • travel of the AGV 10 using the terminal device 20 is performed at the time of map creation, and travel of the AGV 10 using the operation management device 50 is performed after the map creation.
  • FIG. 3 shows an example of a moving space S in which three AGVs 10a, 10b and 10c exist. All AGVs are assumed to travel in the depth direction in the figure. The AGVs 10a and 10b are carrying the load placed on the top plate. The AGV 10 c runs following the front AGV 10 b.
  • the referential mark 10a, 10b and 10c were attached
  • the AGV 10 can also transfer a load using a tow truck connected to itself, in addition to the method of transferring the load placed on the top plate.
  • FIG. 4A shows the AGV 10 and the tow truck 5 before being connected. Each leg of the tow truck 5 is provided with a caster. The AGV 10 is mechanically connected to the tow truck 5.
  • FIG. 4B shows the connected AGV 10 and tow truck 5. When the AGV 10 travels, the tow truck 5 is pulled by the AGV 10. By pulling the tow truck 5, the AGV 10 can transport the load placed on the tow truck 5.
  • connection method of AGV10 and the pulling truck 5 is arbitrary. An example will be described.
  • a plate 6 is fixed to the top plate of the AGV 10.
  • the tow truck 5 is provided with a guide 7 having a slit.
  • the AGV 10 approaches the tow truck 5 and inserts the plate 6 into the slit of the guide 7.
  • the AGV 10 penetrates the plate 6 and the guide 7 with an electromagnetic lock type pin (not shown) to lock the electromagnetic lock.
  • AGV10 and the pulling truck 5 are physically connected.
  • Each AGV 10 and the terminal device 20 can be connected, for example, on a one-to-one basis to perform communication conforming to the Bluetooth (registered trademark) standard.
  • Each AGV 10 and the terminal device 20 can also perform communication conforming to Wi-Fi (registered trademark) using one or more access points 2.
  • the plurality of access points 2 are connected to one another via, for example, a switching hub 3. Two access points 2a and 2b are shown in FIG.
  • the AGV 10 is wirelessly connected to the access point 2a.
  • the terminal device 20 is wirelessly connected to the access point 2b.
  • the data transmitted by the AGV 10 is received by the access point 2 a, transferred to the access point 2 b via the switching hub 3, and transmitted from the access point 2 b to the terminal device 20.
  • the data transmitted by the terminal device 20 is received by the access point 2 b, transferred to the access point 2 a via the switching hub 3, and transmitted from the access point 2 a to the AGV 10. Thereby, bi-directional communication between the AGV 10 and the terminal device 20 is realized.
  • the plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. Thereby, bidirectional communication is realized also between the operation management device 50 and each of the AGVs 10.
  • the AGV 10 transitions to the data acquisition mode by the operation of the user.
  • the AGV 10 starts acquiring sensor data using a laser range finder.
  • the laser range finder periodically scans the surrounding space S by emitting a laser beam of, for example, infrared or visible light to the surroundings.
  • the laser beam is reflected by, for example, a surface such as a wall, a structure such as a pillar, or an object placed on the floor.
  • the laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs measurement data indicating the position of each reflection point.
  • the direction of arrival of reflected light and the distance are reflected in the position of each reflection point.
  • Data of measurement results may be referred to as "measurement data" or "sensor data”.
  • the positioning device stores sensor data in a storage device.
  • the sensor data accumulated in the storage device is transmitted to the external device.
  • the external device is, for example, a computer that has a signal processor and has a mapping program installed.
  • the signal processor of the external device superimposes sensor data obtained for each scan.
  • a map of the space S can be created by repeatedly performing the process of overlaying the signal processor.
  • the external device transmits the created map data to the AGV 10.
  • the AGV 10 stores the created map data in an internal storage device.
  • the external device may be the operation management device 50 or another device.
  • the AGV 10 may create the map instead of the external device.
  • the processing performed by the signal processing processor of the external device described above may be performed by a circuit such as a microcontroller unit (microcomputer) of the AGV 10.
  • a microcontroller unit microcomputer
  • the data capacity of sensor data is generally considered to be large. Since it is not necessary to transmit sensor data to an external device, occupation of the communication line can be avoided.
  • the movement in the movement space S for acquiring sensor data can be implement
  • the AGV 10 wirelessly receives a traveling instruction instructing movement in each of the front, rear, left, and right directions from the user via the terminal device 20.
  • the AGV 10 travels back and forth and left and right in the moving space S in accordance with a travel command to create a map.
  • the map may be created by traveling in the moving space S in the front, rear, left, and right according to a control signal from the steering apparatus.
  • the sensor data may be acquired by a person pushing on the measurement cart on which the laser range finder is mounted.
  • FIGS. 2 and 3 Although a plurality of AGVs 10 are shown in FIGS. 2 and 3, one AGV may be provided. When there are a plurality of AGVs 10, the user 1 can use the terminal device 20 to select one AGV 10 out of the plurality of registered AGVs and create a map of the moving space S.
  • each AGV 10 can automatically travel while estimating its own position using the map.
  • the description of the process of estimating the self position will be described later.
  • FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment.
  • the AGV 10 has two drive wheels 11a and 11b, four casters 11c, 11d, 11e and 11f, a frame 12, a transport table 13, a travel control device 14, and a laser range finder 15.
  • the two drive wheels 11a and 11b are provided on the right and left sides of the AGV 10, respectively.
  • the four casters 11 c, 11 d, 11 e and 11 f are disposed at the four corners of the AGV 10.
  • the AGV 10 also has a plurality of motors connected to the two drive wheels 11a and 11b, which are not shown in FIG. Further, FIG.
  • FIG. 5 shows one drive wheel 11a and two casters 11c and 11e located on the right side of the AGV 10 and a caster 11f located on the left rear, but the left drive wheel 11b and the left front
  • the caster 11 d is not shown because it is hidden by the frame 12.
  • the four casters 11c, 11d, 11e and 11f can pivot freely.
  • the drive wheel 11a and the drive wheel 11b are also referred to as a wheel 11a and a wheel 11b, respectively.
  • the travel control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), an electronic component, and a substrate on which the components are mounted.
  • the traveling control device 14 performs transmission and reception of data with the terminal device 20 and the pre-processing calculation described above.
  • the laser range finder 15 is an optical device that measures, for example, the distance to a reflection point by emitting an infrared laser beam 15a and detecting the reflected light of the laser beam 15a.
  • the laser range finder 15 of the AGV 10 is, for example, a pulsed laser beam while changing the direction every 0.25 degree in a space within a range of 135 degrees (270 degrees in total) with reference to the front of the AGV 10
  • the light 15a is emitted, and the reflected light of each laser beam 15a is detected. This makes it possible to obtain data of the distance to the reflection point in the direction determined by the angle for a total of 1081 steps every 0.25 degrees.
  • the scan of the surrounding space performed by the laser range finder 15 is substantially parallel to the floor surface and planar (two-dimensional). However, scanning in the height direction may be performed.
  • the AGV 10 can create a map of the space S based on the position and attitude of the AGV 10 and the scan result of the laser range finder 15.
  • the map may reflect the surrounding walls of the AGV, structures such as columns, and the placement of objects placed on the floor. Map data is stored in a storage device provided in the AGV 10.
  • the position and posture of a mobile are called a pose.
  • the position and orientation of the moving body in a two-dimensional plane are represented by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be hereinafter simply referred to as "position”.
  • the position of the reflection point viewed from the emission position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 15 outputs sensor data represented by polar coordinates.
  • the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output it.
  • the structure and the operating principle of the laser range finder are known, so a further detailed description will be omitted herein.
  • Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
  • the laser range finder 15 is an example of an external sensor for sensing surrounding space and acquiring sensor data.
  • an image sensor and an ultrasonic sensor can be considered.
  • the traveling control device 14 can estimate the current position of itself by comparing the measurement result of the laser range finder 15 with the map data held by itself.
  • maintained may be the map data which other AGV10 created.
  • FIG. 6A shows a first hardware configuration example of the AGV 10.
  • FIG. 6A also shows a specific configuration of the traveling control device 14.
  • the AGV 10 includes a travel control unit 14, a laser range finder 15, two motors 16a and 16b, a drive unit 17, wheels 11a and 11b, and two rotary encoders 18a and 18b (hereinafter simply referred to as “encoders 18a”). And “the encoder 18 b”).
  • the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the positioning device 14e and / or the memory 14b.
  • the microcomputer 14 a is a processor or control circuit (computer) that performs calculations for controlling the entire AGV 10 including the traveling control device 14.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive unit 17 to control the drive unit 17 to adjust the voltage applied to the motor. This causes each of the motors 16a and 16b to rotate at a desired rotational speed.
  • PWM Pulse Width Modulation
  • control circuits for example, microcomputers
  • control circuits for controlling the drive of the left and right motors 16a and 16b may be provided independently of the microcomputer 14a.
  • the motor drive device 17 may be provided with two microcomputers for controlling the drive of the motors 16a and 16b, respectively. Those two microcomputers may perform coordinate calculation using encoder information output from the encoders 18a and 18b, respectively, to estimate the moving distance of the AGV 10 from a given initial position. Further, the two microcomputers may control the motor drive circuits 17a and 17b using encoder information.
  • the memory 14 b is a volatile storage device that stores a computer program executed by the microcomputer 14 a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
  • the storage device 14 c is a non-volatile semiconductor memory device.
  • the storage device 14 c may be a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
  • the storage device 14 c may include a head device for writing and / or reading data on any recording medium and a control device of the head device.
  • the storage device 14c stores map data M of the space S in which the vehicle travels and data (traveling route data) R of one or more traveling routes.
  • the map data M is created by the AGV 10 operating in the mapping mode and stored in the storage device 14c.
  • the travel route data R is transmitted from the outside after the map data M is created.
  • the map data M and the traveling route data R are stored in the same storage device 14c, but may be stored in different storage devices.
  • the AGV 10 receives traveling route data R indicating a traveling route from the tablet computer.
  • the travel route data R at this time includes marker data indicating the positions of a plurality of markers. “Marker” indicates the passing position (passing point) of the traveling AGV 10.
  • the travel route data R includes at least position information of a start marker indicating a travel start position and an end marker indicating a travel end position.
  • the travel route data R may further include positional information of markers at one or more intermediate waypoints. When the travel route includes one or more intermediate via points, a route from the start marker to the end marker via the travel via points in order is defined as the travel route.
  • the data of each marker may include, in addition to the coordinate data of the marker, data of the orientation (angle) and traveling speed of the AGV 10 until moving to the next marker.
  • the data of each marker is an acceleration time required to accelerate to the traveling speed, and / or It may include data of deceleration time required to decelerate from the traveling speed to a stop at the position of the next marker.
  • the operation management device 50 may control the movement of the AGV 10.
  • the operation management apparatus 50 may instruct the AGV 10 to move to the next marker each time the AGV 10 reaches the marker.
  • the AGV 10 receives, from the operation management apparatus 50, coordinate data of a target position to be headed to next, or data of a distance to the target position and data of an angle to be traveled as travel route data R indicating a travel route.
  • the AGV 10 can travel along the stored travel path while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
  • the communication circuit 14d is, for example, a wireless communication circuit that performs wireless communication compliant with the Bluetooth (registered trademark) and / or the Wi-Fi (registered trademark) standard. Both standards include wireless communication standards using frequencies in the 2.4 GHz band. For example, in the mode in which the AGV 10 is run to create a map, the communication circuit 14d performs wireless communication conforming to the Bluetooth (registered trademark) standard, and communicates with the terminal device 20 on a one-to-one basis.
  • the positioning device 14 e performs map creation processing and estimation processing of the self position when traveling.
  • the positioning device 14e creates a map of the moving space S based on the position and attitude of the AGV 10 and the scanning result of the laser range finder.
  • the positioning device 14e receives sensor data from the laser range finder 15, and reads out the map data M stored in the storage device 14c.
  • Self-location (x, y, ⁇ ) on the map data M by matching local map data (sensor data) created from the scan result of the laser range finder 15 with the map data M in a wider range Identify
  • the positioning device 14 e generates “reliability” data indicating the degree to which the local map data matches the map data M.
  • the data of the self position (x, y, ⁇ ) and the reliability can be transmitted from the AGV 10 to the terminal device 20 or the operation management device 50.
  • the terminal device 20 or the operation management device 50 can receive each data of the self position (x, y, ⁇ ) and the reliability and can display it on a built-in or connected display device.
  • microcomputer 14a and the positioning device 14e are separate components in this embodiment, this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing each operation of the microcomputer 14a and the positioning device 14e.
  • FIG. 6A shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
  • the microcomputer 14a and the positioning device 14e are provided separately and independently will be described.
  • Two motors 16a and 16b are attached to two wheels 11a and 11b, respectively, to rotate each wheel. That is, the two wheels 11a and 11b are respectively drive wheels.
  • the motor 16a and the motor 16b are described as being motors for driving the right and left wheels of the AGV 10, respectively.
  • the AGV 10 further includes an encoder unit 18 that measures the rotational position or rotational speed of the wheels 11a and 11b.
  • the encoder unit 18 includes a first rotary encoder 18a and a second rotary encoder 18b.
  • the first rotary encoder 18a measures the rotation at any position of the power transmission mechanism from the motor 16a to the wheel 11a.
  • the second rotary encoder 18 b measures the rotation at any position of the power transmission mechanism from the motor 16 b to the wheel 11 b.
  • the encoder unit 18 transmits the signals acquired by the rotary encoders 18a and 18b to the microcomputer 14a.
  • the microcomputer 14a can control movement of the AGV 10 using not only the signal received from the positioning device 14e but also the signal received from the encoder unit 18.
  • the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
  • Each of motor drive circuits 17a and 17b includes a so-called inverter circuit.
  • the motor drive circuits 17a and 17b turn on or off the current flowing to each motor by the PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motor.
  • FIG. 6B shows a second hardware configuration example of the AGV 10.
  • the second hardware configuration example differs from the first hardware configuration example (FIG. 6A) in that it has the laser positioning system 14 h and that the microcomputer 14 a is connected to each component on a one-to-one basis. Do.
  • the laser positioning system 14 h includes a positioning device 14 e and a laser range finder 15.
  • the positioning device 14e and the laser range finder 15 are connected by, for example, an Ethernet (registered trademark) cable.
  • the operations of the positioning device 14e and the laser range finder 15 are as described above.
  • the laser positioning system 14 h outputs information indicating the pose (x, y, ⁇ ) of the AGV 10 to the microcomputer 14 a.
  • the microcomputer 14a has various general purpose I / O interfaces or general purpose input / output ports (not shown).
  • the microcomputer 14a is directly connected to other components in the travel control device 14, such as the communication circuit 14d and the laser positioning system 14h, via the general-purpose input / output port.
  • the AGV 10 in the embodiment of the present disclosure may be equipped with a safety sensor such as an obstacle detection sensor and a bumper switch which are not shown.
  • the AGV 10 may include an inertial measurement device such as a gyro sensor.
  • an inertial measurement device such as a gyro sensor.
  • FIGS. 7A to 7F schematically show the AGV 10 moving while acquiring sensor data.
  • the user 1 may move the AGV 10 manually while operating the terminal device 20.
  • the unit provided with the travel control device 14 shown in FIGS. 6A and 6B, or the AGV 10 itself may be mounted on a carriage, and sensor data may be acquired by the user 1 manually pushing or holding the carriage.
  • FIG. 7A shows an AGV 10 that scans the surrounding space using a laser range finder 15. A laser beam is emitted for each predetermined step angle and scanning is performed.
  • the illustrated scan range is an example schematically shown, and is different from the total scan range of 270 degrees described above.
  • the position of the reflection point of the laser beam is schematically shown using a plurality of black points 4 represented by a symbol “ ⁇ ”.
  • the scanning of the laser beam is performed at short intervals while the position and attitude of the laser range finder 15 change. Therefore, the number of actual reflection points is much larger than the number of reflection points 4 shown.
  • the positioning device 14e stores, for example, in the memory 14b, the position of the black point 4 obtained as the vehicle travels.
  • the map data is gradually completed as the AGV 10 continues to scan while traveling.
  • FIGS. 7B-7E only the scan range is shown for simplicity.
  • the scan range is an example, and is different from the above-described example of 270 degrees in total.
  • the map may be created using the microcomputer 14a in the AGV 10 or an external computer based on the sensor data after acquiring the sensor data of the amount necessary for creating the map. Alternatively, a map may be created in real time based on sensor data acquired by the moving AGV 10.
  • FIG. 7F schematically shows a part of the completed map 40.
  • free space is partitioned by a point cloud (Point Cloud) corresponding to a collection of reflection points of the laser beam.
  • Point Cloud Point Cloud
  • Another example of the map is an occupied grid map that distinguishes space occupied by an object from free space in grid units.
  • the positioning device 14e stores map data (map data M) in the memory 14b or the storage device 14c.
  • map data M maps the illustrated number or density of black spots.
  • the map data thus obtained may be shared by multiple AGVs 10.
  • a typical example of an algorithm in which the AGV 10 estimates its own position based on map data is ICP (Iterative Closest Point) matching.
  • ICP Intelligent Closest Point
  • the local map data (sensor data) created from the scan result of the laser range finder 15 is matched (matched) with the map data M in a wider range, whereby the self-location on the map data M (x , Y, ⁇ ) can be estimated.
  • FIG. 8 shows a hardware configuration example of the operation management device 50.
  • the operation management apparatus 50 includes a CPU 51, a memory 52, a position database (position DB) 53, a communication circuit 54, a map database (map DB) 55, and an image processing circuit 56.
  • the CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected by a communication bus 57 and can exchange data with each other.
  • the CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50.
  • the CPU 51 is a semiconductor integrated circuit.
  • the memory 52 is a volatile storage device that stores a computer program that the CPU 51 executes.
  • the memory 52 can also be used as a work memory when the CPU 51 performs an operation.
  • the position DB 53 stores position data indicating each position that can be a destination of each AGV 10.
  • the position data may be represented, for example, by coordinates virtually set in the factory by the administrator. Location data is determined by the administrator.
  • the communication circuit 54 performs wired communication conforming to, for example, the Ethernet (registered trademark) standard.
  • the communication circuit 54 is connected to the access point 2 (FIG. 1) by wire, and can communicate with the AGV 10 via the access point 2.
  • the communication circuit 54 receives data to be transmitted to the AGV 10 from the CPU 51 via the bus 57.
  • the communication circuit 54 also transmits data (notification) received from the AGV 10 to the CPU 51 and / or the memory 52 via the bus 57.
  • the map DB 55 stores data of an internal map of a factory or the like on which the AGV 10 travels.
  • the map may be the same as or different from the map 40 (FIG. 7F).
  • the data format is not limited as long as the map has a one-to-one correspondence with the position of each AGV 10.
  • the map stored in the map DB 55 may be a map created by CAD.
  • the position DB 53 and the map DB 55 may be constructed on a non-volatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disc.
  • the image processing circuit 56 is a circuit that generates data of an image displayed on the monitor 58.
  • the image processing circuit 56 operates only when the administrator operates the operation management device 50. In the present embodiment, particularly the detailed description is omitted.
  • the monitor 59 may be integrated with the operation management device 50. Further, the CPU 51 may perform the processing of the image processing circuit 56.
  • FIG. 9 is a view schematically showing an example of the movement route of the AGV 10 determined by the operation management device 50. As shown in FIG.
  • position M n + 1 (a positive integer greater than or equal to n: 1) explain.
  • position DB 53 coordinate data indicating positions such as a position M 2 to be passed next to the position M 1 and a position M 3 to be passed next to the position M 2 are recorded.
  • CPU51 of traffic control device 50 reads out the coordinate data of the position M 2 with reference to the position DB 53, and generates a travel command to direct the position M 2.
  • the communication circuit 54 transmits a traveling command to the AGV 10 via the access point 2.
  • the CPU 51 periodically receives data indicating the current position and attitude from the AGV 10 via the access point 2.
  • the operation management device 50 can track the position of each AGV 10.
  • CPU51 determines that the current position of the AGV10 matches the position M 2, reads the coordinate data of the position M 3, and transmits the AGV10 generates a travel command to direct the position M 3. That is, when it is determined that the AGV 10 has reached a certain position, the operation management device 50 transmits a traveling command for directing to the next passing position.
  • the AGV 10 can reach the final target position Mn + 1 .
  • the passing position and the target position of the AGV 10 described above may be referred to as a “marker”.
  • FIG. 10 is a block diagram showing a configuration example of the AGV 10.
  • the configuration of FIG. 10 is the same as the configuration of FIG. 6B except that the second positioning device 19 and the display 30 are provided.
  • the second positioning device 19 is connected between the encoder unit 18 and the microcomputer 14a.
  • the display 30 is connected to the microcomputer 14a.
  • the positioning device 14 e will be referred to as “first positioning device 14 e” in order to distinguish it from the second positioning device 19.
  • the laser range finder 15 and the encoder unit 18 have functions as the first sensor 101 and the second sensor 102 in FIG. 1, respectively.
  • the microcomputer 14a corresponds to the arithmetic circuit 105 in FIG.
  • the second positioning device 19 includes, for example, a processing circuit such as a processor and a memory.
  • the second positioning device 19 acquires data output from the rotary encoders 18a and 18b, generates data (x, y, ⁇ ) indicating the position and attitude of the AGV 10, and outputs the data to the microcomputer 14a.
  • the functions of the second positioning device 19 may be integrated into the microcomputer 14a. In that case, the same configuration as the configuration shown in FIG. 6A or 6B is used.
  • the control circuit in the drive device 17 may have the function of the second positioning device 19.
  • FIG. 11 is a view schematically showing the flow of signals between components in the present embodiment.
  • the first positioning device 14 e performs a first estimation operation using data (first sensor data) output from the LRF 15 to estimate the position and orientation of the AGV 10.
  • the first estimation calculation in the present embodiment is processing of collating the first sensor data with the map data to generate data indicating coordinates (x, y), angle ⁇ , and reliability (unit:%). .
  • the first positioning device 14e sends data indicating the coordinates (x, y), the angle ⁇ , and the reliability to the microcomputer (arithmetic circuit) 14a.
  • the second positioning device 19 performs a second estimation operation using data (second sensor data) output from the two encoders 18 a and 18 b to estimate the position and orientation of the AGV 10.
  • the second sensor data includes information on the rotational state or rotational speed of the motor or the wheel. The travel distance of the wheel per unit time can be estimated from the rotational speed and the diameter of the wheel.
  • the second estimation operation includes a process of integrating the coordinate and angle change amounts calculated based on the outputs of the two encoders 18a and 18b with the initial values of the AGV 10 coordinates and angle, respectively. The initial values of coordinates and angles may be periodically updated, for example, with the values of coordinates and angles calculated by the first positioning device 14e.
  • the second positioning device 19 sends data indicating the coordinates (x, y) and the angle ⁇ to the microcomputer 14 a.
  • the coordinates and angles estimated by the first positioning device 14e may be collectively referred to as "LRF coordinates", and the coordinates and angles estimated by the second positioning device 19 may be collectively referred to as “encoder coordinates”. is there.
  • the microcomputer 14a uses the estimation result by the first positioning device 14e and the second positioning device 19 depending on whether the reliability data indicating the likelihood of the estimation result by the first positioning device 14e matches a predetermined condition.
  • One of the estimation results is selected as the AGV 10 coordinates and angle.
  • the microcomputer 14a notifies the drive unit 17 of the selected coordinates and angle.
  • the drive device 17 determines command values of rotational speeds of the motors 16a and 16b from the difference between the current coordinates and angles and the coordinates and angles at the destination. Drive device 17 controls motors 16a and 16b based on the determined command value.
  • the “reliability data” in the present embodiment includes the data of reliability (first reliability data) output from the first positioning device 14 e, the coordinates and angle estimated by the first positioning device 14 e, and the second positioning device And (19) data (second reliability data) indicating the difference between the coordinate estimated by 19 and the angle.
  • the microcomputer 14a basically controls the drive unit 17 to travel using LRF coordinates that are considered to be relatively reliable. At this time, encoder coordinates held by the second positioning device 19 are periodically overwritten with LRF coordinates. Thus, the coordinates of both are synchronized periodically.
  • the microcomputer 14a stops the synchronization of the coordinates, and continues the traveling of the AGV 10 using the encoder coordinates. In this case, the microcomputer 14a issues a command to cause the first positioning device 14e to execute initial position identification, and attempts to restore reliability. In other words, when the microcomputer 14a selects the estimation result by the second positioning device 19, the first positioning device 14e uses the first sensor data and the estimation result by the second positioning device to perform initial position identification ( The first estimation operation is performed.
  • the “initial position identification” refers to a process of searching where on the map the AGV 10 is located.
  • initial position identification matching between map data and LRF 15 data is performed over the entire area or a partial area (for example, an area of about 1 m ⁇ 1 m to 50 m ⁇ 50 m) of the map.
  • initial position identification is performed after the AGV 10 is powered on or after the map is switched.
  • position identification is performed to search a narrower range (for example, within a range of several tens of centimeters from the position) around the position.
  • This position identification may be performed, for example, every fixed time (for example, 100 milliseconds) while the AGV 10 is moving.
  • the position identification has a narrower search range and shorter execution time than the initial position identification.
  • both “initial position identification” and “position identification” correspond to the above-mentioned “first estimation operation”.
  • the microcomputer 14a switches between a mode of traveling using LRF coordinates and a mode of traveling using encoder coordinates based on the reliability data.
  • the microcomputer 14a switches these two modes, for example, in accordance with the conditions shown in Table 1 below.
  • the reliability output from the first positioning device 14e is lowered, but also when the difference between the X axis component of the LRF coordinate and the encoder coordinate or the difference between the Y axis component exceeds the allowable value. Also, the mode using LRF coordinates is switched to the mode using encoder coordinates. The reason for imposing the two conditions in this way is that, even if the first positioning device 14e outputs a high degree of reliability, a position significantly different from the actual position may be estimated as the current position. In the example of Table 1, in order to stabilize the operation, the recovery threshold value of the reliability is set higher than the switching threshold value.
  • FIG. 12 is a flowchart showing an example of the operation of the AGV 10.
  • the microcomputer 14a causes the first positioning device 14e to execute initial position identification (step S101).
  • the first positioning device 14e performs a search over the whole area or a part of the map (for example, a range of about 1 m ⁇ 1 m to 50 m ⁇ 50 m), and specifies the initial position of the AGV 10.
  • the microcomputer 14a causes the first positioning device 14e to perform position identification in a narrower area (for example, within a range of several tens of cm from the position) centered on the position (Step S102).
  • the microcomputer 14a determines whether or not the vehicle is traveling in the map switching area (step S103).
  • the map switching area refers to an area overlapping with another adjacent map in the map in use.
  • FIG. 13 is a diagram for explaining the map switching area.
  • FIG. 13 shows an example in which one map data covers a 50 m ⁇ 50 m area, and four map data M1, M2, M3 and M4 cover the entire area of one floor of one factory.
  • a rectangular overlapping area of 5 m in width is provided at the boundary between two adjacent maps. This overlapping area is the map switching area.
  • the size of the map data and the width of the overlapping area are not limited to this example, and may be set arbitrarily.
  • the microcomputer 14a when the microcomputer 14a determines that the AGV 10 is traveling in the map switching area, the microcomputer 14a performs processing for switching the map to be used to another adjacent map (step S121).
  • the map switching process will be described later with reference to FIGS. 17 and 18.
  • the microcomputer 14a determines whether the condition (A) in Table 1 described above is satisfied (step S104).
  • the microcomputer 14a overwrites the encoder coordinates held by the second positioning device 19 with LRF coordinates. Thereafter, after a predetermined time (for example, 100 milliseconds) has elapsed, the process returns to step S102, and the same operation is performed.
  • step S104 when one of the conditions (A) and (2) in Table 1 is satisfied, the microcomputer 14a switches from traveling using LRF coordinates to traveling using encoder coordinates (step S104). S111). Thereafter, the microcomputer 14a performs initial position identification every predetermined time (step S112). Based on the LRF coordinates estimated by this initial position identification, the microcomputer 14a determines whether the condition (B) in Table 1 is satisfied (step S113). Here, when the condition (B) is satisfied, it is determined that the reliability of the LRF coordinate is restored. In this case, the microcomputer 14a returns from traveling using encoder coordinates to traveling using encoder coordinates (step S114). Thereafter, the process returns to step S102, and the same operation is performed.
  • the microcomputer 14a selects the estimation result by the first positioning device 14e as the position of the AGV 10 as the position of the AGV 10.
  • the value of the first reliability data in the present embodiment, the first positioning data
  • the estimation result by the second positioning device 19 is selected as the position of the AGV 10.
  • the first positioning device 14 e uses the first sensor data and the estimation result (coordinates and angle) by the second positioning device 19 to perform an initial process. Position identification (first estimation operation) is performed.
  • the microcomputer 14a selects the estimation result by the second positioning device 19 as the position of the AGV 10
  • the value of the first reliability data becomes equal to or more than a predetermined return threshold, and the second reliability
  • the estimation result by the first positioning device 14e is selected as the position of the AGV 10.
  • the mode can be switched according to the reliability of the LRF coordinates, and stable traveling can be realized.
  • the microcomputer 14a may control the speed of the AGV 10 according to the reliability of LRF coordinates. For example, when the microcomputer 14a selects the estimation result by the second positioning device 19, the driving device moves the AGV 10 at a slower speed than when the estimation result by the first positioning device 14e is selected. You may instruct 17 Furthermore, when the value of the first reliability data (reliability) output from the first positioning device 14e after performing the first estimation operation (initial position identification) becomes equal to or greater than a predetermined return threshold value, the driving device 17 In addition, the speed of the AGV 10 may be further reduced, and the first positioning device 14e may perform the first estimation operation again.
  • the first reliability data reliability
  • the driving device 17 may increase the speed of the AGV 10 .
  • the driving device 17 increases the speed of the AGV 10 .
  • the first positioning device 14e may perform the first estimation operation again. Even when the first estimation calculation (in the present embodiment, initial position identification) is performed a plurality of times when "not above the return threshold value is maintained", the state in which the value of the first reliability data does not reach the return threshold is continuous. The case is included.
  • FIG. 14 schematically shows an example of the operation of the AGV 10.
  • FIG. 15 is a diagram showing the time change of the speed of the AGV 10 in this example.
  • FIG. 16 is a flowchart showing an operation of traveling using encoder coordinates in this example.
  • the AGV 10 travels while performing position identification at a first speed (for example, 50 m / min) using the LRF coordinates.
  • a first speed for example, 50 m / min
  • the microcomputer 14a uses the encoder coordinates. Switch to running.
  • the microcomputer 14a reduces the speed of the AGV 10 to a second speed (eg, 20 m / min) lower than the first speed (step S201). This is because when traveling at high speed using encoder coordinates, a collision or overrun is more likely to occur.
  • the second velocity is set too low, it may take a long time to get out of the section where the LRF coordinates become unreliable.
  • the second speed is set to a moderate value that is neither too low nor too high.
  • the microcomputer 14a instructs the first positioning device 14e to repeat the initial position identification until the reliability of the LRF coordinates is recovered.
  • the first positioning device 14e repeats the initial position identification until the reliability is recovered to the recovery threshold or more (steps S202 to S204).
  • the upper limit number of repetitions for example, 20 times
  • the microcomputer 14a stops the AGV 10 and transmits an error signal to the operation management device 50 or the terminal device 20 (step S205). .
  • the microcomputer 14a When the reliability of the LRF coordinates recovers to the recovery threshold or more by the initial position identification in step S202, the microcomputer 14a reduces the speed of the AGV 10 to a third speed (eg, 7.5 m / min) which is lower than the second speed. (Step S211). Then, the first positioning device 14e is made to execute initial position identification again (step S212). Also in this initial position identification, when the reliability is equal to or more than the return threshold (Yes in step S213), the microcomputer 14a determines that the difference between the X axis component and the Y axis component of LRF coordinates and encoder coordinates is an allowable value (eg, It is determined whether it is less than 30 cm (step S221).
  • a third speed eg, 7.5 m / min
  • the microcomputer 14a determines that the difference between the X axis component and the Y axis component of LRF coordinates and encoder coordinates is an allowable value (eg, It is determined whether
  • step S213 or S221 the microcomputer 14a repeats the initial position identification (step S211) until the determination is yes.
  • the upper limit number of the repetition is set to five.
  • the microcomputer 14a accelerates the speed of the AGV 10 to a fourth speed (for example, 20 m / min) (step S215).
  • the fourth velocity is the same as the second velocity, but may be different.
  • the process returns to step S201 again. By accelerating to the fourth speed, there is a high possibility that the section with low reliability can be pulled out early.
  • step S215 the process may transition to step S205 to stop the AGV 10 and transmit an error signal to the operation management device 50 or the terminal device 20.
  • step S230 the microcomputer 14a causes the first positioning device 14e to execute position identification processing.
  • the first positioning device 14e performs matching with LRF data in a relatively narrow area around the position determined by the initial position identification (for example, within a range of several tens of centimeters from the position). , Determine AGV 10 coordinates and angles.
  • the microcomputer 14a overwrites the encoder coordinates with LRF coordinates (step S231), and switches to traveling using LRF coordinates (Ste S232). Then, the speed is increased to a first speed (for example, 50 m / min) (step S233). After that, it returns to the normal operation.
  • a first speed for example, 50 m / min
  • the velocity is decelerated to the third velocity (7.5 m / min).
  • the search range for position identification after initial position identification is narrow.
  • the search range may be exceeded within several seconds. If it takes about several seconds, for example, until the microcomputer 14a instructs the first positioning device 14e to perform initial position identification and then position identification, there is a possibility that position identification can not be performed at 20 m / min. Therefore, in the present embodiment, the third speed is reduced to 7.5 m / min.
  • FIG. 17 is a diagram schematically showing map switching processing in a normal state in which the reliability of LRF coordinates is high.
  • FIG. 18 is a diagram schematically showing map switching processing in the case where the reliability of LRF coordinates decreases while traveling in the map switching area.
  • one map covers an area of 50 m ⁇ 50 m, and two adjacent maps have an overlapping area of 5 m in width.
  • the center of each map is the origin of the coordinates, and the area 20 m to 25 m away from the origin is the map switching area in each of the horizontal direction (X direction) and the vertical direction (Y direction).
  • the AGV 10 intrudes into the map switching area at a first velocity (50 m / min in this example) in the X direction.
  • the reliability of the LRF coordinates obtained by this position identification is also sufficiently high, and if the difference from the encoder coordinates is sufficiently low, the microcomputer 14a updates the values of the encoder coordinates with the values of the LRF coordinates, and the moving speed of the AGV 10 Is returned to the first speed of 50 m / min and switched to travel using LRF coordinates.
  • the first positioning device 14e performs initial position identification after map switching using encoder coordinates instead of LRF coordinates. Thereby, the success rate of the initial position identification after map switching can be improved.
  • the microcomputer 14a continues traveling using encoder coordinates until the reliability is recovered, as shown in FIG.
  • the map to be used is switched, and the first positioning device 14 e repeats the initial position identification and tries to restore the reliability of LRF coordinates.
  • the microcomputer 14a overwrites the encoder coordinates in LRF coordinates, increases the speed of the AGV 10 from the third speed to the first speed, and switches to traveling using the LRF coordinates.
  • the traveling when the reliability of the LRF coordinates decreases, the traveling is switched to using the encoder coordinates, and after the reliability is restored, the traveling is returned to using the LRF coordinates. Furthermore, by performing control of the speed as well, more stable traveling becomes possible.
  • the microcomputer 14a may be configured to output a signal indicating which of the estimation result by the first positioning device 14e and the estimation result by the second positioning device 19 is selected when the AGV 10 is moved. .
  • the signal may be output to the display 30 shown in FIG.
  • the display 30 can display information indicating which positioning method of the first positioning device 14e and the second positioning device 19 is selected.
  • the microcomputer 14a may transmit the signal to a device outside the AGV 10.
  • the external device may be, for example, the operation management device 50 or the terminal device 20.
  • the external device may be a device such as a light source or a speaker mounted on the AGV 10.
  • the external device can receive the signal and present, as light, sound, or character information, which positioning method is selected by the first positioning device 14e and the second positioning device 19. Thereby, the user can know which positioning method the AGV 10 currently operates.
  • the switching conditions in Table 1 above are an example, and other conditions can be applied.
  • the microcomputer 14a selects the estimation result by the second positioning device 19 as the position of the AGV 10 regardless of the conditions in Table 1 and in any of the following (1) and (2): Good.
  • the above determination is a determination as to whether or not the movement of LRF coordinates output from the first positioning device 14 e is similar to the movement of encoder coordinates.
  • This determination will be referred to as "reliability determination of LRF coordinates".
  • the difference between the movement distance calculated based on the coordinates output by the first positioning device 14e and the movement distance calculated based on the coordinates output by the second positioning device 19 per unit time is 20% Be less than or equal to
  • the amount of change in the angle calculated based on the coordinates output by the first positioning device 14e in a unit time (for example, 1 second) and the amount of change in the angle calculated based on the coordinates output by the second positioning device 19 The difference is 10% or less.
  • the absolute value of the difference between the angle output from the current first positioning device 14e and the angle output from the second positioning device 19 is 45 degrees or less.
  • the specific calculation formula is, for example, as follows. ⁇ Coordinates and angles output by the first positioning device 14e one second before (Xr1, Yr1, ⁇ r1), ⁇ Coordinates and angles output from the current first positioning device 14e (Xr2, Yr2, ⁇ r2), ⁇ Coordinates and angles output by the second positioning device 19 one second before (Xe1, Ye1, ⁇ e1), ⁇ Coordinates and angles output from the current second positioning device 19 (Xe2, Ye2, ⁇ e2), I assume.
  • FIG. 19 shows an example of these coordinates and angles. During normal driving, it is determined that the LRF coordinates are correct (pass) when all the following three inequalities (1) to (3) are satisfied.
  • the distance calculated from the coordinates of the first positioning device 14e outputs ((Xr2-Xr1) 2 + (Yr2-Yr1) 2) is within the normal swing range of the coordinate values of the first positioning device 14e It may be judged by whether or not it is included. For example, assuming that the square value of the normal swing of the first positioning device 14e coordinate value is 500 (mm), the following equation (4) may be used instead of the above equation (1). ((Xr2-Xr1) 2 + (Yr 2 -Yr1) 2 ) ⁇ 500 (mm) (4)
  • the microcomputer 14a detects encoder coordinates in LRF coordinates if all the following conditions are satisfied. And may return to travel by LRF coordinates. -The above-mentioned reliability judgment of LRF coordinates is pass.-The reliability of the present LRF coordinate is a return threshold (for example 40%) or more.-The average of the reliability at the time of position identification of the last predetermined number of times (for example 5) (For example, 40%) or more The difference between the X component and the Y component between LRF coordinates and encoder coordinates is less than a threshold (for example, 2 m).
  • the LRF is satisfied if either of the difference between the X component and the Y component of the LRF coordinate and the encoder coordinate is equal to or more than the allowable value (for example, 30 cm), the LRF is satisfied if the above-mentioned reliability determination is successful Return to running by coordinates. For this reason, it is more likely to return to traveling using LRF coordinates than when the conditions in Table 1 above are applied.
  • the moving body is a guideless AGV
  • the first sensor is a laser range finder
  • the second sensor includes two rotary encoders. Illustrated. However, the present disclosure is not limited to such an embodiment.
  • the moving body may be a "guided" moving body that moves along a magnetic tape provided on a road surface or a derivative such as a white line.
  • the first sensor or the second sensor may be a magnetic sensor that reads a magnetic tape or a camera that reads a white line by image recognition.
  • the positioning device may generate various pieces of information as reliability data, such as the degree of damage to the magnetic tape or the degree of white line stain, the matching degree in image processing.
  • An acceleration sensor or an angular acceleration sensor may be used as the first sensor or the second sensor.
  • the positioning device may generate various pieces of information as reliability data, such as the distribution of data output from these sensors, or the proportion of rapidly changing data.
  • an AGV traveling in a two-dimensional space is taken as an example.
  • the present disclosure can also be applied to a mobile object moving in three-dimensional space, such as a flying object (drone).
  • a drone creates a three-dimensional space map while flying, the two-dimensional space can be expanded to a three-dimensional space.
  • the processing executed by the arithmetic circuit or the microcomputer in each of the above-described embodiments may be implemented by a computer program (software) or a dedicated circuit (hardware).
  • the mobile body and mobile body management system of the present disclosure can be suitably used for moving and transporting objects such as luggage, parts, finished products, etc. in factories, warehouses, construction sites, logistics, hospitals and the like.
  • Mobile computer 21 CPU, 22 memory, 23 communication circuit, 24 image processing circuit, 25 display, 26 touch screen sensor, 30 display, 50 operation control device, 51 CPU, 52 Memory 53 location database (location DB) 54 communication circuit 55 map database (map DB) 56 image processing circuit 100 mobile management system 101 first sensor 102 second sensor 103 first positioning device 104 Second positioning device, 105 arithmetic circuit, 106 motor, 107 drive device

Abstract

La présente invention a pour objet de stabiliser le déplacement d'un corps mobile pourvu de deux types de dispositif de positionnement utilisant différents procédés de détection. Ce corps mobile est pourvu : d'un moteur ; d'un dispositif d'entraînement qui commande le moteur pour déplacer le corps mobile ; d'un premier capteur et d'un second capteur qui transmettent respectivement des premières données de capteur et des secondes données de capteur indiquant des résultats de détection acquis en fonction du mouvement du corps mobile à l'aide de procédés de détection mutuellement différents ; d'un premier dispositif de positionnement qui effectue un premier calcul d'estimation à l'aide des premières données de capteur pour estimer la position du corps mobile ; d'un second dispositif de positionnement qui effectue un second calcul d'estimation à l'aide des secondes données de capteur pour estimer la position du corps mobile ; et d'un circuit de calcul qui, selon que des données de fiabilité indiquant un degré de certitude du résultat d'estimation obtenu par le premier dispositif de positionnement correspondent ou non à certains critères, sélectionne soit le résultat d'estimation obtenu par le premier dispositif de positionnement, soit le résultat d'estimation obtenu par le second dispositif de positionnement en tant que position du corps mobile.
PCT/JP2018/028092 2017-08-03 2018-07-26 Corps mobile et programme informatique WO2019026761A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880050090.1A CN110998472A (zh) 2017-08-03 2018-07-26 移动体以及计算机程序
JP2019534452A JPWO2019026761A1 (ja) 2017-08-03 2018-07-26 移動体およびコンピュータプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-150567 2017-08-03
JP2017150567 2017-08-03

Publications (1)

Publication Number Publication Date
WO2019026761A1 true WO2019026761A1 (fr) 2019-02-07

Family

ID=65233750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/028092 WO2019026761A1 (fr) 2017-08-03 2018-07-26 Corps mobile et programme informatique

Country Status (3)

Country Link
JP (1) JPWO2019026761A1 (fr)
CN (1) CN110998472A (fr)
WO (1) WO2019026761A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020039656A1 (fr) * 2018-08-23 2020-02-27 日本精工株式会社 Dispositif automoteur, et procédé de commande de déplacement et programme de commande de déplacement pour dispositif automoteur
JP2020140490A (ja) * 2019-02-28 2020-09-03 三菱ロジスネクスト株式会社 搬送システム、領域決定装置、および、領域決定方法
CN112578789A (zh) * 2019-09-30 2021-03-30 日本电产株式会社 移动体
WO2021147008A1 (fr) * 2020-01-22 2021-07-29 深圳市大疆创新科技有限公司 Procédé de commande de robot sans pilote et robot sans pilote
WO2022075083A1 (fr) * 2020-10-09 2022-04-14 ソニーグループ株式会社 Dispositif de mouvement autonome, procédé de commande et programme
JP2022079303A (ja) * 2020-11-16 2022-05-26 株式会社豊田自動織機 無人搬送車用制御装置
WO2022130476A1 (fr) * 2020-12-15 2022-06-23 日本電気株式会社 Dispositif de traitement d'informations, système de commande de corps mobile, procédé de commande et support non transitoire lisible par ordinateur
US11471806B2 (en) 2019-03-19 2022-10-18 Lg Electronics Inc. Air purifier and purifying system
JP7274707B1 (ja) 2021-12-13 2023-05-17 アイサンテクノロジー株式会社 評価システム、コンピュータプログラム、及び評価方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114577200A (zh) * 2022-03-08 2022-06-03 尚匠威亚智能装备(重庆)有限公司 一种用于移动搬运装置的路径数据交换系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007322138A (ja) * 2006-05-30 2007-12-13 Toyota Motor Corp 移動装置及び移動装置の自己位置推定方法
JP2014219722A (ja) * 2013-05-01 2014-11-20 村田機械株式会社 自律移動体
JP2016024598A (ja) * 2014-07-18 2016-02-08 パナソニックIpマネジメント株式会社 自律移動装置の制御方法
JP2016122278A (ja) * 2014-12-24 2016-07-07 ヤマハ発動機株式会社 操作装置および自律移動システム

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5438517A (en) * 1990-02-05 1995-08-01 Caterpillar Inc. Vehicle position determination system and method
BE1016001A3 (nl) * 2004-04-30 2006-01-10 Egemin Nv Automatisch geleid voertuig met verbeterde navigatie.
KR101483549B1 (ko) * 2013-12-03 2015-01-16 전자부품연구원 입자 생성 및 선별을 통한 카메라 위치 추정 방법 및 이동 시스템
JP6011562B2 (ja) * 2014-02-27 2016-10-19 Jfeスチール株式会社 自走式検査装置及び検査システム
WO2015156821A1 (fr) * 2014-04-11 2015-10-15 Nissan North America, Inc. Système de localisation de véhicule
CN106501833A (zh) * 2015-09-07 2017-03-15 石立公 一种基于多源定位的检测车辆所在道路区域的系统和方法
CN105424030B (zh) * 2015-11-24 2018-11-09 东南大学 基于无线指纹和mems传感器的融合导航装置和方法
CN105445776A (zh) * 2015-12-28 2016-03-30 天津大学 一种室内外无缝定位系统
CN106908822B (zh) * 2017-03-14 2020-06-30 北京京东尚科信息技术有限公司 无人机定位切换方法、装置和无人机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007322138A (ja) * 2006-05-30 2007-12-13 Toyota Motor Corp 移動装置及び移動装置の自己位置推定方法
JP2014219722A (ja) * 2013-05-01 2014-11-20 村田機械株式会社 自律移動体
JP2016024598A (ja) * 2014-07-18 2016-02-08 パナソニックIpマネジメント株式会社 自律移動装置の制御方法
JP2016122278A (ja) * 2014-12-24 2016-07-07 ヤマハ発動機株式会社 操作装置および自律移動システム

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020039656A1 (ja) * 2018-08-23 2020-08-27 日本精工株式会社 自走装置、自走装置の走行制御方法及び走行制御プログラム
WO2020039656A1 (fr) * 2018-08-23 2020-02-27 日本精工株式会社 Dispositif automoteur, et procédé de commande de déplacement et programme de commande de déplacement pour dispositif automoteur
US11531344B2 (en) 2018-08-23 2022-12-20 Nsk Ltd. Autonomous running device, running control method for autonomous running device, and running control program of autonomous running device
JP2020140490A (ja) * 2019-02-28 2020-09-03 三菱ロジスネクスト株式会社 搬送システム、領域決定装置、および、領域決定方法
US11471806B2 (en) 2019-03-19 2022-10-18 Lg Electronics Inc. Air purifier and purifying system
CN112578789A (zh) * 2019-09-30 2021-03-30 日本电产株式会社 移动体
CN113661454A (zh) * 2020-01-22 2021-11-16 深圳市大疆创新科技有限公司 一种无人控制机器人的控制方法及无人控制机器人
WO2021147008A1 (fr) * 2020-01-22 2021-07-29 深圳市大疆创新科技有限公司 Procédé de commande de robot sans pilote et robot sans pilote
WO2022075083A1 (fr) * 2020-10-09 2022-04-14 ソニーグループ株式会社 Dispositif de mouvement autonome, procédé de commande et programme
JP2022079303A (ja) * 2020-11-16 2022-05-26 株式会社豊田自動織機 無人搬送車用制御装置
TWI784786B (zh) * 2020-11-16 2022-11-21 日商豐田自動織機股份有限公司 無人搬運車用控制裝置
JP7338612B2 (ja) 2020-11-16 2023-09-05 株式会社豊田自動織機 無人搬送車用制御装置
WO2022130476A1 (fr) * 2020-12-15 2022-06-23 日本電気株式会社 Dispositif de traitement d'informations, système de commande de corps mobile, procédé de commande et support non transitoire lisible par ordinateur
JP7274707B1 (ja) 2021-12-13 2023-05-17 アイサンテクノロジー株式会社 評価システム、コンピュータプログラム、及び評価方法
JP2023087496A (ja) * 2021-12-13 2023-06-23 アイサンテクノロジー株式会社 評価システム、コンピュータプログラム、及び評価方法

Also Published As

Publication number Publication date
JPWO2019026761A1 (ja) 2020-07-27
CN110998472A (zh) 2020-04-10

Similar Documents

Publication Publication Date Title
WO2019026761A1 (fr) Corps mobile et programme informatique
JP7168211B2 (ja) 障害物の回避動作を行う移動体およびそのコンピュータプログラム
JP2019168942A (ja) 移動体、管理装置および移動体システム
US20200264616A1 (en) Location estimation system and mobile body comprising location estimation system
WO2019187816A1 (fr) Corps mobile et système de corps mobile
JP7081881B2 (ja) 移動体および移動体システム
US20200363212A1 (en) Mobile body, location estimation device, and computer program
JP7136426B2 (ja) 管理装置および移動体システム
WO2019054209A1 (fr) Système et dispositif de création de carte
JP2019148870A (ja) 移動体管理システム
JP2019053391A (ja) 移動体
US11537140B2 (en) Mobile body, location estimation device, and computer program
JP2019175137A (ja) 移動体および移動体システム
JP7243014B2 (ja) 移動体
WO2019194079A1 (fr) Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique
JP2019179497A (ja) 移動体および移動体システム
JP2019079171A (ja) 移動体
JP2019067001A (ja) 移動体
JP2019165374A (ja) 移動体および移動体システム
WO2020213645A1 (fr) Système de création de carte, circuit de traitement de signal, corps mobile et procédé de création de carte
JP2020166702A (ja) 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム
JPWO2019069921A1 (ja) 移動体
JP2019148871A (ja) 移動体および移動体システム
JP2019175138A (ja) 移動体および管理装置
WO2019059299A1 (fr) Dispositif de gestion opérationnelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18841448

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019534452

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18841448

Country of ref document: EP

Kind code of ref document: A1