WO2019033712A1 - 移动机器人的地图创建方法及基于该地图的路径规划方法 - Google Patents

移动机器人的地图创建方法及基于该地图的路径规划方法 Download PDF

Info

Publication number
WO2019033712A1
WO2019033712A1 PCT/CN2018/073930 CN2018073930W WO2019033712A1 WO 2019033712 A1 WO2019033712 A1 WO 2019033712A1 CN 2018073930 W CN2018073930 W CN 2018073930W WO 2019033712 A1 WO2019033712 A1 WO 2019033712A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
mobile robot
coordinates
axis
yes
Prior art date
Application number
PCT/CN2018/073930
Other languages
English (en)
French (fr)
Inventor
吴俍儒
杨楷
郑卓斌
丁璜
Original Assignee
广东宝乐机器人股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东宝乐机器人股份有限公司 filed Critical 广东宝乐机器人股份有限公司
Priority to EP18846071.1A priority Critical patent/EP3671121A4/en
Priority to JP2019569913A priority patent/JP6915209B2/ja
Publication of WO2019033712A1 publication Critical patent/WO2019033712A1/zh
Priority to US16/712,977 priority patent/US11385062B2/en
Priority to US17/533,087 priority patent/US20220082391A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • G01C21/188Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Definitions

  • the present invention relates to the field of mobile robots, and more particularly to map creation and path planning of mobile robots based on inertial navigation systems.
  • service robots on the market can automatically perform specific tasks in specific areas under unattended conditions, which brings convenience to people and has a promising market prospect.
  • the complexity of the working environment of the sweeping robot, the change of the friction coefficient between the traveling mechanism and the work, and the error caused by the slipping (or trapping) of the running mechanism also affect the accuracy of the inertial navigation system.
  • One of the objects of the present invention is to overcome the deficiencies in the background art and provide a map creation method for a mobile robot.
  • the specific scheme is as follows:
  • a map creation method for a mobile robot comprising an odometer, a gyroscope and a camera, comprising the steps of: S1: establishing a Cartesian coordinate system in a working area of the mobile robot; S2: pressing the bow in the work area by the mobile robot Walking in the glyph; S3: acquiring the key frame picture of the mobile robot at the point Pi and saving the coordinates of the picture and the point Pi; S4: acquiring the picture of the mobile robot at the point P'i, wherein the abscissa or the ordinate of the point P'i Same as the point Pi; S5: feature extraction and matching of the picture acquired by the point Pi and the point P'i according to the ORB algorithm; S6: correcting the coordinates, odometer and/or gyroscope of the mobile robot at the point P'i according to the matching result The data is saved and saved; S7: Steps S3 to S6 are repeated until the map creation in the work area is completed.
  • the step S2 includes the following sub-steps: S201: the mobile robot walks straight along the positive direction of the X-axis; S202: determines whether the mobile robot encounters an obstacle, and if yes, proceeds to step 203, if otherwise Go to step S206; S203: the mobile robot walks along the obstacle in the positive direction of the Y axis; S204: determine whether the change of the coordinate amount of the mobile robot on the Y axis is greater than the threshold M, if yes, proceed to step S208, if otherwise, proceed to step S205; S205: Determining whether the Y-axis coordinate of the mobile robot is the same as that in step S201, if yes, returning to step S201, if not, returning to step S203; S206: determining whether the linear walking distance of the mobile robot is greater than the threshold N, and if yes, proceeding to step S207, If not, return to step S201; S207: the mobile robot linearly shifts M in the positive direction of the X-axis; S202: determines
  • the mobile robot is a covered mobile robot
  • the M ranges from 0.2 meters to 0.3 meters
  • the N ranges from 6 meters to 8 meters.
  • the mobile robot saves the key frame picture and its corresponding point Pi coordinates every predetermined interval.
  • the minimum distance between the adjacent key frame picture collection points Pi is 0.6 meters.
  • the distance between the point Pi having the same X coordinate or Y coordinate and the point P'i is 0.3 m to 0.6 m.
  • the picture is preferably a ceiling picture or a ground picture of a mobile robot working environment.
  • the second object of the present invention is to overcome the defects in the background art and provide a path planning method for a mobile robot.
  • the specific solution is as follows:
  • a path planning method for a mobile robot comprising the following steps: S'1: the mobile robot moves from an initial position to a point Pi or a point P'i; S'2: Point Pi or point P'i corrects the data of the odometer or/and the gyroscope; S'3: Steps S'1 through S'2 are repeated until the final target point is moved.
  • the step S'1 includes the following sub-steps: S'11: acquiring the coordinates of the initial position of the mobile robot and the coordinates of the final target point; S'12: determining the initial position Whether there is a bit Pi or a point P'i in the rectangular area formed by the coordinates and the coordinates of the final target point, if yes, the process proceeds to step S'13, if otherwise, the process proceeds to step S'14; S'13: the mobile robot moves to the rectangular area.
  • step S'15 a point Pi closest to the initial position or point P'i, and then proceeds to step S'15;
  • S'14 the mobile robot moves to a point Pi or point P'i closest to the initial position in the working area, and then proceeds to step S' 15;
  • S'15 determining whether there is a bit Pi or a point P'i in the rectangular area formed by the coordinates of the current position and the coordinates of the final target point, and if yes, returning to step S'13, if otherwise moving to the final target point.
  • the step S'2 includes the following sub-steps: S'21: acquiring a picture of the mobile robot at the point Pi or the point P'i; S'22: according to the ORB algorithm Feature extraction of the image and feature matching with the image saved at point Pi or point P'i; S'23: correcting the odometer or/and gyroscope of the mobile robot at point Pi or point P'i according to the matching result The data.
  • the final target point includes a mobile robot non-passing area, a charging stand position, and a mobile robot starting starting point.
  • the invention corrects the coordinates of the mobile robot in the map creation process and the data of the odometer and/or the gyroscope through the image matching technology (ORB algorithm), reduces the cumulative error of the coordinates and the angle, and improves the accuracy of the map.
  • ORB algorithm image matching technology
  • FIG. 1 is an overall flow chart of a method for creating a map of a mobile robot according to the present invention
  • FIG. 2 is a flow chart of an optimization scheme of step S2 in FIG. 1;
  • FIG. 3 is a flow chart of a path planning method for a mobile robot according to the present invention.
  • step S'1 in FIG. 3 is a flow chart of an optimization scheme of step S'1 in FIG. 3;
  • FIG. 5 is a flow chart of an optimization scheme of step S'2 in Figure 3;
  • FIG. 6 is a schematic diagram of map creation according to Embodiment 1 of the present invention.
  • FIG. 7 is a path plan diagram of the mobile robot based on the map in FIG. 6;
  • FIG. 8 is a schematic diagram of map creation according to Embodiment 2 of the present invention.
  • FIG. 9 is a path plan diagram of the mobile robot based on the map in FIG.
  • the mobile robot of the present invention is based on an inertial navigation system including an odometer, a gyroscope, and a camera.
  • the odometer is used to obtain the walking distance of the mobile robot
  • the gyroscope is used to acquire the walking angle of the mobile robot
  • the camera is installed at the bottom or the top of the mobile robot for acquiring picture information of the environment.
  • the mobile robot in the present invention is, in particular, a covered robot operating in a designated closed area, such as a security patrol robot or a sweeping robot operating in an indoor environment, a mowing robot working in a specific lawn area, and a cleaning robot cleaning a swimming pool.
  • a covered robot operating in a designated closed area such as a security patrol robot or a sweeping robot operating in an indoor environment, a mowing robot working in a specific lawn area, and a cleaning robot cleaning a swimming pool.
  • the mobile robot further includes a power module, a walking module, a sensor module, a control module, a storage module, and a function module.
  • the power module is a rechargeable battery for providing power for the mobile robot during operation;
  • the walking module includes a driving motor and a walking wheel for moving the mobile robot in the working area; and
  • the sensor module is configured to receive environmental information of the working area.
  • control module is used to control the movement of the mobile robot according to the information fed back by the sensor module;
  • storage module is used to store the control program of the mobile robot and the sensor information acquired from the outside;
  • function module refers to the specific function of the mobile robot, such as sweeping the ground The cleaning module of the robot, the rotating tool holder of the mowing robot, and the like.
  • FIG. 1 a flow chart for creating a map of a mobile robot of the present invention is shown.
  • a rectangular coordinate system consisting of the abscissa (X) and the ordinate (Y) is established in the working area of the mobile robot, and the starting point of the work, a corner of the working area or the charging stand is preferentially used as the origin.
  • the mobile robot starts to walk straight in a certain direction until it hits the obstacle or the boundary of the working area or walks a predetermined distance, then walks straight in a direction perpendicular to the direction and then walks in reverse, then repeats the above process until the completion Coverage of the work area.
  • the position at which the mobile robot starts working is taken as the origin, and the direction in which the mobile robot starts to travel is the positive direction of the X-axis, and the left side perpendicular to the traveling direction of the mobile robot is Y.
  • the right axis direction establishes a rectangular coordinate system, and then the following steps are performed to complete the bow-shaped walking in the working area: S201: the mobile robot walks straight along the positive direction of the X-axis; S202: determines whether the mobile robot encounters an obstacle, and if yes, proceeds to step 203 If not, the process proceeds to step S206; S203: the mobile robot walks along the obstacle in the positive direction of the Y-axis.
  • walking along the obstacle in the positive direction of the Y-axis means that the mobile robot is in the positive direction along the Y-axis when it encounters an obstacle. (ie, the Y-axis coordinate increases direction) walking, if the obstacle is curved or convex structure, the mobile robot may walk along the negative direction of the Y-axis (ie, the direction in which the Y-axis coordinate decreases) after walking for a period of time along the obstacle; S204: judging movement Whether the change in the coordinate amount of the robot on the Y-axis is greater than the threshold M, and if yes, proceeds to step S208, if otherwise proceeds to step S205; S205: determines the mobile robot Whether the Y-axis coordinate is the same as that in step S201, if yes, returning to step S201, if not, returning to step S203; S206: determining whether the linear walking distance of the mobile robot is greater than the threshold N, if yes, proceeding to step S207, if otherwise returning Step S201
  • S3 Acquire a key frame picture of the mobile robot at the point Pi and save the coordinates of the picture and the point Pi.
  • the mobile robot starts to move, and continuously takes pictures of the ceiling or the ground through the camera and performs feature matching during the walking process. For example, at the point Pi (i is a positive integer), the key frame picture with obvious features is selected, and the key frame is saved. The picture stores the coordinates of the point Pi in the Cartesian coordinate system. At this time, since the mobile robot just starts moving, the data of the odometer and the gyroscope are accurate, so the coordinates of the Pi point can be directly saved from the odometer and the gyroscope. save. Then the mobile robot continues to walk and continuously captures the picture of the ceiling or the ground through the camera and performs feature matching.
  • the coordinates of the point where the key frame is located and the coordinate Pi of the stored key frame picture are determined. Whether the distance (including the X-axis or Y-axis direction) is greater than a predetermined value (such as 0.6 meters), if yes, save the key frame image and corresponding coordinates of the point, if otherwise discard the save of the point (because the key frame is saved) Excessive, calibration frequently affects the efficiency of the work of mobile robots). By analogy, the key frame picture in the work area and the coordinates of the picture acquisition point Pi are saved.
  • the mobile robot Since the mobile robot moves along the bow shape, the mobile robot will move backward in a distance after hitting the obstacle, the boundary of the working area or walking a predetermined distance, and will pass the same point P'i as the abscissa or the ordinate of the point Pi, Get a picture of the mobile robot at point P'i.
  • the distance between the point P'i and the point Pi is within a certain range, such as between 0.3 meters and 0.6 meters, in order to balance the working efficiency and matching accuracy of the mobile robot.
  • the point P'i is the same as the horizontal ordinate or the ordinate of the point Pi, so that the two pictures can be matched at a certain angle, which can improve the matching efficiency and reduce the mismatch.
  • the features obtained by the points Pi and P'i are extracted by the ORB algorithm, and then the features extracted by the two images are feature-matched.
  • Steps S3 to S6 are repeated until the map creation in the work area is completed.
  • FIG. 3 it is a flow chart of a mobile robot path planning method according to the method of the present invention, and the path planning method of the mobile robot is based on the map created by the above map creation method.
  • the step S'1 includes the following sub-steps: S'11: acquiring the coordinates of the initial position of the mobile robot and the coordinates of the final target point; S'12: Determining whether there is a bit Pi or a point P'i in the rectangular area formed by the coordinates of the initial position and the coordinates of the final target point, if yes, proceeding to step S'13, if otherwise proceeding to step S'14; S'13: moving the robot to a point Pi or a point P'i in the rectangular area that is closest to the initial position, and then proceeds to step S'15; S'14: the mobile robot moves to a point Pi or a point P'i closest to the initial position in the working area, and then Going to step S'15; S'15: determining whether there is a bit Pi or a point P'i in the rectangular area formed by the coordinates of the current position and the coordinates of the final target point, if yes, returning to step S'
  • the step S'2 includes the following sub-steps: S'21: acquiring a picture of the mobile robot at the point Pi or the point P'i; S'22: Feature extraction according to the ORB algorithm, and feature matching with the picture saved at the point Pi or the point P'i; S'23: correcting the odometer of the mobile robot at the point Pi or the point P'i according to the matching result or / And gyroscope data.
  • Steps S'1 to S'2 are repeated until the final target point is moved.
  • the mobile robot corrects the odometer and the gyroscope at least at the point Pi or the point P'i before moving to the final target point, thereby eliminating the accumulated error of the odometer and the gyroscope to some extent, and improving The success rate of the mobile robot moving to the final target point.
  • the final target points in the method include the mobile robot not passing through the area, the charging stand position, and the starting point of the mobile robot.
  • FIG. 6 and FIG. 7 respectively are schematic diagrams of the map creation and a path planning diagram of the sweeping robot in a rectangular, barrier-free space.
  • the starting point A1 of the sweeping robot is taken as the origin
  • the forward direction of the sweeping robot is the positive direction of the X-axis
  • the Cartesian coordinate system is established with the direction perpendicular to the left side of the sweeping robot as the positive direction of the Y-axis.
  • the sweeping robot moves forward to the point P1 and the point P2, and selects the key frame picture with obvious features, and saves the coordinates of the picture and the point P1 and the point P2 (the coordinates of the point P1 and the point P2 are determined by the odometer and the gyroscope).
  • the key frame picture with obvious features is also screened out.
  • the point P' is discarded.
  • the key frame picture and the coordinates of the point P' are saved.
  • the sweeping robot continues to move, and saves the key frame picture and the coordinates of point P3 at point P3, then moves to point a to encounter the boundary of the working area, and then walks along the positive direction of the Y axis for a predetermined distance M to point b, where M is the machine of the sweeping robot
  • M is the machine of the sweeping robot
  • the width of the body (0.25 m in this embodiment), and then travels in the negative direction of the X-axis, passing through the same point as the abscissa X of the point P3, the point P2, and the point P1, since these points (on the bc line) and the corresponding points
  • the distance (on line A1-a) is less than the preset value (0.4 m in this embodiment), so it is not necessary to save and correct the coordinates or pictures of related points (too dense image saving and correction will affect the working efficiency of the cleaning robot) ).
  • the sweeping robot continues to move to point c to meet the boundary of the working area, then walks along the positive direction of the Y-axis for a predetermined distance M to reach point d, then walks in the positive direction of the X-axis, and then passes through the horizontal with point P1, point P2, and point P3.
  • the sweeping robot continues to move along the positive direction of the X-axis to the edge of the working area, and then screens out the key frame images with obvious features and their corresponding points (P4, P5, P6, P7) and saves them, and then at the corresponding points (P '4, P'5, P'6, P'7) coordinate correction and correction of odometer and gyroscope data.
  • the cleaning robot moves to the point A2
  • the map creation of the rectangular area is completed, and at this time, the cleaning robot starts to search for the final target point to be returned, which is the starting working point A1 of the cleaning robot in this embodiment. Since there is no point Pi or point P'i in the rectangular area formed by the coordinates of point A1 and point A2, the sweeping robot will move to point P'7, and then the data of the odometer or/and gyroscope at point P'7. Correction is performed; then the cleaning robot moves from the point P'7 to the final target point A1. Since the rectangular area formed by the coordinates of the point P'7 and the point A1 (the rectangular frame formed by the broken line) has the point P7, the cleaning robot will move.
  • FIG. 8 and FIG. 9 respectively create a schematic diagram and a path of a map of the sweeping robot in a rectangular working area including a rectangular obstacle B1 and an elliptical obstacle B2. Planning diagram.
  • the starting point A1 of the sweeping robot is taken as the origin
  • the forward direction of the sweeping robot is the positive direction of the X-axis
  • the Cartesian coordinate system is established with the direction perpendicular to the left side of the sweeping robot as the positive direction of the Y-axis.
  • the sweeping robot moves forward to the point P1 and the point P2, and selects the key frame picture with obvious features, and saves the coordinates of the picture and the point P1 and the point P2 (the coordinates of the point P1 and the point P2 are determined by the odometer and the gyroscope).
  • the sweeping robot moves to point a, it encounters the obstacle B1, then walks M (0.25 m in this embodiment) in the positive direction of the Y-axis to reach the point b, and then walks in the opposite direction along the X-axis, passing through the point P2.
  • the coordinates or pictures are saved and corrected (too dense image saving and correction will affect the efficiency of the cleaning robot).
  • the sweeping robot continues to move to point c to meet the boundary of the working area, then walks along the positive direction of the Y-axis for a predetermined distance M to reach point d, then walks in the positive direction of the X-axis, and then passes through the same abscissa X as point P1 and point P2.
  • the point because the sweeping robot has now translated two M, that is, 0.5 meters, which is greater than the preset value (0.4 meters), so the sweeping robot acquires the corresponding point (the point P'1, the point P'2 picture and respectively
  • the key frame picture saved at the same point of the abscissa X (point P1, point P2) is subjected to feature extraction and matching by the ORB algorithm, and then the coordinates of the point P'1, the point P'2 and the point P'1 are corrected according to the matching result. , the odometer at the point P'2 or / and the data of the gyroscope.
  • the sweeping robot When the sweeping robot moves to the point e, it encounters the obstacle B1 again, and then moves around the obstacle B1 to the point f, since the point e and the point f The amount of change between the Y-axis direction is less than M (0.25 m), the sweeping robot does not turn to travel in the negative direction of the X-axis, and continues to move along the obstacle B1 until the point h, at this time due to the ordinate and point of the point h
  • the coordinates of d are the same, the sweeping robot stops moving along the obstacle B1 and moves in the positive direction of the X axis;
  • the robot When the robot moves to point i, since the distance between point d and point i reaches 6 meters (6 meters in this embodiment N), the sweeping robot will walk in the positive direction of the Y axis.
  • the distance M reaches the point j and then decreases along the X axis.
  • the map creation of the passing area is performed in the same manner as in Embodiment 1 until the point is moved to the point A2.
  • the sweeping robot then moves to the untravened area for map creation as described above until the map creation of the entire work area is completed.
  • Embodiment 1 The difference from Embodiment 1 in this embodiment is that:
  • the sweeping robot After the sweeping robot runs a predetermined distance in the X-axis direction, it will automatically translate the M distance along the positive direction of the Y-axis and then walk backward along the X-axis;
  • the sweeping robot saves the picture of the key frame picture point Pi and the corresponding coordinates after running the predetermined distance, that is, the spacing of the adjacent points Pi on the abscissa is the same.
  • the sweeping robot passes through point P11, point P10, point P9, point P8, and then corrects the data of the odometer or / and the gyroscope.
  • the sweeping robot reaches the point P8, since there is no point Pi or point P'i in the rectangular area formed by the point P8 and the final target point A3, the sweeping robot will directly move from the point P8 to the final target point A3, and its walking path is as shown in the figure. The middle arrow is shown.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种移动机器人的地图创建方法和基于该地图的路径规划方法,其中,地图创建方法包括:S1:在移动机器人的工作区域内建立直角坐标系;S2:移动机器人在工作区域内按弓字形行走;S3:获取移动机器人在点Pi的关键帧图片并保存该图片及点Pi的坐标;S4:获取移动机器人在点P'i的图片,其中,点P'i的横坐标或纵坐标与点Pi相同;S5:根据ORB算法对点Pi和点P'i获取的图片进行特征提取和匹配;S6:根据匹配结果校正移动机器人在点P'i的坐标、里程计和/或陀螺仪的数据并保存;S7:重复步骤S3至步骤S6直到完成工作区域内的地图创建。通过ORB算法对移动机器人地图坐标及里程计和/或陀螺仪的数据进行校正,减少坐标和角度的累积误差,提高地图的精确性。

Description

移动机器人的地图创建方法及基于该地图的路径规划方法 技术领域
本发明涉及移动机器人领域,特别是涉及基于惯性导航系统的移动机器人的地图创建及路径规划。
背景技术
目前市面上的服务机器人可在无人参与条件下在特定区域自动完成特定工作,给人们带来了便利,市场前景可观。
作为服务机器人中的代表,扫地机器人目前发展迅速,市场需求量巨大。现有的扫地机器人大多基于惯性导航系统进行定位、地图创建和导航。但由于惯性导航系统的构成器件(陀螺仪及里程计)误差,随着工作时间的增加会逐渐累积而导致较大误差,使得扫地机器人的定位不准确、地图信息与实际相差较大以及导致路径规划无法执行等缺陷。
此外,扫地机器人工作环境的复杂度,行走机构与工作之间的摩擦系数变化,行走机构打滑(或困住)产生的误差也会影响惯性导航系统的精确度。
发明内容
本发明的目的之一在于克服背景技术中的缺陷,提供一种移动机器人的地图创建方法,具体方案如下:
一种移动机器人的地图创建方法,所述移动机器人包括里程计、陀螺仪和摄像头,包括以下步骤:S1:在移动机器人的工作区域内建立直角坐标系;S2:移动机器人在工作区域内按弓字形行走;S3:获取移动机器人在点Pi的关键帧图片并保存该图片及点Pi的坐标;S4:获取移动机器人在点P'i的图片,其中,点P'i的横坐标或纵坐标与点Pi相同;S5:根据ORB算法对点Pi和点P'i获取的图片进行特征提取和匹配;S6:根据匹配结果校正移动机器人在点P'i的坐标、里程计和/或陀螺仪的数据并保存;S7:重复步骤S3至步骤S6直到完成工作区域内的地图创建。
在本发明方案的一个优化方案中,所述步骤S2包括以下子步骤:S201:移 动机器人沿X轴正方向直线行走;S202:判断移动机器人是否遇到障碍,如果是则进入步骤203,如果否则进入步骤S206;S203:移动机器人以Y轴正方向沿障碍行走;S204:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则进入步骤S208,如果否则进入步骤S205;S205:判断移动机器人的Y轴坐标是否与步骤S201中的相同,如果是则返回至步骤S201,如果否则返回至步骤S203;S206:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S207,如果否则返回步骤S201;S207:移动机器人以Y轴正方向直线平移M后进入步骤S208;S208:移动机器人沿X轴负方向直线行走;S209:判断移动机器人是否遇到障碍,如果是则进入步骤S210,如果否则进入步骤S213;S210:移动机器人以Y轴正方向沿障碍行走;S211:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则返回步骤S201,如果否则进入步骤S212;S212:判断移动机器人的Y轴坐标是否与步骤S208中的相同,如果是则返回至步骤S208,如果否则返回至步骤S210;S213:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S214,如果否则返回步骤S208;
S214:移动机器人以Y轴正方向直线平移M后返回步骤S201。
进一步地,在本发明的一个具体实施例中,所述移动机器人为覆盖式移动机器人,所述M的范围为0.2米至0.3米,所述N的范围为6米至8米。
进一步地,在本发明的优化方案中,在所述步骤S3中,移动机器人每间隔预定距离保存关键帧图片及其对应点Pi坐标。
进一步地,所述相邻关键帧图片采集点Pi之间的最小距离为0.6米。
进一步地,所述X坐标或Y坐标相同的点Pi与点P'i之间的距离为0.3米至0.6米。
进一步地,所述图片优选移动机器人工作环境的天花板图片或地面图片。
本发明的目的之二在于克服背景技术中的缺陷,提供一种移动机器人的路径规划方法,具体方案如下:
一种移动机器人的路径规划方法,所述方法基于上述地图创建方法所创建的地图,包括以下步骤:S'1:移动机器人由初始位置移动至点Pi或点P'i;S'2:在点Pi或点P'i对里程计或/和陀螺仪的数据进行校正;S'3:重复步骤S'1至步骤S'2直到运动至最终目标点。
进一步地,在本发明的一个具体实施例中,所述步骤S'1包括以下子步骤: S'11:获取移动机器人初始位置的坐标和最终目标点的坐标;S'12:判断初始位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则进入步骤S'13,如果否则进入步骤S'14;S'13:移动机器人移动至矩形区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;S'14:移动机器人移动至工作区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;S'15:判断当前位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则返回步骤S'13,如果否则移动至最终目标点。
进一步地,在本发明的一个具体实施例中,所述步骤S'2包括以下子步骤:S'21:获取移动机器人在点Pi或点P'i处的图片;S'22:根据ORB算法对图片进行特征提取,并与点Pi或点P'i处保存的图片进行特征匹配;S'23:根据匹配结果校正移动机器人在点Pi或点P'i处的里程计或/和陀螺仪的数据。
进一步地,所述最终目标点包括移动机器人未经过区域、充电座位置以及移动机器人起始出发点。
与现有技术相比,本发明技术方案具有以下有益效果:
本发明通过图像匹配技术(ORB算法)对移动机器人在地图创建过程的坐标及里程计和/或陀螺仪的数据进行校正,减少了坐标和角度的累积误差,提高了地图的精确性。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对实施例中所需要使用的附图作简单的介绍,显而易见地,下面描述中的附图是本发明实施例的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明的移动机器人的地图创建方法的整体流程图;
图2为图1中的步骤S2的一个优化方案的流程图;
图3为本发明的移动机器人的路径规划方法流程图;
图4为图3中的步骤S'1的一个优化方案的流程图;
图5为图3中的步骤S'2的一个优化方案的流程图;
图6为本发明实施例1的地图创建示意图;
图7为移动机器人基于图6中的地图的路径规划图;
图8为本发明实施例2的地图创建示意图;
图9为移动机器人基于图8中的地图的路径规划图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明中的移动机器人基于惯性导航系统,包括里程计、陀螺仪以及摄像头。其中,里程计用于获取移动机器人的行走距离,所述陀螺仪用于获取移动机器人的行走角度,所述摄像头安装在移动机器人底部或顶部,用于获取环境的图片信息。
本发明中的移动机器人尤其是在指定封闭式区域工作的覆盖式机器人,如在室内环境工作的安防巡逻机器人或扫地机器人、在特定草坪区域工作的割草机器人以及清洁游泳池的清洁机器人等。
此外,移动机器人还包括电源模块、行走模块、传感器模块、控制模块、存储模块及功能模块。其中,电源模块为可充电电池,用于为移动机器人在工作时提供电能;行走模块包括驱动电机和行走轮,用于使移动机器人在工作区域内运动;传感器模块用于接收工作区域的环境信息并反馈至控制模块;控制模块用于根据传感器模块反馈的信息控制移动机器人运动;存储模块用于存储移动机器人的控制程序以及从外界获取的传感器信息;功能模块指移动机器人特定的功能,如扫地机器人的清洁模块、割草机器人的旋转刀座等。
如图1所示,为本发明的移动机器人的地图创建流程图。
S1:在移动机器人的工作区域内建立直角坐标系。
即在移动机器人的工作区域建立由横坐标(X)和纵坐标(Y)构成的直角坐标系,并优先将工作起始点、工作区域的某个角落或充电座作为原点。
S2:移动机器人在工作区域内按弓字形行走。
即移动机器人一开始沿某个方向直线行走,直到碰到障碍物或工作区域边界或行走预定距离,然后沿与该方向垂直的方向直线行走一段距离后再反向行走,然后重复上述过程直到完成工作区域的覆盖。
如图2所示,在本步骤的优化方案中,以移动机器人开始工作时的位置为原点,以移动机器人开始行走的方向为X轴正方向、以垂直于移动机器人行进方向的左侧为Y轴正方向建立直角坐标系,然后依次按以下步骤在工作区域内完成弓字形行走:S201:移动机器人沿X轴正方向直线行走;S202:判断移动机器人是否遇到障碍,如果是则进入步骤203,如果否则进入步骤S206;S203:移动机器人以Y轴正方向沿障碍行走,需要说明的是,这里的以Y轴正方向沿障碍行走是指移动机器人在碰到障碍的时候沿Y轴正方向(即Y轴坐标增加方向)行走,如果障碍为弧形或凸起结构,移动机器人沿障碍行走一段时间后,可能沿Y轴负方向(即Y轴坐标减小方向)行走;S204:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则进入步骤S208,如果否则进入步骤S205;S205:判断移动机器人的Y轴坐标是否与步骤S201中的相同,如果是则返回至步骤S201,如果否则返回至步骤S203;S206:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S207,如果否则返回步骤S201;S207:移动机器人以Y轴正方向直线平移M后进入步骤S208;S208:移动机器人沿X轴负方向直线行走;S209:判断移动机器人是否遇到障碍,如果是则进入步骤S210,如果否则进入步骤S213;S210:移动机器人以Y轴正方向沿障碍行走;S211:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则返回步骤S201,如果否则进入步骤S212;S212:判断移动机器人的Y轴坐标是否与步骤S208中的相同,如果是则返回至步骤S208,如果否则返回至步骤S210;S213:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S214,如果否则返回步骤S208;S214:移动机器人以Y轴正方向直线平移M后返回步骤S201。
S3:获取移动机器人在点Pi的关键帧图片并保存该图片及点Pi的坐标。
移动机器人在开始移动,并在行走过程中不断通过摄像头拍摄天花板或地面的图片并进行特征匹配,比如在点Pi(i为正整数)处筛选出了特征明显的关键帧图片,保存该关键帧图片并存储点Pi在直角坐标系的坐标,此时由于移动机器人刚开始移动,其里程计和陀螺仪的数据是准确的,因此可以直接保存从里程计和陀螺仪读取Pi点的坐标并保存。然后移动机器人继续行走并不断通过摄像头拍摄天花板或地面的图片并进行特征匹配,如果再次筛选出特征明显的关键帧,则判断该关键帧所在点的坐标与已存储的关键帧图片的坐标Pi之间的距离(包括X轴或Y轴方向)是否大于预定值(比如0.6米),如果是则将 该点的关键帧图片及对应坐标保存起来,如果否则放弃该点的保存(因为关键帧保存过多、校准频繁影响移动机器人的工作的效率)。以此类推,完成工作区域内的关键帧图片和图片获取点Pi的坐标的保存。
S4:获取移动机器人在点P'i的图片,其中点P'i的横坐标或纵坐标与点Pi相同。
由于移动机器人沿弓字形移动,因此移动机器人在碰到障碍、工作区域边界或行走预定距离后将平移一段距离后反向运动,将经过与点Pi横坐标或纵坐标相同的点P'i,获取移动机器人在点P'i的图片。优选的,点P'i与点Pi之间的距离在一定范围内,如0.3米至0.6米之间,以兼顾移动机器人工作效率及匹配准确率。其中,点P'i与点Pi的横纵标或纵坐标相同使得两幅图片在某一角度能较好的匹配,能提高匹配效率和减少错误匹配。
S5:根据ORB算法对点Pi和点P'i获取的图片进行特征提取和匹配。
通过ORB算法对点Pi和点P'i获取的图片进行特征提取,然后再对两幅图片提取的特征进行特征匹配。
S6:根据匹配结果校正移动机器人在点P'i的坐标、里程计和/或陀螺仪的数据并保存。
根据点Pi和点P'i获取的图片像素偏移反算出实际的物理偏移量以及角度偏移量,并将结果反馈回去给移动机器人,然后移动机器人根据此结果校正点P'i的坐标以及里程计和陀螺仪的计量数据。
S7:重复步骤S3至步骤S6直到完成工作区域内的地图创建。
如图3所示,为本发明方法的移动机器人路径规划流程图,该移动机器人的路径规划方法基于上述地图创建方法所创建的地图。
S'1:移动机器人由初始位置移动至点Pi或点P'i。
如图4所示,在本发明方法的一个具体实施例中,所述步骤S'1包括以下子步骤:S'11:获取移动机器人初始位置的坐标和最终目标点的坐标;S'12:判断初始位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则进入步骤S'13,如果否则进入步骤S'14;S'13:移动机器人移动至矩形区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;S'14:移动机器人移动至工作区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;S'15:判断当前位置的坐标和最终目标点的坐标构成的矩形区域内 是否有点Pi或点P'i,如果是则返回步骤S'13,如果否则移动至最终目标点。
S'2:在点Pi或点P'i对里程计或/和陀螺仪的数据进行校正。
如图5所示,在本发明的一个具体实施例中,所述步骤S'2包括以下子步骤:S'21:获取移动机器人在点Pi或点P'i处的图片;S'22:根据ORB算法对图片进行特征提取,并与点Pi或点P'i处保存的图片进行特征匹配;S'23:根据匹配结果校正移动机器人在点Pi或点P'i处的里程计或/和陀螺仪的数据。
S'3:重复步骤S'1至步骤S'2直到运动至最终目标点。
通过上述方式,移动机器人在移动至最终目标点之前,至少在点Pi或点P'i对里程计和陀螺仪进行了一次校正,在一定程度上消除里程计和陀螺仪的累积误差,提高了移动机器人运动至最终目标点的成功率。本方法中的最终目标点包括移动机器人未经过区域、充电座位置以及移动机器人起始出发点。
下面结合具体实施例进行说明。
实施例1
本实施例中的移动机器人以扫地机器人为例进行说明,图6、图7分别扫地机器人在一个呈矩形的无障碍的空间内的地图创建示意图及路径规划示意图。
如图6所示,以扫地机器人工作起始点A1为原点,以扫地机器人前进方向为X轴正方向,以垂直于扫地机器人左侧的方向为Y轴正方向建立直角坐标系。扫地机器人向前行先后走到点P1、点P2,筛选出了具有特征明显的关键帧图片,保存该图片及点P1、点P2的坐标(点P1、点P2的坐标由里程计和陀螺仪直接获取,由于扫地机器人刚开始行走,里程计和陀螺仪的数据是准确的),其中点P1和点P2的横坐标之间的距离大于预定值(如0.6米)。
当扫地机器人移动至点P'时,也筛选出了具有特征明显的关键帧图片,此时由于点P'与已保存的点P2之间的距离小于0.6米,因此放弃对点P'处的关键帧图片及点P'的坐标保存。
扫地机器人继续运动,并在点P3保存关键帧图片及点P3的坐标,然后移动至a点遇到工作区域边界,然后沿Y轴正方向行走预定距离M到达b点,M为扫地机器人的机身宽度(本实施例为0.25米),然后沿X轴负方向行走,先后经过与点P3、点P2、点P1的横坐标X相同的点,由于这些点(b-c直线上)与对应的点(A1-a直线上)的距离小于预设值(本实施例为0.4米),因此不用 对相关点的坐标或图片进行保存及校正(过于密集的图片保存及校正将影响扫地机器人的工作效率)。
扫地机器人继续运动至c点遇到工作区域边界,然后沿Y轴正方向行走预定距离M到达d点,然后再沿X轴正方向行走,然后依次经过与点P1、点P2、点P3的横坐标X相同的点,由于此时扫地机器人已平移了两个M,即0.5米,大于预设值(0.4米),因此扫地机器人获取对应点(点P'1、点P'2、点P'3)的图片并与分别于其横坐标X相同的点(点P1、点P2、点P3)保存的关键帧图片通过ORB算法进行特征提取和匹配,然后根据匹配结果校正点P'1、点P'2、点P'3的坐标和在点P'1、点P'2、点P'3处的里程计和/或陀螺仪的数据。需要注意的是,为提高效率,在扫地机器人沿同一直线运行时,只要某一点的坐标和里程计和/或陀螺仪的数据进行校正成功,则不必对该直线上的其他点进行校正,也就是说,当在点P'1校正成功后,不必在点P'2、点P'3进行校正,当在点P'2校正成功后,不必在点P'3进行校正。
然后,扫地机器人继续沿X轴正方向运动至工作区域边缘,然后先后筛选出了具有特征明显的关键帧图片及其对应点(P4、P5、P6、P7)并保存,然后在对应点(P'4、P'5、P'6、P'7)进行坐标校正及里程计和陀螺仪的数据的校正。
以此类推,直到完成矩形区域内的地图创建。
如图7所示,当扫地机器人移动至点A2时,完成了矩形区域的地图创建,此时扫地机器人开始寻找要返回的最终目标点,在本实施例中为扫地机器人的起始工作点A1,由于点A1、点A2的坐标构成的矩形区域内没有点Pi或点P'i,因此扫地机器人将移动至点P'7,然后在点P'7对里程计或/和陀螺仪的数据进行校正;接着扫地机器人将从点P'7移动至最终目标点A1,由于点P'7、点A1的坐标构成的矩形区域(虚线构成的矩形框)内存在点P7,因此扫地机器人将移动至点P7,然后再点P7对里程计或/和陀螺仪的数据进行校正;然后扫地机器人将从点P7移动至最终目标点A1,由于点P7、点A1的坐标构成的矩形区域内没有点Pi或点P'i,所以扫地机器人将从点P7直接移动至最终目标点A1,其行走路径如图中箭头所示。
实施例2
本实施例中的移动机器人也以扫地机器人为例进行说明,图8、图9分别扫地机器人在一个包括矩形障碍物B1和椭圆形障碍物B2的呈矩形的工作区域内的地图创建示意图及路径规划示意图。
如图8所示,以扫地机器人工作起始点A1为原点,以扫地机器人前进方向为X轴正方向,以垂直于扫地机器人左侧的方向为Y轴正方向建立直角坐标系。扫地机器人向前行先后走到点P1、点P2,筛选出了具有特征明显的关键帧图片,保存该图片及点P1、点P2的坐标(点P1、点P2的坐标由里程计和陀螺仪直接获取,由于扫地机器人刚开始行走,里程计和陀螺仪的数据是准确的),其中点P1和点P2的横坐标之间的距离大于预定值(如0.6米)。
扫地机器人运动至点a时,碰到障碍物B1,然后沿Y轴正方向行走M(本实施例中为0.25米)距离后到达点b,然后沿X轴反方向行走,先后经过与点P2、点P1的横坐标X相同的点,由于这些点(b-c直线上)与对应的点(A1-a直线上)的距离小于预设值(本实施例为0.4米),因此不用对相关点的坐标或图片进行保存及校正(过于密集的图片保存及校正将影响扫地机器人的工作效率)。
扫地机器人继续运动至c点遇到工作区域边界,然后沿Y轴正方向行走预定距离M到达d点,然后再沿X轴正方向行走,然后依次经过与点P1、点P2的横坐标X相同的点,由于此时扫地机器人已平移了两个M,即0.5米,大于预设值(0.4米),因此扫地机器人获取对应点(点P'1、点P'2的图片并与分别于其横坐标X相同的点(点P1、点P2)保存的关键帧图片通过ORB算法进行特征提取和匹配,然后根据匹配结果校正点P'1、点P'2的坐标和在点P'1、点P'2处的里程计或/和陀螺仪的数据。当扫地机器人运动至点e时,再次遇到障碍物B1,然后绕障碍物B1运动到点f后,由于点e与点f之间的Y轴方向的变化量小于M(0.25米),扫地机器人不会转向至沿X轴负方向行走,继续沿障碍物B1移动直到到点h,此时由于点h的纵坐标与点d的坐标相同,扫地机器人停止沿障碍物B1运动,并沿X轴正方向运动;当扫地机器人运动至点i时,由于点d与点i之间的距离达到6米(本实施例N的为6米),扫地机器人将沿Y轴正方向行走M距离到达j点后沿X轴负方向运动。
以此类推,直到运动至点A2之前,按实施例1中的方式进行经过区域的地图创建。然后扫地机器人将移动至未经过的区域按上述方式进行地图创建直到 完成整个工作区域的地图创建。
本实施例中与实施例1不同的地方在于:
1、扫地机器人在X轴方向运行预定距离后将自动沿Y轴正方向平移M距离后沿X轴反向行走;
2、扫地机器人的在运行预定距离后保存关键帧图片点Pi的图片及对应坐标,即相邻的点Pi在横坐标的间距是相同的。
如图9所示,当扫地机器人移动至点A2时,完成了矩形区域的部分地图创建,此时扫地机器人开始寻找要返回的最终目标点,在本实施例中为扫地机器人的未覆盖区域的点A3,由于点A1、点A3的坐标构成的矩形区域(图中虚线框)内包含点P8、P9、P10、P11、P12,因此扫地机器人将移动至距离点A2最近的点P12,然后在点P12对里程计或/和陀螺仪的数据进行校正。
以此类推,扫地机器人依次经过点P11、点P10、点P9、点P8,然后在对里程计或/和陀螺仪的数据进行校正。当扫地机器人到达点P8后,由于与点P8与最终目标点A3构成的矩形区域内没有点Pi或点P'i,扫地机器人将从点P8直接移动至最终目标点A3,其行走路径如图中箭头所示。
以上所揭露的仅为本发明较佳实施例而已,当然不能以此来限定本发明之权利范围,因此依本发明权利要求所作的等同变化,仍属本发明所涵盖的范围。

Claims (14)

  1. 一种移动机器人的地图创建方法,所述移动机器人包括里程计、陀螺仪和摄像头,其特征在于,包括以下步骤:
    S1:在移动机器人的工作区域内建立直角坐标系;
    S2:移动机器人在工作区域内按弓字形行走;
    S3:获取移动机器人在点Pi的关键帧图片并保存该图片及点Pi的坐标;
    S4:获取移动机器人在点P'i的图片,其中,点P'i的横坐标或纵坐标与点Pi相同;
    S5:根据ORB算法对点Pi和点P'i获取的图片进行特征提取和匹配;
    S6:根据匹配结果校正移动机器人在点P'i的坐标、里程计和/或陀螺仪的数据并保存;
    S7:重复步骤S3至步骤S6直到完成工作区域内的地图创建。
  2. 根据权利要求1所述的移动机器人的地图创建方法,其特征在于,所述步骤S2包括以下子步骤:
    S201:移动机器人沿X轴正方向直线行走;
    S202:判断移动机器人是否遇到障碍,如果是则进入步骤203,如果否则进入步骤S206;
    S203:移动机器人以Y轴正方向沿障碍行走;
    S204:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则进入步骤S208,如果否则进入步骤S205;
    S205:判断移动机器人的Y轴坐标是否与步骤S201中的相同,如果是则返回至步骤S201,如果否则返回至步骤S203;
    S206:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S207,如果否则返回步骤S201;
    S207:移动机器人以Y轴正方向直线平移M后进入步骤S208;
    S208:移动机器人沿X轴负方向直线行走;
    S209:判断移动机器人是否遇到障碍,如果是则进入步骤S210,如果否则进入步骤S213;
    S210:移动机器人以Y轴正方向沿障碍行走;
    S211:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则返 回步骤S201,如果否则进入步骤S212;
    S212:判断移动机器人的Y轴坐标是否与步骤S208中的相同,如果是则返回至步骤S208,如果否则返回至步骤S210;
    S213:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S214,如果否则返回步骤S208;
    S214:移动机器人以Y轴正方向直线平移M后返回步骤S201。
  3. 根据权利要求2所述的移动机器人的地图创建方法,其特征在于,所述移动机器人为覆盖式移动机器人,所述M的范围为0.2米至0.3米,所述N的范围为6米至8米。
  4. 根据权利要求1所述的移动机器人的地图创建方法,其特征在于,在所述步骤S3中,移动机器人每间隔预定距离保存关键帧图片及其对应点Pi坐标。
  5. 根据权利要求1所述的移动机器人的地图创建方法,其特征在于,所述相邻关键帧图片采集点Pi之间的最小距离为0.6米。
  6. 根据权利要求1所述的移动机器人的地图创建方法,其特征在于,所述X坐标或Y坐标相同的点Pi与点P'i之间的距离为0.3米至0.6米。
  7. 根据权利要求1所述的移动机器人的地图创建方法,其特征在于,所述图片为移动机器人工作环境的天花板图片或地面图片。
  8. 一种移动机器人的路径规划方法,其特征在于,所述方法基于权利要求1至7之一方法所创建的地图,包括以下步骤:
    S'1:移动机器人由初始位置移动至点Pi或点P'i;
    S'2:在点Pi或点P'i对里程计或/和陀螺仪的数据进行校正;
    S'3:重复步骤S'1至步骤S'2直到运动至最终目标点。
  9. 根据权利要求8所述的移动机器人的路径规划方法,其特征在于,所述步骤S'1包括以下子步骤:
    S'11:获取移动机器人初始位置的坐标和最终目标点的坐标;
    S'12:判断初始位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则进入步骤S'13,如果否则进入步骤S'14;
    S'13:移动机器人移动至矩形区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;
    S'14:移动机器人移动至工作区域内的距离初始位置最近的点Pi或点P'i, 然后进入步骤S'15;
    S'15:判断当前位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则返回步骤S'13,如果否则移动至最终目标点。
  10. 根据权利要求8所述的移动机器人的路径规划方法,其特征在于,所述步骤S'2包括以下子步骤:
    S'21:获取移动机器人在点Pi或点P'i处的图片;
    S'22:根据ORB算法对图片进行特征提取,并与点Pi或点P'i处保存的图片进行特征匹配;
    S'23:根据匹配结果校正移动机器人在点Pi或点P'i处的里程计或/和陀螺仪的数据。
  11. 根据权利要求8或9所述的移动机器人的路径规划方法,其特征在于,所述最终目标点包括移动机器人未经过区域、充电座位置以及移动机器人起始出发点。
  12. 一种移动机器人的地图创建方法,所述移动机器人包括里程计、陀螺仪和摄像头,其特征在于,包括以下步骤:
    S1:在移动机器人的工作区域内建立直角坐标系;
    S2:移动机器人在工作区域内按弓字形行走;
    S3:获取移动机器人在点Pi的关键帧图片并保存该图片及点Pi的坐标;
    S4:获取移动机器人在点P'i的图片,其中,点P'i的横坐标或纵坐标与点Pi相同;
    S5:根据ORB算法对点Pi和点P'i获取的图片进行特征提取和匹配;
    S6:根据匹配结果校正移动机器人在点P'i的坐标、里程计和/或陀螺仪的数据并保存;
    S7:重复步骤S3至步骤S6直到完成工作区域内的地图创建;
    其中所述步骤S2包括以下子步骤:
    S201:移动机器人沿X轴正方向直线行走;
    S202:判断移动机器人是否遇到障碍,如果是则进入步骤203,如果否则进入步骤S206;
    S203:移动机器人以Y轴正方向沿障碍行走;
    S204:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则进 入步骤S208,如果否则进入步骤S205;
    S205:判断移动机器人的Y轴坐标是否与步骤S201中的相同,如果是则返回至步骤S201,如果否则返回至步骤S203;
    S206:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S207,如果否则返回步骤S201;
    S207:移动机器人以Y轴正方向直线平移M后进入步骤S208;
    S208:移动机器人沿X轴负方向直线行走;
    S209:判断移动机器人是否遇到障碍,如果是则进入步骤S210,如果否则进入步骤S213;
    S210:移动机器人以Y轴正方向沿障碍行走;
    S211:判断移动机器人在Y轴上的坐标量变化是否大于阈值M,如果是则返回步骤S201,如果否则进入步骤S212;
    S212:判断移动机器人的Y轴坐标是否与步骤S208中的相同,如果是则返回至步骤S208,如果否则返回至步骤S210;
    S213:判断移动机器人直线行走距离是否大于阈值N,如果是则进入步骤S214,如果否则返回步骤S208;
    S214:移动机器人以Y轴正方向直线平移M后返回步骤S201;
    所述移动机器人为覆盖式移动机器人,所述M的范围为0.2米至0.3米,所述N的范围为6米至8米;
    在所述步骤S3中,移动机器人每间隔预定距离保存关键帧图片及其对应点Pi坐标;
    所述图片为移动机器人工作环境的天花板图片或地面图片;
    其中步骤S6包括根据点Pi和点P'i获取的图片像素偏移反算出实际的物理偏移量以及角度偏移量,并将结果反馈回去给移动机器人,然后移动机器人根据此结果校正点P'i的坐标以及里程计和陀螺仪的计量数据。
  13. 一种移动机器人的路径规划方法,其特征在于,所述方法基于权利要求12的方法所创建的地图,包括以下步骤:
    S'1:移动机器人由初始位置移动至点Pi或点P'i;
    S'2:在点Pi或点P'i对里程计或/和陀螺仪的数据进行校正;
    S'3:重复步骤S'1至步骤S'2直到运动至最终目标点;
    所述步骤S'1包括以下子步骤:
    S'11:获取移动机器人初始位置的坐标和最终目标点的坐标;
    S'12:判断初始位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则进入步骤S'13,如果否则进入步骤S'14;
    S'13:移动机器人移动至矩形区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;
    S'14:移动机器人移动至工作区域内的距离初始位置最近的点Pi或点P'i,然后进入步骤S'15;
    S'15:判断当前位置的坐标和最终目标点的坐标构成的矩形区域内是否有点Pi或点P'i,如果是则返回步骤S'13,如果否则移动至最终目标点;
    所述最终目标点包括移动机器人未经过区域、充电座位置以及移动机器人起始出发点。
  14. 一种移动机器人的路径规划方法,其特征在于,所述方法基于权利要求12的方法所创建的地图,包括以下步骤:
    S'1:移动机器人由初始位置移动至点Pi或点P'i;
    S'2:在点Pi或点P'i对里程计或/和陀螺仪的数据进行校正;
    S'3:重复步骤S'1至步骤S'2直到运动至最终目标点;
    所述步骤S'2包括以下子步骤:
    S'21:获取移动机器人在点Pi或点P'i处的图片;
    S'22:根据ORB算法对图片进行特征提取,并与点Pi或点P'i处保存的图片进行特征匹配;
    S'23:根据匹配结果校正移动机器人在点Pi或点P'i处的里程计或/和陀螺仪的数据。
PCT/CN2018/073930 2017-08-18 2018-01-24 移动机器人的地图创建方法及基于该地图的路径规划方法 WO2019033712A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP18846071.1A EP3671121A4 (en) 2017-08-18 2018-01-24 MOBILE ROBOT MAP CREATION PROCESS AND MAP-BASED JOURNEY PLANNING METHOD
JP2019569913A JP6915209B2 (ja) 2017-08-18 2018-01-24 移動ロボットの地図作成方法および当該地図に基づく経路計画方法
US16/712,977 US11385062B2 (en) 2017-08-18 2019-12-12 Map creation method for mobile robot and path planning method based on the map
US17/533,087 US20220082391A1 (en) 2017-08-18 2021-11-22 Map creation method for mobile robot and path planning method based on the map

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710713711.7 2017-08-18
CN201710713711.7A CN107314773B (zh) 2017-08-18 2017-08-18 移动机器人的地图创建方法及基于该地图的路径规划方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/712,977 Continuation US11385062B2 (en) 2017-08-18 2019-12-12 Map creation method for mobile robot and path planning method based on the map

Publications (1)

Publication Number Publication Date
WO2019033712A1 true WO2019033712A1 (zh) 2019-02-21

Family

ID=60176243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/073930 WO2019033712A1 (zh) 2017-08-18 2018-01-24 移动机器人的地图创建方法及基于该地图的路径规划方法

Country Status (5)

Country Link
US (2) US11385062B2 (zh)
EP (1) EP3671121A4 (zh)
JP (1) JP6915209B2 (zh)
CN (1) CN107314773B (zh)
WO (1) WO2019033712A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109682382A (zh) * 2019-02-28 2019-04-26 电子科技大学 基于自适应蒙特卡洛和特征匹配的全局融合定位方法

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107314773B (zh) 2017-08-18 2019-10-01 广东宝乐机器人股份有限公司 移动机器人的地图创建方法及基于该地图的路径规划方法
CN107976998A (zh) * 2017-11-13 2018-05-01 河海大学常州校区 一种割草机器人地图创建与路径规划系统及方法
CN107807650A (zh) * 2017-11-29 2018-03-16 莱克电气股份有限公司 一种机器人的运动控制方法、装置及设备
CN108153301B (zh) * 2017-12-07 2021-02-09 深圳市杰思谷科技有限公司 一种基于极坐标的智能避障系统
CN108873880A (zh) * 2017-12-11 2018-11-23 北京石头世纪科技有限公司 智能移动设备及其路径规划方法、计算机可读存储介质
CN109974746B (zh) * 2017-12-27 2020-11-24 深圳市优必选科技有限公司 全向轮里程校准方法及机器人
CN108469826B (zh) * 2018-04-23 2021-06-08 宁波Gqy视讯股份有限公司 一种基于机器人的地图生成方法及系统
CN108873889A (zh) * 2018-05-14 2018-11-23 北京石头世纪科技有限公司 智能移动设备及其路径控制方法、计算机可读存储介质
CN109141395B (zh) * 2018-07-10 2020-06-09 深圳市无限动力发展有限公司 一种基于视觉回环校准陀螺仪的扫地机定位方法及装置
GB2576494B (en) * 2018-08-06 2022-03-23 Dyson Technology Ltd A mobile robot and method of controlling thereof
CN108937742A (zh) * 2018-09-06 2018-12-07 苏州领贝智能科技有限公司 一种扫地机的陀螺仪角度修正方法和扫地机
CN110174892B (zh) * 2019-04-08 2022-07-22 阿波罗智能技术(北京)有限公司 车辆朝向的处理方法、装置、设备及计算机可读存储介质
CN110207707B (zh) * 2019-05-30 2022-04-12 四川长虹电器股份有限公司 基于粒子滤波器的快速初始定位方法及机器人设备
CN110597249B (zh) * 2019-08-23 2022-08-05 深圳市优必选科技股份有限公司 一种机器人及其回充定位方法和装置
CN112783147A (zh) * 2019-11-11 2021-05-11 科沃斯机器人股份有限公司 一种轨迹规划方法、装置、机器人及存储介质
CN111085998B (zh) * 2019-12-17 2021-11-09 珠海市一微半导体有限公司 机器人记录运动轨迹的方法和显示机器人运动轨迹的方法
CN111329399B (zh) * 2020-04-09 2021-09-10 湖南格兰博智能科技有限责任公司 一种基于有限状态机的扫地机目标点导航方法
CN113899376B (zh) * 2020-07-06 2023-10-20 苏州宝时得电动工具有限公司 自移动设备地图生成方法、系统和自动工作系统
CN114111779A (zh) * 2020-08-26 2022-03-01 深圳市杉川机器人有限公司 一种建立工作区域地图的方法及自移动设备
CN112200850B (zh) * 2020-10-16 2022-10-04 河南大学 一种基于成熟特征点的orb提取方法
CN112352530B (zh) * 2020-10-27 2022-05-31 懿力创新(厦门)科技有限公司 一种自动除草机器人的工作路径优化方法
CN112433537B (zh) * 2020-11-11 2022-09-16 广西电网有限责任公司电力科学研究院 一种输电线路铁塔组立施工的可视化监督方法及系统
CN113589806A (zh) * 2021-07-21 2021-11-02 珠海一微半导体股份有限公司 机器人弓字型行走时的策略控制方法
CN113568435B (zh) * 2021-09-24 2021-12-24 深圳火眼智能有限公司 一种基于无人机自主飞行态势感知趋势的分析方法与系统
WO2023155160A1 (en) * 2022-02-18 2023-08-24 Beijing Smorobot Technology Co., Ltd Swimming pool map boundary construction and swimming pool cleaning methods and apparatuses, and electronic device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102138769A (zh) * 2010-01-28 2011-08-03 深圳先进技术研究院 清洁机器人及其清扫方法
CN104390643A (zh) * 2014-11-24 2015-03-04 上海美琦浦悦通讯科技有限公司 基于多信息融合实现室内定位的方法
US9126338B2 (en) * 2012-08-16 2015-09-08 Hanwha Techwin Co., Ltd. Robot system and method for driving the same
CN106020207A (zh) * 2016-07-26 2016-10-12 广东宝乐机器人股份有限公司 自移动机器人行走方法与装置
CN106355197A (zh) * 2016-08-24 2017-01-25 广东宝乐机器人股份有限公司 基于K‑means聚类算法的导航图像匹配过滤方法
CN106708048A (zh) * 2016-12-22 2017-05-24 清华大学 机器人的天花板图像定位方法和系统
CN106767833A (zh) * 2017-01-22 2017-05-31 电子科技大学 一种融合rgbd深度传感器与编码器的机器人定位方法
CN107314773A (zh) * 2017-08-18 2017-11-03 广东宝乐机器人股份有限公司 移动机器人的地图创建方法及基于该地图的路径规划方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5610815A (en) * 1989-12-11 1997-03-11 Caterpillar Inc. Integrated vehicle positioning and navigation system, apparatus and method
AU2002304133A1 (en) * 2002-05-31 2003-12-19 Fujitsu Limited Remotely-operated robot, and robot self position identifying method
CN103251355A (zh) * 2012-02-16 2013-08-21 恩斯迈电子(深圳)有限公司 扫地机器人与充电系统
CN102866706B (zh) * 2012-09-13 2015-03-25 深圳市银星智能科技股份有限公司 一种采用智能手机导航的清扫机器人及其导航清扫方法
CN104972462B (zh) * 2014-04-14 2017-04-19 科沃斯机器人股份有限公司 自移动机器人避障行走方法
US9969337B2 (en) * 2014-09-03 2018-05-15 Sharp Laboratories Of America, Inc. Methods and systems for mobile-agent navigation
WO2016095965A2 (en) * 2014-12-16 2016-06-23 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
US10518407B2 (en) * 2015-01-06 2019-12-31 Discovery Robotics Apparatus and methods for providing a reconfigurable robotic platform
CN105856230B (zh) * 2016-05-06 2017-11-24 简燕梅 一种可提高机器人位姿一致性的orb关键帧闭环检测slam方法
US20180348783A1 (en) * 2017-05-31 2018-12-06 Neato Robotics, Inc. Asynchronous image classification

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102138769A (zh) * 2010-01-28 2011-08-03 深圳先进技术研究院 清洁机器人及其清扫方法
US9126338B2 (en) * 2012-08-16 2015-09-08 Hanwha Techwin Co., Ltd. Robot system and method for driving the same
CN104390643A (zh) * 2014-11-24 2015-03-04 上海美琦浦悦通讯科技有限公司 基于多信息融合实现室内定位的方法
CN106020207A (zh) * 2016-07-26 2016-10-12 广东宝乐机器人股份有限公司 自移动机器人行走方法与装置
CN106355197A (zh) * 2016-08-24 2017-01-25 广东宝乐机器人股份有限公司 基于K‑means聚类算法的导航图像匹配过滤方法
CN106708048A (zh) * 2016-12-22 2017-05-24 清华大学 机器人的天花板图像定位方法和系统
CN106767833A (zh) * 2017-01-22 2017-05-31 电子科技大学 一种融合rgbd深度传感器与编码器的机器人定位方法
CN107314773A (zh) * 2017-08-18 2017-11-03 广东宝乐机器人股份有限公司 移动机器人的地图创建方法及基于该地图的路径规划方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3671121A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109682382A (zh) * 2019-02-28 2019-04-26 电子科技大学 基于自适应蒙特卡洛和特征匹配的全局融合定位方法

Also Published As

Publication number Publication date
EP3671121A1 (en) 2020-06-24
EP3671121A4 (en) 2020-12-02
US20200116501A1 (en) 2020-04-16
CN107314773B (zh) 2019-10-01
JP6915209B2 (ja) 2021-08-04
JP2020529061A (ja) 2020-10-01
US11385062B2 (en) 2022-07-12
US20220082391A1 (en) 2022-03-17
CN107314773A (zh) 2017-11-03

Similar Documents

Publication Publication Date Title
WO2019033712A1 (zh) 移动机器人的地图创建方法及基于该地图的路径规划方法
CN107796397B (zh) 一种机器人双目视觉定位方法、装置和存储介质
US11204247B2 (en) Method for updating a map and mobile robot
CN108873908B (zh) 基于视觉slam和网络地图结合的机器人城市导航系统
WO2018196450A1 (zh) 移动机器人在工作区域内的角度修正方法及移动机器人
Georgiev et al. Localization methods for a mobile robot in urban environments
CN103954275B (zh) 基于车道线检测和gis地图信息开发的视觉导航方法
Wang et al. Intelligent vehicle self-localization based on double-layer features and multilayer LIDAR
US20210229280A1 (en) Positioning method and device, path determination method and device, robot and storage medium
CN106840148A (zh) 室外作业环境下基于双目摄像机的可穿戴式定位与路径引导方法
CN105606102B (zh) 一种基于格网模型的pdr室内定位方法及系统
CN107167826B (zh) 一种自动驾驶中基于可变网格的图像特征检测的车辆纵向定位系统及方法
CN110187348A (zh) 一种激光雷达定位的方法
CN109460032A (zh) 一种基于激光对射的定位方法以及机器人自主充电方法
US20210372798A1 (en) Visual navigation method and system for mobile devices based on qr code signposts
JP2011150473A (ja) 自律型移動体
JP5147129B2 (ja) 自律型移動体
CN112729321A (zh) 一种机器人的扫图方法、装置、存储介质和机器人
CN115290097B (zh) 基于bim的实时精确地图构建方法、终端及存储介质
CN110597265A (zh) 一种扫地机器人回充方法和装置
JP6801269B2 (ja) 自律移動装置
CN112484746A (zh) 一种基于地平面的单目视觉辅助激光雷达里程计方法
CN208289901U (zh) 一种增强视觉的定位装置及机器人
WO2022179519A1 (zh) 基于地面纹理信息的地图构建方法和系统及移动机器人
CN112612034B (zh) 基于激光帧和概率地图扫描的位姿匹配方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18846071

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019569913

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018846071

Country of ref document: EP

Effective date: 20200318