WO2017038012A1 - Procédé de cartographie, procédé de localisation, système de robot, et robot - Google Patents

Procédé de cartographie, procédé de localisation, système de robot, et robot Download PDF

Info

Publication number
WO2017038012A1
WO2017038012A1 PCT/JP2016/003634 JP2016003634W WO2017038012A1 WO 2017038012 A1 WO2017038012 A1 WO 2017038012A1 JP 2016003634 W JP2016003634 W JP 2016003634W WO 2017038012 A1 WO2017038012 A1 WO 2017038012A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
map information
map
mobile robot
location
Prior art date
Application number
PCT/JP2016/003634
Other languages
English (en)
Japanese (ja)
Inventor
洋平 中田
一真 竹内
齊藤 雅彦
原田 尚幸
修平 松井
健介 若杉
Original Assignee
パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016134284A external-priority patent/JP6849330B2/ja
Application filed by パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ filed Critical パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ
Priority to EP16841053.8A priority Critical patent/EP3343307B1/fr
Priority to CN201680003184.4A priority patent/CN106796434B/zh
Publication of WO2017038012A1 publication Critical patent/WO2017038012A1/fr
Priority to US15/825,159 priority patent/US10549430B2/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Definitions

  • the present disclosure relates to a technique for estimating the position of a mobile robot and creating an environment map, and more particularly, to an environment sensor installed in the environment and a mobile robot system that cooperates with the environment sensor.
  • SLAM Multiple Localization And Mapping
  • the mobile robot in the SLAM technology includes an internal sensor that is a sensor for knowing the internal state of the mobile robot, and an external sensor that is a sensor for detecting the external state.
  • the current position / orientation is first estimated from the information of the internal sensor.
  • (i) position (ii) posture (iii) position uncertainty (iv) posture uncertainty make a prediction.
  • the observation information obtained by the external sensor and the information predicted from the information of the internal sensor are compared.
  • the weight of the information of the inner world sensor and the outer world sensor is determined from the likelihood of each information calculated by the comparison.
  • position and map information is performed using the information of the weight of the data of an inner world sensor and an outer world sensor.
  • Patent Document 1 discloses a robot system that evaluates the reliability of a plurality of estimators that perform self-position estimation based on outputs from a plurality of different sensors and integrates self-position estimation information from the plurality of estimators according to the reliability. Is disclosed.
  • Patent Document 2 discloses a mobile robot that moves while avoiding a collision with a person or an obstacle without interfering with a human action in cooperation with an environmental sensor.
  • a plurality of environmental sensors are arranged so as to detect the positions of all persons existing in the environment.
  • JP 2012-248032 A Japanese Patent No. 5617562
  • a map generation method for a mobile robot that performs map generation using at least one environmental sensor node, Obtaining pre-created first map information including information around the mobile robot; Obtaining second map information including information around the mobile robot by an external sensor mounted on the mobile robot; Receiving third map information including information around the mobile robot from the environmental sensor node; (I) When there is information on a location where the temporal change amount is equal to or greater than a predetermined threshold in the third map information, A removal process for removing the location information in the first map information and the location information in the second map information corresponding to a location where the temporal change amount in the third map information is equal to or greater than the predetermined threshold.
  • Increase processing (Ii) matching the first map information and the second map information after the removal process or the dispersion increasing process is performed, and generating map information based on the matching result; (Iii) A map generation method is provided in which the first map information is updated to map information generated based on a matching result.
  • FIG. 1 is a block diagram illustrating an example of the configuration of the robot system according to the first embodiment.
  • FIG. 2 is a schematic diagram illustrating an example of an environment in which an environment sensor node and a mobile robot exist.
  • FIG. 3A is a flowchart illustrating an example of processing of the robot system.
  • FIG. 3B is a flowchart illustrating an example of processing of the moving object movement / environment change information integration unit.
  • FIG. 4A is a diagram illustrating an example of an environment including a mobile robot at time T.
  • FIG. 4B is an illustrative view showing an example of an environment including a mobile robot at time T + ⁇ .
  • FIG. 4C is a diagram illustrating an example of sensor data acquired by the mobile robot at time T.
  • FIG. 4D is a diagram illustrating an example of sensor data acquired by the mobile robot at time T + ⁇ .
  • FIG. 5A is an illustrative view showing one example of data matching at the time of map generation / self-position estimation processing.
  • FIG. 5B is a diagram illustrating an example of matching results.
  • FIG. 5C is a diagram illustrating an example of matching evaluation values.
  • FIG. 6A is a diagram illustrating an example of data removal processing.
  • FIG. 6B is a diagram illustrating an example of a matching result of data after the removal process.
  • FIG. 6C is a diagram illustrating an example of matching evaluation values of data after the removal process.
  • FIG. 7A is a diagram illustrating an example of a data distribution increase process.
  • FIG. 7B is a diagram illustrating an example of a matching result of data after the dispersion increasing process.
  • FIG. 7C is a diagram illustrating an example of matching evaluation values of data after the dispersion increasing process.
  • FIG. 8 is a diagram showing map data at time T. As shown in FIG. FIG. 9 is a diagram for explaining an example of the dispersion increasing process.
  • FIG. 10 is a diagram illustrating an example of a result of the dispersion increasing process.
  • FIG. 11 is a flowchart illustrating an example of processing of the robot system according to the second embodiment.
  • Patent Document 1 In conventional methods such as Patent Document 1 and Patent Document 2, there has been no study on performing map generation and self-position estimation with high accuracy and robustness even in a space with a large environmental change. For this reason, the increase in the amount of calculation required could not be suppressed.
  • the present disclosure provides a self-position estimation method, a robot system, and a mobile robot that perform map generation and self-position estimation with high accuracy and robustness even in a space with large environmental changes.
  • the self-position estimation method is a map generation method for a mobile robot that performs map generation using at least one environment sensor node, and includes a first preliminarily generated information including information on the periphery of the mobile robot.
  • the map information is acquired, the second map information including information around the mobile robot is acquired by an external sensor mounted on the mobile robot, and the third map including the information around the mobile robot from the environment sensor node.
  • the third map information includes information on a place where the temporal change amount is equal to or greater than a predetermined threshold, the temporal change amount in the third map information is equal to or greater than the predetermined threshold value.
  • a removal process is performed to remove the location information in the first map information and the location information in the second map information corresponding to the location, or the third map information Performing a variance increasing process for increasing the variance of each of the location information in the first map information and the location information in the second map information, corresponding to a location where the temporal change amount in FIG. ii) Matching the first map information and the second map information after the removal process or the variance increasing process is performed, and generating map information based on the matching result, (iii) The first map information is updated to the map information generated based on the map information.
  • the third map information includes information on a place where the temporal change amount is equal to or greater than the predetermined threshold between the first timing and the second timing
  • the third map information includes information on a place where the temporal change amount is equal to or more than the predetermined threshold between the first timing and the second timing
  • the first map information When the time difference between the timing and the second timing is equal to or less than the first time, the first map information and the second map are not performed without performing the removal process and the variance increasing process. Information may be matched and map information may be generated based on the matching result.
  • map information may be generated based on the result.
  • the variance increasing process may be a process of increasing the uncertainty of the information of each of the corresponding location of the first map information and the corresponding location of the second map information.
  • the third map information may include information on the existence probability of an object around the environmental sensor node, and the temporal change amount may be the change amount of the existence probability.
  • Information, the second map information, and the third map information are coordinate information in a two-dimensional coordinate system or coordinate information in a three-dimensional coordinate system, and before performing the matching, the first map information A coordinate conversion process for converting the coordinate systems of the second map information and the third map information into a common coordinate system may be performed.
  • a self-position estimation method is a self-position estimation method for a mobile robot that performs self-position estimation using at least one sensor node.
  • 1st map information is acquired
  • 2nd map information including the information around the said mobile robot is acquired by the external sensor mounted in the said mobile robot
  • 3rd including the information around the said mobile robot from an environmental sensor node Receiving map information
  • the third map information includes information on a location whose temporal change amount is equal to or greater than a predetermined threshold, the temporal change amount in the third map information is equal to or greater than the predetermined threshold value.
  • a removal process for removing the location information in the first map information and the location information in the second map information corresponding to the location of the third location, or the third location A variance increasing process for increasing the variance of each of the location information in the first map information and the location information in the second map information corresponding to a location where the temporal change amount in the information is equal to or greater than the predetermined threshold, (Ii) matching the first map information and the second map information after the removal process or the dispersion increasing process is performed, and generating map information based on the matching result; (iii) matching result Update the first map information to the map information generated on the basis of (iv) the updated first map information and an internal sensor for detecting at least one of the position and orientation of the mobile robot Based on the detection result, the self-position of the mobile robot on the updated first map information is estimated.
  • a moving route may be calculated based on the updated first map information and the estimated self-location, and the mobile robot may be moved along the moving route.
  • a robot system is a robot system including at least one environment sensor node and a mobile robot, and the environment sensor node acquires third map information including information around the mobile robot.
  • the mobile robot acquires a database in which first map information created in advance including information around the mobile robot and second map information including information around the mobile robot are acquired.
  • a communication unit that communicates with the environmental sensor node to acquire the third map information, and (i) information on a location whose temporal variation is equal to or greater than a predetermined threshold exists in the third map information.
  • the location information in the first map information and the second map information corresponding to locations where the temporal change in the third map information is equal to or greater than the predetermined threshold.
  • First map information and second map information after the removal process or the dispersion increasing process is performed.
  • an information processing unit that updates the first map information recorded in advance to the map information generated based on the matching result, and generates map information based on the matching result.
  • a mobile robot includes a database in which first map information created in advance including information around the mobile robot is recorded, and second map information including information around the mobile robot.
  • the third map information is acquired by communicating with an external sensor to be acquired and at least one environmental sensor node that is external to the mobile robot and acquires third map information including information around the mobile robot.
  • a communication unit and (i) if the third map information includes information on a place where the temporal change amount is equal to or greater than a predetermined threshold, the temporal change amount in the third map information is equal to or greater than the predetermined threshold.
  • a removal process for removing the location information in the first map information and the location information in the second map information corresponding to the location is performed, or the time in the third map information (Ii) performing a variance increasing process for increasing variance of each of the location information in the first map information and the location information in the second map information corresponding to a location where the change amount is equal to or greater than the predetermined threshold;
  • the first map information and the second map information after the removal process or the dispersion increasing process are matched, map information is generated based on the matching result, and (iii) generated based on the matching result.
  • an information processing unit that updates the previously recorded first map information in the map information.
  • FIG. 1 is a block diagram showing an example of the configuration of the robot system according to the first embodiment.
  • the robot system includes a mobile robot 100 and an environment sensor node 120.
  • the mobile robot 100 includes an external sensor 101, an internal sensor 102, an information processing unit (moving object / environment change information integration unit 103, map generation / self-position estimation unit 104, map information database 105, and route planning unit 106), The control unit 107, the actuator 108, and the communication unit 109 are included.
  • an information processing unit moving object / environment change information integration unit 103, map generation / self-position estimation unit 104, map information database 105, and route planning unit 106
  • the control unit 107, the actuator 108, and the communication unit 109 are included.
  • the mobile robot 100 includes, for example, two or four wheels or two or more legs as means for moving the mobile robot 100. These wheels or legs are operated by the power from the actuator 108, whereby the mobile robot 100 can be moved.
  • the above-described leg of the mobile robot 100 preferably includes one or more joints.
  • the arm of the mobile robot 100 described above preferably includes one or more joints.
  • the mobile robot 100 includes one or more CPUs, one or more main memories, and one or more auxiliary memories (not shown).
  • the above-described CPU of the mobile robot 100 performs various arithmetic processes performed by the moving object / environment change information integration unit 103, the map generation / self-position estimation unit 104, the route plan unit 106, the control unit 107, and the like described below. .
  • the main memory of the mobile robot 100 described above is a storage device that can be directly accessed by the CPU described above, and can be configured by a memory such as a DRAM, SRAM, flash memory, ReRAM, FeRAM, MRAM, STT-RAM, PCRAM, or the like. .
  • the auxiliary storage of the mobile robot 100 described above is a storage device that performs long-term storage, copying, or backup of the contents of the main storage described above, and may be configured by an HDD, a flash memory, an optical disk, a magnetic disk, or the like.
  • the external sensor 101 is a sensor that detects information about the surrounding environment of the mobile robot 100, and detects a two-dimensional shape or a three-dimensional shape, color, and material of the surrounding environment.
  • the external sensor 101 may be configured by sensors such as LRF (Laser Range Finder), LIDAR (Laser Imaging Detection and Ranging), a camera, a depth camera, a stereo camera, a sonar, a RADAR, or all combinations thereof.
  • LRF Laser Range Finder
  • LIDAR Laser Imaging Detection and Ranging
  • a camera a depth camera
  • stereo camera a stereo camera
  • sonar a sonar
  • RADAR RADAR
  • the inner world sensor 102 is a sensor that detects the position / posture of the mobile robot 100.
  • the internal sensor 102 has three direction components of the moving distance, speed, acceleration, angular velocity, and posture of the mobile robot 100 (for example, the X-axis direction component, the Y-axis direction component, and the Z-axis direction component of the XYZ orthogonal coordinate system). ) At least one of the absolute values can be detected. Further, it may be possible to detect the absolute values of the velocity, acceleration, angular velocity, posture, and torque for each joint of the leg and arm of the mobile robot 100.
  • the internal sensor 102 can be configured by sensors such as an acceleration sensor, an angular velocity sensor, an encoder, a geomagnetic sensor, an atmospheric pressure / water pressure sensor, a torque sensor, or a combination thereof.
  • the moving body / environment change information integration unit 103 acquires information (second map information) detected by the external sensor 101 mounted on the mobile robot 100. Also, the moving object movement / environment change information (third map information) extracted by the environment sensor node 120 is acquired via the communication unit 109. Also, a process (removal process) for removing information about a place that dynamically changes in time series in information detected by the external sensor 101 using the moving body movement / environment change information extracted by the environment sensor node 120, or A process of reducing the reliability of information about the place (dispersion increasing process) is performed. Specific contents of the removal process and the dispersion increasing process will be described later.
  • the map generation / self-position estimation unit 104 acquires at least one of the absolute values of the three-direction components of the moving distance, speed, acceleration, angular velocity, and posture of the mobile robot 100 obtained from the internal sensor 102. Further, the map generation / self-position estimation unit 104 acquires information about the surrounding environment of the mobile robot 100 by the external sensor 101 processed by the moving body / environment change information integration unit 103 (removed or distributed and increased). To do. Then, the map generation / self-position estimation unit 104 simultaneously performs estimation of the self-position of the mobile robot 100 and generation of map information from the acquired information by filtering processing. Note that self-position estimation and map information generation need not be performed simultaneously.
  • the filtering process can be configured by a Kalman filter, an extended Kalman filter, an Unsent Kalman filter, a particle filter, or a Rao-Blackwellized particle filter.
  • the map generation / self-position estimation unit 104 stores the generated map in the map information database 105. If a map of the corresponding environment already exists, the map information is updated using the map information stored in the map information database 105 and stored in the map information database 105.
  • the map information database 105 stores the map information (first map information) generated / updated by the map generation / self-position estimation unit 104.
  • the stored map information is used by the map generation / self-position estimating unit 104 and the route planning unit 106, respectively.
  • the map information database 105 is arranged in the main memory or secondary memory of the mobile robot 100.
  • the route planning unit 106 uses the map information stored in the map information database 105 and the information on the self-location of the mobile robot 100 estimated by the map generation / self-position estimation unit 104 to calculate the map from the current self-location.
  • Plan a route to travel through.
  • the planned route is a route that minimizes the cost of the route.
  • the cost of the route can be expressed by the total travel time, the total travel distance, the total energy used for the travel, the sum of the congestion degree of the route, or a combination of all of them.
  • the control unit 107 performs control for executing the moving operation of the mobile robot 100 on the route planned by the route planning unit 106 in an environment where the mobile robot actually operates.
  • the actuator 108 drives, for example, a wheel or the like based on a control command from the control unit 107, and thereby actually moves the mobile robot 100.
  • the communication unit 109 has a function of performing one-to-many and many-to-many communication by wire or wireless.
  • the communication unit 109 inquires of the environment sensor node 120 whether there is moving object movement / environment change information at the specified location.
  • the environmental sensor node 120 includes an environmental sensor 121, a moving body movement / environment change information extraction unit 122, an environmental sensor information time series database 123, a moving body movement / environment change information database 124, and a communication unit 125.
  • the environmental sensor 121 is a sensor that detects information about the surrounding environment of the environmental sensor node 120.
  • the information on the surrounding environment includes information on the two-dimensional shape or the three-dimensional shape and color / material of the surrounding environment.
  • FIG. 2 is a schematic diagram showing an example of an environment in which the environment sensor node 120, a plurality of environment sensor nodes 312 to 319 similar to the environment sensor node 120, and the mobile robot 100 exist.
  • the environmental sensor nodes 120 and 312 to 319 include environmental sensors 121 and 322 to 329, respectively.
  • the detection ranges of the environmental sensor nodes 120 and 312 are 331 and 332, respectively.
  • the detection ranges of the environmental sensor nodes 313 to 319 are not shown for simplification of the drawing, the environmental sensor nodes and the environmental sensors are installed so that the mobile robot can detect all the movable spaces. ing.
  • the environmental sensors 121 and 322 to 329 of the environmental sensor nodes 120 and 312 to 319 include, for example, LRF (Laser Range Finder), LIDAR (Laser Imaging Detection and Ranging), camera, depth camera, stereo camera, sonar, RADAR, focus It may be configured by a sensor such as an electric infrared sensor, an infrared ToF (Time of Flight) sensor, or a combination of all of them.
  • LRF Laser Range Finder
  • LIDAR Laser Imaging Detection and Ranging
  • camera depth camera
  • stereo camera stereo camera
  • sonar sonar
  • RADAR RADAR
  • focus It may be configured by a sensor such as an electric infrared sensor, an infrared ToF (Time of Flight) sensor, or a combination of all of them.
  • the mobile robot 100 includes an external sensor 101 and moves in the movement direction 303.
  • the detection range of the external sensor 101 is 304 at the timing shown in FIG.
  • FIG. 2 shows the current position 341 of the person / moving object, the past position 342 of the person / moving object, and the current moving direction 343 of the person / moving object.
  • the time-series environmental change due to the movement of the person / moving body from the position 342 to the position 341 is extracted by the environmental sensor nodes 120, 312, 313, and 314 as moving body moving / environment change information, as will be described in detail later.
  • the information on the surrounding environment acquired by the environment sensor 121 includes information on coordinates (coordinate information).
  • the moving object / environment change information extraction unit 122 stores the time-series information of the surrounding environment detected by the environment sensor 121 in the environment sensor information time-series database 123 in the form of a combination of detection information and detection timing.
  • the detection timing is represented by time.
  • the moving object / environmental change information extraction unit 122 also changes the surrounding environment based on a plurality of combinations of detection timing and detection information along the time series stored in the environmental sensor information time-series database 123. The magnitude of change and the time taken for the change are calculated.
  • the moving body / environment change information extraction unit 122 uses the moving body / environment change information database 124 as the moving body / environment change information, which includes the change location of the surrounding environment, the change shape, the magnitude of the change, and the time taken for the change. Save to. Note that the moving body / environment change information may not include all of the change place, change shape, change magnitude, and time taken for change in the surrounding environment. Only some of these pieces of information may be included, or other information may be included.
  • the environmental sensor information time-series database 123 stores the information of the time-series surrounding environment detected by the environmental sensor 121 acquired by the moving body movement / environment change information extraction unit 122 as a set of detection information and detection timing. The stored information is used by the moving object movement / environment change information extraction unit 122 to extract moving object movement / environment change information.
  • the moving object / environment change information database 124 stores the moving object / environment change information calculated by the moving object / environment change information extraction unit 122.
  • the stored moving object / environment change information is transmitted to the mobile robot 100 by the communication unit 125 that has received an inquiry from the communication unit 109 of the mobile robot 100.
  • the communication unit 125 receives an inquiry from the communication unit 109 of the mobile robot 100 as to whether or not there is moving body movement / environment change information, and performs a search in the moving body movement / environment change information database 124 to search for moving body movement / environment.
  • the change information is transmitted to the mobile robot 100 by wireless or wired communication.
  • FIG. 3A is a flowchart illustrating an example of processing of the robot system.
  • T201 represents the processing flow of the mobile robot 100
  • T221A and T221B represent the processing flow of the environment sensor node 120.
  • the inner world sensor 102 acquires information.
  • the information to be acquired is the absolute value of the three direction components of the moving distance, speed, acceleration, angular velocity, and posture of the mobile robot 100.
  • step S202 the external sensor 101 acquires information about the surrounding environment of the mobile robot 100.
  • the information on the surrounding environment at the timing shown in FIG. 2 is information detected by the external sensor 101 for the detection range 304.
  • the information to be acquired includes information on the two-dimensional or three-dimensional shape, color, and material of the surrounding environment.
  • step S203 the mobile robot 100 transmits the specified time to the environment sensor nodes 120 and 312 to 319 via the communication unit 109 within a specified two-dimensional or three-dimensional arbitrary size.
  • An inquiry is made as to whether moving object / environment change information exists in the belt, and a response to the inquiry is waited for a predetermined time.
  • the location and time may be designated in advance by a user who manages the mobile robot 100 as an arbitrary (fixed) size location or an arbitrary length of time.
  • the mobile robot may change and specify the size of the place and the length of time according to the state.
  • the inquiry from the mobile robot 100 may be made to all other mobile robots and other environmental sensor nodes that can communicate.
  • the mobile robot 100 can receive responses and moving object / environment change information from all mobile robots and environmental sensor nodes. Inquiries from the mobile robot 100 may be performed simultaneously on a plurality of environmental sensor nodes (a plurality of other mobile robots) or sequentially.
  • step S204 the communication unit 109 determines whether there is a response from the environment sensor nodes 120 and 312 to 319 in response to the inquiry in step S203. If there is one or more responses, the process proceeds to step S205. If there is no response, the process proceeds to step S208.
  • step S205 the communication unit 109 receives moving body movement / environment change information from the environment sensor nodes 120, 312-319, all other environment sensor nodes, and all other mobile robots.
  • moving body movement / environment change information may be received only from a specific environmental sensor node among the environmental sensor nodes 120 and 312 to 319.
  • Information regarding the location specified in step S203 may be acquired only from the environmental sensor node having the environmental change information.
  • step S206 the moving body / environment change information integration unit 103 uses map information (first information) that is used in the next step S207 and includes location information corresponding to the location where the moving body movement / environment change information received in step S205 exists. Map information) is acquired from the map information database 105.
  • FIG. 3B is a flowchart showing an example of the process of the robot system, and more specifically, a flowchart showing an example of the process of step S207.
  • step S207 the specific contents of step S207, that is, the details of steps S20701 to S20706 will be described.
  • step S207 the moving body / environment change information integration unit 103 extracts one of the information acquired by the external sensor 101 in step S202 and the map information acquired in step S206 by the environmental sensor nodes 120 and 312 to 319. Performs processing to integrate more than one moving object / environmental change information.
  • the moving body / environment change information integration unit 103 includes the coordinate system of the information on the surrounding environment acquired by the external sensor 101 in step S202, and the moving body / environment information transmitted from the environment sensor nodes 120 and 312 to 319. Coordinate conversion is performed to convert the coordinate system of the change information into the coordinate system of the map information acquired in step S206. Make each coordinate system a common coordinate system.
  • step S20702 the moving body / environment change information integration unit 103 determines whether there is a place where the magnitude of the temporal change amount between different timings regarding the shape of the moving body movement / environment change information is greater than the threshold Th_ch. Judgment is made.
  • the “temporal change amount” relates to the existence probability (Exitence Probability) of each point, which is the probability that an object exists at each point in the two-dimensional space or the three-dimensional space, for example. Specifically, for each point, the absolute value of the difference between the existence probability at the first timing and the existence probability at the second timing that is temporally later than the first timing is obtained, and each point is obtained. The sum of the absolute values of the differences (Sum of Absolute Difference). That is, it is the change amount of the existence probability that has changed in the time between the first timing and the second timing. Further, the first timing and the second timing may be time, or may be elapsed time after the robot system starts to operate.
  • the temporal change amount related to the shape is, for example, SSD (Sum of Squared Difference), SATD (Sum of Absolute Transformed Difference), MAD (Mean Absolute Difference), MSD Quante, or MSD Quant (Image). It may be a difference between feature vectors such as SIFT, SURF, and HoG. Further, the temporal change amount related to the shape may be, for example, a difference between BoF (Bag of Feature) feature vectors generated using these feature vectors.
  • the setting of the specific value of the threshold Th_ch differs depending on the standard of change and the situation. As an example, a change amount when an object of 100 mm ⁇ 100 mm ⁇ 100 mm moves 20 mm in a space of 10 m ⁇ 10 m ⁇ 10 m may be set as the threshold Th_ch.
  • the setting of the specific value of the threshold Th_ch is not limited to the above example.
  • step S20703 If there is a place where the amount of temporal change between different timings is larger than the threshold Th_ch, the process proceeds to step S20703. Otherwise, the process proceeds to step S20707.
  • step S20703 the moving body / environment change information integration unit 103 determines that the time difference between timings where the temporal change amount related to the shape of the moving object / environment change information is greater than the threshold Th_ch is greater than the threshold Th_tu (first time). It is determined whether or not. If greater than the threshold Th_tu, the process proceeds to step S20707. This is because when the time difference is larger than the predetermined threshold Th_tu, the moving object / environmental change information is not a temporary environmental change due to a disturbance, but a semi-permanent or permanent environmental change due to the movement of a static object, etc. This means treating the changed part of the environment as a new map.
  • a layout change when moving furniture or the like in a room corresponds to this, and a change with a large time difference such as a layout change is treated as a new map. If the time difference is less than or equal to the threshold Th_tu, the process proceeds to step S20704.
  • step S20704 the moving body / environment change information integration unit 103 determines that the time difference between timings at which the temporal change amount related to the shape of the moving body / environment change information is larger than the threshold Th_ch is smaller than the threshold Th_tb (second time). It is determined whether or not.
  • the moving body / environment change information integration unit 103 corresponds to a place where the temporal change amount in the moving body / environment change information (third map information) is larger than the threshold Th_ch.
  • the removal process is performed to remove the location information in the surrounding environment information (second map information) acquired by the 100 external sensors 101 and the location information in the map information (first map information) acquired in step S206. . Details of the information removal process will be described later.
  • the moving body / environment change information integration unit 103 corresponds to a location where the temporal change amount in the moving body / environment change information (third map information) is greater than the threshold Th_ch.
  • the location information in the surrounding environment information (second map information) acquired by the external sensor 101 of the mobile robot 100 and the location information in the map information (first map information) acquired in step S206 are distributed. Execute distributed increase processing to increase and change.
  • the threshold Th_tb is larger than the threshold Th_tu. For example, an increase change in variance is calculated in a form (a ⁇ V) by multiplying variance V by a constant a greater than 1.
  • “Distribution of information” here refers to an index that represents the variability and uncertainty of information.
  • the variance when the variance is small, it means that there is a high probability that the detection target exists around the expected value in the space of the surrounding environment.
  • the variance when the variance is large, it means that the probability that the detection target exists around the expected value is low, and the distribution of the probability that the detection target can exist greatly spreads in the space.
  • dispersion increase processing of information means increasing the likelihood of information variation and uncertainty, that is, reducing the reliability of information. Details of this information increase processing will be described later.
  • step S20704 it is determined whether or not the time difference between timings related to the shape of the moving object / environment change information is greater than the threshold Th_ch, but is not limited thereto. For example, it may be determined whether the magnitude of the temporal change amount related to the shape of the moving object / environment change information is larger than a new threshold value different from the threshold value Th_ch. That is, if the magnitude of the temporal change amount related to the shape of the moving object / environment change information is larger than the new threshold (Yes), the process proceeds to step S20705; otherwise (No), the process proceeds to step S20706.
  • the new threshold value is higher than the threshold value Th_ch. That is, in the case of Yes in step S20704, since the amount of change is larger than in the case where it is not (No), the process of removing the moving body movement / environment change information from the external sensor information / map information is effective.
  • step S20707 the moving object / environment change information integration unit 103 has not processed the moving object / environment change information integration unit 103 regarding the received moving object / environment change information related to one or more places, that is, has not been processed. It is determined whether there is moving object movement / environment change information. If unprocessed moving object / environment change information exists, the process returns to step S20701, and the process is continued until there is no unprocessed moving object / environment change information. If there is no unprocessed moving object / environment change information, the process proceeds to step S208.
  • step S208 map information is generated and the position of the mobile robot 100 is estimated (including matching processing). If there is a response from the environmental sensor node 120 in step S204, the information acquired by the external sensor 101 in which the moving body movement / environment change information is integrated in step S207, and the information acquired by the internal sensor 102 in step S201
  • the map generation / self-position estimation unit 104 generates map information and estimates the self-position of the mobile robot 100.
  • the map information is compared with the sensor data acquired by the external sensor (matching process) to search for the conversion with the highest similarity and move with the conversion with the highest similarity.
  • the moving distance and moving direction of the robot 100 that is, the current self position is estimated.
  • the map generation / self-use is performed using the information acquired by the external sensor 101 in step S202 and the information acquired by the internal sensor 102 in step S201.
  • the position estimation unit 104 generates map information and estimates the self position of the mobile robot 100. The generation of the map information and the estimation of the self position are simultaneously performed by the filtering process in the map generation / self position estimation unit 104.
  • step S209 the route planning unit 106 plans a route moving from the current self-location using the map information generated in step S208 stored in the map information database 105.
  • the route planning method is as described above.
  • step S210 the control unit 107 generates a control command for performing the moving operation of the mobile robot 100 on the route planned in step S209.
  • step S211 the actuator 108 actually moves the mobile robot 100 in response to the control command generated in step S210.
  • steps S221 to S227 which are processes in the environment sensor node 120, will be described.
  • step S221 of the processing flow T221A the communication unit 125 of the environmental sensor node 120 receives an inquiry from the mobile robot 100.
  • step S223 whether or not there is moving body movement / environment change information that meets the conditions of the specified location and the specified time zone, which are the contents of the inquiry from the mobile robot 100, is determined using the moving body movement / environment change information database 124. Check. As a result of the confirmation, if there is moving object movement / environment change information that matches the condition, the process proceeds to step S225. If there is no moving object movement / environment change information that matches the condition, the process returns to step S221.
  • step S225 the communication unit 125 responds to the inquiry about whether the moving body / environment change information exists in the location and time zone specified in step S203 to the mobile robot 100 (the specified location Information on whether or not there is moving object movement / environment change information).
  • step S226 the communication unit 125 transmits moving body movement / environment change information to the mobile robot 100 that has made the inquiry received in step S221. After the transmission, the process returns to step S221.
  • step S222 of the process flow 221B the moving object / environment change information extraction unit 122 acquires information around the environment sensor node 120 detected by the environment sensor 121, and detects the detection time and detection in the environment sensor information time series database 123. Save in chronological order as a set of information.
  • step S224 the moving object / environment change information extraction unit 122 sets a plurality of sets of detection times and detection information related to the periphery of the environmental sensor node 120 stored in the environmental sensor information time series database 123 in time series. It is determined whether or not the environment around the environment sensor node 120 has changed.
  • step S227 the moving body movement / environment change information extraction unit 122 follows the detection time and detection information time series stored in the environmental sensor information time series database 123. Based on a plurality of sets, the change place, change shape, change magnitude, and time taken for the change in the surrounding environment are calculated. The change location, the magnitude of the change, and the time taken for the change are stored in the moving object / environment change information database 124 as moving object / environment change information.
  • the extracted moving body movement / environment change information is transmitted from the environment sensor node through the communication path 353 by the communication unit 351 mounted on the environment sensor node and the communication unit 352 mounted on the mobile robot 100.
  • the moving body / environment change information sent to the mobile robot is information detected by the external sensor 101 of the mobile robot 100 corresponding to the place where the environment change information by the person / moving body exists by the moving body / environment change information integration unit. Used to remove location information in (second map information). This makes it possible to remove a dynamic environmental change that is a disturbance from information detected by the external sensor 101 of the mobile robot 100.
  • FIG. 4A shows that the mobile robot 100 exists in the environment 400 at an arbitrary time T, for example.
  • FIG. 4C shows information (sensor data) 430 acquired by the external sensor 101 of the mobile robot 100 at time T.
  • FIG. 4B shows that the mobile robot 100 exists in the environment 400 at time T + ⁇ when an arbitrary time interval ⁇ has elapsed from time T.
  • FIG. 4D shows information (sensor data) 431 acquired by the external sensor 101 of the mobile robot 100 at time T + ⁇ when an arbitrary time interval ⁇ has elapsed from time T.
  • the environment 400 includes the mobile robot 100, the static object A present at the position 401, the moving object H present at the position 402, and the environment sensor node 120.
  • the mobile robot 100 is moving in the traveling direction 405.
  • the moving body H is moving from the position 402 in the traveling direction 404.
  • the environment 400 includes the mobile robot 100, the static object A present at the position 401, the moving object H present at the position 403, and the environment sensor node 120.
  • the position of the mobile robot 100 has changed. Further, the position of the moving body H also changes from the position 402 to the position 403.
  • the external sensor 101 mounted on the mobile robot 100 acquires information around the mobile robot 100 in the environment 400.
  • FIG. 4C shows sensor data 430 (thick solid line) acquired by the external sensor 101 with respect to the environment 400 at time T.
  • the sensor data 430 includes the static object A present at the position 401, the moving object H present at the position 402, the shape of each wall W, and the distance from the external sensor 101 in the environment 400.
  • FIG. 4D shows sensor data 431 (thick dashed line) acquired by the external sensor 101 for the environment 400 at time T + ⁇ when an arbitrary time interval ⁇ has elapsed from time T.
  • the sensor data 431 includes, in the environment 400, the static object A present at the position 401, the moving object H present at the position 403, the shape of each wall W, and the distance from the external sensor 101.
  • the sensor data 430 and the sensor data 431 may include data formats such as two-dimensional distance data, three-dimensional distance data, two-dimensional shape data, three-dimensional shape data, and image feature points, or all combinations thereof. . That is, the information acquired by the external sensor 101 includes coordinate information.
  • the environmental sensor node 120 extracts, as the moving body movement / environment change information, the environmental change caused by the moving body H moving from the position 402 to the position 403 between the time T and the time T + ⁇ .
  • 5A to 5C show a matching process between map data at time T and sensor data at time T + ⁇ , and an example of the result.
  • the sensor data matching process is executed by the map generation / self-position estimation unit 104 in step S208 described above.
  • the map information (data) and the sensor data acquired by the external sensor are collated to search for a conversion with a high degree of similarity, and the moving distance and movement of the mobile robot 100 with the conversion with the highest degree of similarity.
  • map data 501 at time T that is, map information in the map information database 105) created based on the sensor data 430, sensor data 431 acquired by the external sensor 101 of the mobile robot 100 at time T + ⁇ , The matching process will be described.
  • the matching process is a process of comparing a plurality of data and searching for conversion between a plurality of data having a low degree of difference, that is, a high degree of similarity.
  • the gradient of the dissimilarity is calculated, and a conversion that minimizes the dissimilarity is searched for.
  • the matching result of the map data 501 and the sensor data 431 by the matching process can be as a result example 503 and a result example 504 shown in FIG. 5B.
  • Result example 504 shows an example that is very close to the correct answer (Ground truth) as the correspondence between the map data 501 and the sensor data 431.
  • the still object A that does not move from the position 401 corresponds to the data portion (left side portion) based on the shape of the wall with high accuracy, and corresponds to the correct answer with high accuracy.
  • the part (right side part) in which the shape of the sensor data has changed due to the moving body H moving from the position 402 to the position 403 is not taken well, and as a result, the similarity of the entire sensor data is reduced. Yes.
  • FIG. 5C shows an evaluation value example 505 in matching.
  • an evaluation value example 505 an example is shown in which the horizontal axis represents the distance in the X-axis direction between two pieces of data 501 and 431 to be matched, and the vertical axis represents the evaluation value.
  • the evaluation value represents the degree of dissimilarity between data, and becomes a lower value as the data shapes match.
  • the evaluation value example 505 shows an example in which the sensor data 431 is translated only in the X-axis direction with respect to the map data 501.
  • Translation refers to a movement that translates in one direction.
  • the evaluation value example 505 in matching shows an evaluation value 507 of matching calculation for the result example 504.
  • the change in the evaluation value is gentle, and when the result is obtained using a method such as the gradient method, a large number of repetitions of the matching operation are required before the result is converged.
  • the result example 503 shown in FIG. 5B shows an example in which the degree of correspondence between the map data 501 and the correct answer (Ground truth) as the correspondence between the sensor data 431 is low.
  • the evaluation value example 505 in matching shows the evaluation value 506 of the matching operation for the result example 503.
  • the evaluation value 506 is a minimum value in the evaluation value example 505. Therefore, when a result is obtained using a technique such as a gradient method, the result converges to the evaluation value 506, and the result example 503 having a low degree of correspondence with the correct answer is erroneously determined as the optimum result. As described above, the result converges to the result example 503, and there may be a problem that a correspondence having a low degree of correspondence with the correct answer is erroneously determined to be optimal.
  • FIG. 6A is a diagram illustrating an example of data removal processing.
  • the process of removing the data of the target place by the first process that is, the process of removing the data of the place (region C) where the moving body / environment change information exists due to the movement of the moving body H
  • the map data 601 at time T and the sensor data 602 at time T + ⁇ after the operation is performed are shown.
  • step S20705 the process shown in FIG. 6A is the same process as step S20705. That is, the region C shown in FIG. 6A corresponds to a place where the temporal change amount is larger than the threshold Th_ch.
  • the matching evaluation value example shows an example in which the sensor data 602 is translated only in the X-axis direction with respect to the map data 601. That is, in the graph of FIG. 6C, the horizontal axis indicates the distance in the X-axis direction between the two data 601 and 602, and the vertical axis indicates the evaluation value (difference between the two data 601 and 602).
  • the information of the area where the moving object exists in each of the two data to be matched (area C shown in FIG. 6A) is removed before the matching process is executed.
  • the accuracy of the matching process can be improved and the amount of calculation can be reduced.
  • FIG. 7A is a diagram illustrating an example of a data distribution increase process.
  • the process of increasing the data distribution at the target location is performed by the process of increasing the data distribution at the location where the moving object / environment change information exists (area C) due to the movement of the moving object H.
  • Area C the process of increasing the data distribution at the location where the moving object / environment change information exists due to the movement of the moving object H.
  • Later map data 603 at time T and sensor data 604 at time T + ⁇ are shown.
  • FIG. 8 shows map data 501 at time T.
  • the map data 501 includes actual measurement points (measurement coordinates) actually measured by the external sensor 101 of the mobile robot 100.
  • the variance of the information of the map data 501 can be regarded as the variance of the measured points, and represents the uncertainty of the measured points (the reliability of the coordinate information).
  • the variance of the actual measurement points can be specifically defined by the width of the distribution of existence probabilities having the actual measurement points as a peak. Therefore, an increase in the distribution of information means an increase in the width of the distribution of existence probabilities having the actual measurement point as a peak.
  • FIG. 10 shows a state in which an increase in the distribution of information is executed in a part (right side part) of the map data 501, that is, the width of the distribution of the existence probability of the measured points is increased.
  • step S20706 the process shown in FIG. 7A is the same process as step S20706. That is, a region C illustrated in FIG. 7A corresponds to a place where the temporal change amount is larger than the threshold Th_ch.
  • FIG. 7B and 7C show examples of matching results and matching evaluation values of the map data 603 and sensor data 604.
  • the matching evaluation value example shows an example in which the sensor data 604 is translated only in the X-axis direction with respect to the map data 603.
  • the change in the evaluation value becomes steep. Therefore, when the result is obtained using a method such as the gradient method, the minimum value corresponding to the point of the evaluation value 506 of the evaluation value example 505 shown in FIG. 5C is sufficiently larger than the point of the evaluation value 608. Compared to the evaluation value example 505 in FIG. According to this evaluation example, it is easy to avoid misjudgment and it is possible to respond to a correct answer with high accuracy. Furthermore, the change in the evaluation value around the evaluation value 608 is steep compared to the evaluation value 507 shown in FIG. 5C, and the number of times of the matching operation is repeated using the evaluation value example 505 shown in FIG. 5C. Compared to
  • the information on the area where the moving object exists in each of the two data to be matched (for example, the area C shown in FIG. 7A) is distributed and increased before the matching process is executed. As a result, the accuracy of the matching process can be improved and the amount of calculation can be reduced.
  • the information removal process shown in FIG. 6A and the information dispersion increase process shown in FIG. 7A are used according to the conditions as shown in FIG. 3B. Specifically, when there is a place where the amount of temporal change between the first timing and the second timing is greater than the threshold Th_ch (when there is a place where such moving object / environment change information exists) In addition, when the time difference between the first timing and the second timing is smaller than the threshold Th_tb (second time), that is, it is determined that the influence due to the movement of the moving object / change in the environment is relatively large. In this case, the removal process is executed in step S20705.
  • step S20706 Increase processing is executed.
  • the removal process is a process that suppresses the influence of the movement of the moving object and the change of the environment to be smaller than the dispersion increasing process.
  • the removal process and the dispersion increasing process are selectively used depending on the degree of the influence due to the movement of the moving object and the change of the environment, thereby realizing the matching process between the data using the optimum method. .
  • the first embodiment it is possible to perform map generation and self-position estimation of a mobile robot that moves using the map with high accuracy and robustness even in a space with a large environmental change.
  • the robot system according to the first embodiment described above includes one mobile robot and at least one environmental sensor node.
  • the robot system according to the second embodiment includes a plurality of mobile robots and at least one environmental sensor node.
  • Each of the plurality of mobile robots has the same function as the environment sensor node in the first embodiment. That is, the remaining mobile robots can function as environment sensor nodes for one mobile robot that performs map generation and self-position estimation.
  • each mobile robot of the second embodiment uses an external sensor as in the case of the environmental sensor node of the first embodiment that finally acquires moving body movement / environment change information using the environmental sensor. It is configured to finally acquire moving body movement / environment change information.
  • the mobile robot includes substantially the same components as the moving body / environment change information extraction unit, environment sensor information time series database, and moving body / environment change information database of the environment sensor node shown in FIG. .
  • FIG. 11 is a flowchart showing an example of processing of a plurality of mobile robots in the robot system according to the second embodiment.
  • the mobile robot according to the second embodiment is configured to operate according to a main process flow T701 and sub-process flows T702A and T702B.
  • FIG. 11 shows a robot that executes the main processing flow (corresponding to the mobile robot 100 of the above-described first embodiment) and at least one other mobile robot that executes the sub-processing flows T702A and T702B (the above-described embodiment).
  • the exchange of information with a robot that functions as one environmental sensor node 120 is shown.
  • the difference between the robot system according to the second embodiment that performs the processing flow shown in FIG. 11 and the robot system according to the first embodiment shown in FIGS. 3A and 3B is that the moving body movement / environment change information is the processing flow. Extracted by a mobile robot that performs T702A and T702B and plays the role of an environmental sensor node, and is transmitted from the robot to a mobile robot that executes a main processing flow and performs map generation and self-position estimation. A mobile robot acting as a sensor node acquires information about its surroundings using an external sensor.
  • the map generation and the self-position estimation of the mobile robot that moves using the map can be performed with high accuracy and robustness even in a space with a large environmental change. It can be performed.
  • the movable mobile robot acts as an environmental sensor node, so that the area in which moving body movement / environment change can be detected is expanded, and accordingly, the amount of moving body movement / environment change information increases ( Compared to a robot system with a fixed environmental sensor node). As a result, it is possible to further suppress the deterioration in accuracy of the estimated value in the map generation / self-position estimation process and the increase in the calculation amount due to the disturbance.
  • the robot system since any one of the plurality of mobile robots plays the role of the environmental sensor node, it is possible to exclude at least one environmental sensor node itself from the robot system. That is, in this case, the robot system does not include the environment sensor node, but includes at least two mobile robots that can play the role of the environment sensor node.
  • the technique described in the above aspect can be realized, for example, in the following types of robot systems.
  • the types of robot systems in which the technology described in the above aspect is realized are not limited to these.
  • the processing of the present disclosure has been described in the embodiment, but the subject and the device that perform each processing are not particularly limited. It may be processed by a processor or the like (described below) embedded in a specific device located locally. Further, it may be processed by a cloud server or the like arranged at a location different from the local device. Moreover, you may share each process demonstrated by this indication by coordinating information between a local apparatus and a cloud server. Embodiments of the present disclosure will be described below.
  • the above apparatus is specifically a computer system including a microprocessor, ROM, RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
  • a computer program is stored in the RAM or hard disk unit.
  • Each device achieves its functions by the microprocessor operating according to the computer program.
  • the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
  • a part or all of the constituent elements constituting the above-described apparatus may be configured by one system LSI (Large Scale Integration).
  • the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on a single chip, and specifically, a computer system including a microprocessor, ROM, RAM, and the like. .
  • a computer program is stored in the RAM.
  • the system LSI achieves its functions by the microprocessor operating according to the computer program.
  • a part or all of the constituent elements constituting the above-described device may be constituted by an IC card or a single module that can be attached to and detached from each device.
  • the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
  • the IC card or the module may include the super multifunctional LSI described above.
  • the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
  • the present disclosure may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
  • the present disclosure provides a computer-readable recording medium such as a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a BD ( It may be recorded on a Blu-ray (registered trademark) Disc), a semiconductor memory, or the like.
  • the digital signal may be recorded on these recording media.
  • the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
  • the present disclosure may be a computer system including a microprocessor and a memory, and the memory may store the computer program, and the microprocessor may operate according to the computer program.
  • program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like and executed by another independent computer system. You may do that.
  • the mobile robot and environmental sensor node can take various forms.
  • the mobile robot 100 and the environmental sensor nodes 120 (312 to 319) shown in FIG. 2 are used indoors, but are not limited thereto.
  • the mobile robot may be in the form of a vehicle that automatically travels on an outdoor road.
  • the environmental sensor node is arranged along the road like a road sign or is provided in a building near the road.
  • the present disclosure is applicable to a map generation method, a self-position estimation method, a robot system, and a robot.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention porte sur un procédé de cartographie qui : acquiert de premières informations de carte pré-créées qui comprennent des informations de voisinage d'un robot mobile ; acquiert de secondes informations de carte qui comprennent des informations de voisinage du robot mobile à partir de capteurs du robot mobile ; reçoit de troisièmes informations de carte qui comprennent des informations du voisinage du robot mobile à partir de nœuds de capteurs environnementaux ; (i) si un site est présent dans les troisièmes informations de carte qui présentent un degré de variation temporelle qui est supérieur ou égal à un seuil prescrit, soit supprime des informations de sites respectivement dans les premières et secondes informations de carte qui correspondent audit site, soit augmente la variance d'informations des sites respectivement dans les premières et secondes informations de carte correspondant audit site ; et (ii) crée des informations de carte en utilisant les premières et secondes informations de carte dans lesquelles la suppression ou l'augmentation de variance a été effectuée.
PCT/JP2016/003634 2015-08-28 2016-08-08 Procédé de cartographie, procédé de localisation, système de robot, et robot WO2017038012A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP16841053.8A EP3343307B1 (fr) 2015-08-28 2016-08-08 Procédé de cartographie, procédé de localisation, système de robot, et robot
CN201680003184.4A CN106796434B (zh) 2015-08-28 2016-08-08 地图生成方法、自身位置推定方法、机器人系统和机器人
US15/825,159 US10549430B2 (en) 2015-08-28 2017-11-29 Mapping method, localization method, robot system, and robot

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015169412 2015-08-28
JP2015-169412 2015-08-28
JP2016134284A JP6849330B2 (ja) 2015-08-28 2016-07-06 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
JP2016-134284 2016-07-06

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/825,159 Continuation US10549430B2 (en) 2015-08-28 2017-11-29 Mapping method, localization method, robot system, and robot

Publications (1)

Publication Number Publication Date
WO2017038012A1 true WO2017038012A1 (fr) 2017-03-09

Family

ID=58186807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/003634 WO2017038012A1 (fr) 2015-08-28 2016-08-08 Procédé de cartographie, procédé de localisation, système de robot, et robot

Country Status (1)

Country Link
WO (1) WO2017038012A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018179649A1 (fr) * 2017-03-28 2018-10-04 株式会社日立産機システム Système de création de carte et système de robot
WO2019131198A1 (fr) * 2017-12-28 2019-07-04 ソニー株式会社 Dispositif de commande, procédé de commande, programme et corps mobile
CN110069586A (zh) * 2017-09-26 2019-07-30 卡西欧计算机株式会社 地图信息提供装置、系统及方法以及便携型地图发送装置
JP2020095435A (ja) * 2018-12-12 2020-06-18 株式会社日立製作所 移動体
CN113519019A (zh) * 2019-03-15 2021-10-19 日立安斯泰莫株式会社 自身位置推断装置、配备其的自动驾驶系统以及自身生成地图共享装置
US11216973B2 (en) 2019-03-07 2022-01-04 Mitsubishi Heavy Industries, Ltd. Self-localization device, self-localization method, and non-transitory computer-readable medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010511957A (ja) * 2006-12-08 2010-04-15 韓國電子通信研究院 周辺の環境変化に迅速に適応し、環境マップを作成することができる移動体の環境マップ作成装置及びその方法
WO2013002067A1 (fr) * 2011-06-29 2013-01-03 株式会社日立産機システム Robot mobile et système d'auto-estimation de position et d'attitude installé sur un corps mobile

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010511957A (ja) * 2006-12-08 2010-04-15 韓國電子通信研究院 周辺の環境変化に迅速に適応し、環境マップを作成することができる移動体の環境マップ作成装置及びその方法
WO2013002067A1 (fr) * 2011-06-29 2013-01-03 株式会社日立産機システム Robot mobile et système d'auto-estimation de position et d'attitude installé sur un corps mobile

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2018179649A1 (ja) * 2017-03-28 2019-11-07 株式会社日立産機システム 地図作成システム及びロボットシステム
WO2018179649A1 (fr) * 2017-03-28 2018-10-04 株式会社日立産機システム Système de création de carte et système de robot
CN110069586B (zh) * 2017-09-26 2023-06-27 卡西欧计算机株式会社 地图信息提供装置、系统及方法以及便携型地图发送装置
CN110069586A (zh) * 2017-09-26 2019-07-30 卡西欧计算机株式会社 地图信息提供装置、系统及方法以及便携型地图发送装置
JPWO2019131198A1 (ja) * 2017-12-28 2020-12-24 ソニー株式会社 制御装置、および制御方法、プログラム、並びに移動体
JP7151725B2 (ja) 2017-12-28 2022-10-12 ソニーグループ株式会社 制御装置、および制御方法、プログラム、並びに移動体
WO2019131198A1 (fr) * 2017-12-28 2019-07-04 ソニー株式会社 Dispositif de commande, procédé de commande, programme et corps mobile
US11822341B2 (en) 2017-12-28 2023-11-21 Sony Corporation Control device, control method, and mobile object to estimate the mobile object's self-position
JP2020095435A (ja) * 2018-12-12 2020-06-18 株式会社日立製作所 移動体
JP7302966B2 (ja) 2018-12-12 2023-07-04 株式会社日立製作所 移動体
US11216973B2 (en) 2019-03-07 2022-01-04 Mitsubishi Heavy Industries, Ltd. Self-localization device, self-localization method, and non-transitory computer-readable medium
CN113519019A (zh) * 2019-03-15 2021-10-19 日立安斯泰莫株式会社 自身位置推断装置、配备其的自动驾驶系统以及自身生成地图共享装置
CN113519019B (zh) * 2019-03-15 2023-10-20 日立安斯泰莫株式会社 自身位置推断装置、配备其的自动驾驶系统以及自身生成地图共享装置

Similar Documents

Publication Publication Date Title
JP6849330B2 (ja) 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
WO2017038012A1 (fr) Procédé de cartographie, procédé de localisation, système de robot, et robot
KR101776621B1 (ko) 에지 기반 재조정을 이용하여 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법
KR101725060B1 (ko) 그래디언트 기반 특징점을 이용한 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법
JP4942733B2 (ja) 物体認識及び認識された物体を含む周辺環境情報に基づいたロボットの自己位置推定方法
US8649557B2 (en) Method of mobile platform detecting and tracking dynamic objects and computer-readable medium thereof
Diosi et al. Interactive SLAM using laser and advanced sonar
US8467902B2 (en) Method and apparatus for estimating pose of mobile robot using particle filter
KR101503903B1 (ko) 이동 로봇의 지도 구성 장치 및 방법
KR101784183B1 (ko) ADoG 기반 특징점을 이용한 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법
KR101708061B1 (ko) 제어 장치, 제어 방법 및 기록 매체
WO2019190395A1 (fr) Procédé et système de retour d'un robot mobile autonome déplacé vers sa trajectoire de navigation
JP2019525342A (ja) 自律移動ロボッを制御する方法
JP5892663B2 (ja) 自己位置推定装置、自己位置推定方法、自己位置推定プログラム、及び移動体
CN112639502A (zh) 机器人位姿估计
KR20120046974A (ko) 이동 로봇 및 이동 로봇의 위치인식 및 지도작성 방법
JP2006350776A (ja) 移動体の経路生成装置
JP5276931B2 (ja) 移動体および移動体の位置推定誤り状態からの復帰方法
JP2015055974A (ja) 三次元物体認識装置、三次元物体認識方法、及び移動体
Lee et al. Vision-based kidnap recovery with SLAM for home cleaning robots
Xie et al. A real-time robust global localization for autonomous mobile robots in large environments
JP2015215651A (ja) ロボットおよび自己位置推定方法
WO2019100354A1 (fr) Procédé de détection d'état et appareil associé
Luo et al. Autonomous mobile robot intrinsic navigation based on visual topological map
Berkvens et al. Feasibility of geomagnetic localization and geomagnetic RatSLAM

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16841053

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016841053

Country of ref document: EP