JP2011209203A - Self-position estimating device and self-position estimating method - Google Patents

Self-position estimating device and self-position estimating method Download PDF

Info

Publication number
JP2011209203A
JP2011209203A JP2010078891A JP2010078891A JP2011209203A JP 2011209203 A JP2011209203 A JP 2011209203A JP 2010078891 A JP2010078891 A JP 2010078891A JP 2010078891 A JP2010078891 A JP 2010078891A JP 2011209203 A JP2011209203 A JP 2011209203A
Authority
JP
Japan
Prior art keywords
self
estimation
estimation unit
value
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2010078891A
Other languages
Japanese (ja)
Inventor
Atsushi Miyamoto
Kenichiro Nagasaka
敦史 宮本
憲一郎 長阪
Original Assignee
Sony Corp
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp, ソニー株式会社 filed Critical Sony Corp
Priority to JP2010078891A priority Critical patent/JP2011209203A/en
Publication of JP2011209203A publication Critical patent/JP2011209203A/en
Withdrawn legal-status Critical Current

Links

Images

Abstract

A self-position estimation apparatus capable of estimating a self-position with high sampling period and high accuracy is provided.
A self-position estimation apparatus according to the present invention includes a movement mechanism, an internal sensor for detecting operation information by a movement mechanism of an autonomous movement apparatus capable of moving in a predetermined space, and space information relating to a predetermined space. Based on the detection result by the external sensor to be detected, the first self-position estimation unit that estimates the self-position at the first sampling period, the self-position estimation result by the first self-position estimation unit, and the detection by the internal sensor And a second self-position estimation unit that estimates the self-position at a sampling period higher than the first sampling period based on the operation information of the autonomous mobile device that has been performed.
[Selection] Figure 4

Description

  The present invention relates to a self-position estimation apparatus and a self-position estimation method, and more particularly to a self-position estimation apparatus and a self-position estimation method of an autonomous mobile device that can autonomously move in a work space.

  In recent years, robots that move autonomously and perform various tasks are being realized. Such autonomous mobile robots are expected to perform work in environments where human access is difficult, such as maintenance work at nuclear power plants, rescue work at disaster sites, work in outer space, etc. Has been.

  On the other hand, there is an increasing expectation that autonomous mobile robots will be introduced into the home and users will be assisted in the home. For example, the burden on the user can be reduced by substituting an autonomous mobile robot for work that is physically difficult for the user as life support for the elderly and wheelchair users.

  In order for a robot to move autonomously in a predetermined space such as a room, it is necessary to be able to recognize its own environment, estimate its own position in the space, and specify its own movement route. As a method for estimating the self-position, for example, the following Non-Patent Document 1 discloses SLAM (Simultaneous Localization And Mapping) that can simultaneously estimate the position and orientation of the camera and the position of the feature point reflected in the camera image. A technique for dynamically generating an environment map representing a three-dimensional position of an object existing in real space by applying a so-called technique is described. The basic principle of SLAM technology using a monocular camera is described in Non-Patent Document 1 below.

  Conventionally, there has been a method of performing self-position estimation with high accuracy using the above-described method using SLAM and other methods. For example, Patent Document 1 discloses a method for recognizing the position of a mobile robot using a plurality of omnidirectional cameras. Patent Document 2 discloses a mobile robot capable of recognizing the position of the robot by detecting a marker arranged in the environment. Patent Document 3 discloses a method for estimating a self-position using a temporal change amount of a posture rotation deviation.

JP 2008-229834 A JP 2006-346767 A Japanese Patent No. 4246696

Andrew J. Davison, "Real-Time Simultaneous Localization and Mapping with a Single Camera", Proceedings of the 9th IEEE International Conference on Computer Vision Volume 2, 2003, pp.1403-1410

  However, either method can acquire information only with a low sampling period due to the characteristics of the sensor, for example, as in Patent Document 1, or may require a lot of time for calculation due to the characteristics of the algorithm. . For this reason, when self-position estimation is performed by these methods, there is a problem that the self-position estimation value can be acquired only with a low sampling period with a delay. On the other hand, it is also possible to acquire the self-position estimation value with a relatively high sampling period by using the drive amount detected from the moving mechanism of the robot. In this method, since the self-position is estimated based on the difference from the initial position of the robot, the error from the initial position increases as the robot moves, and high-precision self-position estimation cannot be performed. Also, slippage of the moving mechanism may cause a difference between the detected value and the actual movement, which may reduce the estimation accuracy.

  Also, in the control of a system that performs calculations with a period of 1 ms or less, such as robot motion control, if the self-position estimation value by the above method is used as it is, it will be input as a discontinuous value, and control It becomes a factor that impedes accuracy. Further, in the case of recognizing an environmental object based on the self-position estimation value, if the update of the self-position estimation value is delayed, the accurate position of the environmental object cannot be acquired.

  Accordingly, the present invention has been made in view of the above problems, and an object of the present invention is to provide a new and improved self-position capable of estimating the self-position with high sampling period and high accuracy. The object is to provide a position estimation device and a self-position estimation method.

  In order to solve the above-described problems, according to an aspect of the present invention, an inner world sensor that includes a moving mechanism and detects operation information by a moving mechanism of an autonomous mobile device that can move in a predetermined space, and a predetermined space A first self-position estimation unit that estimates a self-position in a first sampling period based on a detection result by an external sensor that detects spatial information about the self-position, a self-position estimation result by a first self-position estimation unit, A self-position estimation device comprising: a second self-position estimation unit that estimates the self-position at a sampling period higher than the first sampling period based on the operation information of the autonomous mobile device detected by the field sensor. Provided.

  The self-position estimation apparatus can further include a storage unit that stores space information in the space as an environment map. The first self-position estimation unit includes a position estimation unit that estimates a self-position in the space of the autonomous mobile device based on the spatial information of the environment map, the detection value of the internal sensor, and the detection value of the external sensor, and a position estimation unit And an update unit that updates the environment map of the storage unit based on the self-position estimated by the above and the detection value of the external sensor.

  The external sensor is a distance sensor that acquires the distance from the autonomous mobile device to an object that exists in the space, and the first self-position estimation unit recognizes based on the position and orientation of the distance sensor and the distance measured by the distance sensor. The environment map may be dynamically generated by using SLAM that simultaneously estimates the position of the detected object.

  The second self-position estimation unit uses an extended Kalman filter including a time evolution model based on the detection value of the internal sensor and an observation model based on the self-position of the autonomous mobile device estimated by the first self-position estimation unit. Thus, the self position in the space of the autonomous mobile device may be estimated.

  The second self-position estimation unit may generate an observation model using an addition value obtained by adding the detection value of the inner world sensor to the self-position result estimated by the first self-position estimation unit.

  The inner world sensor may be a position sensor that detects the driving amount of the driving mechanism of the autonomous mobile device.

  The first sampling period may be a detection period of the external sensor, and the second sampling period may be a detection period of the internal sensor.

  In order to solve the above-described problem, according to another aspect of the present invention, an inner world sensor that includes a moving mechanism and detects operation information by a moving mechanism of an autonomous mobile device that can move in a predetermined space, and a predetermined A step of estimating a self-position at a first sampling period by a first self-position estimating unit based on a detection result by an external sensor that detects spatial information relating to the inside of the space, and a self by a second self-position estimating unit Based on the position estimation result and the operation information of the autonomous mobile device detected by the internal sensor, the second self-position estimating unit estimates the self-position at a sampling period higher than the first sampling period; A self-position estimation method is provided.

  As described above, according to the present invention, it is possible to provide a self-position estimation apparatus and a self-position estimation method capable of estimating a self-position with high sampling period and high accuracy.

It is explanatory drawing explaining the outline of the self-position estimation of the autonomous mobile robot by the self-position estimation apparatus which concerns on embodiment of this invention. It is a block diagram which shows the function structure of the self-position estimation apparatus which concerns on the same embodiment. It is a flowchart which shows the self-position estimation method by the self-position estimation apparatus which concerns on the same embodiment. It is a sequence diagram which shows the self-position estimation method by the self-position estimation apparatus which concerns on the same embodiment. It is explanatory drawing which shows the calculation means of the self-position estimated value by an EKF self-position estimation part. It is a block diagram which shows an example of the hardware constitutions of the self-position estimation apparatus which concerns on the embodiment.

  Exemplary embodiments of the present invention will be described below in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.

The description will be made in the following order.
1. 1. Outline of self-position estimation by self-position estimation device 2. Configuration of self-position estimation apparatus 3. Self-position estimation method Hardware configuration example

<1. Outline of self-position estimation by self-position estimation device>
First, based on FIG. 1, the outline of the self-position estimation in the space using the self-position estimation apparatus 100 which concerns on this embodiment is demonstrated. FIG. 1 is an explanatory diagram for explaining an outline of self-position estimation of the autonomous mobile robot 10 by the self-position estimation apparatus 100 according to the present embodiment.

  The self-position estimation apparatus 100 according to the present embodiment is an apparatus that estimates the self-position in a space where the self exists. For example, as shown in FIG. 1, an autonomous mobile robot (hereinafter, also simply referred to as “robot”) 10 existing in a room can estimate the position in the room by the self-position estimation apparatus 100. it can. The autonomous mobile robot 10 includes a moving mechanism (for example, the wheel 14 shown in FIG. 2) and a distance sensor (for example, a laser range shown in FIG. 2) that acquires a distance (ranging value) to the object for recognizing the space. And a finder 12). The robot 10 can move the room autonomously by estimating the self position by the self position estimating apparatus 100 using the driving amount of the moving mechanism and the acquired distance measurement value.

  In addition, the self-position estimation apparatus 100 generates an environment map that represents the state of the room based on the object information recognized by the distance sensor. The environment map stores information related to the object such as the position, shape, and weight of the object recognized in the room, such as the desk 1 or the chest 3. The robot 10 can recognize a state of a space in which the robot 10 exists by referring to such an environment map and determine a movement route according to the purpose.

  At this time, in order to increase the accuracy of the operation of the robot 10, it is necessary to correctly recognize the self position in the space where the robot 10 exists and to accurately calculate the control amount for operating the robot 10. In the object recognition based on the motion control of the robot 10 and the self-position estimation value, it is necessary to calculate the control amount with a high sampling period, and therefore the self-position estimation value needs to be updated with a high sampling period. The self-position estimation apparatus 100 according to the present embodiment can calculate a self-position estimation value with high accuracy at a high sampling period, which can be applied to the operation control of the robot 10. Hereinafter, based on FIGS. 2 to 5, the self-position estimation apparatus 100 according to the present embodiment and the self-position estimation method using the same will be described in detail.

<2. Configuration of self-position estimation device>
First, a functional configuration of the self-position estimation apparatus 100 according to the present embodiment will be described based on FIG. FIG. 2 is a block diagram illustrating a functional configuration of the self-position estimation apparatus 100 according to the present embodiment.

  As shown in FIG. 2, the self-position estimation apparatus 100 includes a wheel encoder acquisition unit 110, a SLAM self-position estimation unit 120, an environment map storage unit 130, an EKF self-position estimation unit 140, and a motion control unit 150. Prepare.

  The wheel encoder acquisition unit 110 is an internal sensor that detects operation information by a moving mechanism of the autonomous mobile robot 10. The autonomous mobile robot 10 according to the present embodiment includes a plurality of wheels 14 as a moving mechanism. Each wheel 14 is provided with an encoder that detects the rotation of an actuator that rotationally drives the wheel 14. The wheel encoder acquisition unit 110 acquires a detection value (encoder value) of an encoder provided on the wheel 14. Wheel encoder acquisition unit 110 then outputs the detected value to SLAM self-position estimation unit 120 and EKF self-position estimation unit 140.

  The SLAM self-position estimation unit 120 calculates the self-position estimation value based on the detection results of the environment map, the inner world sensor, and the outer world sensor using the above-described SLAM. The SLAM self-position estimation unit 120 includes a position estimation unit 122 that performs self-position estimation and an environment map update unit 124 that updates the environment map stored in the environment map storage unit 130.

  First, the position estimation unit 122 calculates the wheel speed from the acquired value of the wheel encoder acquisition unit 110 that acquires the value of the encoder provided in the wheel 14 that is the internal sensor, and analytically updates the self position. After that, the position estimation unit 122 performs matching between the distance value detected by the laser range finder 12 that is an external sensor and the environment map in the environment map storage unit 130, and the final self in the SLAM self-position estimation unit 120. A position estimate is calculated. The self-position estimation by the position estimation unit 122 estimates the self-position with a low sampling period, for example, 100 ms. The self-position estimation value calculated by the position estimation unit 122 is output to the environment map update unit 124 and the EKF self-position estimation unit 140.

  The environment map update unit 124 updates the environment map based on the distance measurement value detected by the laser range finder 12 and the self-position estimation value estimated by the position estimation unit 122. The environment map update unit 124 calculates the position and orientation of the object in the space from the distance measurement value and the self-position estimation value of the input laser range finder 12, and updates the environment map in the environment map storage unit 130.

  The environment map storage unit 130 stores an environment map that represents a three-dimensional position of an object existing in the real space. The environment map stores, for example, information on an object (for example, the desk 1 or the chest 3) recognized from an image photographed by the laser range finder 12 or a camera (not shown). Examples of the object information include a three-dimensional shape, position, and posture information.

  The EKF self-position estimation unit 140 includes a self-position estimation value calculated by the SLAM self-position estimation unit 120 and an acquisition value of the wheel encoder acquisition unit 110 that acquires a value of an encoder provided on the wheel 14 that is an internal sensor. Based on the above, a self-position estimation value is calculated. The EKF self-position estimating unit 140 performs high-precision self-position estimation with a high sampling period using an extended Kalman filter (EKF).

  The extended Kalman filter has a time evolution model based on the detection value of the internal sensor and an observation model based on the self-position estimation value calculated by the SLAM self-position estimation unit 120. The EKF self-position estimation unit 140 can calculate a self-position estimation value by identifying the time evolution model and the observation model. The self-position estimation in the EKF self-position estimation unit 140 is performed at the acquisition timing (for example, 1 ms cycle) of the wheel encoder acquisition unit 110. Details of the self-position estimation processing by the EKF self-position estimation unit 140 will be described later. The self-position estimation value calculated by the EKF self-position estimation unit 140 is output to the motion control unit 150.

  The motion control unit 150 performs motion control of the autonomous mobile robot 10 using the self-position estimation value calculated by the EKF self-position estimation unit 140. The motion control unit 150 calculates a motion control value for causing the robot 10 to perform a desired motion, and a drive mechanism (not shown) such as a motor that rotates the wheels 14 of the robot 10 based on the motion control value. Drive.

<3. Self-position estimation method>
Next, a self-position estimation method by the self-position estimation apparatus 100 according to the present embodiment will be described based on FIGS. FIG. 3 is a flowchart showing a self-position estimation method by the self-position estimation apparatus 100 according to the present embodiment. FIG. 4 is a sequence diagram showing a self-position estimation method by the self-position estimation apparatus 100 according to the present embodiment. FIG. 5 is an explanatory diagram illustrating a self-position estimation value calculation unit by the EKF self-position estimation unit 140.

  In the self-position estimation method performed by the self-position estimation apparatus 100, as shown in FIG. 3, first, a detection value (encoder value) of an encoder that measures the rotational position of the wheel 14 is acquired by the wheel encoder acquisition unit 110 (step S100). ). The wheel encoder acquisition unit 110 detects the rotational position of the wheel 14 that is the moving mechanism of the robot 10 by the encoder, and outputs the detected value to the SLAM self-position estimation unit 120 and the EKF self-position estimation unit 140 as encoder values.

  Here, as shown in FIG. 4, the wheel encoder acquisition unit 110 can acquire the encoder value at a high sampling period, for example, 1 ms. The wheel encoder acquisition unit 110 outputs the acquired encoder value in accordance with the processing cycle of self-position estimation of the output destination. That is, in the present embodiment, the encoder value is transmitted from the wheel encoder acquisition unit 110 to the SLAM self-position estimation unit 120 every 100 ms for the SLAM self-position estimation unit 120 that performs self-position estimation at a low sampling period (for example, 100 ms). Is output. On the other hand, an encoder value is output from the wheel encoder acquisition unit 110 to the EKF self-position estimation unit 140 every 1 ms for the EKF self-position estimation unit 140 that performs self-position estimation at a high sampling period (for example, 1 ms).

  On the other hand, the laser range finder 12 acquires the distance from the robot 10 to an object existing in the space as a distance measurement value (step S110). The laser range finder 12 cannot acquire a detection value at a high sampling period unlike an encoder, and acquires a distance measurement value at a relatively low sampling period, for example, a 100 ms period as shown in FIG. The laser range finder 12 outputs the acquired distance measurement value to the SLAM self-position estimation unit 120.

  The SLAM self-position estimation unit 120 executes self-position estimation processing at the timing when the distance measurement value from the laser range finder 12 is input (step S120). As shown in FIG. 4, the SLAM self-position estimation unit 120 performs self-position estimation processing based on the distance measurement value by the laser range finder 12 and the encoder value of the wheel encoder acquisition unit 110 at the time of measurement of the distance measurement value. I do. The self-position estimation process by SLAM can be performed using the technique described in Non-Patent Document 1, for example. The self-position estimation value calculated by the SLAM self-position estimation unit 120 is output to the EKF self-position estimation unit 140.

  Then, the EKF self-position estimation unit 140 uses the self-position estimation value calculated by the SLAM self-position estimation unit 120 and the encoder value obtained by the wheel encoder acquisition unit 110 based on the principle of the extended Kalman filter. Is calculated (step S130). As described above, the encoder value can be acquired at a high sampling period, but the distance measurement value by the laser range finder 12 can be acquired only at a sampling period lower than this. Therefore, when using SLAM, the self-position can be estimated with high accuracy. On the other hand, in a situation where real-time is required, such as motion control of the robot 10, it is desirable that the self-position can be estimated with a higher sampling period. Therefore, the EKF self-position estimation unit 140 uses the encoder value that can be acquired with a high sampling period and the self-position estimation value estimated with high accuracy by the SLAM self-position estimation unit 120, so It is possible to obtain a position estimation value.

  In calculating the self-position estimation value of the robot 10 using the extended Kalman filter, the EKF self-position estimation unit 140 first calculates an odometry output value based on the encoder value. The odometry output value is the speed or angular velocity of the wheel 14 calculated from the rotational position of the wheel 14 of the robot 10 input from the wheel encoder acquisition unit 110. The EKF self-position estimation unit 140 can calculate the odometry output value with the sampling period of the encoder. Then, the EKF self-position estimation unit 140 constructs a time evolution model of the extended Kalman filter based on the odometry output value.

  Further, the EKF self-position estimation unit 140 performs a process of adding the odometry output value to the self-position estimation value by SLAM calculated by the SALM self-position estimation unit 120. As shown in FIGS. 4 and 5, the EKF self-position estimation unit 140 fuses the odometry output value and the self-position estimation value by SLAM to calculate the self-position estimation value, but the acquisition timing of these values is different. To do. For this reason, if the self-position estimation value by SLAM is used as it is, such a value is input to the extended Kalman filter as a discontinuous value.

  Therefore, the same value as the odometry output value of the encoder used for the self-position estimation process by the EKF self-position estimation unit 140 is set as the self-position estimation value by the SLAM self-position estimation unit 120. Add to. Thus, a value including the movement of the robot 10 from the self-position estimation value estimated by the SLAM self-position estimation unit 120 is acquired at the same cycle (for example, 1 ms) as the sampling cycle of the encoder. Based on this acquired value, an observation model is constructed. Therefore, the EKF self-position estimation unit 140 can calculate a self-position estimation value from a model based on values acquired at the same timing.

The EKF self-position estimation unit 140 uses an extended Kalman filter having a time evolution model based on the detected value of the wheel 14 that is an internal sensor and an observation model based on the self-position estimation value calculated by the SLAM self-position estimation unit 120. The self-position estimation value is calculated. The extended Kalman filter is an observer for estimating the state of a dynamic system using observations with errors. The extended Kalman filter includes a prediction phase for calculating an estimated state at the current time from an estimated state at the previous time, and an update phase for correcting the estimated value using the observation result at the current time. The input value x t−1 , P t− Output values x t and P t are obtained from 1 , u t and y t . The prediction phase and the update phase are represented by the following mathematical formulas 1 to 5.

  Where x is the state estimate, P is the error covariance, u is the control value, y is the observed value, g () is the state prediction, h () is the observation prediction, A is the Jacobian matrix of the time evolution model, C Is the Jacobian matrix of the observation model, R is the covariance of the time evolution model, Q is the covariance of the observation model, and K is the Kalman gain.

  Specifically, as shown in FIG. 5, the EKF self-position estimation unit 140 estimates the odometry self-position using a time evolution model based on the wheel speed (odometry output value) calculated from the encoder value obtained by the wheel encoder acquisition unit 110. I do. The time evolution model is expressed by the following Equation 6.

Here, x is a position in the x-axis direction, y is a position in the y-axis direction, θ is a turning angle, and x = (x, y, θ) T. Further, v is a translation speed, w is a rotation speed, and ε is dispersion. That is, Equation 6 represents the estimated self-position value at time t estimated based on the odometry output value.

  Further, the EKF self-position estimation unit 140 acquires the position and orientation of the robot 10 from the self-position estimation value obtained by the SLAM self-position estimation unit 120, and uses an observation model based on the estimated position and orientation of the robot 10 and the wheel speed. SLAM self-position estimation. The observation model is expressed by the following mathematical formula 7.

  Here, u is the immediately preceding SLAM self-position estimation time, and δ is the variance. In other words, Formula 7 represents the self-position estimation value at time t estimated based on the SLAM self-position estimation value. Here, as can be seen from the two terms in the first term parenthesis on the right side of Equation 7, the SLAM self-position estimation value (left term) calculated at time tu is the time from time tu + 1 to the current time t. The travel distance (right term) by the wheel 14 is added (addition processing in the integrator of FIG. 5). In this way, the two models of the extended Kalman filter are synchronized to improve the estimation accuracy.

  The EKF self-position estimation unit 140 calculates the estimated position in the space of the robot 10 based on the mathematical expressions 1 to 5 using the models expressed by the mathematical expressions 6 and 7. As described above, the EKF self-position estimation unit 140 estimates the self-position with high accuracy and high sampling period using the self-position estimation value by high-accuracy SLAM and the odometry output value acquired with high sampling period. be able to.

  Thereafter, the EKF self-position estimation unit 140 outputs the calculated self-position estimation value to the motion control unit 150. Based on the self position estimated by the EKF self position estimation unit 140, the motion control unit 150 calculates a control value of a drive mechanism for the robot 10 to move and work, and controls the robot 10 (step S140).

  The self-position estimation method by the self-position estimation apparatus 100 according to the present embodiment has been described above. According to this method, the self-position estimation result using the SLAM at the low sampling period and the odometry output from the wheel encoder acquisition unit 110 that can be acquired at the high sampling period are fused by the extended Kalman filter. As a result, the self-position estimation result estimated in the low sampling period can be corrected to be closer to the actual position, and high-precision self-position estimation can be performed in the high sampling period.

  Further, in the self-position estimation method of the present embodiment, the odometry output from the wheel encoder acquisition unit 110 is added before the self-position estimation result by SLAM is input to the EKF self-position estimation unit 140. Thereby, the self-position estimation result by the EKF self-position estimation unit 140 does not become a discontinuous value, and it becomes possible to perform control such as motion control of the robot 10 using the self-position estimation value with high accuracy.

<4. Hardware configuration example>
A part of the processing by the self-position estimation apparatus 100 according to the present embodiment can be executed by hardware or can be executed by software. In this case, the self-position estimation apparatus 100 can also be configured as a computer as shown in FIG. Hereinafter, a hardware configuration example of the self-position estimation apparatus 100 according to the present embodiment will be described with reference to FIG.

  As described above, the self-position estimation apparatus 100 configuring the input unit according to the present embodiment can be realized by an information processing apparatus such as a computer, and can be provided in an autonomous distributed robot including the manipulator 10. As shown in FIG. 6, the self-position estimation apparatus 100 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, and a host bus 104a. Further, the self-position estimation device 100 includes a bridge 104, an external bus 104b, an interface 105, an input device 106, an output device 107, a storage device (HDD) 108, a drive 109, a connection port 111, and a communication. Device 113.

  The CPU 101 functions as an arithmetic processing device and a control device, and controls the overall operation within the self-position estimation device 100 according to various programs. Further, the CPU 101 may be a microprocessor. The ROM 102 stores programs and calculation parameters used by the CPU 101. The RAM 103 temporarily stores programs used in the execution of the CPU 101, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus 104a including a CPU bus.

  The host bus 104 a is connected to an external bus 104 b such as a PCI (Peripheral Component Interconnect / Interface) bus via the bridge 104. Note that the host bus 104a, the bridge 104, and the external bus 104b are not necessarily configured separately, and these functions may be mounted on one bus.

  The input device 106 includes an input means for a user to input information, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 101. Etc. By operating the input device 106, the user can input various data and instruct processing operations to the robot's self-position estimation device 100.

  The output device 107 includes, for example, a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device and a display device such as a lamp, and an audio output device such as a speaker.

  The storage device 108 is an example of a storage unit of the self-position estimation device 100 and is a device for storing data. The storage device 108 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 108 is composed of, for example, an HDD (Hard Disk Drive). The storage device 108 drives a hard disk and stores programs executed by the CPU 101 and various data.

  The drive 109 is a storage medium reader / writer, and is built in or externally attached to the self-position estimation apparatus 100. The drive 109 reads information recorded on a mounted removable recording medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 103.

  The connection port 111 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example. The communication device 113 is a communication interface configured with a communication device or the like for connecting to the communication network 15, for example. The communication device 112 may be a wireless LAN (Local Area Network) compatible communication device, a wireless USB compatible communication device, or a wire communication device that performs wired communication.

  Note that the self-position estimation apparatus 100 according to the present embodiment does not necessarily include the hardware illustrated in FIG. For example, the input device 106, the display device 107, the storage device 108, the drive 109, the connection port 111, the communication device 113, etc. are connected to the self-position estimation device 100 and used as a device different from the self-position estimation device 100. You may be able to do it.

  The preferred embodiments of the present invention have been described in detail above with reference to the accompanying drawings, but the present invention is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field to which the present invention pertains can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that these also belong to the technical scope of the present invention.

  For example, in the above embodiment, the self-position estimation value by SLAM and the odometry output value by the internal sensor are fused using the extended Kalman filter, but the present invention is not limited to such an example. For example, a particle filter or the like may be used instead of the extended Kalman filter. Note that the use of a particle filter greatly increases the amount of calculation, so it is preferable to use an extended Kalman filter.

  Moreover, in the said embodiment, although the laser range finder 12 was used as an external sensor, this invention is not limited to this example, For example, you may use a stereo camera etc.

  Furthermore, in the above-described embodiment, the self-position estimation process by the SLAM self-position estimation unit 120 is performed in a 100 ms cycle, and the self-position estimation process by the EKF self-position estimation unit 130 is performed in a 1 ms cycle. However, the present invention is limited to this example. Not. The present invention is applicable when the sampling period of the EKF self-position estimation unit 140 is higher than the sampling period of the SLAM self-position estimation unit 120.

DESCRIPTION OF SYMBOLS 10 Autonomous mobile robot 12 Laser range finder 14 Wheel 100 Self-position estimation apparatus 110 Wheel encoder acquisition part 120 SLAM self-position estimation part 122 Position estimation part 124 Environmental map update part 130 Environmental map memory | storage part 140 EKF self-position estimation part 150 Motion control Part

Claims (8)

  1. Based on a detection result by an external sensor that detects movement information by the movement mechanism of the autonomous mobile device that includes a movement mechanism and can move in a predetermined space, and an external sensor that detects spatial information in the predetermined space, A first self-position estimation unit that estimates the self-position in a first sampling period;
    Based on the self-position estimation result by the first self-position estimation unit and the operation information of the autonomous mobile device detected by the internal sensor, the self-position is determined at a sampling period higher than the first sampling period. A second self-position estimating unit to estimate;
    A self-position estimation apparatus comprising:
  2. A storage unit for storing space information in the space as an environment map;
    The first self-position estimation unit includes:
    A position estimation unit that estimates the self-position of the autonomous mobile device in the space based on the spatial information of the environment map, the detection value of the internal sensor, and the detection value of the external sensor;
    An update unit that updates the environment map of the storage unit based on the self-position estimated by the position estimation unit and the detection value of the external sensor,
    The self-position estimation apparatus according to claim 1, comprising:
  3. The external sensor is a distance sensor that acquires a distance from the autonomous mobile device to an object existing in the space,
    The first self-position estimation unit dynamically generates the environment map using SLAM that simultaneously estimates the position and orientation of the distance sensor and the position of an object recognized based on a distance measurement value by the distance sensor. The self-position estimation apparatus according to claim 2.
  4.   The second self-position estimation unit includes a time development model based on a detection value of the inner world sensor and an observation model based on the self-position of the autonomous mobile device estimated by the first self-position estimation unit. The self-position estimation apparatus according to any one of claims 1 to 3, wherein a self-position in the space of the autonomous mobile apparatus is estimated using an extended Kalman filter.
  5.   The second self-position estimation unit generates the observation model using an addition value obtained by adding the detection value of the inner world sensor to the self-position result estimated by the first self-position estimation unit. Item 5. The self-position estimation apparatus according to Item 4.
  6.   The self-position estimation apparatus according to claim 1, wherein the inner world sensor is a position sensor that detects a driving amount of a driving mechanism of the autonomous mobile device.
  7. The first sampling period is a detection period of the external sensor,
    The self-position estimation apparatus according to claim 1, wherein the second sampling period is a detection period of the inner world sensor.
  8. Based on a detection result by an external sensor that detects movement information by the movement mechanism of the autonomous mobile device that includes a movement mechanism and can move in a predetermined space, and an external sensor that detects spatial information in the predetermined space, Estimating a self-position at a first sampling period by a first self-position estimation unit;
    Based on the self-position estimation result by the second self-position estimation unit and the operation information of the autonomous mobile device detected by the internal sensor, the second self-position estimation unit performs the first sampling. Estimating the self-position with a sampling period higher than the period;
    A self-position estimation method including:
JP2010078891A 2010-03-30 2010-03-30 Self-position estimating device and self-position estimating method Withdrawn JP2011209203A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010078891A JP2011209203A (en) 2010-03-30 2010-03-30 Self-position estimating device and self-position estimating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010078891A JP2011209203A (en) 2010-03-30 2010-03-30 Self-position estimating device and self-position estimating method

Publications (1)

Publication Number Publication Date
JP2011209203A true JP2011209203A (en) 2011-10-20

Family

ID=44940416

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010078891A Withdrawn JP2011209203A (en) 2010-03-30 2010-03-30 Self-position estimating device and self-position estimating method

Country Status (1)

Country Link
JP (1) JP2011209203A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012103819A (en) * 2010-11-08 2012-05-31 Fujitsu Ltd Position estimation method, position estimation device and program
WO2012153629A1 (en) * 2011-05-12 2012-11-15 株式会社Ihi Device and method for controlling prediction of motion
JP2012236254A (en) * 2011-05-12 2012-12-06 Ihi Corp Device and method for holding moving body
JP2012245568A (en) * 2011-05-25 2012-12-13 Ihi Corp Device and method for controlling and predicting motion
JP2012247835A (en) * 2011-05-25 2012-12-13 Ihi Corp Robot movement prediction control method and device
KR101454824B1 (en) * 2013-04-03 2014-11-03 국방과학연구소 System and Method for estimating positions of an autonomous mobile vehicle
JP2015152991A (en) * 2014-02-12 2015-08-24 株式会社デンソーアイティーラボラトリ Self-location estimation device and self-location estimation method
WO2016006588A1 (en) * 2014-07-09 2016-01-14 パイオニア株式会社 Mobile object control device, mobile object control method, mobile object control program, and recording medium
US10119804B2 (en) 2014-11-12 2018-11-06 Murata Machinery, Ltd. Moving amount estimating apparatus, autonomous mobile body, and moving amount estimating method
WO2019138640A1 (en) * 2018-01-10 2019-07-18 ソニー株式会社 Information processing device, information processing method, and program
WO2020049945A1 (en) * 2018-09-04 2020-03-12 ソニー株式会社 Self-localization device, information processing device, and self-localization method

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012103819A (en) * 2010-11-08 2012-05-31 Fujitsu Ltd Position estimation method, position estimation device and program
WO2012153629A1 (en) * 2011-05-12 2012-11-15 株式会社Ihi Device and method for controlling prediction of motion
JP2012236254A (en) * 2011-05-12 2012-12-06 Ihi Corp Device and method for holding moving body
US9108321B2 (en) 2011-05-12 2015-08-18 Ihi Corporation Motion prediction control device and method
JP2012247835A (en) * 2011-05-25 2012-12-13 Ihi Corp Robot movement prediction control method and device
JP2012245568A (en) * 2011-05-25 2012-12-13 Ihi Corp Device and method for controlling and predicting motion
KR101454824B1 (en) * 2013-04-03 2014-11-03 국방과학연구소 System and Method for estimating positions of an autonomous mobile vehicle
JP2015152991A (en) * 2014-02-12 2015-08-24 株式会社デンソーアイティーラボラトリ Self-location estimation device and self-location estimation method
WO2016006588A1 (en) * 2014-07-09 2016-01-14 パイオニア株式会社 Mobile object control device, mobile object control method, mobile object control program, and recording medium
JPWO2016006588A1 (en) * 2014-07-09 2017-04-27 パイオニア株式会社 Mobile body control device, mobile body control method, mobile body control program, and recording medium
US10119804B2 (en) 2014-11-12 2018-11-06 Murata Machinery, Ltd. Moving amount estimating apparatus, autonomous mobile body, and moving amount estimating method
WO2019138640A1 (en) * 2018-01-10 2019-07-18 ソニー株式会社 Information processing device, information processing method, and program
WO2020049945A1 (en) * 2018-09-04 2020-03-12 ソニー株式会社 Self-localization device, information processing device, and self-localization method

Similar Documents

Publication Publication Date Title
US10354396B1 (en) Visual-inertial positional awareness for autonomous and non-autonomous device
US10732647B2 (en) Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
US10423832B1 (en) Visual-inertial positional awareness for autonomous and non-autonomous tracking
US10453213B2 (en) Mapping optimization in autonomous and non-autonomous platforms
Hilsenbeck et al. Graph-based data fusion of pedometer and WiFi measurements for mobile indoor positioning
JP2016157473A (en) Adaptive mapping using spatial concentration of sensor data
US10300603B2 (en) Methods and apparatus for early sensory integration and robust acquisition of real world knowledge
Shen et al. Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV
CN104236548B (en) Autonomous navigation method in a kind of MAV room
US10750155B2 (en) Mapping and tracking system with features in three-dimensional space
JP6436604B2 (en) Method and system implemented by a computing system
Santos et al. An evaluation of 2D SLAM techniques available in robot operating system
KR102016551B1 (en) Apparatus and method for estimating position
Weiss et al. Real-time metric state estimation for modular vision-inertial systems
Van den Bergh et al. Real-time 3D hand gesture interaction with a robot for understanding directions from humans
Heng et al. Self-calibration and visual slam with a multi-camera system on a micro aerial vehicle
JP6198230B2 (en) Head posture tracking using depth camera
Ligorio et al. Extended Kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: Comparative analysis and performance evaluation
KR20150119337A (en) Generation of 3d models of an environment
US8854594B2 (en) System and method for tracking
Schmid et al. Autonomous vision‐based micro air vehicle for indoor and outdoor navigation
Nishiwaki et al. The experimental humanoid robot H7: a research platform for autonomous behaviour
US9155675B2 (en) Portable robotic device
US9224043B2 (en) Map generation apparatus, map generation method, moving method for moving body, and robot apparatus
US10618165B1 (en) Tooltip stabilization

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20130604