WO2021153176A1 - Dispositif à déplacement autonome, procédé de commande de mouvement autonome, et programme - Google Patents

Dispositif à déplacement autonome, procédé de commande de mouvement autonome, et programme Download PDF

Info

Publication number
WO2021153176A1
WO2021153176A1 PCT/JP2021/000295 JP2021000295W WO2021153176A1 WO 2021153176 A1 WO2021153176 A1 WO 2021153176A1 JP 2021000295 W JP2021000295 W JP 2021000295W WO 2021153176 A1 WO2021153176 A1 WO 2021153176A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
autonomous
image analysis
yaw angle
traveling direction
Prior art date
Application number
PCT/JP2021/000295
Other languages
English (en)
Japanese (ja)
Inventor
希彰 町中
中井 幹夫
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021153176A1 publication Critical patent/WO2021153176A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • This disclosure relates to an autonomous mobile device, an autonomous movement control method, and a program. More specifically, the present invention relates to an autonomous movement device that performs autonomous movement by analyzing a captured image of a ceiling, an autonomous movement control method, and a program.
  • an autonomous mobile robot is used for parcel delivery processing.
  • Patent Document 1 Patent No. 4838824.
  • Patent Document 1 the photographed image of a marker provided above the autonomous mobile robot is analyzed, and the relative position between the autonomous mobile robot and the marker is calculated to navigate the robot based on the position and orientation. The configuration to be performed is disclosed.
  • Patent Document 1 can perform autonomous movement according to an absolute direction based on a landmark that does not depend on self-position information stored in, for example, an inertial measurement unit (IMU). .. An error may be accumulated in the self-position information of the inertial measurement unit (IMU), and an error due to such an accumulation error can be prevented.
  • IMU inertial measurement unit
  • Patent Document 1 it is necessary to install a marker in advance. For example, when the moving range of the robot is wide, it is necessary to install the marker in advance over the entire wide area, which causes a large load for installing the marker. Further, depending on the angle of view that the camera mounted on the robot can shoot and the distance, it is necessary to install a large number of markers, and the labor for installing the markers becomes even greater.
  • the present disclosure has been made in view of the above problems, for example, and provides an autonomous movement device, an autonomous movement control method, and a program that enable autonomous traveling without performing preprocessing such as marker preset setting.
  • the purpose is.
  • the first aspect of the disclosure is An image analysis unit that inputs images taken by a camera and performs image analysis, A measuring unit that acquires observation information for each unit time of the moving state of the autonomous mobile device, It has an own device absolute yaw angle calculation unit that inputs the image analysis result of the image analysis unit and the observation information of the measurement unit and calculates the traveling direction of the autonomous moving device.
  • the own device absolute yaw angle calculation unit A plurality of estimated traveling direction candidate data of the autonomous moving device obtained from the image analysis result of the image analysis unit, and By collating the integrated value-based estimated traveling direction data of the autonomous mobile device obtained from the integrated result of the observation information for each unit time of the measuring unit, Based on the collation result, the autonomous moving device calculates the absolute yaw angle of the own device corresponding to the traveling direction of the autonomous moving device.
  • the second aspect of the present disclosure is It is an autonomous movement control method executed in an autonomous mobile device.
  • An image analysis step in which the image analysis unit inputs images taken by the camera and performs image analysis
  • a measurement step in which the measuring unit acquires observation information for each unit time of the moving state of the autonomous mobile device
  • the own device absolute yaw angle calculation unit inputs the image analysis result of the image analysis unit and the observation information of the measurement unit, and calculates the traveling direction of the autonomous moving device.
  • the step of calculating the absolute yaw angle of the own device is A plurality of estimated traveling direction candidate data of the autonomous moving device obtained from the image analysis result of the image analysis unit, and By collating the integrated value-based estimated traveling direction data of the autonomous mobile device obtained from the integrated result of the observation information for each unit time of the measuring unit, There is an autonomous movement control method for calculating the absolute yaw angle of the own device corresponding to the traveling direction of the autonomous moving device based on the collation result.
  • the third aspect of the present disclosure is It is a program that executes autonomous movement control in an autonomous mobile device.
  • An image analysis step in which an image taken by a camera is input to the image analysis unit to perform image analysis,
  • a measurement step in which the measurement unit acquires observation information for each unit time of the moving state of the autonomous mobile device.
  • the own device absolute yaw angle calculation unit inputs the image analysis result of the image analysis unit and the observation information of the measurement unit, and causes the own device absolute yaw angle calculation unit to calculate the traveling direction of the autonomous moving device.
  • the program of the present disclosure is, for example, a program that can be provided by a storage medium or a communication medium that is provided in a computer-readable format to an information processing device or a computer system that can execute various program codes.
  • a program that can be provided by a storage medium or a communication medium that is provided in a computer-readable format to an information processing device or a computer system that can execute various program codes.
  • system is a logical set configuration of a plurality of devices, and the devices having each configuration are not limited to those in the same housing.
  • an image analysis unit that analyzes the ceiling image taken by the camera, a measurement unit that acquires observation information for each unit time of the moving state of the autonomous moving device, and a ceiling image analysis result of the image analysis unit. It has an own device absolute yaw angle calculation unit that inputs observation information of the measurement unit and calculates the traveling direction of the autonomous moving device.
  • the own device absolute yaw angle calculation unit is an autonomous movement device obtained from a plurality of estimated traveling direction candidate data of the autonomous movement device obtained from the ceiling image analysis result and an integration result of observation information for each unit time of the measurement unit.
  • the absolute yaw angle of the own device corresponding to the traveling direction of the autonomous moving device is calculated by collating the integrated value-based estimated traveling direction data of.
  • FIG. 1 shows an autonomous traveling robot 10 which is an example of the autonomous mobile device of the present disclosure.
  • the autonomous traveling robot 10 has a camera 11 and uses the camera 11 to take an image of the ceiling.
  • the data processing unit inside the autonomous traveling robot 10 analyzes the image of the ceiling taken by the camera 11 and performs autonomous movement while analyzing its own position and traveling direction.
  • the data processing unit inside the autonomous traveling robot 10 of the present disclosure analyzes the image of the ceiling (building) in the moving environment of the autonomous traveling robot 10, and based on the analysis result of the ceiling image, the yaw of the autonomous traveling robot 10 (Yaw). ) Calculate the angle ( ⁇ ).
  • the yaw angle of the autonomous traveling robot 10 will be described with reference to FIG. As shown in FIG. 2, the yaw angle of the autonomous traveling robot 10 is an angle around a vertical axis perpendicular to the traveling surface of the autonomous traveling robot 10.
  • the amount of change in the yaw angle, ⁇ yaw ( ⁇ ), indicates the amount of change in the traveling direction of the autonomous traveling robot 10.
  • ⁇ yaw which is the amount of change in the yaw angle
  • the traveling direction (t1) of the autonomous traveling robot 10 at time t1 changes to the traveling direction (t2) at time t2.
  • the amount of change in the yaw angle: ⁇ yaw indicates the amount of change in the traveling direction of the autonomous traveling robot 10 at time t1 and the amount of change in the traveling direction at time t2, that is, the amount of change in the angle indicating the traveling direction.
  • Amount of change in yaw angle from time t0 to tn Integrated value of ⁇ yaw, that is, ⁇ yaw
  • This integrated value is a yaw angle (absolute yaw angle) indicating the traveling direction of the autonomous traveling robot 10 at the current time tun.
  • the amount of change in yaw angle: ⁇ yaw can be continuously calculated and integrated, the change in the traveling direction of the autonomous traveling robot 10 can be calculated.
  • the amount of change in yaw angle: ⁇ yaw is calculated sequentially every unit time specified in advance. For example, it is necessary to calculate sequentially every predetermined unit time such as 1 second unit or 2 second unit.
  • Amount of change in yaw angle per unit time If the calculated value of ⁇ yaw contains even a small error, an error will accumulate in the integrated value, and a large error will occur in the final integrated value. there's a possibility that.
  • dead reckoning the method of calculating the current position by integrating the measured values per unit time is called dead reckoning, but it has been pointed out that this dead reckoning is difficult to control accurately due to accumulation error. ing.
  • the autonomous traveling robot 10 of the present disclosure performs a yaw (Yaw) angle change amount per unit time: ⁇ yaw calculation process, and calculates the yaw (Yaw) angle of the autonomous traveling robot 10 based on the analysis result of the ceiling image. do.
  • ⁇ yaw calculation process calculates the yaw (Yaw) angle of the autonomous traveling robot 10 based on the analysis result of the ceiling image.
  • the autonomous traveling robot 10 of the present disclosure can calculate the traveling direction of the robot with high accuracy without acquiring prior information on the traveling environment.
  • the data processing unit inside the autonomous traveling robot 10 of the present disclosure calculates the yaw angle change amount: ⁇ yaw, and then calculates the integrated value of the yaw angle change amount ( ⁇ yaw) and the analysis result of the ceiling image.
  • the absolute yaw angle absolute yaw angle indicating the current traveling direction angle of the autonomous traveling robot 10 can be obtained. calculate.
  • the absolute yaw angle at the current time point (tx) is the current time point.
  • Tx corresponds to the amount of change in the angle in the traveling direction from the initial position (start position) of the autonomous traveling robot 10.
  • the reference direction is, for example, the initial travel start direction at the start of travel (time t0) of the autonomous travel robot 10.
  • the yaw angle change amount ( ⁇ yaw) due to the change in the traveling direction accompanying the subsequent travel of the autonomous traveling robot 10 is calculated, and the calculated yaw (Yaw) is calculated.
  • the absolute yaw angle can be calculated by integrating the amount of change in angle ( ⁇ yaw).
  • the absolute yaw angle of time tx corresponds to the integrated value of the amount of change in the traveling direction from the initial traveling start direction at the start of traveling (time t0), and corresponds to the integrated value of the amount of change in the traveling direction at the start of traveling (time t0).
  • the traveling direction of the time tx can be accurately calculated from the absolute yaw angle of the time tx.
  • the data processing unit inside the autonomous traveling robot 10 of the present disclosure calculates the amount of change in yaw (Yaw) angle: ⁇ yaw by using an inertial measurement unit (IMU). Further, the integrated value of the yaw angle change amount ( ⁇ yaw) calculated using the IMU is compared and collated with the yaw angle of the autonomous traveling robot 10 estimated based on the analysis result of the ceiling image. Then, the absolute Yaw angle (absolute yaw angle) indicating the current traveling direction angle of the autonomous traveling robot 10 is calculated.
  • IMU inertial measurement unit
  • the array of objects (feature information) that can be detected from the ceiling image has a linear array in a specific direction. Is used. By using this assumption, the absolute Yaw angle is estimated without the need for prior information for each environment.
  • the array of objects (feature information) that can be detected from the ceiling image is, for example, a row of fluorescent tube lighting, a row of spherical lighting, a sprinkler, a row of fire alarms, and the like. These objects are often arranged along a straight line.
  • the patterns and seams of the ceiling plate are often arranged along a predetermined straight line, and the arrangement of the patterns and seams of the ceiling plate also corresponds to the arrangement of objects (feature information) that can be detected from the ceiling image.
  • FIG. 4 shows a plurality of pattern examples corresponding to the object (feature information) array that can be detected from the ceiling image.
  • FIG. 4 shows 11 different pattern examples from patterns 1 to 11. These are some examples of a plurality of pattern examples relating to an object (feature information) array that can be detected from a ceiling image.
  • the data processing unit inside the autonomous traveling robot 10 of the present disclosure captures a ceiling image while the autonomous traveling robot 10 is traveling, and analyzes the object (feature information) arrangement direction from the captured ceiling image. By continuously executing this analysis process, the yaw angle change amount: ⁇ yaw ( ⁇ ) corresponding to the traveling direction change amount of the autonomous traveling robot 10 is calculated.
  • FIG. 5 is a diagram showing a configuration example of the autonomous traveling robot 10 which is an example of the autonomous mobile device of the present disclosure.
  • the autonomous traveling robot 10 includes a camera (ceiling image capturing camera) 11, an image analysis unit (an object (feature information) arrangement direction estimation unit in the ceiling image) 12, an inertial measurement unit (IMU) 13, and a self.
  • the device has an absolute yaw angle estimation unit 14, a robot drive unit 15, and a storage unit (general ceiling-learned model) 18.
  • the camera (ceiling image capturing camera) 11 continuously captures an image of the ceiling in the vertical upward direction of the traveling surface of the autonomous traveling robot 10 while the autonomous traveling robot 10 is traveling. That is, a moving image of the ceiling image is taken.
  • the ceiling image 21 captured by the camera 11 is input to the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12.
  • the image analysis unit (object in the ceiling image (feature information) array direction estimation unit) 12 analyzes the ceiling image 21 taken by the camera 11 and of the object (feature information) included in the ceiling image 21 taken by the camera 11. Estimate the array direction.
  • the array of objects (feature information) is, for example, an array of fluorescent tube illuminations, an array of spherical illuminations, an array of sprinklers and fire alarms, an array of ceiling plate patterns, seams, and the like.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 applies the general ceiling-learned model stored in the storage unit 18 to the object (feature information) included in the ceiling image 21. Estimate the array direction.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 performs four array direction estimation results as the array directions of the objects (feature information) included in the ceiling image 21 by this array direction estimation process. To generate. That is, the object (feature information) array direction candidates (4 candidates) 22 in the ceiling image shown in FIG. 5 are generated. A specific processing example will be described later.
  • the object (feature information) array direction candidate (4 candidates) 22 in the ceiling image output by the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 is the own device absolute yaw (Yaw) angle estimation unit 14. Is entered in.
  • the inertial measurement unit (IMU) 13 is an inertial measurement unit (IMU: Industrial Measurement Unit) and observes the movement of the autonomous traveling robot 10. Specifically, it is composed of a gyro sensor and an accelerometer, and measures the angular velocity and acceleration corresponding to each of the three XYZ axes according to the movement of the autonomous traveling robot 10.
  • the inertial measurement unit (IMU) 13 has a yaw corresponding to the amount of change in the traveling direction (traveling direction change angle) per unit time of the autonomous traveling robot 10.
  • the angle change amount ( ⁇ yaw) 23 is calculated. That is, the yaw angle change amount ( ⁇ yaw) described above with reference to FIG. 3 is calculated.
  • the yaw angle change amount ( ⁇ yaw) 23 calculated by the inertial measurement unit (IMU) 13 is input to the own device absolute yaw angle estimation unit 14.
  • the own device absolute yaw angle estimation unit 14 inputs the following two data (a) and (b).
  • the own device absolute yaw (Yaw) angle estimation unit 14 uses these two data (a) and (b) to correspond to the own device, that is, the yaw (yaw) corresponding to the angle indicating the current traveling direction of the autonomous traveling robot 10.
  • the absolute yaw angle 24 estimated by the own device absolute yaw angle estimation unit 14 is input to the robot drive unit 15.
  • the robot drive unit 15 confirms the absolute yaw angle 24 calculated by the own device absolute yaw angle estimation unit 14, that is, the angle indicating the current traveling direction of the autonomous traveling robot 10, and the own device, That is, the traveling direction and the traveling speed of the autonomous traveling robot 10 are controlled. By this control, the autonomous traveling robot 10 can be accurately traveled toward the destination.
  • the camera (ceiling image capturing camera) 11 continuously captures an image of the ceiling in the vertically upward direction of the traveling surface of the autonomous traveling robot 10 while the autonomous traveling robot 10 is traveling. That is, a moving image of the ceiling image is taken.
  • FIG. 6 is a diagram showing an example of the ceiling image 21 captured by the camera (ceiling image capturing camera) 11.
  • the ceiling image 21 may include images of columns and walls in addition to the ceiling.
  • the ceiling image 21 captured by the camera (ceiling image capturing camera) 11 is input to the image analysis unit (object (feature information) arrangement direction estimation unit in the ceiling image) 12.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 analyzes the ceiling image 21 taken by the camera 11 and includes it in the ceiling image 21 taken by the camera 11. Estimate the arrangement direction of the objects (feature information).
  • the array of objects (feature information) is, for example, an array of fluorescent tube illuminations, an array of spherical illuminations, an array of sprinklers and fire alarms, an array of ceiling plate patterns, seams, and the like.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 applies the general ceiling-learned model stored in the storage unit 18 to the object (feature information) included in the ceiling image 21. Estimate the array direction.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 performs four array direction estimation results as the array directions of the objects (feature information) included in the ceiling image 21 by this array direction estimation process. To generate. That is, the object (feature information) array direction candidates (4 candidates) 22 in the ceiling image shown in FIG. 5 are generated. A specific processing example will be described with reference to FIG. 7.
  • the input (a) shown in FIG. 7 is the ceiling image 21 captured by the camera (ceiling image capturing camera) 11. It corresponds to the ceiling image 21 shown in FIG.
  • the image analysis unit (object (feature information) arrangement direction estimation unit in the ceiling image) 12 analyzes the ceiling image 21 captured by this camera (ceiling image capturing camera) 11.
  • a pre-learned model that is, a general ceiling-learned model stored in the storage unit 18, is applied to estimate the arrangement direction of objects (feature information) included in the ceiling image 21.
  • the trained model is a trained model for extracting objects (feature information) included in the ceiling image, for example, a checkerboard pattern, a sequence of fluorescent lamps, a sprinkler, and a fire alarm, and outputting these arrangement directions. be.
  • the general ceiling-learned model stored in the storage unit 18 is a trained model generated in advance by a learning process using various ceiling patterns as input data.
  • a learning process for example, a large number of ceiling images having different patterns as described above with reference to FIG. 4 are input, and the array direction of objects (feature information) is estimated from the input images and used as an output value.
  • the model (learned model) generated by such a learning process is stored in the storage unit 18.
  • the image analysis unit (object (feature information) arrangement direction estimation unit in the ceiling image) 12 applies a general ceiling-learned model stored in the storage unit 18 to a camera (ceiling image capturing camera). ) 11
  • the captured ceiling image 21 is analyzed, and the arrangement direction of the objects (feature information) included in the ceiling image 21 is estimated.
  • the image analysis unit 12 provides a network such as a neural network as a configuration for estimating the arrangement direction of objects (feature information) to which the general ceiling-learned model stored in the storage unit 18 is applied.
  • This network can be used in various networks. For example, ResNet, LeNet, AlexNet, and networks modified from these can be used.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 is not limited to the processing using such a network for the estimation process of the object (feature information) array direction, for example, a rule base. It may be configured to be executed as an estimation process by (Hough transform, RANSAC, Fourier transform, matching, etc.).
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 applies, for example, a general ceiling-learned model stored in the storage unit 18 to a camera (ceiling image).
  • the ceiling image 21 captured by the photographing camera) 11 is analyzed, and the arrangement direction of the objects (feature information) included in the ceiling image 21 is estimated.
  • the estimation result of the object (feature information) array direction output by the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 is 4. Candidates for array directions of two different objects (feature information). This is because it is difficult to distinguish the ceiling image 21 taken by the camera (ceiling image taking camera) 11 from the rotated image obtained by rotating the ceiling image 21 by 90 degrees [deg].
  • the estimation result of the object (feature information) array direction output by the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 is 4 as shown in FIG. It becomes a candidate for the array direction of two orthogonal objects (feature information).
  • These four orthogonal object (feature information) arrangement direction candidates are the object (feature information) array in the ceiling image which is the output of the image analysis unit (object in the ceiling image (feature information) arrangement direction estimation unit) 12 shown in FIG.
  • Direction candidates (4 candidates) 22 are the object (feature information) array in the ceiling image which is the output of the image analysis unit (object in the ceiling image (feature information) arrangement direction estimation unit) 12 shown in FIG.
  • Direction candidates (4 candidates) 22.
  • the object (feature information) arrangement direction candidates (4 candidates) 22 in the ceiling image are input to the own device absolute yaw angle estimation unit 14.
  • the own device absolute yaw angle estimation unit 14 inputs the following two data (a) and (b).
  • the own device absolute yaw (Yaw) angle estimation unit 14 uses these two data (a) and (b) to correspond to an angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10.
  • the inertial measurement unit (IMU) 13 is an inertial measurement unit (IMU), which measures the angular velocity and acceleration corresponding to each of the XYZ3 axes of the autonomous traveling robot 10, and is an autonomous traveling robot.
  • the amount of change in the yaw angle ( ⁇ yaw) 23 corresponding to the amount of change in the traveling direction (change angle in the traveling direction) per unit time of 10 is calculated. That is, the yaw angle change amount ( ⁇ yaw) described above with reference to FIG. 3 is calculated.
  • the own device absolute yaw angle estimation unit 14 uses the following two data (a) and (b) to correspond to an angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10. Calculate the yaw angle.
  • A Object in ceiling image (feature information) array direction candidate (4 candidates) 22 output by image analysis unit (object in ceiling image (feature information) array direction estimation unit) 12.
  • B Yaw angle change amount ( ⁇ yaw) 23, calculated by the inertial measurement unit (IMU) 13.
  • the own device absolute yaw (Yaw) angle estimation unit 14 uses these two data (a) and (b) to correspond to an angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10.
  • Yaw Estimate the angle.
  • the own device absolute yaw angle estimation unit 14 (A) Object in ceiling image (feature information) array direction candidate (4 candidates) 22 output by image analysis unit (object in ceiling image (feature information) array direction estimation unit) 12. Using this data (a), a probability density distribution indicating the certainty of the absolute yaw angle corresponding to the traveling direction angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10, is calculated. This probability density distribution is a line shown by a solid line in FIG.
  • the horizontal axis indicates the absolute yaw angle corresponding to the traveling direction angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10, and the vertical axis indicates the probability density indicating the certainty. It is a graph which set (Probability density).
  • the line shown by the solid line in FIG. 8, that is, the "absolute yaw angle probability density distribution of the own device calculated from the object (feature information) array direction estimation result (4 candidates)" has four peaks (P1, P2, P3). , P4). These four peaks (P1, P2, P3, P4) are objects (feature information) output by the image analysis unit (object in ceiling image (feature information) array direction estimation unit) 12 described above with reference to FIG. 7. ) The absolute yaw angle of the own device calculated based on each of the four orthogonal objects (feature information) array direction candidates shown in FIG. 7B, which is the estimation result of the array direction. be.
  • the absolute yaw angle corresponding to the traveling direction angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10, is one of these four peaks (P1, P2, P3, P4). .. That is, any one of the four peaks (P1, P2, P3, P4) is the correct peak, and the peak position of this one correct answer corresponds to the traveling direction angle indicating the current traveling direction of the autonomous traveling robot 10. It shows the absolute yaw angle.
  • the own device absolute yaw (Yaw) angle estimation unit 14 has a line shown by a solid line in FIG. 8, that is, the absolute yaw (Yaw) angle probability of the own device calculated from the “object (feature information) array direction estimation result (4 candidates)”.
  • the input data of the inertial measurement unit (IMU) 13 that is, (B) Yaw angle change amount ( ⁇ yaw) 23, calculated by the inertial measurement unit (IMU) 13. Use this input data.
  • the inertial measurement unit (IMU) 13 is an inertial measurement unit (IMU: Industrial Measurement Unit) that measures the angular velocity and acceleration corresponding to each of the XYZ3 axes of the autonomous traveling robot 10 and of the autonomous traveling robot 10.
  • the amount of change in the yaw angle ( ⁇ yaw) 23 corresponding to the amount of change in the traveling direction (angle of change in the traveling direction) per unit time is calculated. That is, the yaw angle change amount ( ⁇ yaw) described above with reference to FIG. 3 is calculated.
  • the yaw angle change amount indicates the amount of change in the traveling direction per unit time of the autonomous traveling robot 10 calculated sequentially for each predetermined unit time, that is, the amount of change in the angle indicating the traveling direction.
  • Amount of change in yaw angle from time t0 to tn Amount of change in yaw angle from time t0 to tn: Integrated value of ⁇ yaw, that is, ⁇ yaw
  • This integrated value is a yaw angle (absolute yaw angle) indicating the traveling direction of the autonomous traveling robot 10 at the current time tun.
  • the own device absolute yaw angle estimation unit 14 first calculates the integrated value of the yaw angle change amount ( ⁇ yaw) 23 calculated by the inertial measurement unit (IMU) 13. That is, ⁇ yaw Calculate the above integrated value.
  • This integrated value is a yaw angle (absolute yaw angle) indicating the traveling direction of the autonomous traveling robot 10 at the current time tun.
  • the own device absolute yaw (Yaw) angle estimation unit 14 is the own device calculated from the above integrated value ⁇ yaw and the line shown by the solid line in FIG. 8, that is, the “object (feature information) array direction estimation result (4 candidates)”. "Absolute Yaw angle probability density distribution" is compared and collated.
  • the integrated value ⁇ yaw of the yaw angle change amount ( ⁇ yaw) 23 calculated by the inertial measurement unit (IMU) 13 is the current time (t) corresponding to the calculated timing of the integrated value ⁇ yaw, and one before.
  • the result of applying the Kalman filter to the integrated value ⁇ yaw at two times of the time (t-1) corresponding to the integrated value ⁇ yaw calculation timing of is shown.
  • the Kalman filter is a filter for reducing the error component included in the integrated value ⁇ yaw.
  • the probability distribution data Kalman filter application data @ t
  • the probability distribution data Kalman filter application data @ t-1 shown in FIG. 8 of the integrated value ⁇ yaw at the two times of the time corresponding to the timing (t-1) is calculated.
  • the peak positions of the two curves shown as the result of applying the Kalman filter of the integrated value ⁇ yaw at the continuous times t and t-1 are the yaw angles (absolute) indicating the traveling direction of the autonomous traveling robot 10 at the current times t and t-1. Yaw angle) is shown.
  • the own device absolute yaw angle estimation unit 14 estimates the peak position of the curve shown as the result of applying the Kalman filter of the integrated value ⁇ yaw at the current time t, and the line shown by the solid line in FIG. 8, that is, “object (feature information) array direction estimation”.
  • the peaks (P1, P2, P3, P4) of the "absolute yaw angle probability density distribution of the own device calculated from the results (4 candidates)" are compared and collated.
  • the peak position of the curve shown as the result of applying the Kalman filter of the integrated value ⁇ yaw at the current time t is the line shown by the solid line in FIG. It is located closest to the peak (P3) of the "Yaw angle probability density distribution".
  • the peak (P3) of the "absolute yaw angle probability density distribution of the own device calculated from the object (feature information) array direction estimation result (4 candidates)" is the own device at the current time (t). That is, it is determined that the position is the peak position indicating the absolute yaw angle indicating the current traveling direction nucleus and the order of the autonomous mobile robot 10.
  • the peak position of the curve shown as the Kalman filter application result of the integrated value ⁇ yaw is the peak of the “absolute yaw angle probability density distribution of the own device calculated from the object (feature information) array direction estimation result (4 candidates)”.
  • the deviation from the (P3) position is due to an error generated by the integration process of the yaw angle change amount ( ⁇ yaw) as described above. That is, it is caused by the integration error of the Yaw angle based on dead reckoning.
  • the own device absolute yaw angle estimation unit 14 has the following two data (a) and (b), that is, (A) Object in ceiling image (feature information) array direction candidate (4 candidates) 22 output by image analysis unit (object in ceiling image (feature information) array direction estimation unit) 12. (B) Yaw angle change amount ( ⁇ yaw) 23, calculated by the inertial measurement unit (IMU) 13. Using these two data (a) and (b), the absolute yaw angle corresponding to the angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10, is calculated.
  • the absolute yaw angle 24 estimated by the own device absolute yaw angle estimation unit 14 is input to the robot drive unit 15.
  • the robot drive unit 15 confirms the absolute yaw angle 24 calculated by the own device absolute yaw angle estimation unit 14, that is, the angle indicating the current traveling direction of the autonomous traveling robot 10, and the own device, That is, the traveling direction and the traveling speed of the autonomous traveling robot 10 are controlled. By this control, the autonomous traveling robot 10 can be accurately traveled toward the destination.
  • FIG. 9 is a diagram showing a flowchart for explaining a processing sequence executed by the autonomous mobile device of the present disclosure.
  • the processing according to the flowchart shown in FIG. 9 is executed by, for example, the control unit (data processing unit) of the autonomous traveling robot 10 having the configuration shown in FIG. 5 according to a program stored in the storage unit of the autonomous traveling robot 10, for example. Is possible. For example, it can be performed as a program execution process by a processor such as a CPU having a program execution function.
  • the flow shown in FIG. 9 can also be executed as a process of a control device such as a control server capable of communicating with the autonomous traveling robot 10.
  • a control device such as a control server capable of communicating with the autonomous traveling robot 10.
  • Step S101 the autonomous traveling robot 10 executes the initial position / posture acquisition process in step S101.
  • Step S102 the autonomous traveling robot 10 executes the current self-position / posture acquisition process in step S102 after the start of traveling.
  • the position and posture acquired in step S101 and stored in the memory become the current self-position posture.
  • the current self-positioning posture is continuously updated based on the information obtained by the processing of steps S103 to S107. This update data is also recorded in the memory.
  • Step S103 The processing of steps S103 to S104 and the processing of step S105 are executed in parallel.
  • step S103 the ceiling image acquisition process is executed.
  • This process is a process executed by the camera 11 of the autonomous traveling robot 10 shown in FIG.
  • the camera (ceiling image capturing camera) 11 continuously captures an image of the ceiling in the vertical upward direction of the traveling surface of the autonomous traveling robot 10 while the autonomous traveling robot 10 is traveling. That is, a moving image of the ceiling image is taken.
  • the ceiling image 21 as described above with reference to FIG. 6 is photographed.
  • the ceiling image 21 captured by the camera 11 is input to the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12.
  • step S104 the object (feature information) array direction is estimated using the trained model.
  • This process is a process executed by the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 of the autonomous traveling robot 10 shown in FIG.
  • the image analysis unit (object in the ceiling image (feature information) array direction estimation unit) 12 analyzes the ceiling image 21 taken by the camera 11 and of the object (feature information) included in the ceiling image 21 taken by the camera 11. Estimate the array direction.
  • the array of objects (feature information) is, for example, an array of fluorescent tube illuminations, an array of spherical illuminations, an array of sprinklers and fire alarms, an array of ceiling plate patterns, seams, and the like.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 applies the general ceiling-learned model stored in the storage unit 18 to the object (feature information) included in the ceiling image 21. Estimate the array direction.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 has previously referred to FIG. 7 as the array direction of the objects (feature information) included in the ceiling image 21 by this array direction estimation process. As described above, four array direction estimation results are generated. That is, the object (feature information) array direction candidates (4 candidates) 22 in the ceiling image shown in FIG. 5 described above are generated.
  • the image analysis unit 12 sets the arrangement direction of the objects (feature information) to which the general ceiling-learned model stored in the storage unit 18 is applied, for example, a network such as a neural network.
  • a network such as a neural network.
  • ResNet, LeNet, AlexNet, or a modified network thereof can be used.
  • the process is not limited to the process using the network, and may be executed as an estimation process based on a rule base (Hough transform, RANSAC, Fourier transform, matching, etc.), for example.
  • the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 analyzes the ceiling image 21 and objects (features) included in the ceiling image 21. Information) Estimate the arrangement direction.
  • FIG. 7B output (Autoput) the estimation result of the object (feature information) array direction output by the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 is 4. Candidates for array directions of two different objects (feature information). This is because it is difficult to distinguish the ceiling image 21 taken by the camera (ceiling image taking camera) 11 from the rotated image obtained by rotating the ceiling image 21 by 90 degrees [deg].
  • Step S105 Next, the process of step S105 that can be executed in parallel with the process of steps S103 to S104 will be described.
  • step S105 the yaw change amount ( ⁇ yaw) is calculated based on the measured value of the inertial measurement unit (IMU).
  • This process is a process executed by the inertial measurement unit (IMU) 13 of the autonomous traveling robot 10 shown in FIG.
  • the inertial measurement unit (IMU) 13 is an inertial measurement unit (IMU: Industrial Measurement Unit) and observes the movement of the autonomous traveling robot 10. Specifically, it is composed of a gyro sensor and an accelerometer, measures the angular velocity and acceleration corresponding to each of the XYZ3 axes according to the movement of the autonomous traveling robot 10, and the amount of change in the traveling direction of the autonomous traveling robot 10 per unit time. The amount of change in yaw angle ( ⁇ yaw) corresponding to (angle of change in the traveling direction) is calculated. That is, the yaw angle change amount ( ⁇ yaw) described above with reference to FIG. 3 is calculated.
  • VO Vehicle Odomery
  • Lidar Odomery may be used to calculate the yaw angle change amount ( ⁇ yaw).
  • step S106 the absolute yaw angle of the own device is calculated using the Kalman filter.
  • This process is a process executed by the own device absolute yaw angle estimation unit 14 of the autonomous traveling robot 10 shown in FIG.
  • the own device absolute yaw angle estimation unit 14 inputs the following two data (a) and (b).
  • the own device absolute yaw (Yaw) angle estimation unit 14 uses these two data (a) and (b) to correspond to an angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10.
  • the own device absolute yaw angle estimation unit 14 (A) Object in ceiling image (feature information) array direction candidate (4 candidates) 22 output by image analysis unit (object in ceiling image (feature information) array direction estimation unit) 12. Using this data (a), a probability density distribution indicating the certainty of the absolute yaw angle corresponding to the traveling direction angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10, is calculated. This probability density distribution is a line shown by a solid line in FIG.
  • the line shown by the solid line in FIG. 8, that is, the "absolute yaw angle probability density distribution of the own device calculated from the object (feature information) array direction estimation result (4 candidates)" has four peaks (P1, P2, P3). , P4). These four peaks (P1, P2, P3, P4) are objects (feature information) output by the image analysis unit (object in ceiling image (feature information) array direction estimation unit) 12 described above with reference to FIG. 7. ) The absolute yaw angle of the own device calculated based on each of the four orthogonal objects (feature information) array direction candidates shown in FIG. 7B, which is the estimation result of the array direction. be.
  • the absolute yaw angle corresponding to the traveling direction angle indicating the current traveling direction of the autonomous traveling robot 10 is any one of these four peaks (P1, P2, P3, P4), and these four peaks (P1, P2, P3, P4).
  • the own device absolute yaw angle estimation unit 14 uses the calculated data of the inertial measurement unit (IMU) 13 obtained in step S105, that is, (B) Yaw angle change amount ( ⁇ yaw) 23, calculated by the inertial measurement unit (IMU) 13. Use this calculated data.
  • the own device absolute yaw angle estimation unit 14 calculates the integrated value of the yaw angle change amount ( ⁇ yaw) 23 calculated by the inertial measurement unit (IMU) 13. That is, ⁇ yaw Calculate the above integrated value.
  • This integrated value is a yaw angle (absolute yaw angle) indicating the traveling direction of the autonomous traveling robot 10 at the current time tun. Further, a Kalman filter is applied to perform a filtering process for reducing the error component included in the integrated value ⁇ yaw. ,
  • the own device absolute yaw angle estimation unit 14 estimates the peak position of the curve shown as the result of applying the Kalman filter of the integrated value ⁇ yaw at the current time t, and the line shown by the solid line in FIG. 8, that is, “object (feature information) array direction estimation”.
  • the peaks (P1, P2, P3, P4) of the "absolute yaw angle probability density distribution of the own device calculated from the results (4 candidates)" are compared and collated.
  • the peak position of the curve shown as the result of applying the Kalman filter of the integrated value ⁇ yaw at the current time t is the line shown by the solid line in FIG. It is located closest to the peak (P3) of the "Yaw angle probability density distribution".
  • the peak (P3) of the "absolute yaw angle probability density distribution of the own device calculated from the object (feature information) array direction estimation result (4 candidates)" is the own device at the current time (t). That is, it is determined that the position is the peak position indicating the absolute yaw angle indicating the current traveling direction nucleus and the order of the autonomous mobile robot 10.
  • the own device absolute yaw angle estimation unit 14 uses the following two data (a) and (b), that is, (A) In step S104, the object (feature information) array direction candidate (4 candidates) 22 in the ceiling image generated by the image analysis unit (object (feature information) array direction estimation unit in the ceiling image) 12 (B) Yaw angle change amount ( ⁇ yaw) 23, calculated by the inertial measurement unit (IMU) 13 in step S105. Using these two data (a) and (b), the absolute yaw angle corresponding to the angle indicating the current traveling direction of the own device, that is, the autonomous traveling robot 10, is calculated.
  • the absolute yaw angle 24 estimated by the own device absolute yaw angle estimation unit 14 is input to the robot drive unit 15.
  • step S107 the robot is driven and controlled based on the self-device calculated in step S106, that is, the absolute yaw angle calculation result of the autonomous traveling robot 10.
  • This process is a process executed by the robot drive unit 15 of the autonomous traveling robot 10 shown in FIG.
  • the robot drive unit 15 confirms the absolute yaw angle 24 calculated by the own device absolute yaw angle estimation unit 14, that is, the angle indicating the current traveling direction of the autonomous traveling robot 10, and the own device, That is, the traveling direction and the traveling speed of the autonomous traveling robot 10 are controlled. By this control, the autonomous traveling robot 10 can be accurately traveled toward the destination.
  • step S107 While the autonomous traveling robot 10 is traveling, after step S107, the process returns to step S102 and the processes of steps S102 to S107 are repeatedly executed.
  • the process ends.
  • the yaw angle change calculated by the inertial measurement unit (IMU) when the autonomous traveling robot 10 analyzes the ceiling image obtained while traveling and estimates the self-position and direction in real time It is a configuration that can avoid the occurrence of an error due to the integration error included in the integrated value ⁇ yaw of the quantity ( ⁇ yaw).
  • the process of the present disclosure can be applied to, for example, a pose graph 62 to which a graph-based SLAM, which is an offline SLAM using the graph shown in FIG. 10, is applied, and a map 63 creation process.
  • the SLAM (simultaneous localization and mapping) process is a process of executing self-location identification (localization) and environment mapping (mapping) in parallel.
  • the graph-based SLAM is a process of offlinely generating a pose graph 62 composed of position information of a moving body and a map 63 composed of a map of the surrounding environment (three-dimensional map) of the moving body.
  • a pose graph 62 composed of position information and a map 63 composed of a map of the surrounding environment (three-dimensional map) of a moving object are generated.
  • a yaw angle with reliability indicating the traveling direction angle of the moving body is added as a graph constraint. This process makes it possible to correct the distortion of the graph. As a result, it becomes possible to acquire and generate the movement route information of the moving body with higher accuracy.
  • FIG. 11 is a diagram illustrating an example of self-position estimation processing to which a particle filter is applied.
  • the self-position estimation process to which the particle filter is applied is a method of sequentially narrowing down the positions (particles) where the self-position is likely to exist as time passes.
  • the autonomous mobile robot is traveling in the travelable area shown in FIG.
  • the autonomous mobile robot moves in the travelable area shown in FIG.
  • the autonomous mobile robot is moving while taking an image.
  • the position (particle position) where the autonomous mobile robot is likely to exist is narrowed down.
  • the position (particle position) where the autonomous mobile robot is likely to exist spreads over the entire travelable area, but with the passage of time, the position (particle position) where the autonomous mobile robot is likely to exist is likely to exist.
  • Position is narrowed down.
  • the position (particle) in which the autonomous mobile robot is likely to exist is narrowed down, and the process of specifying the position is the self-position estimation process in which the particle filter is applied. It should be noted that a process of narrowing down the particles is performed by setting a larger weight for the particles at a position where the autonomous mobile robot is likely to exist and leaving the particles having the larger weight.
  • the observation information of the yaw angle corresponding to the traveling direction of the present disclosure is applied to the weight setting.
  • This can be expected to have the effect of accelerating the narrowing down (convergence) of particles.
  • it is effective when a landmark that is not uniquely determined can be acquired or when a travelable area is provided as a preliminary map.
  • the image for calculating the yaw angle is not limited to the ceiling image, and for example, an image of the robot running surface, for example, a floor (carpet pattern, etc.) can be used. Further, it may be configured to perform processing using surrounding images such as an image of a road surface which is a traveling surface of a vehicle, a front image of a road extending in the forward direction, and a rear image.
  • the relative angle between the autonomous moving device such as an autonomous traveling robot and the building is calculated based on the calculated yaw angle, and the autonomous moving device.
  • the image for calculating the yaw angle is not limited to the ceiling image.
  • it is possible to perform processing using surrounding images such as an image of a road surface which is a traveling surface of an autonomous vehicle, a front image of a road extending in the forward direction, and a rear image.
  • an autonomous driving vehicle which is an embodiment of the autonomous mobile device of the present disclosure, will be described.
  • FIG. 12 is a block diagram showing a configuration example of a schematic function of the vehicle control system 100 of the autonomous driving vehicle.
  • a vehicle provided with the vehicle control system 100 is distinguished from other vehicles, it is referred to as a own vehicle or a own vehicle.
  • the vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, an in-vehicle device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a body system control unit 109, and a body. It includes a system system 110, a storage unit 111, and an automatic operation control unit 112.
  • the input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the body system control unit 109, the storage unit 111, and the automatic operation control unit 112 are connected via the communication network 121. They are interconnected.
  • the communication network 121 is, for example, from an in-vehicle communication network or bus conforming to an arbitrary standard such as CAN (Control Area Network), LIN (Local Internet Network), LAN (Local Area Network), or FlexRay (registered trademark). Become. In addition, each part of the vehicle control system 100 may be directly connected without going through the communication network 121.
  • the description of the communication network 121 shall be omitted.
  • the input unit 101 and the automatic operation control unit 112 communicate with each other via the communication network 121, it is described that the input unit 101 and the automatic operation control unit 112 simply communicate with each other.
  • the input unit 101 includes a device used by the passenger to input various data, instructions, and the like.
  • the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device capable of inputting by a method other than manual operation by voice or gesture.
  • the input unit 101 may be a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device corresponding to the operation of the vehicle control system 100.
  • the input unit 101 generates an input signal based on data, instructions, and the like input by the passenger, and supplies the input signal to each unit of the vehicle control system 100.
  • the data acquisition unit 102 includes various sensors and the like that acquire data used for processing of the vehicle control system 100, and supplies the acquired data to each unit of the vehicle control system 100.
  • the data acquisition unit 102 includes various sensors for detecting the state of the own vehicle and the like.
  • the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an accelerator pedal operation amount, a brake pedal operation amount, a steering wheel steering angle, and an engine speed. It is equipped with a sensor or the like for detecting the rotation speed of the motor, the rotation speed of the wheels, or the like.
  • IMU inertial measurement unit
  • the data acquisition unit 102 includes various sensors for detecting information outside the own vehicle.
  • the data acquisition unit 102 includes an image pickup device such as a ToF (Time Of Flight) camera, a visible light camera, a stereo camera, a monocular camera, a (far) infrared camera, and other cameras.
  • the data acquisition unit 102 includes an environment sensor for detecting the weather or the weather, and a surrounding information detection sensor for detecting an object around the own vehicle.
  • the environmental sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.
  • the ambient information detection sensor includes, for example, an ultrasonic sensor, a radar, a LiDAR (Light Detection and Ranking, a Laser Imaging Detection and Ranking), a sonar, or the like.
  • the data acquisition unit 102 includes various sensors for detecting the current position of the own vehicle.
  • the data acquisition unit 102 includes a GNSS receiver or the like that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite.
  • GNSS Global Navigation Satellite System
  • the data acquisition unit 102 includes various sensors for detecting information in the vehicle.
  • the data acquisition unit 102 includes an imaging device that images the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
  • the biosensor is provided on, for example, the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel.
  • the communication unit 103 communicates with the in-vehicle device 104 and various devices, servers, base stations, etc. outside the vehicle, transmits data supplied from each unit of the vehicle control system 100, and transmits the received data to the vehicle control system. It is supplied to each part of 100.
  • the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 may support a plurality of types of communication protocols.
  • the communication unit 103 wirelessly communicates with the in-vehicle device 104 by wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like. Further, for example, the communication unit 103 uses a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL () via a connection terminal (and a cable if necessary) (not shown). Wired communication is performed with the in-vehicle device 104 by Mobile High-definition Link) or the like.
  • USB Universal Serial Bus
  • HDMI registered trademark
  • MHL Mobility Management Entity
  • the communication unit 103 is connected to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network or a network peculiar to a business operator) via a base station or an access point. Communicate. Further, for example, the communication unit 103 uses P2P (Peer To Peer) technology to connect with a terminal (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal) existing in the vicinity of the own vehicle. Communicate.
  • a device for example, an application server or a control server
  • an external network for example, the Internet, a cloud network or a network peculiar to a business operator
  • the communication unit 103 uses P2P (Peer To Peer) technology to connect with a terminal (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal) existing in the vicinity of the own vehicle. Communicate.
  • P2P Peer To Peer
  • a terminal for example, a pedestrian or
  • the communication unit 103 includes vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-home (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Perform V2X communication such as communication. Further, for example, the communication unit 103 is provided with a beacon receiving unit, receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic congestion, traffic regulation, or required time. do.
  • a beacon receiving unit receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic congestion, traffic regulation, or required time. do.
  • the in-vehicle device 104 includes, for example, a mobile device or a wearable device owned by a passenger, an information device carried in or attached to the own vehicle, a navigation device for searching a route to an arbitrary destination, and the like.
  • the output control unit 105 controls the output of various information to the passengers of the own vehicle or the outside of the vehicle.
  • the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data) and supplies the output signal to the output unit 106.
  • the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate a bird's-eye view image, a panoramic image, or the like, and outputs an output signal including the generated image. It is supplied to the output unit 106.
  • the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a danger zone, and outputs an output signal including the generated voice data to the output unit 106.
  • Supply for example, the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into
  • the output unit 106 is provided with a device capable of outputting visual information or auditory information to the passengers of the own vehicle or the outside of the vehicle.
  • the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a spectacle-type display worn by a passenger, a projector, a lamp, and the like.
  • the display device included in the output unit 106 displays visual information in the driver's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function, in addition to the device having a normal display. It may be a display device.
  • the drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies control signals to each unit other than the drive system system 108 as necessary, and notifies the control state of the drive system system 108.
  • the drive system system 108 includes various devices related to the drive system of the own vehicle.
  • the drive system system 108 includes a drive force generator for generating a drive force of an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle, and the like. It is equipped with a braking device that generates braking force, ABS (Antilock Brake System), ESC (Electronic Stability Control), an electric power steering device, and the like.
  • the body system control unit 109 controls the body system 110 by generating various control signals and supplying them to the body system 110. Further, the body system control unit 109 supplies a control signal to each unit other than the body system 110 as necessary, and notifies the control state of the body system 110 and the like.
  • the body system 110 includes various body devices equipped on the vehicle body.
  • the body system 110 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, and various lamps (for example, head lamps, back lamps, brake lamps, winkers, fog lamps, etc.). Etc. are provided.
  • the storage unit 111 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and the like. ..
  • the storage unit 111 stores various programs, data, and the like used by each unit of the vehicle control system 100.
  • the storage unit 111 has map data such as a three-dimensional high-precision map such as a dynamic map, a global map which is less accurate than the high-precision map and covers a wide area, and a local map including information around the own vehicle.
  • map data such as a three-dimensional high-precision map such as a dynamic map, a global map which is less accurate than the high-precision map and covers a wide area, and a local map including information around the own vehicle.
  • the automatic driving control unit 112 controls automatic driving such as autonomous driving or driving support. Specifically, for example, the automatic driving control unit 112 issues collision avoidance or impact mitigation of the own vehicle, follow-up running based on the inter-vehicle distance, vehicle speed maintenance running, collision warning of the own vehicle, lane deviation warning of the own vehicle, and the like. Coordinated control is performed for the purpose of realizing the functions of ADAS (Advanced Driver Assistance System) including. Further, for example, the automatic driving control unit 112 performs cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation of the driver.
  • the automatic operation control unit 112 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
  • the detection unit 131 detects various types of information necessary for controlling automatic operation.
  • the detection unit 131 includes an outside information detection unit 141, an inside information detection unit 142, and a vehicle state detection unit 143.
  • the vehicle outside information detection unit 141 performs detection processing of information outside the own vehicle based on data or signals from each unit of the vehicle control system 100. For example, the vehicle outside information detection unit 141 performs detection processing, recognition processing, tracking processing, and distance detection processing for an object around the own vehicle. Objects to be detected include, for example, vehicles, people, obstacles, structures, roads, traffic lights, traffic signs, road signs, and the like. Further, for example, the vehicle outside information detection unit 141 performs detection processing of the environment around the own vehicle. The surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like.
  • the vehicle outside information detection unit 141 outputs data indicating the result of the detection process to the self-position estimation unit 132, the map analysis unit 151 of the situation analysis unit 133, the traffic rule recognition unit 152, the situation recognition unit 153, and the operation control unit 135. It is supplied to the emergency situation avoidance unit 171 and the like.
  • the in-vehicle information detection unit 142 performs in-vehicle information detection processing based on data or signals from each unit of the vehicle control system 100.
  • the vehicle interior information detection unit 142 performs driver authentication processing and recognition processing, driver status detection processing, passenger detection processing, vehicle interior environment detection processing, and the like.
  • the state of the driver to be detected includes, for example, physical condition, arousal level, concentration level, fatigue level, line-of-sight direction, and the like.
  • the environment inside the vehicle to be detected includes, for example, temperature, humidity, brightness, odor, and the like.
  • the vehicle interior information detection unit 142 supplies data indicating the result of the detection process to the situational awareness unit 153 of the situational analysis unit 133, the emergency situation avoidance unit 171 of the motion control unit 135, and the like.
  • the vehicle state detection unit 143 performs the state detection process of the own vehicle based on the data or signals from each part of the vehicle control system 100.
  • the states of the own vehicle to be detected include, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, driving operation state, power seat position / tilt, door lock state, and other in-vehicle devices. The state etc. are included.
  • the vehicle state detection unit 143 supplies data indicating the result of the detection process to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
  • the self-position estimation unit 132 estimates the position and attitude of the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the vehicle exterior information detection unit 141 and the situational awareness unit 153 of the situation analysis unit 133. Perform processing. In addition, the self-position estimation unit 132 generates a local map (hereinafter, referred to as a self-position estimation map) used for self-position estimation, if necessary.
  • the map for self-position estimation is, for example, a high-precision map using a technique such as SLAM (Simultaneus Localization and Mapping).
  • the self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and the like of the situation analysis unit 133. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 111.
  • the situation analysis unit 133 analyzes the situation of the own vehicle and the surroundings.
  • the situation analysis unit 133 includes a map analysis unit 151, a traffic rule recognition unit 152, a situation recognition unit 153, and a situation prediction unit 154.
  • the map analysis unit 151 uses data or signals from each unit of the vehicle control system 100 such as the self-position estimation unit 132 and the vehicle exterior information detection unit 141 as necessary, and the map analysis unit 151 of various maps stored in the storage unit 111. Perform analysis processing and build a map containing information necessary for automatic driving processing.
  • the map analysis unit 151 applies the constructed map to the traffic rule recognition unit 152, the situation recognition unit 153, the situation prediction unit 154, the route planning unit 161 of the planning unit 134, the action planning unit 162, the operation planning unit 163, and the like. Supply to.
  • the traffic rule recognition unit 152 determines the traffic rules around the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the self-position estimation unit 132, the vehicle outside information detection unit 141, and the map analysis unit 151. Perform recognition processing. By this recognition process, for example, the position and state of the signal around the own vehicle, the content of the traffic regulation around the own vehicle, the lane in which the vehicle can travel, and the like are recognized.
  • the traffic rule recognition unit 152 supplies data indicating the result of the recognition process to the situation prediction unit 154 and the like.
  • the situation recognition unit 153 can be used for data or signals from each unit of the vehicle control system 100 such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151. Based on this, the situation recognition process related to the own vehicle is performed. For example, the situational awareness unit 153 performs recognition processing such as the situation of the own vehicle, the situation around the own vehicle, and the situation of the driver of the own vehicle. In addition, the situational awareness unit 153 generates a local map (hereinafter, referred to as a situational awareness map) used for recognizing the situation around the own vehicle, if necessary.
  • the situational awareness map is, for example, an occupied grid map (Occupancy Grid Map).
  • the status of the own vehicle to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the own vehicle, and the presence / absence and contents of an abnormality.
  • the surrounding conditions of the vehicle to be recognized include, for example, the type and position of surrounding stationary objects, the type, position and movement of surrounding animals (for example, speed, acceleration, moving direction, etc.), and the surrounding roads.
  • the composition and road surface condition, as well as the surrounding weather, temperature, humidity, brightness, etc. are included.
  • the state of the driver to be recognized includes, for example, physical condition, arousal level, concentration level, fatigue level, line-of-sight movement, driving operation, and the like.
  • the situational awareness unit 153 supplies data indicating the result of the recognition process (including a situational awareness map, if necessary) to the self-position estimation unit 132, the situation prediction unit 154, and the like. Further, the situational awareness unit 153 stores the situational awareness map in the storage unit 111.
  • the situation prediction unit 154 performs a situation prediction process related to the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151, the traffic rule recognition unit 152, and the situation recognition unit 153. For example, the situation prediction unit 154 performs prediction processing such as the situation of the own vehicle, the situation around the own vehicle, and the situation of the driver.
  • the situation of the own vehicle to be predicted includes, for example, the behavior of the own vehicle, the occurrence of an abnormality, the mileage, and the like.
  • the situation around the own vehicle to be predicted includes, for example, the behavior of the animal body around the own vehicle, the change in the signal state, the change in the environment such as the weather, and the like.
  • the driver's situation to be predicted includes, for example, the driver's behavior and physical condition.
  • the situation prediction unit 154 together with the data from the traffic rule recognition unit 152 and the situation recognition unit 153, provides the data indicating the result of the prediction processing to the route planning unit 161, the action planning unit 162, and the operation planning unit 163 of the planning unit 134. And so on.
  • the route planning unit 161 plans a route to the destination based on data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. For example, the route planning unit 161 sets a route from the current position to the specified destination based on the global map. Further, for example, the route planning unit 161 appropriately changes the route based on the conditions such as traffic congestion, accidents, traffic restrictions, construction work, and the physical condition of the driver. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
  • the action planning unit 162 safely sets the route planned by the route planning unit 161 within the planned time based on the data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. Plan your vehicle's actions to drive. For example, the action planning unit 162 plans starting, stopping, traveling direction (for example, forward, backward, left turn, right turn, turning, etc.), traveling lane, traveling speed, overtaking, and the like. The action planning unit 162 supplies data indicating the planned behavior of the own vehicle to the motion planning unit 163 and the like.
  • the operation planning unit 163 is the operation of the own vehicle for realizing the action planned by the action planning unit 162 based on the data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. Plan.
  • the motion planning unit 163 plans acceleration, deceleration, traveling track, and the like.
  • the motion planning unit 163 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172 and the direction control unit 173 of the motion control unit 135.
  • the motion control unit 135 controls the motion of the own vehicle.
  • the operation control unit 135 includes an emergency situation avoidance unit 171, an acceleration / deceleration control unit 172, and a direction control unit 173.
  • the emergency situation avoidance unit 171 may collide, contact, enter a danger zone, have a driver abnormality, or cause a vehicle. Performs emergency detection processing such as abnormalities.
  • the emergency situation avoidance unit 171 detects the occurrence of an emergency situation, the emergency situation avoidance unit 171 plans the operation of the own vehicle to avoid an emergency situation such as a sudden stop or a sharp turn.
  • the emergency situation avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172, the direction control unit 173, and the like.
  • the acceleration / deceleration control unit 172 performs acceleration / deceleration control for realizing the operation of the own vehicle planned by the motion planning unit 163 or the emergency situation avoidance unit 171.
  • the acceleration / deceleration control unit 172 calculates a control target value of a driving force generator or a braking device for realizing a planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. It is supplied to the system control unit 107.
  • the direction control unit 173 performs direction control for realizing the operation of the own vehicle planned by the motion planning unit 163 or the emergency situation avoidance unit 171. For example, the direction control unit 173 calculates the control target value of the steering mechanism for realizing the traveling track or the sharp turn planned by the motion planning unit 163 or the emergency situation avoidance unit 171 and controls to indicate the calculated control target value. The command is supplied to the drive system control unit 107.
  • FIG. 13 is a diagram showing an example of the hardware configuration of the autonomous mobile device of the present disclosure, such as the autonomous traveling robot 10 described with reference to FIG.
  • the hardware configuration shown in FIG. 13 will be described.
  • the CPU (Central Processing Unit) 301 functions as a data processing unit that executes various processes according to a program stored in the ROM (Read Only Memory) 302 or the storage unit 308. For example, the process according to the sequence described in the above-described embodiment is executed.
  • the RAM (Random Access Memory) 303 stores programs and data executed by the CPU 301. These CPU 301, ROM 302, and RAM 303 are connected to each other by a bus 304.
  • the CPU 301 is connected to the input / output interface 305 via the bus 304, and the input / output interface 305 is input by various switches, a keyboard, a touch panel, a mouse, a microphone, and a status data acquisition unit such as a sensor, a camera, and GPS.
  • An output unit 307 including a unit 306, a display, a speaker, and the like is connected.
  • the input information from the sensor 321 is also input to the input unit 306.
  • the output unit 307 also outputs drive information for the drive unit 322 of the mobile device.
  • the CPU 301 inputs commands, status data, and the like input from the input unit 306, executes various processes, and outputs the process results to, for example, the output unit 307.
  • the storage unit 308 connected to the input / output interface 305 is composed of, for example, a hard disk or the like, and stores a program executed by the CPU 301 and various data.
  • the communication unit 309 functions as a transmission / reception unit for data communication via a network such as the Internet or a local area network, and communicates with an external device.
  • the drive 310 connected to the input / output interface 305 drives a removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory such as a memory card, and records or reads data.
  • a removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory such as a memory card
  • the technology disclosed in the present specification can have the following configuration.
  • An image analysis unit that inputs images taken by a camera and performs image analysis
  • a measuring unit that acquires observation information for each unit time of the moving state of the autonomous mobile device
  • It has an own device absolute yaw angle calculation unit that inputs the image analysis result of the image analysis unit and the observation information of the measurement unit and calculates the traveling direction of the autonomous moving device.
  • the own device absolute yaw angle calculation unit A plurality of estimated traveling direction candidate data of the autonomous moving device obtained from the image analysis result of the image analysis unit, and By collating the integrated value-based estimated traveling direction data of the autonomous mobile device obtained from the integrated result of the observation information for each unit time of the measuring unit, An autonomous moving device that calculates the absolute yaw angle of the own device corresponding to the traveling direction of the autonomous moving device based on the collation result.
  • the autonomous mobile device is It has a camera that photographs the ceiling vertically above the traveling surface of the autonomous mobile device.
  • the image analysis unit The autonomous moving device according to (1), wherein an image analysis is performed by inputting a ceiling image obtained by photographing the ceiling vertically above the traveling surface of the autonomous moving device.
  • the image taken by the camera is This is a ceiling image of the ceiling vertically above the traveling surface of the autonomous mobile device.
  • the image analysis unit The method according to (1) or (2), wherein the arrangement direction of the feature information is detected from the ceiling image, and the detected feature information arrangement direction is used to generate a plurality of estimated traveling direction candidate data of the autonomous moving device. Autonomous mobile device.
  • the arrangement direction of the feature information is at least one of the arrangement direction of lighting, the arrangement direction of sprinklers, the arrangement direction of fire alarms, the pattern of ceiling plates, or the arrangement direction of seams (3).
  • Autonomous mobile device is at least one of the arrangement direction of lighting, the arrangement direction of sprinklers, the arrangement direction of fire alarms, the pattern of ceiling plates, or the arrangement direction of seams (3).
  • the image analysis unit The autonomous mobile device according to (3) or (4), wherein a plurality of estimated traveling direction candidate data of the autonomous mobile device is generated by applying a trained model generated in advance.
  • the trained model is The autonomous mobile device according to (5), which is a trained model generated in advance by a learning process using various ceiling patterns as input data.
  • the measuring unit is The autonomous moving device according to any one of (1) to (6), which calculates a yaw angle change amount ( ⁇ yaw) corresponding to a change amount in the traveling direction per unit time of the autonomous moving device.
  • the measuring unit is The autonomous mobile device according to any one of (1) to (7), which is an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • the own device absolute yaw angle calculation unit is A plurality of estimated traveling direction candidate data of the autonomous moving device obtained from the image analysis result of the image analysis unit, and The integrated value-based estimated traveling direction data of the autonomous mobile device obtained from the integrated result of the observation information for each unit time of the measuring unit is collated. From the plurality of estimated traveling direction candidate data, the estimated traveling direction candidate data closest to the integrated value-based estimated traveling direction data is selected.
  • the autonomous moving device according to any one of (1) to (8), wherein the selected estimated traveling direction candidate data is an own device absolute yaw angle corresponding to the traveling direction of the autonomous moving device.
  • the own device absolute yaw angle calculation unit is The result data of the filtering process for a plurality of estimated traveling direction candidate data of the autonomous moving device obtained from the image analysis result of the image analysis unit, and The result data of the filtering process for the integrated value-based estimated traveling direction data of the autonomous mobile device obtained from the integrated result of the observation information for each unit time of the measuring unit is collated.
  • the autonomous moving device according to any one of (1) to (9), wherein the own device absolute yaw angle corresponding to the traveling direction of the autonomous moving device is calculated based on the collation result.
  • the autonomous mobile device is It has a drive unit that controls the movement of the autonomous mobile device, and has a drive unit.
  • the drive unit The autonomous moving device according to any one of (1) to (11), wherein the traveling direction data of the autonomous moving device is input from the own device absolute yaw angle calculation unit to control the movement of the autonomous moving device.
  • An image analysis step in which the image analysis unit inputs images taken by the camera and performs image analysis,
  • a measurement step in which the measuring unit acquires observation information for each unit time of the moving state of the autonomous mobile device,
  • the own device absolute yaw angle calculation unit inputs the image analysis result of the image analysis unit and the observation information of the measurement unit, and calculates the traveling direction of the autonomous moving device.
  • the step of calculating the absolute yaw angle of the own device is A plurality of estimated traveling direction candidate data of the autonomous moving device obtained from the image analysis result of the image analysis unit, and By collating the integrated value-based estimated traveling direction data of the autonomous mobile device obtained from the integrated result of the observation information for each unit time of the measuring unit, An autonomous movement control method for calculating an absolute yaw angle of an own device corresponding to a traveling direction of the autonomous moving device based on a collation result.
  • a program that executes autonomous movement control in an autonomous mobile device An image analysis step in which an image taken by a camera is input to the image analysis unit to perform image analysis, A measurement step in which the measurement unit acquires observation information for each unit time of the moving state of the autonomous mobile device.
  • the own device absolute yaw angle calculation unit inputs the image analysis result of the image analysis unit and the observation information of the measurement unit, and causes the own device absolute yaw angle calculation unit to calculate the traveling direction of the autonomous moving device.
  • the series of processes described in the specification can be executed by hardware, software, or a composite configuration of both.
  • the program can be pre-recorded on a recording medium.
  • LAN Local Area Network
  • the various processes described in the specification are not only executed in chronological order according to the description, but may also be executed in parallel or individually as required by the processing capacity of the device that executes the processes.
  • the system is a logical set configuration of a plurality of devices, and the devices having each configuration are not limited to those in the same housing.
  • an image analysis unit that analyzes the ceiling image taken by the camera, a measurement unit that acquires observation information for each unit time of the moving state of the autonomous moving device, and a ceiling image analysis result of the image analysis unit. It has an own device absolute yaw angle calculation unit that inputs observation information of the measurement unit and calculates the traveling direction of the autonomous moving device.
  • the own device absolute yaw angle calculation unit is an autonomous movement device obtained from a plurality of estimated traveling direction candidate data of the autonomous movement device obtained from the ceiling image analysis result and an integration result of observation information for each unit time of the measurement unit.
  • the absolute yaw angle of the own device corresponding to the traveling direction of the autonomous moving device is calculated by collating the integrated value-based estimated traveling direction data of.
  • Storage unit 112 ... Automatic operation control unit, 121 ... Communication network, 131 ... Detection unit, 132 ... Self-position estimation unit, 133 ... Situation analysis unit, 134 ⁇ ⁇ Planning unit, 135 ⁇ ⁇ Motion control unit, 141 ⁇ ⁇ Outside information detection unit, 142 ⁇ ⁇ Inside information detection unit, 143 ⁇ ⁇ Vehicle condition detection unit, 151 ⁇ ⁇ Map analysis unit, 152 ⁇ ⁇ Traffic rule recognition unit , 153 ... Situation recognition unit, 154 ... Situation prediction unit, 161 ... Route planning department, 162 ... Action planning department, 163 ... Operation planning department, 171 ... Emergency situation avoidance unit, 172 ... Acceleration / deceleration control Unit, 173 ...
  • Direction control unit 301 ... CPU, 302 ... ROM, 303 ... RAM, 304 ... Bus, 305 ... Input / output interface, 306 ... Input unit, 307 ... Output unit, 308 ... ⁇ Storage unit, 309 ⁇ ⁇ Communication unit, 310 ⁇ ⁇ Drive, 311 ⁇ ⁇ Removable media, 321 ⁇ ⁇ Sensor, 322 ⁇ ⁇ Drive unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un dispositif et un procédé qui permettent une commande de mouvement à haute précision d'un dispositif à déplacement autonome. La présente invention comprend une unité d'analyse d'image qui analyse une image de plafond qui a été capturée par une caméra, une unité de mesure qui acquiert des informations d'observation par unité de temps concernant l'état de mouvement d'un dispositif à déplacement autonome, et une unité de calcul d'angle de lacet absolu de dispositif qui reçoit en entrée les résultats de l'analyse d'image de plafond provenant de l'unité d'analyse d'image et les informations d'observation provenant de l'unité de mesure et calcule la direction de progression du dispositif à déplacement autonome. L'unité de calcul d'angle de lacet absolu de dispositif compare une pluralité d'éléments de données de direction de progression estimée candidate pour le dispositif à déplacement autonome, obtenues à partir des résultats de l'analyse d'image de plafond, à des données de direction de progression estimée basée sur une valeur intégrée pour le dispositif à déplacement autonome, obtenues par intégration des informations d'observation par unité de temps provenant de l'unité de mesure, et calcule ainsi un angle de lacet absolu de dispositif qui correspond à la direction de progression du dispositif à déplacement autonome.
PCT/JP2021/000295 2020-01-31 2021-01-07 Dispositif à déplacement autonome, procédé de commande de mouvement autonome, et programme WO2021153176A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020014375A JP2021120837A (ja) 2020-01-31 2020-01-31 自律移動装置、自律移動制御方法、並びにプログラム
JP2020-014375 2020-01-31

Publications (1)

Publication Number Publication Date
WO2021153176A1 true WO2021153176A1 (fr) 2021-08-05

Family

ID=77079333

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/000295 WO2021153176A1 (fr) 2020-01-31 2021-01-07 Dispositif à déplacement autonome, procédé de commande de mouvement autonome, et programme

Country Status (2)

Country Link
JP (1) JP2021120837A (fr)
WO (1) WO2021153176A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220300006A1 (en) * 2020-08-19 2022-09-22 Topcon Positioning Systems, Inc. System for Monitoring Stability of Operation of Autonomous Robots
WO2023124009A1 (fr) * 2021-12-31 2023-07-06 北京石头创新科技有限公司 Procédé et appareil de détermination d'état pour robot de nettoyage

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009098743A (ja) * 2007-10-12 2009-05-07 Sanyo Electric Co Ltd 点検システム、移動体、操作装置、及び点検プログラム
JP2018065171A (ja) * 2016-10-19 2018-04-26 三菱日立パワーシステムズ株式会社 配管内移動ロボットによる施工システムおよび施工方法
JP2018109849A (ja) * 2016-12-28 2018-07-12 本田技研工業株式会社 制御装置、監視装置及び制御用プログラム
US20190323845A1 (en) * 2016-11-09 2019-10-24 The Texas A&M University System Method and System for Accurate Long Term Simultaneous Localization and Mapping with Absolute Orientation Sensing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009098743A (ja) * 2007-10-12 2009-05-07 Sanyo Electric Co Ltd 点検システム、移動体、操作装置、及び点検プログラム
JP2018065171A (ja) * 2016-10-19 2018-04-26 三菱日立パワーシステムズ株式会社 配管内移動ロボットによる施工システムおよび施工方法
US20190323845A1 (en) * 2016-11-09 2019-10-24 The Texas A&M University System Method and System for Accurate Long Term Simultaneous Localization and Mapping with Absolute Orientation Sensing
JP2018109849A (ja) * 2016-12-28 2018-07-12 本田技研工業株式会社 制御装置、監視装置及び制御用プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220300006A1 (en) * 2020-08-19 2022-09-22 Topcon Positioning Systems, Inc. System for Monitoring Stability of Operation of Autonomous Robots
WO2023124009A1 (fr) * 2021-12-31 2023-07-06 北京石头创新科技有限公司 Procédé et appareil de détermination d'état pour robot de nettoyage

Also Published As

Publication number Publication date
JP2021120837A (ja) 2021-08-19

Similar Documents

Publication Publication Date Title
JP7136106B2 (ja) 車両走行制御装置、および車両走行制御方法、並びにプログラム
US11661084B2 (en) Information processing apparatus, information processing method, and mobile object
US20200409387A1 (en) Image processing apparatus, image processing method, and program
US20200241549A1 (en) Information processing apparatus, moving apparatus, and method, and program
WO2019181284A1 (fr) Dispositif de traitement d'informations, dispositif de mouvement, procédé et programme
JP7143857B2 (ja) 情報処理装置、情報処理方法、プログラム、及び、移動体
JP6891753B2 (ja) 情報処理装置、移動装置、および方法、並びにプログラム
US20210027486A1 (en) Controller, control method, and program
JP7257737B2 (ja) 情報処理装置、自己位置推定方法、及び、プログラム
US20220253065A1 (en) Information processing apparatus, information processing method, and information processing program
WO2019150918A1 (fr) Dispositif de traitement d'information, procédé de traitement d'information, programme, et corps mobile
WO2021153176A1 (fr) Dispositif à déplacement autonome, procédé de commande de mouvement autonome, et programme
US20200230820A1 (en) Information processing apparatus, self-localization method, program, and mobile body
WO2020241303A1 (fr) Dispositif de commande de déplacement autonome, système de commande de déplacement autonome et procédé de commande de déplacement autonome
WO2022158185A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et dispositif mobile
JP2023541322A (ja) 低信頼度の物体検出条件における車両動作のための注釈及びマッピング
WO2020213275A1 (fr) Dispositif, procédé et programme de traitement d'informations
JP7135690B2 (ja) 情報処理装置および方法、プログラム、並びに移動体制御システム
WO2023153083A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et dispositif de déplacement
US11366237B2 (en) Mobile object, positioning system, positioning program, and positioning method
US20230260254A1 (en) Information processing device, information processing method, and program
WO2019176278A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
WO2020261703A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20230206596A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21748429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21748429

Country of ref document: EP

Kind code of ref document: A1