WO2022138213A1 - Moving body, control method, and control program - Google Patents

Moving body, control method, and control program Download PDF

Info

Publication number
WO2022138213A1
WO2022138213A1 PCT/JP2021/045398 JP2021045398W WO2022138213A1 WO 2022138213 A1 WO2022138213 A1 WO 2022138213A1 JP 2021045398 W JP2021045398 W JP 2021045398W WO 2022138213 A1 WO2022138213 A1 WO 2022138213A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
self
unit
laser beam
control unit
Prior art date
Application number
PCT/JP2021/045398
Other languages
French (fr)
Japanese (ja)
Inventor
志門 鯵坂
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2022138213A1 publication Critical patent/WO2022138213A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • This application relates to mobiles, control methods and control programs.
  • a moving body that estimates the self-position of the moving body and autonomously moves based on the self-position.
  • the point group forming a plane is extracted for each frame as a plane point group, and the position of the plane formed by the plane point group is specified for each frame.
  • the translation vector and rotation matrix of the moving object between frames are calculated from the relationship between the position of the plane specified in a predetermined frame and the position of the plane specified in the next frame, and the translation of the moving object between frames is calculated. It is disclosed that a vector and a rotation matrix are sequentially calculated for each frame to estimate the locus of a moving body.
  • the accuracy of the self-position estimated based on the measurement result of the sensor of the conventional moving body decreases. Therefore, there is room for improvement in the conventional moving body in improving the estimation accuracy of the self-position even if the change in the surrounding environment is small.
  • the moving body includes a sensor that measures a distance based on the reflected light reflected by irradiating a laser beam, and an estimation unit that estimates the self-position and surrounding map of the moving body using the sensor.
  • the support portion that supports the sensor so that the irradiation direction of the laser beam can be changed, and the support portion that actively tilts the irradiation direction of the laser beam in a direction that suppresses a decrease in estimation accuracy of the estimation unit.
  • a support control unit for controlling the unit is provided.
  • the control method is a sensor that measures the distance based on the reflected light reflected by irradiating the laser beam, and a support portion that supports the sensor so that the irradiation direction of the laser beam can be changed.
  • the control program includes a sensor that irradiates a laser beam and measures a distance based on the reflected light reflected, and a support portion that supports the sensor so that the irradiation direction of the laser beam can be changed.
  • a moving body control program comprising the It includes controlling the support portion so as to actively tilt the irradiation direction of the laser beam.
  • FIG. 1 is a diagram for explaining an outline of a moving body according to an embodiment.
  • FIG. 2 is a diagram showing an example of the configuration of the moving body according to the embodiment.
  • FIG. 3 is a diagram showing an example of the functional configuration of the control unit shown in FIG.
  • FIG. 4 is a diagram for explaining an example of an error ellipse showing the certainty of the self-position.
  • FIG. 5 is a diagram for explaining an example in which the moving body according to the embodiment tilts the irradiation direction.
  • FIG. 6 is a flowchart showing an example of a processing procedure executed by the mobile body 100 according to the embodiment.
  • FIG. 1 is a diagram for explaining an outline of the moving body according to the embodiment.
  • the mobile body 100 shown in FIG. 1 is, for example, a flying object capable of autonomously flying.
  • the projectile includes, for example, a drone, an airplane, and the like.
  • the moving body 100 includes a vehicle, a robot, and the like that can move autonomously.
  • the moving body 100 includes a main body 110, a plurality of rotary wings 150, a plurality of legs 170, and a camera 190.
  • a main body 110 a plurality of rotary wings 150, a plurality of legs 170, and a camera 190.
  • the plurality of rotary blades 150 are four will be described, but the present invention is not limited to this.
  • the moving body 100 stands by with the legs 170 in contact with the ground.
  • the mobile body 100 includes a LiDAR (Light Detection And Ringing, Laser Imaging Detection And Ringing) 124.
  • the LiDAR 124 for example, irradiates a laser beam that emits light in a pulse shape, measures the reflected light of the laser beam, and measures the distance, direction, and the like of the target.
  • the moving body 100 irradiates the laser beam of LiDAR 124 in the horizontal direction, measures the reflected light reflected by the target, and measures the distance, direction, and the like of the target.
  • the mobile body 100 estimates the self-position and the surrounding map of the mobile body 100 by using the measurement result of LiDAR124, for example, by the method of SLAM (Simultaneus Localization And Mapping).
  • the surrounding map can be represented by, for example, point cloud data.
  • the point cloud data has information on three-dimensional coordinates and colors.
  • the moving body 100 estimates the likelihood of the estimated self-position based on the estimated surrounding map and the moving amount of the moving body 100.
  • the likelihood of self-position has a characteristic that it becomes smaller as the moving body 100 moves. In other words, a large likelihood of self-position corresponds to a good estimation of self-position, and a small likelihood of self-position may mean a large error ellipse (variance). ..
  • the moving body 100 is flying inside the building 1000.
  • the building 1000 has a floor 1010, a wall 1020 provided on the floor 1010, and a ceiling 1030 facing the floor 1010.
  • the floor 1010, the wall 1020, and the ceiling 1030 form a space 2000 in which the moving body 100 can fly.
  • the moving body 100 irradiates the laser beam to the irradiation direction L1 while flying at the altitude H1 in the space 2000, and measures the reflected light reflected by the laser beam on the wall 1020 to measure the target. Measure the distance, direction, etc.
  • the irradiation direction L1 is, for example, a horizontal direction horizontal to the floor 1010.
  • the moving body 100 irradiates the irradiation direction L1 with the laser beam in a state of rising to the altitude H2, and measures the reflected light reflected by the laser beam on the wall 1020 to measure the target distance, direction, and the like. do.
  • the moving body 100 measures point cloud data indicating similar walls 1020 at altitudes H1 and H2 by LiDAR124. Therefore, since the moving body 100 has no change in the point cloud data at altitude H1 and altitude H2, it is not possible to estimate the change in altitude. As a result, the moving body 100 may reduce the self-position in the moving direction M and the estimation accuracy of the surrounding map.
  • the moving body 100 changes the LiDAR 124 from the irradiation direction L1 to the irradiation direction L2.
  • the LiDAR 124 irradiates the laser beam in the irradiation direction L2 while flying at the altitude H1 in the space 2000.
  • the irradiation direction L2 intersects with the irradiation direction L1, for example, and is a direction toward the characteristic region near the corner between the floor 1010 and the wall 1020 from the moving body 100.
  • the laser beam emitted by the LiDAR 124 illuminates a characteristic region from the floor 1010 to the vicinity of the corner between the floor 1010 and the wall 1020.
  • the moving body 100 measures the reflected light and measures the distance, direction, and the like of the target.
  • the moving body 100 can obtain point cloud data changed from the floor 1010 to the wall 1020 at altitudes H1 and H2.
  • the change in the altitude of the moving body 100 is estimated based on the amount of change in the point cloud data by changing the irradiation direction of the LiDAR 124. be able to.
  • the moving body 100 can suppress the influence of the surrounding environment and improve the estimation accuracy of the self-position based on the estimated altitude change and the result of moving to the implementation.
  • FIG. 2 is a diagram showing an example of the configuration of the mobile body 100 according to the embodiment.
  • the main body 110 of the moving body 100 includes a communication unit 121, an imaging control unit 122, a power control unit 123, a LiDAR 124, a gimbal 125, an IMU (Inertial Measurement Unit) 126, and a storage unit. 127 and a control unit 128 are provided.
  • the control unit 128 is electrically connected to a communication unit 121, an imaging control unit 122, a power control unit 123, a LiDAR 124, a gimbal 125, an IMU 126, a storage unit 127, and the like.
  • the communication unit 121 can execute communication related to the exchange of various data with the external communication device of the mobile body 100.
  • the communication unit 121 can receive radio signals in a predetermined frequency band from GPS satellites.
  • the communication unit 121 can perform demodulation processing of the received radio wave signal and send the processed signal to the control unit 128.
  • the shooting control unit 122 controls shooting of an image using the camera 190.
  • the control by the shooting control unit 122 includes control of the shooting direction of the camera 190.
  • the camera 190 for example, a monocular camera, an infrared camera, a Depth camera, or the like can be used.
  • the camera 190 is provided on the main body 110 via a gimbal or the like so that the shooting direction can be changed.
  • the shooting control unit 122 can provide the image information acquired from the camera 190 to the control unit 128.
  • the power control unit 123 controls the driving force of a plurality of motors 140.
  • the plurality of motors 140 rotate the rotary blades 150 of the moving body 100b.
  • the plurality of motors 140 control the rotation speeds of the plurality of rotary blades 150.
  • the power control unit 123 realizes ascending, descending, flying, and the like of the moving body 100 by rotating the motor 140 under the control of the control unit 128.
  • the moving body 100 describes a case where the four rotor blades 150 are driven by the four motors 140, but the present invention is not limited thereto.
  • the LiDAR 124 irradiates a laser beam that emits light in a pulse shape, measures the reflected light of the laser beam, and measures the distance, direction, and the like of the target.
  • the LiDAR 124 is provided on the main body 110, for example, so that the laser beam can be irradiated in the direction in which the nose of the moving body 100 faces.
  • the LiDAR 124 supplies the measured result to the control unit 128.
  • the gimbal 125 is a support mechanism provided on the main body 110 and supporting the LiDAR 124 so that the irradiation direction of the laser beam can be changed.
  • the gimbal 125 can be supported in a state where the irradiation direction of the LiDAR 124 is directed to a desired direction in the vertical direction of the main body 110 by causing an arc movement about an axis along the horizontal direction by rotation of a motor (not shown).
  • the gimbal 125 can support the LiDAR 124 so that the horizontal direction of the main body 110 is the reference irradiation direction.
  • the gimbal 125 can change the support angle of the LiDAR 124 so as to have the irradiation direction required by the control unit 128.
  • the gimbal 125 describes a configuration in which the LiDAR 124 can move in an arc in the vertical and horizontal directions of the main body 110, but the gimbal 125 is not limited thereto.
  • the support portion may be a rotation support mechanism or the like formed on the main body 110 as long as the angle of the irradiation direction L1 of the LiDAR 124 can be adjusted.
  • the IMU126 senses the movement of the moving body 100.
  • the IMU 126 has, for example, a 3-axis gyro sensor, a 3-axis acceleration sensor, and the like, and supplies inertial information indicating the detected three-dimensional angular velocity, acceleration, and the like to the control unit 128.
  • the storage unit 127 can store programs and data.
  • the storage unit 127 may include any non-transient storage medium such as a semiconductor storage medium and a magnetic storage medium.
  • the storage unit 127 may include a storage medium such as a memory card, an optical disk, or a magneto-optical disk, and a combination of a storage medium reading device.
  • the storage unit 127 may include a storage device used as a temporary storage area such as a RAM.
  • the storage unit 127 can store the control program 127a, the control data 127b, the self-position data 127c, and the point cloud data 127d.
  • the control program 127a can each provide a function for realizing processing related to various operations of the mobile body 100.
  • the control program 127a can provide various functions related to flight control of the moving body 100. Functions related to flight control include a function for controlling the driving force of the motor 140 based on the measurement result of the LiDAR 124.
  • the control program 127a can provide a function for estimating the self-position and the surrounding map of the moving body 100 based on the measurement results of, for example, LiDAR124 and IMU126.
  • the control data 127b includes data referred to for executing processing related to various operations of the mobile body 100.
  • the control data 127b includes, for example, data such as a planned movement route of the moving body 100 and an estimated self-position.
  • the self-position data 127c includes, for example, data showing the self-position of the moving body 100 in chronological order.
  • the point cloud data 127d includes the point cloud data showing the surrounding map estimated by using the measurement result of LiDAR124 and the SLAM method.
  • the point cloud data 127d shows, for example, a map around the moving body 100 by a plurality of points.
  • the control unit 128 includes one or more arithmetic units.
  • the arithmetic unit includes, for example, a CPU (Central Processing Unit), a System (System-on-a-Chip), an MCU (Micro Control Unit), an FPGA (Field-Programmable Gate Array), and a coprocessor, but is limited thereto. Not done.
  • the control unit 128 realizes processing related to various operations of the mobile body 100 by causing an arithmetic unit (computer) to execute the control program 127a.
  • the control unit 128 may realize at least one part of the functions provided by the control program 127a by a dedicated IC (Integrated Circuit).
  • the control unit 128 controls the flight (movement) of the moving body 100 based on the estimated self-position by executing the control program 127a.
  • the control unit 128 realizes the flight of the moving body 100 by controlling the power control unit 123.
  • FIG. 3 is a diagram showing an example of the functional configuration of the control unit 128 shown in FIG.
  • the control unit 128 of the moving body 100 includes a posture acquisition unit 128a, a point cloud acquisition unit 128b, an inertia acquisition unit 128c, an estimation unit 128d, a likelihood identification unit 128e, a determination unit 128f, and a support control unit 128g.
  • a posture acquisition unit 128a a point cloud acquisition unit 128b
  • an inertia acquisition unit 128c an estimation unit 128d
  • a likelihood identification unit 128e a likelihood identification unit 128e
  • a determination unit 128f a determination unit 128f
  • a support control unit 128g a support control unit 128g.
  • the control unit 128 includes a posture acquisition unit 128a, a point cloud acquisition unit 128b, an inertia acquisition unit 128c, an estimation unit 128d, a likelihood specifying unit 128e, a determination unit 128f, a support control unit 128g, and the like. Functions as a functional unit.
  • the posture acquisition unit 128a acquires posture information indicating the posture of the gimbal 125.
  • the posture acquisition unit 128a acquires the posture of the gimbal, that is, the generation information indicating the state supported by the gimbal, based on, for example, the encoder of the motor that rotates the gimbal 125, the control result, and the like.
  • the posture acquisition unit 128a supplies the acquired posture information to the estimation unit 128d.
  • the point cloud acquisition unit 128b acquires point cloud data 127d having information such as three-dimensional coordinate values and colors measured by LiDAR124.
  • the point cloud acquisition unit 128b stores the acquired point cloud data 127d in the storage unit 127 in chronological order.
  • the point cloud acquisition unit 128B supplies the acquired point cloud data 127d to the estimation unit 128d.
  • the inertia acquisition unit 128c acquires inertia information indicating the three-dimensional angular velocity, acceleration, etc. measured by the IMU 126 and stores it in the storage unit 127.
  • the inertia acquisition unit 128c supplies the acquired inertia information to the estimation unit 128d.
  • the estimation unit 128d estimates the latest self-position and surrounding map of the moving body 100 based on the point cloud data 127d, inertial information, and the like. For example, the estimation unit 128d estimates the self-position of the moving body 100 and the surrounding map by using the SLAM method. The estimation unit 128d supplies the estimated result to the determination unit 128f.
  • the likelihood specifying unit 128e specifies the likelihood of self-position estimation by comparing the self-position estimated by the estimation unit 128d with the past self-position data 127C, the movement plan, and the like.
  • the likelihood specifying unit 128e specifies an error ellipse 300 obtained by calculating a region in which the moving body 100 exists with a predetermined probability by using, for example, a known Kalman filter or the like.
  • FIG. 4 is a diagram for explaining an example of the error ellipse 300 showing the certainty of the self-position.
  • the likelihood specifying unit 128e has a larger likelihood when the error ellipse 300 indicating the certainty of the self-position is at altitude H2 than at altitude H1, and has a likelihood including an error ellipse 300 having a long moving direction M. Identify.
  • the likelihood specifying unit 128e supplies the specified likelihood to the determination unit 128f.
  • the determination unit 128f determines the direction in which the irradiation direction L1 of the LiDAR 124 is tilted based on the likelihood of self-position estimation. When the error ellipse 300 becomes larger than the threshold value, the determination unit 128f determines the direction and angle at which the irradiation direction L1 is tilted. The determination unit 128f determines, for example, the direction and angle of inclination from the reference irradiation direction L1 to the irradiation direction L2. The determination unit 128f determines the direction and angle at which the irradiation direction L1 is tilted based on the error direction of the self-position estimation. The determination unit 128f supplies the support control unit 128g with the direction and angle at which the determined irradiation direction L1 is tilted.
  • FIG. 5 is a diagram for explaining an example in which the moving body 100 according to the embodiment tilts the irradiation direction L1.
  • the moving body 100 moves (ascends) in the moving direction M from the altitude H1 to the altitude H2.
  • the error ellipse 300 indicating the certainty of the self-position corresponds to the amount of movement. Becomes smaller. That is, the likelihood of self-position estimation has an error ellipse 300 whose error direction is long in the moving direction M.
  • the determination unit 128f determines the direction in which the irradiation direction L1 of the LiDAR 124 is tilted in the vertical direction 410, which is the movement direction M.
  • the determination unit 128f determines the direction and angle toward the floor 1010, the ceiling 1030, etc., which are the characteristics of the building 1000, based on the self-position, the point cloud data 127d, and the like.
  • the determination unit 128f identifies a characteristic region around the self-position from the point cloud data 127d based on the self-position of the moving body 100, calculates the direction and angle from the self-position toward the characteristic region, and is based on the calculation result. To determine the tilting direction.
  • the moving body 100 keeps the altitude H3 constant and moves horizontally in the moving direction MH from the position P1 to the position P2.
  • the error ellipse 300 indicating the certainty of the self-position is in the horizontal direction. It becomes smaller according to the amount of movement. That is, the likelihood of self-position estimation has an error ellipse 300 whose error direction is long in the moving direction MH.
  • the determination unit 128f determines the direction in which the irradiation direction L1 of the LiDAR 124 is tilted in the horizontal direction 420, which is the movement direction MH.
  • the determination unit 128f determines the direction and angle toward the wall 1020, the feature on the floor 1010, etc., which are the features of the building 1000, based on the self-position, the point cloud data 127d, and the like.
  • the determination unit 128f determines the direction in which the irradiation direction L1 is tilted in the vertical direction 410 or the horizontal direction 420 has been described, but the present invention is not limited to this.
  • the determination unit 128f may determine the direction in which the irradiation direction L1 is tilted in the vertical direction 410 and the horizontal direction 420.
  • the support control unit 128g controls the gimbal 125 so as to actively tilt the irradiation direction L1 of the laser beam in the direction of suppressing the deterioration of the estimation accuracy of the estimation unit 128d.
  • the support control unit 128g controls the gimbal 125 so that the irradiation direction L1 of the LiDAR 124 is tilted based on the direction and angle at which the irradiation direction L1 determined by the determination unit 128f is tilted.
  • the support control unit 128g changes the irradiation direction L1 of the LiDAR 124 by rotating the gimbal 125 so that the angle of the gimbal 125 supporting the LiDAR 124 is tilted.
  • the functional configuration example of the mobile body 100 according to the present embodiment has been described above.
  • the above configuration described with reference to FIGS. 2 and 3 is merely an example, and the functional configuration of the mobile body 100 according to the present embodiment is not limited to such an example.
  • the functional configuration of the mobile body 100 according to the present embodiment can be flexibly modified according to specifications and operations.
  • FIG. 6 is a flowchart showing an example of a processing procedure executed by the mobile body 100 according to the embodiment.
  • the processing procedure shown in FIG. 6 is realized by the control unit 128 of the mobile body 100 executing the control program 127a.
  • the processing procedure shown in FIG. 6 is repeatedly executed by the control unit 128.
  • the control unit 128 of the moving body 100 acquires the posture information of the gimbal 125 (step S101). For example, the control unit 128 acquires posture information capable of identifying the angle of the LiDAR 124 supported by the current gimbal 125 by using the encoder of the gimbal 125, the control result, and the like.
  • the control unit 128 stores the acquired posture information in the storage unit 127, the process proceeds to step S102.
  • the control unit 128 acquires inertial information from the IMU 126 (step S102). For example, the control unit 128 acquires inertial information indicating the three-dimensional angular velocity, acceleration, etc. of the moving body 100 measured by the IMU 126. When the control unit 128 stores the acquired inertial information in the storage unit 127, the process proceeds to step S103.
  • the control unit 128 estimates its own position and surrounding map based on the measurement result of LiDAR124 (step S103). For example, the control unit 128 estimates the self-position and the surrounding map of the moving body 100 based on the measurement result of the LiDAR 124 by using the method of SLAM. The control unit 128 reflects the estimated self-position in the self-position data 127e of the storage unit 127. The control unit 128 reflects the estimated surrounding map in the point cloud data 127d of the storage unit 127. When the process of step S103 is completed, the control unit 128 advances the process to step S104.
  • the control unit 128 specifies the likelihood of self-position estimation (step S104). For example, the control unit 128 compares the estimated self-position with the past self-position data 127C, the movement plan, and the like to specify the likelihood of self-position estimation. As described above, the control unit 128 specifies the error ellipse 300 obtained by calculating the region where the moving body 100 exists with a predetermined probability by using a known Kalman filter or the like. When the control unit 128 stores the likelihood of the specified self-position estimation in the storage unit 127, the process proceeds to step S105.
  • the control unit 128 determines whether or not the error ellipse 300 exceeds the threshold value (step S105). For example, the control unit 128 obtains the size of the error ellipse 300, and determines that the error ellipse 300 exceeds the threshold value when the size exceeds a preset threshold value for determination. When the control unit 128 determines that the error ellipse 300 exceeds the threshold value (Yes in step S105), the control unit 128 advances the process to step S106.
  • the control unit 128 determines the direction in which the irradiation direction L1 is tilted so as to suppress a decrease in estimation accuracy based on the likelihood of self-position estimation (step S106). For example, the control unit 128 determines the direction and angle at which the irradiation direction L1 is tilted in the error direction indicated by the error ellipse 300 of the likelihood of self-position estimation. When the control unit 128 stores the determined tilting direction and angle in the storage unit 127, the process proceeds to step S107.
  • the control unit 128 calculates the angle of the gimbal 125 corresponding to the tilting direction (step S107). For example, the control unit 128 calculates the angle of the gimbal 125 supporting the LiDAR 124 by using a table, a calculation program, or the like so that the irradiation direction L1 of the LiDAR 124 is tilted. When the control unit 128 stores the calculated angle of the gimbal 125 in the storage unit 127, the process proceeds to step S108.
  • the control unit 128 controls the angle of the gimbal 125 so that it is tilted (step S108). For example, the control unit 128 changes the angle of the gimbal 125 by rotating the gimbal 125 so as to have a calculated angle. As a result, the LiDAR 124 is changed in the direction in which the irradiation direction L1 is tilted.
  • the control unit 128 ends the process procedure shown in FIG.
  • step S109 determines whether or not the irradiation direction L1 of the LiDAR 124 is changed. For example, the control unit 128 determines that the irradiation direction L1 of the LiDAR 124 is changed when the gimbal 125 supports the LiDAR 124 so as to face a direction different from the irradiation direction L1.
  • the control unit 128 ends the processing procedure shown in FIG.
  • step S110 controls the angle of the gimbal 125 so as to return from the tilting direction to the irradiation direction L1 (step S110).
  • step S110 changes the angle of the gimbal 125 by rotating the gimbal 125 so as to return to the reference irradiation direction L1.
  • the LiDAR 124 is changed from the tilting direction to the irradiation direction L1.
  • the moving body 100 described above can control the gimbal 125 so as to actively tilt the irradiation direction L1 of the laser beam in the direction of suppressing the deterioration of the estimation accuracy of the estimation unit 128d.
  • the moving body 100 changes the irradiation direction L1 of the LiDAR 124 in a direction that suppresses the decrease in the estimation accuracy even if the change in the surrounding environment is small. It can be suppressed.
  • the moving body 100 can suppress the deterioration of the safety of the moving body 100 by suppressing the deterioration of the estimation accuracy of the self-position and the surrounding map.
  • the moving body 100 can determine the direction in which the irradiation direction L1 is tilted based on the likelihood of self-position estimation, and can control the gimbal 125 so that the irradiation direction L1 of the LiDAR 124 is in the tilting direction. As a result, the moving body 100 can tilt the LiDAR 124 according to the likelihood of self-position estimation, so that it is possible to suppress a decrease in the estimation accuracy of the self-position and the surrounding map using the LiDAR 124. As a result, the moving body 100 is less likely to degenerate the self-position estimation result in the direction corresponding to the likelihood of the self-position estimation, so that the self-position estimation important for the moving body can be improved.
  • the moving body 100 can determine the direction in which the irradiation direction L1 is tilted based on the error direction of the self-position estimation. As a result, the moving body 100 can incline the irradiation direction L1 in the error direction of the self-position estimation, that is, the moving direction of the moving body 100, so that the direction of the laser beam irradiated by the LiDAR 124 can be changed. As a result, the moving body 100 is less likely to degenerate the self-position estimation result with respect to the error direction of the self-position estimation, so that the self-position estimation important for the moving body can be improved.
  • the moving body 100 can determine the direction in which the irradiation direction L1 is tilted based on the direction of the characteristic area shown by the surrounding map. As a result, since the moving body 100 can measure the characteristic region of the surrounding environment by using the LiDAR 124, it is possible to suppress a decrease in the estimation accuracy of the self-position and the surrounding map using the LiDAR 124. As a result, the moving body 100 can improve the estimation accuracy of the self-position and the surrounding map, and thus can contribute to the improvement of the safety of the moving body 100.
  • the moving body 100 when the moving body 100 is a flying object, the same terrain appears in the vertical direction in a building town or the like, so if the altitude estimation result is lowered from the information from LiDAR124, the altitude estimation accuracy is significantly lowered. Further, the moving body 100 becomes a surrounding map distorted in the altitude direction with respect to the estimation of the surrounding map, and the accuracy is lowered. On the other hand, since the moving body 100 according to the embodiment can change the irradiation direction L1 of the LiDAR 124 to the moving direction, it is possible to suppress a decrease in the estimation accuracy of the self-position and the surrounding map using the LiDAR 124.
  • the moving body 100 can improve the estimation accuracy of the self-position and the surrounding map, and thus can contribute to the improvement of the safety of the flying body.
  • Such an effect can be similarly obtained in a vehicle, a robot, or the like that can move autonomously.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A moving body according to one aspect of the present invention comprises: a sensor (124) that radiates a laser beam and measures a distance on the basis of reflected reflection light; an estimation section (128d) that estimates an own position of the moving body and a map of the surroundings using the sensor (124); a support section (125) that supports the sensor (124) such that it is possible to change the radiation direction of the laser beam; and a support control section (128g) that controls the support section (125) so as to actively tilt the radiation direction of the laser beam to a direction which suppresses a decrease in estimation accuracy of the estimation section (128d).

Description

移動体、制御方法および制御プログラムMobiles, control methods and control programs
 本出願は、移動体、制御方法および制御プログラムに関する。 This application relates to mobiles, control methods and control programs.
 移動体には、例えば、移動体の自己位置を推定し、自己位置に基づいて自律移動するものがある。例えば、特許文献1には、フレーム毎に取得された点群から、平面を形成する点群を平面点群としてフレーム毎に抽出し、平面点群により形成される平面の位置をフレーム毎に特定し、所定のフレームにおいて特定された平面の位置と、次のフレームにおいて特定された平面の位置との関係からフレーム間の移動体の並進ベクトル及び回転行列を算出し、フレーム間の移動体の並進ベクトル及び回転行列をフレーム毎に順次算出して、移動体の軌跡を推定することが開示されている。 For example, there is a moving body that estimates the self-position of the moving body and autonomously moves based on the self-position. For example, in Patent Document 1, from the point group acquired for each frame, the point group forming a plane is extracted for each frame as a plane point group, and the position of the plane formed by the plane point group is specified for each frame. Then, the translation vector and rotation matrix of the moving object between frames are calculated from the relationship between the position of the plane specified in a predetermined frame and the position of the plane specified in the next frame, and the translation of the moving object between frames is calculated. It is disclosed that a vector and a rotation matrix are sequentially calculated for each frame to estimate the locus of a moving body.
特開2017-3363号公報Japanese Unexamined Patent Publication No. 2017-3363
 例えば、移動している周囲環境が変化しない場合、従来の移動体は、センサの測定結果に基づいて推定する自己位置の精度が低下する。このため、従来の移動体は、周囲環境の変化が少なくても、自己位置の推定精度を向上させることに改善の余地があった。 For example, if the moving surrounding environment does not change, the accuracy of the self-position estimated based on the measurement result of the sensor of the conventional moving body decreases. Therefore, there is room for improvement in the conventional moving body in improving the estimation accuracy of the self-position even if the change in the surrounding environment is small.
 態様の1つに係る移動体は、レーザ光線を照射して反射した反射光に基づいて距離を測定するセンサと、前記センサを用いて移動体の自己位置及び周囲地図を推定する推定部と、前記レーザ光線の照射方向を変更可能なように前記センサを支持する支持部と、前記推定部の推定精度の低下を抑える方向へ前記レーザ光線の前記照射方向を能動的に傾けるように、前記支持部を制御する支持制御部と、を備える。 The moving body according to one of the embodiments includes a sensor that measures a distance based on the reflected light reflected by irradiating a laser beam, and an estimation unit that estimates the self-position and surrounding map of the moving body using the sensor. The support portion that supports the sensor so that the irradiation direction of the laser beam can be changed, and the support portion that actively tilts the irradiation direction of the laser beam in a direction that suppresses a decrease in estimation accuracy of the estimation unit. A support control unit for controlling the unit is provided.
 態様の1つに係る制御方法は、レーザ光線を照射して反射した反射光に基づいて距離を測定するセンサと、前記レーザ光線の照射方向を変更可能なように前記センサを支持する支持部と、を備える移動体の制御方法であって、前記センサを用いて前記移動体の自己位置及び周囲地図を推定すること、前記自己位置及び前記周囲地図の推定精度の低下を抑える方向へ前記レーザ光線の前記照射方向を能動的に傾けるように、前記支持部を制御すること、を含む。 The control method according to one of the embodiments is a sensor that measures the distance based on the reflected light reflected by irradiating the laser beam, and a support portion that supports the sensor so that the irradiation direction of the laser beam can be changed. A method for controlling a moving body including the above, wherein the sensor is used to estimate the self-position and the surrounding map of the moving body, and the laser beam is directed toward suppressing a decrease in the estimation accuracy of the self-position and the surrounding map. It includes controlling the support portion so as to actively tilt the irradiation direction of the above.
 態様の1つに係る制御プログラムは、レーザ光線を照射して反射した反射光に基づいて距離を測定するセンサと、前記レーザ光線の照射方向を変更可能なように前記センサを支持する支持部と、を備える移動体の制御プログラムであって、コンピュータに、前記センサを用いて前記移動体の自己位置及び周囲地図を推定させ、前記自己位置及び前記周囲地図の推定精度の低下を抑える方向へ前記レーザ光線の前記照射方向を能動的に傾けるように、前記支持部を制御させることを含む。 The control program according to one of the embodiments includes a sensor that irradiates a laser beam and measures a distance based on the reflected light reflected, and a support portion that supports the sensor so that the irradiation direction of the laser beam can be changed. A moving body control program comprising the It includes controlling the support portion so as to actively tilt the irradiation direction of the laser beam.
図1は、実施形態に係る移動体の概要を説明するための図である。FIG. 1 is a diagram for explaining an outline of a moving body according to an embodiment. 図2は、実施形態に係る移動体が有する構成の一例を示す図である。FIG. 2 is a diagram showing an example of the configuration of the moving body according to the embodiment. 図3は、図2に示す制御ユニットの機能構成の一例を示す図である。FIG. 3 is a diagram showing an example of the functional configuration of the control unit shown in FIG. 図4は、自己位置の確からしさを示す誤差楕円の一例を説明するための図である。FIG. 4 is a diagram for explaining an example of an error ellipse showing the certainty of the self-position. 図5は、実施形態に係る移動体が照射方向を傾ける一例を説明するための図である。FIG. 5 is a diagram for explaining an example in which the moving body according to the embodiment tilts the irradiation direction. 図6は、実施形態に係る移動体100が実行する処理手順の一例を示すフローチャートである。FIG. 6 is a flowchart showing an example of a processing procedure executed by the mobile body 100 according to the embodiment.
 本出願に係る移動体等を実施するための複数の実施形態を、図面を参照しつつ詳細に説明する。なお、以下の説明により本出願が限定されるものではない。また、以下の説明における構成要素には、当業者が容易に想定できるもの、実質的に同一のもの、いわゆる均等の範囲のものが含まれる。以下の説明において、同様の構成要素について同一の符号を付すことがある。さらに、重複する説明は省略することがある。なお、以下の実施形態に係る移動体の説明は、本出願にかかる制御方法および制御プログラムの一実施形態の説明を兼ねる。 A plurality of embodiments for carrying out the mobile body and the like according to the present application will be described in detail with reference to the drawings. The following description does not limit the present application. Further, the components in the following description include those that can be easily assumed by those skilled in the art, those that are substantially the same, that are, those in a so-called equal range. In the following description, similar components may be designated by the same reference numerals. Further, duplicate description may be omitted. The following description of the moving body according to the embodiment also serves as a description of one embodiment of the control method and the control program according to the present application.
 図1は、実施形態に係る移動体の概要を説明するための図である。図1に示す移動体100は、例えば、自律して飛行可能な飛翔体である。飛翔体は、例えば、ドローン、飛行機等を含む。本開示では、移動体100は、飛翔体である場合について説明するが、これに限定されない。例えば、移動体100は、自律移動可能な車両、ロボット等を含む。 FIG. 1 is a diagram for explaining an outline of the moving body according to the embodiment. The mobile body 100 shown in FIG. 1 is, for example, a flying object capable of autonomously flying. The projectile includes, for example, a drone, an airplane, and the like. In the present disclosure, the case where the moving body 100 is a flying body will be described, but the present invention is not limited thereto. For example, the moving body 100 includes a vehicle, a robot, and the like that can move autonomously.
 移動体100は、本体110と、複数の回転翼150と、複数の脚部170と、カメラ190と、を備える。図1に示す一例では、複数の回転翼150は、4つである場合について説明するが、これに限定されない。移動体100は、飛行中ではないとき、脚部170を接地させた状態で待機する。 The moving body 100 includes a main body 110, a plurality of rotary wings 150, a plurality of legs 170, and a camera 190. In the example shown in FIG. 1, the case where the plurality of rotary blades 150 are four will be described, but the present invention is not limited to this. When the mobile body 100 is not in flight, the moving body 100 stands by with the legs 170 in contact with the ground.
 移動体100は、LiDAR(Light Detection And Ranging、Laser Imaging Detection And Ranging)124を備える。LiDAR124は、例えば、パルス状に発光するレーザ光を照射し、該レーザ光の反射光を測定して対象の距離、方向等を測定する。移動体100は、LiDAR124のレーザ光を水平方向へ照射させ、対象で反射した反射光を測定し、対象の距離、方向等を測定する。 The mobile body 100 includes a LiDAR (Light Detection And Ringing, Laser Imaging Detection And Ringing) 124. The LiDAR 124, for example, irradiates a laser beam that emits light in a pulse shape, measures the reflected light of the laser beam, and measures the distance, direction, and the like of the target. The moving body 100 irradiates the laser beam of LiDAR 124 in the horizontal direction, measures the reflected light reflected by the target, and measures the distance, direction, and the like of the target.
 移動体100は、例えば、SLAM(Simultaneous Localization And Mapping)の手法により、LiDAR124の測定結果を用いて、移動体100の自己位置及び周囲地図を推定する。周囲地図は、例えば、点群データで周辺の地図が表現できる。点群データは、3次元座標と色との情報を有する。移動体100は、推定した周囲地図と移動体100の移動量とに基づいて、推定した自己位置の尤度を推定する。自己位置の尤度は、移動体100が移動するほど小さくなる特性を有する。つまり、自己位置の尤度が大きいということは、自己位置が良く推定できていることに対応し、自己位置の尤度が小さくなるということは、誤差楕円(分散)が大きくなるということとしてよい。 The mobile body 100 estimates the self-position and the surrounding map of the mobile body 100 by using the measurement result of LiDAR124, for example, by the method of SLAM (Simultaneus Localization And Mapping). The surrounding map can be represented by, for example, point cloud data. The point cloud data has information on three-dimensional coordinates and colors. The moving body 100 estimates the likelihood of the estimated self-position based on the estimated surrounding map and the moving amount of the moving body 100. The likelihood of self-position has a characteristic that it becomes smaller as the moving body 100 moves. In other words, a large likelihood of self-position corresponds to a good estimation of self-position, and a small likelihood of self-position may mean a large error ellipse (variance). ..
 図1に示す一例では、移動体100は、建物1000の内部を飛行している。建物1000は、床1010と、床1010に設けられた壁1020と、床1010に対向する天井1030と、を有する。建物1000は、床1010と壁1020と天井1030とで、移動体100が飛行可能な空間2000を形成している。 In the example shown in FIG. 1, the moving body 100 is flying inside the building 1000. The building 1000 has a floor 1010, a wall 1020 provided on the floor 1010, and a ceiling 1030 facing the floor 1010. In the building 1000, the floor 1010, the wall 1020, and the ceiling 1030 form a space 2000 in which the moving body 100 can fly.
 場面ST1では、移動体100は、空間2000における高度H1を飛行している状態で、LiDAR124がレーザ光を照射方向L1へ照射し、該レーザ光が壁1020で反射した反射光を測定して対象の距離、方向等を測定する。照射方向L1は、例えば、床1010に対して水平な水平方向である。その後、移動体100は、高度H2に上昇した状態で、LiDAR124によって照射方向L1にレーザ光を照射し、該レーザ光が壁1020で反射した反射光を測定して対象の距離、方向等を測定する。この場合、移動体100は、高度H1及び高度H2で、類似した壁1020を示す点群データをLiDAR124によって計測する。このため、移動体100は、高度H1及び高度H2の点群データに変化がないため、高度の変化を推定することができない。これにより、移動体100は、移動方向Mの自己位置及び周囲地図の推定精度が低下する可能性がある。 In the scene ST1, the moving body 100 irradiates the laser beam to the irradiation direction L1 while flying at the altitude H1 in the space 2000, and measures the reflected light reflected by the laser beam on the wall 1020 to measure the target. Measure the distance, direction, etc. The irradiation direction L1 is, for example, a horizontal direction horizontal to the floor 1010. After that, the moving body 100 irradiates the irradiation direction L1 with the laser beam in a state of rising to the altitude H2, and measures the reflected light reflected by the laser beam on the wall 1020 to measure the target distance, direction, and the like. do. In this case, the moving body 100 measures point cloud data indicating similar walls 1020 at altitudes H1 and H2 by LiDAR124. Therefore, since the moving body 100 has no change in the point cloud data at altitude H1 and altitude H2, it is not possible to estimate the change in altitude. As a result, the moving body 100 may reduce the self-position in the moving direction M and the estimation accuracy of the surrounding map.
 本開示では、場面ST2に示すように、移動体100は、LiDAR124を照射方向L1から照射方向L2へ変更している。移動体100は、空間2000における高度H1を飛行している状態で、LiDAR124がレーザ光を照射方向L2に照射している。照射方向L2は、例えば、照射方向L1と交わり、床1010と壁1020との角付近の特徴的な領域へ移動体100から向かう方向となっている。LiDAR124が照射したレーザ光は、床1010から床1010と壁1020との間の角付近までの特徴領域を照射している。移動体100は、反射した反射光を測定して対象の距離、方向等を測定する。この場合、移動体100は、高度H1及び高度H2で、床1010から壁1020へ変化した点群データを得ることができる。これにより、移動体100は、周囲環境の変化が少ない場所を移動する場合に、LiDAR124の照射方向を変更することで、点群データの変化量に基づいて移動体100の高度の変化を推定することができる。その結果、移動体100は、周囲環境の影響を抑制し、推定した高度の変化と実施に移動した結果とに基づく自己位置の推定精度を向上させることができる。 In the present disclosure, as shown in the scene ST2, the moving body 100 changes the LiDAR 124 from the irradiation direction L1 to the irradiation direction L2. In the moving body 100, the LiDAR 124 irradiates the laser beam in the irradiation direction L2 while flying at the altitude H1 in the space 2000. The irradiation direction L2 intersects with the irradiation direction L1, for example, and is a direction toward the characteristic region near the corner between the floor 1010 and the wall 1020 from the moving body 100. The laser beam emitted by the LiDAR 124 illuminates a characteristic region from the floor 1010 to the vicinity of the corner between the floor 1010 and the wall 1020. The moving body 100 measures the reflected light and measures the distance, direction, and the like of the target. In this case, the moving body 100 can obtain point cloud data changed from the floor 1010 to the wall 1020 at altitudes H1 and H2. As a result, when the moving body 100 moves to a place where there is little change in the surrounding environment, the change in the altitude of the moving body 100 is estimated based on the amount of change in the point cloud data by changing the irradiation direction of the LiDAR 124. be able to. As a result, the moving body 100 can suppress the influence of the surrounding environment and improve the estimation accuracy of the self-position based on the estimated altitude change and the result of moving to the implementation.
 図2は、実施形態に係る移動体100が有する構成の一例を示す図である。図2に示すように、移動体100の本体110は、通信部121と、撮影制御部122と、動力制御部123と、LiDAR124と、ジンバル125と、IMU(Intertial Measurement Unit)126と、記憶部127と、制御ユニット128と、を備える。制御ユニット128は、通信部121、撮影制御部122、動力制御部123、LiDAR124、ジンバル125、IMU126、記憶部127等と電気的に接続されている。 FIG. 2 is a diagram showing an example of the configuration of the mobile body 100 according to the embodiment. As shown in FIG. 2, the main body 110 of the moving body 100 includes a communication unit 121, an imaging control unit 122, a power control unit 123, a LiDAR 124, a gimbal 125, an IMU (Inertial Measurement Unit) 126, and a storage unit. 127 and a control unit 128 are provided. The control unit 128 is electrically connected to a communication unit 121, an imaging control unit 122, a power control unit 123, a LiDAR 124, a gimbal 125, an IMU 126, a storage unit 127, and the like.
 通信部121は、移動体100の外部の通信装置との間で各種データのやり取りに関する通信を実行できる。通信部121は、GPS衛星からの所定の周波数帯の電波信号を受信できる。通信部121は、受信した電波信号の復調処理を行って、処理後の信号を制御ユニット128に送出できる。 The communication unit 121 can execute communication related to the exchange of various data with the external communication device of the mobile body 100. The communication unit 121 can receive radio signals in a predetermined frequency band from GPS satellites. The communication unit 121 can perform demodulation processing of the received radio wave signal and send the processed signal to the control unit 128.
 撮影制御部122は、カメラ190を用いた画像の撮影を制御する。撮影制御部122による制御は、カメラ190の撮影方向の制御を含む。カメラ190は、例えば、単眼カメラ、赤外線カメラ、Depthカメラ等を用いることができる。カメラ190は、撮影の方向を変更可能なように、ジンバル等を介して本体110に設けられている。撮影制御部122は、カメラ190から取得した画像情報を制御ユニット128に提供できる。 The shooting control unit 122 controls shooting of an image using the camera 190. The control by the shooting control unit 122 includes control of the shooting direction of the camera 190. As the camera 190, for example, a monocular camera, an infrared camera, a Depth camera, or the like can be used. The camera 190 is provided on the main body 110 via a gimbal or the like so that the shooting direction can be changed. The shooting control unit 122 can provide the image information acquired from the camera 190 to the control unit 128.
 動力制御部123は、複数のモータ140の駆動力を制御する。複数のモータ140は、移動体100bの回転翼150を回転させる。複数のモータ140は、複数の回転翼150の回転数を制御する。動力制御部123は、制御ユニット128の制御により、モータ140を回転させることで、移動体100の上昇、下降、飛行等を実現する。本実施形態では、移動体100は、4つの回転翼150を4つのモータ140で駆動させる場合について説明するが、これに限定されない。 The power control unit 123 controls the driving force of a plurality of motors 140. The plurality of motors 140 rotate the rotary blades 150 of the moving body 100b. The plurality of motors 140 control the rotation speeds of the plurality of rotary blades 150. The power control unit 123 realizes ascending, descending, flying, and the like of the moving body 100 by rotating the motor 140 under the control of the control unit 128. In the present embodiment, the moving body 100 describes a case where the four rotor blades 150 are driven by the four motors 140, but the present invention is not limited thereto.
 LiDAR124は、上述したように、パルス状に発光するレーザ光を照射し、該レーザ光の反射光を測定して対象の距離、方向等を測定する。LiDAR124は、例えば、移動体100の機首が向く方向へレーザ光線を照射可能なように、本体110に設けられている。LiDAR124は、測定した結果を制御ユニット128に供給する。 As described above, the LiDAR 124 irradiates a laser beam that emits light in a pulse shape, measures the reflected light of the laser beam, and measures the distance, direction, and the like of the target. The LiDAR 124 is provided on the main body 110, for example, so that the laser beam can be irradiated in the direction in which the nose of the moving body 100 faces. The LiDAR 124 supplies the measured result to the control unit 128.
 ジンバル125は、本体110に設けられ、レーザ光線の照射方向を変更可能なようにLiDAR124を支持する支持機構である。ジンバル125は、図示しないモータの回転により、水平方向に沿った軸を中心に円弧運動させることで、LiDAR124の照射方向を本体110の上下方向における所望の方向へ向かせた状態で支持できる。ジンバル125は、本体110の水平方向を基準の照射方向となるように、LiDAR124を支持できる。ジンバル125は、制御ユニット128が要求した照射方向となるように、LiDAR124の支持角度を変更できる。ジンバル125は、LiDAR124を本体110の上下方向及び左右方向に円弧運動可能な構成について説明するが、これに限定されない。 The gimbal 125 is a support mechanism provided on the main body 110 and supporting the LiDAR 124 so that the irradiation direction of the laser beam can be changed. The gimbal 125 can be supported in a state where the irradiation direction of the LiDAR 124 is directed to a desired direction in the vertical direction of the main body 110 by causing an arc movement about an axis along the horizontal direction by rotation of a motor (not shown). The gimbal 125 can support the LiDAR 124 so that the horizontal direction of the main body 110 is the reference irradiation direction. The gimbal 125 can change the support angle of the LiDAR 124 so as to have the irradiation direction required by the control unit 128. The gimbal 125 describes a configuration in which the LiDAR 124 can move in an arc in the vertical and horizontal directions of the main body 110, but the gimbal 125 is not limited thereto.
 本実施形態では、移動体100は、ジンバル125を支持部として備える場合について説明するが、これに限定されない。例えば、支持部は、LiDAR124の照射方向L1の角度が調整可能であれば、本体110に形成した回転支持機構等であってもよい。 In the present embodiment, the case where the moving body 100 includes the gimbal 125 as a support portion will be described, but the present invention is not limited to this. For example, the support portion may be a rotation support mechanism or the like formed on the main body 110 as long as the angle of the irradiation direction L1 of the LiDAR 124 can be adjusted.
 IMU126は、移動体100の動きをセンシングする。IMU126は、例えば、3軸ジャイロセンサ、3軸加速度センサ等を有し、検出した3次元の角速度、加速度等を示す慣性情報を制御ユニット128に供給する。 The IMU126 senses the movement of the moving body 100. The IMU 126 has, for example, a 3-axis gyro sensor, a 3-axis acceleration sensor, and the like, and supplies inertial information indicating the detected three-dimensional angular velocity, acceleration, and the like to the control unit 128.
 記憶部127は、プログラム及びデータを記憶できる。記憶部127は、半導体記憶媒体、及び磁気記憶媒体等の任意の非一過的な記憶媒体を含んでよい。記憶部127は、メモリカード、光ディスク、又は光磁気ディスク等の記憶媒体と、記憶媒体の読み取り装置との組み合わせを含んでよい。記憶部127は、RAMなどの一時的な記憶領域として利用される記憶デバイスを含んでよい。 The storage unit 127 can store programs and data. The storage unit 127 may include any non-transient storage medium such as a semiconductor storage medium and a magnetic storage medium. The storage unit 127 may include a storage medium such as a memory card, an optical disk, or a magneto-optical disk, and a combination of a storage medium reading device. The storage unit 127 may include a storage device used as a temporary storage area such as a RAM.
 記憶部127は、制御プログラム127a、制御用データ127b、自己位置データ127c及び点群データ127dを記憶できる。制御プログラム127aは、移動体100の各種動作に関する処理を実現するための機能をそれぞれ提供できる。制御プログラム127aは、移動体100の飛行制御に関する各種機能を提供できる。飛行制御に関する機能は、LiDAR124の測定結果に基づいてモータ140の駆動力を制御するための機能を含む。制御プログラム127aは、例えば、LiDAR124、IMU126等の測定結果に基づいて、移動体100の自己位置及び周囲地図を推定するための機能を提供できる。 The storage unit 127 can store the control program 127a, the control data 127b, the self-position data 127c, and the point cloud data 127d. The control program 127a can each provide a function for realizing processing related to various operations of the mobile body 100. The control program 127a can provide various functions related to flight control of the moving body 100. Functions related to flight control include a function for controlling the driving force of the motor 140 based on the measurement result of the LiDAR 124. The control program 127a can provide a function for estimating the self-position and the surrounding map of the moving body 100 based on the measurement results of, for example, LiDAR124 and IMU126.
 制御用データ127bは、移動体100の各種動作に関する処理を実行するために参照されるデータを含む。制御用データ127bは、例えば、移動体100の移動予定ルート、推定した自己位置等のデータを含む。自己位置データ127cは、例えば、移動体100の自己位置を時系列順に示したデータを含む。点群データ127dは、LiDAR124の測定結果とSLAMの手法とを用いて推定した周囲地図を示す点群のデータを含む。点群データ127dは、例えば、複数の点により移動体100の周囲地図を示している。 The control data 127b includes data referred to for executing processing related to various operations of the mobile body 100. The control data 127b includes, for example, data such as a planned movement route of the moving body 100 and an estimated self-position. The self-position data 127c includes, for example, data showing the self-position of the moving body 100 in chronological order. The point cloud data 127d includes the point cloud data showing the surrounding map estimated by using the measurement result of LiDAR124 and the SLAM method. The point cloud data 127d shows, for example, a map around the moving body 100 by a plurality of points.
 制御ユニット128は、1又は複数の演算装置を含む。演算装置は、例えば、CPU(Central Processing Unit)、SoC(System-on-a-Chip)、MCU(Micro Control Unit)、FPGA(Field-Programmable Gate Array)、およびコプロセッサを含むが、これらに限定されない。制御ユニット128は、制御プログラム127aを演算装置(コンピュータ)に実行させることにより、移動体100の各種動作に関する処理を実現する。制御ユニット128は、制御プログラム127aにより提供される機能の少なくとも1部を専用のIC(Integrated Circuit)により実現してもよい。 The control unit 128 includes one or more arithmetic units. The arithmetic unit includes, for example, a CPU (Central Processing Unit), a System (System-on-a-Chip), an MCU (Micro Control Unit), an FPGA (Field-Programmable Gate Array), and a coprocessor, but is limited thereto. Not done. The control unit 128 realizes processing related to various operations of the mobile body 100 by causing an arithmetic unit (computer) to execute the control program 127a. The control unit 128 may realize at least one part of the functions provided by the control program 127a by a dedicated IC (Integrated Circuit).
 制御ユニット128は、制御プログラム127aを実行することで、推定した自己位置に基づく移動体100の飛行(移動)を制御する。制御ユニット128は、動力制御部123を制御することで、移動体100の飛行を実現する。 The control unit 128 controls the flight (movement) of the moving body 100 based on the estimated self-position by executing the control program 127a. The control unit 128 realizes the flight of the moving body 100 by controlling the power control unit 123.
 図3は、図2に示す制御ユニット128の機能構成の一例を示す図である。図3に示すように、移動体100の制御ユニット128は、姿勢取得部128a、点群取得部128b、慣性取得部128c、推定部128d、尤度特定部128e、決定部128f及び支持制御部128gを有する。制御ユニット128は、制御プログラム127aを実行することで、姿勢取得部128a、点群取得部128b、慣性取得部128c、推定部128d、尤度特定部128e、決定部128f、支持制御部128g等の機能部として機能する。 FIG. 3 is a diagram showing an example of the functional configuration of the control unit 128 shown in FIG. As shown in FIG. 3, the control unit 128 of the moving body 100 includes a posture acquisition unit 128a, a point cloud acquisition unit 128b, an inertia acquisition unit 128c, an estimation unit 128d, a likelihood identification unit 128e, a determination unit 128f, and a support control unit 128g. Has. By executing the control program 127a, the control unit 128 includes a posture acquisition unit 128a, a point cloud acquisition unit 128b, an inertia acquisition unit 128c, an estimation unit 128d, a likelihood specifying unit 128e, a determination unit 128f, a support control unit 128g, and the like. Functions as a functional unit.
 姿勢取得部128aは、ジンバル125の姿勢を示す姿勢情報を取得する。姿勢取得部128aは、例えば、ジンバル125を回転させるモータのエンコーダ、制御結果等に基づいて、ジンバルの姿勢、すなわちジンバルが支持している状態を示す生成情報を取得する。姿勢取得部128aは、取得した姿勢情報を推定部128dに供給する。 The posture acquisition unit 128a acquires posture information indicating the posture of the gimbal 125. The posture acquisition unit 128a acquires the posture of the gimbal, that is, the generation information indicating the state supported by the gimbal, based on, for example, the encoder of the motor that rotates the gimbal 125, the control result, and the like. The posture acquisition unit 128a supplies the acquired posture information to the estimation unit 128d.
 点群取得部128bは、LiDAR124が測定した3次元座標値、色等の情報を有する点群データ127dを取得する。点群取得部128bは、取得した点群データ127dを時系列順に記憶部127に記憶する。点群取得部128Bは、取得した点群データ127dを推定部128dに供給する。 The point cloud acquisition unit 128b acquires point cloud data 127d having information such as three-dimensional coordinate values and colors measured by LiDAR124. The point cloud acquisition unit 128b stores the acquired point cloud data 127d in the storage unit 127 in chronological order. The point cloud acquisition unit 128B supplies the acquired point cloud data 127d to the estimation unit 128d.
 慣性取得部128cは、IMU126が測定した3次元の角速度、加速度等を示す慣性情報を取得し、記憶部127に記憶する。慣性取得部128cは、取得した慣性情報を推定部128dに供給する。 The inertia acquisition unit 128c acquires inertia information indicating the three-dimensional angular velocity, acceleration, etc. measured by the IMU 126 and stores it in the storage unit 127. The inertia acquisition unit 128c supplies the acquired inertia information to the estimation unit 128d.
 推定部128dは、点群データ127d、慣性情報等に基づいて、移動体100の最新の自己位置及び周囲地図を推定する。例えば、推定部128dは、SLAMの手法を用いて、移動体100の自己位置及び周囲地図を推定する。推定部128dは、推定した結果を決定部128fに供給する。 The estimation unit 128d estimates the latest self-position and surrounding map of the moving body 100 based on the point cloud data 127d, inertial information, and the like. For example, the estimation unit 128d estimates the self-position of the moving body 100 and the surrounding map by using the SLAM method. The estimation unit 128d supplies the estimated result to the determination unit 128f.
 尤度特定部128eは、推定部128dが推定した自己位置と過去の自己位置データ127C、移動計画等とを比較して、自己位置推定の尤度を特定する。尤度特定部128eは、例えば、公知のカルマンフィルタ等を用いて、移動体100が所定の確率で存在する領域を計算した誤差楕円300を特定する。 The likelihood specifying unit 128e specifies the likelihood of self-position estimation by comparing the self-position estimated by the estimation unit 128d with the past self-position data 127C, the movement plan, and the like. The likelihood specifying unit 128e specifies an error ellipse 300 obtained by calculating a region in which the moving body 100 exists with a predetermined probability by using, for example, a known Kalman filter or the like.
 図4は、自己位置の確からしさを示す誤差楕円300の一例を説明するための図である。図4に示すように、移動体100は、高度H1及び高度H2で、類似した壁1020を示す点群データ127dをLiDAR124によって計測するため、高度H1及び高度H2の点群データ127dの変化が十分に得られない。この場合、尤度特定部128eは、自己位置の確からしさを示す誤差楕円300が高度H1の場合よりも高度H2の場合の方が大きくなり、移動方向Mが長い誤差楕円300を含む尤度を特定する。尤度特定部128eは、特定した尤度を決定部128fに供給する。 FIG. 4 is a diagram for explaining an example of the error ellipse 300 showing the certainty of the self-position. As shown in FIG. 4, since the moving body 100 measures the point cloud data 127d indicating a similar wall 1020 at altitudes H1 and H2 by the LiDAR 124, the change in the point cloud data 127d at altitudes H1 and H2 is sufficient. I can't get it. In this case, the likelihood specifying unit 128e has a larger likelihood when the error ellipse 300 indicating the certainty of the self-position is at altitude H2 than at altitude H1, and has a likelihood including an error ellipse 300 having a long moving direction M. Identify. The likelihood specifying unit 128e supplies the specified likelihood to the determination unit 128f.
 決定部128fは、自己位置推定の尤度に基づいて、LiDAR124の照射方向L1を傾ける方向を決定する。決定部128fは、誤差楕円300が閾値よりも大きくなると、照射方向L1を傾ける方向及び角度を決定する。決定部128fは、例えば、基準となる照射方向L1から照射方向L2へ傾ける方向及び角度を決定する。決定部128fは、自己位置推定の誤差方向に基づいて、照射方向L1を傾ける方向及び角度を決定する。決定部128fは、決定した照射方向L1を傾ける方向及び角度を支持制御部128gに供給する。 The determination unit 128f determines the direction in which the irradiation direction L1 of the LiDAR 124 is tilted based on the likelihood of self-position estimation. When the error ellipse 300 becomes larger than the threshold value, the determination unit 128f determines the direction and angle at which the irradiation direction L1 is tilted. The determination unit 128f determines, for example, the direction and angle of inclination from the reference irradiation direction L1 to the irradiation direction L2. The determination unit 128f determines the direction and angle at which the irradiation direction L1 is tilted based on the error direction of the self-position estimation. The determination unit 128f supplies the support control unit 128g with the direction and angle at which the determined irradiation direction L1 is tilted.
 図5は、実施形態に係る移動体100が照射方向L1を傾ける一例を説明するための図である。図5に示すように、場面ST11では、移動体100は、高度H1から高度H2までの移動方向Mに移動(上昇)している。この場合、自己位置推定の尤度は、自己位置及び周囲地図の推定に必要な点群データ127dの変化が十分に得られないため、自己位置の確からしさを示す誤差楕円300が移動量に応じて小さくなる。すなわち、自己位置推定の尤度は、誤差方向が移動方向Mに長い誤差楕円300を有する。これにより、決定部128fは、LiDAR124の照射方向L1を、移動方向Mである上下方向410において、傾ける方向を決定する。図5に示す一例では、決定部128fは、建物1000の特徴である床1010、天井1030等に向かう方向及び角度を、自己位置、点群データ127d等に基づいて決定する。決定部128fは、移動体100の自己位置に基づいて、点群データ127dから自己位置の周囲の特徴領域を特定し、自己位置から該特徴領域に向かう方向及び角度を計算し、計算結果に基づいて傾ける方向を決定する。 FIG. 5 is a diagram for explaining an example in which the moving body 100 according to the embodiment tilts the irradiation direction L1. As shown in FIG. 5, in the scene ST11, the moving body 100 moves (ascends) in the moving direction M from the altitude H1 to the altitude H2. In this case, as for the likelihood of self-position estimation, since the change of the point cloud data 127d necessary for estimating the self-position and the surrounding map cannot be sufficiently obtained, the error ellipse 300 indicating the certainty of the self-position corresponds to the amount of movement. Becomes smaller. That is, the likelihood of self-position estimation has an error ellipse 300 whose error direction is long in the moving direction M. As a result, the determination unit 128f determines the direction in which the irradiation direction L1 of the LiDAR 124 is tilted in the vertical direction 410, which is the movement direction M. In the example shown in FIG. 5, the determination unit 128f determines the direction and angle toward the floor 1010, the ceiling 1030, etc., which are the characteristics of the building 1000, based on the self-position, the point cloud data 127d, and the like. The determination unit 128f identifies a characteristic region around the self-position from the point cloud data 127d based on the self-position of the moving body 100, calculates the direction and angle from the self-position toward the characteristic region, and is based on the calculation result. To determine the tilting direction.
 場面ST12では、移動体100は、高度H3を一定とし、位置P1から位置P2までの移動方向MHに水平移動している。この場合、自己位置推定の尤度は、自己位置及び周囲地図の推定に必要な点群データ127dの変化が十分に得られないため、自己位置の確からしさを示す誤差楕円300が、水平方向における移動量に応じて小さくなる。すなわち、自己位置推定の尤度は、誤差方向が移動方向MHに長い誤差楕円300を有する。これにより、決定部128fは、LiDAR124の照射方向L1を、移動方向MHである水平方向420において、傾ける方向を決定する。図5に示す一例では、決定部128fは、建物1000の特徴である壁1020、床1010上の特徴物等に向かう方向及び角度を、自己位置、点群データ127d等に基づいて決定する。 In the scene ST12, the moving body 100 keeps the altitude H3 constant and moves horizontally in the moving direction MH from the position P1 to the position P2. In this case, as for the likelihood of self-position estimation, since the change of the point cloud data 127d necessary for estimating the self-position and the surrounding map cannot be sufficiently obtained, the error ellipse 300 indicating the certainty of the self-position is in the horizontal direction. It becomes smaller according to the amount of movement. That is, the likelihood of self-position estimation has an error ellipse 300 whose error direction is long in the moving direction MH. As a result, the determination unit 128f determines the direction in which the irradiation direction L1 of the LiDAR 124 is tilted in the horizontal direction 420, which is the movement direction MH. In the example shown in FIG. 5, the determination unit 128f determines the direction and angle toward the wall 1020, the feature on the floor 1010, etc., which are the features of the building 1000, based on the self-position, the point cloud data 127d, and the like.
 なお、図5に示す一例では、決定部128fは、上下方向410または水平方向420において照射方向L1を傾ける方向を決定する場合について説明したが、これに限定されない。例えば、決定部128fは、移動体100が斜め上方向または斜め下方向へ移動している場合、上下方向410及び水平方向420において、照射方向L1を傾ける方向を決定してもよい。 In the example shown in FIG. 5, the case where the determination unit 128f determines the direction in which the irradiation direction L1 is tilted in the vertical direction 410 or the horizontal direction 420 has been described, but the present invention is not limited to this. For example, when the moving body 100 is moving diagonally upward or diagonally downward, the determination unit 128f may determine the direction in which the irradiation direction L1 is tilted in the vertical direction 410 and the horizontal direction 420.
 図3に戻り、支持制御部128gは、推定部128dの推定精度の低下を抑える方向へ、レーザ光線の照射方向L1を能動的に傾けるように、ジンバル125を制御する。支持制御部128gは、決定部128fが決定した照射方向L1を傾ける方向及び角度に基づいて、LiDAR124の照射方向L1が傾ける方向となるように、ジンバル125を制御する。支持制御部128gは、LiDAR124を支持しているジンバル125の角度が傾ける方向となるように、ジンバル125を回転させることで、LiDAR124の照射方向L1を変更する。 Returning to FIG. 3, the support control unit 128g controls the gimbal 125 so as to actively tilt the irradiation direction L1 of the laser beam in the direction of suppressing the deterioration of the estimation accuracy of the estimation unit 128d. The support control unit 128g controls the gimbal 125 so that the irradiation direction L1 of the LiDAR 124 is tilted based on the direction and angle at which the irradiation direction L1 determined by the determination unit 128f is tilted. The support control unit 128g changes the irradiation direction L1 of the LiDAR 124 by rotating the gimbal 125 so that the angle of the gimbal 125 supporting the LiDAR 124 is tilted.
 以上、本実施形態に係る移動体100の機能構成例について説明した。なお、図2及び図3を用いて説明した上記の構成はあくまで一例であり、本実施形態に係る移動体100の機能構成は係る例に限定されない。本実施形態に係る移動体100の機能構成は、仕様や運用に応じて柔軟に変形可能である。 The functional configuration example of the mobile body 100 according to the present embodiment has been described above. The above configuration described with reference to FIGS. 2 and 3 is merely an example, and the functional configuration of the mobile body 100 according to the present embodiment is not limited to such an example. The functional configuration of the mobile body 100 according to the present embodiment can be flexibly modified according to specifications and operations.
 図6は、実施形態に係る移動体100が実行する処理手順の一例を示すフローチャートである。図6に示す処理手順は、移動体100の制御ユニット128が制御プログラム127aを実行することによって実現される。図6に示す処理手順は、制御ユニット128によって繰り返し実行される。 FIG. 6 is a flowchart showing an example of a processing procedure executed by the mobile body 100 according to the embodiment. The processing procedure shown in FIG. 6 is realized by the control unit 128 of the mobile body 100 executing the control program 127a. The processing procedure shown in FIG. 6 is repeatedly executed by the control unit 128.
 図6に示すように、移動体100の制御ユニット128は、ジンバル125の姿勢情報を取得する(ステップS101)。例えば、制御ユニット128は、ジンバル125のエンコーダ、制御結果等を用いて、現在のジンバル125が支持しているLiDAR124の角度を識別可能な姿勢情報を取得する。制御ユニット128は、取得した姿勢情報を記憶部127に記憶すると、処理をステップS102に進める。 As shown in FIG. 6, the control unit 128 of the moving body 100 acquires the posture information of the gimbal 125 (step S101). For example, the control unit 128 acquires posture information capable of identifying the angle of the LiDAR 124 supported by the current gimbal 125 by using the encoder of the gimbal 125, the control result, and the like. When the control unit 128 stores the acquired posture information in the storage unit 127, the process proceeds to step S102.
 制御ユニット128は、IMU126から慣性情報を取得する(ステップS102)。例えば、制御ユニット128は、IMU126が計測した移動体100の3次元の角速度、加速度等を示す慣性情報を取得する。制御ユニット128は、取得した慣性情報を記憶部127に記憶すると、処理をステップS103に進める。 The control unit 128 acquires inertial information from the IMU 126 (step S102). For example, the control unit 128 acquires inertial information indicating the three-dimensional angular velocity, acceleration, etc. of the moving body 100 measured by the IMU 126. When the control unit 128 stores the acquired inertial information in the storage unit 127, the process proceeds to step S103.
 制御ユニット128は、LiDAR124の測定結果に基づいて、自己位置及び周囲地図を推定する(ステップS103)。例えば、制御ユニット128は、SLAMの手法を用いて、LiDAR124の測定結果に基づく移動体100の自己位置及び周囲地図を推定する。制御ユニット128は、推定した自己位置を記憶部127の自己位置データ127eに反映する。制御ユニット128は、推定した周囲地図を記憶部127の点群データ127dに反映する。制御ユニット128は、ステップS103の処理が終了すると、処理をステップS104に進める。 The control unit 128 estimates its own position and surrounding map based on the measurement result of LiDAR124 (step S103). For example, the control unit 128 estimates the self-position and the surrounding map of the moving body 100 based on the measurement result of the LiDAR 124 by using the method of SLAM. The control unit 128 reflects the estimated self-position in the self-position data 127e of the storage unit 127. The control unit 128 reflects the estimated surrounding map in the point cloud data 127d of the storage unit 127. When the process of step S103 is completed, the control unit 128 advances the process to step S104.
 制御ユニット128は、自己位置推定の尤度を特定する(ステップS104)。例えば、制御ユニット128は、推定した自己位置と過去の自己位置データ127C、移動計画等とを比較して、自己位置推定の尤度を特定する。制御ユニット128は、上述したように、公知のカルマンフィルタ等を用いて、移動体100が所定の確率で存在する領域を計算した誤差楕円300を特定する。制御ユニット128は、特定した自己位置推定の尤度を記憶部127に記憶すると、処理をステップS105に進める。 The control unit 128 specifies the likelihood of self-position estimation (step S104). For example, the control unit 128 compares the estimated self-position with the past self-position data 127C, the movement plan, and the like to specify the likelihood of self-position estimation. As described above, the control unit 128 specifies the error ellipse 300 obtained by calculating the region where the moving body 100 exists with a predetermined probability by using a known Kalman filter or the like. When the control unit 128 stores the likelihood of the specified self-position estimation in the storage unit 127, the process proceeds to step S105.
 制御ユニット128は、誤差楕円300が閾値を超えたか否かを判定する(ステップS105)。例えば、制御ユニット128は、誤差楕円300の大きさを求め、該大きさが予め設定された判定用の閾値を超えている場合に、誤差楕円300が閾値を超えたと判定する。制御ユニット128は、誤差楕円300が閾値を超えたと判定した場合(ステップS105でYes)、処理をステップS106に進める。 The control unit 128 determines whether or not the error ellipse 300 exceeds the threshold value (step S105). For example, the control unit 128 obtains the size of the error ellipse 300, and determines that the error ellipse 300 exceeds the threshold value when the size exceeds a preset threshold value for determination. When the control unit 128 determines that the error ellipse 300 exceeds the threshold value (Yes in step S105), the control unit 128 advances the process to step S106.
 制御ユニット128は、自己位置推定の尤度に基づいて、推定精度の低下を抑制するように照射方向L1を傾ける方向を決定する(ステップS106)。例えば、制御ユニット128は、自己位置推定の尤度の誤差楕円300が示す誤差方向に、照射方向L1を傾ける方向及び角度を決定する。制御ユニット128は、決定した傾ける方向及び角度を記憶部127に記憶すると、処理をステップS107に進める。 The control unit 128 determines the direction in which the irradiation direction L1 is tilted so as to suppress a decrease in estimation accuracy based on the likelihood of self-position estimation (step S106). For example, the control unit 128 determines the direction and angle at which the irradiation direction L1 is tilted in the error direction indicated by the error ellipse 300 of the likelihood of self-position estimation. When the control unit 128 stores the determined tilting direction and angle in the storage unit 127, the process proceeds to step S107.
 制御ユニット128は、傾ける方向に対応したジンバル125の角度を計算する(ステップS107)。例えば、制御ユニット128は、LiDAR124の照射方向L1が傾ける方向となるように、LiDAR124を支持するジンバル125の角度をテーブル、算出プログラム等を用いて計算する。制御ユニット128は、計算したジンバル125の角度を記憶部127に記憶すると、処理をステップS108に進める。 The control unit 128 calculates the angle of the gimbal 125 corresponding to the tilting direction (step S107). For example, the control unit 128 calculates the angle of the gimbal 125 supporting the LiDAR 124 by using a table, a calculation program, or the like so that the irradiation direction L1 of the LiDAR 124 is tilted. When the control unit 128 stores the calculated angle of the gimbal 125 in the storage unit 127, the process proceeds to step S108.
 制御ユニット128は、傾ける方向となるようにジンバル125の角度を制御する(ステップS108)。例えば、制御ユニット128は、計算した角度となるようにジンバル125を回転させることで、ジンバル125の角度を変更する。これにより、LiDAR124は、照射方向L1を傾ける方向に変更される。制御ユニット128は、ステップS108の処理が終了すると、図6に示す処理手順を終了させる。 The control unit 128 controls the angle of the gimbal 125 so that it is tilted (step S108). For example, the control unit 128 changes the angle of the gimbal 125 by rotating the gimbal 125 so as to have a calculated angle. As a result, the LiDAR 124 is changed in the direction in which the irradiation direction L1 is tilted. When the process of step S108 is completed, the control unit 128 ends the process procedure shown in FIG.
 また、制御ユニット128は、誤差楕円300が閾値を超えていないと判定した場合(ステップS105でNo)、処理をステップS109に進める。制御ユニット128は、LiDAR124の照射方向L1を変更しているか否かを判定する(ステップS109)。例えば、制御ユニット128は、ジンバル125が照射方向L1とは異なる方向を向くようにLiDAR124を支持している場合に、LiDAR124の照射方向L1を変更していると判定する。制御ユニット128は、LiDAR124の照射方向L1を変更していないと判定した場合(ステップS109でNo)、図6に示す処理手順を終了させる。 Further, when the control unit 128 determines that the error ellipse 300 does not exceed the threshold value (No in step S105), the process proceeds to step S109. The control unit 128 determines whether or not the irradiation direction L1 of the LiDAR 124 is changed (step S109). For example, the control unit 128 determines that the irradiation direction L1 of the LiDAR 124 is changed when the gimbal 125 supports the LiDAR 124 so as to face a direction different from the irradiation direction L1. When the control unit 128 determines that the irradiation direction L1 of the LiDAR 124 has not been changed (No in step S109), the control unit 128 ends the processing procedure shown in FIG.
 また、制御ユニット128は、LiDAR124の照射方向L1を変更していると判定した場合(ステップS109でYes)、処理をステップS110に進める。制御ユニット128は、傾ける方向から照射方向L1へ戻すように、ジンバル125の角度を制御する(ステップS110)。例えば、制御ユニット128は、基準の照射方向L1へ戻すようにジンバル125を回転させることで、ジンバル125の角度を変更する。これにより、LiDAR124は、傾ける方向から照射方向L1へ変更される。制御ユニット128は、ステップS110の処理が終了すると、図6に示す処理手順を終了させる。 Further, when the control unit 128 determines that the irradiation direction L1 of the LiDAR 124 is changed (Yes in step S109), the process proceeds to step S110. The control unit 128 controls the angle of the gimbal 125 so as to return from the tilting direction to the irradiation direction L1 (step S110). For example, the control unit 128 changes the angle of the gimbal 125 by rotating the gimbal 125 so as to return to the reference irradiation direction L1. As a result, the LiDAR 124 is changed from the tilting direction to the irradiation direction L1. When the process of step S110 is completed, the control unit 128 ends the process procedure shown in FIG.
 以上により、上述した移動体100は、推定部128dの推定精度の低下を抑える方向へレーザ光線の照射方向L1を能動的に傾けるように、ジンバル125を制御することができる。これにより、移動体100は、周囲環境の変化が少なくても、LiDAR124の照射方向L1を推定精度の低下を抑える方向へ変更するので、LiDAR124を用いた自己位置及び周囲地図を推定精度の低下を抑制することができる。その結果、移動体100は、自己位置及び周囲地図の推定精度の低下を抑制することで、移動体100の安全性の低下を抑制することができる。 From the above, the moving body 100 described above can control the gimbal 125 so as to actively tilt the irradiation direction L1 of the laser beam in the direction of suppressing the deterioration of the estimation accuracy of the estimation unit 128d. As a result, the moving body 100 changes the irradiation direction L1 of the LiDAR 124 in a direction that suppresses the decrease in the estimation accuracy even if the change in the surrounding environment is small. It can be suppressed. As a result, the moving body 100 can suppress the deterioration of the safety of the moving body 100 by suppressing the deterioration of the estimation accuracy of the self-position and the surrounding map.
 移動体100は、自己位置推定の尤度に基づいて、照射方向L1を傾ける方向を決定し、LiDAR124の照射方向L1が傾ける方向となるようにジンバル125を制御することができる。これにより、移動体100は、自己位置推定の尤度に応じてLiDAR124を傾けることができるので、LiDAR124を用いた自己位置及び周囲地図を推定精度の低下を抑制することができる。その結果、移動体100は、自己位置推定の尤度に応じた方向に対して自己位置推定の結果が退化しにくくなるため、移動体にとって重要な自己位置推定を向上させることができる。 The moving body 100 can determine the direction in which the irradiation direction L1 is tilted based on the likelihood of self-position estimation, and can control the gimbal 125 so that the irradiation direction L1 of the LiDAR 124 is in the tilting direction. As a result, the moving body 100 can tilt the LiDAR 124 according to the likelihood of self-position estimation, so that it is possible to suppress a decrease in the estimation accuracy of the self-position and the surrounding map using the LiDAR 124. As a result, the moving body 100 is less likely to degenerate the self-position estimation result in the direction corresponding to the likelihood of the self-position estimation, so that the self-position estimation important for the moving body can be improved.
 移動体100は、自己位置推定の誤差方向に基づいて、照射方向L1を傾ける方向を決定することができる。これにより、移動体100は、自己位置推定の誤差方向、すなわち移動体100の移動方向に照射方向L1を傾けることができるので、LiDAR124が照射するレーザ光線の方向を変化させることができる。その結果、移動体100は、自己位置推定の誤差方向に対して自己位置推定の結果が退化しにくくなるため、移動体にとって重要な自己位置推定を向上させることができる。 The moving body 100 can determine the direction in which the irradiation direction L1 is tilted based on the error direction of the self-position estimation. As a result, the moving body 100 can incline the irradiation direction L1 in the error direction of the self-position estimation, that is, the moving direction of the moving body 100, so that the direction of the laser beam irradiated by the LiDAR 124 can be changed. As a result, the moving body 100 is less likely to degenerate the self-position estimation result with respect to the error direction of the self-position estimation, so that the self-position estimation important for the moving body can be improved.
 移動体100は、周囲地図が示す特徴領域の方向に基づいて、照射方向L1を傾ける方向を決定することができる。これにより、移動体100は、LiDAR124によって周囲環境の特徴領域を測定できるので、LiDAR124を用いた自己位置及び周囲地図を推定精度の低下を抑制することができる。その結果、移動体100は、自己位置及び周囲地図の推定精度を向上させることができるので、移動体100の安全性の向上に貢献することができる。 The moving body 100 can determine the direction in which the irradiation direction L1 is tilted based on the direction of the characteristic area shown by the surrounding map. As a result, since the moving body 100 can measure the characteristic region of the surrounding environment by using the LiDAR 124, it is possible to suppress a decrease in the estimation accuracy of the self-position and the surrounding map using the LiDAR 124. As a result, the moving body 100 can improve the estimation accuracy of the self-position and the surrounding map, and thus can contribute to the improvement of the safety of the moving body 100.
 例えば、移動体100が飛翔体である場合、ビル街などにおいては垂直方向に同じ地形が現れるため、LiDAR124からの情報では高度の推定結果が低下すると、高度推定の精度が著しく低下する。また、移動体100は、周囲地図の推定に対しても、高度方向に歪んだ周囲地図となり、精度が低下する。これに対し、実施形態に係る移動体100は、LiDAR124の照射方向L1を移動方向へ変更可能なので、LiDAR124を用いた自己位置及び周囲地図を推定精度の低下を抑制することができる。その結果、移動体100は、自己位置及び周囲地図の推定精度を向上させることができるので、飛翔体の安全性の向上に貢献することができる。このような効果は、自律移動可能な車両、ロボット等でも同様に得ることができる。 For example, when the moving body 100 is a flying object, the same terrain appears in the vertical direction in a building town or the like, so if the altitude estimation result is lowered from the information from LiDAR124, the altitude estimation accuracy is significantly lowered. Further, the moving body 100 becomes a surrounding map distorted in the altitude direction with respect to the estimation of the surrounding map, and the accuracy is lowered. On the other hand, since the moving body 100 according to the embodiment can change the irradiation direction L1 of the LiDAR 124 to the moving direction, it is possible to suppress a decrease in the estimation accuracy of the self-position and the surrounding map using the LiDAR 124. As a result, the moving body 100 can improve the estimation accuracy of the self-position and the surrounding map, and thus can contribute to the improvement of the safety of the flying body. Such an effect can be similarly obtained in a vehicle, a robot, or the like that can move autonomously.
 添付の請求項に係る技術を完全かつ明瞭に開示するために特徴的な実施形態に関し記載してきた。しかし、添付の請求項は、上記実施形態に限定されるべきものでなく、本明細書に示した基礎的事項の範囲内で当該技術分野の当業者が創作しうるすべての変形例及び代替可能な構成を具現化するように構成されるべきである。 A characteristic embodiment has been described in order to completely and clearly disclose the technique according to the attached claim. However, the attached claims are not limited to the above embodiments, and all modifications and alternatives that can be created by those skilled in the art within the scope of the basic matters shown in the present specification are possible. It should be configured to embody a unique configuration.
 100 移動体
 110 本体
 121 通信部
 122 撮影制御部
 123 動力制御部
 124 LiDAR(センサ)
 125 ジンバル(支持部)
 126 IMU
 127 記憶部
 127a 制御プログラム
 127b 制御用データ
 127c 自己位置データ
 127d 点群データ
 128 制御ユニット
 128a 姿勢取得部
 128b 点群取得部
 128c 慣性取得部
 128d 推定部
 128e 尤度特定部
 128f 決定部
 128g 支持制御部
100 Move 110 Main unit 121 Communication unit 122 Imaging control unit 123 Power control unit 124 LiDAR (sensor)
125 gimbal (support)
126 IMU
127 Storage unit 127a Control program 127b Control data 127c Self-position data 127d Point cloud data 128 Control unit 128a Posture acquisition unit 128b Point cloud acquisition unit 128c Inertia acquisition unit 128d Estimating unit 128e Probability specifying unit 128f Determining unit 128g Support control unit

Claims (7)

  1.  レーザ光線を照射して反射した反射光に基づいて距離を測定するセンサと、
     前記センサを用いて移動体の自己位置及び周囲地図を推定する推定部と、
     前記レーザ光線の照射方向を変更可能なように前記センサを支持する支持部と、
     前記推定部の推定精度の低下を抑える方向へ前記レーザ光線の前記照射方向を能動的に傾けるように、前記支持部を制御する支持制御部と、
     を備える移動体。
    A sensor that measures the distance based on the reflected light reflected by irradiating a laser beam,
    An estimation unit that estimates the self-position of a moving object and a surrounding map using the sensor,
    A support portion that supports the sensor so that the irradiation direction of the laser beam can be changed,
    A support control unit that controls the support unit so as to actively tilt the irradiation direction of the laser beam in a direction that suppresses a decrease in estimation accuracy of the estimation unit.
    A mobile body equipped with.
  2.  請求項1に記載の移動体において、
     自己位置推定の尤度に基づいて、前記照射方向を傾ける方向を決定する決定部をさらに備え、
     前記支持制御部は、前記センサの前記照射方向が前記傾ける方向となるように前記支持部を制御する、移動体。
    In the moving body according to claim 1,
    Further provided with a determination unit for determining the direction in which the irradiation direction is tilted based on the likelihood of self-position estimation.
    The support control unit is a moving body that controls the support unit so that the irradiation direction of the sensor is in the tilting direction.
  3.  請求項2に記載の移動体において、
     前記決定部は、前記自己位置推定の誤差方向に基づいて、前記傾ける方向を決定する、移動体。
    In the moving body according to claim 2,
    The determination unit is a moving body that determines the tilting direction based on the error direction of the self-position estimation.
  4.  請求項3に記載の移動体において、
     前記決定部は、前記周囲地図が示す特徴領域の方向に基づいて前記傾ける方向を決定する、移動体。
    In the moving body according to claim 3,
    The determination unit is a moving body that determines the tilting direction based on the direction of the feature area indicated by the surrounding map.
  5.  請求項4に記載の移動体において、
     前記移動体は、飛翔体である、移動体。
    In the moving body according to claim 4,
    The moving body is a moving body which is a flying body.
  6.  レーザ光線を照射して反射した反射光に基づいて距離を測定するセンサと、前記レーザ光線の照射方向を変更可能なように前記センサを支持する支持部と、を備える移動体の制御方法であって、
     前記センサを用いて前記移動体の自己位置及び周囲地図を推定すること、
     前記自己位置及び前記周囲地図の推定精度の低下を抑える方向へ前記レーザ光線の前記照射方向を能動的に傾けるように、前記支持部を制御すること、
     を含む制御方法。
    It is a control method of a moving body including a sensor that irradiates a laser beam and measures a distance based on the reflected light reflected, and a support portion that supports the sensor so that the irradiation direction of the laser beam can be changed. hand,
    To estimate the self-position and surrounding map of the moving object using the sensor,
    Controlling the support portion so as to actively tilt the irradiation direction of the laser beam in a direction that suppresses a decrease in the estimation accuracy of the self-position and the surrounding map.
    Control method including.
  7.  レーザ光線を照射して反射した反射光に基づいて距離を測定するセンサと、前記レーザ光線の照射方向を変更可能なように前記センサを支持する支持部と、を備える移動体の制御プログラムであって、
     コンピュータに、
     前記センサを用いて前記移動体の自己位置及び周囲地図を推定させ、
     前記自己位置及び前記周囲地図の推定精度の低下を抑える方向へ前記レーザ光線の前記照射方向を能動的に傾けるように、前記支持部を制御させる、制御プログラム。
    It is a control program for a moving body including a sensor that irradiates a laser beam and measures a distance based on the reflected light reflected, and a support portion that supports the sensor so that the irradiation direction of the laser beam can be changed. hand,
    On the computer
    Using the sensor, the self-position of the moving body and the surrounding map are estimated.
    A control program for controlling the support portion so as to actively tilt the irradiation direction of the laser beam in a direction that suppresses a decrease in the estimation accuracy of the self-position and the surrounding map.
PCT/JP2021/045398 2020-12-24 2021-12-09 Moving body, control method, and control program WO2022138213A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-215749 2020-12-24
JP2020215749A JP2022101274A (en) 2020-12-24 2020-12-24 Moving object, control method and control program

Publications (1)

Publication Number Publication Date
WO2022138213A1 true WO2022138213A1 (en) 2022-06-30

Family

ID=82157907

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/045398 WO2022138213A1 (en) 2020-12-24 2021-12-09 Moving body, control method, and control program

Country Status (2)

Country Link
JP (1) JP2022101274A (en)
WO (1) WO2022138213A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015001820A (en) * 2013-06-14 2015-01-05 シャープ株式会社 Autonomous mobile body, control system of the same, and own position detection method
JP2018112830A (en) * 2017-01-10 2018-07-19 株式会社東芝 Self position estimation device and self position estimation method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015001820A (en) * 2013-06-14 2015-01-05 シャープ株式会社 Autonomous mobile body, control system of the same, and own position detection method
JP2018112830A (en) * 2017-01-10 2018-07-19 株式会社東芝 Self position estimation device and self position estimation method

Also Published As

Publication number Publication date
JP2022101274A (en) 2022-07-06

Similar Documents

Publication Publication Date Title
JP6235213B2 (en) Autonomous flying robot
JP6029446B2 (en) Autonomous flying robot
KR102159376B1 (en) Laser scanning system, laser scanning method, mobile laser scanning system and program
CN117369489A (en) Collision avoidance system, depth imaging system, vehicle, map generator, and method thereof
JP6375503B2 (en) Flight type inspection apparatus and inspection method
JP6195450B2 (en) Autonomous flying robot
US20180350086A1 (en) System And Method Of Dynamically Filtering Depth Estimates To Generate A Volumetric Map Of A Three-Dimensional Environment Having An Adjustable Maximum Depth
JP6140458B2 (en) Autonomous mobile robot
JP6527726B2 (en) Autonomous mobile robot
JP6014485B2 (en) Autonomous flying robot
WO2021087701A1 (en) Terrain prediction method and apparatus for undulating ground, and radar, unmanned aerial vehicle and operating control method
WO2021087702A1 (en) Sloped terrain prediction method and device, radar, unmanned aerial vehicle, and operation control method
JP6014484B2 (en) Autonomous mobile robot
JP2021117502A (en) Landing control device, landing control method and program
GB2571711A (en) Drone control system
JP2016181178A (en) Autonomous mobile robot
JP6900029B2 (en) Unmanned aerial vehicle, position estimation device, flight control device, position estimation method, control method and program
US20210229810A1 (en) Information processing device, flight control method, and flight control system
WO2022138213A1 (en) Moving body, control method, and control program
JP7351609B2 (en) Route searching device and program
Hou et al. Autonomous target localization using quadrotor
WO2021025568A2 (en) A lidar device, system and a control method of the same
WO2018180175A1 (en) Mobile body, signal processing device, and computer program
WO2022153392A1 (en) Self-position estimation system and method for uncrewed aircraft, uncrewed aircraft, program, and recording medium
WO2022153390A1 (en) Self-position estimation system for estimating self position of uncrewed aircraft, flight control system, uncrewed aircraft, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21910362

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21910362

Country of ref document: EP

Kind code of ref document: A1