WO2021131990A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2021131990A1
WO2021131990A1 PCT/JP2020/047053 JP2020047053W WO2021131990A1 WO 2021131990 A1 WO2021131990 A1 WO 2021131990A1 JP 2020047053 W JP2020047053 W JP 2020047053W WO 2021131990 A1 WO2021131990 A1 WO 2021131990A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
division
condition
clustering
processing unit
Prior art date
Application number
PCT/JP2020/047053
Other languages
French (fr)
Japanese (ja)
Inventor
正樹 若林
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021131990A1 publication Critical patent/WO2021131990A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to an information processing device that performs information processing, an information processing method used in such an information processing device, and a program that causes a computer to perform such information processing.
  • Non-Patent Document 1 discloses a technique for clustering a point cloud by searching for nearby points.
  • the processing time is short, and it is expected that the processing time is short even in such a point cloud clustering process.
  • the information processing apparatus includes a first division determination unit and a first division processing unit.
  • the first division determination unit is configured to determine whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition. When the first division determination unit determines that the point cloud satisfies the division condition, the first division processing unit divides the point cloud included in the first point cloud data until the division condition is not satisfied. It is configured as follows.
  • the information processing method is a first method of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition.
  • the first point cloud included in the first point cloud data is divided until the division condition is not satisfied. Includes split processing.
  • the program according to the embodiment of the present disclosure is a first division determination process for determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition.
  • the point cloud included in the first point cloud data is divided until the division condition is not satisfied. Is configured to let the computer run.
  • FIG. 1 shows a configuration example of a robot 100 provided with an information processing device according to an embodiment. Since the information processing method according to the embodiment of the present disclosure is embodied by the present embodiment, it will be described together.
  • the robot 100 is configured to generate map data based on the captured image of the surroundings of the robot 100 and to perform autonomous movement using the map data.
  • the robot 100 may be, for example, a traveling robot traveling on a plane or a drone flying in space.
  • the robot 100 includes image sensors 11L and 11R, a depth estimation unit 12, a plane estimation unit 20, an object recognition unit 13, a self-position estimation unit 14, an action plan determination unit 15, an actuator 16, and a movement mechanism 17. And have.
  • the plane estimation unit 20, the object recognition unit 13, the self-position estimation unit 14, and the action plan determination unit 15 can be configured by using a processor that performs processing based on a program.
  • the image sensors 11L and 11R are so-called stereo cameras, and are configured to generate a set of images having parallax with each other by taking an image of, for example, the front of the robot 100.
  • the image sensor 11L and the image sensor 11R are arranged horizontally separated by a predetermined distance.
  • the image sensor 11L is arranged to the left of the image sensor 11R to generate a left image PL, and the image sensor 11R is arranged to the right of the image sensor 11L to generate a right image PR.
  • the image sensors 11L and 11R are adapted to perform imaging operations in synchronization with each other at a predetermined frame rate (for example, 30 [fps]).
  • the depth estimation unit 12 is configured to generate a depth map by estimating the depth based on the left image PL and the right image PR. Then, the depth estimation unit 12 is adapted to generate a point cloud data DPC including data about the point cloud PC based on this depth map.
  • FIG. 2 schematically shows an example of a point cloud PC.
  • the robot 100 is schematically represented by using a rectangular parallelepiped.
  • the depth estimation unit 12 estimates the depth (distance) to the image points corresponding to each other in the left image PL and the right image PR. Then, the depth estimation unit 12 converts the image point into, for example, a point P in the three-dimensional coordinate system based on the information about the depth. In this way, the depth estimation unit 12 converts the plurality of image points into a plurality of points P in the three-dimensional coordinate system. As a result, the depth estimation unit 12 acquires a point cloud PC which is a collection of points P.
  • the plane estimation unit 20 is configured to generate a minute plane group by clustering the point cloud PCs based on the point cloud data DPC. Each cluster corresponds to one of a plurality of microplanes constituting the surface of an object around the robot 100.
  • the plane estimation unit 20 is adapted to generate cluster data DC including a plurality of clusters generated in this way.
  • the plane estimation unit 20 includes a division determination unit 21, a division processing unit 22, a clustering determination unit 23, and a clustering processing unit 24.
  • the division determination unit 21 is configured to determine whether or not to divide the point cloud PC based on the division condition C1.
  • the division condition C1 is a condition for determining whether or not to divide the point cloud PC, and when the point cloud PC satisfies the division condition C1, the point cloud PC is divided.
  • the division processing unit 22 is configured to divide the point cloud PC until the division condition C1 is no longer satisfied.
  • the clustering determination unit 23 is configured to determine whether or not to perform clustering of the point cloud PC based on the clustering condition C2 when the division determination unit 21 determines that the point cloud PC does not satisfy the division condition C1.
  • the clustering condition C2 is a condition for determining whether or not to perform clustering of the point cloud PC, and when the point cloud PC satisfies the clustering condition C2, the point cloud PC is clustered.
  • the clustering processing unit 24 is configured to perform clustering of the point cloud PC when the clustering determination unit 23 determines that the point cloud PC satisfies the clustering condition C2.
  • the object recognition unit 13 is configured to recognize an object by grouping one or a plurality of clusters based on the cluster data DC.
  • the objects correspond to each of the objects around the robot 100, specifically, a person, a wall, a floor, and the like around the robot 10.
  • the object recognition unit 13 is adapted to generate a map data MAP represented by a plurality of clusters based on the recognized object.
  • This map data MAP can be, for example, two-dimensional data when the robot 10 is a traveling robot traveling on a plane, and 3 when the robot 10 is a drone flying in space, for example. It can be dimensional data.
  • each cluster is represented using, for example, center coordinates and a probability distribution shape.
  • the map data MAP can reduce the data size and improve the accuracy as compared with the case where the map data MAP is configured by using, for example, a so-called grid map. Since the robot 100 can operate based on the map data MAP represented by such a plurality of clusters, it is possible to perform processing in a short time using less computational resources.
  • the self-position estimation unit 14 is configured to generate a position data POS indicating the position of the robot 100 by estimating the position of the robot 100 based on the left image PL and the map data MAP.
  • the position is estimated based on the left image PL, but the position is not limited to this, and the position may be estimated based on, for example, the right image PR, or the left image PL and the right image PR.
  • the position may be estimated based on.
  • the action plan determination unit 15 is configured to determine the action plan of the robot 100 by grasping the surrounding situation of the robot 100 based on the map data MAP and the position data POS.
  • the actuator 16 is configured to generate power based on the action plan determined by the action plan determination unit 15 and drive the moving mechanism 17 based on the power.
  • the actuator 16 is configured to include, for example, one or more motors.
  • the moving mechanism 17 is configured to move the robot 100 based on the power generated by the actuator 16.
  • the moving mechanism 17 is configured to include, for example, one or more wheels when the robot 10 is a traveling robot traveling on a plane, and one or more propellers when the robot 10 is a drone, for example. Consists of including.
  • the image sensors 11L and 11R correspond to a specific example of the "first sensor” in the present disclosure.
  • the point cloud data DPC corresponds to a specific example of the "first point cloud data” in the present disclosure.
  • the division determination unit 21 corresponds to a specific example of the “first division determination unit” in the present disclosure.
  • the division condition C1 corresponds to a specific example of the “division condition” in the present disclosure.
  • the division processing unit 22 corresponds to a specific example of the “first division processing unit” in the present disclosure.
  • the clustering determination unit 23 corresponds to a specific example of the “first clustering determination unit” in the present disclosure.
  • the clustering condition C2 corresponds to a specific example of the “clustering condition” in the present disclosure.
  • the clustering processing unit 24 corresponds to a specific example of the “first clustering processing unit” in the present disclosure.
  • the image sensors 11L and 11R generate a left image PL and a right image PR having parallax with each other by photographing, for example, the front of the robot 100.
  • the depth estimation unit 12 generates a depth map by estimating the depth based on the left image PL and the right image PR, and generates a point cloud data DPC including data about the point cloud PC based on the depth map. ..
  • the plane estimation unit 20 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC, and generates a cluster data DC including a plurality of clusters each corresponding to the microplane.
  • the object recognition unit 13 recognizes an object by grouping one or a plurality of clusters based on the cluster data DC, and generates a map data MAP represented by the plurality of clusters based on the recognized object. ..
  • the self-position estimation unit 14 generates position data POS by estimating the position of the robot 100 on the map based on the left image PL and the map data MAP.
  • the action plan determination unit 15 determines the action plan of the robot 100 by grasping the surrounding situation of the robot 100 based on the map data MAP and the position data POS.
  • the actuator 16 generates power based on the action plan determined by the action plan determination unit 15, and drives the moving mechanism 17 based on the power.
  • the moving mechanism 17 moves the robot 100 based on the power generated by the actuator 16.
  • FIG. 3 shows an operation example of the plane estimation unit 20.
  • the division determination unit 21 determines whether or not to divide the point cloud PC based on the division condition C1, and in the division processing unit 22, the point cloud PC determines the division condition C1 in the division determination unit 21. When it is determined that the point cloud is satisfied, the point cloud PC is divided until the division condition C1 is no longer satisfied.
  • the clustering determination unit 23 determines whether to cluster the point cloud PC based on the clustering condition C2, and determines whether or not to perform clustering, and the clustering processing unit 23. 24 performs clustering of the point cloud PC when the clustering determination unit 23 determines that the point cloud PC satisfies the clustering condition C2. This operation will be described in detail below.
  • the division determination unit 21 pays attention to the entire point cloud PC included in the point cloud data DPC (step S101).
  • the division determination unit 21 obtains the number of points P (points N) included in the point cloud PC of interest (step S102).
  • the division determination unit 21 sets the bounding box B, which is the smallest area surrounding all of the point cloud PCs of interest (step S103).
  • the division determination unit 21 uses the two-dimensional coordinate space.
  • Bounding box B is set in.
  • the division determination unit 21 uses the points at both ends in the X-axis direction among the plurality of points P included in the point group PC. Based on the X coordinate value of P and the Y coordinate value of the points P at both ends in the Y axis direction, the smallest rectangular area surrounding all of the point group PCs of interest can be set as the bounding box B.
  • the division determination unit 21 creates a bounding box in the three-dimensional coordinate space.
  • Set B When the three-dimensional coordinate space is defined using a Cartesian coordinate system including the X-axis, the Y-axis, and the Z-axis, the division determination unit 21 is in the X-axis direction among the plurality of points P included in the point group PC.
  • the minimum surrounding all of the point group PCs of interest Based on the X coordinate values of the points P at both ends in, the Y coordinate values of the points P at both ends in the Y axis direction, and the Z coordinate values of the points P at both ends in the Z axis direction, the minimum surrounding all of the point group PCs of interest.
  • the rectangular area of can be set as the bounding box B.
  • the division determination unit 21 confirms whether or not the point cloud PC of interest satisfies the division condition C1 based on the score N obtained in step S102 and the bounding box B set in step S103 (step S104).
  • the division condition C1 is a condition that the score N is a predetermined number or more (for example, 100 or more), and the length of the longest side among the plurality of sides of the bounding box B is a predetermined length (for example, 5 cm). ) Includes the condition that it is above.
  • the division determination unit 21 determines that the point cloud PC of interest satisfies the division condition C1 when all the conditions included in the division condition C1 are satisfied.
  • step S104 when the point cloud PC of interest satisfies the division condition C1 (“Y” in step S104), the division processing unit 22 has an axis on which the longest side of the plurality of sides of the bounding box B extends.
  • the direction is set as the division direction (step S105).
  • the division processing unit 22 sets the center position of the bounding box B in the division direction as the division position (step S106).
  • the division processing unit 22 divides the point cloud PC of interest at the division position set in step S106 in the division direction set in step S105 (step S107).
  • step S108 the division determination unit 21 pays attention to one of the two point cloud PCs divided in step S107 (step S108). Then, the process returns to the process of step S102.
  • the point cloud PC of interest is repeatedly divided. That is, the number of points P included in the point cloud PC of interest decreases, and the bounding box B related to the point cloud PC of interest decreases. Then, the point cloud PC of interest does not satisfy the division condition C1.
  • step S104 when the point cloud PC of interest does not satisfy the division condition C1 (“N” in step S104), the clustering determination unit 23 clusters the point cloud PC of interest based on the score N. It is confirmed whether or not the condition C2 is satisfied (step S109).
  • the clustering condition C2 includes a condition that the score N is a predetermined number or more (for example, 50 or more).
  • the clustering processing unit 24 clusters the point cloud PC of interest into one cluster (step S110).
  • the clustering processing unit 24 does not perform the processing of this step S110. That is, in this case, the clustering processing unit 24 excludes the point cloud PC of interest from the target of clustering.
  • the division determination unit 21 confirms whether or not there is another point cloud PC that satisfies the division condition C1 (step S111).
  • the division determination unit 21 pays attention to one of the point cloud PCs satisfying the division condition C1 (step S108). .. Then, the process returns to the process of step S102.
  • step S111 if there is no other point cloud PC satisfying the division condition C1 (“N” in step S111), this flow ends.
  • the plane estimation unit 20 generates a plurality of clusters, each of which corresponds to a minute plane, by clustering the point cloud PCs based on the point cloud data DPC.
  • FIG. 4 shows an example of the division processing of the point cloud PC in the plane estimation unit 20.
  • the point cloud PC is illustrated using the region where the points P are distributed.
  • the point cloud PC (point cloud PC1) shown in FIG. 4A has a predetermined number of points N or more, the upper side and the lower side of the bounding box B have a predetermined length or more, and the division condition C1 is satisfied. Fulfill. Therefore, the division processing unit 22 sets the X-axis direction as the division direction, sets the center position of the bounding box B in the X-axis direction as the division position, and sets this point cloud PC as two point cloud PCs (point cloud PC11, It is divided into PC12) (FIG. 4 (B)).
  • the point cloud PC11 is divided into two point cloud PCs (point cloud PC111, PC112) (FIG. 4C), and the point cloud PC112 is divided into two point cloud PCs (point cloud PC1121, PC1122). It is divided into (Fig. 4 (D)).
  • the point cloud PC12 is divided into two point cloud PCs (point cloud PC121, PC122) (FIG. 4C), and the point cloud PC121 is divided into two point cloud PCs (point cloud PC1211, PC1212). It is divided into (Fig. 4 (D)).
  • the plane estimation unit 20 recursively divides the point cloud PC until the point cloud PC does not satisfy the division condition C1.
  • the division determination unit 21 determines whether or not to divide the point cloud PC based on the division condition C1, and the division determination unit 21 determines that the point cloud PC satisfies the division condition C1. In this case, the division processing unit 22 divides the point cloud PC until the division condition C1 is no longer satisfied.
  • the point cloud PC can be treated as a cluster as it is, so that the processing time Can be shortened.
  • the clustering processing time is long. It may be long.
  • the processing time can be shortened. As a result, the response speed of the robot 100 is improved, so that the robot 100 can perform agile movements. Further, with such a configuration, the computing power required for the robot 100 can be reduced, so that the cost of parts such as a processor and a memory can be reduced.
  • the clustering determination unit 23 determines in the division determination unit 21 that the point cloud does not satisfy the division condition C1, it determines whether or not to perform clustering of the point cloud PC based on the clustering condition C2. Then, when the clustering determination unit 23 determines that the point cloud PC satisfies the clustering condition C2, the clustering processing unit 24 clusters the point cloud PC.
  • the point cloud PC that does not satisfy the clustering condition C2 can be excluded from the clustering target. Therefore, for example, an inaccurate point cloud PC can be excluded from the target of clustering, so that the accuracy of clustering can be improved.
  • the division determination unit determines whether or not to divide the point cloud based on the division condition, and when the division determination unit determines that the point cloud satisfies the division condition, Since the point cloud is divided by the division processing unit until the division condition is not satisfied, the processing time can be shortened.
  • the clustering determination unit determines whether or not to perform clustering of the point cloud based on the clustering condition.
  • the clustering processing unit clusters the point cloud, so that the accuracy of clustering can be improved.
  • the division condition C1 satisfies a condition that the score N is a predetermined number or more, and a condition that the length of the longest side among the plurality of sides of the bounding box B is a predetermined length or more. I tried to include it, but it is not limited to this.
  • the division condition C1 may include only one of these two conditions.
  • the division condition C1 may include a condition regarding the number of points P included in the point cloud PC, and also include a condition regarding the size of the bounding box B surrounding the point cloud PC. You may. Further, the division condition C1 may include a condition regarding the density of the point P in the point cloud PC. Here, "density" is the number of points P divided by the size of the bounding box B. Further, the division condition C1 may include a condition regarding the shape of the point cloud PC. Specifically, for example, the division determination unit 21 can determine that the division condition C1 is not satisfied when the point cloud PC shows a single plane.
  • the division condition C1 may be changed according to the distance between the point cloud PC and the image sensors 11L and 11R of the robot 100.
  • the division condition C1 includes the condition that the score N is "a predetermined number" or more
  • the "predetermined number” is set to 100 when the distance is short
  • the "predetermined number” is set to 100 when the distance is long.
  • the "predetermined number” may be set to 200. That is, the closer the distance is, the better the position accuracy of the point cloud PC is utilized.
  • the "predetermined number” is reduced to make the cluster smaller, and when the distance is far, " The cluster can be enlarged by increasing the "predetermined number”.
  • the division condition C1 may be set by combining these conditions.
  • a logical expression, a weighted sum, or the like may be used.
  • the clustering condition C2 includes, but is not limited to, a condition that the score N is a predetermined number or more.
  • the clustering condition C2 may include a condition regarding the number of points P included in the point cloud PC in this way.
  • the clustering condition C2 may include a condition regarding the size of the bounding box B surrounding the point cloud PC, a condition regarding the density of the point P in the point cloud PC, or the point cloud PC. It may include the condition about the shape of.
  • the clustering condition C2 may be changed according to the distance between the point cloud PC and the image sensors 11L and 11R of the robot 100. Further, the clustering condition C2 may be set by combining these conditions.
  • the division determination unit 21 sets the bounding box B in the two-dimensional coordinate space when generating the two-dimensional map data MAP in the two-dimensional coordinate space corresponding to the horizontal plane. It is not limited to.
  • the two-dimensional coordinate space in the map data MAP and the two-dimensional coordinate space for setting the bounding box B according to the division condition C1 may be different from each other.
  • the two-dimensional coordinate space for setting the bounding box B related to the division condition C1 is a coordinate space rotated by a predetermined angle (for example, 45 degrees) from the two-dimensional coordinate space in the map data MAP. May be good.
  • the clustering condition C2 includes a condition regarding the size of the bounding box B
  • the two-dimensional coordinate space for setting the bounding box B according to the clustering condition C2 is the same as the two-dimensional coordinate space in the map data MAP.
  • the object recognition unit 13 corresponds to a specific example of the "generation unit" in the present disclosure.
  • the division determination unit 21 sets the bounding box B in the three-dimensional coordinate space when generating the three-dimensional map data MAP in the three-dimensional coordinate space. It is not limited. Instead, the three-dimensional coordinate space in the map data MAP and the three-dimensional coordinate space in which the bounding box B related to the division condition C1 is set may be different from each other. Specifically, in the horizontal plane, the coordinates in which the three-dimensional coordinate space for setting the bounding box B related to the division condition C1 is rotated by a predetermined angle (for example, 45 degrees) from the three-dimensional coordinate space in the map data MAP. It may be a space. When the clustering condition C2 includes a condition regarding the size of the bounding box B, the three-dimensional coordinate space for setting the bounding box B according to the clustering condition C2 is the same as the three-dimensional coordinate space in the map data MAP. There may be.
  • the division processing unit 22 sets the axial direction in which the longest side of the plurality of sides of the bounding box B extends as the division direction, but the division direction is not limited to this. Instead, for example, the division processing unit 22 may set all the axial directions as the division directions.
  • the division processing unit 22 may set the X-axis direction and the Y-axis direction as the division directions. In this case, the division processing unit 22 sets the center position of the bounding box B in the X-axis direction as the division position in the X-axis direction, and sets the center position of the bounding box B in the Y-axis direction as the division position in the Y-axis direction. To do. As a result, the division processing unit 22 divides one point cloud PC into four point cloud PCs.
  • the division processing unit 22 may set the X-axis direction, the Y-axis direction, and the Z-axis direction as the division directions.
  • the division processing unit 22 sets the center position of the bounding box B in the X-axis direction as the division position in the X-axis direction, and sets the center position of the bounding box B in the Y-axis direction as the division position in the Y-axis direction.
  • the center position of the bounding box B in the Z-axis direction is set as the division position in the Z-axis direction.
  • the division processing unit 22 divides one point cloud PC into eight point cloud PCs.
  • the division processing unit 22 sets the center position of the bounding box B in the division direction as the division position, but the present invention is not limited to this. Instead, for example, the division processing unit 22 may set the average position of the positions of the plurality of points P included in the point cloud PC in the division direction as the division position.
  • FIG. 5 shows an example of a configuration of the robot 200.
  • the robot 200 includes an image sensor 11L, 11R, a depth estimation unit 12, a plane estimation unit 20, an image sensor 31L, 31R, a depth estimation unit 32, a plane estimation unit 40, and a ToF (Time of Flight) sensor 51.
  • a signal processing unit 52, a plane estimation unit 60, an integration unit 19, an object recognition unit 13, a self-position estimation unit 14, an action plan determination unit 15, an actuator 16, and a movement mechanism 17 are provided.
  • the plane estimation units 20, 40, 60, the integration unit 19, the object recognition unit 13, the self-position estimation unit 14, and the action plan determination unit 15 can be configured by using a processor that performs processing based on a program. ..
  • the robot 200 includes three types of sensors: image sensors 11L and 11R, image sensors 31L and 31R, and ToF sensor 51.
  • the image sensor 11L generates the left image PL1
  • the image sensor 11R generates the right image PR1
  • the depth estimation unit 12 generates the point cloud data DPC1 based on the left image PL1 and the right image PR1.
  • the plane estimation unit 20 generates cluster data DC1 based on the point cloud data DPC1.
  • the image sensors 31L and 31R are so-called stereo cameras, and a set of images having parallax with each other by capturing, for example, the front of the robot 200 (left image PL2 and right image PR2). Is configured to generate.
  • the angles of view of the image sensors 31L and 31R may be the same as or different from the angles of view of the image sensors 11L and 11R.
  • the depth estimation unit 32 Similar to the depth estimation unit 12, the depth estimation unit 32 generates a depth map by estimating the depth based on the left image PL2 and the right image PR2, and based on this depth map, data about the point cloud PC. It is configured to generate point cloud data DPC2 including.
  • the plane estimation unit 40 Similar to the plane estimation unit 20, the plane estimation unit 40 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC2, and a cluster including a plurality of clusters each corresponding to the microplane. It is configured to generate data DC2. Similar to the plane estimation unit 20, the plane estimation unit 40 has a division determination unit 41, a division processing unit 42, a clustering determination unit 43, and a clustering processing unit 44.
  • the ToF sensor 51 is a distance measuring sensor, and is configured to measure the distance to, for example, an object in front of the robot 200.
  • the signal processing unit 52 is configured to generate point cloud data DPC3 including data about the point cloud PC based on the measurement result of the ToF sensor 51.
  • the plane estimation unit 60 Similar to the plane estimation unit 20, the plane estimation unit 60 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC3, and a cluster including a plurality of clusters each corresponding to the microplane. It is configured to generate data DC3. Similar to the plane estimation unit 20, the plane estimation unit 60 includes a division determination unit 61, a division processing unit 62, a clustering determination unit 63, and a clustering processing unit 64.
  • the integration unit 19 is configured to generate cluster data DC by integrating cluster data DC1, cluster data DC2, and cluster data DC3. Then, the integration unit 19 supplies the cluster data DC to the object recognition unit 13.
  • the image sensors 31L and 31R correspond to a specific example of the "second sensor” in the present disclosure.
  • the point cloud data DPC1 corresponds to a specific example of the "first point cloud data” in the present disclosure.
  • the point cloud data DPC2 corresponds to a specific example of the "second point cloud data” in the present disclosure.
  • the division determination unit 41 corresponds to a specific example of the “second division determination unit” in the present disclosure.
  • the division processing unit 42 corresponds to a specific example of the “second division processing unit” in the present disclosure.
  • the clustering determination unit 43 corresponds to a specific example of the “second clustering determination unit” in the present disclosure.
  • the clustering processing unit 44 corresponds to a specific example of the “second clustering processing unit” in the present disclosure.
  • the integration unit 19 corresponds to a specific example of the “integration unit” in the present disclosure.
  • the robot 200 is provided with additional sensors in addition to the image sensors 11L and 11R.
  • the image sensors 31L and 31R and the ToF sensor 51 are provided.
  • the accuracy of the map data MAP can be improved, for example.
  • the robot 200 can obtain an accurate point cloud PC, so that the accuracy of clustering can be improved.
  • the processing amount is large and the processing time becomes long. Even in this case, the processing time can be shortened as compared with the case where the point cloud PCs are clustered by the neighborhood point search described in Non-Patent Document 1.
  • the integration unit 19 is provided after the three plane estimation units 20, 40, 60, but the present invention is not limited to this, and the integration unit is not limited to this, for example, as in the robot 200A shown in FIG.
  • One plane estimation unit 20 may be provided in the subsequent stage.
  • the robot 200A includes an integrated unit 19A.
  • the integration unit 19A is configured to generate the point cloud data DPC by integrating the three point cloud data DPC1, DCP2, and DPC3.
  • the plane estimation unit 20 generates cluster data DC by clustering the point cloud PCs based on the point cloud data DPCs.
  • the integration unit 19A corresponds to a specific example of the "integration unit" in the present disclosure. With this configuration, the robot 200A requires only one plane estimation unit 20, so that the configuration can be simplified.
  • the robot 200B includes a ToF sensor 71, a signal processing unit 72, an integrated unit 19B, a plane estimation unit 80, and an integrated unit 19C.
  • the robot 200B includes three types of sensors: image sensors 11L and 11R, ToF sensor 51, and ToF sensor 71.
  • the ToF sensor 71 is a distance measuring sensor, and is configured to measure the distance to, for example, an object in front of the robot 200B.
  • the signal processing unit 72 is configured to generate the point cloud data DPC 4 including the data about the point cloud PC based on the measurement result of the ToF sensor 71.
  • the integration unit 19B is configured to generate the point cloud data DPC 5 by integrating the point cloud data DPC 3 and the point cloud data DPC 4.
  • the plane estimation unit 80 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC 5, and a cluster including a plurality of clusters each corresponding to the microplane. It is configured to generate data DC4.
  • the plane estimation unit 80 includes a division determination unit 81, a division processing unit 82, a clustering determination unit 83, and a clustering processing unit 84.
  • the integration unit 19C is configured to generate a cluster data DC by integrating the cluster data DC1 and the cluster data DC2.
  • the ToF sensor 51 corresponds to a specific example of the "second sensor” in the present disclosure.
  • the point cloud data DPC3 corresponds to a specific example of the "second point cloud data” in the present disclosure.
  • the ToF sensor 71 corresponds to a specific example of the "third sensor” in the present disclosure.
  • the point cloud data DPC4 corresponds to a specific example of the "third point cloud data" in the present disclosure.
  • the integration unit 19B corresponds to a specific example of the "first integration unit” in the present disclosure.
  • the division determination unit 81 corresponds to a specific example of the “second division determination unit” in the present disclosure.
  • the division processing unit 82 corresponds to a specific example of the “second division processing unit” in the present disclosure.
  • the clustering determination unit 83 corresponds to a specific example of the “second clustering determination unit” in the present disclosure.
  • the clustering processing unit 84 corresponds to a specific example of the “second clustering processing unit” in the present disclosure.
  • the integration unit 19C corresponds to a specific example of the "second integration unit” in the present disclosure.
  • the information obtained by the ToF sensor 51 and the information obtained by the ToF sensor 71 can be easily integrated, while the information obtained by the two ToF sensors 51 and 71 and the image sensor 11L , 11R is effective when it is difficult to integrate with the information obtained.
  • the three plane estimation units 20, 40, 60 are provided, but the division conditions C1 in these three plane estimation units 20, 40, 60 may be the same or different from each other. You may be. Similarly, the clustering conditions C2 in these three plane estimation units 20, 40, 60 may be the same as each other or may be different from each other.
  • the image sensors 11L and 11R are provided, but a ToF sensor may be provided instead.
  • the configuration of the three systems of sensors in the second embodiment may be changed as appropriate.
  • this technology can be configured as follows. According to the present technology having the following configuration, the processing time can be shortened.
  • a first division determination unit that determines whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division conditions. When the first division determination unit determines that the point cloud satisfies the division condition, the first division of the point cloud included in the first point cloud data is performed until the division condition is not satisfied.
  • the division condition is a condition regarding the number of points included in the point cloud, a condition regarding the size of the area surrounding the point cloud, a condition regarding the density of points in the point cloud, and the shape of the point cloud.
  • the information processing apparatus according to (1) or (2), wherein the division condition can be changed according to the distance between the point cloud and the first sensor.
  • the area surrounding the point cloud is arranged in a coordinate space indicated by a plurality of coordinate axes.
  • the first division processing unit sets the division direction based on the axial direction in which the length of the region surrounding the point cloud is the longest among the axial directions of the plurality of coordinate axes, and in the set division direction.
  • the information processing apparatus according to any one of (1) to (3) above, wherein a division position is set and the point cloud is divided at the division position.
  • the information processing apparatus according to (4), wherein the first division processing unit sets the center position of a region surrounding the point cloud in the division direction as the division position.
  • the information processing apparatus according to (4), wherein the first division processing unit sets the average position of a plurality of points included in the point cloud in the division direction as the division position. (7) Further provided with a generation unit that generates map data based on the point cloud processed by the first division processing unit.
  • the information processing apparatus according to any one of (4) to (6) above, wherein the plurality of coordinate axes are obtained by rotating a plurality of map coordinate axes used in the coordinate space of the map data in a predetermined direction.
  • the area surrounding the point cloud is arranged in a coordinate space indicated by a plurality of coordinate axes.
  • the first division processing unit sets a division position in each of the plurality of coordinate axes in the axial direction, and divides the point cloud at the plurality of division positions according to any one of (1) to (3).
  • Information processing equipment (9) When the first division determination unit determines that the point cloud does not satisfy the division condition, clustering is performed as to whether or not to cluster the point cloud processed by the first division processing unit.
  • the information processing apparatus according to any one of (1) to (8) above, further comprising a first clustering determination unit for determining based on conditions.
  • the clustering condition includes a condition regarding the number of points included in the point cloud, a condition regarding the size of a region surrounding the point cloud, a condition regarding the density of points in the point cloud, and a shape of the point cloud.
  • the information processing apparatus which includes one or more of the conditions.
  • (11) The information processing apparatus according to (9) or (10), wherein the clustering condition can be changed according to the distance between the point cloud and the first sensor.
  • (12) A first clustering process for clustering the point cloud processed by the first partition processing unit when the first clustering determination unit determines that the point cloud satisfies the clustering condition.
  • the information processing apparatus according to any one of (9) to (11) above, further comprising a unit.
  • a second division determination unit that determines whether or not to divide the point cloud included in the second point cloud data supplied from the second sensor based on the division condition.
  • the second division determination unit determines that the point cloud satisfies the division condition
  • the second division determination unit performs the division of the point cloud included in the second point cloud data until the division condition is not satisfied.
  • Division processing unit and The information processing apparatus according to any one of (1) to (8) above, further comprising an integrated unit that integrates the processing result by the first division processing unit and the processing result by the second division processing unit. .. (14)
  • clustering is performed as to whether or not to cluster the point cloud processed by the first division processing unit.
  • a first clustering determination unit that determines based on conditions, When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit, and the first clustering processing unit. When the second division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the second division processing unit is set as the clustering condition.
  • a second clustering determination unit that determines based on When the second clustering determination unit determines that the point cloud satisfies the clustering condition, the second clustering processing unit that clusters the point cloud processed by the second division processing unit is used.
  • the integration unit integrates a processing result by the first clustering processing unit and a processing result by the second clustering processing unit.
  • the first division determination unit determines whether or not to divide the point cloud included in the first point cloud data and the second point cloud data integrated by the integration unit based on the division condition. Judge, When the first division determination unit determines that the point cloud satisfies the division condition, the first division processing unit performs the first point cloud data and the first division until the division condition is not satisfied.
  • the information processing apparatus according to any one of (1) to (8) above, which divides the point cloud included in the point cloud data of 2.
  • a first integration unit that integrates the second point cloud data supplied from the second sensor and the third point cloud data supplied from the third sensor, and A second division that determines whether or not to divide the point cloud included in the second point cloud data and the third point cloud data integrated by the first integration unit based on the division condition. Judgment unit and When the second division determination unit determines that the point cloud satisfies the division condition, the second division determination unit performs the division of the point cloud included in the second point cloud data until the division condition is not satisfied.
  • the first division determination unit determines that the point cloud does not satisfy the division condition
  • clustering is performed as to whether or not to cluster the point cloud processed by the first division processing unit.
  • a first clustering determination unit that determines based on conditions, When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit, and the first clustering processing unit.
  • the second division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the second division processing unit is set as the clustering condition.
  • a second clustering determination unit that determines based on When the second clustering determination unit determines that the point cloud satisfies the clustering condition, the second clustering processing unit that clusters the point cloud processed by the second division processing unit is used.
  • the second integration unit integrates the processing result by the first clustering processing unit and the processing result by the second clustering processing unit.
  • the information processing device according to any one of (1) to (17), wherein the first sensor and the information processing device are provided on a mobile body.
  • the first division determination process of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition, and When it is determined in the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied.
  • Information processing methods including partition processing.
  • the first division determination process of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition, and When it is determined in the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

This information processing device comprises: a first division determination unit that determines, on the basis of division conditions, whether to divide a point group contained in first point group data supplied from a first sensor; and a first division processing unit that, if it was determined in the first division determination unit that the point group satisfies the division conditions, divides the point group contained in the first point group data until the division conditions are no longer satisfied.

Description

情報処理装置、情報処理方法、およびプログラムInformation processing equipment, information processing methods, and programs
 本開示は、情報処理を行う情報処理装置、そのような情報処理装置において用いられる情報処理方法、そのような情報処理をコンピュータに行わせるプログラムに関する。 The present disclosure relates to an information processing device that performs information processing, an information processing method used in such an information processing device, and a program that causes a computer to perform such information processing.
 例えば測距センサは、周辺の物体までの距離を測定することにより、点群(Point cloud)を取得する。このような点群は、しばしばクラスタリングされ、クラスタリングされたデータに基づいて様々な処理が行われる。例えば非特許文献1には、近傍点探索により点群をクラスタリングする技術が開示されている。 For example, a distance measuring sensor acquires a point cloud by measuring the distance to a surrounding object. Such point clouds are often clustered and various processes are performed based on the clustered data. For example, Non-Patent Document 1 discloses a technique for clustering a point cloud by searching for nearby points.
 情報処理装置では、一般に、処理時間は短いことが望まれており、このような点群のクラスタリング処理においても、処理時間が短いことが期待されている。 In an information processing device, it is generally desired that the processing time is short, and it is expected that the processing time is short even in such a point cloud clustering process.
 処理時間を短くすることができる情報処理装置、情報処理方法、およびプログラムを提供することが望ましい。 It is desirable to provide an information processing device, an information processing method, and a program that can shorten the processing time.
 本開示の一実施の形態における情報処理装置は、第1の分割判定部と、第1の分割処理部とを備えている。第1の分割判定部は、第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを分割条件に基づいて判定するように構成される。第1の分割処理部は、第1の分割判定部において点群が分割条件を満たすと判定された場合に、分割条件を満たさなくなるまで第1の点群データに含まれる点群の分割を行うように構成される。 The information processing apparatus according to the embodiment of the present disclosure includes a first division determination unit and a first division processing unit. The first division determination unit is configured to determine whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition. When the first division determination unit determines that the point cloud satisfies the division condition, the first division processing unit divides the point cloud included in the first point cloud data until the division condition is not satisfied. It is configured as follows.
 本開示の一実施の形態における情報処理方法は、第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを、分割条件に基づいて判定する第1の分割判定処理と、第1の分割判定処理において点群が分割条件を満たすと判定された場合に、分割条件を満たさなくなるまで第1の点群データに含まれる点群の分割を行う第1の分割処理とを含んでいる。 The information processing method according to the embodiment of the present disclosure is a first method of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition. When it is determined in the division determination process and the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied. Includes split processing.
 本開示の一実施の形態におけるプログラムは、第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを分割条件に基づいて判定する第1の分割判定処理と、第1の分割判定処理において点群が分割条件を満たすと判定された場合に、分割条件を満たさなくなるまで第1の点群データに含まれる点群の分割を行う第1の分割処理とをコンピュータに実行させるように構成される。 The program according to the embodiment of the present disclosure is a first division determination process for determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition. When the point cloud is determined to satisfy the division condition in the first division determination process, the point cloud included in the first point cloud data is divided until the division condition is not satisfied. Is configured to let the computer run.
 本開示の一実施の形態における情報処理装置、情報処理方法、プログラムでは、分割条件に基づいて、第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかが判定される。そして、点群が分割条件を満たすと判定された場合に、分割条件を満たさなくなるまで第1の点群データに含まれる点群の分割が行われる。 In the information processing apparatus, information processing method, and program according to the embodiment of the present disclosure, whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition. Is determined. Then, when it is determined that the point cloud satisfies the division condition, the point cloud included in the first point cloud data is divided until the division condition is not satisfied.
本開示の第1の実施の形態に係るロボットの一構成例を表すブロック図である。It is a block diagram which shows one configuration example of the robot which concerns on 1st Embodiment of this disclosure. 点群の一例を表す説明図である。It is explanatory drawing which shows an example of a point group. 図1に示した平面推定部の一動作例を表すフローチャートである。It is a flowchart which shows one operation example of the plane estimation part shown in FIG. 一実施の形態に係る点群の分割処理の一例を表す説明図である。It is explanatory drawing which shows an example of the point group division processing which concerns on one Embodiment. 第2の実施の形態に係るロボットの一構成例を表すブロック図である。It is a block diagram which shows one configuration example of the robot which concerns on 2nd Embodiment. 第2の実施の形態の変形例に係るロボットの一構成例を表すブロック図である。It is a block diagram which shows one configuration example of the robot which concerns on the modification of the 2nd Embodiment. 第2の実施の形態の変形例に係るロボットの一構成例を表すブロック図である。It is a block diagram which shows one configuration example of the robot which concerns on the modification of the 2nd Embodiment.
 以下、本開示の実施の形態について、図面を参照して詳細に説明する。なお、説明は以下の順序で行う。
1.第1の実施の形態
2.第2の実施の形態
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. The explanation will be given in the following order.
1. 1. First Embodiment 2. Second embodiment
<1.第1の実施の形態>
[構成例]
(全体構成例)
 図1は、一実施の形態に係る情報処理装置を備えたロボット100の一構成例を表すものである。なお、本開示の実施の形態に係る情報処理方法は、本実施の形態により具現化されるので、併せて説明する。
<1. First Embodiment>
[Configuration example]
(Overall configuration example)
FIG. 1 shows a configuration example of a robot 100 provided with an information processing device according to an embodiment. Since the information processing method according to the embodiment of the present disclosure is embodied by the present embodiment, it will be described together.
 ロボット100は、撮像したロボット100の周囲の画像に基づいて地図データを生成し、その地図データを利用して自律移動を行うように構成される。ロボット100は、例えば、平面上を走行する走行ロボットであってもよいし、空間内を飛行するドローンであってもよい。ロボット100は、イメージセンサ11L,11Rと、深度推定部12と、平面推定部20と、オブジェクト認識部13と、自己位置推定部14と、行動計画決定部15と、アクチュエータ16と、移動機構17とを備えている。例えば、平面推定部20、オブジェクト認識部13、自己位置推定部14、および行動計画決定部15は、プログラムに基づいて処理を行うプロセッサを用いて構成することができる。 The robot 100 is configured to generate map data based on the captured image of the surroundings of the robot 100 and to perform autonomous movement using the map data. The robot 100 may be, for example, a traveling robot traveling on a plane or a drone flying in space. The robot 100 includes image sensors 11L and 11R, a depth estimation unit 12, a plane estimation unit 20, an object recognition unit 13, a self-position estimation unit 14, an action plan determination unit 15, an actuator 16, and a movement mechanism 17. And have. For example, the plane estimation unit 20, the object recognition unit 13, the self-position estimation unit 14, and the action plan determination unit 15 can be configured by using a processor that performs processing based on a program.
 イメージセンサ11L,11Rは、いわゆるステレオカメラであり、ロボット100の例えば前方を撮像することにより、互いに視差を有する一組の画像を生成するように構成される。イメージセンサ11Lおよびイメージセンサ11Rは、水平方向に所定距離だけ離間して配置される。イメージセンサ11Lは、イメージセンサ11Rの左に配置され、左画像PLを生成し、イメージセンサ11Rは、イメージセンサ11Lの右に配置され、右画像PRを生成する。イメージセンサ11L,11Rは、所定のフレームレート(例えば30[fps])で、互いに同期して撮像動作を行うようになっている。 The image sensors 11L and 11R are so-called stereo cameras, and are configured to generate a set of images having parallax with each other by taking an image of, for example, the front of the robot 100. The image sensor 11L and the image sensor 11R are arranged horizontally separated by a predetermined distance. The image sensor 11L is arranged to the left of the image sensor 11R to generate a left image PL, and the image sensor 11R is arranged to the right of the image sensor 11L to generate a right image PR. The image sensors 11L and 11R are adapted to perform imaging operations in synchronization with each other at a predetermined frame rate (for example, 30 [fps]).
 深度推定部12は、左画像PLおよび右画像PRに基づいて深度を推定することにより、深度マップを生成するように構成される。そして、深度推定部12は、この深度マップに基づいて、点群PCについてのデータを含む点群データDPCを生成するようになっている。 The depth estimation unit 12 is configured to generate a depth map by estimating the depth based on the left image PL and the right image PR. Then, the depth estimation unit 12 is adapted to generate a point cloud data DPC including data about the point cloud PC based on this depth map.
 図2は、点群PCの一例を模式的に表すものである。なお、図2では、ロボット100を直方体を用いて模式的に表している。深度推定部12は、左画像PLおよび右画像PRにおける互いに対応する画像点までの深度(距離)を推定する。そして、深度推定部12は、この深度についての情報に基づいて、その画像点を、例えば3次元座標系における点Pに変換する。深度推定部12は、このようにして、複数の画像点を、3次元座標系における複数の点Pに変換する。その結果、深度推定部12は、点Pの集まりである点群PCを取得するようになっている。 FIG. 2 schematically shows an example of a point cloud PC. In FIG. 2, the robot 100 is schematically represented by using a rectangular parallelepiped. The depth estimation unit 12 estimates the depth (distance) to the image points corresponding to each other in the left image PL and the right image PR. Then, the depth estimation unit 12 converts the image point into, for example, a point P in the three-dimensional coordinate system based on the information about the depth. In this way, the depth estimation unit 12 converts the plurality of image points into a plurality of points P in the three-dimensional coordinate system. As a result, the depth estimation unit 12 acquires a point cloud PC which is a collection of points P.
 平面推定部20は、点群データDPCに基づいて、点群PCをクラスタリングすることにより微小平面群を生成するように構成される。各クラスタは、ロボット100の周囲の物体の表面を構成する複数の微小平面のうちの1つに対応する。平面推定部20は、このようにして生成された複数のクラスタを含むクラスタデータDCを生成するようになっている。平面推定部20は、分割判定部21と、分割処理部22と、クラスタリング判定部23と、クラスタリング処理部24とを有している。 The plane estimation unit 20 is configured to generate a minute plane group by clustering the point cloud PCs based on the point cloud data DPC. Each cluster corresponds to one of a plurality of microplanes constituting the surface of an object around the robot 100. The plane estimation unit 20 is adapted to generate cluster data DC including a plurality of clusters generated in this way. The plane estimation unit 20 includes a division determination unit 21, a division processing unit 22, a clustering determination unit 23, and a clustering processing unit 24.
 分割判定部21は、点群PCの分割を行うかどうかを分割条件C1に基づいて判定するように構成される。分割条件C1は、点群PCの分割を行うかどうかを判定する条件であり、点群PCが分割条件C1を満たす場合に、点群PCの分割が行われるようになっている。 The division determination unit 21 is configured to determine whether or not to divide the point cloud PC based on the division condition C1. The division condition C1 is a condition for determining whether or not to divide the point cloud PC, and when the point cloud PC satisfies the division condition C1, the point cloud PC is divided.
 分割処理部22は、分割判定部21において点群PCが分割条件C1を満たすと判定された場合に、この分割条件C1を満たさなくなるまで点群PCの分割を行うように構成される。 When the division determination unit 21 determines that the point cloud PC satisfies the division condition C1, the division processing unit 22 is configured to divide the point cloud PC until the division condition C1 is no longer satisfied.
 クラスタリング判定部23は、分割判定部21において点群PCが分割条件C1を満たさないと判定された場合に、点群PCのクラスタリングを行うかどうかをクラスタリング条件C2に基づいて判定するように構成される。クラスタリング条件C2は、点群PCのクラスタリングを行うかどうかを判定する条件であり、点群PCがクラスタリング条件C2を満たす場合に、点群PCのクラスタリングが行われるようになっている。 The clustering determination unit 23 is configured to determine whether or not to perform clustering of the point cloud PC based on the clustering condition C2 when the division determination unit 21 determines that the point cloud PC does not satisfy the division condition C1. To. The clustering condition C2 is a condition for determining whether or not to perform clustering of the point cloud PC, and when the point cloud PC satisfies the clustering condition C2, the point cloud PC is clustered.
 クラスタリング処理部24は、クラスタリング判定部23において点群PCがクラスタリング条件C2を満たすと判定された場合に、点群PCのクラスタリングを行うように構成される。 The clustering processing unit 24 is configured to perform clustering of the point cloud PC when the clustering determination unit 23 determines that the point cloud PC satisfies the clustering condition C2.
 オブジェクト認識部13は、クラスタデータDCに基づいて、1または複数のクラスタをグループ化することにより、オブジェクトを認識するように構成される。オブジェクトは、ロボット100の周囲にある物体のそれぞれに対応し、具体的には、ロボット10の周囲の人、壁、床などである。そして、オブジェクト認識部13は、認識したオブジェクトに基づいて、複数のクラスタにより表現される地図データMAPを生成するようになっている。この地図データMAPは、例えばロボット10が平面上を走行する走行ロボットである場合には2次元のデータにすることができ、また、例えばロボット10が空間内を飛行するドローンである場合には3次元のデータにすることができる。地図データMAPにおいて、各クラスタは、例えば中心座標および確率分布形状を用いて表現される。これにより、地図データMAPは、例えば、いわゆるグリッドマップを用いて構成する場合に比べて、データサイズを抑えることができるとともに、精度を高めることができる。ロボット100は、このような複数のクラスタにより表現される地図データMAPに基づいて動作することができるので、より少ない演算資源を用いて、短時間で処理を行うことができるようになっている。 The object recognition unit 13 is configured to recognize an object by grouping one or a plurality of clusters based on the cluster data DC. The objects correspond to each of the objects around the robot 100, specifically, a person, a wall, a floor, and the like around the robot 10. Then, the object recognition unit 13 is adapted to generate a map data MAP represented by a plurality of clusters based on the recognized object. This map data MAP can be, for example, two-dimensional data when the robot 10 is a traveling robot traveling on a plane, and 3 when the robot 10 is a drone flying in space, for example. It can be dimensional data. In the map data MAP, each cluster is represented using, for example, center coordinates and a probability distribution shape. As a result, the map data MAP can reduce the data size and improve the accuracy as compared with the case where the map data MAP is configured by using, for example, a so-called grid map. Since the robot 100 can operate based on the map data MAP represented by such a plurality of clusters, it is possible to perform processing in a short time using less computational resources.
 自己位置推定部14は、左画像PLおよび地図データMAPに基づいて、ロボット100の位置を推定することにより、ロボット100の位置を示す位置データPOSを生成するように構成される。なお、この例では、左画像PLに基づいて位置を推定したが、これに限定されるものではなく、例えば右画像PRに基づいて位置を推定してもよいし、左画像PLおよび右画像PRに基づいて位置を推定してもよい。 The self-position estimation unit 14 is configured to generate a position data POS indicating the position of the robot 100 by estimating the position of the robot 100 based on the left image PL and the map data MAP. In this example, the position is estimated based on the left image PL, but the position is not limited to this, and the position may be estimated based on, for example, the right image PR, or the left image PL and the right image PR. The position may be estimated based on.
 行動計画決定部15は、地図データMAPおよび位置データPOSに基づいて、ロボット100の周囲の状況を把握することにより、ロボット100の行動計画を決定するように構成される。 The action plan determination unit 15 is configured to determine the action plan of the robot 100 by grasping the surrounding situation of the robot 100 based on the map data MAP and the position data POS.
 アクチュエータ16は、行動計画決定部15が決定した行動計画に基づいて動力を生成し、その動力に基づいて移動機構17を駆動するように構成される。アクチュエータ16は、例えば、1または複数のモータを含んで構成される。 The actuator 16 is configured to generate power based on the action plan determined by the action plan determination unit 15 and drive the moving mechanism 17 based on the power. The actuator 16 is configured to include, for example, one or more motors.
 移動機構17は、アクチュエータ16により生成された動力に基づいてロボット100を移動させるように構成される。移動機構17は、例えばロボット10が平面上を走行する走行ロボットである場合には、例えば1または複数の車輪を含んで構成され、例えばロボット10がドローンである場合には、1または複数のプロペラを含んで構成される。 The moving mechanism 17 is configured to move the robot 100 based on the power generated by the actuator 16. The moving mechanism 17 is configured to include, for example, one or more wheels when the robot 10 is a traveling robot traveling on a plane, and one or more propellers when the robot 10 is a drone, for example. Consists of including.
 ここで、イメージセンサ11L,11Rは、本開示における「第1のセンサ」の一具体例に対応する。点群データDPCは、本開示における「第1の点群データ」の一具体例に対応する。分割判定部21は、本開示における「第1の分割判定部」の一具体例に対応する。分割条件C1は、本開示における「分割条件」の一具体例に対応する。分割処理部22は、本開示における「第1の分割処理部」の一具体例に対応する。クラスタリング判定部23は、本開示における「第1のクラスタリング判定部」の一具体例に対応する。クラスタリング条件C2は、本開示における「クラスタリング条件」の一具体例に対応する。クラスタリング処理部24は、本開示における「第1のクラスタリング処理部」の一具体例に対応する。 Here, the image sensors 11L and 11R correspond to a specific example of the "first sensor" in the present disclosure. The point cloud data DPC corresponds to a specific example of the "first point cloud data" in the present disclosure. The division determination unit 21 corresponds to a specific example of the “first division determination unit” in the present disclosure. The division condition C1 corresponds to a specific example of the “division condition” in the present disclosure. The division processing unit 22 corresponds to a specific example of the “first division processing unit” in the present disclosure. The clustering determination unit 23 corresponds to a specific example of the “first clustering determination unit” in the present disclosure. The clustering condition C2 corresponds to a specific example of the “clustering condition” in the present disclosure. The clustering processing unit 24 corresponds to a specific example of the “first clustering processing unit” in the present disclosure.
[動作および作用]
 続いて、本実施の形態のロボット100の動作および作用について説明する。
[Operation and action]
Subsequently, the operation and operation of the robot 100 of the present embodiment will be described.
(全体動作概要)
 まず、図1を参照して、ロボット100の全体動作概要を説明する。イメージセンサ11L,11Rは、ロボット100の例えば前方を撮像することにより、互いに視差を有する左画像PLおよび右画像PRを生成する。深度推定部12は、左画像PLおよび右画像PRに基づいて深度を推定することにより深度マップを生成し、この深度マップに基づいて、点群PCについてのデータを含む点群データDPCを生成する。平面推定部20は、点群データDPCに基づいて、点群PCをクラスタリングすることにより微小平面群を生成し、それぞれが微小平面に対応する複数のクラスタを含むクラスタデータDCを生成する。オブジェクト認識部13は、クラスタデータDCに基づいて、1または複数のクラスタをグループ化することにより、オブジェクトを認識し、認識したオブジェクトに基づいて、複数のクラスタにより表現される地図データMAPを生成する。自己位置推定部14は、左画像PLおよび地図データMAPに基づいて、地図上のロボット100の位置を推定することにより、位置データPOSを生成する。行動計画決定部15は、地図データMAPおよび位置データPOSに基づいて、ロボット100の周囲の状況を把握することにより、ロボット100の行動計画を決定する。アクチュエータ16は、行動計画決定部15が決定した行動計画に基づいて動力を生成し、その動力に基づいて移動機構17を駆動する。移動機構17は、アクチュエータ16により生成された動力に基づいてロボット100を移動させる。
(Overview of overall operation)
First, an outline of the overall operation of the robot 100 will be described with reference to FIG. The image sensors 11L and 11R generate a left image PL and a right image PR having parallax with each other by photographing, for example, the front of the robot 100. The depth estimation unit 12 generates a depth map by estimating the depth based on the left image PL and the right image PR, and generates a point cloud data DPC including data about the point cloud PC based on the depth map. .. The plane estimation unit 20 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC, and generates a cluster data DC including a plurality of clusters each corresponding to the microplane. The object recognition unit 13 recognizes an object by grouping one or a plurality of clusters based on the cluster data DC, and generates a map data MAP represented by the plurality of clusters based on the recognized object. .. The self-position estimation unit 14 generates position data POS by estimating the position of the robot 100 on the map based on the left image PL and the map data MAP. The action plan determination unit 15 determines the action plan of the robot 100 by grasping the surrounding situation of the robot 100 based on the map data MAP and the position data POS. The actuator 16 generates power based on the action plan determined by the action plan determination unit 15, and drives the moving mechanism 17 based on the power. The moving mechanism 17 moves the robot 100 based on the power generated by the actuator 16.
(詳細動作)
 次に、平面推定部20の動作について詳細に説明する。
(Detailed operation)
Next, the operation of the plane estimation unit 20 will be described in detail.
 図3は、平面推定部20の一動作例を表すものである。平面推定部20では、分割判定部21は、点群PCの分割を行うかどうかを分割条件C1に基づいて判定し、分割処理部22は、分割判定部21において点群PCが分割条件C1を満たすと判定された場合に、この分割条件C1を満たさなくなるまで点群PCの分割を行う。クラスタリング判定部23は、分割判定部21において点群PCが分割条件C1を満たさないと判定された場合に、点群PCのクラスタリングを行うかどうかをクラスタリング条件C2に基づいて判定し、クラスタリング処理部24は、クラスタリング判定部23において点群PCがクラスタリング条件C2を満たすと判定された場合に、点群PCのクラスタリングを行う。以下に、この動作について詳細に説明する。 FIG. 3 shows an operation example of the plane estimation unit 20. In the plane estimation unit 20, the division determination unit 21 determines whether or not to divide the point cloud PC based on the division condition C1, and in the division processing unit 22, the point cloud PC determines the division condition C1 in the division determination unit 21. When it is determined that the point cloud is satisfied, the point cloud PC is divided until the division condition C1 is no longer satisfied. When the division determination unit 21 determines that the point cloud PC does not satisfy the division condition C1, the clustering determination unit 23 determines whether to cluster the point cloud PC based on the clustering condition C2, and determines whether or not to perform clustering, and the clustering processing unit 23. 24 performs clustering of the point cloud PC when the clustering determination unit 23 determines that the point cloud PC satisfies the clustering condition C2. This operation will be described in detail below.
 まず、分割判定部21は、点群データDPCに含まれる点群PC全体に着目する(ステップS101)。 First, the division determination unit 21 pays attention to the entire point cloud PC included in the point cloud data DPC (step S101).
 次に、分割判定部21は、着目した点群PCに含まれる点Pの数(点数N)を求める(ステップS102)。 Next, the division determination unit 21 obtains the number of points P (points N) included in the point cloud PC of interest (step S102).
 次に、分割判定部21は、着目した点群PCの全てを囲む最小の領域であるバウンディングボックスBを設定する(ステップS103)。 Next, the division determination unit 21 sets the bounding box B, which is the smallest area surrounding all of the point cloud PCs of interest (step S103).
 例えば、ロボット100が平面上を走行する走行ロボットであり、水平面に対応する2次元の座標空間における2次元の地図データMAPを生成する場合には、分割判定部21は、この2次元の座標空間にバウンディングボックスBを設定する。2次元の座標空間を、X軸およびY軸を含む直交座標系を用いて定義した場合、分割判定部21は、点群PCに含まれる複数の点Pのうち、X軸方向における両端の点PのX座標値、およびY軸方向における両端の点PのY座標値に基づいて、着目した点群PCのすべてを囲む最小の長方形領域を、バウンディングボックスBとして設定することができる。 For example, when the robot 100 is a traveling robot traveling on a plane and generates a two-dimensional map data MAP in a two-dimensional coordinate space corresponding to a horizontal plane, the division determination unit 21 uses the two-dimensional coordinate space. Bounding box B is set in. When the two-dimensional coordinate space is defined using a Cartesian coordinate system including the X-axis and the Y-axis, the division determination unit 21 uses the points at both ends in the X-axis direction among the plurality of points P included in the point group PC. Based on the X coordinate value of P and the Y coordinate value of the points P at both ends in the Y axis direction, the smallest rectangular area surrounding all of the point group PCs of interest can be set as the bounding box B.
 また、例えば、ロボット100が空間内を飛行するドローンであり、3次元の座標空間における3次元の地図データMAPを生成する場合には、分割判定部21は、この3次元の座標空間にバウンディングボックスBを設定する。3次元の座標空間を、X軸、Y軸、およびZ軸を含む直交座標系を用いて定義した場合、分割判定部21は、点群PCに含まれる複数の点Pのうち、X軸方向における両端の点PのX座標値、Y軸方向における両端の点PのY座標値、およびZ軸方向における両端の点PのZ座標値に基づいて、着目した点群PCのすべてを囲む最小の直方体領域を、バウンディングボックスBとして設定することができる。 Further, for example, when the robot 100 is a drone flying in space and generates a three-dimensional map data MAP in a three-dimensional coordinate space, the division determination unit 21 creates a bounding box in the three-dimensional coordinate space. Set B. When the three-dimensional coordinate space is defined using a Cartesian coordinate system including the X-axis, the Y-axis, and the Z-axis, the division determination unit 21 is in the X-axis direction among the plurality of points P included in the point group PC. Based on the X coordinate values of the points P at both ends in, the Y coordinate values of the points P at both ends in the Y axis direction, and the Z coordinate values of the points P at both ends in the Z axis direction, the minimum surrounding all of the point group PCs of interest. The rectangular area of can be set as the bounding box B.
 次に、分割判定部21は、ステップS102において求めた点数NおよびステップS103において設定したバウンディングボックスBに基づいて、着目した点群PCが分割条件C1を満たすかどうかを確認する(ステップS104)。分割条件C1は、この例では、点数Nが所定数以上(例えば100以上)であるという条件、およびバウンディングボックスBの複数の辺のうちの最も長い辺の長さが所定の長さ(例えば5cm)以上であるという条件を含む。分割判定部21は、分割条件C1に含まれる全ての条件が満たされた場合に、着目した点群PCが分割条件C1を満たすと判断する。 Next, the division determination unit 21 confirms whether or not the point cloud PC of interest satisfies the division condition C1 based on the score N obtained in step S102 and the bounding box B set in step S103 (step S104). In this example, the division condition C1 is a condition that the score N is a predetermined number or more (for example, 100 or more), and the length of the longest side among the plurality of sides of the bounding box B is a predetermined length (for example, 5 cm). ) Includes the condition that it is above. The division determination unit 21 determines that the point cloud PC of interest satisfies the division condition C1 when all the conditions included in the division condition C1 are satisfied.
 ステップS104において、着目した点群PCが分割条件C1を満たす場合(ステップS104において“Y”)には、分割処理部22は、バウンディングボックスBの複数の辺のうちの最も長い辺が延伸する軸方向を分割方向として設定する(ステップS105)。 In step S104, when the point cloud PC of interest satisfies the division condition C1 (“Y” in step S104), the division processing unit 22 has an axis on which the longest side of the plurality of sides of the bounding box B extends. The direction is set as the division direction (step S105).
 次に、分割処理部22は、分割方向における、バウンディングボックスBの中心位置を、分割位置として設定する(ステップS106)。 Next, the division processing unit 22 sets the center position of the bounding box B in the division direction as the division position (step S106).
 次に、分割処理部22は、ステップS105において設定した分割方向における、ステップS106において設定した分割位置で、着目した点群PCを分割する(ステップS107)。 Next, the division processing unit 22 divides the point cloud PC of interest at the division position set in step S106 in the division direction set in step S105 (step S107).
 次に、分割判定部21は、ステップS107において分割された2つの点群PCのうちの一方に着目する(ステップS108)。そして、ステップS102の処理に戻る。 Next, the division determination unit 21 pays attention to one of the two point cloud PCs divided in step S107 (step S108). Then, the process returns to the process of step S102.
 例えば、このステップS102~S108の処理を繰り返すことにより、着目した点群PCは繰り返し分割されていく。すなわち、着目した点群PCに含まれる点Pの数は減っていき、着目した点群PCに係るバウンディングボックスBは小さくなっていく。そして、着目した点群PCは、やがて分割条件C1を満たさなくなる。 For example, by repeating the processes of steps S102 to S108, the point cloud PC of interest is repeatedly divided. That is, the number of points P included in the point cloud PC of interest decreases, and the bounding box B related to the point cloud PC of interest decreases. Then, the point cloud PC of interest does not satisfy the division condition C1.
 そして、ステップS104において、着目した点群PCが分割条件C1を満たさなくなった場合(ステップS104において“N”)には、クラスタリング判定部23は、点数Nに基づいて、着目した点群PCがクラスタリング条件C2を満たすかどうかを確認する(ステップS109)。クラスタリング条件C2は、この例では、点数Nが所定数以上(例えば50以上)であるという条件を含む。着目した点群PCがクラスタリング条件C2を満たす場合(ステップS109において“Y”)には、クラスタリング処理部24は、着目した点群PCをクラスタリングすることにより1つのクラスタとする(ステップS110)。また、着目した点群PCがクラスタリング条件C2を満たさない場合(ステップS109において“N”)には、クラスタリング処理部24は、このステップS110の処理を行わない。すなわち、この場合、クラスタリング処理部24は、着目した点群PCをクラスタリングの対象から外す。 Then, in step S104, when the point cloud PC of interest does not satisfy the division condition C1 (“N” in step S104), the clustering determination unit 23 clusters the point cloud PC of interest based on the score N. It is confirmed whether or not the condition C2 is satisfied (step S109). In this example, the clustering condition C2 includes a condition that the score N is a predetermined number or more (for example, 50 or more). When the point cloud PC of interest satisfies the clustering condition C2 (“Y” in step S109), the clustering processing unit 24 clusters the point cloud PC of interest into one cluster (step S110). Further, when the point cloud PC of interest does not satisfy the clustering condition C2 (“N” in step S109), the clustering processing unit 24 does not perform the processing of this step S110. That is, in this case, the clustering processing unit 24 excludes the point cloud PC of interest from the target of clustering.
 次に、分割判定部21は、分割条件C1を満たす点群PCが他にあるかどうかを確認する(ステップS111)。分割条件C1を満たす点群PCが他にある場合(ステップS111において“Y”)には、分割判定部21は、分割条件C1を満たす点群PCのうちの1つに着目する(ステップS108)。そして、ステップS102の処理に戻る。 Next, the division determination unit 21 confirms whether or not there is another point cloud PC that satisfies the division condition C1 (step S111). When there is another point cloud PC satisfying the division condition C1 (“Y” in step S111), the division determination unit 21 pays attention to one of the point cloud PCs satisfying the division condition C1 (step S108). .. Then, the process returns to the process of step S102.
 ステップS111において、分割条件C1を満たす点群PCが他にない場合(ステップS111において“N”)には、このフローは終了する。 In step S111, if there is no other point cloud PC satisfying the division condition C1 (“N” in step S111), this flow ends.
 このようにして、平面推定部20は、点群データDPCに基づいて、点群PCをクラスタリングすることにより、それぞれが微小平面に対応する複数のクラスタを生成する。 In this way, the plane estimation unit 20 generates a plurality of clusters, each of which corresponds to a minute plane, by clustering the point cloud PCs based on the point cloud data DPC.
 図4は、平面推定部20における、点群PCの分割処理の一例を表すものである。図4では、点群PCを、点Pが分布する領域を用いて図示している。 FIG. 4 shows an example of the division processing of the point cloud PC in the plane estimation unit 20. In FIG. 4, the point cloud PC is illustrated using the region where the points P are distributed.
 この例では、図4(A)に示した点群PC(点群PC1)は、点数Nが所定数以上あり、バウンディングボックスBの上辺および下辺が所定の長さ以上であり、分割条件C1を満たす。よって、分割処理部22は、X軸方向を分割方向として設定し、X軸方向におけるバウンディングボックスBの中心位置を分割位置として設定し、この点群PCを2つの点群PC(点群PC11,PC12)に分割する(図4(B))。 In this example, the point cloud PC (point cloud PC1) shown in FIG. 4A has a predetermined number of points N or more, the upper side and the lower side of the bounding box B have a predetermined length or more, and the division condition C1 is satisfied. Fulfill. Therefore, the division processing unit 22 sets the X-axis direction as the division direction, sets the center position of the bounding box B in the X-axis direction as the division position, and sets this point cloud PC as two point cloud PCs (point cloud PC11, It is divided into PC12) (FIG. 4 (B)).
 同様に、この例では、点群PC11を2つの点群PC(点群PC111,PC112)に分割し(図4(C))、点群PC112を2つの点群PC(点群PC1121,PC1122)に分割する(図4(D))。同様に、この例では、点群PC12を2つの点群PC(点群PC121,PC122)に分割し(図4(C))、点群PC121を2つの点群PC(点群PC1211,PC1212)に分割する(図4(D))。 Similarly, in this example, the point cloud PC11 is divided into two point cloud PCs (point cloud PC111, PC112) (FIG. 4C), and the point cloud PC112 is divided into two point cloud PCs (point cloud PC1121, PC1122). It is divided into (Fig. 4 (D)). Similarly, in this example, the point cloud PC12 is divided into two point cloud PCs (point cloud PC121, PC122) (FIG. 4C), and the point cloud PC121 is divided into two point cloud PCs (point cloud PC1211, PC1212). It is divided into (Fig. 4 (D)).
 このようにして、平面推定部20は、点群PCが分割条件C1を満たさなくなるまで再帰的に点群PCの分割を行う。 In this way, the plane estimation unit 20 recursively divides the point cloud PC until the point cloud PC does not satisfy the division condition C1.
 このように、ロボット100では、分割判定部21が、点群PCの分割を行うかどうかを分割条件C1に基づいて判定し、分割判定部21において点群PCが分割条件C1を満たすと判定された場合に、分割条件C1を満たさなくなるまで、分割処理部22が点群PCの分割を行うようにした。これにより、ロボット100では、例えば点群PCの大きさがある所定の大きさよりも小さくなり、分割条件C1を満たさなくなった場合に、その点群PCをそのままクラスタとして扱うことができるので、処理時間を短くすることができる。 In this way, in the robot 100, the division determination unit 21 determines whether or not to divide the point cloud PC based on the division condition C1, and the division determination unit 21 determines that the point cloud PC satisfies the division condition C1. In this case, the division processing unit 22 divides the point cloud PC until the division condition C1 is no longer satisfied. As a result, in the robot 100, for example, when the size of the point cloud PC becomes smaller than a predetermined size and the division condition C1 is no longer satisfied, the point cloud PC can be treated as a cluster as it is, so that the processing time Can be shortened.
 すなわち、例えば、非特許文献1に記載の近傍点探索により点群PCをクラスタリングする場合には、点群PCの規模に応じた回数の近傍点探索を行う必要があるので、クラスタリングの処理時間が長くなるおそれがある。一方、本実施の形態に係るロボット100では、処理がシンプルであるので、処理時間を短くすることができる。その結果、ロボット100における応答速度が向上するので、ロボット100が機敏な動作を行うことができる。また、このように構成することにより、ロボット100に要求される演算能力を下げることができるので、例えばプロセッサやメモリなどの部品コストを削減することができる。 That is, for example, when clustering point cloud PCs by the neighborhood point search described in Non-Patent Document 1, it is necessary to perform the number of neighborhood point searches according to the scale of the point cloud PCs, so that the clustering processing time is long. It may be long. On the other hand, in the robot 100 according to the present embodiment, since the processing is simple, the processing time can be shortened. As a result, the response speed of the robot 100 is improved, so that the robot 100 can perform agile movements. Further, with such a configuration, the computing power required for the robot 100 can be reduced, so that the cost of parts such as a processor and a memory can be reduced.
 また、ロボット100では、クラスタリング判定部23が、分割判定部21において点群が分割条件C1を満たさないと判定された場合に、点群PCのクラスタリングを行うかどうかをクラスタリング条件C2に基づいて判定し、クラスタリング判定部23において点群PCがクラスタリング条件C2を満たすと判定された場合に、クラスタリング処理部24が点群PCのクラスタリングを行うようにした。これにより、ロボット100では、例えば、クラスタリング条件C2を満たさない点群PCをクラスタリングの対象から外すことができる。よって、例えば、不正確な点群PCをクラスタリングの対象から外すことができるので、クラスタリングの精度を高めることができる。 Further, in the robot 100, when the clustering determination unit 23 determines in the division determination unit 21 that the point cloud does not satisfy the division condition C1, it determines whether or not to perform clustering of the point cloud PC based on the clustering condition C2. Then, when the clustering determination unit 23 determines that the point cloud PC satisfies the clustering condition C2, the clustering processing unit 24 clusters the point cloud PC. As a result, in the robot 100, for example, the point cloud PC that does not satisfy the clustering condition C2 can be excluded from the clustering target. Therefore, for example, an inaccurate point cloud PC can be excluded from the target of clustering, so that the accuracy of clustering can be improved.
[効果]
 以上のように本実施の形態では、分割判定部が、点群の分割を行うかどうかを分割条件に基づいて判定し、分割判定部において点群が分割条件を満たすと判定された場合に、分割条件を満たさなくなるまで、分割処理部が点群の分割を行うようにしたので、処理時間を短くすることができる。
[effect]
As described above, in the present embodiment, the division determination unit determines whether or not to divide the point cloud based on the division condition, and when the division determination unit determines that the point cloud satisfies the division condition, Since the point cloud is divided by the division processing unit until the division condition is not satisfied, the processing time can be shortened.
 本実施の形態では、クラスタリング判定部が、分割判定部において点群が分割条件を満たさないと判定された場合に、点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定し、クラスタリング判定部において点群がクラスタリング条件を満たすと判定された場合に、クラスタリング処理部が点群のクラスタリングを行うようにしたので、クラスタリングの精度を高めることができる。 In the present embodiment, when the clustering determination unit determines in the division determination unit that the point cloud does not satisfy the division condition, the clustering determination unit determines whether or not to perform clustering of the point cloud based on the clustering condition. When it is determined that the point cloud satisfies the clustering condition in the above, the clustering processing unit clusters the point cloud, so that the accuracy of clustering can be improved.
[変形例1-1]
 上記実施の形態では、分割条件C1は、点数Nが所定数以上であるという条件、およびバウンディングボックスBの複数の辺のうちの最も長い辺の長さが所定の長さ以上であるという条件を含むようにしたが、これに限定されるものではない。例えば、分割条件C1は、これらの2つの条件のうちの一方のみを含んでいてもよい。
[Modification 1-1]
In the above embodiment, the division condition C1 satisfies a condition that the score N is a predetermined number or more, and a condition that the length of the longest side among the plurality of sides of the bounding box B is a predetermined length or more. I tried to include it, but it is not limited to this. For example, the division condition C1 may include only one of these two conditions.
 また、例えば、分割条件C1は、このように、点群PCに含まれる点Pの数についての条件を含んでいてもよいし、点群PCを囲むバウンディングボックスBのサイズについての条件を含んでいてもよい。また、分割条件C1は、点群PCにおける点Pの密度についての条件を含んでいてもよい。ここで、“密度”は、点Pの数をバウンディングボックスBの大きさで除算したものである。また、分割条件C1は、点群PCの形状についての条件を含んでいてもよい。具体的には、例えば、分割判定部21は、点群PCが単一の平面を示す場合には、分割条件C1を満たさないと判定することができる。 Further, for example, the division condition C1 may include a condition regarding the number of points P included in the point cloud PC, and also include a condition regarding the size of the bounding box B surrounding the point cloud PC. You may. Further, the division condition C1 may include a condition regarding the density of the point P in the point cloud PC. Here, "density" is the number of points P divided by the size of the bounding box B. Further, the division condition C1 may include a condition regarding the shape of the point cloud PC. Specifically, for example, the division determination unit 21 can determine that the division condition C1 is not satisfied when the point cloud PC shows a single plane.
 また、分割条件C1は、点群PCとロボット100のイメージセンサ11L,11Rとの間の距離に応じて変更されるようにしてもよい。具体的には、例えば、分割条件C1は、点数Nが“所定数”以上であるという条件を含む場合において、距離が近い場合には、“所定数”を100にし、距離が遠い場合には、“所定数”を200にしてもよい。すなわち、距離が近いほど、点群PCの位置精度が良いことを利用し、距離が近い場合には、“所定数”を小さくすることにより、クラスタを小さくし、距離が遠い場合には、“所定数”を大きくすることにより、クラスタを大きくすることができる。 Further, the division condition C1 may be changed according to the distance between the point cloud PC and the image sensors 11L and 11R of the robot 100. Specifically, for example, in the case where the division condition C1 includes the condition that the score N is "a predetermined number" or more, the "predetermined number" is set to 100 when the distance is short, and the "predetermined number" is set to 100 when the distance is long. , The "predetermined number" may be set to 200. That is, the closer the distance is, the better the position accuracy of the point cloud PC is utilized. When the distance is short, the "predetermined number" is reduced to make the cluster smaller, and when the distance is far, " The cluster can be enlarged by increasing the "predetermined number".
 また、これらの条件を組み合わせることにより分割条件C1を設定してもよい。組み合わせる場合には、論理式や重み付き総和などを用いてもよい。 Further, the division condition C1 may be set by combining these conditions. When combining, a logical expression, a weighted sum, or the like may be used.
[変形例1-2]
 上記実施の形態では、クラスタリング条件C2は、点数Nが所定数以上であるという条件を含むようにしたが、これに限定されるものではない。例えば、クラスタリング条件C2は、このように、点群PCに含まれる点Pの数についての条件を含んでいてもよい。また、クラスタリング条件C2は、点群PCを囲むバウンディングボックスBのサイズについての条件を含んでいてもよいし、点群PCにおける点Pの密度についての条件を含んでいてもよいし、点群PCの形状についての条件を含んでいてもよい。また、クラスタリング条件C2は、点群PCとロボット100のイメージセンサ11L,11Rとの間の距離に応じて変更されるようにしてもよい。また、これらの条件を組み合わせることによりクラスタリング条件C2を設定してもよい。
[Modification 1-2]
In the above embodiment, the clustering condition C2 includes, but is not limited to, a condition that the score N is a predetermined number or more. For example, the clustering condition C2 may include a condition regarding the number of points P included in the point cloud PC in this way. Further, the clustering condition C2 may include a condition regarding the size of the bounding box B surrounding the point cloud PC, a condition regarding the density of the point P in the point cloud PC, or the point cloud PC. It may include the condition about the shape of. Further, the clustering condition C2 may be changed according to the distance between the point cloud PC and the image sensors 11L and 11R of the robot 100. Further, the clustering condition C2 may be set by combining these conditions.
[変形例1-3]
 上記実施の形態では、分割判定部21は、水平面に対応する2次元の座標空間における2次元の地図データMAPを生成する場合に、この2次元の座標空間にバウンディングボックスBを設定したが、これに限定されるものではない。例えば、地図データMAPにおける2次元の座標空間と、分割条件C1に係るバウンディングボックスBを設定する2次元の座標空間とが、互いに異なっていてもよい。具体的には、分割条件C1に係るバウンディングボックスBを設定する2次元の座標空間が、地図データMAPにおける2次元の座標空間から、所定の角度(例えば45度)だけ回転した座標空間であってもよい。クラスタリング条件C2が、バウンディングボックスBのサイズについての条件を含む場合には、このクラスタリング条件C2に係るバウンディングボックスBを設定する2次元の座標空間は、地図データMAPにおける2次元の座標空間と同じであってもよい。ここで、オブジェクト認識部13は、本開示における「生成部」の一具体例に対応する。
[Modification 1-3]
In the above embodiment, the division determination unit 21 sets the bounding box B in the two-dimensional coordinate space when generating the two-dimensional map data MAP in the two-dimensional coordinate space corresponding to the horizontal plane. It is not limited to. For example, the two-dimensional coordinate space in the map data MAP and the two-dimensional coordinate space for setting the bounding box B according to the division condition C1 may be different from each other. Specifically, the two-dimensional coordinate space for setting the bounding box B related to the division condition C1 is a coordinate space rotated by a predetermined angle (for example, 45 degrees) from the two-dimensional coordinate space in the map data MAP. May be good. When the clustering condition C2 includes a condition regarding the size of the bounding box B, the two-dimensional coordinate space for setting the bounding box B according to the clustering condition C2 is the same as the two-dimensional coordinate space in the map data MAP. There may be. Here, the object recognition unit 13 corresponds to a specific example of the "generation unit" in the present disclosure.
 同様に、上記実施の形態では、分割判定部21は、3次元の座標空間における3次元の地図データMAPを生成する場合に、この3次元の座標空間にバウンディングボックスBを設定したが、これに限定されるものではない。これに代えて、地図データMAPにおける3次元の座標空間と、分割条件C1に係るバウンディングボックスBを設定する3次元の座標空間とが、互いに異なっていてもよい。具体的には、水平面内において、分割条件C1に係るバウンディングボックスBを設定する3次元の座標空間が、地図データMAPにおける3次元の座標空間から、所定の角度(例えば45度)だけ回転した座標空間であってもよい。クラスタリング条件C2が、バウンディングボックスBのサイズについての条件を含む場合には、このクラスタリング条件C2に係るバウンディングボックスBを設定する3次元の座標空間は、地図データMAPにおける3次元の座標空間と同じであってもよい。 Similarly, in the above embodiment, the division determination unit 21 sets the bounding box B in the three-dimensional coordinate space when generating the three-dimensional map data MAP in the three-dimensional coordinate space. It is not limited. Instead, the three-dimensional coordinate space in the map data MAP and the three-dimensional coordinate space in which the bounding box B related to the division condition C1 is set may be different from each other. Specifically, in the horizontal plane, the coordinates in which the three-dimensional coordinate space for setting the bounding box B related to the division condition C1 is rotated by a predetermined angle (for example, 45 degrees) from the three-dimensional coordinate space in the map data MAP. It may be a space. When the clustering condition C2 includes a condition regarding the size of the bounding box B, the three-dimensional coordinate space for setting the bounding box B according to the clustering condition C2 is the same as the three-dimensional coordinate space in the map data MAP. There may be.
[変形例1-4]
 上記実施の形態では、分割処理部22は、バウンディングボックスBの複数の辺のうちの最も長い辺が延伸する軸方向を分割方向として設定したが、これに限定されるものではない。これに代えて、例えば、分割処理部22は、全ての軸方向を分割方向として設定してもよい。
[Modification 1-4]
In the above embodiment, the division processing unit 22 sets the axial direction in which the longest side of the plurality of sides of the bounding box B extends as the division direction, but the division direction is not limited to this. Instead, for example, the division processing unit 22 may set all the axial directions as the division directions.
 例えば、2次元の座標空間の場合には、分割処理部22は、X軸方向およびY軸方向を分割方向として設定してもよい。この場合、分割処理部22は、X軸方向におけるバウンディングボックスBの中心位置をX軸方向の分割位置として設定するとともに、Y軸方向におけるバウンディングボックスBの中心位置をY軸方向の分割位置として設定する。その結果、分割処理部22は、1つの点群PCを4つの点群PCに分割する。 For example, in the case of a two-dimensional coordinate space, the division processing unit 22 may set the X-axis direction and the Y-axis direction as the division directions. In this case, the division processing unit 22 sets the center position of the bounding box B in the X-axis direction as the division position in the X-axis direction, and sets the center position of the bounding box B in the Y-axis direction as the division position in the Y-axis direction. To do. As a result, the division processing unit 22 divides one point cloud PC into four point cloud PCs.
 また、例えば、3次元の座標空間の場合には、分割処理部22は、X軸方向、Y軸方向、およびZ軸方向を分割方向として設定してもよい。この場合、分割処理部22は、X軸方向におけるバウンディングボックスBの中心位置をX軸方向の分割位置として設定し、Y軸方向におけるバウンディングボックスBの中心位置をY軸方向の分割位置として設定し、Z軸方向におけるバウンディングボックスBの中心位置をZ軸方向の分割位置として設定する。その結果、分割処理部22は、1つの点群PCを8つの点群PCに分割する。 Further, for example, in the case of a three-dimensional coordinate space, the division processing unit 22 may set the X-axis direction, the Y-axis direction, and the Z-axis direction as the division directions. In this case, the division processing unit 22 sets the center position of the bounding box B in the X-axis direction as the division position in the X-axis direction, and sets the center position of the bounding box B in the Y-axis direction as the division position in the Y-axis direction. , The center position of the bounding box B in the Z-axis direction is set as the division position in the Z-axis direction. As a result, the division processing unit 22 divides one point cloud PC into eight point cloud PCs.
[変形例1-5]
 上記実施の形態では、分割処理部22は、分割方向における、バウンディングボックスBの中心位置を、分割位置として設定したが、これに限定されるものではない。これに代えて、例えば、分割処理部22は、分割方向における、点群PCに含まれる複数の点Pの位置の平均位置を、分割位置として設定してもよい。
[Modification 1-5]
In the above embodiment, the division processing unit 22 sets the center position of the bounding box B in the division direction as the division position, but the present invention is not limited to this. Instead, for example, the division processing unit 22 may set the average position of the positions of the plurality of points P included in the point cloud PC in the division direction as the division position.
[その他の変形例]
 また、これらの変形例のうちの2以上を組み合わせてもよい。
[Other variants]
Moreover, you may combine two or more of these modified examples.
<2.第2の実施の形態>
 次に、第2の実施の形態に係るロボット200について説明する。本実施の形態は、第1の実施の形態に比べて、より多くのセンサを備えたものである。なお、上記第1の実施の形態に係るロボット100と実質的に同一の構成部分には同一の符号を付し、適宜説明を省略する。
<2. Second Embodiment>
Next, the robot 200 according to the second embodiment will be described. This embodiment includes more sensors than the first embodiment. The same components as those of the robot 100 according to the first embodiment are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
 図5は、ロボット200の一構成例を表すものである。ロボット200は、イメージセンサ11L,11Rと、深度推定部12と、平面推定部20と、イメージセンサ31L,31Rと、深度推定部32と、平面推定部40と、ToF(Time of Flight)センサ51と、信号処理部52と、平面推定部60と、統合部19と、オブジェクト認識部13と、自己位置推定部14と、行動計画決定部15と、アクチュエータ16と、移動機構17とを備えている。例えば、平面推定部20,40,60、統合部19、オブジェクト認識部13、自己位置推定部14、および行動計画決定部15は、プログラムに基づいて処理を行うプロセッサを用いて構成することができる。ロボット200は、イメージセンサ11L,11R、イメージセンサ31L,31R、ToFセンサ51の3系統のセンサを備えている。 FIG. 5 shows an example of a configuration of the robot 200. The robot 200 includes an image sensor 11L, 11R, a depth estimation unit 12, a plane estimation unit 20, an image sensor 31L, 31R, a depth estimation unit 32, a plane estimation unit 40, and a ToF (Time of Flight) sensor 51. A signal processing unit 52, a plane estimation unit 60, an integration unit 19, an object recognition unit 13, a self-position estimation unit 14, an action plan determination unit 15, an actuator 16, and a movement mechanism 17 are provided. There is. For example, the plane estimation units 20, 40, 60, the integration unit 19, the object recognition unit 13, the self-position estimation unit 14, and the action plan determination unit 15 can be configured by using a processor that performs processing based on a program. .. The robot 200 includes three types of sensors: image sensors 11L and 11R, image sensors 31L and 31R, and ToF sensor 51.
 ロボット200では、イメージセンサ11Lは左画像PL1を生成し、イメージセンサ11Rは右画像PR1を生成し、深度推定部12は、左画像PL1および右画像PR1に基づいて点群データDPC1を生成し、平面推定部20は、点群データDPC1に基づいて、クラスタデータDC1を生成する。 In the robot 200, the image sensor 11L generates the left image PL1, the image sensor 11R generates the right image PR1, and the depth estimation unit 12 generates the point cloud data DPC1 based on the left image PL1 and the right image PR1. The plane estimation unit 20 generates cluster data DC1 based on the point cloud data DPC1.
 イメージセンサ31L,31Rは、イメージセンサ11L,11Rと同様に、いわゆるステレオカメラであり、ロボット200の例えば前方を撮像することにより、互いに視差を有する一組の画像(左画像PL2および右画像PR2)を生成するように構成される。イメージセンサ31L,31Rの画角は、イメージセンサ11L,11Rの画角と同じであってもよいし、異なっていてもよい。 Similar to the image sensors 11L and 11R, the image sensors 31L and 31R are so-called stereo cameras, and a set of images having parallax with each other by capturing, for example, the front of the robot 200 (left image PL2 and right image PR2). Is configured to generate. The angles of view of the image sensors 31L and 31R may be the same as or different from the angles of view of the image sensors 11L and 11R.
 深度推定部32は、深度推定部12と同様に、左画像PL2および右画像PR2に基づいて深度を推定することにより、深度マップを生成し、この深度マップに基づいて、点群PCについてのデータを含む点群データDPC2を生成するように構成される。 Similar to the depth estimation unit 12, the depth estimation unit 32 generates a depth map by estimating the depth based on the left image PL2 and the right image PR2, and based on this depth map, data about the point cloud PC. It is configured to generate point cloud data DPC2 including.
 平面推定部40は、平面推定部20と同様に、点群データDPC2に基づいて、点群PCをクラスタリングすることにより微小平面群を生成し、それぞれが微小平面に対応する複数のクラスタを含むクラスタデータDC2を生成するように構成される。平面推定部40は、平面推定部20と同様に、分割判定部41と、分割処理部42と、クラスタリング判定部43と、クラスタリング処理部44とを有している。 Similar to the plane estimation unit 20, the plane estimation unit 40 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC2, and a cluster including a plurality of clusters each corresponding to the microplane. It is configured to generate data DC2. Similar to the plane estimation unit 20, the plane estimation unit 40 has a division determination unit 41, a division processing unit 42, a clustering determination unit 43, and a clustering processing unit 44.
 ToFセンサ51は、測距センサであり、ロボット200の例えば前方の物体までの距離を測定するように構成される。 The ToF sensor 51 is a distance measuring sensor, and is configured to measure the distance to, for example, an object in front of the robot 200.
 信号処理部52は、ToFセンサ51における測定結果に基づいて、点群PCについてのデータを含む点群データDPC3を生成するように構成される。 The signal processing unit 52 is configured to generate point cloud data DPC3 including data about the point cloud PC based on the measurement result of the ToF sensor 51.
 平面推定部60は、平面推定部20と同様に、点群データDPC3に基づいて、点群PCをクラスタリングすることにより微小平面群を生成し、それぞれが微小平面に対応する複数のクラスタを含むクラスタデータDC3を生成するように構成される。平面推定部60は、平面推定部20と同様に、分割判定部61と、分割処理部62と、クラスタリング判定部63と、クラスタリング処理部64とを有している。 Similar to the plane estimation unit 20, the plane estimation unit 60 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC3, and a cluster including a plurality of clusters each corresponding to the microplane. It is configured to generate data DC3. Similar to the plane estimation unit 20, the plane estimation unit 60 includes a division determination unit 61, a division processing unit 62, a clustering determination unit 63, and a clustering processing unit 64.
 統合部19は、クラスタデータDC1、クラスタデータDC2、およびクラスタデータDC3を統合することにより、クラスタデータDCを生成するように構成される。そして、統合部19は、このクラスタデータDCをオブジェクト認識部13に供給するようになっている。 The integration unit 19 is configured to generate cluster data DC by integrating cluster data DC1, cluster data DC2, and cluster data DC3. Then, the integration unit 19 supplies the cluster data DC to the object recognition unit 13.
 ここで、イメージセンサ31L,31Rは、本開示における「第2のセンサ」の一具体例に対応する。点群データDPC1は、本開示における「第1の点群データ」の一具体例に対応する。点群データDPC2は、本開示における「第2の点群データ」の一具体例に対応する。分割判定部41は、本開示における「第2の分割判定部」の一具体例に対応する。分割処理部42は、本開示における「第2の分割処理部」の一具体例に対応する。クラスタリング判定部43は、本開示における「第2のクラスタリング判定部」の一具体例に対応する。クラスタリング処理部44は、本開示における「第2のクラスタリング処理部」の一具体例に対応する。統合部19は、本開示における「統合部」の一具体例に対応する。 Here, the image sensors 31L and 31R correspond to a specific example of the "second sensor" in the present disclosure. The point cloud data DPC1 corresponds to a specific example of the "first point cloud data" in the present disclosure. The point cloud data DPC2 corresponds to a specific example of the "second point cloud data" in the present disclosure. The division determination unit 41 corresponds to a specific example of the “second division determination unit” in the present disclosure. The division processing unit 42 corresponds to a specific example of the “second division processing unit” in the present disclosure. The clustering determination unit 43 corresponds to a specific example of the “second clustering determination unit” in the present disclosure. The clustering processing unit 44 corresponds to a specific example of the “second clustering processing unit” in the present disclosure. The integration unit 19 corresponds to a specific example of the “integration unit” in the present disclosure.
 このように、ロボット200では、イメージセンサ11L,11Rに加えて、さらにセンサを設けるようにした。具体的には、この例では、イメージセンサ31L,31RおよびToFセンサ51を設けるようにした。これにより、例えば、光の反射などによりイメージセンサ11L,11Rによる撮像画像が鮮明でない場合において、イメージセンサ31L,31Rによる撮像画像が鮮明である場合には、点群PCの正確な位置を得ることができる。また、例えば、ロボット200では、例えば、地図データMAPの精度を高めることができる。イメージセンサ11L,11Rなどにより例えば透明な物体の位置を検出しにくい場合には、ToFセンサ51により、この物体の位置を正確に検出できるので、点群PCの正確な位置を得ることができる。このように、ロボット200では、正確な点群PCを得ることができるので、クラスタリングの精度を高めることができる。 In this way, the robot 200 is provided with additional sensors in addition to the image sensors 11L and 11R. Specifically, in this example, the image sensors 31L and 31R and the ToF sensor 51 are provided. As a result, for example, when the image captured by the image sensors 11L and 11R is not clear due to light reflection or the like, and the image captured by the image sensors 31L and 31R is clear, the accurate position of the point cloud PC can be obtained. Can be done. Further, for example, in the robot 200, the accuracy of the map data MAP can be improved, for example. When it is difficult to detect the position of a transparent object by the image sensors 11L, 11R or the like, the position of this object can be accurately detected by the ToF sensor 51, so that the accurate position of the point cloud PC can be obtained. In this way, the robot 200 can obtain an accurate point cloud PC, so that the accuracy of clustering can be improved.
 また、ロボット200では、3つの点群データDPC1~DPC3のそれぞれに基づいて点群PCをクラスタリングするので、処理量が多いため、処理時間が長くなる。この場合でも、非特許文献1に記載の近傍点探索により点群PCをクラスタリングする場合に比べて、処理時間を短くすることができる。 Further, in the robot 200, since the point cloud PCs are clustered based on each of the three point cloud data DPC1 to DPC3, the processing amount is large and the processing time becomes long. Even in this case, the processing time can be shortened as compared with the case where the point cloud PCs are clustered by the neighborhood point search described in Non-Patent Document 1.
 以上のように本実施の形態では、より多くのセンサを設けるようにしたので、クラスタリングの精度を高めることができる。その他の効果は、上記第1の実施の形態の場合と同様である。 As described above, in the present embodiment, since more sensors are provided, the accuracy of clustering can be improved. Other effects are the same as in the case of the first embodiment.
[変形例2-1]
 上記実施の形態では、3つの平面推定部20,40,60の後段に統合部19を設けたが、これに限定されるものではなく、例えば、図6に示すロボット200Aのように、統合部の後段に1つの平面推定部20を設けてもよい。ロボット200Aは、統合部19Aを備えている。統合部19Aは、3つの点群データDPC1,DCP2,DPC3を統合することにより点群データDPCを生成するように構成される。平面推定部20は、この点群データDPCに基づいて点群PCをクラスタリングすることによりクラスタデータDCを生成する。ここで、統合部19Aは、本開示における「統合部」の一具体例に対応する。この構成により、ロボット200Aでは、1つの平面推定部20で済むので、構成をシンプルにすることができる。
[Modification 2-1]
In the above embodiment, the integration unit 19 is provided after the three plane estimation units 20, 40, 60, but the present invention is not limited to this, and the integration unit is not limited to this, for example, as in the robot 200A shown in FIG. One plane estimation unit 20 may be provided in the subsequent stage. The robot 200A includes an integrated unit 19A. The integration unit 19A is configured to generate the point cloud data DPC by integrating the three point cloud data DPC1, DCP2, and DPC3. The plane estimation unit 20 generates cluster data DC by clustering the point cloud PCs based on the point cloud data DPCs. Here, the integration unit 19A corresponds to a specific example of the "integration unit" in the present disclosure. With this configuration, the robot 200A requires only one plane estimation unit 20, so that the configuration can be simplified.
[変形例2-2]
 上記実施の形態では、1つの統合部19を設けたが、これに限定されるものではなく、図7に示すロボット200Bのように、複数の統合部を設けてもよい。ロボット200Bは、ToFセンサ71と、信号処理部72と、統合部19Bと、平面推定部80と、統合部19Cとを備えている。ロボット200Bは、イメージセンサ11L,11R、ToFセンサ51、ToFセンサ71の3系統のセンサを備えている。ToFセンサ71は、ToFセンサ51と同様に、測距センサであり、ロボット200Bの例えば前方の物体までの距離を測定するように構成される。信号処理部72は、信号処理部52と同様に、ToFセンサ71における測定結果に基づいて、点群PCについてのデータを含む点群データDPC4を生成するように構成される。統合部19Bは、点群データDPC3および点群データDPC4を統合することにより、点群データDPC5を生成するように構成される。平面推定部80は、平面推定部20と同様に、点群データDPC5に基づいて、点群PCをクラスタリングすることにより微小平面群を生成し、それぞれが微小平面に対応する複数のクラスタを含むクラスタデータDC4を生成するように構成される。平面推定部80は、平面推定部20と同様に、分割判定部81と、分割処理部82と、クラスタリング判定部83と、クラスタリング処理部84とを有している。統合部19Cは、クラスタデータDC1およびクラスタデータDC2を統合することにより、クラスタデータDCを生成するように構成される。ここで、ToFセンサ51は、本開示における「第2のセンサ」の一具体例に対応する。点群データDPC3は、本開示における「第2の点群データ」の一具体例に対応する。ToFセンサ71は、本開示における「第3のセンサ」の一具体例に対応する。点群データDPC4は、本開示における「第3の点群データ」の一具体例に対応する。統合部19Bは、本開示における「第1の統合部」の一具体例に対応する。分割判定部81は、本開示における「第2の分割判定部」の一具体例に対応する。分割処理部82は、本開示における「第2の分割処理部」の一具体例に対応する。クラスタリング判定部83は、本開示における「第2のクラスタリング判定部」の一具体例に対応する。クラスタリング処理部84は、本開示における「第2のクラスタリング処理部」の一具体例に対応する。統合部19Cは、本開示における「第2の統合部」の一具体例に対応する。このロボット200Bは、例えば、ToFセンサ51により得られた情報と、ToFセンサ71により得られた情報とは統合しやすく、一方、2つのToFセンサ51,71により得られた情報と、イメージセンサ11L,11Rにより得られた情報とは統合しにくい場合に有効である。
[Modification 2-2]
In the above embodiment, one integrated unit 19 is provided, but the present invention is not limited to this, and a plurality of integrated units may be provided as in the robot 200B shown in FIG. 7. The robot 200B includes a ToF sensor 71, a signal processing unit 72, an integrated unit 19B, a plane estimation unit 80, and an integrated unit 19C. The robot 200B includes three types of sensors: image sensors 11L and 11R, ToF sensor 51, and ToF sensor 71. Like the ToF sensor 51, the ToF sensor 71 is a distance measuring sensor, and is configured to measure the distance to, for example, an object in front of the robot 200B. Similar to the signal processing unit 52, the signal processing unit 72 is configured to generate the point cloud data DPC 4 including the data about the point cloud PC based on the measurement result of the ToF sensor 71. The integration unit 19B is configured to generate the point cloud data DPC 5 by integrating the point cloud data DPC 3 and the point cloud data DPC 4. Similar to the plane estimation unit 20, the plane estimation unit 80 generates a microplane group by clustering the point cloud PCs based on the point cloud data DPC 5, and a cluster including a plurality of clusters each corresponding to the microplane. It is configured to generate data DC4. Similar to the plane estimation unit 20, the plane estimation unit 80 includes a division determination unit 81, a division processing unit 82, a clustering determination unit 83, and a clustering processing unit 84. The integration unit 19C is configured to generate a cluster data DC by integrating the cluster data DC1 and the cluster data DC2. Here, the ToF sensor 51 corresponds to a specific example of the "second sensor" in the present disclosure. The point cloud data DPC3 corresponds to a specific example of the "second point cloud data" in the present disclosure. The ToF sensor 71 corresponds to a specific example of the "third sensor" in the present disclosure. The point cloud data DPC4 corresponds to a specific example of the "third point cloud data" in the present disclosure. The integration unit 19B corresponds to a specific example of the "first integration unit" in the present disclosure. The division determination unit 81 corresponds to a specific example of the “second division determination unit” in the present disclosure. The division processing unit 82 corresponds to a specific example of the “second division processing unit” in the present disclosure. The clustering determination unit 83 corresponds to a specific example of the “second clustering determination unit” in the present disclosure. The clustering processing unit 84 corresponds to a specific example of the “second clustering processing unit” in the present disclosure. The integration unit 19C corresponds to a specific example of the "second integration unit" in the present disclosure. In this robot 200B, for example, the information obtained by the ToF sensor 51 and the information obtained by the ToF sensor 71 can be easily integrated, while the information obtained by the two ToF sensors 51 and 71 and the image sensor 11L , 11R is effective when it is difficult to integrate with the information obtained.
[変形例2-3]
 上記実施の形態では、3つの平面推定部20,40,60を設けたが、これらの3つの平面推定部20,40,60における分割条件C1は、互いに同じであってもよいし、互いに異なっていてもよい。同様に、これらの3つの平面推定部20,40,60におけるクラスタリング条件C2は、互いに同じであってもよいし、互いに異なっていてもよい。
[Modification 2-3]
In the above embodiment, the three plane estimation units 20, 40, 60 are provided, but the division conditions C1 in these three plane estimation units 20, 40, 60 may be the same or different from each other. You may be. Similarly, the clustering conditions C2 in these three plane estimation units 20, 40, 60 may be the same as each other or may be different from each other.
[変形例2-4]
 上記実施の形態に係るロボット200に、上記第1の実施の形態の各変形例を適用してもよい。
[Modification 2-4]
Each modification of the first embodiment may be applied to the robot 200 according to the above embodiment.
 以上、いくつかの実施の形態および変形例を挙げて本技術を説明したが、本技術はこれらの実施の形態等には限定されず、種々の変形が可能である。 Although the present technology has been described above with reference to some embodiments and modifications, the present technology is not limited to these embodiments and can be modified in various ways.
 例えば、上記第1の実施の形態では、イメージセンサ11L,11Rを設けるようにしたが、これに代えてToFセンサを設けてもよい。また、上記第2の実施の形態における3系統のセンサの構成は、適宜変更してもよい。 For example, in the first embodiment described above, the image sensors 11L and 11R are provided, but a ToF sensor may be provided instead. Moreover, the configuration of the three systems of sensors in the second embodiment may be changed as appropriate.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術は以下のような構成とすることができる。以下の構成の本技術によれば、処理時間を短くすることができる。 Note that this technology can be configured as follows. According to the present technology having the following configuration, the processing time can be shortened.
(1)第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを分割条件に基づいて判定する第1の分割判定部と、
 前記第1の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第1の点群データに含まれる前記点群の分割を行う第1の分割処理部と
 を備えた情報処理装置。
(2)前記分割条件は、前記点群に含まれる点の数についての条件、前記点群を囲む領域のサイズについての条件、前記点群における点の密度についての条件、前記点群の形状についての条件のうちの1以上を含む
 前記(1)に記載の情報処理装置。
(3)前記分割条件は、前記点群と前記第1のセンサとの間の距離に応じて変更可能である
 前記(1)または(2)に記載の情報処理装置。
(4)前記点群を囲む領域は、複数の座標軸により示される座標空間に配置され、
 前記第1の分割処理部は、前記複数の座標軸の軸方向のうちの、前記点群を囲む領域の長さが最も長い軸方向に基づいて分割方向を設定し、設定された前記分割方向において分割位置を設定し、前記分割位置で前記点群を分割する
 前記(1)から(3)のいずれかに記載の情報処理装置。
(5)前記第1の分割処理部は、前記分割方向における、前記点群を囲む領域の中心位置を、前記分割位置として設定する
 前記(4)に記載の情報処理装置。
(6)前記第1の分割処理部は、前記分割方向における、前記点群に含まれる複数の点の平均位置を、前記分割位置として設定する
 前記(4)に記載の情報処理装置。
(7)前記第1の分割処理部により処理された前記点群に基づいて地図データを生成する生成部をさらに備え、
 前記複数の座標軸は、前記地図データの座標空間で用いられる複数の地図座標軸を所定の方向に回転させたものである
 前記(4)から(6)のいずれかに記載の情報処理装置。
(8)前記点群を囲む領域は、複数の座標軸により示される座標空間に配置され、
 前記第1の分割処理部は、前記複数の座標軸の軸方向のそれぞれにおいて分割位置を設定し、複数の前記分割位置で前記点群を分割する
 前記(1)から(3)のいずれかに記載の情報処理装置。
(9)前記第1の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定する第1のクラスタリング判定部をさらに備えた
 前記(1)から(8)のいずれかに記載の情報処理装置。
(10)前記クラスタリング条件は、前記点群に含まれる点の数についての条件、前記点群を囲む領域のサイズについての条件、前記点群における点の密度についての条件、前記点群の形状についての条件のうちの1以上を含む
 前記(9)に記載の情報処理装置。
(11)前記クラスタリング条件は、前記点群と前記第1のセンサとの間の距離に応じて変更可能である
 前記(9)または(10)に記載の情報処理装置。
(12)前記第1のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行う第1のクラスタリング処理部をさらに備えた
 前記(9)から(11)のいずれかに記載の情報処理装置。
(13)第2のセンサから供給された第2の点群データに含まれる前記点群の分割を行うかどうかを前記分割条件に基づいて判定する第2の分割判定部と、
 前記第2の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第2の点群データに含まれる前記点群の分割を行う第2の分割処理部と、
 前記第1の分割処理部による処理結果と、前記第2の分割処理部による処理結果とを統合する統合部と
 をさらに備えた
 前記(1)から(8)のいずれかに記載の情報処理装置。
(14)前記第1の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定する第1のクラスタリング判定部と、
 前記第1のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行う第1のクラスタリング処理部と、
 前記第2の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行うかどうかを前記クラスタリング条件に基づいて判定する第2のクラスタリング判定部と、
 前記第2のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行う第2のクラスタリング処理部と
 を備え、
 前記統合部は、前記第1のクラスタリング処理部による処理結果と、前記第2のクラスタリング処理部による処理結果とを統合する
 前記(13)に記載の情報処理装置。
(15)前記第1のセンサから供給された前記第1の点群データ、および第2のセンサから供給された第2の点群データを統合する統合部をさらに備え、
 前記第1の分割判定部は、前記統合部により統合された前記第1の点群データおよび前記第2の点群データに含まれる前記点群の分割を行うかどうかを前記分割条件に基づいて判定し、
 前記第1の分割処理部は、前記第1の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件が満たさなくなるまで前記第1の点群データおよび前記第2の点群データに含まれる前記点群の分割を行う
 前記(1)から(8)のいずれかに記載の情報処理装置。
(16)第2のセンサから供給された第2の点群データおよび第3のセンサから供給された第3の点群データを統合する第1の統合部と、
 前記第1の統合部により統合された前記第2の点群データおよび前記第3の点群データに含まれる前記点群の分割を行うかどうかを前記分割条件に基づいて判定する第2の分割判定部と、
 前記第2の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第2の点群データに含まれる前記点群の分割を行う第2の分割処理部と、
 前記第1の分割処理部による処理結果と、前記第2の分割処理部による処理結果とを統合する第2の統合部と
 をさらに備えた
 前記(1)から(8)のいずれかに記載の情報処理装置。
(17)前記第1の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定する第1のクラスタリング判定部と、
 前記第1のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行う第1のクラスタリング処理部と、
 前記第2の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行うかどうかを前記クラスタリング条件に基づいて判定する第2のクラスタリング判定部と、
 前記第2のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行う第2のクラスタリング処理部と
 を備え、
 前記第2の統合部は、前記第1のクラスタリング処理部による処理結果と、前記第2のクラスタリング処理部による処理結果とを統合する
 前記(16)に記載の情報処理装置。
(18)前記第1のセンサおよび前記情報処理装置は、移動体に設けられた
 前記(1)から(17)のいずれかに記載の情報処理装置。
(19)第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを、分割条件に基づいて判定する第1の分割判定処理と、
 前記第1の分割判定処理において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第1の点群データに含まれる前記点群の分割を行う第1の分割処理と
 を含む情報処理方法。
(20)第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを分割条件に基づいて判定する第1の分割判定処理と、
 前記第1の分割判定処理において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第1の点群データに含まれる前記点群の分割を行う第1の分割処理と
 をコンピュータに実行させるプログラム。
(1) A first division determination unit that determines whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division conditions.
When the first division determination unit determines that the point cloud satisfies the division condition, the first division of the point cloud included in the first point cloud data is performed until the division condition is not satisfied. An information processing device equipped with a partition processing unit.
(2) The division condition is a condition regarding the number of points included in the point cloud, a condition regarding the size of the area surrounding the point cloud, a condition regarding the density of points in the point cloud, and the shape of the point cloud. The information processing apparatus according to (1) above, which includes one or more of the conditions of (1).
(3) The information processing apparatus according to (1) or (2), wherein the division condition can be changed according to the distance between the point cloud and the first sensor.
(4) The area surrounding the point cloud is arranged in a coordinate space indicated by a plurality of coordinate axes.
The first division processing unit sets the division direction based on the axial direction in which the length of the region surrounding the point cloud is the longest among the axial directions of the plurality of coordinate axes, and in the set division direction. The information processing apparatus according to any one of (1) to (3) above, wherein a division position is set and the point cloud is divided at the division position.
(5) The information processing apparatus according to (4), wherein the first division processing unit sets the center position of a region surrounding the point cloud in the division direction as the division position.
(6) The information processing apparatus according to (4), wherein the first division processing unit sets the average position of a plurality of points included in the point cloud in the division direction as the division position.
(7) Further provided with a generation unit that generates map data based on the point cloud processed by the first division processing unit.
The information processing apparatus according to any one of (4) to (6) above, wherein the plurality of coordinate axes are obtained by rotating a plurality of map coordinate axes used in the coordinate space of the map data in a predetermined direction.
(8) The area surrounding the point cloud is arranged in a coordinate space indicated by a plurality of coordinate axes.
The first division processing unit sets a division position in each of the plurality of coordinate axes in the axial direction, and divides the point cloud at the plurality of division positions according to any one of (1) to (3). Information processing equipment.
(9) When the first division determination unit determines that the point cloud does not satisfy the division condition, clustering is performed as to whether or not to cluster the point cloud processed by the first division processing unit. The information processing apparatus according to any one of (1) to (8) above, further comprising a first clustering determination unit for determining based on conditions.
(10) The clustering condition includes a condition regarding the number of points included in the point cloud, a condition regarding the size of a region surrounding the point cloud, a condition regarding the density of points in the point cloud, and a shape of the point cloud. The information processing apparatus according to (9) above, which includes one or more of the conditions.
(11) The information processing apparatus according to (9) or (10), wherein the clustering condition can be changed according to the distance between the point cloud and the first sensor.
(12) A first clustering process for clustering the point cloud processed by the first partition processing unit when the first clustering determination unit determines that the point cloud satisfies the clustering condition. The information processing apparatus according to any one of (9) to (11) above, further comprising a unit.
(13) A second division determination unit that determines whether or not to divide the point cloud included in the second point cloud data supplied from the second sensor based on the division condition.
When the second division determination unit determines that the point cloud satisfies the division condition, the second division determination unit performs the division of the point cloud included in the second point cloud data until the division condition is not satisfied. Division processing unit and
The information processing apparatus according to any one of (1) to (8) above, further comprising an integrated unit that integrates the processing result by the first division processing unit and the processing result by the second division processing unit. ..
(14) When the first division determination unit determines that the point cloud does not satisfy the division condition, clustering is performed as to whether or not to cluster the point cloud processed by the first division processing unit. A first clustering determination unit that determines based on conditions,
When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit, and the first clustering processing unit.
When the second division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the second division processing unit is set as the clustering condition. A second clustering determination unit that determines based on
When the second clustering determination unit determines that the point cloud satisfies the clustering condition, the second clustering processing unit that clusters the point cloud processed by the second division processing unit is used. Prepare,
The information processing apparatus according to (13), wherein the integration unit integrates a processing result by the first clustering processing unit and a processing result by the second clustering processing unit.
(15) Further provided with an integration unit that integrates the first point cloud data supplied from the first sensor and the second point cloud data supplied from the second sensor.
The first division determination unit determines whether or not to divide the point cloud included in the first point cloud data and the second point cloud data integrated by the integration unit based on the division condition. Judge,
When the first division determination unit determines that the point cloud satisfies the division condition, the first division processing unit performs the first point cloud data and the first division until the division condition is not satisfied. The information processing apparatus according to any one of (1) to (8) above, which divides the point cloud included in the point cloud data of 2.
(16) A first integration unit that integrates the second point cloud data supplied from the second sensor and the third point cloud data supplied from the third sensor, and
A second division that determines whether or not to divide the point cloud included in the second point cloud data and the third point cloud data integrated by the first integration unit based on the division condition. Judgment unit and
When the second division determination unit determines that the point cloud satisfies the division condition, the second division determination unit performs the division of the point cloud included in the second point cloud data until the division condition is not satisfied. Division processing unit and
The description according to any one of (1) to (8) above, further comprising a second integration unit that integrates the processing result by the first division processing unit and the processing result by the second division processing unit. Information processing device.
(17) When the first division determination unit determines that the point cloud does not satisfy the division condition, clustering is performed as to whether or not to cluster the point cloud processed by the first division processing unit. A first clustering determination unit that determines based on conditions,
When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit, and the first clustering processing unit.
When the second division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the second division processing unit is set as the clustering condition. A second clustering determination unit that determines based on
When the second clustering determination unit determines that the point cloud satisfies the clustering condition, the second clustering processing unit that clusters the point cloud processed by the second division processing unit is used. Prepare,
The information processing apparatus according to (16), wherein the second integration unit integrates the processing result by the first clustering processing unit and the processing result by the second clustering processing unit.
(18) The information processing device according to any one of (1) to (17), wherein the first sensor and the information processing device are provided on a mobile body.
(19) The first division determination process of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition, and
When it is determined in the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied. Information processing methods including partition processing.
(20) The first division determination process of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition, and
When it is determined in the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied. A program that causes a computer to perform the partition processing of.
 本出願は、日本国特許庁において2019年12月23日に出願された日本特許出願番号2019-231406号を基礎として優先権を主張するものであり、この出願のすべての内容を参照によって本出願に援用する。 This application claims priority based on Japanese Patent Application No. 2019-231406 filed on December 23, 2019 at the Japan Patent Office, and this application is made by referring to all the contents of this application. Invite to.
 当業者であれば、設計上の要件や他の要因に応じて、種々の修正、コンビネーション、サブコンビネーション、および変更を想到し得るが、それらは添付の請求の範囲やその均等物の範囲に含まれるものであることが理解される。 One of ordinary skill in the art can conceive of various modifications, combinations, sub-combinations, and changes, depending on design requirements and other factors, which are included in the appended claims and their equivalents. It is understood that it is something to be done.

Claims (20)

  1.  第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを分割条件に基づいて判定する第1の分割判定部と、
     前記第1の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第1の点群データに含まれる前記点群の分割を行う第1の分割処理部と
     を備えた情報処理装置。
    A first division determination unit that determines whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division conditions.
    When the first division determination unit determines that the point cloud satisfies the division condition, the first division of the point cloud included in the first point cloud data is performed until the division condition is not satisfied. An information processing device equipped with a partition processing unit.
  2.  前記分割条件は、前記点群に含まれる点の数についての条件、前記点群を囲む領域のサイズについての条件、前記点群における点の密度についての条件、前記点群の形状についての条件のうちの1以上を含む
     請求項1に記載の情報処理装置。
    The division condition includes a condition for the number of points included in the point cloud, a condition for the size of the area surrounding the point cloud, a condition for the density of points in the point cloud, and a condition for the shape of the point cloud. The information processing apparatus according to claim 1, which includes one or more of them.
  3.  前記分割条件は、前記点群と前記第1のセンサとの間の距離に応じて変更可能である
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the division condition can be changed according to a distance between the point cloud and the first sensor.
  4.  前記点群を囲む領域は、複数の座標軸により示される座標空間に配置され、
     前記第1の分割処理部は、前記複数の座標軸の軸方向のうちの、前記点群を囲む領域の長さが最も長い軸方向に基づいて分割方向を設定し、設定された前記分割方向において分割位置を設定し、前記分割位置で前記点群を分割する
     請求項1に記載の情報処理装置。
    The area surrounding the point cloud is arranged in a coordinate space indicated by a plurality of coordinate axes.
    The first division processing unit sets the division direction based on the axial direction in which the length of the region surrounding the point cloud is the longest among the axial directions of the plurality of coordinate axes, and in the set division direction. The information processing apparatus according to claim 1, wherein a division position is set and the point cloud is divided at the division position.
  5.  前記第1の分割処理部は、前記分割方向における、前記点群を囲む領域の中心位置を、前記分割位置として設定する
     請求項4に記載の情報処理装置。
    The information processing apparatus according to claim 4, wherein the first division processing unit sets the center position of the region surrounding the point cloud in the division direction as the division position.
  6.  前記第1の分割処理部は、前記分割方向における、前記点群に含まれる複数の点の平均位置を、前記分割位置として設定する
     請求項4に記載の情報処理装置。
    The information processing apparatus according to claim 4, wherein the first division processing unit sets the average position of a plurality of points included in the point cloud in the division direction as the division position.
  7.  前記第1の分割処理部により処理された前記点群に基づいて地図データを生成する生成部をさらに備え、
     前記複数の座標軸は、前記地図データの座標空間で用いられる複数の地図座標軸を所定の方向に回転させたものである
     請求項4に記載の情報処理装置。
    A generation unit that generates map data based on the point cloud processed by the first division processing unit is further provided.
    The information processing apparatus according to claim 4, wherein the plurality of coordinate axes are obtained by rotating a plurality of map coordinate axes used in the coordinate space of the map data in a predetermined direction.
  8.  前記点群を囲む領域は、複数の座標軸により示される座標空間に配置され、
     前記第1の分割処理部は、前記複数の座標軸の軸方向のそれぞれにおいて分割位置を設定し、複数の前記分割位置で前記点群を分割する
     請求項1に記載の情報処理装置。
    The area surrounding the point cloud is arranged in a coordinate space indicated by a plurality of coordinate axes.
    The information processing apparatus according to claim 1, wherein the first division processing unit sets division positions in each of the plurality of coordinate axes in the axial direction, and divides the point cloud at the plurality of division positions.
  9.  前記第1の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定する第1のクラスタリング判定部をさらに備えた
     請求項1に記載の情報処理装置。
    When the first division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the first division processing unit is based on the clustering condition. The information processing apparatus according to claim 1, further comprising a first clustering determination unit for determination.
  10.  前記クラスタリング条件は、前記点群に含まれる点の数についての条件、前記点群を囲む領域のサイズについての条件、前記点群における点の密度についての条件、前記点群の形状についての条件のうちの1以上を含む
     請求項9に記載の情報処理装置。
    The clustering conditions include conditions for the number of points included in the point cloud, conditions for the size of the area surrounding the point cloud, conditions for the density of points in the point cloud, and conditions for the shape of the point cloud. The information processing apparatus according to claim 9, which includes one or more of them.
  11.  前記クラスタリング条件は、前記点群と前記第1のセンサとの間の距離に応じて変更可能である
     請求項9に記載の情報処理装置。
    The information processing apparatus according to claim 9, wherein the clustering condition can be changed according to a distance between the point cloud and the first sensor.
  12.  前記第1のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行う第1のクラスタリング処理部をさらに備えた
     請求項9に記載の情報処理装置。
    When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit is further added. The information processing apparatus according to claim 9.
  13.  第2のセンサから供給された第2の点群データに含まれる前記点群の分割を行うかどうかを前記分割条件に基づいて判定する第2の分割判定部と、
     前記第2の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第2の点群データに含まれる前記点群の分割を行う第2の分割処理部と、
     前記第1の分割処理部による処理結果と、前記第2の分割処理部による処理結果とを統合する統合部と
     をさらに備えた
     請求項1に記載の情報処理装置。
    A second division determination unit that determines whether or not to divide the point cloud included in the second point cloud data supplied from the second sensor based on the division condition,
    When the second division determination unit determines that the point cloud satisfies the division condition, the second division determination unit performs the division of the point cloud included in the second point cloud data until the division condition is not satisfied. Division processing unit and
    The information processing apparatus according to claim 1, further comprising an integrated unit that integrates the processing result by the first division processing unit and the processing result by the second division processing unit.
  14.  前記第1の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定する第1のクラスタリング判定部と、
     前記第1のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行う第1のクラスタリング処理部と、
     前記第2の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行うかどうかを前記クラスタリング条件に基づいて判定する第2のクラスタリング判定部と、
     前記第2のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行う第2のクラスタリング処理部と
     を備え、
     前記統合部は、前記第1のクラスタリング処理部による処理結果と、前記第2のクラスタリング処理部による処理結果とを統合する
     請求項13に記載の情報処理装置。
    When the first division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the first division processing unit is based on the clustering condition. The first clustering determination unit to determine
    When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit, and the first clustering processing unit.
    When the second division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the second division processing unit is set as the clustering condition. A second clustering determination unit that determines based on
    When the second clustering determination unit determines that the point cloud satisfies the clustering condition, the second clustering processing unit that clusters the point cloud processed by the second division processing unit is used. Prepare,
    The information processing apparatus according to claim 13, wherein the integrated unit integrates a processing result by the first clustering processing unit and a processing result by the second clustering processing unit.
  15.  前記第1のセンサから供給された前記第1の点群データ、および第2のセンサから供給された第2の点群データを統合する統合部をさらに備え、
     前記第1の分割判定部は、前記統合部により統合された前記第1の点群データおよび前記第2の点群データに含まれる前記点群の分割を行うかどうかを前記分割条件に基づいて判定し、
     前記第1の分割処理部は、前記第1の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件が満たさなくなるまで前記第1の点群データおよび前記第2の点群データに含まれる前記点群の分割を行う
     請求項1に記載の情報処理装置。
    Further provided with an integration unit that integrates the first point cloud data supplied from the first sensor and the second point cloud data supplied from the second sensor.
    The first division determination unit determines whether or not to divide the point cloud included in the first point cloud data and the second point cloud data integrated by the integration unit based on the division condition. Judge,
    When the first division determination unit determines that the point cloud satisfies the division condition, the first division processing unit performs the first point cloud data and the first division until the division condition is not satisfied. The information processing apparatus according to claim 1, wherein the point cloud included in the point cloud data of 2 is divided.
  16.  第2のセンサから供給された第2の点群データおよび第3のセンサから供給された第3の点群データを統合する第1の統合部と、
     前記第1の統合部により統合された前記第2の点群データおよび前記第3の点群データに含まれる前記点群の分割を行うかどうかを前記分割条件に基づいて判定する第2の分割判定部と、
     前記第2の分割判定部において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第2の点群データに含まれる前記点群の分割を行う第2の分割処理部と、
     前記第1の分割処理部による処理結果と、前記第2の分割処理部による処理結果とを統合する第2の統合部と
     をさらに備えた
     請求項1に記載の情報処理装置。
    A first integration unit that integrates the second point cloud data supplied from the second sensor and the third point cloud data supplied from the third sensor, and
    A second division that determines whether or not to divide the point cloud included in the second point cloud data and the third point cloud data integrated by the first integration unit based on the division condition. Judgment unit and
    When the second division determination unit determines that the point cloud satisfies the division condition, the second division determination unit performs the division of the point cloud included in the second point cloud data until the division condition is not satisfied. Division processing unit and
    The information processing apparatus according to claim 1, further comprising a second integration unit that integrates the processing result by the first division processing unit and the processing result by the second division processing unit.
  17.  前記第1の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行うかどうかをクラスタリング条件に基づいて判定する第1のクラスタリング判定部と、
     前記第1のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第1の分割処理部により処理された前記点群のクラスタリングを行う第1のクラスタリング処理部と、
     前記第2の分割判定部において前記点群が前記分割条件を満たさないと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行うかどうかを前記クラスタリング条件に基づいて判定する第2のクラスタリング判定部と、
     前記第2のクラスタリング判定部において前記点群が前記クラスタリング条件を満たすと判定された場合に、前記第2の分割処理部により処理された前記点群のクラスタリングを行う第2のクラスタリング処理部と
     を備え、
     前記第2の統合部は、前記第1のクラスタリング処理部による処理結果と、前記第2のクラスタリング処理部による処理結果とを統合する
     請求項16に記載の情報処理装置。
    When the first division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the first division processing unit is based on the clustering condition. The first clustering determination unit to determine
    When the first clustering determination unit determines that the point cloud satisfies the clustering condition, the first clustering processing unit that clusters the point cloud processed by the first division processing unit, and the first clustering processing unit.
    When the second division determination unit determines that the point cloud does not satisfy the division condition, whether or not to cluster the point cloud processed by the second division processing unit is set as the clustering condition. A second clustering determination unit that determines based on
    When the second clustering determination unit determines that the point cloud satisfies the clustering condition, the second clustering processing unit that clusters the point cloud processed by the second division processing unit is used. Prepare,
    The information processing apparatus according to claim 16, wherein the second integration unit integrates the processing result by the first clustering processing unit and the processing result by the second clustering processing unit.
  18.  前記第1のセンサおよび前記情報処理装置は、移動体に設けられた
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the first sensor and the information processing device are provided on a mobile body.
  19.  第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを、分割条件に基づいて判定する第1の分割判定処理と、
     前記第1の分割判定処理において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第1の点群データに含まれる前記点群の分割を行う第1の分割処理と
     を含む情報処理方法。
    The first division determination process of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition, and
    When it is determined in the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied. Information processing methods including partition processing.
  20.  第1のセンサから供給された第1の点群データに含まれる点群の分割を行うかどうかを分割条件に基づいて判定する第1の分割判定処理と、
     前記第1の分割判定処理において前記点群が前記分割条件を満たすと判定された場合に、前記分割条件を満たさなくなるまで前記第1の点群データに含まれる前記点群の分割を行う第1の分割処理と
     をコンピュータに実行させるプログラム。
    The first division determination process of determining whether or not to divide the point cloud included in the first point cloud data supplied from the first sensor based on the division condition, and
    When it is determined in the first division determination process that the point cloud satisfies the division condition, the first point cloud included in the first point cloud data is divided until the division condition is not satisfied. A program that causes a computer to perform the partition processing of.
PCT/JP2020/047053 2019-12-23 2020-12-16 Information processing device, information processing method, and program WO2021131990A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019231406A JP2021099689A (en) 2019-12-23 2019-12-23 Information processor, information processing method, and program
JP2019-231406 2019-12-23

Publications (1)

Publication Number Publication Date
WO2021131990A1 true WO2021131990A1 (en) 2021-07-01

Family

ID=76541946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/047053 WO2021131990A1 (en) 2019-12-23 2020-12-16 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2021099689A (en)
WO (1) WO2021131990A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023053723A1 (en) 2021-10-01 2023-04-06 ソニーグループ株式会社 Information processing device, information processing method, and moving body device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024116817A1 (en) * 2022-11-30 2024-06-06 ソニーグループ株式会社 Information processing method, information processing device, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014169947A (en) * 2013-03-05 2014-09-18 Hitachi Ltd Shape inspection method and device thereof
WO2014155715A1 (en) * 2013-03-29 2014-10-02 株式会社日立製作所 Object recognition device, object recognition method, and program
JP2015099483A (en) * 2013-11-19 2015-05-28 キヤノン株式会社 Image processing apparatus, image processing apparatus control method, imaging device, and program
US20160012646A1 (en) * 2014-07-10 2016-01-14 Perfetch, Llc Systems and methods for constructing a three dimensional (3d) color representation of an object
JP2017126890A (en) * 2016-01-14 2017-07-20 キヤノン株式会社 Encoder and control method therefor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014169947A (en) * 2013-03-05 2014-09-18 Hitachi Ltd Shape inspection method and device thereof
WO2014155715A1 (en) * 2013-03-29 2014-10-02 株式会社日立製作所 Object recognition device, object recognition method, and program
JP2015099483A (en) * 2013-11-19 2015-05-28 キヤノン株式会社 Image processing apparatus, image processing apparatus control method, imaging device, and program
US20160012646A1 (en) * 2014-07-10 2016-01-14 Perfetch, Llc Systems and methods for constructing a three dimensional (3d) color representation of an object
JP2017126890A (en) * 2016-01-14 2017-07-20 キヤノン株式会社 Encoder and control method therefor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023053723A1 (en) 2021-10-01 2023-04-06 ソニーグループ株式会社 Information processing device, information processing method, and moving body device

Also Published As

Publication number Publication date
JP2021099689A (en) 2021-07-01

Similar Documents

Publication Publication Date Title
US10948297B2 (en) Simultaneous location and mapping (SLAM) using dual event cameras
WO2020135446A1 (en) Target positioning method and device and unmanned aerial vehicle
EP2202672B1 (en) Information processing apparatus, information processing method, and computer program
EP2385496A1 (en) Extraction of 2D surfaces from a 3D point cloud
WO2021052403A1 (en) Obstacle information sensing method and device for mobile robot
US20100284572A1 (en) Systems and methods for extracting planar features, matching the planar features, and estimating motion from the planar features
KR20150144731A (en) Apparatus for recognizing location mobile robot using edge based refinement and method thereof
Karimi et al. LoLa-SLAM: low-latency LiDAR SLAM using continuous scan slicing
CN113985445A (en) 3D target detection algorithm based on data fusion of camera and laser radar
WO2021131990A1 (en) Information processing device, information processing method, and program
CN112070782B (en) Method, device, computer readable medium and electronic equipment for identifying scene contour
JP2010282615A (en) Object motion detection system based on combining 3d warping technique and proper object motion (pom) detection
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
JP2019504418A (en) Method and system for determining the position of a moving object
JP2015114954A (en) Photographing image analysis method
CN114766042A (en) Target detection method, device, terminal equipment and medium
US11504608B2 (en) 6DoF inside-out tracking game controller
EP3839817A2 (en) Generating and/or using training instances that include previously captured robot vision data and drivability labels
Asif et al. Real-time pose estimation of rigid objects using RGB-D imagery
CN114387462A (en) Dynamic environment sensing method based on binocular camera
US10134182B1 (en) Large scale dense mapping
CN113536959A (en) Dynamic obstacle detection method based on stereoscopic vision
CN113111787A (en) Target detection method, device, equipment and storage medium
Dubey et al. Droan-disparity-space representation for obstacle avoidance: Enabling wire mapping & avoidance
Saha et al. 3D LiDAR-based obstacle detection and tracking for autonomous navigation in dynamic environments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20907481

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20907481

Country of ref document: EP

Kind code of ref document: A1