WO2022024602A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2022024602A1
WO2022024602A1 PCT/JP2021/023757 JP2021023757W WO2022024602A1 WO 2022024602 A1 WO2022024602 A1 WO 2022024602A1 JP 2021023757 W JP2021023757 W JP 2021023757W WO 2022024602 A1 WO2022024602 A1 WO 2022024602A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
distance
point cloud
information processing
processor
Prior art date
Application number
PCT/JP2021/023757
Other languages
English (en)
Japanese (ja)
Inventor
幹夫 中井
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022024602A1 publication Critical patent/WO2022024602A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • This disclosure relates to information processing devices, information processing methods and programs.
  • Self-position estimation is executed by matching based on a map created in advance by SLAM (Simultaneous Localization and Mapping), or is executed in parallel with map creation by SLAM.
  • SLAM Simultaneous Localization and Mapping
  • the self-position is calculated by comparing the surrounding environment acquired by the sensor with the surrounding environment acquired in the past.
  • the sensor When the sensor is installed at an angle to the running surface due to intentional or mounting error, or when the obstacle measured by the sensor is on an inclined surface, the measurement direction of the sensor and the running surface, obstacles, etc. If the measurement surfaces of an object or the like do not appear in parallel, the depth deviation of the observation result that occurs depending on the distance to the obstacle occurs. When such a phenomenon occurs, matching in self-position measurement is not performed properly, and the accuracy of self-position estimation is deteriorated. In the design of the robot, since this self-position estimation is used to judge the avoidance of obstacles, the distance measuring sensor is often placed in the area under the feet, and such a situation can occur frequently.
  • an information processing device an information processing method, and a program for reducing an error in self-position estimation are provided.
  • the information processing device includes a processor.
  • the processor is based on the point cloud data obtained from the distance measurement sensor mounted on the mobile device and at least one or more conditions of the distance or direction to the distance measurement target indicated by the point cloud data. Generate at least two or more maps using the different point cloud data.
  • the processor may execute a filtering process on the point cloud data based on the distance or direction, and generate the two or more maps based on the point cloud data subjected to the filtering process.
  • the processor compares the point cloud data with a predetermined distance, generates a first map from the first point cloud data at a distance closer than the predetermined distance, and generates a second map from the second point cloud data at a distance farther than the predetermined distance. You may generate a map.
  • the processor may generate a third map from the unfiltered point cloud data including both the first point cloud data and the second point cloud data.
  • the processor may correct the distortion of the first map and the distortion of the second map based on the third map.
  • the processor may extract the difference between the corrected first map and the corrected second map.
  • the processor may draw the extracted difference on the third map.
  • a display device which displays according to a request from the processor, may be further provided, and the processor displays the third map on which the difference is drawn on the display device, and the user displays the third map on the display device. Processing may be accepted.
  • the processor may further include an input interface for inputting a request from the user, and the processor may execute and update the process received via the input interface with respect to the third map. ..
  • the processor may perform self-position estimation of the mobile device based on the updated third map.
  • the processor may modify and update the third map based on the extracted differences.
  • the processor may at least integrate the self-position estimation result based on the first map and the self-position estimation result based on the second map to execute self-position estimation.
  • the processor may integrate the self-position estimation result based on the first map and the self-position estimation result based on the second map by using the extended Kalman filter.
  • the processor may set a distance at which the difference point can be detected as a label at a point on the third map corresponding to the difference point between the first map and the second map.
  • the processor may determine whether or not to perform self-position estimation based on the distance to the difference point and the label set at the difference point. good.
  • the information processing method is a method of generating the map described above by a processor.
  • the program causes the processor to execute the method of generating the map described above.
  • the block diagram of the information processing apparatus which concerns on one Embodiment The figure which shows typically the range of the moving body and the distance which concerns on one Embodiment. The figure which shows typically the range of the moving body and the distance which concerns on one Embodiment.
  • the flowchart which shows the operation of the information processing apparatus which concerns on one Embodiment.
  • the flowchart which shows the operation of the information processing apparatus which concerns on one Embodiment.
  • the schematic diagram which shows an example of the input interface which concerns on one Embodiment.
  • the flowchart which shows the operation of the information processing apparatus which concerns on one Embodiment.
  • the figure which shows an example of the light beam region of the sensor which concerns on one Embodiment.
  • the figure which shows typically the range of the moving body and the angle which concerns on one Embodiment.
  • the figure which shows the position example of the mounting of the information processing apparatus which concerns on one Embodiment.
  • It is a block diagram which shows an example of the schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of the vehicle outside information detection unit and the image pickup unit.
  • FIG. 1 is a block diagram showing a configuration of an information processing apparatus according to the first embodiment.
  • the information processing apparatus 1 includes a filter 100, a storage unit 102, a first generation unit 104, a second generation unit 106, a third generation unit 108, a correction unit 110, a comparison unit 112, and a synthesis unit 114. It includes an output unit 116, an input unit 118, and an update unit 120.
  • the information processing device 1 updates the map for measuring its own position based on the information acquired by the sensor 2, and outputs necessary information.
  • the sensor 2 is, for example, a distance measuring device such as LiDAR (Light Detection and Ringing), and outputs the acquired distance information to the information processing device 1.
  • the sensor 2 is attached to a moving body (moving device) and measures the distance to the environment around the moving body.
  • the filter 100 classifies the information acquired by the sensor 2 based on the distance information. For example, the filter 100 compares the threshold value with the distance information among the information acquired by the sensor 2, and classifies the cases according to the distance above the threshold value and the distance below the threshold value. For example, the filter 100 outputs the observation points whose distance is less than the threshold value to the first generation unit 104, and outputs the observation points whose distance is equal to or more than the threshold value to the second generation unit 106.
  • first threshold value for acquiring a short-distance point cloud is output to the first generation unit 104
  • second threshold value for acquiring a long-distance point cloud.
  • Data may be output to the second generation unit 106.
  • the first threshold value ⁇ the second threshold value, and the point cloud between them may be in a form in which neither the first generation unit 104 nor the second generation unit 106 is output.
  • FIG. 2 is a diagram showing an example of a moving body and a distance.
  • a range of distance is specified as shown in FIG.
  • the filter 100 acquires an observation point closer to the threshold value as a point in the distance range 1 and outputs the observation point to the first generation unit 104.
  • the filter 100 acquires an observation point located at a position farther than the threshold value as a point in the distance range 2, and outputs the observation point to the second generation unit 106.
  • the limit distance is set for the distance range 2, but it is not limited to this.
  • the distance range 2 may be defined as an observable point that is simply a distance farther than the threshold value.
  • FIG. 3 is a diagram showing another example of a moving body and a distance. The figure in the case where the sensor is provided in the moving body similar to FIG. 2 is shown.
  • the filter 100 may filter points observed in a distance range of 3 or more by a plurality of threshold values as each distance range. For example, the filter 100 filters the observation point of the distance range 1 and the observation point of the distance range 2 at the smaller threshold value, and observes the observation point of the distance range 2 and the observation point of the distance range 3 at the larger threshold value. You may filter the points.
  • filtering may be performed in two ranges using two threshold values.
  • the filter 100 may be in a form in which the observation point in the distance range 2 is not output to either the first generation unit 104 or the second generation unit 106.
  • the storage unit 102 stores the information necessary for the operation of the information processing device 1.
  • the information received from the sensor 2 may be stored in the storage unit 102.
  • Data such as map information acquired in advance and updated map information may be stored in the storage unit 102.
  • information processing by software is specifically realized by using hardware resources and the operation of the information processing apparatus 1 is executed, information such as a program required for this processing is stored in the storage unit 102. May be.
  • the connection of data to the storage unit 102 is omitted in the drawings, the storage unit 102 is appropriately accessed from each component as needed.
  • the first generation unit 104 creates a map using the data classified by the filter 100. For example, when the filter 100 outputs data about a point cloud lower than the threshold value to the first generation unit 104 as described above, the first generation unit 104 generates a short-distance map (first map). .. The created map may be stored in the storage unit 102.
  • the second generation unit 106 creates a map using the data classified by the filter 100. For example, when the filter 100 outputs data about a point cloud higher than the threshold value to the second generation unit 106 as described above, the second generation unit 106 generates a long-distance map (second map). .. The created map may be stored in the storage unit 102.
  • the third generation unit 108 generates an entire map (third map) using all the data acquired from the sensor 2.
  • each generator may create their respective maps by parallel processing.
  • the first generation unit 104, the second generation unit 106, and the third generation unit 108 execute self-position estimation and map creation at the same timing on the same processor or different processors.
  • three generators may execute serial processing.
  • the second map of the second generation unit 106 may be generated, and then the processing of the third generation unit 108 may be executed. .. This operation may be performed by changing the point cloud data using the same function.
  • the processes of the first generation unit 104 and the second generation unit 106 are executed first, and the third generation unit 108 uses the execution results of these or the numerical values calculated during the execution. 3 Maps may be generated.
  • the process of the third generation unit 108 may be executed first, and the first generation unit 104 and the second generation unit 106 may execute the process using this result or the progress.
  • the correction unit 110 corrects the distortion of the first map and the second map based on the third map. This distortion correction may be performed by a general method.
  • the correction unit 110 extracts, for example, the positions where the moving objects existed at the same time from the first map, the second map, and the third map at a plurality of times, and adjusts to the third map based on the extracted positions. Correct the distortion of the 1st map and the 2nd map.
  • the comparison unit 112 compares the distortion-corrected first map with the second map. By comparing the first map and the second map, the comparison unit 112 extracts, for example, points, places, areas, etc. where the map differs between the result measured at a short distance and the result measured at a long distance. ..
  • the synthesis unit 114 synthesizes the difference points, difference areas, etc. extracted by the comparison unit 112 on the third map.
  • the output unit 116 outputs the third map in which the difference points and the difference areas are combined.
  • the output unit 116 includes, for example, a display for displaying, and displays a third map in which the difference points and the difference areas are combined on this display.
  • the input unit 118 accepts input from the user. For example, after the output unit 116 outputs the third map via the display, it waits for input from the user. Then, when the user instructs the process, the input unit 118 accepts this instruction.
  • the update unit 120 executes processing on the synthesized third map based on the processing received via the input unit 118, and updates the third map. Then, based on this updated map, for example, the moving body makes an autonomous movement.
  • the first generation unit 104, the second generation unit 106, and the third generation unit 108 may improve the accuracy of the map by inputting from another sensor 3 in addition to the input from the sensor 2.
  • another sensor 3 for example, an inertial sensor (IMU: Inertial Measurement Unit), a wheel encoder, or the like may be assumed.
  • IMU Inertial Measurement Unit
  • a wheel encoder or the like may be assumed.
  • the information processing apparatus 1 may be provided with a sensor 2 for acquiring at least information including distance information among the information required for generating a map, but may be provided with other sensors 3.
  • the sensor 2 may be a sensor that acquires only distance information.
  • the sensor 2 and the sensor 3 may work together to add the distance information to the information acquired by the sensor 3.
  • each generation unit may execute the generation of the map.
  • the information processing apparatus 1 may be implemented as long as it can form a map from the information acquired in situations where the distances are different.
  • each generation unit Based on the information acquired by the sensor 2 and the sensor 3, each generation unit generates an environmental map together with self-position estimation by, for example, a SLAM (Simultaneous Localization and Mapping) method.
  • the third map as a whole may be composed of, for example, all the acquired information.
  • processor such as a CPU (Central Processing Unit) or GPU (Graphics Processing Unit). Further, at least a part thereof may be implemented by an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or the like, and an analog circuit or a digital circuit such as an ASIC or FPGA is also described as a processor in the following. That is, all components may be realized by the processor.
  • a processing circuit such as a CPU (Central Processing Unit) or GPU (Graphics Processing Unit).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the relationship between the mobile body and the information processing device 1 is not drawn, but this may be in any form.
  • At least the sensor 2 (and the sensor 3) needs to be mounted on the moving body, but other configurations can be arbitrarily arranged.
  • the information processing device 1 is provided inside the mobile body, and the output unit may output via a wired or wireless network.
  • the information processing device 1 may be provided in the housing.
  • the housing may be a server or the like provided via the Internet. It is also possible to use a server or the like existing in the cloud.
  • some of the components may be mounted on the mobile body, and other components may be mounted on another server or the like. In this way, each component can be arbitrarily arranged in a moving body or other housing.
  • FIG. 4 is a flowchart showing the overall processing according to the present embodiment.
  • the information processing device 1 acquires advance map data based on the information acquired by the sensor 2 (and the sensor 3) (S10).
  • the prior map data is map data generated by, for example, a moving body traveling autonomously or remotely controlled in advance to acquire various environmental information. During this run, self-position estimation such as SLAM and environmental map generation are executed.
  • the surrounding environment information is acquired by a sensor 2 or the like.
  • the environmental information is acquired as point data, for example, depending on the position of the moving object and the direction of the sensor 2.
  • the moving body acquires distance information at a predetermined angle as point data based on the position of the moving body and the angle of the sensor 2 while moving.
  • the point data is acquired as the position of the moving body, the measured distance, and the direction information of the sensor 2.
  • the point cloud data which is the data of a plurality of point clouds, is acquired. There may be a plurality of these predetermined angles depending on the performance of the sensor 2.
  • the sensor 2 transmits the acquired data and the distance information to the information processing device 1.
  • FIG. 5 is a diagram schematically showing how the moving body acquires point cloud data by the sensor 2.
  • the sensor 2 is provided at the tip of the moving body.
  • the arrow indicates the direction of travel of the moving object.
  • the moving body measures a distance in a plurality of directions at a certain position, and acquires point cloud data which is a set of point data such as obstacles.
  • the points P1, P2, P3, and P4 are acquired as the positions of the wall W1
  • the data of the points P5 and P6 are acquired as the positions of the wall W2 together with the distance data.
  • the information processing apparatus 1 acquires one or a plurality of point cloud data at each position via the sensor 2.
  • FIG. 6 shows a state in which the time has advanced from the state of FIG.
  • the sensor 2 mounted on the moving body has the points P'1, P'2, P'3, P'4, P'5 as the positions of the wall W1 and the points P'6 as the positions of the wall W2.
  • Information is acquired as point cloud data.
  • the moving body acquires point data at each position in this way while moving, and the information processing apparatus 1 acquires point cloud data as a whole.
  • the information processing device 1 executes SLAM (self-position estimation and map generation) to acquire pre-map data while the moving object acquires point cloud data in the area where it moves automatically / autonomously. do.
  • SLAM self-position estimation and map generation
  • FIG. 7 is a flowchart showing the process of map generation (S10) according to the embodiment.
  • the information processing device 1 acquires point cloud data from the sensor 2 (S100).
  • the point cloud data is, for example, data acquired by a sensor 2 mounted on a moving body, and is data indicating a distance and a direction with respect to a point.
  • the point cloud data may be stored in the storage unit 102.
  • the filter 100 filters the point cloud data by comparing it with a predetermined distance (S102).
  • points P1, P2, P5, and P6 are points observed at a distance shorter than the predetermined distance shown in the figure, so that they are used for generating a short-range map (first map). It is filtered as point cloud data (first point cloud data). On the other hand, since the points P3 and P4 are points observed at a distance longer than a predetermined distance, they are filtered as point cloud data (second point cloud data) for generating a long-distance map (second map).
  • points P'1, P'2, P'3, P'4, and P'6 are points observed at a distance shorter than the predetermined distance. , Filtered as point cloud data for the first map generation.
  • the point P'5 is a point observed at a distance longer than a predetermined distance, it is filtered as point cloud data for generating a second map.
  • the data filtered in this way may be stored in the storage unit 102 as data for generating each map.
  • the first generation unit 104 generates the first map using the first point cloud data
  • the second generation unit 106 generates the second map using the second point cloud data (S104).
  • FIG. 8 is a diagram showing an example of the first map generated by the first generation unit 104.
  • the solid line indicates the boundary in the map generated from the first point cloud data.
  • the dotted line is the position of the wall to be drawn for reference.
  • the first generation unit 104 uses the point cloud data of the short distance up to that point and the point cloud data acquired in the state of FIG. Create the position of as the first map.
  • the first generation unit 104 generates this map by using, for example, the method of SLAM.
  • the first generation unit 104 determines the point cloud data as a point on the boundary between the shaded area indicating an obstacle and the area indicating a passage, and generates such a map.
  • FIG. 9 is a diagram showing an example of the first map generated by the first generation unit 104.
  • the first generation unit 104 generates the first map shown in FIG. 9 by the same processing as described above.
  • FIG. 10 is a diagram showing an example of a second map generated at the timing of FIG. 6 by the second generation unit 106.
  • the second generation unit 106 generates the second map by the same processing as the first generation unit 104 described above using the second point cloud data.
  • the third generation unit 108 generates a third map using unfiltered point cloud data, that is, point cloud data including both short-distance and long-distance data. (S106).
  • FIG. 11 is a diagram showing a position example of the third map generated at the timing of FIG. 6 by the third generation unit 108.
  • the third generation unit 108 generates the third map without separating the short-distance point cloud and the long-distance point cloud using the point cloud data without filter filling.
  • the examples in FIGS. 8 to 11 show an ideal case in which there is no difference between a short distance and a long distance. In such a case, a composite of the first map of FIG. 9 and the second map of FIG. 10 is acquired as the third map.
  • the process of S10 in FIG. 4 is executed.
  • the information processing apparatus 1 generates and acquires the first map, the second map, and the third map as advance maps.
  • this map generation process may be repeatedly executed until the necessary map information is acquired. For example, for each acquisition frame of the point cloud data of the sensor 2, the information processing apparatus 1 repeatedly executes the map generation until the map generation is completed.
  • the map generation may be repeatedly executed up to an appropriate end condition, for example, until the moving object has finished moving within the movable range, or until the moving body has finished moving for a predetermined time.
  • the information processing apparatus 1 generates pre-maps and then adjusts these pre-maps (S20).
  • the adjustment of the pre-map is, for example, the processing of correcting the distortion of the map and correcting the difference between the first map and the second map.
  • FIG. 12 is a flowchart showing the process of S20 in FIG. 4 more concretely.
  • the correction unit 110 acquires a preliminary map from the storage unit 102 (S200).
  • the correction unit 110 executes distortion correction of the first map and the second map based on the acquired third map (S202). Creating maps with SLAM using different point cloud data may perform different optimization calculations. In such a case, if the above filtered point cloud data is used, the maps will be distorted. In this process, this distortion is corrected.
  • the correction unit 110 calculates the distortion of the first map and the second map for the third map using a lot of point cloud data, and corrects the first map and the second map, respectively.
  • This correction can be performed by using any method such as similarity transformation, affine transformation, projective transformation, Rubber-Sheeting, or non-linear transformation. Moreover, these methods are given as an example, and are not limited to these.
  • homothety transformations include scale, rotation, translation, and mirror transformation
  • affine transformations include transformations obtained by adding shear transformations to homothety transformations
  • Rubber-Sheeting includes methods such as Polynomial and Thin Plate Spline. Is the same as above, and is not limited thereto.
  • FIG. 13 is a diagram schematically showing an example of correction by the correction unit 110. As shown in this figure, point data at the same time is acquired on the first map and the third map, and control points are set. The control points are indicated by circles in the figure. By performing the above conversion so that the control points match, the distortion of the first map is corrected to match the third map.
  • the second map is also corrected in the same way.
  • the same control points are used for correction with the first map and the second map, but the control points are not limited to this.
  • the point cloud data that can be acquired differs between the first map and the second map. Therefore, it is not always possible to obtain the same control point. Therefore, a control point is set between the first map and the third map to correct the first map, and a control point is set between the second map and the third map to correct the second map.
  • the correction unit 110 sets control points for the first map and the second map, respectively, acquires points on the third map for the control points, and obtains the first map and the first map. 2 Correct the map.
  • the information processing apparatus 1 can use the first map and the second map as maps having the same scale.
  • the comparison unit 112 then extracts the difference between the first map corrected by the correction unit 110 and the second map (S204).
  • the data indicating the same point is projected at the same position on the map between the first point cloud data and the second point cloud data, these differences do not occur. However, in general, the following differences occur.
  • FIG. 14 is a diagram showing an example of point data acquired at a short distance and point data for detecting the same position acquired at a long distance. As shown in Fig. 14, consider the case where there is an obstacle on the floor. The moving object is provided with a sensor 2, which has a small angle with respect to the floor, either intentionally or unintentionally.
  • the sensor 2 When the distance is measured by the sensor 2, for example, in the measurement at a long distance, the sensor 2 can obtain the distance between the moving object and the obstacle almost accurately. On the other hand, in the measurement at a short distance, the distance to the obstacle is significantly different from the correct distance. For example, in the case of FIG. 14, the sensor 2 measures a distance of about 3 to 4 times the original distance.
  • FIG. 15 is a diagram showing another example of the point data acquired at a short distance and the point data acquired at a long distance to detect the same position.
  • long-distance ranging data was more accurate, but not limited to this.
  • the short-distance distance measurement data is more accurate than the long-distance distance measurement data. Thus, it is not possible to uniquely determine that a long distance or a short distance is more accurate.
  • the reflective surface of the obstacle is glass, has a high reflectance, or has an extremely low reflectance, which one is the most accurate data is unconditional. I can't judge.
  • the comparison unit 112 extracts an area such that the first map and the second map show different points by comparing the first map and the second map.
  • FIG. 16 is a schematic diagram showing an example in which there is a difference between the acquired first map and the second map. As shown in FIG. 16, for example, there is a difference between the position of the obstacle on the first map and the position of the obstacle on the second map.
  • the comparison unit 112 extracts such an area on the map by comparing the first map and the second map.
  • the synthesis unit 114 then synthesizes the region extracted by the comparison unit 112 into the third map (S206).
  • the synthesis unit 114 may draw a difference region extracted by the comparison unit 112 on the third map and output it via the output unit 116.
  • the output unit 116 may include a display as a display device, and the information processing device 1 may display the third map synthesized by the synthesis unit 114 on this display.
  • FIG. 17 is a diagram showing an example of the synthesized third map displayed on the display.
  • the information processing apparatus 1 outputs, for example, to the output unit 116 an area in which a difference is detected by the comparison unit 112 in an area as shown by a diagonal line rising to the left.
  • the information processing apparatus 1 corrects the third map and updates the third map (S208).
  • the third map synthesized by the synthesis unit 114 is output via the output unit 116, and the user is requested to process.
  • FIG. 18 is a diagram showing an example of a user interface displayed on the display device when the display device is provided as the output unit 116. For example, a third map is displayed on the display device. Then, while looking at this output device, the user can instruct, for example, to delete the area. As shown in FIG. 18, as an example, the display device may be provided with a button for deleting an area.
  • the user may press the button for deleting this area by using an arbitrary input interface which is the input unit 118, so that the information processing apparatus 1 may change the cursor into an eraser type as shown in the figure, for example. ..
  • the user can also delete this area from the third map by setting the area using the input unit 118.
  • the input interface is, for example, a mouse, a trackball, a keyboard, a touch panel, or the like.
  • a touch panel it may be a touch display in which the display and the touch panel are integrally shaped.
  • the update unit 120 may delete the area requested to be deleted from the third map. Further, this area may be stored on the third map as an unknown area. Alternatively, when a deletion request is made, the output unit 116 may output so as to set a new boundary. In this case, the user may optionally set a new boundary. When the boundary is not set by the user, the update unit 120 may store the boundary as an unknown area as described above, or may set the innermost boundary as a temporary boundary.
  • the information processing device 1 may show the user other options for processing related to the third map.
  • an automatic deletion button may be prepared, and when automatic deletion is selected, the area where the difference is generated extracted by the comparison unit 112 may be automatically deleted.
  • other processes such as manipulating the area and manipulating the boundary may be selectable.
  • the update unit 120 may execute the process of the third map based on the process selected by the user.
  • the update unit 120 may store the updated third map in, for example, the storage unit 102. Further, as another example, it may be output to the outside via the output unit 116.
  • the moving object executes self-position estimation using the third map (S30).
  • the moving body can estimate its own position accurately and at high speed.
  • the moving body may estimate its own position without matching the information acquired by the sensor 2 with the unknown area. If the innermost boundary is a tentative boundary, autonomous movement may be executed based on the tentative boundary. If the correct boundary is specified by the user and synthesized on the third map, autonomous movement may be performed based on this synthesized enhancement. In this way, it is possible to move autonomously using the updated third map. In this case, self-position estimation may be performed by the SLAM method that simultaneously executes self-position estimation and map creation.
  • the first map created from the first point group data whose measurement result was a short distance and the second map created from the second point group data whose measurement result was a long distance It is possible to update the third map, which is the entire map, based on the two maps. By updating the third map in this way, it is possible to improve the accuracy when self-position estimation using the third map is performed.
  • the point cloud data is divided into a short distance and a long distance according to a predetermined distance, but the division is not limited to this.
  • the point cloud data may be divided into three ranges of short distance, medium distance, and long distance, or the point cloud data may be further divided into four or more ranges.
  • a map is created for each range in which the point cloud data is filtered, and separately, a map using the entire point cloud data (the above-mentioned third map) is created. Then, the other processing described above may be executed.
  • the configuration is such that the output unit 116 and the input unit 118 are provided, but the configuration is not limited to this.
  • the third map may be automatically updated based on the map information synthesized by the synthesis unit 114. That is, the output unit 116 and the input unit 118 are not indispensable configurations, and may be configured to automatically update the map in a configuration without them.
  • the processing flow may be the order of processing shown in FIG. 4, as in the above-described embodiment.
  • the moving body may determine that it is a step that can be crossed by itself when the boundary of the first map is in front as described above. In this case, self-position estimation may be performed in these areas without matching at the time of movement.
  • This judgment may be made based on, for example, the relative position of the boundary on the first map and the boundary on the second map with respect to the moving object, and the distance between these boundaries. For example, if the boundary on the second map is closer to the moving object and the distance between the boundary on the first map and the second map is more than a predetermined distance, it is judged as a step that can be crossed and it is within the predetermined distance. It may be judged that the step cannot be crossed.
  • an area including a boundary is designated as an unknown area on the third map in the information processing apparatus 1, and in the unknown area, the first map and the first map are used at the stage of estimating the self-position of the moving object. 2 It may be configured to refer to the map. Of course, it may be stored as information in which the boundary between the first map and the second map is added to the third map. Further, as another example, the updating unit 120 may modify the third map as an area having a step that can be crossed, such as an area satisfying the above conditions in the process of updating the third map.
  • the third map may be automatically modified or updated.
  • the correction is not limited to automatically adding information such as steps to the third map, and even if the correction is made to automatically add the boundaries of the first map and the second map to the third map. good. In this case, it is possible to use the information of the different boundaries of the first map and the second map added to the third map at the timing when the moving body moves.
  • the update unit 120 automatically updates if it can be automatically processed, and if it cannot be automatically processed, the user's processing is performed. You may request it.
  • the user may be requested to perform what kind of processing. For example, if the position where the obstacle exists is a difference of a distance equal to or greater than the threshold value between the first map and the second map, the output unit 116 may output an alert.
  • the filter 100 may be divided into a plurality of distance zones instead of dividing the distance into two.
  • the processing flow may be the same as in FIG.
  • a generation unit corresponding to each distance zone is provided, and each map is displayed using the point cloud data acquired in each distance zone by the processing of the generation unit of each distance zone.
  • whether or not to create a third map using unfiltered point cloud data depends on the algorithm.
  • the process of S106 that generates the third map may be omitted.
  • this process is, of course, executed without omission.
  • FIG. 19 is a flowchart showing the process of S20 in FIG. 4 according to the present embodiment.
  • the information processing device 1 reads the created map of each distance band (S200), and then the correction unit 110 corrects the map acquired from these plurality of generation units (S210).
  • the method of correction is as shown in the first embodiment. This correction process can be omitted depending on the algorithm of the synthesis process. Therefore, in the present embodiment, the correction unit 110 is not an indispensable configuration depending on the method used.
  • the comparison unit 112 performs a comparison of each map (S212). This process is not essential and can be omitted depending on the algorithm. Therefore, in the present embodiment, the comparison unit 112 may not be provided in the information processing apparatus 1 depending on the method used.
  • the synthesizing unit 114 synthesizes the data acquired by various distances (S214). This synthesis is performed, for example, by processing an extended Kalman filter.
  • the information that can be acquired by the sensor 2 may be divided into an arbitrary number for each distance, and SLAM may be executed independently in each range. Then, after creating the map, the map may be synthesized based on a predetermined algorithm.
  • self-position estimation may be performed independently for all distances, and cartography may be performed by synthesizing the results of self-position estimation.
  • an algorithm such as an extended Kalman filter.
  • this embodiment it is possible to create a map based on the self-position estimation result for each distance zone. Further, in this case, another information from the sensor 3 may be reflected in an algorithm such as an extended Kalman filter.
  • the extended Kalman filter was used, but it is not limited to this.
  • various linear or non-linear methods such as simple mean value, multiple regression analysis, covariance analysis, Unscented Kalman filter, and ensemble Kalman filter can be used.
  • a third map created without filtering and a map for each distance zone are created.
  • the correction unit 110 corrects the distortion of the map in each distance zone based on the third map (S210).
  • the comparison unit 112 extracts the difference in the map of each distance zone (S212). For example, the boundaries between the first map and the second map as shown in FIG. 16 are extracted for each distance zone. It may be different in all distance zones, or it may have the same boundary in some distance zones.
  • the compositing unit 114 executes compositing in which the position of the boundary in each distance zone is embedded in the third map (S214). That is, in the third map, information on the position (at least the boundary) where an obstacle or the like is detected in each distance zone is embedded in the map in the unfiltered state.
  • the embedding of the boundary for each distance zone should be performed at least only in the area where the difference in distance is detected by the comparison unit 112.
  • the synthesis unit 114 stores the data of the third map in which the information is embedded in the storage unit 102. For example, in the third map, data indicating boundaries such as obstacles labeled for each distance zone may be embedded.
  • the mobile body can perform self-position estimation (and map creation) by referring to this updated third map.
  • the data of the points captured by the sensor 2 is a region with a difference in the third map
  • the data of the distance band embedded in the third map is used. More specifically, self-position estimation (and cartography) is executed by matching the position of the boundary of the distance zone (labeled data) including the distance measurement result with the data of the distance measurement position. be able to.
  • the process may not execute matching at the position.
  • the information processing apparatus 1 can determine whether or not to perform matching in the position estimation process based on the result of distance measurement.
  • FIG. 20 is a diagram showing an example of a region where the point data of the sensor 2 according to the embodiment can be acquired.
  • the searchable range of the sensor 2 may be deformed (light beam deformation) depending on the direction of the optical axis or the receiving sensor (light receiving sensor).
  • the filter 100 does not filter by distance, but rather the positional relationship between the sensor 2 and the target point, more specifically, the angle with the installation direction of the sensor surface. Filtering may be executed depending on the relationship of.
  • FIG. 21 is a diagram schematically showing each range when the filter 100 filters the range by an angle.
  • this angle range may be set based on the mounting direction of the sensor 2.
  • the filter 100 defines a range of a predetermined angle from the center to the left and right (or may be up and down, etc.) as the angle range 1, and the observation point in this range is the first generation unit. It may be output to 104. Then, the filter 100 may output the observation points in the angle range 2 and the angle range 3, which are the range of the angle outside the angle range, to another generation unit, for example, the second generation unit 106.
  • the angle range 2 and the angle range 3 may be output to the same generation unit, but it may be output to another generation unit. That is, data may be output to different generation units on the left side and the right side when viewed from the sensor 2, and maps may be generated separately.
  • the angle range is divided into three, but it is not limited to this. Similar to the distance range described above, a margin may be set between the respective angle ranges, or may be further divided into more angle ranges.
  • points P1 and P6, points P2 and P5, and points P3 and P4 are deviated from the sensor surface by the same angle in opposite directions.
  • the observation results in the directions of points P1 and P6, the observation results in the directions of points P2 and P5, and the observation results in the directions of points P3 and P4 may be filtered separately.
  • a map may be created according to the angle to extract areas with differences.
  • the composition with the third map may be based on the user's request or may be automatically executed.
  • the area may be divided for each angle band, a map corresponding to a plurality of angle bands may be created, and this may be synthesized by a method such as an extended Kalman filter. Further, SLAM may be executed for each observation data acquired by the angle band to create a plurality of maps, and if there is a difference, labeling for each angle band may be performed.
  • filtering may be performed by considering both distance and direction. For example, a short distance and a long distance may be filtered by a predetermined distance, and the front surface and the side surface may be filtered by a predetermined distance. In this case, five maps of four maps + the entire map may be created, and the same processing as in each of the above-described embodiments may be executed for each. Of course, it is not limited to the front and side divisions, and may be divided into three or more.
  • the present invention is not limited to this.
  • the light receiving surface of the sensor 2 may be tilted.
  • Different angle bands may be assigned to the left and right in the horizontal direction with respect to the optical axis of the sensor 2.
  • the filter 100 filters, for example, the right side surface, the front surface, and the left side surface using two predetermined angles.
  • short-distance and long-distance filtering can generate six different maps. This may be applied to each of the above-described embodiments by using this and the third map which is the whole map.
  • FIG. 22 is a diagram showing an unlimited implementation example of the information processing apparatus according to the embodiment.
  • the information processing device 1 is mounted on, for example, a mobile robot 4 and a terminal 5.
  • the terminal 5 may be mounted in the robot 4, and in this case, the robot 4 includes all the components.
  • the robot 4 includes a sensor 2, a sensor 3, a filter 100, a storage unit 102, a first generation unit 104, a second generation unit 106, a third generation unit 108, and a pre-map holding unit 122.
  • Those having the same reference numerals as the above-described forms are components that perform the same operation, and perform the same operation.
  • Each generation unit stores the generated map in the advance map holding unit 122.
  • the advance map holding unit 122 may be provided in, for example, the storage unit 102. When the robot 4 and the terminal 5 communicate with each other, the advance map holding unit 122 may be provided in an area easily accessible from the communication interface.
  • the advance map holding unit 122 outputs the map output from each generation unit to the terminal 5.
  • the terminal 5 includes a storage unit 102, a pre-map management unit 124, a correction unit 110, a comparison unit 112, a synthesis unit 114, an output unit 116, an input unit 118, an update unit 120, and a pre-map output unit.
  • the storage unit has the same reference numeral as that of the robot 4, but this may be different or the same. Further, similarly to each component of the robot 4, those having the same reference numerals as the above-mentioned mobile phones are components that perform the same operation, and perform the same operation.
  • the advance map management unit 124 acquires the advance map from the advance map holding unit 122. Then, each component after the correction unit 110 executes each of the above processes based on the advance map received by the advance map management unit 124.
  • the pre-map output unit 126 outputs the updated third map as a pre-map to the pre-map management unit 124.
  • the advance map management unit 124 outputs the advance map information acquired from the advance map output unit 126 to the advance map holding unit 122 of the robot 4.
  • the robot 4 may execute SLAM based on the information acquired from the sensors 2 and 3 by using this preliminary map by a control unit or the like (not shown), and acquire the map information necessary for movement.
  • the control unit may realize the movement processing of the moving body based on this map information.
  • the robot 4 and the terminal 5 communicate with each other using, for example, a wired or wireless communication path. Both may be provided with a communication interface for performing this communication.
  • the terminal 5 may be, for example, a computer or a portable terminal such as a smartphone or a tablet terminal. Further, all the components of the terminal 5 do not need to be provided in the terminal 5, for example, the output unit 116 and the input unit 118 are portable terminals, and the other components are in the built-in computer or communication network. It may be any existing computer.
  • the information processing apparatus 1 can be implemented by being divided into arbitrary operation units in any form.
  • the technology related to this disclosure can be applied to various products.
  • the technology according to the present disclosure is any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.
  • FIG. 23 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. ..
  • the communication network 7010 connecting these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various arithmetic, and a drive circuit that drives various controlled devices. To prepare for.
  • Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside or outside the vehicle by wired communication or wireless communication.
  • a communication I / F for performing communication is provided. In FIG.
  • control unit 7600 As the functional configuration of the integrated control unit 7600, the microcomputer 7610, the general-purpose communication I / F7620, the dedicated communication I / F7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F7660, the audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are illustrated.
  • Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 has a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • the vehicle state detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. It includes at least one of sensors for detecting an angle, engine speed, wheel speed, and the like.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps.
  • a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200.
  • the body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
  • the vehicle outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400.
  • the image pickup unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle outside information detection unit 7420 is used, for example, to detect the current weather or an environment sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the surrounding information detection sensors is included.
  • the environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 24 shows an example of the installation position of the image pickup unit 7410 and the vehicle outside information detection unit 7420.
  • the image pickup unit 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirror, rear bumper, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900.
  • the image pickup unit 7910 provided in the front nose and the image pickup section 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • the image pickup units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900.
  • the image pickup unit 7916 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900.
  • the image pickup unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 24 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging range of the imaging units 7912 and 7914 provided on the side mirrors, respectively
  • the imaging range d indicates the imaging range d.
  • the imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the image pickup units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 can be obtained.
  • the vehicle exterior information detection unit 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corner and the upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device.
  • These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
  • the vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle outside information detection unit 7400 receives detection information from the connected vehicle outside information detection unit 7420.
  • the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device
  • the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the out-of-vehicle information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
  • the out-of-vehicle information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc. based on the received information.
  • the out-of-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the vehicle outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different image pickup units 7410 to generate a bird's-eye view image or a panoramic image. May be good.
  • the vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different image pickup units 7410.
  • the in-vehicle information detection unit 7500 detects the in-vehicle information.
  • a driver state detection unit 7510 for detecting the state of the driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
  • the biosensor is provided on, for example, a seat surface or a steering wheel, and detects biometric information of a passenger sitting on the seat or a driver holding the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is asleep. You may.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. You may.
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
  • the storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • Bluetooth® may be implemented.
  • the general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal). May be connected with.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
  • the dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of IEEE802.11p in the lower layer and IEEE1609 in the upper layer. May be implemented.
  • Dedicated communication I / F7630 is typically vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-house (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic jam, road closure, or required time.
  • the function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle.
  • the in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • the in-vehicle device I / F7660 is via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)).
  • -Definition Link and other wired connections may be established.
  • the in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device for searching a route to an arbitrary destination.
  • the in-vehicle device I / F 7660 may be a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
  • the in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the vehicle-mounted network I / F7680 transmits / receives signals and the like according to a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680.
  • the vehicle control system 7000 is controlled according to various programs based on the information acquired. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of.
  • the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving or the like.
  • the microcomputer 7610 has information acquired via at least one of a general-purpose communication I / F7620, a dedicated communication I / F7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F7660, and an in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict the danger of a vehicle collision, a pedestrian or the like approaching or entering a closed road, and generate a warning signal based on the acquired information.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices.
  • the display unit 7720 may include, for example, at least one of an onboard display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps other than these devices.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs the audio signal audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be composed of a plurality of control units.
  • the vehicle control system 7000 may include another control unit (not shown).
  • the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any of the control units.
  • a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. .
  • a computer program for realizing each function of the information processing apparatus 1 according to the present embodiment described with reference to FIG. 1 can be implemented in any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
  • the information processing apparatus 1 can be applied to the integrated control unit 7600 of the application example shown in FIG. 23.
  • each component of the information processing apparatus 1 corresponds to a microcomputer 7610, a storage unit 7690, and an in-vehicle network I / F 7680 of the integrated control unit 7600.
  • the integrated control unit 7600 can assist automatic driving and autonomous movement by generating a map and estimating a self-position.
  • the components of the information processing apparatus 1 described with reference to FIG. 1 are in the module for the integrated control unit 7600 shown in FIG. 23 (for example, an integrated circuit module composed of one die). It may be realized. Alternatively, the information processing apparatus 100 described with reference to FIGS. 1, 22 and the like may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG. 23. Further, as the output unit 116, for example, the display unit 7720 may be used.
  • the point cloud is different based on the point cloud data obtained from the distance measurement sensor mounted on the mobile device and at least one or more conditions of the distance or direction to the distance measurement target indicated by the point cloud data. Generate at least two maps using the data, Information processing equipment.
  • the processor The point cloud data is filtered based on the distance or direction. Generate the two or more maps based on the filtered point cloud data.
  • the information processing device according to (1).
  • the processor Comparing the point cloud data with a predetermined distance, A first map is generated from the first point cloud data of a distance closer than the predetermined distance, and the first map is generated. A second map is generated from the second point cloud data at a distance farther than the predetermined distance.
  • the processor A third map is generated from the unfiltered point cloud data including both the first point cloud data and the second point cloud data.
  • the information processing device according to (3).
  • the processor Based on the third map, the distortion of the first map and the distortion of the second map are corrected.
  • the information processing device according to (4).
  • the processor Extract the difference between the corrected first map and the corrected second map.
  • the information processing device according to (5).
  • the processor The extracted difference is drawn on the third map.
  • the information processing device according to (6).
  • a display device that displays according to the request from the processor. Further prepare The processor The third map on which the difference is drawn is displayed on the display device, and the display device is displayed. Accepts processing from users, The information processing device according to (7).
  • the processor is further equipped with an input interface for inputting a request from a user.
  • the processor The third map is updated by executing the process received via the input interface.
  • the processor Based on the extracted differences, the third map is modified and updated.
  • the information processing apparatus according to any one of (6) to (10).
  • the processor At least, the self-position estimation result based on the first map and the self-position estimation result based on the second map are integrated to execute the self-position estimation.
  • the information processing device according to (3).
  • the processor Using the extended Kalman filter, the self-position estimation result based on the first map and the self-position estimation result based on the second map are integrated.
  • the information processing device according to (12).
  • the processor A distance at which the difference point can be detected is set as a label at a point on the third map corresponding to the difference point between the first map and the second map.
  • the processor When self-position estimation is performed based on the difference point, it is determined whether or not self-position estimation is performed based on the distance to the difference point and the label set at the difference point.
  • the information processing apparatus according to (14).
  • the processor generates the map described in any of (1) to (15). Information processing method.
  • the aspect of the present disclosure is not limited to the above-mentioned embodiment, but also includes various possible modifications, and the effect of the present disclosure is not limited to the above-mentioned contents.
  • the components in each embodiment may be applied in appropriate combinations. That is, various additions, changes and partial deletions are possible without departing from the conceptual idea and purpose of the present disclosure derived from the contents specified in the claims and their equivalents.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Le problème décrit par la présente invention consiste à réduire des erreurs de mesure de position libre. La solution selon l'invention porte sur un dispositif de traitement d'informations qui est doté d'un processeur. Sur la base de données de nuage de points acquises à partir d'un capteur de mesure de distance monté sur un dispositif mobile et d'au moins une condition de distance ou de direction vers un sujet de mesure représenté par les données de nuage de points, le processeur génère au moins deux cartes, chacune utilisant un ensemble différent des données de nuage de points.
PCT/JP2021/023757 2020-07-28 2021-06-23 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2022024602A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-127444 2020-07-28
JP2020127444A JP2023122597A (ja) 2020-07-28 2020-07-28 情報処理装置、情報処理方法及びプログラム

Publications (1)

Publication Number Publication Date
WO2022024602A1 true WO2022024602A1 (fr) 2022-02-03

Family

ID=80035472

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/023757 WO2022024602A1 (fr) 2020-07-28 2021-06-23 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (2)

Country Link
JP (1) JP2023122597A (fr)
WO (1) WO2022024602A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7424438B1 (ja) 2022-09-22 2024-01-30 いすゞ自動車株式会社 車両位置推定装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018017900A (ja) * 2016-07-28 2018-02-01 シャープ株式会社 地図作成方法及び地図作成装置
JP2018206038A (ja) * 2017-06-02 2018-12-27 株式会社Ihi 点群データ処理装置、移動ロボット、移動ロボットシステム、および点群データ処理方法
JP2019175136A (ja) * 2018-03-28 2019-10-10 日本電産シンポ株式会社 移動体
JP2020038498A (ja) * 2018-09-04 2020-03-12 株式会社Ihi 自己位置推定装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018017900A (ja) * 2016-07-28 2018-02-01 シャープ株式会社 地図作成方法及び地図作成装置
JP2018206038A (ja) * 2017-06-02 2018-12-27 株式会社Ihi 点群データ処理装置、移動ロボット、移動ロボットシステム、および点群データ処理方法
JP2019175136A (ja) * 2018-03-28 2019-10-10 日本電産シンポ株式会社 移動体
JP2020038498A (ja) * 2018-09-04 2020-03-12 株式会社Ihi 自己位置推定装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7424438B1 (ja) 2022-09-22 2024-01-30 いすゞ自動車株式会社 車両位置推定装置

Also Published As

Publication number Publication date
JP2023122597A (ja) 2023-09-04

Similar Documents

Publication Publication Date Title
US11531354B2 (en) Image processing apparatus and image processing method
US11860640B2 (en) Signal processing device and signal processing method, program, and mobile body
US10970877B2 (en) Image processing apparatus, image processing method, and program
JP2019045892A (ja) 情報処理装置、情報処理方法、プログラム、及び、移動体
JP7320001B2 (ja) 情報処理装置、情報処理方法、プログラム、移動体制御装置、及び、移動体
JP7294148B2 (ja) キャリブレーション装置とキャリブレーション方法およびプログラム
WO2019130945A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et corps mobile
US20200241549A1 (en) Information processing apparatus, moving apparatus, and method, and program
US10587863B2 (en) Image processing apparatus, image processing method, and program
CN113692521A (zh) 信息处理装置、信息处理方法和信息处理程序
US20230230368A1 (en) Information processing apparatus, information processing method, and program
US20200349367A1 (en) Image processing device, image processing method, and program
WO2019073795A1 (fr) Dispositif de traitement d'informations, procédé d'estimation de position propre, programme et corps mobile
WO2022024602A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20220276655A1 (en) Information processing device, information processing method, and program
US11436706B2 (en) Image processing apparatus and image processing method for improving quality of images by removing weather elements
WO2022044830A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
JP7363890B2 (ja) 情報処理装置、情報処理方法及びプログラム
CN115128566A (zh) 雷达数据确定电路及雷达数据确定方法
US20210295563A1 (en) Image processing apparatus, image processing method, and program
WO2022059489A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2020195969A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2022196316A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme associé
WO2021065510A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, système de traitement d'informations, et programme
US20230119187A1 (en) Circuitry and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21849146

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21849146

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP