WO2022059489A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2022059489A1
WO2022059489A1 PCT/JP2021/032172 JP2021032172W WO2022059489A1 WO 2022059489 A1 WO2022059489 A1 WO 2022059489A1 JP 2021032172 W JP2021032172 W JP 2021032172W WO 2022059489 A1 WO2022059489 A1 WO 2022059489A1
Authority
WO
WIPO (PCT)
Prior art keywords
self
accuracy
information processing
processor
algorithm
Prior art date
Application number
PCT/JP2021/032172
Other languages
French (fr)
Japanese (ja)
Inventor
幹夫 中井
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022059489A1 publication Critical patent/WO2022059489A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Definitions

  • This disclosure relates to information processing devices, information processing methods and programs.
  • a star reckoning method for estimating the absolute self-position and a dead reckoning method for estimating the relative self-position are used.
  • self-position estimation for autonomous movement low-speed star reckoning is performed in a predetermined area, and based on this absolute position, relative self-position estimation is performed while moving by dead reckoning that can be executed at high speed. Is commonly done.
  • dead reckoning depends on past results, there is a problem that if an error occurs in the estimation, this error will be accumulated.
  • a predetermined timing for example, a predetermined distance
  • move to an area where star reckoning can be performed perform star reckoning, and re-estimate the self-position at an absolute position. Resets the accumulation of errors due to dead reckoning.
  • the timing at which Star Reckoning can be used is often limited depending on the surrounding environment and the sensors and algorithms used. In such a case, it is necessary to move the moving body to a predetermined position at a predetermined timing, or to change the behavior such as interrupting the work being performed and moving for this movement. .. Further, even in the path for returning to a predetermined position, since the path length is not constant, there is a problem that the path must be returned with a considerable margin. In addition, the self-position estimation accuracy may decrease due to various causes regardless of dead reckoning. Even in such a case, it is necessary to change the behavior in order to solve the decrease in the self-position estimation accuracy as described above.
  • the present disclosure provides an information processing device, an information processing method, and a program that suppresses a decrease in the accuracy of self-position estimation and improves the accuracy.
  • the information processing device includes a processor for controlling a mobile body.
  • the processor calculates the accuracy of self-position estimation based on information obtained from one or more sensors, and if the accuracy is reduced or is likely to be reduced, at least one of the cases. , Update the parameters based on the algorithm that performs the self-position estimation, and create an action plan based on the updated parameters.
  • the processor may control the moving body based on the created action plan.
  • the processor may estimate the self-position using a self-position estimation algorithm set for each sensor.
  • the processor may estimate the self-position using one algorithm based on the outputs from the plurality of sensors.
  • the processor may update the parameters based on the sensor or algorithm used to estimate the self-position.
  • the processor may calculate the result of self-position estimation by the sensor or the algorithm, and calculate the accuracy.
  • the processor may calculate the accuracy using one algorithm based on the outputs from the plurality of sensors.
  • the processor may acquire the peripheral situation of the moving body based on the information acquired from the sensor, and may update the parameter based on the acquired peripheral situation.
  • a memory which stores information regarding the update of the parameter, may be further provided.
  • the processor may update the parameters based on the information stored in the memory.
  • the memory may store a table in which information relating to the sensor used for estimating the self-position, the type of algorithm, the parameter to be updated, the condition, and the update data is described.
  • the processor may update the parameters with reference to the table.
  • the processor may update the parameters for the algorithm that has or is likely to have reduced accuracy.
  • the processor may update the parameter so as to select a route that suppresses the decrease in accuracy or a route that improves the accuracy.
  • the parameter to be updated may be a cost map.
  • the information processing method is a method of controlling the information processing apparatus described above.
  • the program causes the processor to execute the process described above.
  • the block diagram of the information processing apparatus which concerns on one Embodiment The flowchart which shows the processing of the information processing apparatus which concerns on one Embodiment.
  • FIG. 1 is a block diagram showing a configuration of an information processing device according to an embodiment.
  • the information processing device 1 is a device that estimates the self-position and the self-position accuracy based on the information acquired from one or a plurality of sensors 2, and controls the movement of the moving body based on the estimation results.
  • the information processing device 1 includes a storage unit 100, a self-position estimation unit 102, a self-position accuracy estimation unit 104, a recognizer 106, a parameter update unit 108, a change table holding unit 110, an action planning unit 112, and the like. It includes a control unit 114.
  • the information processing device 1 may be mounted on a moving body together with the sensor 2, for example. Further, as another example, the information processing apparatus 1 may be provided in a place different from the mobile body on which the sensor 2 is mounted via a communication path such as a network.
  • the storage unit 100 stores data and the like required for the operation of the information processing device 1. In addition to input information and output information, data such as the progress of calculation may be temporarily stored.
  • data such as the progress of calculation may be temporarily stored.
  • the change table holding unit 110 which will be described later, may be a part of the storage unit 100, and in this case, the change table is stored in the storage unit 100.
  • a part or all the components of the information processing apparatus 1 are executed by a processor (processing circuit) such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit) that are operated by data stored in the storage unit 100.
  • a processor processing circuit
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • it may be implemented by a dedicated analog circuit or digital circuit, for example, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or the like.
  • the self-position estimation unit 102 executes self-position estimation based on the information acquired by the sensor 2.
  • This self-position estimation may be, for example, star reckoning and dead reckoning, or a combination of these methods may be appropriately used according to the situation.
  • the self-position estimation unit 102 may execute a plurality of self-position estimation algorithms based on information acquired from a plurality of sensors and appropriately fuse them.
  • GPS Global Positioning System
  • image feature point map As star recording, GPS (Global Positioning System), image feature point map, NDT (Normal Distributions Transform) matching of LiDAR (Light Detection and Ringing), etc.
  • NDT Normal Distributions Transform
  • LiDAR Light Detection and Ringing
  • vehicle odometry As the dead reckoning, vehicle odometry, visual SLAM (Simultaneous Localization and Mapping), depth SLAM, IMU (Inertial Measurement Unit) and the like may be used. It is necessary to have the corresponding sensor 2 used for each method, but any of these may be used, and the form in the present disclosure is not limited to the above example.
  • the self-position estimation unit 102 may estimate the self-position by using an algorithm set for each sensor based on the information acquired from one or a plurality of sensors 2.
  • the position estimation results based on the information acquired from each sensor 2 may be fused by an appropriate algorithm.
  • the self-position estimation unit 102 may estimate the self-position by using one algorithm for the information acquired from the plurality of sensors 2.
  • the self-position accuracy estimation unit 104 calculates the self-position accuracy estimation unit 104 estimated by the self-position accuracy estimation unit 102.
  • this accuracy for example, a general method using a variance-covariance matrix or the like may be used.
  • the accuracy of self-position estimation may differ depending on the method used to perform self-position estimation. In this case, the accuracy of self-position estimation suitable for the method may be calculated.
  • the self-position accuracy estimation unit 104 may detect a case where the accuracy of the self-position estimation is expected to decrease in the near future based on the result of the current self-position estimation.
  • a situation in which the self-position accuracy is expected to decrease may be detected from the results of self-position estimation several times in the past, for example. Further, as another example, it may be detected based on statistical information, or it may be detected by inputting the result of past self-position estimation into a trained model by machine learning.
  • the accuracy can be calculated from the total mileage, matching rate, etc., or it can be detected that the accuracy is likely to decrease.
  • the part described as the calculation of accuracy may be rephrased as detecting the possibility that the accuracy will decrease.
  • the accuracy can be calculated using the total mileage.
  • the accuracy may be calculated using the total absolute number of revolutions.
  • the self-position estimation method is star reckoning
  • the covariance by the above dead reckoning can be obtained as a numerical value, and the accuracy may be calculated using this covariance.
  • the accuracy may be calculated using the mileage and the mileage since the last matching, or the matching rate may be used.
  • DOP Denssion of Precision
  • the self-position accuracy estimation unit 104 calculates the self-position accuracy based on the method used by the self-position estimation unit 102.
  • the self-position accuracy estimation unit 104 may calculate the accuracy of self-position estimation for each sensor or for each algorithm.
  • the self-position estimation unit 102 calculates the self-position estimation result for each sensor or algorithm
  • the self-position accuracy estimation unit 104 calculates the accuracy of the result for each sensor or algorithm. May be calculated respectively.
  • the self-position accuracy estimation unit 104 may calculate the final self-position accuracy fused.
  • the self-position accuracy estimation unit 104 may calculate the accuracy of the self-position estimated by different algorithms. In this case, it may be determined whether or not to update the parameters for each algorithm, or whether or not to update the parameters as a result of the fusion may be determined. When the update is determined as a result of the fusion, the parameters for some of the algorithms may be selected and updated.
  • the self-position estimation unit 102 and the self-position accuracy estimation unit 104 are separate components, but these may be one component in some cases. For example, when a method of calculating self-position estimation and self-position accuracy together is used, these components execute self-position estimation and accuracy calculation as one component.
  • the recognizer 106 executes various recognition processes based on the output from the sensor 2.
  • This recognition process executes recognition processes other than the recognition used for self-position estimation, such as detection of a person and detection of an obstacle.
  • recognition processes other than the recognition used for self-position estimation, such as detection of a person and detection of an obstacle.
  • any feature amount such as HoG (Histograms of Oriented Gradient), SIFT (Scaled Invariance Feature Transform), optical flow, etc. may be used.
  • the recognition and detection are not limited to those using these features, and recognition may be performed using a recognition model trained in advance by machine learning. In this way, the recognizer 106 acquires information and a situation regarding the environment around the moving body based on the information acquired from the sensor.
  • the parameter update unit 108 performs autonomous movement using the table held in the change table holding unit 110 based on the self-position accuracy estimated by the self-position accuracy estimation unit 104 and the result recognized by the recognizer 106. Update the parameters to do.
  • the change table holding unit 110 stores, for example, an action plan parameter change table for each algorithm and sensor.
  • the action plan parameter change table is a table that stores the parameters to be changed and the changed values based on the conditions by the algorithm and the sensor.
  • the parameter update unit 108 updates the score for, for example, the grid on the map, each area, and the like.
  • the action plan unit 112 generates an action plan based on the parameters updated by the parameter update unit 108. For example, the above-mentioned methods such as A * and Dijkstra's algorithm are used for determining this action plan, but the determination is not limited to this, and other methods may be used.
  • the action planning unit 112 updates the parameters for moving without changing the action performed by the moving body. By determining the action plan in this way, it becomes possible to appropriately move the moving body while performing the action carried out by the moving body.
  • the action planning unit 112 can, for example, create an action plan that moves along a route that does not reduce the accuracy of self-position estimation while performing the action currently being executed.
  • the action planning unit 112 moves to a place suitable for star reckoning so as not to reduce the current accuracy in order to recover the accuracy of self-position estimation while performing the action currently being executed. You may decide on an action plan.
  • the algorithm of the action plan is different from the above, of course, the parameters updated by the parameter update unit 108 and the contents of the table held in the change table holding unit 110 will be different, but the action plan is appropriately applied to the algorithm. Any configuration that can be determined is sufficient.
  • the control unit 114 controls the moving body based on the action plan generated by the action planning unit 112.
  • the senor 2 is mounted on a moving body, but the sensor 2 is not limited to this.
  • it may be a fixed sensor that monitors an area where the moving body can move.
  • FIG. 2 is a flowchart showing the overall processing of the information processing apparatus 1 according to the embodiment.
  • the information processing apparatus 1 estimates the self-position by the self-position estimation unit 102 based on the information acquired from the sensor 2, and calculates the accuracy of this self-position by the self-position accuracy estimation unit 104 (S10).
  • the self-position estimation and the accuracy calculation may be executed at the same timing. ..
  • the information processing apparatus 1 operates to suppress the error of the self-position accuracy by the parameter update unit 108, or the error of the self-position accuracy, in response to the action in which the accuracy of the self-position is lowered in the accuracy of the self-position.
  • Update the parameters based on the behavior to improve S20.
  • the parameter update unit 108 may select and output this operation based on the accuracy of its own position.
  • FIG. 3 is a flowchart showing the update process of this action plan parameter more concretely.
  • the parameter update unit 108 acquires the self-position accuracy calculated by the self-position accuracy estimation unit 104 (S200).
  • the parameter update unit 108 acquires data related to the parameters of the action plan based on the sensors and algorithms that cause the accuracy to deteriorate (S202). For example, the parameter update unit 108 acquires data such as parameters to be updated, conditions to be updated, and values to be updated by referring to the parameter change table.
  • This update may use, for example, the results of recognizer 106. As an example of this update, there may be the following.
  • a parameter that increases the cost around a person is acquired from the result of human recognition. This is because moving objects affect recognition accuracy. You may also acquire parameters that reduce costs near walls and obstacles. This is because the accuracy of matching can be improved when it is near a wall or an obstacle that does not move.
  • the cost of roofs, tunnels, near buildings, etc. will be increased based on the map and recognition results. This is to make it easier to acquire radio waves from the satellite used for GPS.
  • the parameter update unit 108 acquires the data related to the parameter update by referring to the table in which the parameter update as described above is described.
  • FIG. 4 is a diagram showing an example of a change table held in the change table holding unit 110 according to the embodiment.
  • a part of the parameter update related to LiDAR scan matching is extracted and shown.
  • the change table is, for example, a table including sensor algorithms, reflection parameters, conditions, and update data. If the sensor algorithm is LiDAR scan matching, the target to reflect the parameter update is the cost map. As a condition, for example, there may be a distance to a person or a distance to a wall. Data that reflects the parameter update is described in the update data. In the processing of S202, the update target parameters and update data that match the conditions are acquired from this conversion table.
  • a table was prepared, but it is not limited to this.
  • a file storing different conversion data may be prepared for each algorithm or the like, and parameter update data may be acquired based on the contents of the file.
  • it may be hard-coded in the program or coded in the circuit.
  • DB Database
  • This DB may be updated appropriately by the administrator or automatically by a trained model or the like at an appropriate timing.
  • the information processing apparatus 1 updates the parameters by the parameter update unit 108 (S204). For example, consider the case where the sensor algorithm used for self-position estimation is LiDAR scan matching, and the accuracy of this LiDAR scan matching is reduced.
  • the parameter update unit 108 doubles the cost within 0.5 m around the person and doubles the cost in the range of 0.5 m to 1.0 m in the cost map. If the recognizer 106 determines that there is a wall, the parameter update unit 108 updates the cost map within 1 m from the wall to 1/2. In this way, the parameter update unit 108 updates the cost map.
  • the parameter update unit 108 updates the parameters for executing the self-position estimation based on the information stored in the change table.
  • the self-position accuracy estimation unit 104 may output the self-position accuracy as a numerical value, and the parameter update unit 108 may execute the above parameter update when this numerical value falls below a predetermined threshold value. Further, when the accuracy is higher as it is closer to 0, the parameter update may be executed when the predetermined threshold value is exceeded.
  • the recognition result of the recognizer 106 may be used, or depending on the combination of the sensor and the algorithm, the update may be executed without using the recognition result of the recognizer 106.
  • the parameter update unit 108 may update the parameters by extracting only the data related to the sensor and the algorithm whose deterioration is recognized from the change table. By executing in this way, it is possible to realize a sensor having a deteriorated self-position accuracy, suppressing a decrease in self-position estimation by an algorithm, or improving the accuracy.
  • the parameter update unit 108 may update parameters related to other sensors and algorithms that are being executed in parallel when there is a sensor or algorithm that is found to be degraded. By executing in this way, parameter update may be performed for the sensor or algorithm whose self-position accuracy has deteriorated, and the self-position accuracy of other sensors or algorithms may be improved in order to interpolate this decrease. As a result, the deterioration of the self-positioning accuracy as a whole may be suppressed or reduced.
  • the information processing apparatus 1 then creates an action plan by the action planning unit 112 based on the parameters updated by the parameter updating unit 108 (S30).
  • the parameter update unit 108 updates the cost of the cost map as in the above example
  • an action plan is created using an appropriate algorithm such as A * or Dijkstra using this updated cost map.
  • control unit 114 controls the moving body based on this action plan.
  • Figure 5 is a diagram showing the action plan before updating the parameters.
  • the route that the moving body moves while working in the passage sandwiched between the walls is shown. Costs are shown in shaded areas, with diagonal lines in the same direction and spacing mean the same cost.
  • the wall may have a cost of ⁇ or a value sufficiently larger than the other costs, for example, 100. ..
  • the center of the passage is set at cost 1, and the cost is set at 2, 4 as it approaches the wall. In such cases, the moving object's action plan goes straight and turns right toward the target point, as indicated by the arrow.
  • FIG. 6 is a diagram showing an action plan when there is a parameter update.
  • FIG. 6 shows a cost map that is updated when a decrease in the accuracy of self-position estimation is observed.
  • the parameter update unit 108 updates the cost map from the state of FIG. 5 to the state of FIG. 6 based on the change table from the recognition result of the person on the passage and the recognition result of the wall. Based on the change table shown in Figure 4, the cost around the person is increased and the cost around the wall is updated low.
  • the information processing apparatus 1 can select a route that suppresses a decrease in self-position accuracy or a route that is expected to recover self-position accuracy by updating the parameters.
  • the action plan department 112 recreates the action plan based on this cost map. Then, the action planning unit 112 creates, for example, an action plan shown by a solid line with respect to the original action plan shown by the dotted line. For example, as shown in FIG. 6, when the target point is determined, the action planning unit 112 may create a new action plan as a route toward the target point. If you are moving freely, create an action plan to move appropriately based on the surrounding environment, rather than aiming for a specific target point.
  • an action plan in consideration of the accuracy of self-position estimation.
  • an action plan such as a route while driving without changing the purpose of driving to the destination.
  • the technology related to this disclosure can be applied to various products.
  • the technology according to the present disclosure is any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.
  • FIG. 7 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. ..
  • the communication network 7010 connecting these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various arithmetic, and a drive circuit that drives various controlled devices. To prepare for.
  • Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside or outside the vehicle by wired communication or wireless communication.
  • a communication I / F for performing communication is provided. In FIG.
  • control unit 7600 the microcomputer 7610, the general-purpose communication I / F7620, the dedicated communication I / F7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F7660, the audio image output unit 7670,
  • the vehicle-mounted network I / F 7680 and the storage unit 7690 are illustrated.
  • Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 has a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • the vehicle state detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. It includes at least one of sensors for detecting an angle, engine speed, wheel speed, and the like.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps.
  • a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200.
  • the body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
  • the vehicle outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400.
  • the image pickup unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle outside information detection unit 7420 is used, for example, to detect the current weather or an environment sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
  • the environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 8 shows an example of the installation position of the image pickup unit 7410 and the vehicle exterior information detection unit 7420.
  • the image pickup unit 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirror, rear bumper, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900.
  • the image pickup unit 7910 provided in the front nose and the image pickup section 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • the image pickup units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900.
  • the image pickup unit 7916 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900.
  • the image pickup unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 8 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging range of the imaging units 7912 and 7914 provided on the side mirrors, respectively
  • the imaging range d indicates the imaging range d.
  • the imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the image pickup units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 can be obtained.
  • the vehicle exterior information detection unit 7920, 7922, 7924, 7926, 7928, 7930 provided at the front, rear, side, corner and the upper part of the windshield of the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device.
  • These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
  • the vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the connected vehicle exterior information detection unit 7420. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the out-of-vehicle information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
  • the out-of-vehicle information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc. based on the received information.
  • the out-of-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the vehicle outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different image pickup units 7410 to generate a bird's-eye view image or a panoramic image. May be good.
  • the vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different image pickup units 7410.
  • the in-vehicle information detection unit 7500 detects the in-vehicle information.
  • a driver state detection unit 7510 that detects the state of the driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
  • the biosensor is provided on, for example, on the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is asleep. You may.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be input-operated by the occupant, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. You may.
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
  • the storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • Bluetooth® may be implemented.
  • the general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal). May be connected with.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
  • the dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of IEEE802.11p in the lower layer and IEEE1609 in the upper layer. May be implemented.
  • Dedicated communication I / F7630 is typically vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-house (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives, for example, a radio wave or an electromagnetic wave transmitted from a radio station or the like installed on a road, and acquires information such as a current position, a traffic jam, a road closure, or a required time.
  • the function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle.
  • the in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • the in-vehicle device I / F7660 is via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)).
  • -Definition Link and other wired connections may be established.
  • the in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device for searching a route to an arbitrary destination.
  • the in-vehicle device I / F 7660 may be a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
  • the in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the vehicle-mounted network I / F7680 transmits / receives signals and the like according to a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680.
  • the vehicle control system 7000 is controlled according to various programs based on the information acquired. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of.
  • the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving or the like.
  • the microcomputer 7610 has information acquired via at least one of a general-purpose communication I / F7620, a dedicated communication I / F7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F7660, and an in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict the danger of a vehicle collision, a pedestrian or the like approaching or entering a closed road, and generate a warning signal based on the acquired information.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices.
  • the display unit 7720 may include, for example, at least one of an onboard display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps other than these devices.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs the audio signal audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be composed of a plurality of control units.
  • the vehicle control system 7000 may include another control unit (not shown).
  • the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any of the control units.
  • a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. .
  • a computer program for realizing each function of the information processing apparatus 1 according to the present embodiment described with reference to FIG. 1 can be implemented in any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
  • the information processing device 1 can be applied to the integrated control unit 7600 of the application example shown in FIG.
  • each component of the information processing apparatus 1 corresponds to a microcomputer 7610, a storage unit 7690, and an in-vehicle network I / F 7680 of the integrated control unit 7600.
  • the integrated control unit 7600 can assist automatic driving and autonomous movement by generating a map and estimating a self-position.
  • the components of the information processing apparatus 1 described with reference to FIG. 1 are in the module for the integrated control unit 7600 shown in FIG. 7 (for example, an integrated circuit module composed of one die). It may be realized. Alternatively, the information processing apparatus 1 described with reference to FIG. 1 may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG. 7.
  • Equipped with a processor to control the mobile The processor Calculate the accuracy of self-position estimation based on the information obtained from one or more sensors, When the accuracy is reduced, or at least one of the cases where the accuracy is expected to be reduced, the parameters based on the algorithm for performing the self-position estimation are updated. Create an action plan based on the updated parameters, Information processing equipment.
  • the processor The self-position is estimated using the self-position estimation algorithm set for each sensor.
  • the information processing device according to (1) or (2).
  • the processor Estimating the self-position using one algorithm based on the outputs from the plurality of sensors.
  • the information processing device according to any one of (1) to (2).
  • the processor Update the parameters based on the sensor or algorithm used to estimate the self-position.
  • the information processing device according to any one of (1) to (4).
  • the processor The result of self-position estimation is calculated by the sensor or the algorithm, and the accuracy is calculated.
  • the information processing device according to any one of (1) to (4).
  • the processor One algorithm is used to calculate the accuracy based on the outputs from the plurality of sensors.
  • the information processing device according to any one of (1) to (6).
  • the processor Based on the information acquired from the sensor, the surrounding condition of the moving body is acquired, and the surrounding condition is acquired. The parameter is updated based on the acquired peripheral situation.
  • the information processing device according to any one of (1) to (7).
  • the information processing device according to any one of (1) to (8).
  • the memory is A table in which information relating to the sensor and algorithm type used for estimating the self-position, the parameter to be updated, the condition, and the update data is described is stored.
  • the processor Update the parameters with reference to the table.
  • the information processing device according to (9).
  • the processor Updating the parameters for the algorithm that is or is likely to be less accurate.
  • the information processing apparatus according to (9) or (10).
  • the processor The parameter is updated so as to select a route that suppresses the decrease in accuracy or a route that improves the accuracy.
  • the information processing apparatus according to any one of (9) to (11).
  • the parameter to be updated is a cost map, The information processing apparatus according to any one of (1) to (12).
  • the aspect of the present disclosure is not limited to the above-mentioned embodiment, but also includes various possible modifications, and the effect of the present disclosure is not limited to the above-mentioned contents.
  • the components in each embodiment may be applied in appropriate combinations. That is, various additions, changes and partial deletions are possible without departing from the conceptual idea and purpose of the present disclosure derived from the contents specified in the claims and their equivalents.

Abstract

[Problem] To suppress a reduction in precision when measuring a self-position, and to improve precision. [Solution] An information processing device comprises a processor for controlling a moving body. The processor estimates the self-position on the basis of information acquired from a sensor, calculates the precision of the self-position, updates a parameter based on an algorithm that estimates the self-position when the precision is reduced, and creates a behavior plan on the basis of the updated parameter.

Description

情報処理装置、情報処理方法及びプログラムInformation processing equipment, information processing methods and programs
 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 This disclosure relates to information processing devices, information processing methods and programs.
 自律して移動する移動体を用いて種々の処理を実行しようとする場合、自己位置推定を実行する必要がある。自己位置推定は、絶対的な自己位置を推定するスターレコニングと、相対的な自己位置を推定するデッドレコニングの手法が用いられる。自律移動するための自己位置推定としては、所定の領域において低速であるスターレコニングをし、この絶対的な位置に基づいて、高速に実行できるデッドレコニングにより移動しながら相対的な自己位置推定を実行することが一般的に行われる。 When trying to execute various processes using a moving object that moves autonomously, it is necessary to execute self-position estimation. For self-position estimation, a star reckoning method for estimating the absolute self-position and a dead reckoning method for estimating the relative self-position are used. As self-position estimation for autonomous movement, low-speed star reckoning is performed in a predetermined area, and based on this absolute position, relative self-position estimation is performed while moving by dead reckoning that can be executed at high speed. Is commonly done.
 デッドレコニングは、過去の結果に依存するため、推定に誤差が生じると、この誤差が蓄積されてしまう問題がある。これを回避するために、所定のタイミング、例えば、所定距離の移動をした後に、スターレコニングが実行できる領域等に移動してスターレコニングを実行し、絶対的な位置で自己位置を推定し直すことでデッドレコニングに起因する誤差の蓄積をリセットする。 Since dead reckoning depends on past results, there is a problem that if an error occurs in the estimation, this error will be accumulated. In order to avoid this, after moving at a predetermined timing, for example, a predetermined distance, move to an area where star reckoning can be performed, perform star reckoning, and re-estimate the self-position at an absolute position. Resets the accumulation of errors due to dead reckoning.
 しかしながら、スターレコニングは周囲環境や利用するセンサ及びアルゴリズムによって、利用できるタイミングが限られることが多い。このような場合には、所定のタイミングで所定の位置に移動体を移動させることや、この移動のために、実行している作業を中断して移動するといった行動を変えるといった処置が必要となる。また、所定位置まで戻るための経路においても、経路長は一定ではないため、かなり余裕を持った状態で戻らなくてはいけないといった問題もある。また、デッドレコニングによらず、種々の原因に起因して自己位置推定精度が低下することがある。このような場合にも、上記と同様に自己位置推定精度の低下を解決するべく行動を変化させる必要がある。 However, the timing at which Star Reckoning can be used is often limited depending on the surrounding environment and the sensors and algorithms used. In such a case, it is necessary to move the moving body to a predetermined position at a predetermined timing, or to change the behavior such as interrupting the work being performed and moving for this movement. .. Further, even in the path for returning to a predetermined position, since the path length is not constant, there is a problem that the path must be returned with a considerable margin. In addition, the self-position estimation accuracy may decrease due to various causes regardless of dead reckoning. Even in such a case, it is necessary to change the behavior in order to solve the decrease in the self-position estimation accuracy as described above.
特開2008-59218号公報Japanese Unexamined Patent Publication No. 2008-59218
 そこで、本開示では、自己位置推定の精度の低下を抑制し、精度を向上させる情報処理装置、情報処理方法及びプログラムを提供する。 Therefore, the present disclosure provides an information processing device, an information processing method, and a program that suppresses a decrease in the accuracy of self-position estimation and improves the accuracy.
 一実施形態によれば、情報処理装置は、移動体を制御するためのプロセッサを備える。前記プロセッサは、1又は複数のセンサから取得した情報に基づいた自己位置の推定の精度を計算し、前記精度が低下した場合、又は、前記精度が低下する見込みがある場合の少なくとも一方の場合に、前記自己位置の推定を実行するアルゴリズムに基づいたパラメータを更新し、更新された前記パラメータに基づいて、行動計画を作成する。 According to one embodiment, the information processing device includes a processor for controlling a mobile body. The processor calculates the accuracy of self-position estimation based on information obtained from one or more sensors, and if the accuracy is reduced or is likely to be reduced, at least one of the cases. , Update the parameters based on the algorithm that performs the self-position estimation, and create an action plan based on the updated parameters.
 前記プロセッサは、作成した前記行動計画に基づいて、前記移動体を制御してもよい。 The processor may control the moving body based on the created action plan.
 前記プロセッサは、前記センサごとに設定された自己位置推定アルゴリズムを用いて、前記自己位置を推定してもよい。 The processor may estimate the self-position using a self-position estimation algorithm set for each sensor.
 前記プロセッサは、複数の前記センサからの出力に基づいて、1つのアルゴリズムを用いて、前記自己位置を推定してもよい。 The processor may estimate the self-position using one algorithm based on the outputs from the plurality of sensors.
 前記プロセッサは、前記自己位置の推定に用いる前記センサ又は前記アルゴリズムに基づいて、前記パラメータを更新してもよい。 The processor may update the parameters based on the sensor or algorithm used to estimate the self-position.
 前記プロセッサは、前記センサ又は前記アルゴリズムにより自己位置推定の結果を算出し、前記精度を算出してもよい。 The processor may calculate the result of self-position estimation by the sensor or the algorithm, and calculate the accuracy.
 前記プロセッサは、複数の前記センサからの出力に基づいて、1つのアルゴリズムを用いて、前記精度を算出してもよい。 The processor may calculate the accuracy using one algorithm based on the outputs from the plurality of sensors.
 前記プロセッサは、前記センサから取得した情報に基づいて、前記移動体の周辺状況を取得し、取得した前記周辺状況に基づいて、前記パラメータを更新してもよい。 The processor may acquire the peripheral situation of the moving body based on the information acquired from the sensor, and may update the parameter based on the acquired peripheral situation.
 前記パラメータの更新に関する情報を格納する、メモリ、をさらに備えてもよい。前記プロセッサは、前記メモリに格納されている情報に基づいて、前記パラメータを更新してもよい。 A memory, which stores information regarding the update of the parameter, may be further provided. The processor may update the parameters based on the information stored in the memory.
 前記メモリは、前記自己位置の推定に用いるセンサ、アルゴリズムの種類と、更新する対象となる前記パラメータと、条件と、更新データと、を紐付けた情報が記述されるテーブルを格納してもよく、前記プロセッサは、前記テーブルを参照して前記パラメータを更新してもよい。 The memory may store a table in which information relating to the sensor used for estimating the self-position, the type of algorithm, the parameter to be updated, the condition, and the update data is described. , The processor may update the parameters with reference to the table.
 前記プロセッサは、前記精度が低下している、又は、前記精度が低下する見込みがある前記アルゴリズムに対する前記パラメータを更新してもよい。 The processor may update the parameters for the algorithm that has or is likely to have reduced accuracy.
 前記プロセッサは、前記精度の低下を抑制する経路、又は、前記精度が向上する経路を選択するように前記パラメータを更新してもよい。 The processor may update the parameter so as to select a route that suppresses the decrease in accuracy or a route that improves the accuracy.
 更新の対象となる前記パラメータは、コストマップであってもよい。 The parameter to be updated may be a cost map.
 一実施形態によれば、情報処理方法は、上記に記載した情報処理装置を制御する方法である。 According to one embodiment, the information processing method is a method of controlling the information processing apparatus described above.
 一実施形態によれば、プログラムは、上記に記載した処理をプロセッサに実行させる。 According to one embodiment, the program causes the processor to execute the process described above.
一実施形態に係る情報処理装置のブロック図。The block diagram of the information processing apparatus which concerns on one Embodiment. 一実施形態に係る情報処理装置の処理を示すフローチャート。The flowchart which shows the processing of the information processing apparatus which concerns on one Embodiment. 一実施形態に係る情報処理装置の処理を示すフローチャート。The flowchart which shows the processing of the information processing apparatus which concerns on one Embodiment. 一実施形態に係る更新テーブルの一例を示す図。The figure which shows an example of the update table which concerns on one Embodiment. 一実施形態に係るコストマップの一例を示す図。The figure which shows an example of the cost map which concerns on one Embodiment. 一実施形態に係るコストマップの一例を示す図。The figure which shows an example of the cost map which concerns on one Embodiment. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the image pickup unit.
 以下、図面を参照して本開示における実施形態の説明をする。図面は、説明のために用いるものであり、実際の装置における各部の構成の形状、サイズ、又は、他の構成とのサイズの比等が図に示されている通りである必要はない。また、図面は、簡略化して書かれているため、図に書かれている以外にも実装上必要な構成は、適切に備えるものとする。数値を比較する場合における以下、以上等の表現は、それぞれ適切に、未満、より大きいと書き換えることができることに留意されたい。 Hereinafter, embodiments in the present disclosure will be described with reference to the drawings. The drawings are for illustration purposes only, and the shape, size, or size ratio of each part configuration to other configurations in an actual device need not be as shown in the figure. In addition, since the drawings are written in a simplified form, it is assumed that configurations necessary for mounting other than those shown in the drawings are appropriately prepared. It should be noted that the following and above expressions when comparing numerical values can be appropriately rewritten as less than and greater than, respectively.
 図1は、一実施形態に係る情報処理装置の構成を示すブロック図である。情報処理装置1は、1又は複数のセンサ2から取得した情報に基づいて、自己位置及び自己位置精度を推定し、これらの推定結果に基づいて移動体を移動させる制御をする装置である。情報処理装置1は、記憶部100と、自己位置推定部102と、自己位置精度推定部104と、認識器106と、パラメータ更新部108と、変更テーブル保持部110と、行動計画部112と、制御部114と、を備える。情報処理装置1は、例えば、センサ2とともに移動体に搭載されていてもよい。また、情報処理装置1は、別の例として、ネットワーク等の通信経路を介してセンサ2を搭載する移動体とは別の場所に備えられていてもよい。 FIG. 1 is a block diagram showing a configuration of an information processing device according to an embodiment. The information processing device 1 is a device that estimates the self-position and the self-position accuracy based on the information acquired from one or a plurality of sensors 2, and controls the movement of the moving body based on the estimation results. The information processing device 1 includes a storage unit 100, a self-position estimation unit 102, a self-position accuracy estimation unit 104, a recognizer 106, a parameter update unit 108, a change table holding unit 110, an action planning unit 112, and the like. It includes a control unit 114. The information processing device 1 may be mounted on a moving body together with the sensor 2, for example. Further, as another example, the information processing apparatus 1 may be provided in a place different from the mobile body on which the sensor 2 is mounted via a communication path such as a network.
 記憶部100は、情報処理装置1の動作に必要となるデータ等を格納する。また、入力情報、出力情報の他、計算の途中経過等のデータを一時的に格納してもよい。情報処理装置1が、ソフトウェアによる情報処理がハードウェア資源を用いて具体的に実行される場合には、このソフトウェアに関するプログラム等が格納されていてもよい。後述する変更テーブル保持部110は、この記憶部100の一部であってもよく、この場合、変更テーブルが記憶部100に格納されている。 The storage unit 100 stores data and the like required for the operation of the information processing device 1. In addition to input information and output information, data such as the progress of calculation may be temporarily stored. When the information processing apparatus 1 specifically executes information processing by software using hardware resources, a program or the like related to the software may be stored. The change table holding unit 110, which will be described later, may be a part of the storage unit 100, and in this case, the change table is stored in the storage unit 100.
 情報処理装置1の一部又は全ての構成要素は、記憶部100に格納されているデータにより動作するCPU(Central Processing Unit)、GPU(Graphics Processing Unit)等のプロセッサ(処理回路)により実行されてもよいし、専用のアナログ回路又はデジタル回路、例えば、ASIC(Application Specific Integrated Circuity)、FPGA(Field Programmable Gate Array)等により実装されてもよい。 A part or all the components of the information processing apparatus 1 are executed by a processor (processing circuit) such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit) that are operated by data stored in the storage unit 100. Alternatively, it may be implemented by a dedicated analog circuit or digital circuit, for example, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or the like.
 自己位置推定部102は、センサ2が取得した情報に基づいて、自己位置推定を実行する。この自己位置推定は、例えば、スターレコニング及びデッドレコニングであってもよく、適切に状況に合わせてこれらの手法を組み合わせて用いてもよい。例えば、自己位置推定部102は、複数の自己位置推定アルゴリズムを複数のセンサから取得した情報に基づいて実行し、適切にフュージョンする形態であってもよい。 The self-position estimation unit 102 executes self-position estimation based on the information acquired by the sensor 2. This self-position estimation may be, for example, star reckoning and dead reckoning, or a combination of these methods may be appropriately used according to the situation. For example, the self-position estimation unit 102 may execute a plurality of self-position estimation algorithms based on information acquired from a plurality of sensors and appropriately fuse them.
 スターレコニングとしては、GPS(Global Positioning System)、画像特徴点地図、LiDAR(Light Detection and Ranging)のNDT(Normal Distributions Transform)マッチング等を用いてもよい。また、デッドレコニングとしては、車両オドメトリ、ビジュアルSLAM(Simultaneous Localization and Mapping)、デプスSLAM、IMU(Inertial Measurement Unit)等を用いてもよい。それぞれの手法に用いる対応するセンサ2を備えている必要があるが、これらのいずれを用いてもよいし、上記の例に本開示における形態は、限定されるものではない。 As star recording, GPS (Global Positioning System), image feature point map, NDT (Normal Distributions Transform) matching of LiDAR (Light Detection and Ringing), etc. may be used. Further, as the dead reckoning, vehicle odometry, visual SLAM (Simultaneous Localization and Mapping), depth SLAM, IMU (Inertial Measurement Unit) and the like may be used. It is necessary to have the corresponding sensor 2 used for each method, but any of these may be used, and the form in the present disclosure is not limited to the above example.
 例えば、自己位置推定部102は、1又は複数のセンサ2から取得した情報に基づいて、それぞれのセンサに対して設定されたアルゴリズムを用いることにより自己位置を推定してもよい。この場合、各センサ2から取得した情報に基づく位置推定結果を、適切なアルゴリズムにより融合してもよい。 For example, the self-position estimation unit 102 may estimate the self-position by using an algorithm set for each sensor based on the information acquired from one or a plurality of sensors 2. In this case, the position estimation results based on the information acquired from each sensor 2 may be fused by an appropriate algorithm.
 例えば、自己位置推定部102は、複数のセンサ2から取得した情報を1つのアルゴリズムを用いることにより自己位置の推定をしてもよい。 For example, the self-position estimation unit 102 may estimate the self-position by using one algorithm for the information acquired from the plurality of sensors 2.
 自己位置精度推定部104は、自己位置精度推定部104は、自己位置推定部102が推定した自己位置推定の精度を計算する。この精度の計算は、例えば、分散共分散行列等を用いた一般的な手法を用いてもよい。自己位置推定の精度は、自己位置推定を実行する手法により異なる求め方となることがある。この場合、当該手法に適した自己位置推定の精度を算出すればよい。 The self-position accuracy estimation unit 104 calculates the self-position accuracy estimation unit 104 estimated by the self-position accuracy estimation unit 102. For the calculation of this accuracy, for example, a general method using a variance-covariance matrix or the like may be used. The accuracy of self-position estimation may differ depending on the method used to perform self-position estimation. In this case, the accuracy of self-position estimation suitable for the method may be calculated.
 また、別の例として、現在の自己位置推定の結果に基づいて、近い将来的に自己位置推定の精度が低下すると見込める場合を、自己位置精度推定部104は、検出してもよい。自己位置精度の低下が見込まれる状況は、例えば、過去数回の自己位置推定の結果から検出してもよい。また、別の例として統計情報に基づいて検出してもよいし、過去の自己位置推定の結果を機械学習により訓練済モデルに入力することにより検出してもよい。 As another example, the self-position accuracy estimation unit 104 may detect a case where the accuracy of the self-position estimation is expected to decrease in the near future based on the result of the current self-position estimation. A situation in which the self-position accuracy is expected to decrease may be detected from the results of self-position estimation several times in the past, for example. Further, as another example, it may be detected based on statistical information, or it may be detected by inputting the result of past self-position estimation into a trained model by machine learning.
 例えば、自己位置推定方法がLiDARオドメトリである場合、トータルの走行距離、マッチング率等により精度を算出、又は、精度が低下する見込みがあることを検出することができる。以下同様に、精度の算出と記載されている箇所は、精度の低下する見込みを検出することと言い換えてもよい。車輪エンコーダである場合、トータル走行距離を用いて精度を算出することができる。IMUである場合には、トータルの絶対値の回転数を用いて精度を算出してもよい。 For example, when the self-position estimation method is LiDAR odometry, the accuracy can be calculated from the total mileage, matching rate, etc., or it can be detected that the accuracy is likely to decrease. Similarly, the part described as the calculation of accuracy may be rephrased as detecting the possibility that the accuracy will decrease. In the case of a wheel encoder, the accuracy can be calculated using the total mileage. In the case of an IMU, the accuracy may be calculated using the total absolute number of revolutions.
 例えば、自己位置推定方法がスターレコニングの場合には、上記のデッドレコニングによる共分散を数値として取得することができるので、この共分散を用いて精度を算出してもよい。LiDARによるスキャンマッチング、ビジュアルSLAM等である場合には、最後にマッチングしてからの走行距離、走行時間を用いて精度を算出してもよいし、マッチング率を用いてもよい。GPSの場合には、DOP(Dilution of Precision)を用いて精度の低下を算出することもできる。 For example, when the self-position estimation method is star reckoning, the covariance by the above dead reckoning can be obtained as a numerical value, and the accuracy may be calculated using this covariance. In the case of scan matching by LiDAR, visual SLAM, etc., the accuracy may be calculated using the mileage and the mileage since the last matching, or the matching rate may be used. In the case of GPS, it is also possible to calculate the decrease in accuracy using DOP (Dilution of Precision).
 このように、自己位置推定部102が採用する手法によって、精度の算出方法が変わることもある。自己位置精度推定部104は、自己位置推定部102の用いている方法に基づいて、自己位置の精度を算出する。 In this way, the accuracy calculation method may change depending on the method adopted by the self-position estimation unit 102. The self-position accuracy estimation unit 104 calculates the self-position accuracy based on the method used by the self-position estimation unit 102.
 自己位置精度推定部104は、センサごと、又は、アルゴリズムごと、に自己位置推定の精度を算出してもよい。 The self-position accuracy estimation unit 104 may calculate the accuracy of self-position estimation for each sensor or for each algorithm.
 また、自己位置推定部102は、センサごと、又は、アルゴリズムごと、に自己位置推定の結果を算出し、自己位置精度推定部104は、このセンサごと、又は、アルゴリズムごとに算出された結果の精度をそれぞれ算出してもよい。 Further, the self-position estimation unit 102 calculates the self-position estimation result for each sensor or algorithm, and the self-position accuracy estimation unit 104 calculates the accuracy of the result for each sensor or algorithm. May be calculated respectively.
 複数の自己位置推定アルゴリズムを用いる場合には、自己位置精度推定部104は、融合された最終的な自己位置の精度を算出してもよい。別の例としては、別々のアルゴリズムで推定された自己位置を、自己位置精度推定部104は、それぞれに精度を算出してもよい。この場合、アルゴリズムごとにパラメータ更新をするか否かを決定してもよいし、融合した結果としてパラメータ更新をするか否かを決定してもよい。融合した結果として更新を決定する場合には、複数のアルゴリズムのうち、一部のアルゴリズムに対するパラメータを選択して更新する構成としてもよい。 When a plurality of self-position estimation algorithms are used, the self-position accuracy estimation unit 104 may calculate the final self-position accuracy fused. As another example, the self-position accuracy estimation unit 104 may calculate the accuracy of the self-position estimated by different algorithms. In this case, it may be determined whether or not to update the parameters for each algorithm, or whether or not to update the parameters as a result of the fusion may be determined. When the update is determined as a result of the fusion, the parameters for some of the algorithms may be selected and updated.
 なお、図面においては、自己位置推定部102と、自己位置精度推定部104は、別々の構成要素としているが、これらは、場合によっては1つの構成要素であってもよい。例えば、自己位置推定と、自己位置精度とを併せて算出する手法を用いる場合には、これらの構成要素は、1つの構成要素として、自己位置推定と、精度算出と、を実行する。 In the drawing, the self-position estimation unit 102 and the self-position accuracy estimation unit 104 are separate components, but these may be one component in some cases. For example, when a method of calculating self-position estimation and self-position accuracy together is used, these components execute self-position estimation and accuracy calculation as one component.
 認識器106は、センサ2からの出力に基づいて、各種認識処理を実行する。この認識処理は、例えば、人物の検出、障害物の検出等、自己位置推定に用いる認識以外の認識処理を実行する。この認識には、例えば、HoG(Histograms of Oriented Gradient)、SIFT(Scaled Invariance Feature Transform)、オプティカルフロー等、任意の特徴量を用いてもよい。これらの特徴量を用いた認識、検出には限られず、予め機械学習により訓練された認識モデルを用いて認識してもよい。このように、認識器106は、センサから取得された情報に基づいて、移動体の周辺の環境に関する情報、状況を取得する。 The recognizer 106 executes various recognition processes based on the output from the sensor 2. This recognition process executes recognition processes other than the recognition used for self-position estimation, such as detection of a person and detection of an obstacle. For this recognition, for example, any feature amount such as HoG (Histograms of Oriented Gradient), SIFT (Scaled Invariance Feature Transform), optical flow, etc. may be used. The recognition and detection are not limited to those using these features, and recognition may be performed using a recognition model trained in advance by machine learning. In this way, the recognizer 106 acquires information and a situation regarding the environment around the moving body based on the information acquired from the sensor.
 パラメータ更新部108は、自己位置精度推定部104が推定した自己位置の精度、及び、認識器106が認識した結果に基づいて、変更テーブル保持部110に保持されているテーブルを用いて自律移動をするためのパラメータを更新する。 The parameter update unit 108 performs autonomous movement using the table held in the change table holding unit 110 based on the self-position accuracy estimated by the self-position accuracy estimation unit 104 and the result recognized by the recognizer 106. Update the parameters to do.
 変更テーブル保持部110には、例えば、アルゴリズム、センサごとの行動計画パラメータ変更テーブルが格納されている。行動計画パラメータ変更テーブルは、アルゴリズム、センサにより、条件に基づいて、変更するパラメータ及び変更値を格納するテーブルである。 The change table holding unit 110 stores, for example, an action plan parameter change table for each algorithm and sensor. The action plan parameter change table is a table that stores the parameters to be changed and the changed values based on the conditions by the algorithm and the sensor.
 例えば、行動計画がA*、ダイクストラといった手法で決定される場合には、変更テーブル保持部110には、グリッドごと、経路ごと等のスコアを、自己位置精度及び認識結果の条件により、どのように変更させるかが格納される。パラメータ更新部108は、このテーブルを参照することにより、例えば、マップ上のグリッド、各領域等について、スコアを更新する。 For example, when the action plan is determined by a method such as A * or Dijkstra, how to give the score for each grid, each route, etc. to the change table holding unit 110 according to the self-position accuracy and the condition of the recognition result. Whether to change is stored. By referring to this table, the parameter update unit 108 updates the score for, for example, the grid on the map, each area, and the like.
 行動計画部112は、パラメータ更新部108が更新したパラメータに基づいて、行動計画を生成する。この行動計画の決定は、例えば、上述したA*、ダイクストラ法等の手法が用いられるが、これには限定されず、他の手法によるものであってもよい。例えば、行動計画部112は、移動体が実行している行動を変更させることなく、移動をするためのパラメータを更新する。このように行動計画を決定することにより、移動体が実施している行動を行いつつ、移動体を適切に移動させることが可能となる。 The action plan unit 112 generates an action plan based on the parameters updated by the parameter update unit 108. For example, the above-mentioned methods such as A * and Dijkstra's algorithm are used for determining this action plan, but the determination is not limited to this, and other methods may be used. For example, the action planning unit 112 updates the parameters for moving without changing the action performed by the moving body. By determining the action plan in this way, it becomes possible to appropriately move the moving body while performing the action carried out by the moving body.
 行動計画部112は、例えば、現在実行している行動をしながら、自己位置推定の精度が落ちないような経路を移動する行動計画を作成ことが可能となる。また、行動計画部112は、別の例として、現在実行している行動をしながら、自己位置推定の精度を回復するべくスターレコニングに適した場所に現在の精度を落とさないように移動するように行動計画を決定してもよい。 The action planning unit 112 can, for example, create an action plan that moves along a route that does not reduce the accuracy of self-position estimation while performing the action currently being executed. In addition, as another example, the action planning unit 112 moves to a place suitable for star reckoning so as not to reduce the current accuracy in order to recover the accuracy of self-position estimation while performing the action currently being executed. You may decide on an action plan.
 行動計画のアルゴリズムが上記と異なる場合には、もちろんパラメータ更新部108が更新するパラメータ及び変更テーブル保持部110に保持されるテーブルの内容が異なるものとなるが、アルゴリズムに対して適切に行動計画を決定できる構成であればよい。 If the algorithm of the action plan is different from the above, of course, the parameters updated by the parameter update unit 108 and the contents of the table held in the change table holding unit 110 will be different, but the action plan is appropriately applied to the algorithm. Any configuration that can be determined is sufficient.
 制御部114は、行動計画部112が生成した行動計画に基づいて、移動体を制御する。 The control unit 114 controls the moving body based on the action plan generated by the action planning unit 112.
 上記において、センサ2は、移動体に搭載されるものとしたが、これに限定されるものではない。例えば、移動体が移動しうる領域をモニタする固定のセンサであってもよい。 In the above, the sensor 2 is mounted on a moving body, but the sensor 2 is not limited to this. For example, it may be a fixed sensor that monitors an area where the moving body can move.
 図2は、一実施形態に係る情報処理装置1の全体的な処理を示すフローチャートである。 FIG. 2 is a flowchart showing the overall processing of the information processing apparatus 1 according to the embodiment.
 まず、情報処理装置1は、センサ2から取得した情報に基づいて自己位置推定部102により自己位置を推定し、自己位置精度推定部104によりこの自己位置の精度を計算する(S10)。上述したように、センサの種類及び用いるアルゴリズムにより、自己位置推定とその精度計算を併せて実行することも可能であり、この場合、自己位置推定と精度計算とを同じタイミングで実行してもよい。 First, the information processing apparatus 1 estimates the self-position by the self-position estimation unit 102 based on the information acquired from the sensor 2, and calculates the accuracy of this self-position by the self-position accuracy estimation unit 104 (S10). As described above, depending on the type of sensor and the algorithm used, it is possible to execute the self-position estimation and the accuracy calculation at the same time. In this case, the self-position estimation and the accuracy calculation may be executed at the same timing. ..
 次に、情報処理装置1は、自己位置の精度において自己位置の精度が低下している行動に対して、パラメータ更新部108により自己位置精度の誤差を抑制する動作、又は、自己位置精度の誤差を改善する動作に基づいてパラメータを更新する(S20)。センサの種類及び用いるアルゴリズムによっては、自己位置推定の誤差を抑制する動作、又は、誤差を削減する動作がある。パラメータ更新部108は、自己位置の精度に基づいて、この動作を選択して出力してもよい。 Next, the information processing apparatus 1 operates to suppress the error of the self-position accuracy by the parameter update unit 108, or the error of the self-position accuracy, in response to the action in which the accuracy of the self-position is lowered in the accuracy of the self-position. Update the parameters based on the behavior to improve (S20). Depending on the type of sensor and the algorithm used, there is an operation of suppressing the error of self-position estimation or an operation of reducing the error. The parameter update unit 108 may select and output this operation based on the accuracy of its own position.
 図3は、この行動計画パラメータの更新処理をより具体的に示すフローチャートである。 FIG. 3 is a flowchart showing the update process of this action plan parameter more concretely.
 行動計画パラメータの更新処理において、まず、パラメータ更新部108は、自己位置精度推定部104が計算した自己位置精度を取得する(S200)。 In the action plan parameter update process, first, the parameter update unit 108 acquires the self-position accuracy calculated by the self-position accuracy estimation unit 104 (S200).
 次に、パラメータ更新部108は、精度が低下している原因となるセンサ、アルゴリズムに基づいて、行動計画のパラメータに関するデータを取得する(S202)。例えば、パラメータ更新部108は、パラメータの変更テーブルを参照することにより、更新させるパラメータ、更新させる条件、更新する値等のデータを取得する。この更新は、例えば、認識器106の結果を用いてもよい。この更新の一例として、下記のようなものがあってもよい。 Next, the parameter update unit 108 acquires data related to the parameters of the action plan based on the sensors and algorithms that cause the accuracy to deteriorate (S202). For example, the parameter update unit 108 acquires data such as parameters to be updated, conditions to be updated, and values to be updated by referring to the parameter change table. This update may use, for example, the results of recognizer 106. As an example of this update, there may be the following.
 例えば、車輪エンコーダを用いている場合には、総移動距離に対するコストを増加させるパラメータを取得する。 For example, when using a wheel encoder, acquire a parameter that increases the cost for the total travel distance.
 例えば、IMUを用いている場合には、移動体の同一方向回転をなるべく減らすパラメータを取得する。これには、Scale Factorのキャリブレーションずれによる影響を抑制する効果がある。 For example, when using an IMU, acquire a parameter that reduces the rotation of the moving object in the same direction as much as possible. This has the effect of suppressing the influence of the calibration deviation of Scale Factor.
 例えば、LiDARマッチングを用いている場合には、人認識をした結果から、人の周囲のコストを増加させるパラメータを取得する。これは、動く物体が認識精度に影響を及ぼすためである。また、壁や障害物の近くのコストを低下させるパラメータを取得してもよい。これは、壁や動かない障害物の近くであるとマッチングの精度を向上することができるためである。 For example, when using LiDAR matching, a parameter that increases the cost around a person is acquired from the result of human recognition. This is because moving objects affect recognition accuracy. You may also acquire parameters that reduce costs near walls and obstacles. This is because the accuracy of matching can be improved when it is near a wall or an obstacle that does not move.
 例えば、GPUを用いている場合には、地図、認識結果に基づいて、屋根、トンネル、ビル群の近く等のコストを増加させる。これは、GPSに用いている衛星からの電波を取得しやすくするためである。 For example, when using a GPU, the cost of roofs, tunnels, near buildings, etc. will be increased based on the map and recognition results. This is to make it easier to acquire radio waves from the satellite used for GPS.
 パラメータ更新部108は、上記のようなパラメータの更新が記述されたテーブルを参照することにより、パラメータの更新に関するデータを取得する。 The parameter update unit 108 acquires the data related to the parameter update by referring to the table in which the parameter update as described above is described.
 図4は、一実施形態に係る変更テーブル保持部110に保持されている変更テーブルの一例を示す図である。この図4では、一例として、LiDARスキャンマッチングに係るパラメータ更新の一部を抽出して示している。 FIG. 4 is a diagram showing an example of a change table held in the change table holding unit 110 according to the embodiment. In FIG. 4, as an example, a part of the parameter update related to LiDAR scan matching is extracted and shown.
 変更テーブルは、例えば、センサ・アルゴリズム、反映パラメータ、条件、更新データを備えるテーブルである。センサ・アルゴリズムがLiDARスキャンマッチングである場合、パラメータ更新を反映させる対象は、コストマップとなる。条件としては、例えば、人との距離、又は、壁との距離があってもよい。更新データには、パラメータ更新を反映させるデータが記述されている。S202の処理においては、この変換テーブルから条件に合致する更新対象パラメータ及び更新データを取得する。 The change table is, for example, a table including sensor algorithms, reflection parameters, conditions, and update data. If the sensor algorithm is LiDAR scan matching, the target to reflect the parameter update is the cost map. As a condition, for example, there may be a distance to a person or a distance to a wall. Data that reflects the parameter update is described in the update data. In the processing of S202, the update target parameters and update data that match the conditions are acquired from this conversion table.
 なお、図4のように、テーブルが準備されていたが、これには限られない。例えば、それぞれのアルゴリズム等に対して別々の変換データを格納したファイルを準備し、当該ファイルの内容に基づいてパラメータ更新データを取得してもよい。別の例としては、プログラム中にハードコーディングされたり、回路にコーディングされたりしていてもよい。また、例えば、クラウドに備えられるDB(Database)等に記述されており、この記述に基づいてもよい。このDBは、管理者により適切に、又は、学習済モデル等により自動的に、適切なタイミングにおいて更新されるものであってもよい。 As shown in Fig. 4, a table was prepared, but it is not limited to this. For example, a file storing different conversion data may be prepared for each algorithm or the like, and parameter update data may be acquired based on the contents of the file. As another example, it may be hard-coded in the program or coded in the circuit. Further, for example, it is described in a DB (Database) provided in the cloud, and may be based on this description. This DB may be updated appropriately by the administrator or automatically by a trained model or the like at an appropriate timing.
 図3に戻り、情報処理装置1は、パラメータ更新部108により、パラメータの更新を実行する(S204)。例えば、自己位置推定にも用いるセンサ・アルゴリズムがLiDARのスキャンマッチングであり、このLiDARのスキャンマッチングの精度が低下している場合について考える。 Returning to FIG. 3, the information processing apparatus 1 updates the parameters by the parameter update unit 108 (S204). For example, consider the case where the sensor algorithm used for self-position estimation is LiDAR scan matching, and the accuracy of this LiDAR scan matching is reduced.
 認識器106の認識結果において、人がいると判断されたとする。図4のテーブルを用いる場合、パラメータ更新部108は、コストマップにおいて人の周囲0.5m以内のコストを4倍に、0.5m~1.0mの範囲のコストを2倍に更新する。また、認識器106により壁があると判断されると、パラメータ更新部108は、壁から1m以内の範囲におけるコストマップを1 / 2に更新する。このように、パラメータ更新部108は、コストマップを更新する。 It is assumed that there is a person in the recognition result of the recognizer 106. When using the table of FIG. 4, the parameter update unit 108 doubles the cost within 0.5 m around the person and doubles the cost in the range of 0.5 m to 1.0 m in the cost map. If the recognizer 106 determines that there is a wall, the parameter update unit 108 updates the cost map within 1 m from the wall to 1/2. In this way, the parameter update unit 108 updates the cost map.
 パラメータ更新部108は、このように、自己位置精度の低下が認められる場合には、変更テーブルに格納されている情報に基づいて、当該自己位置推定を実行するためのパラメータを更新する。 When the decrease in self-position accuracy is recognized in this way, the parameter update unit 108 updates the parameters for executing the self-position estimation based on the information stored in the change table.
 例えば、自己位置精度推定部104は、自己位置精度を数値として出力し、パラメータ更新部108は、この数値が所定の閾値を下回った場合に、上記のパラメータ更新を実行してもよい。また、精度が0に近いほど高い場合には、所定の閾値を上回った場合に、パラメータ更新を実行してもよい。この場合、上述したように、認識器106の認識結果を用いてもよいし、センサ、アルゴリズムの組み合わせによっては、認識器106の認識結果を用いずに更新を実行してもよい。 For example, the self-position accuracy estimation unit 104 may output the self-position accuracy as a numerical value, and the parameter update unit 108 may execute the above parameter update when this numerical value falls below a predetermined threshold value. Further, when the accuracy is higher as it is closer to 0, the parameter update may be executed when the predetermined threshold value is exceeded. In this case, as described above, the recognition result of the recognizer 106 may be used, or depending on the combination of the sensor and the algorithm, the update may be executed without using the recognition result of the recognizer 106.
 複数のセンサ、アルゴリズムを並行して実行している場合には、センサ、アルゴリズムごとに自己位置精度による判断を実行してもよい。この場合、パラメータ更新部108は、低下が認められたセンサ、アルゴリズムに関わるデータのみを変更テーブルから抽出してパラメータを更新してもよい。このように実行することにより、自己位置精度が低下したセンサ、アルゴリズムによる自己位置推定の低下の抑制、又は、精度の向上を実現することができる。 When multiple sensors and algorithms are being executed in parallel, judgment based on self-position accuracy may be executed for each sensor and algorithm. In this case, the parameter update unit 108 may update the parameters by extracting only the data related to the sensor and the algorithm whose deterioration is recognized from the change table. By executing in this way, it is possible to realize a sensor having a deteriorated self-position accuracy, suppressing a decrease in self-position estimation by an algorithm, or improving the accuracy.
 パラメータ更新部108は、別の例として、低下が認められるセンサ、アルゴリズムがある場合に、並行して実行している他のセンサ、アルゴリズムに関するパラメータについても更新を実行してもよい。このように実行することにより、自己位置精度が低下したセンサ、アルゴリズムについてパラメータ更新を実行するとともに、この低下を補間するために他のセンサ、アルゴリズムの自己位置精度の向上をしてもよい。この結果として、全体としての自己位置精度の低下を抑制、削減してもよい。 As another example, the parameter update unit 108 may update parameters related to other sensors and algorithms that are being executed in parallel when there is a sensor or algorithm that is found to be degraded. By executing in this way, parameter update may be performed for the sensor or algorithm whose self-position accuracy has deteriorated, and the self-position accuracy of other sensors or algorithms may be improved in order to interpolate this decrease. As a result, the deterioration of the self-positioning accuracy as a whole may be suppressed or reduced.
 図2に戻り、次に、情報処理装置1は、行動計画部112により、パラメータ更新部108が更新したパラメータに基づいて、行動計画を作成する(S30)。上記の例のように、コストマップのコストをパラメータ更新部108が更新する場合には、この更新されたコストマップを用いてA*、ダイクストラ等の適切なアルゴリズムを用いて行動計画を作成する。 Returning to FIG. 2, the information processing apparatus 1 then creates an action plan by the action planning unit 112 based on the parameters updated by the parameter updating unit 108 (S30). When the parameter update unit 108 updates the cost of the cost map as in the above example, an action plan is created using an appropriate algorithm such as A * or Dijkstra using this updated cost map.
 この後に、制御部114は、この行動計画に基づいて、移動体を制御する。 After this, the control unit 114 controls the moving body based on this action plan.
 コストマップを用いた場合のパラメータの更新及び行動計画の作成の一例について説明する。 An example of updating parameters and creating an action plan when using a cost map will be explained.
 図5は、パラメータ更新前の行動計画を示す図である。壁に挟まれた通路内を移動体が作業しながら移動する経路が示されている。コストは、斜線の領域で示され、同じ向き、間隔の斜線は、同じコストを意味する。壁は、コスト∞としてもよいし、他のコストよりも十分大きな値、例えば、100等としてもよい。。例えば、通路の中央は、コスト1、壁に近づくにつれ、コスト2、4、と設定されている。このような場合、移動体の行動計画は、矢印で示されるように、直進して、右折して目標地点へと向かう。 Figure 5 is a diagram showing the action plan before updating the parameters. The route that the moving body moves while working in the passage sandwiched between the walls is shown. Costs are shown in shaded areas, with diagonal lines in the same direction and spacing mean the same cost. The wall may have a cost of ∞ or a value sufficiently larger than the other costs, for example, 100. .. For example, the center of the passage is set at cost 1, and the cost is set at 2, 4 as it approaches the wall. In such cases, the moving object's action plan goes straight and turns right toward the target point, as indicated by the arrow.
 図6は、パラメータ更新がある場合の行動計画を示す図である。例えば、図6は、自己位置推定の精度の低下が認められる場合に、更新されるコストマップを示す。この場合、通路上の人の認識結果、及び、壁の認識結果から、変更テーブルに基づいて、パラメータ更新部108は、コストマップを図5の状態から図6の状態のように更新する。図4に示す変更テーブルに基づいて、人の周辺のコストを高くし、壁の周辺のコストを低く更新する。この結果、人の周辺に近づくことによる自己位置精度が低下する経路を回避したり、壁の周辺に近づくことによる自己位置精度の回復をしやすい経路を選択したりすることができる。この結果、情報処理装置1は、パラメータを更新することにより、自己位置精度の低下を抑制する経路、又は、自己位置精度の回復が見込まれる経路を選択することができる。 FIG. 6 is a diagram showing an action plan when there is a parameter update. For example, FIG. 6 shows a cost map that is updated when a decrease in the accuracy of self-position estimation is observed. In this case, the parameter update unit 108 updates the cost map from the state of FIG. 5 to the state of FIG. 6 based on the change table from the recognition result of the person on the passage and the recognition result of the wall. Based on the change table shown in Figure 4, the cost around the person is increased and the cost around the wall is updated low. As a result, it is possible to avoid a route in which the self-positioning accuracy is lowered by approaching the periphery of a person, or to select a route in which the self-positioning accuracy is easily restored by approaching the periphery of a wall. As a result, the information processing apparatus 1 can select a route that suppresses a decrease in self-position accuracy or a route that is expected to recover self-position accuracy by updating the parameters.
 行動計画部112は、このコストマップに基づいて、行動計画を再作成する。そして、行動計画部112は、例えば、点線で示される元の行動計画に対して、実線で示される行動計画を作成する。例えば、図6に示すように、目標地点を決定している場合には、目標地点に向かう経路として、行動計画部112は、新たな行動計画を作成してもよい。自由に移動をしている場合には、特定の目標地点を目指すのではなく、適切に周囲の環境に基づいて移動を行う行動計画を作成する。 The action plan department 112 recreates the action plan based on this cost map. Then, the action planning unit 112 creates, for example, an action plan shown by a solid line with respect to the original action plan shown by the dotted line. For example, as shown in FIG. 6, when the target point is determined, the action planning unit 112 may create a new action plan as a route toward the target point. If you are moving freely, create an action plan to move appropriately based on the surrounding environment, rather than aiming for a specific target point.
 以上のように、本実施形態によれば、自己位置推定の精度を考慮して、行動計画を作成することが可能となる。この行動計画の作成においては、移動体が現在実行している作業を中断することなく、移動経路だけを適切に修正することが可能となる。例えば、ロボットが会場において、何かしらのものを配布する場合、この配布の作業を中断することなく、適切に、自己位置推定の精度を下げないように行動計画を作成することも可能である。自動車の自動運転である場合には、目的地までの運行をするという目的を変えることなく、運転をしながら適切に経路等の行動計画を修正することが可能となる。 As described above, according to this embodiment, it is possible to create an action plan in consideration of the accuracy of self-position estimation. In creating this action plan, it is possible to appropriately modify only the movement route without interrupting the work currently being performed by the moving body. For example, when a robot distributes something at the venue, it is possible to create an action plan appropriately without interrupting the distribution work and without degrading the accuracy of self-position estimation. In the case of automatic driving of a car, it is possible to appropriately modify an action plan such as a route while driving without changing the purpose of driving to the destination.
 このように、自己位置推定の精度が低下しうる環境下において、目的となる行動自体を変化させることなく、精度低下の抑制、又は、精度の復帰を実行することができる。この結果、よりロバストナ自己位置推定を実現することが可能となる。さらには、これにより、ロボットの行動を精度低下に対して切り替える等の処理が不要となるため、ロボットの稼働効率等を向上させ、また、ユーザインタラクション等を改善することも可能となる。 In this way, in an environment where the accuracy of self-position estimation can be reduced, it is possible to suppress the decrease in accuracy or restore the accuracy without changing the target behavior itself. As a result, it becomes possible to realize more Robustona self-position estimation. Furthermore, this eliminates the need for processing such as switching the robot's behavior with respect to the decrease in accuracy, so that it is possible to improve the robot's operating efficiency and the like, and also to improve the user interaction and the like.
 なお、上記においては、マッチングを行う自己位置推定において説明したが、これには限られず、種々の自己位置推定において、行動計画を設定するためのパラメータを更新することに適用することができる。また、コストマップを用いる形態について説明したが、こちらも、コストマップには限られず、種々の形態の行動計画を作成するためのパラメータの更新について適用することができる。 In the above, the explanation was given in the self-position estimation for matching, but the present invention is not limited to this, and can be applied to updating the parameters for setting the action plan in various self-position estimations. In addition, although the form using the cost map has been described, this is also not limited to the cost map, and can be applied to the update of parameters for creating various forms of action plans.
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。 The technology related to this disclosure can be applied to various products. For example, the technology according to the present disclosure is any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.
 図7は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム7000の概略的な構成例を示すブロック図である。車両制御システム7000は、通信ネットワーク7010を介して接続された複数の電子制御ユニットを備える。図7に示した例では、車両制御システム7000は、駆動系制御ユニット7100、ボディ系制御ユニット7200、バッテリ制御ユニット7300、車外情報検出ユニット7400、車内情報検出ユニット7500、及び統合制御ユニット7600を備える。これらの複数の制御ユニットを接続する通信ネットワーク7010は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)又はFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。 FIG. 7 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010. In the example shown in FIG. 7, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. .. The communication network 7010 connecting these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
 各制御ユニットは、各種プログラムにしたがって演算処理を行うマイクロコンピュータと、マイクロコンピュータにより実行されるプログラム又は各種演算に用いられるパラメータ等を記憶する記憶部と、各種制御対象の装置を駆動する駆動回路とを備える。各制御ユニットは、通信ネットワーク7010を介して他の制御ユニットとの間で通信を行うためのネットワークI/Fを備えるとともに、車内外の装置又はセンサ等との間で、有線通信又は無線通信により通信を行うための通信I/Fを備える。図7では、統合制御ユニット7600の機能構成として、マイクロコンピュータ7610、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660、音声画像出力部7670、車載ネットワークI/F7680及び記憶部7690が図示されている。他の制御ユニットも同様に、マイクロコンピュータ、通信I/F及び記憶部等を備える。 Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various arithmetic, and a drive circuit that drives various controlled devices. To prepare for. Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside or outside the vehicle by wired communication or wireless communication. A communication I / F for performing communication is provided. In FIG. 7, as the functional configuration of the integrated control unit 7600, the microcomputer 7610, the general-purpose communication I / F7620, the dedicated communication I / F7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F7660, the audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are illustrated. Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
 駆動系制御ユニット7100は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット7100は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。駆動系制御ユニット7100は、ABS(Antilock Brake System)又はESC(Electronic Stability Control)等の制御装置としての機能を有してもよい。 The drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 has a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle. The drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
 駆動系制御ユニット7100には、車両状態検出部7110が接続される。車両状態検出部7110には、例えば、車体の軸回転運動の角速度を検出するジャイロセンサ、車両の加速度を検出する加速度センサ、あるいは、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数又は車輪の回転速度等を検出するためのセンサのうちの少なくとも一つが含まれる。駆動系制御ユニット7100は、車両状態検出部7110から入力される信号を用いて演算処理を行い、内燃機関、駆動用モータ、電動パワーステアリング装置又はブレーキ装置等を制御する。 The vehicle state detection unit 7110 is connected to the drive system control unit 7100. The vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. It includes at least one of sensors for detecting an angle, engine speed, wheel speed, and the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
 ボディ系制御ユニット7200は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット7200は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット7200には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット7200は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps. In this case, a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200. The body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
 バッテリ制御ユニット7300は、各種プログラムにしたがって駆動用モータの電力供給源である二次電池7310を制御する。例えば、バッテリ制御ユニット7300には、二次電池7310を備えたバッテリ装置から、バッテリ温度、バッテリ出力電圧又はバッテリの残存容量等の情報が入力される。バッテリ制御ユニット7300は、これらの信号を用いて演算処理を行い、二次電池7310の温度調節制御又はバッテリ装置に備えられた冷却装置等の制御を行う。 The battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
 車外情報検出ユニット7400は、車両制御システム7000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット7400には、撮像部7410及び車外情報検出部7420のうちの少なくとも一方が接続される。撮像部7410には、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ及びその他のカメラのうちの少なくとも一つが含まれる。車外情報検出部7420には、例えば、現在の天候又は気象を検出するための環境センサ、あるいは、車両制御システム7000を搭載した車両の周囲の他の車両、障害物又は歩行者等を検出するための周囲情報検出センサのうちの少なくとも一つが含まれる。 The vehicle outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000. For example, at least one of the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400. The image pickup unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The vehicle outside information detection unit 7420 is used, for example, to detect the current weather or an environment sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
 環境センサは、例えば、雨天を検出する雨滴センサ、霧を検出する霧センサ、日照度合いを検出する日照センサ、及び降雪を検出する雪センサのうちの少なくとも一つであってよい。周囲情報検出センサは、超音波センサ、レーダ装置及びLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)装置のうちの少なくとも一つであってよい。これらの撮像部7410及び車外情報検出部7420は、それぞれ独立したセンサないし装置として備えられてもよいし、複数のセンサないし装置が統合された装置として備えられてもよい。 The environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device. The image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
 ここで、図8は、撮像部7410及び車外情報検出部7420の設置位置の例を示す。撮像部7910,7912,7914,7916,7918は、例えば、車両7900のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部のうちの少なくとも一つの位置に設けられる。フロントノーズに備えられる撮像部7910及び車室内のフロントガラスの上部に備えられる撮像部7918は、主として車両7900の前方の画像を取得する。サイドミラーに備えられる撮像部7912,7914は、主として車両7900の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部7916は、主として車両7900の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部7918は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 Here, FIG. 8 shows an example of the installation position of the image pickup unit 7410 and the vehicle exterior information detection unit 7420. The image pickup unit 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirror, rear bumper, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900. The image pickup unit 7910 provided in the front nose and the image pickup section 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900. The image pickup units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900. The image pickup unit 7916 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900. The image pickup unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図8には、それぞれの撮像部7910,7912,7914,7916の撮影範囲の一例が示されている。撮像範囲aは、フロントノーズに設けられた撮像部7910の撮像範囲を示し、撮像範囲b,cは、それぞれサイドミラーに設けられた撮像部7912,7914の撮像範囲を示し、撮像範囲dは、リアバンパ又はバックドアに設けられた撮像部7916の撮像範囲を示す。例えば、撮像部7910,7912,7914,7916で撮像された画像データが重ね合わせられることにより、車両7900を上方から見た俯瞰画像が得られる。 Note that FIG. 8 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916. The imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose, the imaging ranges b and c indicate the imaging range of the imaging units 7912 and 7914 provided on the side mirrors, respectively, and the imaging range d indicates the imaging range d. The imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the image pickup units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 can be obtained.
 車両7900のフロント、リア、サイド、コーナ及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7922,7924,7926,7928,7930は、例えば超音波センサ又はレーダ装置であってよい。車両7900のフロントノーズ、リアバンパ、バックドア及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7926,7930は、例えばLIDAR装置であってよい。これらの車外情報検出部7920~7930は、主として先行車両、歩行者又は障害物等の検出に用いられる。 The vehicle exterior information detection unit 7920, 7922, 7924, 7926, 7928, 7930 provided at the front, rear, side, corner and the upper part of the windshield of the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device. The vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device. These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
 図7に戻って説明を続ける。車外情報検出ユニット7400は、撮像部7410に車外の画像を撮像させるとともに、撮像された画像データを受信する。また、車外情報検出ユニット7400は、接続されている車外情報検出部7420から検出情報を受信する。車外情報検出部7420が超音波センサ、レーダ装置又はLIDAR装置である場合には、車外情報検出ユニット7400は、超音波又は電磁波等を発信させるとともに、受信された反射波の情報を受信する。車外情報検出ユニット7400は、受信した情報に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、降雨、霧又は路面状況等を認識する環境認識処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、車外の物体までの距離を算出してもよい。 Return to Fig. 7 and continue the explanation. The vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the connected vehicle exterior information detection unit 7420. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information. The out-of-vehicle information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information. The out-of-vehicle information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc. based on the received information. The out-of-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
 また、車外情報検出ユニット7400は、受信した画像データに基づいて、人、車、障害物、標識又は路面上の文字等を認識する画像認識処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した画像データに対して歪補正又は位置合わせ等の処理を行うとともに、異なる撮像部7410により撮像された画像データを合成して、俯瞰画像又はパノラマ画像を生成してもよい。車外情報検出ユニット7400は、異なる撮像部7410により撮像された画像データを用いて、視点変換処理を行ってもよい。 Further, the vehicle outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data. The vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different image pickup units 7410 to generate a bird's-eye view image or a panoramic image. May be good. The vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different image pickup units 7410.
 車内情報検出ユニット7500は、車内の情報を検出する。車内情報検出ユニット7500には、例えば、運転者の状態を検出する運転者状態検出部7510が接続される。運転者状態検出部7510は、運転者を撮像するカメラ、運転者の生体情報を検出する生体センサ又は車室内の音声を集音するマイク等を含んでもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座った搭乗者又はステアリングホイールを握る運転者の生体情報を検出する。車内情報検出ユニット7500は、運転者状態検出部7510から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。車内情報検出ユニット7500は、集音された音声信号に対してノイズキャンセリング処理等の処理を行ってもよい。 The in-vehicle information detection unit 7500 detects the in-vehicle information. For example, a driver state detection unit 7510 that detects the state of the driver is connected to the in-vehicle information detection unit 7500. The driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like. The biosensor is provided on, for example, on the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel. The in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is asleep. You may. The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
 統合制御ユニット7600は、各種プログラムにしたがって車両制御システム7000内の動作全般を制御する。統合制御ユニット7600には、入力部7800が接続されている。入力部7800は、例えば、タッチパネル、ボタン、マイクロフォン、スイッチ又はレバー等、搭乗者によって入力操作され得る装置によって実現される。統合制御ユニット7600には、マイクロフォンにより入力される音声を音声認識することにより得たデータが入力されてもよい。入力部7800は、例えば、赤外線又はその他の電波を利用したリモートコントロール装置であってもよいし、車両制御システム7000の操作に対応した携帯電話又はPDA(Personal Digital Assistant)等の外部接続機器であってもよい。入力部7800は、例えばカメラであってもよく、その場合搭乗者はジェスチャにより情報を入力することができる。あるいは、搭乗者が装着したウェアラブル装置の動きを検出することで得られたデータが入力されてもよい。さらに、入力部7800は、例えば、上記の入力部7800を用いて搭乗者等により入力された情報に基づいて入力信号を生成し、統合制御ユニット7600に出力する入力制御回路などを含んでもよい。搭乗者等は、この入力部7800を操作することにより、車両制御システム7000に対して各種のデータを入力したり処理動作を指示したりする。 The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs. An input unit 7800 is connected to the integrated control unit 7600. The input unit 7800 is realized by a device that can be input-operated by the occupant, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. You may. The input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
 記憶部7690は、マイクロコンピュータにより実行される各種プログラムを記憶するROM(Read Only Memory)、及び各種パラメータ、演算結果又はセンサ値等を記憶するRAM(Random Access Memory)を含んでいてもよい。また、記憶部7690は、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等によって実現してもよい。 The storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
 汎用通信I/F7620は、外部環境7750に存在する様々な機器との間の通信を仲介する汎用的な通信I/Fである。汎用通信I/F7620は、GSM(登録商標)(Global System of Mobile communications)、WiMAX(登録商標)、LTE(登録商標)(Long Term Evolution)若しくはLTE-A(LTE-Advanced)などのセルラー通信プロトコル、又は無線LAN(Wi-Fi(登録商標)ともいう)、Bluetooth(登録商標)などのその他の無線通信プロトコルを実装してよい。汎用通信I/F7620は、例えば、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)へ接続してもよい。また、汎用通信I/F7620は、例えばP2P(Peer To Peer)技術を用いて、車両の近傍に存在する端末(例えば、運転者、歩行者若しくは店舗の端末、又はMTC(Machine Type Communication)端末)と接続してもよい。 The general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750. General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced). , Or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi®), Bluetooth® may be implemented. The general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal). May be connected with.
 専用通信I/F7630は、車両における使用を目的として策定された通信プロトコルをサポートする通信I/Fである。専用通信I/F7630は、例えば、下位レイヤのIEEE802.11pと上位レイヤのIEEE1609との組合せであるWAVE(Wireless Access in Vehicle Environment)、DSRC(Dedicated Short Range Communications)、又はセルラー通信プロトコルといった標準プロトコルを実装してよい。専用通信I/F7630は、典型的には、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、車両と家との間(Vehicle to Home)の通信及び歩車間(Vehicle to Pedestrian)通信のうちの1つ以上を含む概念であるV2X通信を遂行する。 The dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle. The dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of IEEE802.11p in the lower layer and IEEE1609 in the upper layer. May be implemented. Dedicated communication I / F7630 is typically vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-house (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
 測位部7640は、例えば、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して測位を実行し、車両の緯度、経度及び高度を含む位置情報を生成する。なお、測位部7640は、無線アクセスポイントとの信号の交換により現在位置を特定してもよく、又は測位機能を有する携帯電話、PHS若しくはスマートフォンといった端末から位置情報を取得してもよい。 The positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including. The positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
 ビーコン受信部7650は、例えば、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行止め又は所要時間等の情報を取得する。なお、ビーコン受信部7650の機能は、上述した専用通信I/F7630に含まれてもよい。 The beacon receiving unit 7650 receives, for example, a radio wave or an electromagnetic wave transmitted from a radio station or the like installed on a road, and acquires information such as a current position, a traffic jam, a road closure, or a required time. The function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
 車内機器I/F7660は、マイクロコンピュータ7610と車内に存在する様々な車内機器7760との間の接続を仲介する通信インタフェースである。車内機器I/F7660は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)又はWUSB(Wireless USB)といった無線通信プロトコルを用いて無線接続を確立してもよい。また、車内機器I/F7660は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface、又はMHL(Mobile High-definition Link)等の有線接続を確立してもよい。車内機器7760は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、又は車両に搬入され若しくは取り付けられる情報機器のうちの少なくとも1つを含んでいてもよい。また、車内機器7760は、任意の目的地までの経路探索を行うナビゲーション装置を含んでいてもよい。車内機器I/F7660は、これらの車内機器7760との間で、制御信号又はデータ信号を交換する。 The in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle. The in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB). In addition, the in-vehicle device I / F7660 is via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)). -Definition Link) and other wired connections may be established. The in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device for searching a route to an arbitrary destination. The in-vehicle device I / F 7660 may be a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
 車載ネットワークI/F7680は、マイクロコンピュータ7610と通信ネットワーク7010との間の通信を仲介するインタフェースである。車載ネットワークI/F7680は、通信ネットワーク7010によりサポートされる所定のプロトコルに則して、信号等を送受信する。 The in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network I / F7680 transmits / receives signals and the like according to a predetermined protocol supported by the communication network 7010.
 統合制御ユニット7600のマイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、各種プログラムにしたがって、車両制御システム7000を制御する。例えば、マイクロコンピュータ7610は、取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット7100に対して制御指令を出力してもよい。例えば、マイクロコンピュータ7610は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行ってもよい。また、マイクロコンピュータ7610は、取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行ってもよい。 The microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680. The vehicle control system 7000 is controlled according to various programs based on the information acquired. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good. For example, the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of. In addition, the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving or the like.
 マイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、車両と周辺の構造物や人物等の物体との間の3次元距離情報を生成し、車両の現在位置の周辺情報を含むローカル地図情報を作成してもよい。また、マイクロコンピュータ7610は、取得される情報に基づき、車両の衝突、歩行者等の近接又は通行止めの道路への進入等の危険を予測し、警告用信号を生成してもよい。警告用信号は、例えば、警告音を発生させたり、警告ランプを点灯させたりするための信号であってよい。 The microcomputer 7610 has information acquired via at least one of a general-purpose communication I / F7620, a dedicated communication I / F7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F7660, and an in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict the danger of a vehicle collision, a pedestrian or the like approaching or entering a closed road, and generate a warning signal based on the acquired information. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
 音声画像出力部7670は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図7の例では、出力装置として、オーディオスピーカ7710、表示部7720及びインストルメントパネル7730が例示されている。表示部7720は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。表示部7720は、AR(Augmented Reality)表示機能を有していてもよい。出力装置は、これらの装置以外の、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ又はランプ等の他の装置であってもよい。出力装置が表示装置の場合、表示装置は、マイクロコンピュータ7610が行った各種処理により得られた結果又は他の制御ユニットから受信された情報を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。また、出力装置が音声出力装置の場合、音声出力装置は、再生された音声データ又は音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle. In the example of FIG. 7, an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices. The display unit 7720 may include, for example, at least one of an onboard display and a head-up display. The display unit 7720 may have an AR (Augmented Reality) display function. The output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps other than these devices. When the output device is a display device, the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually. When the output device is an audio output device, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs the audio signal audibly.
 なお、図7に示した例において、通信ネットワーク7010を介して接続された少なくとも二つの制御ユニットが一つの制御ユニットとして一体化されてもよい。あるいは、個々の制御ユニットが、複数の制御ユニットにより構成されてもよい。さらに、車両制御システム7000が、図示されていない別の制御ユニットを備えてもよい。また、上記の説明において、いずれかの制御ユニットが担う機能の一部又は全部を、他の制御ユニットに持たせてもよい。つまり、通信ネットワーク7010を介して情報の送受信がされるようになっていれば、所定の演算処理が、いずれかの制御ユニットで行われるようになってもよい。同様に、いずれかの制御ユニットに接続されているセンサ又は装置が、他の制御ユニットに接続されるとともに、複数の制御ユニットが、通信ネットワーク7010を介して相互に検出情報を送受信してもよい。 In the example shown in FIG. 7, at least two control units connected via the communication network 7010 may be integrated as one control unit. Alternatively, each control unit may be composed of a plurality of control units. Further, the vehicle control system 7000 may include another control unit (not shown). Further, in the above description, the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any of the control units. Similarly, a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. ..
 なお、図1を用いて説明した本実施形態に係る情報処理装置1の各機能を実現するためのコンピュータプログラムを、いずれかの制御ユニット等に実装することができる。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体を提供することもできる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 Note that a computer program for realizing each function of the information processing apparatus 1 according to the present embodiment described with reference to FIG. 1 can be implemented in any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
 以上説明した車両制御システム7000において、図1を用いて説明した本実施形態に係る情報処理装置1は、図7に示した応用例の統合制御ユニット7600に適用することができる。例えば、情報処理装置1の各構成要素は、統合制御ユニット7600のマイクロコンピュータ7610、記憶部7690、車載ネットワークI/F7680に相当する。例えば、統合制御ユニット7600が地図生成及び自己位置推定をすることにより、自動運転、自律移動の補助をすることができる。 In the vehicle control system 7000 described above, the information processing device 1 according to the present embodiment described with reference to FIG. 1 can be applied to the integrated control unit 7600 of the application example shown in FIG. For example, each component of the information processing apparatus 1 corresponds to a microcomputer 7610, a storage unit 7690, and an in-vehicle network I / F 7680 of the integrated control unit 7600. For example, the integrated control unit 7600 can assist automatic driving and autonomous movement by generating a map and estimating a self-position.
 また、図1を用いて説明した情報処理装置1の少なくとも一部の構成要素は、図7に示した統合制御ユニット7600のためのモジュール(例えば、一つのダイで構成される集積回路モジュール)において実現されてもよい。あるいは、図1を用いて説明した情報処理装置1が、図7に示した車両制御システム7000の複数の制御ユニットによって実現されてもよい。 Further, at least a part of the components of the information processing apparatus 1 described with reference to FIG. 1 are in the module for the integrated control unit 7600 shown in FIG. 7 (for example, an integrated circuit module composed of one die). It may be realized. Alternatively, the information processing apparatus 1 described with reference to FIG. 1 may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG. 7.
 前述した実施形態は、以下のような形態としてもよい。 The above-mentioned embodiment may be in the following form.
(1)
 移動体を制御するためのプロセッサを備え、
 前記プロセッサは、
  1又は複数のセンサから取得した情報に基づいた自己位置の推定の精度を計算し、
  前記精度が低下した場合、又は、前記精度が低下する見込みがある場合の少なくとも一方の場合に、前記自己位置の推定を実行するアルゴリズムに基づいたパラメータを更新し、
  更新された前記パラメータに基づいて、行動計画を作成する、
 情報処理装置。
(1)
Equipped with a processor to control the mobile
The processor
Calculate the accuracy of self-position estimation based on the information obtained from one or more sensors,
When the accuracy is reduced, or at least one of the cases where the accuracy is expected to be reduced, the parameters based on the algorithm for performing the self-position estimation are updated.
Create an action plan based on the updated parameters,
Information processing equipment.
(2)
 前記プロセッサは、
  作成した前記行動計画に基づいて、前記移動体を制御する、
 (1)に記載の情報処理装置。
(2)
The processor
Control the moving body based on the created action plan,
The information processing device according to (1).
(3)
 前記プロセッサは、
  前記センサごとに設定された自己位置推定アルゴリズムを用いて、前記自己位置を推定する、
 (1)又は(2)に記載の情報処理装置。
(3)
The processor
The self-position is estimated using the self-position estimation algorithm set for each sensor.
The information processing device according to (1) or (2).
(4)
 前記プロセッサは、
  複数の前記センサからの出力に基づいて、1つのアルゴリズムを用いて、前記自己位置を推定する、
 (1)から(2)のいずれかに記載の情報処理装置。
(4)
The processor
Estimating the self-position using one algorithm based on the outputs from the plurality of sensors.
The information processing device according to any one of (1) to (2).
(5)
 前記プロセッサは、
  前記自己位置の推定に用いる前記センサ又は前記アルゴリズムに基づいて、前記パラメータを更新する、
 (1)から(4)のいずれかに記載の情報処理装置。
(5)
The processor
Update the parameters based on the sensor or algorithm used to estimate the self-position.
The information processing device according to any one of (1) to (4).
(6)
 前記プロセッサは、
  前記センサ又は前記アルゴリズムにより自己位置推定の結果を算出し、前記精度を算出する、
 (1)から(4)のいずれかに記載の情報処理装置。
(6)
The processor
The result of self-position estimation is calculated by the sensor or the algorithm, and the accuracy is calculated.
The information processing device according to any one of (1) to (4).
(7)
 前記プロセッサは、
  複数の前記センサからの出力に基づいて、1つのアルゴリズムを用いて、前記精度を算出する、
 (1)から(6)のいずれかに記載の情報処理装置。
(7)
The processor
One algorithm is used to calculate the accuracy based on the outputs from the plurality of sensors.
The information processing device according to any one of (1) to (6).
(8)
 前記プロセッサは、
  前記センサから取得した情報に基づいて、前記移動体の周辺状況を取得し、
  取得した前記周辺状況に基づいて、前記パラメータを更新する、
 (1)から(7)のいずれかに記載の情報処理装置。
(8)
The processor
Based on the information acquired from the sensor, the surrounding condition of the moving body is acquired, and the surrounding condition is acquired.
The parameter is updated based on the acquired peripheral situation.
The information processing device according to any one of (1) to (7).
(9)
 前記パラメータの更新に関する情報を格納する、メモリ、をさらに備え、
 前記プロセッサは、
  前記メモリに格納されている情報に基づいて、前記パラメータを更新する、
 (1)から(8)のいずれかに記載の情報処理装置。
(9)
Further provided with a memory for storing information regarding the update of the parameters.
The processor
Update the parameter based on the information stored in the memory.
The information processing device according to any one of (1) to (8).
(10)
 前記メモリは、
  前記自己位置の推定に用いるセンサ、アルゴリズムの種類と、更新する対象となる前記パラメータと、条件と、更新データと、を紐付けた情報が記述されるテーブルを格納し、
 前記プロセッサは、
  前記テーブルを参照して前記パラメータを更新する、
 (9)に記載の情報処理装置。
(10)
The memory is
A table in which information relating to the sensor and algorithm type used for estimating the self-position, the parameter to be updated, the condition, and the update data is described is stored.
The processor
Update the parameters with reference to the table.
The information processing device according to (9).
(11)
 前記プロセッサは、
  前記精度が低下している、又は、前記精度が低下する見込みがある前記アルゴリズムに対する前記パラメータを更新する、
 (9)又は(10)に記載の情報処理装置。
(11)
The processor
Updating the parameters for the algorithm that is or is likely to be less accurate.
The information processing apparatus according to (9) or (10).
(12)
 前記プロセッサは、
  前記精度の低下を抑制する経路、又は、前記精度が向上する経路を選択するように前記パラメータを更新する、
 (9)から(11)のいずれかに記載の情報処理装置。
(12)
The processor
The parameter is updated so as to select a route that suppresses the decrease in accuracy or a route that improves the accuracy.
The information processing apparatus according to any one of (9) to (11).
(13)
 更新の対象となる前記パラメータは、コストマップである、
 (1)から(12)のいずれかに記載の情報処理装置。
(13)
The parameter to be updated is a cost map,
The information processing apparatus according to any one of (1) to (12).
(14)
 移動体を制御するためのプロセッサにより、
  (1)から(14)のいずれかに記載の方法を実行する、
 情報処理方法。
(14)
With a processor to control the mobile
Perform the method according to any one of (1) to (14),
Information processing method.
(15)
 移動体を制御するためのプロセッサに、
  (1)から(14)のいずれかに記載の方法を、
 実行させるプログラム。
(15)
To the processor for controlling the moving body,
The method described in any of (1) to (14),
The program to be executed.
 本開示の態様は、前述した実施形態に限定されるものではなく、想到しうる種々の変形も含むものであり、本開示の効果も前述の内容に限定されるものではない。各実施形態における構成要素は、適切に組み合わされて適用されてもよい。すなわち、特許請求の範囲に規定された内容及びその均等物から導き出される本開示の概念的な思想と趣旨を逸脱しない範囲で種々の追加、変更及び部分的削除が可能である。 The aspect of the present disclosure is not limited to the above-mentioned embodiment, but also includes various possible modifications, and the effect of the present disclosure is not limited to the above-mentioned contents. The components in each embodiment may be applied in appropriate combinations. That is, various additions, changes and partial deletions are possible without departing from the conceptual idea and purpose of the present disclosure derived from the contents specified in the claims and their equivalents.
1:情報処理装置、
100:記憶部、
102:自己位置推定部、
104:自己位置精度推定部、
106:認識器、
108:パラメータ更新部、
110:変更テーブル保持部、
112:行動計画部、
114:制御部、
2:センサ
1: Information processing device,
100: Memory,
102: Self-position estimation unit,
104: Self-position accuracy estimation unit,
106: recognizer,
108: Parameter update section,
110: Change table holder,
112: Action Planning Department,
114: Control unit,
2: Sensor

Claims (15)

  1.  移動体を制御するためのプロセッサを備え、
     前記プロセッサは、
      1又は複数のセンサから取得した情報に基づいた自己位置の推定の精度を計算し、
      前記精度が低下した場合、又は、前記精度が低下する見込みがある場合の少なくとも一方の場合に、前記自己位置の推定を実行するアルゴリズムに基づいたパラメータを更新し、
      更新された前記パラメータに基づいて、行動計画を作成する、
     情報処理装置。
    Equipped with a processor to control the mobile
    The processor
    Calculate the accuracy of self-position estimation based on the information obtained from one or more sensors,
    When the accuracy is reduced, or at least one of the cases where the accuracy is expected to be reduced, the parameters based on the algorithm for performing the self-position estimation are updated.
    Create an action plan based on the updated parameters,
    Information processing equipment.
  2.  前記プロセッサは、
      作成した前記行動計画に基づいて、前記移動体を制御する、
     請求項1に記載の情報処理装置。
    The processor
    Control the moving body based on the created action plan,
    The information processing apparatus according to claim 1.
  3.  前記プロセッサは、
      前記センサごとに設定された自己位置推定アルゴリズムを用いて、前記自己位置を推定する、
     請求項1に記載の情報処理装置。
    The processor
    The self-position is estimated using the self-position estimation algorithm set for each sensor.
    The information processing apparatus according to claim 1.
  4.  前記プロセッサは、
      複数の前記センサからの出力に基づいて、1つのアルゴリズムを用いて、前記自己位置を推定する、
     請求項1に記載の情報処理装置。
    The processor
    Estimating the self-position using one algorithm based on the outputs from the plurality of sensors.
    The information processing apparatus according to claim 1.
  5.  前記プロセッサは、
      前記自己位置の推定に用いる前記センサ又は前記アルゴリズムに基づいて、前記パラメータを更新する、
     請求項1に記載の情報処理装置。
    The processor
    Update the parameters based on the sensor or algorithm used to estimate the self-position.
    The information processing apparatus according to claim 1.
  6.  前記プロセッサは、
      前記センサ又は前記アルゴリズムにより自己位置推定の結果を算出し、前記精度を算出する、
     請求項1に記載の情報処理装置。
    The processor
    The result of self-position estimation is calculated by the sensor or the algorithm, and the accuracy is calculated.
    The information processing apparatus according to claim 1.
  7.  前記プロセッサは、
      複数の前記センサからの出力に基づいて、1つのアルゴリズムを用いて、前記精度を算出する、
     請求項1に記載の情報処理装置。
    The processor
    One algorithm is used to calculate the accuracy based on the outputs from the plurality of sensors.
    The information processing apparatus according to claim 1.
  8.  前記プロセッサは、
      前記センサから取得した情報に基づいて、前記移動体の周辺状況を取得し、
      取得した前記周辺状況に基づいて、前記パラメータを更新する、
     請求項1に記載の情報処理装置。
    The processor
    Based on the information acquired from the sensor, the surrounding condition of the moving body is acquired, and the surrounding condition is acquired.
    The parameter is updated based on the acquired peripheral situation.
    The information processing apparatus according to claim 1.
  9.  前記パラメータの更新に関する情報を格納する、メモリ、をさらに備え、
     前記プロセッサは、
      前記メモリに格納されている情報に基づいて、前記パラメータを更新する、
     請求項1に記載の情報処理装置。
    Further provided with a memory for storing information regarding the update of the parameters.
    The processor
    Update the parameter based on the information stored in the memory.
    The information processing apparatus according to claim 1.
  10.  前記メモリは、
      前記自己位置の推定に用いるセンサ、アルゴリズムの種類と、更新する対象となる前記パラメータと、条件と、更新データと、を紐付けた情報が記述されるテーブルを格納し、
     前記プロセッサは、
      前記テーブルを参照して前記パラメータを更新する、
     請求項9に記載の情報処理装置。
    The memory is
    A table in which information relating to the sensor and algorithm type used for estimating the self-position, the parameter to be updated, the condition, and the update data is described is stored.
    The processor
    Update the parameters with reference to the table.
    The information processing apparatus according to claim 9.
  11.  前記プロセッサは、
      前記精度が低下している、又は、前記精度が低下する見込みがある前記アルゴリズムに対する前記パラメータを更新する、
     請求項9に記載の情報処理装置。
    The processor
    Updating the parameters for the algorithm that is or is likely to be less accurate.
    The information processing apparatus according to claim 9.
  12.  前記プロセッサは、
      前記精度の低下を抑制する経路、又は、前記精度が向上する経路を選択するように前記パラメータを更新する、
     請求項9に記載の情報処理装置。
    The processor
    The parameter is updated so as to select a route that suppresses the decrease in accuracy or a route that improves the accuracy.
    The information processing apparatus according to claim 9.
  13.  更新の対象となる前記パラメータは、コストマップである、
     請求項1に記載の情報処理装置。
    The parameter to be updated is a cost map,
    The information processing apparatus according to claim 1.
  14.  移動体を制御するためのプロセッサにより、
      1又は複数のセンサから取得した情報に基づいた自己位置の推定の精度を計算し、
      前記精度が低下した場合、又は、前記精度が低下する見込みがある場合の少なくとも一方の場合に、前記自己位置の推定を実行するアルゴリズムに基づいたパラメータを更新し、
      更新された前記パラメータに基づいて、行動計画を作成する、
     情報処理方法。
    With a processor to control the mobile
    Calculate the accuracy of self-position estimation based on the information obtained from one or more sensors,
    When the accuracy is reduced, or at least one of the cases where the accuracy is expected to be reduced, the parameters based on the algorithm for performing the self-position estimation are updated.
    Create an action plan based on the updated parameters,
    Information processing method.
  15.  移動体を制御するためのプロセッサに、
      1又は複数のセンサから取得した情報に基づいた自己位置の推定の精度を計算し、
      前記精度が低下した場合、又は、前記精度が低下する見込みがある場合の少なくとも一方の場合に、前記自己位置の推定を実行するアルゴリズムに基づいたパラメータを更新し、
      更新された前記パラメータに基づいて、行動計画を作成する、
     ことを実行させるプログラム。
    To the processor for controlling the moving body,
    Calculate the accuracy of self-position estimation based on the information obtained from one or more sensors,
    When the accuracy is reduced, or at least one of the cases where the accuracy is expected to be reduced, the parameters based on the algorithm for performing the self-position estimation are updated.
    Create an action plan based on the updated parameters,
    A program that lets you do things.
PCT/JP2021/032172 2020-09-15 2021-09-01 Information processing device, information processing method, and program WO2022059489A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020154858A JP2023159898A (en) 2020-09-15 2020-09-15 Information processing system, information processing method, and program
JP2020-154858 2020-09-15

Publications (1)

Publication Number Publication Date
WO2022059489A1 true WO2022059489A1 (en) 2022-03-24

Family

ID=80776009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/032172 WO2022059489A1 (en) 2020-09-15 2021-09-01 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2023159898A (en)
WO (1) WO2022059489A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016048464A (en) * 2014-08-27 2016-04-07 本田技研工業株式会社 Autonomously acting robot and control method of autonomously acting robot
JP2019130997A (en) * 2018-01-30 2019-08-08 マツダ株式会社 Vehicle controlling apparatus
JP2020038498A (en) * 2018-09-04 2020-03-12 株式会社Ihi Apparatus for estimating self-location
JP2020079997A (en) * 2018-11-12 2020-05-28 ソニー株式会社 Information processing apparatus, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016048464A (en) * 2014-08-27 2016-04-07 本田技研工業株式会社 Autonomously acting robot and control method of autonomously acting robot
JP2019130997A (en) * 2018-01-30 2019-08-08 マツダ株式会社 Vehicle controlling apparatus
JP2020038498A (en) * 2018-09-04 2020-03-12 株式会社Ihi Apparatus for estimating self-location
JP2020079997A (en) * 2018-11-12 2020-05-28 ソニー株式会社 Information processing apparatus, information processing method, and program

Also Published As

Publication number Publication date
JP2023159898A (en) 2023-11-02

Similar Documents

Publication Publication Date Title
CN110377025B (en) Sensor aggregation frame for an autonomous vehicle
US11531354B2 (en) Image processing apparatus and image processing method
US10183641B2 (en) Collision prediction and forward airbag deployment system for autonomous driving vehicles
WO2017057055A1 (en) Information processing device, information terminal and information processing method
CN110349416B (en) Density-based traffic light control system for autonomous vehicles (ADV)
WO2020062032A1 (en) A pedestrian probability prediction system for autonomous vehicles
JP7294148B2 (en) CALIBRATION DEVICE, CALIBRATION METHOD AND PROGRAM
JP2023126642A (en) Information processing device, information processing method, and information processing system
US10438074B2 (en) Method and system for controlling door locks of autonomous driving vehicles based on lane information
US20230230368A1 (en) Information processing apparatus, information processing method, and program
US20200349367A1 (en) Image processing device, image processing method, and program
WO2019049828A1 (en) Information processing apparatus, self-position estimation method, and program
US20200230820A1 (en) Information processing apparatus, self-localization method, program, and mobile body
US11533420B2 (en) Server, method, non-transitory computer-readable medium, and system
US20220276655A1 (en) Information processing device, information processing method, and program
US20190339366A1 (en) Signal processing device, signal processing method, and program
WO2022024602A1 (en) Information processing device, information processing method and program
WO2020195965A1 (en) Information processing device, information processing method, and program
US20220012552A1 (en) Information processing device and information processing method
WO2022059489A1 (en) Information processing device, information processing method, and program
WO2020195969A1 (en) Information processing device, information processing method, and program
WO2021065510A1 (en) Information processing device, information processing method, information processing system, and program
WO2021075112A1 (en) Information processing device, information processing method, and program
WO2022196316A1 (en) Information processing device, information processing method, and program
JP2024065130A (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21869176

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21869176

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP