WO2019176278A1 - Information processing device, information processing method, program, and mobile body - Google Patents

Information processing device, information processing method, program, and mobile body Download PDF

Info

Publication number
WO2019176278A1
WO2019176278A1 PCT/JP2019/001369 JP2019001369W WO2019176278A1 WO 2019176278 A1 WO2019176278 A1 WO 2019176278A1 JP 2019001369 W JP2019001369 W JP 2019001369W WO 2019176278 A1 WO2019176278 A1 WO 2019176278A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
unit
determination
information
height position
Prior art date
Application number
PCT/JP2019/001369
Other languages
French (fr)
Japanese (ja)
Inventor
遼 高橋
雅貴 豊浦
倫之 鈴鹿
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019176278A1 publication Critical patent/WO2019176278A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present technology relates to an information processing apparatus, an information processing method, a program, and a moving body that can be applied to autonomous movement control of the moving body.
  • Patent Document 1 describes an obstacle determination device that enables a vehicle or the like that performs autonomous movement to avoid an obstacle.
  • this obstacle determination device the shape of the measurement range (measurement space region) by the distance measuring device is changed based on the inclination (inclination) of the distance measuring device. This prevents erroneous determination of an obstacle (paragraphs [0041] to [0054] in FIG. 4 of Patent Document 1).
  • an object of the present technology is to provide an information processing apparatus, an information processing method, a program, and a moving body that can easily and accurately determine the situation around the moving body. is there.
  • an information processing apparatus includes an acquisition unit, a calculation unit, and a determination unit.
  • the acquisition unit acquires information related to a height position of a sensor capable of detecting peripheral information of the moving body.
  • the calculation unit calculates a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor.
  • the determination unit determines a state of the periphery of the moving body based on the calculated determination area and the peripheral information detected by the sensor.
  • a determination region for determining the situation around the moving body is calculated based on information on the height position of the sensor. This makes it possible to easily and accurately determine the situation around the moving body.
  • the calculation unit may determine whether there is an obstacle around the moving body.
  • the acquisition unit may acquire shape data related to the periphery of the moving body.
  • the determination unit may determine a situation around the moving body based on the relationship between the detected shape data and the determination region.
  • the acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body.
  • the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region.
  • the calculation unit may change the height position of the determination region according to a change in the height position of the sensor.
  • the calculation unit may calculate one or more determination planes that define the determination area.
  • the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.
  • the calculation unit may change the height position of at least one of the first determination surface and the second determination surface according to a change in the height position of the sensor.
  • the calculation unit may calculate the determination area based on a predetermined reference determination area based on the acquired information on the height position of the sensor.
  • the calculation unit may calculate the determination region so that the height position is the same as the height position of the reference determination region.
  • the calculation unit may correct the size of the calculated determination area in the height direction based on the acquired information on the height position of the sensor.
  • the acquisition unit may acquire information related to the tilt of the sensor.
  • the calculation unit may calculate the determination area based on information on the tilt of the sensor.
  • the calculation unit may change the inclination of the determination region according to a change in the inclination of the sensor.
  • An information processing method is an information processing method executed by a computer system, and includes acquiring information related to a height position of a sensor capable of detecting peripheral information of a moving object. Based on the acquired information on the height position of the sensor, a determination region for determining a situation around the moving body is calculated. Based on the calculated determination area and the surrounding information detected by the sensor, a situation around the moving body is determined.
  • a program causes a computer system to execute the following steps. Obtaining information related to a height position of a sensor capable of detecting peripheral information of the moving body; Calculating a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor; A step of determining a situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor;
  • a moving body includes a drive unit, a sensor, a detection unit, a calculation unit, and a drive control unit.
  • the sensor can detect surrounding information.
  • the said detection part detects the information regarding the height position of the said sensor.
  • the calculation unit calculates a determination region for determining a surrounding situation based on the detected information on the height position of the sensor.
  • the determination unit determines a surrounding situation based on the calculated determination region and the peripheral information detected by the sensor.
  • the drive control unit controls the drive unit based on a determination result by the determination unit.
  • the drive unit may be a leg having a multi-joint structure.
  • FIG. 1 is a block diagram illustrating a configuration example that is an outline of a moving object 10 according to an embodiment of the present technology.
  • the moving body 10 is, for example, a robot, and includes a sensor group 11, an autonomous movement control unit 12, and an actuator group 13.
  • the sensor group 11 includes sensors 11a-1 to 11a-n that detect various kinds of information necessary for recognizing the inside of the moving body 10 and the surroundings of the moving body 10, and the detection result is transmitted to the autonomous movement control unit. 12 is output. Further, when there is no need to particularly distinguish the sensors 11a-1 to 11a-n, they are simply referred to as the sensor 11a, and the other configurations are also referred to in the same manner.
  • the sensors 11a-1 to 11a-n are, for example, a camera that images the surroundings of the moving body 10, an acceleration sensor that detects the movement of the moving body 10, and objects existing around the moving body 10.
  • LiDAR Light Detection and Ranging, Laser ⁇ Imaging Detection and Ranging
  • ToF Time of Flight
  • geomagnetic sensor for detecting direction
  • gyro sensor acceleration sensor
  • pressure sensor for detecting changes in ambient pressure
  • Contact sensor that detects presence or absence of contact
  • temperature sensor that detects temperature
  • humidity sensor that detects humidity
  • PSD Position Sensitive Detector
  • GNSS Global Navigation Satellite System
  • the autonomous movement control unit 12 recognizes the surrounding situation from various detection results of the sensor group 11, generates an action plan based on the recognition result, and various types of actuator groups 13 that drive the robot according to the action plan. Actuators 13a-1 to 13a-n are operated. In addition, when it is not necessary to distinguish the actuators 13a-1 to 13a-n, they are simply referred to as the actuator 13a, and the other configurations are also referred to in the same manner.
  • the autonomous movement control unit 12 includes a recognition processing unit 14, an action plan processing unit 15, and an action control processing unit 16.
  • the recognition processing unit 14 executes recognition processing based on the detection result supplied from the sensor group 11, for example, an image, a person, an object, a facial expression type, a position, an attribute, and a position of itself or an obstacle. It recognizes and outputs to the action plan process part 15 as a recognition result. Further, the recognition processing unit 14 estimates the self position based on the detection result supplied from the sensor group 11. At this time, the recognition processing unit 14 estimates the self position using a predetermined model. Further, the recognition processing unit 14 is a model different from the predetermined model when the self-position cannot be estimated using the predetermined model based on the detection result supplied from the sensor group 11 due to the influence of the external force. Estimate self-position.
  • the action plan processing unit 15 Based on the recognition result, the action plan processing unit 15 generates an action plan such as a trajectory of the movement of the device related to the movement of the moving body 10, a state change, and a speed or acceleration based on the recognition result. To the behavior control processing unit 16.
  • the behavior control processing unit 16 generates a control signal for controlling the specific movements of the actuators 13a-1 to 13a-n of the actuator group 13 based on the behavior plan supplied from the behavior plan processing unit 15. Then, the actuator group 13 is operated.
  • the actuator group 13 operates the actuators 13a-1 to 13a-n that specifically operate the moving body 10 based on the control signal supplied from the behavior control processing unit 16. More specifically, the actuators 13a-1 to 13a-n operate operations of a motor, a servo motor, a brake, and the like that realize a specific movement of the moving body 10 based on the control signal.
  • the actuators 13a-1 to 13a-n include a configuration that realizes an expansion / contraction motion, a bending / extension motion, a turning motion, or the like, and includes an LED (Light Emission Diode) or an LCD (Liquid Crystal Display) that displays information. It can further include a configuration such as a display unit and a speaker for outputting sound. Therefore, when the actuator group 13 is controlled based on the control signal, operations of various devices that drive the moving body 10 are realized, information is displayed, and sound is output.
  • the actuators 13a-1 to 13a-n of the actuator group 13 the operation related to the movement of the moving body 10 is controlled, and various information such as information display and sound output are also presented. Be controlled.
  • a moving body control system for controlling the moving body 10 for realizing the functions described above will be described.
  • FIG. 2 is a block diagram illustrating a schematic configuration example of a function of the moving body control system 100 that controls the moving body 10 of the present disclosure.
  • 2 is an example of a mobile body control system that controls the mobile body 10 including a robot to which the present technology can be applied, but other mobile bodies such as an aircraft, a ship, and a multi It can also be applied as a system for controlling a rotor copter (drone) or the like.
  • the robot may be a wheel type robot, an autonomous driving vehicle that can be boarded, or a multi-legged walking type robot.
  • the present invention can also be applied to a robot having a leg portion having a multi-joint structure as a drive unit.
  • the mobile control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a storage unit 109, And an autonomous movement control unit 110.
  • the input unit 101, data acquisition unit 102, communication unit 103, output control unit 105, drive system control unit 107, storage unit 109, and autonomous movement control unit 110 are connected to each other via a communication network 111. .
  • the communication network 111 is, for example, a communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) such as IEEE 802.3, or FlexRay (registered trademark). Or a bus, or a non-standardized communication method.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • each part of the mobile body control system 100 may be directly connected without going through the communication network 111.
  • each unit of the mobile control system 100 performs communication via the communication network 111
  • the description of the communication network 111 is omitted.
  • the input unit 101 and the autonomous movement control unit 110 communicate via the communication network 111, it is simply described that the input unit 101 and the autonomous movement control unit 110 communicate.
  • the input unit 101 includes a device used by the passenger for inputting various data and instructions.
  • the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device that can be input by a method other than manual operation by voice, gesture, or the like.
  • the input unit 101 may be a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile device or a wearable device corresponding to the operation of the mobile control system 100.
  • the input unit 101 generates an input signal based on data and instructions input by the passenger and supplies the input signal to each unit of the mobile control system 100.
  • the data acquisition unit 102 includes various sensors that acquire data used for processing of the mobile body control system 100, and supplies the acquired data to each part of the mobile body control system 100.
  • the data acquisition unit 102 includes various sensors for detecting the state of the moving body and the like, thereby configuring the sensor group 112, and the sensor group 11 including the sensors 11a-1 to 11a-n in FIG.
  • the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement device (IMU), an acceleration input operation amount such as an accelerator, a deceleration input operation amount, a direction instruction input operation amount, A sensor or the like is provided for detecting the rotational speed, input / output energy / fuel amount of the driving device such as the engine or motor, the torque amount of the engine or motor, or the rotational speed or torque of the wheel or joint.
  • IMU inertial measurement device
  • a sensor or the like is provided for detecting the rotational speed, input / output energy / fuel amount of the driving device such as the engine or motor, the torque amount of the engine or motor, or the rotational speed or torque of the wheel or joint.
  • the data acquisition unit 102 includes various sensors for detecting information outside the moving body.
  • the data acquisition unit 102 includes an imaging device such as a ToF camera, a stereo camera, a monocular camera, an infrared camera, a polarization camera, and other cameras.
  • the data acquisition unit 102 includes an environmental sensor for detecting weather or weather and a surrounding information detection sensor for detecting an object around the moving body.
  • the environmental sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.
  • the ambient information detection sensor includes, for example, a laser range sensor, an ultrasonic sensor, a radar, a LiDAR, a sonar, and the like.
  • the data acquisition unit 102 includes various sensors for detecting the current position of the moving object.
  • the data acquisition unit 102 includes a GNSS receiver that receives a GNSS signal from a GNSS satellite.
  • the communication unit 103 communicates with the mobile device 104 and various devices, servers, base stations, and the like outside the mobile device, and transmits and receives data supplied from each unit of the mobile control system 100. Data is supplied to each part of the mobile control system 100.
  • the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 can support a plurality of types of communication protocols.
  • the communication unit 103 performs wireless communication with the mobile internal device 104 using a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like. Further, for example, the communication unit 103 is connected to a USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High-definition Link) via a connection terminal (and a cable if necessary). ) Or the like to perform wired communication with the mobile internal device 104.
  • USB Universal Serial Bus
  • HDMI High-Definition Multimedia Interface
  • MHL Mobile High-definition Link
  • the communication unit 103 communicates with a device (for example, an application server or a control server) that exists on an external network (for example, the Internet, a cloud network, or an operator-specific network) via a base station or an access point. I do.
  • the communication unit 103 communicates with a terminal (for example, a pedestrian or a store terminal or an MTC (Machine Type Communication) terminal) that exists in the vicinity of the mobile body using P2P (Peer To To Peer) technology. I do.
  • the communication unit 103 when the mobile body 10 is a vehicle, the communication unit 103 performs vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, communication between the mobile body and the house (Vehicle to Home), In addition, V2X communication such as vehicle-to-pedestrian communication is performed.
  • the communication unit 103 includes a beacon receiving unit, receives radio waves or electromagnetic waves transmitted from a wireless station or the like installed on the road, and acquires information such as a current position, traffic jam, traffic regulation, or required time. To do.
  • the mobile internal device 104 includes, for example, a mobile device or wearable device possessed by a passenger, an information device that is carried in or attached to the mobile body, and a navigation device that searches for a route to an arbitrary destination.
  • the output control unit 105 controls the output of various information to the passenger of the moving body or the outside of the moving body.
  • the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data), and supplies the output signal to the output unit 106, thereby outputting the output unit
  • the output of visual information and auditory information from 106 is controlled.
  • the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate an overhead image or a panoramic image, and outputs an output signal including the generated image. This is supplied to the output unit 106.
  • the output control unit 105 generates sound data including a warning sound or a warning message for a danger such as a collision, contact, or entry into a dangerous zone, and outputs an output signal including the generated sound data to the output unit 106. Supply.
  • the output unit 106 includes a device that can output visual information or auditory information to a passenger of the moving body or the outside of the moving body.
  • the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a glasses-type display worn by a passenger, a projector, a lamp, and the like.
  • the display device included in the output unit 106 may display visual information in the driver's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function. It may be a display device.
  • the output control unit 105 and the output unit 106 are not essential components for the autonomous movement process, and may be omitted as necessary.
  • the drive system control unit 107 controls the drive system 108 by generating various control signals and supplying them to the drive system 108. Further, the drive system control unit 107 supplies a control signal to each unit other than the drive system 108 as necessary, and notifies the control state of the drive system 108 and the like.
  • the drive system 108 includes various devices related to the drive system of the moving body.
  • the drive system 108 includes a servo motor that can specify the angle and torque of each joint of four legs, a motion controller that disassembles and replaces the movement movement of the robot itself into four leg movements, and Provided with a feedback control device using sensors in each motor and sensors on the back of the foot.
  • the drive system 108 includes a motor having four or six propellers facing upward, and a motion controller that disassembles and replaces the movement of the robot itself into the rotation amount of each motor.
  • the drive system 108 includes a driving force generator for generating a driving force such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle.
  • a steering mechanism for adjusting, a braking device for generating braking force, an ABS (AntilocktiBrakeSystem), an ESC (ElectronicElectroStability Control), an electric power steering device, and the like are provided.
  • the output control unit 105, the output unit 106, the drive system control unit 107, and the drive system 108 constitute an actuator group 113 and correspond to the actuator group 13 including the actuators 13a-1 to 13a-n in FIG. .
  • the drive system control unit 107 and the drive system system (actuator group 13) function as a drive unit.
  • the storage unit 109 includes, for example, a magnetic storage device such as ROM (Read Only Memory), RAM (Random Access Memory), and HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, and the like. .
  • the storage unit 109 stores various programs and data used by each unit of the mobile control system 100.
  • the storage unit 109 is a map data such as a three-dimensional high-accuracy map such as a dynamic map, a global map that is less accurate than a high-accuracy map and covers a wide area, and a local map that includes information around a moving object.
  • a map data such as a three-dimensional high-accuracy map such as a dynamic map, a global map that is less accurate than a high-accuracy map and covers a wide area, and a local map that includes information around a moving object.
  • the autonomous movement control unit 110 performs control related to autonomous movement such as automatic driving or driving support. Specifically, for example, the autonomous movement control unit 110 aims to realize a function of collision avoidance or impact mitigation of a moving object, follow-up movement based on the distance between moving objects, movement speed maintaining movement, or collision warning of the moving object. Coordinated control is performed. Further, for example, the autonomous movement control unit 110 performs cooperative control for the purpose of autonomous movement that moves autonomously without depending on the operation of the operator / user.
  • the autonomous movement control unit 110 corresponds to the information processing apparatus according to the present embodiment, and includes hardware necessary for a computer such as a CPU, a RAM, and a ROM.
  • the information processing method according to the present technology is executed when the CPU loads a program according to the present technology recorded in advance in the ROM into the RAM and executes the program.
  • the specific configuration of the autonomous movement control unit 110 is not limited, and a device such as PLD (Programmable Logic Device) such as FPGA (Field Programmable Gate Array) or other ASIC (Application Specific Specific Integrated Circuit) may be used.
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Specific Integrated Circuit
  • the autonomous movement control unit 110 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
  • the detection part 131, the self-position estimation part 132, and the condition analysis part 133 comprise the recognition process part 121, and respond
  • the plan part 134 comprises the action plan process part 122, and respond
  • the detecting unit 131 detects various information necessary for controlling autonomous movement.
  • the detection unit 131 includes a mobile body external information detection unit 141, a mobile body internal information detection unit 142, and a mobile body state detection unit 143.
  • the mobile object external information detection unit 141 performs a process of detecting information outside the mobile object based on data or signals from each part of the mobile object control system 100.
  • the mobile object external information detection unit 141 performs detection processing, recognition processing, tracking processing, and distance detection processing for an object around the mobile object.
  • objects to be detected include moving objects, people, obstacles, structures, roads, traffic lights, traffic signs, road markings, and the like.
  • the mobile object external information detection unit 141 performs an environment detection process around the mobile object.
  • the surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like.
  • the moving body external information detection unit 141 supplies data indicating the detection processing result to the self-position estimation unit 132, the map analysis unit 151 of the situation analysis unit 133, the situation recognition unit 152, the operation control unit 135, and the like. .
  • the mobile body internal information detection unit 142 performs a process of detecting information inside the mobile body based on data or signals from each part of the mobile body control system 100.
  • the mobile body internal information detection unit 142 performs driver authentication processing and recognition processing, driver state detection processing, passenger detection processing, environment detection processing inside the mobile body, and the like.
  • the state of the driver to be detected includes, for example, physical condition, arousal level, concentration level, fatigue level, gaze direction, and the like.
  • Examples of the environment inside the moving object to be detected include temperature, humidity, brightness, smell, and the like.
  • the mobile object internal information detection unit 142 supplies data indicating the result of the detection process to the situation recognition unit 152 of the situation analysis unit 133, the operation control unit 135, and the like.
  • the moving body state detection unit 143 performs a state detection process of the moving body based on data or signals from each part of the moving body control system 100.
  • the state of the mobile object to be detected includes, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, driving operation state, power seat position and tilt, door lock state, and other mobile body mounting
  • the status of the device is included.
  • the moving body state detection unit 143 supplies data indicating the result of the detection process to the situation recognition unit 152 of the situation analysis unit 133, the operation control unit 135, and the like.
  • the self-position estimation unit 132 is based on data or signals from each part of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 152 of the situation analysis unit 133. Etc. are estimated. In addition, the self-position estimation unit 132 generates a local map (hereinafter referred to as a self-position estimation map) used for self-position estimation as necessary.
  • the self-position estimation map is, for example, a highly accurate map using a technique such as SLAM (SimultaneousultLocalization and Mapping).
  • the self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151 of the situation analysis unit 133, the situation recognition unit 152, and the like.
  • the self-position estimating unit 132 stores the self-position estimating map in the storage unit 109.
  • the self-position estimation unit 132 accumulates time-series information supplied in time series in the database based on the detection result supplied from the sensor group 112, and based on the accumulated time-series information, Is output as the time-series information self-position. Further, the self-position estimation unit 132 estimates the self-position based on the current detection result supplied from the sensor group 112, and outputs it as the current information self-position. Then, the self-position estimation unit 132 outputs the self-position estimation result by integrating or switching the time-series information self-position and the current information self-position.
  • the self-position estimation unit 132 detects the posture of the moving body 10 based on the detection result supplied from the sensor group 112, detects a change in the posture, changes the self-position greatly, and the time-series information self When the position estimation accuracy is considered to be lowered, the self-position is estimated only from the current information self-position.
  • the self-position estimation unit 132 changes the posture of the moving body 10 based on the detection result supplied from the sensor group 112. Even if it is not detected, since the self-position changes greatly, it is assumed that the estimation accuracy of the time-series information self-position is lowered, and the self-position is estimated only from the current information self-position.
  • the moving body 10 may be a vehicle and mounted on a car ferry boat to move.
  • the self-position is estimated from only the current information self-position even when there is a change in posture that cannot be predicted in advance and the self-position changes greatly regardless of the influence of external force. Therefore, the self position can be estimated with a predetermined accuracy.
  • the situation analysis unit 133 performs analysis processing of the moving body and the surrounding situation.
  • the situation analysis unit 133 includes a map analysis unit 151, a situation recognition unit 152, and a situation prediction unit 153.
  • the map analysis unit 151 uses various data or signals from each part of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and stores various data stored in the storage unit 109.
  • the map is analyzed, and a map including information necessary for the autonomous movement process is constructed.
  • the map analysis unit 151 supplies the constructed map to the situation recognition unit 152, the situation prediction unit 153, the route plan unit 161, the action plan unit 162, the action plan unit 163, and the like of the plan unit 134.
  • the situation recognition unit 152 is a part of the mobile body control system 100 such as a self-position estimation unit 132, a mobile body external information detection unit 141, a mobile body internal information detection unit 142, a mobile body state detection unit 143, and a map analysis unit 151. Based on the data or signal from, the recognition process of the situation regarding a moving body is performed. For example, the situation recognition unit 152 performs recognition processing such as the situation of the moving body, the situation around the moving body, and the situation of the driver of the moving body. In addition, the situation recognition unit 152 generates a local map (hereinafter referred to as a situation recognition map) used for recognition of the situation around the moving object, as necessary.
  • the situation recognition map is, for example, an occupation grid map (OccupancyccMap Map), a road map (Lane Map), or a point cloud map (Point Cloud Map).
  • the situation of the moving body to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the moving body, presence / absence and contents of the abnormality.
  • the situation around the moving object to be recognized includes, for example, the type and position of the surrounding stationary object, the type and position of the surrounding moving object (for example, speed, acceleration, moving direction, etc.), the surrounding road Configuration, road surface conditions, ambient weather, temperature, humidity, brightness, etc. are included.
  • the state of the driver to be recognized includes, for example, physical condition, arousal level, concentration level, fatigue level, line of sight movement, and driving operation.
  • the situation recognition unit 152 supplies data indicating the result of recognition processing (including a situation recognition map as necessary) to the self-position estimation unit 132, the situation prediction unit 153, and the like. Further, the situation recognition unit 152 stores the situation recognition map in the storage unit 109.
  • the situation prediction unit 153 performs a process for predicting a situation related to the moving body based on data or signals from each part of the moving body control system 100 such as the map analysis unit 151 and the situation recognition unit 152. For example, the situation prediction unit 153 performs prediction processing such as the situation of the moving body, the situation around the moving body, and the situation of the driver.
  • the situation of the mobile object to be predicted includes, for example, the behavior of the mobile object, the occurrence of an abnormality, and the movable distance.
  • the situation around the moving object to be predicted includes, for example, the behavior of the moving object around the moving object, the change in the signal state, the change in the environment such as the weather, and the like.
  • the situation of the driver to be predicted includes, for example, the behavior and physical condition of the driver.
  • the situation prediction unit 153 supplies the data indicating the result of the prediction process and the data from the situation recognition unit 152 to the route planning unit 161, the action planning unit 162, the action planning unit 163, and the like of the planning unit 134.
  • the route planning unit 161 plans a route to the destination based on data or signals from each part of the mobile control system 100 such as the map analysis unit 151 and the situation prediction unit 153. For example, the route planning unit 161 sets a route from the current position to the designated destination based on the global map. In addition, for example, the route planning unit 161 changes the route as appropriate based on conditions such as traffic jams, accidents, traffic restrictions, construction, and the physical condition of the driver. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
  • the action plan unit 162 can safely route the route planned by the route plan unit 161 within the planned time. Plan the behavior of the moving body to move to. For example, the action planning unit 162 performs plans such as start, stop, traveling direction (for example, forward, backward, left turn, right turn, direction change, etc.), moving speed, and overtaking.
  • the behavior planning unit 162 supplies data indicating the planned behavior of the moving body to the motion planning unit 163 and the like.
  • the action plan unit 162 generates, as action plan candidates, action plan candidates of moving bodies for safely moving within the planned time for each route planned by the route plan unit 161. To do. More specifically, the action planning unit 162 divides the environment into a grid, for example, A * algorithm (A star search algorithm) that optimizes arrival determination and route weight to generate the best path, road center Generating action plan candidates with Lane algorithm that sets the route according to the line and RRT (Rapidly-exploring Random Tree) algorithm that stretches while properly pruning the path from the self position to the place where it can reach incrementally .
  • a * algorithm A star search algorithm
  • RRT Rapidly-exploring Random Tree
  • the motion planning unit 163 is based on data or signals from each part of the mobile control system 100 such as the map analysis unit 151 and the situation prediction unit 153, and the motion planning unit 162 realizes the behavior planned by the behavior planning unit 162. Plan for action. For example, the motion planning unit 163 performs planning such as acceleration, deceleration, and movement trajectory. The motion planning unit 163 supplies data indicating the planned motion of the moving body to the motion control unit 135 and the like.
  • the operation control unit 135 controls the operation of the moving object.
  • the operation control unit 135 detects collision, contact, or danger zone based on the detection results of the mobile body external information detection unit 141, the mobile body internal information detection unit 142, and the mobile body state detection unit 143. Detects emergency situations such as approach, driver abnormality, and moving body abnormality. When the occurrence of an emergency situation is detected, the operation control unit 135 plans the operation of the moving body to avoid an emergency situation such as a sudden stop or a sudden turn.
  • the operation control unit 135 performs acceleration / deceleration control for realizing the operation of the moving object planned by the operation planning unit 163. For example, the operation control unit 135 calculates a control target value of a driving force generator or a braking device for realizing planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. This is supplied to the system control unit 107.
  • the motion control unit 135 performs direction control for realizing the motion of the moving body planned by the motion planning unit 163. For example, the operation control unit 135 calculates a control target value of the steering mechanism for realizing the moving trajectory or the sudden turn planned by the operation planning unit 163, and outputs a control command indicating the calculated control target value to the drive system control unit It supplies to 107.
  • FIG. 3 is a schematic diagram illustrating an appearance of a robot 20 that is an example of the moving body 10.
  • FIG. 4 is a block diagram illustrating a functional configuration example for executing an obstacle determination process.
  • a quadruped walking type robot having an articulated leg 21 is taken as an example.
  • the illustration of the multi-joint structure is simplified.
  • a distance sensor 25 (not shown in FIG. 3) is disposed on the front side of the head 22 of the robot 20, and it is possible to measure the distance to an object existing in the measurement range M extending on the front side. ing.
  • LiDAR is used as the distance sensor 25, and the distance to the object existing in the measurement range M is acquired as three-dimensional point cloud data.
  • the shape within the measurement range M can be determined from the three-dimensional point cloud data.
  • the distance sensor 25 for example, other sensors such as a laser distance sensor, an ultrasonic sensor, a radar, and a sonar may be used. For example, if it is possible to determine how far the pixel of the image acquired by the imaging apparatus is present from the robot 20, stereo matching using a stereo camera, distance measurement using an IR camera, laser range finder Any type of sensor may be used. Further, any sensor that can determine the shape within the measurement range M may be used.
  • a measurement range M is set on the front side which is a part of the periphery of the robot 20.
  • the present invention is not limited to this, and an arbitrary range around the robot 20 can be set as the measurement region D.
  • an arbitrary range around the robot 20 can be set as the measurement region D.
  • a distance sensor 25 is disposed at a position serving as a base point of the measurement range M.
  • the distance sensor 25 corresponds to a sensor that can detect the peripheral information of the robot 20.
  • the three-dimensional point cloud data in the measurement range M corresponds to the peripheral information of the robot 20 and shape data related to the periphery of the robot 20.
  • the robot 20 has a height position sensor 26 (not shown in FIG. 3).
  • the height position sensor 26 detects information related to the height position of the distance sensor 25.
  • the height position of the distance sensor 25 is typically calculated based on a predetermined reference position. That is, the amount of displacement in the height direction from the reference position is calculated as the height position.
  • the relative position from the floor surface 1 may be calculated using the position of the floor surface (ground) 1 on which the robot 20 is placed as a reference position.
  • the surface of another object that actually exists such as the ceiling surface 2 (see FIG. 5 and the like) may be adopted as the reference position.
  • the height position of the distance sensor 25 calculated in the reference state is set as the reference position. Then, the relative position from the reference position may be calculated as the height position of the distance sensor 25.
  • Which state of the robot 20 is set as the reference state may be arbitrarily set. For example, the initial state of the robot 20 arranged on the floor 1 and the standard state of the robot 20 can be set as the reference state.
  • the height position sensor 26 for example, a pressure sensor (atmospheric pressure sensor) is used.
  • the relative position from the floor surface 1 may be calculated using the Doppler effect of the ultrasonic sensor. It is also possible to calculate the height position by integrating the detection results of the acceleration sensor or the like.
  • the information on the height position includes various information that can calculate the height position, and includes, for example, a detection signal included in the output result of the sensor.
  • the information on the height position includes information on the calculated height position itself.
  • the robot 20 has an internal sensor 27 (not shown in FIG. 3).
  • the internal sensor 27 is a generic name for an acceleration sensor, a gyro sensor, an inertial measurement device (IMU), a geomagnetic sensor, and the like, and can detect the acceleration, angle, angular velocity, geomagnetic direction, and the like of the robot 20.
  • IMU inertial measurement device
  • the distance sensor 25, the height position sensor 26, and the inner world sensor 27 are included in the sensor group 11 (112) shown in FIGS.
  • the robot 20 includes a height position calculation unit 30, a posture calculation unit 31, a determination region calculation unit 32, and an obstacle determination unit 33.
  • These blocks are configured in the autonomous movement control unit 110 corresponding to the information processing apparatus according to the present embodiment. Typically, it is configured as a part of the recognition processing unit 14 shown in FIG. 1 and the recognition processing unit 121 shown in FIG. Of course, it is not limited to this.
  • FIG. 5 is a flowchart showing an example of an obstacle determination process. 6 to 9 are diagrams for explaining each step shown in FIG. By executing the processing shown in FIG. 5, it is possible to determine whether there is an obstacle around the robot 20.
  • the cycle for executing the flow is not limited and may be set arbitrarily.
  • the height position calculation unit 30 calculates the height position of the distance sensor 25 based on the detection result of the height position sensor 26 (step 101).
  • the determination area calculation unit 32 calculates the determination area D based on the calculated height position of the distance sensor 25 (step 102).
  • the determination area D is an area for determining the situation around the robot 20, and is an area that serves as an identification reference when executing an obstacle determination process. In FIGS. 3 and 6 to 9, the area of the space illustrated in gray is the determination area D.
  • a range of a substantially quadrangular pyramid shape having the apex at the front side of the head 22 of the robot 20 where the distance sensor 25 is arranged is illustrated as the measurement range D.
  • the determination area D is calculated as an area included in the measurement range M.
  • the determination area D is defined by the first identification surface 35 set on the floor surface 1 side and the second identification surface 36 set on the ceiling surface 2 side.
  • a space area from the first identification surface 35 to the second identification surface 36 is set as the determination region D.
  • the first and second identification surfaces 35 and 26 correspond to first and second determination surfaces.
  • a sensor coordinate system is set with the position of the distance sensor 25 as the origin and the direction of the detection axis L (detection direction) of the distance sensor 25 as the axial direction (Y-axis in FIG. 6).
  • the Each axial direction can be calculated by the internal sensor 27.
  • the method for setting the sensor coordinate system is not limited.
  • the first and second identification surfaces 35 and 36 are set based on the sensor coordinate system.
  • FIG. 6 illustrates a state (attitude) in which the detection axis L of the distance sensor 25 extends in the horizontal direction toward the front side.
  • this state is the reference state
  • the height position H0 of the distance sensor 25 is the reference position.
  • the determination area calculation unit 32 sets the first and second identification surfaces 35 and 36 to be parallel to the XY plane (that is, the horizontal plane) based on the sensor coordinate system. A region sandwiched between the first and second identification surfaces 35 and 36 is calculated as a determination region D.
  • the determination area D in the reference state will be described as the reference determination area D0.
  • the first identification surface 35 is set at a position higher than the floor surface 1 by an offset h floor (hereinafter referred to as hf).
  • the second identification surface 36 is set at a position lower than the ceiling surface 2 by an offset h ceiling (hereinafter referred to as hc).
  • the positions of the floor surface 1 and the ceiling surface 2 can be calculated by the internal sensor 27.
  • the specific values of the offsets hf and hc are not limited and may be set as appropriate. For example, by setting the distance sensor 25 to be larger than the maximum error amount, it is possible to improve the obstacle determination accuracy.
  • the offset hf is set at a position lower than the specified height.
  • the offset hf is set at a position higher than the specified height.
  • the offset hf may be set according to a height that cannot be overcome by the robot 20, a height that can be sufficiently overcome, and the like.
  • the offset hc is set at a position higher than the defined height.
  • the offset hc is set at a position lower than the specified height.
  • the offset hc may be set according to the maximum height of the head 22 of the robot 20 during walking.
  • the heights of the first and second identification surfaces 35 and 36 are set based on the moving environment, the performance of the robot 20, and the like so as to be advantageous for autonomous movement.
  • the heights of the first and second identification surfaces 35 and 36 are set so that an object to be avoided in movement can be extracted as an obstacle.
  • the first identification surface 35 may be set at a position substantially equal to the position of the floor surface 1
  • the first identification surface 36 may be set at a position substantially equal to the ceiling surface 2 (both offsets hf and hc are substantially zero). ).
  • the process may be temporarily executed with a high calculation amount and with high accuracy.
  • the reference determination area D0 can be set with high accuracy.
  • the setting of the reference determination area D0 may be executed by a user operation or may be automatically executed.
  • the obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104).
  • the obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D.
  • points included in the determination area D (D0) are determined as points constituting an obstacle. Then, an object including a point constituting the obstacle is determined as the obstacle 3.
  • the object 4 a existing on the floor surface 1 and the object 4 b installed on the ceiling surface 2 are both determined as the obstacle 3.
  • the three-dimensional point cloud data obtained from the distance sensor 25 is filtered based on the first and second identification surfaces 35 and 26. And when at least one part is contained in the determination area
  • an avoidance operation such as detour or stop
  • output of a warning sound or a warning message are executed.
  • These processes are executed, for example, by the cooperation of the action plan processing unit 15 and the action control processing unit 16 illustrated in FIG.
  • these blocks correspond to a drive control unit that controls the drive unit based on the determination result by the determination unit.
  • the height position calculation unit 30 functions as an acquisition unit that acquires information about the height position of the sensor that can detect the peripheral information of the moving object.
  • the posture calculation unit 31 functions as an acquisition unit that acquires information regarding the tilt of the sensor. The attitude of the sensor will be described later.
  • the determination region calculation unit 32 functions as a calculation unit that calculates a determination region for determining the situation around the moving body based on the acquired information on the height position of the sensor.
  • the obstacle determination unit 33 functions as a determination unit that determines the situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.
  • FIGS. 7 and 8 are diagrams illustrating an example in which the leg 21 having a multi-joint structure is driven and the robot 20 moves.
  • the height position of the distance sensor 25 in this state is H1, and the amount of variation from the reference position H0 is ⁇ t.
  • the height position of the measurement range M is also lowered by the amount of variation ⁇ t.
  • a determination area D ′ similar to the reference determination area D0 shown in FIG. 6 is set in the sensor coordinate system based on the height position H1 of the distance sensor 25. That is, it is assumed that the plane defined by the same plane equation is set as the first and second identification surfaces 35 ′ and 36 ′. Then, the determination area D ′ and the first and second identification surfaces 35 ′ and 36 ′ are positions lower by the variation amount ⁇ t than the reference determination area D0, the first and second identification surfaces 35 and 36 shown in FIG. Set to
  • the first identification surface 35 ′ becomes lower than the floor surface 1, and the floor surface 1 is included in the determination region D ′.
  • the floor surface 1 is determined as the obstacle 3.
  • an object that can be traversed and should be ignored may be determined as the obstacle 3.
  • the second identification surface 36 ′ is significantly lower than the ceiling surface 2, and the object 4b existing on the ceiling surface 2 deviates from the determination region D ′. As a result, the object 4b that should be determined as an obstacle cannot be seen, and an obstacle is lost.
  • an avoidance operation such as sudden braking is inadvertently performed, and proper autonomous movement is hindered. Further, the head 22 may collide with an object and the robot 20 may be damaged.
  • the determination region D is calculated based on the height position H1 of the distance sensor 25. Specifically, the height position of the determination region D is changed according to the change in the height position of the distance sensor 25.
  • the determination area D is calculated so that the height position is the same as the height position of the reference determination area D0 with reference to the reference determination area D0 shown in FIG. That is, the first identification surface 35 is calculated so as to have the same height position as the height position of the first identification surface 35 in the reference determination region D0. Similarly, the second identification surface 36 is calculated so as to have the same height position as the height position of the second identification surface 36 in the reference determination region D0.
  • the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf.
  • the second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc.
  • the plane expression of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state.
  • the first and second identification surfaces 35 and 36 are offset upward along the vertical direction by the amount of variation ⁇ t of the distance sensor 25.
  • the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG.
  • the influence of the swing of the distance sensor 25 in the height direction is sufficiently prevented, and an object that does not obstruct the movement, such as the floor surface 1 and the ceiling surface 2, and an object that obstructs the movement (obstacle 3). Can be identified.
  • the height position of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the calculation amount is small. It is possible to easily determine the obstacle 3.
  • FIG. 9 is a diagram schematically illustrating another example of changing the height position of the determination region D according to the variation ⁇ t.
  • the measurement range M is sufficiently large, it is possible to change only the height position while maintaining the shape of the determination region D. That is, as shown in FIG. 9, when the height position of the head 22 (the height position of the distance sensor 25) is lowered, it is assumed that the reference determination region D0 is included in the measurement range M. In this case, the reference determination area D0 may be calculated as the determination area D as it is. This is processing that can be said to have changed the height position of the determination region D in accordance with the fluctuation amount ⁇ t.
  • the posture calculation unit 31 can calculate the posture of the robot 20 based on the detection result of the internal sensor 27.
  • the inclination of the distance sensor 25 (the inclination of the head 22) is detected as the posture of the robot 20.
  • the inclination of the detection axis L of the distance sensor 25 and the rotation angle about the detection axis L are calculated as the inclination of the distance sensor 25. That is, when the X axis schematically shown in FIG. 6 is the pitch axis, the Y axis (detection axis) is the roll axis, and the Z axis is the yaw axis, the pitch angle, roll angle, and yaw angle are the inclinations of the distance sensor 25. Is calculated as
  • the inclination of the distance sensor 25 is not limited to these parameters. Any parameter that defines the relative posture with the floor surface 1 can be calculated as the inclination of the distance sensor 25.
  • the detection axis L of the distance sensor 25 may be described as the position reference axis of the determination region D.
  • the determination area D may be calculated based on the inclination of the distance sensor 25 in addition to the height position of the distance sensor 25. In this case, a step of acquiring the inclination of the distance sensor 25 is added.
  • 10 to 12 show the case where the height position of the head 22 (the height position of the distance sensor 25) of the robot 20 is low and the inclination of the head 22 (inclination of the distance sensor 25) varies from the horizontal direction. It is a figure which shows an example. 10 to 12, the head 22 (distance sensor 25) is inclined downward from the horizontal direction (pitch angle varies).
  • the measurement range M is lowered by the amount of fluctuation ⁇ t in the height direction, and is inclined downward by the amount of inclination ⁇ with respect to the horizontal direction.
  • the tilt amount ⁇ can also be referred to as a posture displacement ⁇ .
  • a determination area D ′ similar to the reference determination area D0 shown in FIG. 6 is set in the sensor coordinate system based on the height position H2 of the distance sensor 25.
  • the first and second identification surfaces 35 ′ and 36 ′ are also lowered by the variation amount ⁇ t and further tilted downward by the tilt amount ⁇ .
  • the floor surface 1 is included in the determination region D ′ and is determined as an obstacle.
  • the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost.
  • the first and second identification surfaces 35 ′′ and 36 ′′ are corrected so that each is a horizontal plane, and the determination region D ′′ is calculated. To do. Even in this case, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. That is, erroneous detection due to the fluctuation amount ⁇ t in the height direction remains.
  • the determination region D is calculated based on the height position H2 of the distance sensor 25 and the inclination of the distance sensor 25. That is, the height position and the inclination of the determination region D are changed according to the change in the height position and the inclination of the distance sensor 25.
  • the inclination of the determination area D corresponds to the inclination of the position reference axis of the determination area D (the same axis as the detection axis L of the distance sensor 25).
  • the determination area D is set to have the same height position as the height position of the reference determination area D0 and the same inclination as the inclination of the reference determination area D0. Calculated.
  • the first identification surface 35 is calculated so as to coincide with the height position and inclination of the first identification surface 35 in the reference determination area D0.
  • the second identification surface 36 is calculated so as to coincide with the height position and inclination of the second identification surface 36 in the reference determination region D0.
  • the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf.
  • the second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc.
  • the plane formula of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state.
  • the first and second identification surfaces 35 and 36 are offset upward by the fluctuation amount ⁇ t of the distance sensor 25, and the reverse rotation direction ( ⁇ direction) by the inclination amount ⁇ . ).
  • the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG.
  • the influence of the swing of the distance sensor 25 in the height direction can be sufficiently prevented, and the floor surface 1, the ceiling surface 2, and the obstacle 3 can be identified with high accuracy.
  • the height position and inclination of the determination region D may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the amount of calculation is small. Thus, the obstacle 3 can be easily determined.
  • FIG. 13 is a diagram schematically showing another example of changing the height position and inclination of the determination region D according to the variation amount ⁇ t and the inclination amount ⁇ .
  • the height position and inclination of the determination area D can be changed while the shape of the determination area D is maintained. That is, as shown in FIG. 13, when the height position and inclination of the head 22 (height position and inclination of the distance sensor 25) fluctuate, it is assumed that the reference determination region D0 is included in the measurement range M. . In this case, the reference determination area D0 may be calculated as the determination area D as it is. This is a process that can be said to have changed the height position and inclination of the determination region D in accordance with the fluctuation amount ⁇ t.
  • the calculation of the determination region D based on the height position and inclination of the distance sensor 25 is not limited to the case where the calculation is performed with the reference determination region D as a reference. Even when the calculation is based on the reference determination area D0, the calculation is not limited to the case where the determination area D is calculated so as to have the same height position and the same inclination as the reference determination area D.
  • the size of the determination region D in the height direction may be corrected based on the height position of the distance sensor 25.
  • the size in the height direction that is, the distance between the first and second identification surfaces 35 and 36 may be corrected according to the variation ⁇ t in the height position.
  • the correction amount for the determination region D is increased as the variation amount ⁇ t increases. Thereby, the influence of the detection error of each sense can be suppressed.
  • the size of the determination region D in the height direction is to be increased or decreased is determined to be effective for autonomous movement according to the moving environment, the performance and use of the robot 20, and the safety design is realized. It may be set as appropriate.
  • the size of the determination region D in the height direction may be corrected. For example, the correction amount for the determination region D is increased as the inclination amount ⁇ is larger. Thereby, the influence of the detection error of each sense can be suppressed.
  • the determination region D for determining the situation around the robot 20 is calculated based on the information regarding the height position of the distance sensor 25. As a result, the situation around the robot 20 can be easily and accurately determined, and the obstacle 3 can be determined.
  • an autonomous mobile robot that uses a distance sensor to recognize obstacles in the outside world can identify the floor / ceiling surface and obstacles even if the height / posture of the distance sensor changes during movement. Can be performed stably.
  • a distance sensor In order for autonomous mobile robots to move around obstacles, it is common to measure the environment with a distance sensor.
  • the distance sensor allows the robot to know the distance to the external object centered on itself.
  • a distance sensor generally measures the linear distance from an object using reflection of a wave phenomenon with straightness such as sound and light, so an object that becomes an obstacle to movement and a floor -Both objects that do not obstruct movement such as the ceiling surface will be detected.
  • An autonomous mobile robot needs to distinguish obstacles from movement and floor / ceiling surfaces from signals obtained from distance sensors.
  • the robot can always move while maintaining a posture parallel to the ground, it is easy to identify the floor / ceiling surface and obstacles. From the obtained sensor signal, only the information in the area surrounded by the plane of appropriate height fixed to the sensor coordinate system may be identified as an obstacle.
  • the distance sensor swings when the robot moves (walks), and its posture and height change with respect to the ground.
  • a distance sensor corresponding to an “eye” in a robot is often installed at a high position in the robot body in order to take a wide field of view, and the position / posture is greatly moved by swinging accompanying movement.
  • a method of extracting a plane from a three-dimensional point group obtained from a distance sensor and recognizing an object other than the plane as an obstacle is conceivable.
  • the floor surface can be removed only from the signal without depending on the posture of the sensor with respect to the outside world.
  • this method requires a complicated procedure for detecting the floor surface from the point cloud, and is difficult to apply to a small robot with a scarce computer resource.
  • a method of discriminating between the floor and the obstacle based on the reflected light intensity of the distance sensor can be considered.
  • the reflection intensity of the obstacle directly facing is stronger than the reflection intensity of the floor / ceiling surface parallel to the sensor optical axis. This is a technique using this.
  • the reflection intensity is affected by the degree of light scattering due to the surface properties of the object and the light absorption rate by the material, so the identification accuracy in an environment surrounded by objects of various materials is low. End up.
  • the distance sensor is greatly directed up and down and directly faces the floor surface, the reflection intensity from the floor surface is better than that of the obstacle, so this method cannot be used.
  • a method is conceivable in which the distance sensor output of the outside world when there is no obstacle is acquired in advance, and the difference compared with the current observation value is identified as an obstacle.
  • this method has a problem in that it requires calculation to match the sensor signal acquired in advance and the coordinates of the current sensor output, and cannot be used in a case where an unknown area is to be searched.
  • the above method increases the calculation cost and cannot cope with the height change of the distance sensor.
  • the present technology can be implemented only by coordinate comparison processing in a simple three-dimensional space as long as an inexpensive height sensor / attitude sensor is provided. Compared to high-load processing such as detection, it can be realized at low cost. Thus, the present technology can also be applied to a small object such as a domestic pet robot that has a limited calculation capability. In addition, this technology demonstrates robust identification capability against high-speed rocking of any three-dimensional sensor including height position and orientation by adding height position information from the height measurement unit. Is possible.
  • the determination area is defined by the first and second identification surfaces (determination surfaces).
  • the determination area is not limited to this, and the determination area may be defined by one identification plane or three or more identification planes.
  • the determination area may be defined by a curved identification surface.
  • the shape of the determination region (the shape of the identification surface) may be set according to the shape of the floor surface or the ceiling surface. Note that the identification surface (determination surface) can also be said to be a threshold for determination.
  • the determination of the presence or absence of obstacles was performed as a determination of the surrounding situation.
  • the present technology is not limited to r, and the present technology may also be applied to counting a predetermined object existing in the vicinity, detecting a change in the size of an object, and the like. That is, any determination process may be executed as a determination of the surrounding situation.
  • FIG. 14 is an external view showing a configuration example of a vehicle equipped with an automatic driving control unit according to an embodiment of the present technology.
  • FIG. 14A is a perspective view illustrating a configuration example of the vehicle 290
  • FIG. 1B is a schematic view when the vehicle 290 is viewed from above.
  • the vehicle 290 has an automatic driving function capable of automatic traveling (autonomous movement) to a destination.
  • the vehicle 290 is an example of a moving body.
  • the vehicle 290 includes various sensors 291 used for automatic driving.
  • FIG. 14A schematically illustrates an imaging device 292 and a distance sensor 293 that are directed to the front of the vehicle 290.
  • the imaging device 292 and the distance sensor 293 function as an external sensor.
  • FIG. 14B schematically shows a wheel encoder 294 that detects the rotation and the like of each wheel.
  • the wheel encoder 294 functions as an internal sensor.
  • various sensors 291 are mounted on the vehicle 290, and movement control of the vehicle 290 is performed based on the output from the sensor 291.
  • FIG. 15 is a block diagram illustrating a configuration example of the vehicle control system 200 that controls the vehicle 290.
  • the vehicle control system 200 is a system that is provided in the vehicle 290 and performs various controls of the vehicle 290.
  • the input unit 201, data acquisition unit 202, communication unit 203, in-vehicle device 204, output control unit 205, output unit 206, drive system control unit 207, drive system system 208, storage unit 209, and automatic operation control unit shown in FIG. 210 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a storage unit 109, And a block corresponding to the autonomous movement control unit 110.
  • the detection unit 231, the self-position estimation unit 232, the situation analysis unit 233, the planning unit 234, and the operation control unit 135 in the automatic driving control unit 210 illustrated in FIG. 15 are included in the autonomous movement control unit 110 illustrated in FIG. 2. It corresponds to the detection unit 131, the self-position estimation unit 132, the situation analysis unit 133, the planning unit 134, and the operation control unit 135.
  • the data acquisition unit 202 may include an imaging device that images the driver, a biological sensor that detects the driver's biological information, a microphone that collects sound in the passenger compartment, and the like.
  • the biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of a passenger sitting on the seat or a driver holding the steering wheel.
  • the body system control unit 280 controls the body system 281 by generating various control signals and supplying them to the body system 281. Further, the body system control unit 280 supplies a control signal to each unit other than the body system 281 as necessary, and notifies the control state of the body system 281 and the like.
  • the body system 281 includes various body-related devices mounted on the vehicle body.
  • the body system 281 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, and various lamps (for example, a head lamp, a back lamp, a brake lamp, a blinker, a fog lamp, etc.) Etc.
  • the traffic rule recognition unit 282 determines traffic rules around the vehicle 290 based on data or signals from each part of the vehicle control system 200 such as the self-position estimation unit 232, the vehicle outside information detection unit 241, and the map analysis unit 251. Perform recognition processing. By this recognition processing, for example, the position and state of signals around the vehicle 290, the contents of traffic restrictions around the vehicle 290, and the lanes that can travel are recognized.
  • the traffic rule recognition unit 282 supplies data indicating the result of the recognition process to the situation prediction unit 253 and the like.
  • the operation control unit 235 controls the operation of the vehicle 290.
  • the operation control unit 235 includes an emergency situation avoiding unit 283, an acceleration / deceleration control unit 284, and a direction control unit 285.
  • the emergency situation avoiding unit 283 is configured to detect collision, contact, approach to a dangerous zone, driver abnormality, vehicle 290 based on the detection results of the vehicle outside information detecting unit 241, the vehicle interior information detecting unit 242, and the vehicle state detecting unit 243. Detects emergency situations such as abnormalities. When the emergency avoidance unit 283 detects the occurrence of an emergency, the emergency avoidance unit 283 plans the operation of the vehicle 290 to avoid an emergency such as a sudden stop or a sudden turn. The emergency avoidance unit 283 supplies data indicating the planned operation of the vehicle 290 to the acceleration / deceleration control unit 284, the direction control unit 285, and the like.
  • the acceleration / deceleration control unit 284 performs acceleration / deceleration control for realizing the operation of the vehicle 290 planned by the operation planning unit 263 or the emergency situation avoiding unit 283. For example, the acceleration / deceleration control unit 284 calculates a control target value of a driving force generator or a braking device for realizing planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. This is supplied to the system control unit 207.
  • the direction control unit 285 performs direction control for realizing the operation of the vehicle 290 planned by the operation planning unit 263 or the emergency situation avoiding unit 283. For example, the direction control unit 285 calculates the control target value of the steering mechanism for realizing the traveling track or the sudden turn planned by the motion planning unit 263 or the emergency situation avoiding unit 283, and the control indicating the calculated control target value The command is supplied to the drive system control unit 207.
  • the present technology can be applied to the vehicle 290 and the vehicle control system 200 having the above-described configuration. That is, it is possible to cause the vehicle control system 200 to function as an information processing apparatus according to the present technology and to execute a determination process for the situation around the vehicle 290 including determination of an obstacle.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). You may implement
  • the information processing method according to the present technology including determination of an obstacle and the like is executed by the autonomous movement control unit mounted on the moving body.
  • the information processing method according to the present technology may be executed by the cloud server.
  • the cloud server operates as an information processing apparatus according to the present technology.
  • an information processing method and a program according to the present technology are executed by interlocking a computer mounted on a vehicle with another computer (cloud server) capable of communicating via a network or the like.
  • a processing device may be constructed.
  • the information processing method and the program according to the present technology can be executed not only in a computer system configured by a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems.
  • the information processing method and the program according to the present technology by the computer system are executed when, for example, acquisition of information on the sensor height position, calculation of a determination region, determination of a surrounding situation, and the like are executed by a single computer, and This includes both cases where each process is executed by a different computer.
  • the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquiring the result.
  • the information processing method and program according to the present technology can be applied to a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is processed jointly.
  • “same height position”, “same inclination”, “center”, “equal position”, “equal axis”, “parallel”, etc. are “substantially the same height position”, “substantially the same inclination”, “substantially the same”. ”Center”, “substantially equal position”, “substantially equal axis”, and “substantially parallel”.
  • a predetermined range for example, ⁇ 10%
  • the range included in the range of (1) is also included.
  • this technique can also take the following structures.
  • an acquisition unit that acquires information about the height position of a sensor capable of detecting peripheral information of the moving body; Based on the acquired information on the height position of the sensor, a calculation unit that calculates a determination region for determining a situation around the moving body;
  • An information processing apparatus comprising: a determination unit configured to determine a situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.
  • the calculation unit determines whether there is an obstacle around the moving body.
  • the information processing apparatus acquires shape data related to the periphery of the moving body, The determination unit determines a situation around the moving body based on a relationship between the detected shape data and the determination area.
  • the information processing apparatus acquires three-dimensional point cloud data related to the periphery of the moving object, The determination unit determines a situation around the moving object by determining whether or not point data included in the three-dimensional point cloud data is included in the determination region.
  • the information processing apparatus calculates one or more determination planes that define the determination region;
  • the one or more determination surfaces include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to a ceiling surface.
  • the information processing apparatus according to (6), The information processing apparatus, wherein the calculation unit changes a height position of at least one of the first determination surface and the second determination surface according to a change in a height position of the sensor.
  • the calculation unit calculates the determination region based on a predetermined reference determination region based on the acquired information on the height position of the sensor.
  • the information processing apparatus calculates the determination region so that the calculation unit has the same height position as a height position of the reference determination region.
  • the information processing apparatus corrects a size of the calculated determination region in a height direction based on the acquired information on the height position of the sensor.
  • the information processing apparatus according to any one of (1) to (10), The acquisition unit acquires information on the tilt of the sensor, The information processing apparatus that calculates the determination area based on information related to the tilt of the sensor.
  • the information processing apparatus according to (11), The calculation unit changes an inclination of the determination region according to a change in inclination of the sensor.
  • a drive unit capable of detecting surrounding information; A detection unit for detecting information on a height position of the sensor; A calculation unit for calculating a determination region for determining a surrounding situation based on the detected information on the height position of the sensor; A determination unit for determining a surrounding situation based on the calculated determination region and the peripheral information detected by the sensor; A moving body comprising: a drive control unit that controls the drive unit based on a determination result by the determination unit.
  • the driving unit is a leg having a multi-joint structure.
  • D Determination region D0: Reference determination region 1 ... Floor surface 2 ... Ceiling surface 3 ... Obstacle 10 ... Moving body 20 ... Robot 21 ... Leg 25 ... Distance sensor 26 ... Height position sensor 27 ... Inner world sensor 30 ... Position Calculation unit 31 ... Posture calculation unit 32 ... Determination region calculation unit 33 ... Obstacle determination unit 35 ... First identification surface 36 ... Second identification surface 100 ... Mobile body control system 200 ... Vehicle control system 290 ... Vehicle 293 ... Distance Sensor

Abstract

To achieve the objective of this invention, an information processing device according to an embodiment of the present technology comprises an acquisition part, a computation part, and an assessment part. The acquisition part acquires information relating to the height position of a sensor capable of detecting mobile body periphery information. The computation part uses the acquired sensor height position information to compute an assessment region for assessing a situation in the periphery of the mobile body. The assessment part assesses the situation in the periphery of the mobile body on the basis of the computed assessment region and the periphery information having been detected by the sensor.

Description

情報処理装置、情報処理方法、プログラム、及び移動体Information processing apparatus, information processing method, program, and moving object
 本技術は、移動体の自律移動制御に適用可能な情報処理装置、情報処理方法、プログラム、及び移動体に関する。 The present technology relates to an information processing apparatus, an information processing method, a program, and a moving body that can be applied to autonomous movement control of the moving body.
 近年、車両やロボット等の自律移動を実現するための技術が開発されている。特許文献1には、自律移動を行う車両等が障害物を回避することを可能とする障害物判定装置について記載されている。この障害物判定装置では、測距装置の傾き(傾斜)に基づいて、測距装置による計測範囲(計測空間領域)の形状が変更される。これにより障害物の誤判定の防止が図られている(特許文献1の明細書段落[0041]~[0054]図4等)。 In recent years, technologies for realizing autonomous movement of vehicles and robots have been developed. Patent Document 1 describes an obstacle determination device that enables a vehicle or the like that performs autonomous movement to avoid an obstacle. In this obstacle determination device, the shape of the measurement range (measurement space region) by the distance measuring device is changed based on the inclination (inclination) of the distance measuring device. This prevents erroneous determination of an obstacle (paragraphs [0041] to [0054] in FIG. 4 of Patent Document 1).
特開2017-59150号公報JP 2017-59150 A
 今後、車両やロボット等の自律移動制御を利用したシステムは普及していくと考えられ、複雑なアルゴリズム等を用いることなく、移動体の周辺の状況を簡単に精度よく判定することを可能とする技術が求められている。 In the future, systems using autonomous movement control for vehicles, robots, etc. are expected to become widespread, making it possible to easily and accurately determine the situation around a moving object without using complicated algorithms. Technology is required.
 以上のような事情に鑑み、本技術の目的は、移動体の周辺の状況を簡単に精度よく判定することを可能とする情報処理装置、情報処理方法、プログラム、及び移動体を提供することにある。 In view of the circumstances as described above, an object of the present technology is to provide an information processing apparatus, an information processing method, a program, and a moving body that can easily and accurately determine the situation around the moving body. is there.
 上記目的を達成するため、本技術の一形態に係る情報処理装置は、取得部と、算出部と、判定部とを具備する。
 前記取得部は、移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得する。
 前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出する。
 前記判定部は、前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定する。
In order to achieve the above object, an information processing apparatus according to an embodiment of the present technology includes an acquisition unit, a calculation unit, and a determination unit.
The acquisition unit acquires information related to a height position of a sensor capable of detecting peripheral information of the moving body.
The calculation unit calculates a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor.
The determination unit determines a state of the periphery of the moving body based on the calculated determination area and the peripheral information detected by the sensor.
 この情報処理装置では、センサの高さ位置に関する情報に基づいて、移動体の周辺の状況を判定するための判定領域が算出される。これにより移動体の周辺の状況を簡単に精度よく判定することが可能となる。 In this information processing apparatus, a determination region for determining the situation around the moving body is calculated based on information on the height position of the sensor. This makes it possible to easily and accurately determine the situation around the moving body.
 前記算出部は、前記移動体の周辺に障害物が存在するか否かを判定してもよい。 The calculation unit may determine whether there is an obstacle around the moving body.
 前記取得部は、前記移動体の周辺に関する形状データを取得してもよい。この場合、前記判定部は、前記検出された形状データと前記判定領域との関係に基づいて、前記移動体の周辺の状況を判定してもよい。 The acquisition unit may acquire shape data related to the periphery of the moving body. In this case, the determination unit may determine a situation around the moving body based on the relationship between the detected shape data and the determination region.
 前記取得部は、前記移動体の周辺に関する3次元点群データを取得してもよい。この場合、前記判定部は、前記3次元点群データに含まれる点データが、前記判定領域に含まれるか否かを判定することで、前記移動体の周辺の状況を判定してもよい。 The acquisition unit may acquire three-dimensional point cloud data related to the periphery of the moving body. In this case, the determination unit may determine the situation around the moving body by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region.
 前記算出部は、前記センサの高さ位置の変動に応じて、前記判定領域の高さ位置を変更してもよい。 The calculation unit may change the height position of the determination region according to a change in the height position of the sensor.
 前記算出部は、前記判定領域を規定する1以上の判定面を算出してもよい。この場合、前記1以上の判定面は、地面に対応する第1の判定面、及び天井面に対応する第2の判定面の少なくとも一方を含んでもよい。 The calculation unit may calculate one or more determination planes that define the determination area. In this case, the one or more determination surfaces may include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to the ceiling surface.
 前記算出部は、前記センサの高さ位置の変動に応じて、前記第1の判定面及び前記第2の判定面の少なくとも一方の高さ位置を変更してもよい。 The calculation unit may change the height position of at least one of the first determination surface and the second determination surface according to a change in the height position of the sensor.
 前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、所定の基準判定領域を基準として前記判定領域を算出してもよい。 The calculation unit may calculate the determination area based on a predetermined reference determination area based on the acquired information on the height position of the sensor.
 前記算出部は、前記基準判定領域の高さ位置と同じ高さ位置となるように、前記判定領域を算出してもよい。 The calculation unit may calculate the determination region so that the height position is the same as the height position of the reference determination region.
 前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、前記算出された判定領域の高さ方向におけるサイズを補正してもよい。 The calculation unit may correct the size of the calculated determination area in the height direction based on the acquired information on the height position of the sensor.
 前記取得部は、前記センサの傾きに関する情報を取得してもよい。この場合、前記算出部は、前記センサの傾きに関する情報に基づいて、前記判定領域を算出してもよい。 The acquisition unit may acquire information related to the tilt of the sensor. In this case, the calculation unit may calculate the determination area based on information on the tilt of the sensor.
 前記算出部は、前記センサの傾きの変動に応じて、前記判定領域の傾きを変更してもよい。 The calculation unit may change the inclination of the determination region according to a change in the inclination of the sensor.
 本技術の一形態に係る情報処理方法は、コンピュータシステムにより実行される情報処理方法であって、移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得することを含む。
 前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域が算出される。
 前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況が判定される。
An information processing method according to an embodiment of the present technology is an information processing method executed by a computer system, and includes acquiring information related to a height position of a sensor capable of detecting peripheral information of a moving object.
Based on the acquired information on the height position of the sensor, a determination region for determining a situation around the moving body is calculated.
Based on the calculated determination area and the surrounding information detected by the sensor, a situation around the moving body is determined.
 本技術の一形態に係るプログラムは、コンピュータシステムに以下のステップを実行させる。
 移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得するステップ。
 前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出するステップ。
 前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定するステップ。
A program according to an embodiment of the present technology causes a computer system to execute the following steps.
Obtaining information related to a height position of a sensor capable of detecting peripheral information of the moving body;
Calculating a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor;
A step of determining a situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor;
 本技術の一形態に係る移動体は、駆動部と、センサと、検出部と、算出部と、駆動制御部とを具備する。
 センサは、周辺情報を検出可能である。
 前記検出部は、前記センサの高さ位置に関する情報を検出する。
 前記算出部は、前記検出された前記センサの高さ位置に関する情報に基づいて、周辺の状況を判定するための判定領域を算出する。
 前記判定部は、前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、周辺の状況を判定する。
 前記駆動制御部は、前記判定部による判定結果に基づいて、前記駆動部を制御する。
A moving body according to an embodiment of the present technology includes a drive unit, a sensor, a detection unit, a calculation unit, and a drive control unit.
The sensor can detect surrounding information.
The said detection part detects the information regarding the height position of the said sensor.
The calculation unit calculates a determination region for determining a surrounding situation based on the detected information on the height position of the sensor.
The determination unit determines a surrounding situation based on the calculated determination region and the peripheral information detected by the sensor.
The drive control unit controls the drive unit based on a determination result by the determination unit.
 前記駆動部は、多関節構造を有する脚部であってもよい。 The drive unit may be a leg having a multi-joint structure.
 以上のように、本技術によれば、移動体の周辺の状況を簡単に精度よく判定することが可能となる。なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。 As described above, according to the present technology, it is possible to easily and accurately determine the situation around the moving body. Note that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
本技術の一実施形態に係る移動体の概要となる構成例を示すブロック図である。It is a block diagram which shows the structural example used as the outline | summary of the moving body which concerns on one Embodiment of this technique. 移動体を制御する移動体制御システムの概略的な機能の構成例を示すブロック図である。It is a block diagram which shows the structural example of the schematic function of the mobile body control system which controls a mobile body. 移動体の一例であるロボットの外観を示す模式図である。It is a schematic diagram which shows the external appearance of the robot which is an example of a mobile body. 障害物の判定処理を実行するための機能的な構成例を示すブロック図である。It is a block diagram which shows the functional structural example for performing the determination process of an obstruction. 障害物の判定処理の一例を示すフローチャートである。It is a flowchart which shows an example of the determination process of an obstruction. ロボットの基準状態の一例を示す模式図である。It is a schematic diagram which shows an example of the reference | standard state of a robot. 距離センサの高さ位置の変動の一例を示す模式図である。It is a schematic diagram which shows an example of the fluctuation | variation of the height position of a distance sensor. 変動量に応じた判定領域の高さ位置の変更の一例を模式的に示す図である。It is a figure which shows typically an example of the change of the height position of the determination area | region according to the variation | change_quantity. 変動量に応じた判定領域の高さ位置の変更の一例を模式的に示す図である。It is a figure which shows typically an example of the change of the height position of the determination area | region according to the variation | change_quantity. 距離センサの高さ位置及び傾き変動の一例を示す模式図である。It is a schematic diagram which shows an example of the height position of a distance sensor, and inclination fluctuation | variation. 距離センサの高さ位置及び傾き変動の一例を示す模式図である。It is a schematic diagram which shows an example of the height position of a distance sensor, and inclination fluctuation | variation. 変動量及び傾き量に応じた判定領域の設定の一例を模式的に示す図である。It is a figure which shows typically an example of the setting of the determination area | region according to the variation | change_quantity and inclination amount. 変動量及び傾き量に応じた判定領域の設定の一例を模式的に示す図である。It is a figure which shows typically an example of the setting of the determination area | region according to the variation | change_quantity and inclination amount. 本技術の一実施形態に係る自動運転制御部を搭載する車両の構成例を示す外観図である。It is an external view showing an example of composition of vehicles carrying an automatic operation control part concerning one embodiment of this art. 車両の制御を行う車両制御システムの構成例を示すブロック図である。It is a block diagram which shows the structural example of the vehicle control system which controls a vehicle.
 以下、本技術に係る実施形態を、図面を参照しながら説明する。 Hereinafter, embodiments of the present technology will be described with reference to the drawings.
 [本開示の概要]
 図1は、本技術の一実施形態に係る移動体10の概要となる構成例を示すブロック図である。
[Outline of this disclosure]
FIG. 1 is a block diagram illustrating a configuration example that is an outline of a moving object 10 according to an embodiment of the present technology.
 移動体10は、例えば、ロボットなどであり、センサ群11、自律移動制御部12、及びアクチュエータ群13を備えている。 The moving body 10 is, for example, a robot, and includes a sensor group 11, an autonomous movement control unit 12, and an actuator group 13.
 センサ群11は、移動体10の内部、及び移動体10の周囲の状況の認識に必要な各種の情報を検出するセンサ11a-1~11a-nを備えており、検出結果を自律移動制御部12に出力する。また、センサ11a-1~11a-nについて特に区別する必要がない場合、単に、センサ11aと称するものとし、その他の構成についても同様に称する。 The sensor group 11 includes sensors 11a-1 to 11a-n that detect various kinds of information necessary for recognizing the inside of the moving body 10 and the surroundings of the moving body 10, and the detection result is transmitted to the autonomous movement control unit. 12 is output. Further, when there is no need to particularly distinguish the sensors 11a-1 to 11a-n, they are simply referred to as the sensor 11a, and the other configurations are also referred to in the same manner.
 より具体的には、センサ11a-1~11a-nは、例えば、移動体10の周囲を撮像するカメラ、移動体10の動きを検出する加速度センサ、移動体10の周囲に存在する物体までの距離を測定するLiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ToF(Time of Flight)センサ、方向を検出する地磁気センサ、ジャイロセンサ、加速度センサ、周囲の気圧の変化を検出する気圧センサ、接触の有無等を検出する接触センサ、温度を検出する温度センサ、湿度を検出する湿度センサ、PSD(Position Sensitive Detector)測距センサ及び、地球上の位置を検出するGNSS(Global Navigation Satellite System)などを含む。 More specifically, the sensors 11a-1 to 11a-n are, for example, a camera that images the surroundings of the moving body 10, an acceleration sensor that detects the movement of the moving body 10, and objects existing around the moving body 10. LiDAR (Light Detection and Ranging, Laser 測定 Imaging Detection and Ranging), ToF (Time of Flight) sensor for measuring distance, geomagnetic sensor for detecting direction, gyro sensor, acceleration sensor, pressure sensor for detecting changes in ambient pressure, Contact sensor that detects presence or absence of contact, temperature sensor that detects temperature, humidity sensor that detects humidity, PSD (Position Sensitive Detector) distance sensor, GNSS (Global Navigation Satellite System) that detects the position on the earth, etc. including.
 自律移動制御部12は、センサ群11の各種の検出結果より、周囲の状況を認識し、認識結果に基づいて行動計画を生成し、行動計画に応じて、ロボットを駆動させるアクチュエータ群13の各種アクチュエータ13a-1~13a-nを動作させる。また、アクチュエータ13a-1~13a-nについて特に区別する必要がない場合、単に、アクチュエータ13aと称するものとし、その他の構成についても同様に称する。 The autonomous movement control unit 12 recognizes the surrounding situation from various detection results of the sensor group 11, generates an action plan based on the recognition result, and various types of actuator groups 13 that drive the robot according to the action plan. Actuators 13a-1 to 13a-n are operated. In addition, when it is not necessary to distinguish the actuators 13a-1 to 13a-n, they are simply referred to as the actuator 13a, and the other configurations are also referred to in the same manner.
 より詳細には、自律移動制御部12は、認識処理部14、行動計画処理部15、及び行動制御処理部16を備えている。 More specifically, the autonomous movement control unit 12 includes a recognition processing unit 14, an action plan processing unit 15, and an action control processing unit 16.
 認識処理部14は、センサ群11より供給される検出結果に基づいて、認識処理を実行し、例えば、画像、人、物体、表情の種類、位置、属性、及び自らや障害物の位置等を認識し、認識結果として行動計画処理部15に出力する。また、認識処理部14は、センサ群11より供給される検出結果に基づいて、自己位置を推定する。この際、認識処理部14は、所定のモデルを用いて自己位置を推定する。さらに、認識処理部14は、センサ群11により供給される検出結果に基づいて、外力の影響により、所定のモデルを用いた自己位置の推定ができない状態になると、所定のモデルとは異なるモデルで自己位置を推定する。 The recognition processing unit 14 executes recognition processing based on the detection result supplied from the sensor group 11, for example, an image, a person, an object, a facial expression type, a position, an attribute, and a position of itself or an obstacle. It recognizes and outputs to the action plan process part 15 as a recognition result. Further, the recognition processing unit 14 estimates the self position based on the detection result supplied from the sensor group 11. At this time, the recognition processing unit 14 estimates the self position using a predetermined model. Further, the recognition processing unit 14 is a model different from the predetermined model when the self-position cannot be estimated using the predetermined model based on the detection result supplied from the sensor group 11 due to the influence of the external force. Estimate self-position.
 行動計画処理部15は、認識結果に基づいて、移動体10の全体の行動である、移動体10の移動に係る機器の移動の軌跡、状態変化、及び速度または加速度などの行動計画を生成し、行動制御処理部16に供給する。 Based on the recognition result, the action plan processing unit 15 generates an action plan such as a trajectory of the movement of the device related to the movement of the moving body 10, a state change, and a speed or acceleration based on the recognition result. To the behavior control processing unit 16.
 行動制御処理部16は、行動計画処理部15より供給される行動計画に基づいて、アクチュエータ群13のアクチュエータ13a-1~13a-nのそれぞれの具体的な動きを制御するための制御信号を生成し、アクチュエータ群13を動作させる。 The behavior control processing unit 16 generates a control signal for controlling the specific movements of the actuators 13a-1 to 13a-n of the actuator group 13 based on the behavior plan supplied from the behavior plan processing unit 15. Then, the actuator group 13 is operated.
 アクチュエータ群13は、行動制御処理部16より供給される制御信号に基づいて、移動体10を具体的に動作させるアクチュエータ13a-1~13a-nを動作させる。より詳細には、アクチュエータ13a-1~13a-nは、移動体10の具体的な動き実現するモータ、サーボモータ、ブレーキ等の動作を制御信号に基づいて動作させる。 The actuator group 13 operates the actuators 13a-1 to 13a-n that specifically operate the moving body 10 based on the control signal supplied from the behavior control processing unit 16. More specifically, the actuators 13a-1 to 13a-n operate operations of a motor, a servo motor, a brake, and the like that realize a specific movement of the moving body 10 based on the control signal.
 また、アクチュエータ13a-1~13a-nは、伸縮運動、屈伸運動、または旋回運動などを実現させる構成を含むと共に、情報を表示するLED(Light Emission Diode)やLCD(Liquid Crystal Display)などからなる表示部、及び、音声を出力するスピーカなどの構成をさらに含むこともできる。したがって、アクチュエータ群13が制御信号に基づいて、制御されることにより、移動体10を駆動させる各種の装置の動作が実現されると共に、情報が表示される、及び音声が出力される。 In addition, the actuators 13a-1 to 13a-n include a configuration that realizes an expansion / contraction motion, a bending / extension motion, a turning motion, or the like, and includes an LED (Light Emission Diode) or an LCD (Liquid Crystal Display) that displays information. It can further include a configuration such as a display unit and a speaker for outputting sound. Therefore, when the actuator group 13 is controlled based on the control signal, operations of various devices that drive the moving body 10 are realized, information is displayed, and sound is output.
 すなわち、アクチュエータ群13のアクチュエータ13a-1~13a-nが制御されることにより、移動体10の移動に係る動作が制御されると共に、情報の表示や音声の出力などの各種の情報の提示も制御される。 That is, by controlling the actuators 13a-1 to 13a-n of the actuator group 13, the operation related to the movement of the moving body 10 is controlled, and various information such as information display and sound output are also presented. Be controlled.
 [移動体制御システムの構成例]
 上述した機能を実現させるための移動体10を制御する移動体制御システムについて説明する。
[Configuration example of mobile control system]
A moving body control system for controlling the moving body 10 for realizing the functions described above will be described.
 図2は、本開示の移動体10を制御する移動体制御システム100の概略的な機能の構成例を示すブロック図である。なお、図2の移動体制御システム100は、本技術が適用され得るロボットからなる移動体10を制御する移動体制御システムの一例であるが、他の移動体、例えば、航空機、船舶、及びマルチローターコプター(ドローン)などを制御するシステムとして適用することもできる。また、ロボットについても、車輪型のロボットや搭乗可能な自動運転車でもよいし、多足歩行型のロボットでもよい。もちろん駆動部として、多関節構造を有する脚部を備えるロボットにも適用可能である。 FIG. 2 is a block diagram illustrating a schematic configuration example of a function of the moving body control system 100 that controls the moving body 10 of the present disclosure. 2 is an example of a mobile body control system that controls the mobile body 10 including a robot to which the present technology can be applied, but other mobile bodies such as an aircraft, a ship, and a multi It can also be applied as a system for controlling a rotor copter (drone) or the like. Also, the robot may be a wheel type robot, an autonomous driving vehicle that can be boarded, or a multi-legged walking type robot. Of course, the present invention can also be applied to a robot having a leg portion having a multi-joint structure as a drive unit.
 移動体制御システム100は、入力部101、データ取得部102、通信部103、移動体内部機器104、出力制御部105、出力部106、駆動系制御部107、駆動系システム108、記憶部109、及び自律移動制御部110を備える。入力部101、データ取得部102、通信部103、出力制御部105、駆動系制御部107、記憶部109、及び、自律移動制御部110は、通信ネットワーク111を介して、相互に接続されている。通信ネットワーク111は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、IEEE802.3等のLAN(Local Area Network)、又は、FlexRay(登録商標)等の任意の規格に準拠した通信ネットワークやバス、あるいは規格化されていない独自の通信方式等からなる。なお、移動体制御システム100の各部は、通信ネットワーク111を介さずに、直接接続される場合もある。 The mobile control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a storage unit 109, And an autonomous movement control unit 110. The input unit 101, data acquisition unit 102, communication unit 103, output control unit 105, drive system control unit 107, storage unit 109, and autonomous movement control unit 110 are connected to each other via a communication network 111. . The communication network 111 is, for example, a communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) such as IEEE 802.3, or FlexRay (registered trademark). Or a bus, or a non-standardized communication method. In addition, each part of the mobile body control system 100 may be directly connected without going through the communication network 111.
 なお以下、移動体制御システム100の各部が、通信ネットワーク111を介して通信を行う場合、通信ネットワーク111の記載を省略するものとする。例えば、入力部101と自律移動制御部110が、通信ネットワーク111を介して通信を行う場合、単に入力部101と自律移動制御部110が通信を行うと記載する。 In the following, when each unit of the mobile control system 100 performs communication via the communication network 111, the description of the communication network 111 is omitted. For example, when the input unit 101 and the autonomous movement control unit 110 communicate via the communication network 111, it is simply described that the input unit 101 and the autonomous movement control unit 110 communicate.
 入力部101は、搭乗者が各種のデータや指示等の入力に用いる装置を備える。例えば、入力部101は、タッチパネル、ボタン、マイクロフォン、スイッチ、及び、レバー等の操作デバイス、並びに、音声やジェスチャ等により手動操作以外の方法で入力可能な操作デバイス等を備える。また、例えば、入力部101は、赤外線若しくはその他の電波を利用したリモートコントロール装置、又は、移動体制御システム100の操作に対応したモバイル機器若しくはウェアラブル機器等の外部接続機器であってもよい。入力部101は、搭乗者により入力されたデータや指示等に基づいて入力信号を生成し、移動体制御システム100の各部に供給する。 The input unit 101 includes a device used by the passenger for inputting various data and instructions. For example, the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device that can be input by a method other than manual operation by voice, gesture, or the like. In addition, for example, the input unit 101 may be a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile device or a wearable device corresponding to the operation of the mobile control system 100. The input unit 101 generates an input signal based on data and instructions input by the passenger and supplies the input signal to each unit of the mobile control system 100.
 データ取得部102は、移動体制御システム100の処理に用いるデータを取得する各種のセンサ等を備え、取得したデータを、移動体制御システム100の各部に供給する。 The data acquisition unit 102 includes various sensors that acquire data used for processing of the mobile body control system 100, and supplies the acquired data to each part of the mobile body control system 100.
 例えば、データ取得部102は、移動体の状態等を検出するための各種のセンサを備えることでセンサ群112を構成し、図1のセンサ11a-1~11a-nより構成されるセンサ群11に対応する。具体的には、例えば、データ取得部102は、ジャイロセンサ、加速度センサ、慣性計測装置(IMU)、及び、アクセル等の加速入力の操作量、減速入力の操作量、方向指示入力の操作量、エンジンやモータ等の駆動装置の回転数や入出力エネルギー・燃料量、エンジンやモータ等のトルク量、若しくは、車輪や関節の回転速度やトルク等を検出するためのセンサ等を備える。 For example, the data acquisition unit 102 includes various sensors for detecting the state of the moving body and the like, thereby configuring the sensor group 112, and the sensor group 11 including the sensors 11a-1 to 11a-n in FIG. Corresponding to Specifically, for example, the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement device (IMU), an acceleration input operation amount such as an accelerator, a deceleration input operation amount, a direction instruction input operation amount, A sensor or the like is provided for detecting the rotational speed, input / output energy / fuel amount of the driving device such as the engine or motor, the torque amount of the engine or motor, or the rotational speed or torque of the wheel or joint.
 また例えば、データ取得部102は、移動体の外部の情報を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、ToFカメラ、ステレオカメラ、単眼カメラ、赤外線カメラ、偏光カメラ、及び、その他のカメラ等の撮像装置を備える。また、例えば、データ取得部102は、天候又は気象等を検出するための環境センサ、及び、移動体の周囲の物体を検出するための周囲情報検出センサを備える。環境センサは、例えば、雨滴センサ、霧センサ、日照センサ、雪センサ等からなる。周囲情報検出センサは、例えば、レーザ測距センサ、超音波センサ、レーダ、LiDAR、ソナー等からなる。 For example, the data acquisition unit 102 includes various sensors for detecting information outside the moving body. Specifically, for example, the data acquisition unit 102 includes an imaging device such as a ToF camera, a stereo camera, a monocular camera, an infrared camera, a polarization camera, and other cameras. Further, for example, the data acquisition unit 102 includes an environmental sensor for detecting weather or weather and a surrounding information detection sensor for detecting an object around the moving body. The environmental sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like. The ambient information detection sensor includes, for example, a laser range sensor, an ultrasonic sensor, a radar, a LiDAR, a sonar, and the like.
 さらに例えば、データ取得部102は、移動体の現在位置を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、GNSS衛星からのGNSS信号を受信するGNSS受信機等を備える。 Further, for example, the data acquisition unit 102 includes various sensors for detecting the current position of the moving object. Specifically, for example, the data acquisition unit 102 includes a GNSS receiver that receives a GNSS signal from a GNSS satellite.
 通信部103は、移動体内部機器104、並びに、移動体外部の様々な機器、サーバ、基地局等と通信を行い、移動体制御システム100の各部から供給されるデータを送信したり、受信したデータを移動体制御システム100の各部に供給したりする。なお、通信部103がサポートする通信プロトコルは、特に限定されるものではなく、また、通信部103が、複数の種類の通信プロトコルをサポートすることも可能である The communication unit 103 communicates with the mobile device 104 and various devices, servers, base stations, and the like outside the mobile device, and transmits and receives data supplied from each unit of the mobile control system 100. Data is supplied to each part of the mobile control system 100. Note that the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 can support a plurality of types of communication protocols.
 例えば、通信部103は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)、又は、WUSB(Wireless USB)等により、移動体内部機器104と無線通信を行う。また例えば、通信部103は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(High-Definition Multimedia Interface)、又は、MHL(Mobile High-definition Link)等により、移動体内部機器104と有線通信を行う。 For example, the communication unit 103 performs wireless communication with the mobile internal device 104 using a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like. Further, for example, the communication unit 103 is connected to a USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High-definition Link) via a connection terminal (and a cable if necessary). ) Or the like to perform wired communication with the mobile internal device 104.
 さらに例えば、通信部103は、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)との通信を行う。また例えば、通信部103は、P2P(Peer To Peer)技術を用いて、移動体の近傍に存在する端末(例えば、歩行者若しくは店舗の端末、又は、MTC(Machine Type Communication)端末)との通信を行う。さらに、例えば、移動体10が車両の場合、通信部103は、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、移動体と家との間(Vehicle to Home)の通信、及び、歩車間(Vehicle to Pedestrian)通信等のV2X通信を行う。また、例えば、通信部103は、ビーコン受信部を備え、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行規制又は所要時間等の情報を取得する。 Further, for example, the communication unit 103 communicates with a device (for example, an application server or a control server) that exists on an external network (for example, the Internet, a cloud network, or an operator-specific network) via a base station or an access point. I do. In addition, for example, the communication unit 103 communicates with a terminal (for example, a pedestrian or a store terminal or an MTC (Machine Type Communication) terminal) that exists in the vicinity of the mobile body using P2P (Peer To To Peer) technology. I do. Further, for example, when the mobile body 10 is a vehicle, the communication unit 103 performs vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, communication between the mobile body and the house (Vehicle to Home), In addition, V2X communication such as vehicle-to-pedestrian communication is performed. In addition, for example, the communication unit 103 includes a beacon receiving unit, receives radio waves or electromagnetic waves transmitted from a wireless station or the like installed on the road, and acquires information such as a current position, traffic jam, traffic regulation, or required time. To do.
 移動体内部機器104は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、移動体に搬入され若しくは取り付けられる情報機器、及び、任意の目的地までの経路探索を行うナビゲーション装置等を含む。 The mobile internal device 104 includes, for example, a mobile device or wearable device possessed by a passenger, an information device that is carried in or attached to the mobile body, and a navigation device that searches for a route to an arbitrary destination.
 出力制御部105は、移動体の搭乗者又は移動体外部に対する各種の情報の出力を制御する。例えば、出力制御部105は、視覚情報(例えば、画像データ)及び聴覚情報(例えば、音声データ)のうちの少なくとも1つを含む出力信号を生成し、出力部106に供給することにより、出力部106からの視覚情報及び聴覚情報の出力を制御する。具体的には、例えば、出力制御部105は、データ取得部102の異なる撮像装置により撮像された画像データを合成して、俯瞰画像又はパノラマ画像等を生成し、生成した画像を含む出力信号を出力部106に供給する。また、例えば、出力制御部105は、衝突、接触、危険地帯への進入等の危険に対する警告音又は警告メッセージ等を含む音声データを生成し、生成した音声データを含む出力信号を出力部106に供給する。 The output control unit 105 controls the output of various information to the passenger of the moving body or the outside of the moving body. For example, the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data), and supplies the output signal to the output unit 106, thereby outputting the output unit The output of visual information and auditory information from 106 is controlled. Specifically, for example, the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate an overhead image or a panoramic image, and outputs an output signal including the generated image. This is supplied to the output unit 106. Further, for example, the output control unit 105 generates sound data including a warning sound or a warning message for a danger such as a collision, contact, or entry into a dangerous zone, and outputs an output signal including the generated sound data to the output unit 106. Supply.
 出力部106は、移動体の搭乗者又は移動体外部に対して、視覚情報又は聴覚情報を出力することが可能な装置を備える。例えば、出力部106は、表示装置、インストルメントパネル、オーディオスピーカ、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ、ランプ等を備える。出力部106が備える表示装置は、通常のディスプレイを有する装置以外にも、例えば、ヘッドアップディスプレイ、透過型ディスプレイ、AR(Augmented Reality)表示機能を有する装置等の運転者の視野内に視覚情報を表示する装置であってもよい。尚、出力制御部105及び出力部106は、自律移動の処理には必須の構成ではないため、必要に応じて省略するようにしてもよい。 The output unit 106 includes a device that can output visual information or auditory information to a passenger of the moving body or the outside of the moving body. For example, the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a glasses-type display worn by a passenger, a projector, a lamp, and the like. In addition to a device having a normal display, the display device included in the output unit 106 may display visual information in the driver's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function. It may be a display device. Note that the output control unit 105 and the output unit 106 are not essential components for the autonomous movement process, and may be omitted as necessary.
 駆動系制御部107は、各種の制御信号を生成し、駆動系システム108に供給することにより、駆動系システム108の制御を行う。また、駆動系制御部107は、必要に応じて、駆動系システム108以外の各部に制御信号を供給し、駆動系システム108の制御状態の通知等を行う。 The drive system control unit 107 controls the drive system 108 by generating various control signals and supplying them to the drive system 108. Further, the drive system control unit 107 supplies a control signal to each unit other than the drive system 108 as necessary, and notifies the control state of the drive system 108 and the like.
 駆動系システム108は、移動体の駆動系に関わる各種の装置を備える。例えば、駆動系システム108は、4本の足の各関節に備わった角度やトルクを指定可能なサーボモータ、ロボット自体の移動の動きを4本の足の動きに分解・置換するモーションコントローラ並びに、各モータ内のセンサや足裏面のセンサによるフィードバック制御装置を備える。 The drive system 108 includes various devices related to the drive system of the moving body. For example, the drive system 108 includes a servo motor that can specify the angle and torque of each joint of four legs, a motion controller that disassembles and replaces the movement movement of the robot itself into four leg movements, and Provided with a feedback control device using sensors in each motor and sensors on the back of the foot.
 別の例では、駆動系システム108は、4基ないし6基の機体上向きのプロペラを持つモータ、ロボット自体の移動の動きを各モータの回転量に分解・置換するモーションコントローラを備える。 In another example, the drive system 108 includes a motor having four or six propellers facing upward, and a motion controller that disassembles and replaces the movement of the robot itself into the rotation amount of each motor.
 さらに、別の例では、駆動系システム108は、内燃機関又は駆動用モータ等の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、舵角を調節するステアリング機構、制動力を発生させる制動装置、ABS(Antilock BrakeSystem)、ESC(Electronic Stability Control)、並びに、電動パワーステアリング装置等を備える。尚、出力制御部105、出力部106、駆動系制御部107、及び駆動系システム108は、アクチュエータ群113を構成し、図1のアクチュエータ13a-1~13a-nからなるアクチュエータ群13に対応する。 In another example, the drive system 108 includes a driving force generator for generating a driving force such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle. A steering mechanism for adjusting, a braking device for generating braking force, an ABS (AntilocktiBrakeSystem), an ESC (ElectronicElectroStability Control), an electric power steering device, and the like are provided. The output control unit 105, the output unit 106, the drive system control unit 107, and the drive system 108 constitute an actuator group 113 and correspond to the actuator group 13 including the actuators 13a-1 to 13a-n in FIG. .
 本実施形態において、駆動系制御部107、及び駆動系システム(アクチュエータ群13)は、駆動部として機能する。 In this embodiment, the drive system control unit 107 and the drive system system (actuator group 13) function as a drive unit.
 記憶部109は、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、及び、光磁気記憶デバイス等を備える。記憶部109は、移動体制御システム100の各部が用いる各種プログラムやデータ等を記憶する。例えば、記憶部109は、ダイナミックマップ等の3次元の高精度地図、高精度地図より精度が低く、広いエリアをカバーするグローバルマップ、及び、移動体の周囲の情報を含むローカルマップ等の地図データを記憶する。 The storage unit 109 includes, for example, a magnetic storage device such as ROM (Read Only Memory), RAM (Random Access Memory), and HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, and the like. . The storage unit 109 stores various programs and data used by each unit of the mobile control system 100. For example, the storage unit 109 is a map data such as a three-dimensional high-accuracy map such as a dynamic map, a global map that is less accurate than a high-accuracy map and covers a wide area, and a local map that includes information around a moving object. Remember.
 自律移動制御部110は、自動運転又は運転支援等の自律移動に関する制御を行う。具体的には、例えば、自律移動制御部110は、移動体の衝突回避あるいは衝撃緩和、移動体間距離に基づく追従移動、移動体速度維持移動、または、移動体の衝突警告の機能実現を目的とした協調制御を行う。また、例えば、自律移動制御部110は、操作者・ユーザの操作に拠らずに自律的に移動する自律移動等を目的とした協調制御を行う。 The autonomous movement control unit 110 performs control related to autonomous movement such as automatic driving or driving support. Specifically, for example, the autonomous movement control unit 110 aims to realize a function of collision avoidance or impact mitigation of a moving object, follow-up movement based on the distance between moving objects, movement speed maintaining movement, or collision warning of the moving object. Coordinated control is performed. Further, for example, the autonomous movement control unit 110 performs cooperative control for the purpose of autonomous movement that moves autonomously without depending on the operation of the operator / user.
 自律移動制御部110は、本実施形態に係る情報処理装置に相当し、例えばCPU、RAM、及びROM等のコンピュータに必要なハードウェアを有する。CPUがROMに予め記録されている本技術に係るプログラムをRAMにロードして実行することにより、本技術に係る情報処理方法が実行される。 The autonomous movement control unit 110 corresponds to the information processing apparatus according to the present embodiment, and includes hardware necessary for a computer such as a CPU, a RAM, and a ROM. The information processing method according to the present technology is executed when the CPU loads a program according to the present technology recorded in advance in the ROM into the RAM and executes the program.
 自律移動制御部110の具体的な構成は限定されず、例えばFPGA(Field Programmable Gate Array)等のPLD(Programmable Logic Device)、その他ASIC(Application Specific Integrated Circuit)等のデバイスが用いられてもよい。 The specific configuration of the autonomous movement control unit 110 is not limited, and a device such as PLD (Programmable Logic Device) such as FPGA (Field Programmable Gate Array) or other ASIC (Application Specific Specific Integrated Circuit) may be used.
 図2に示すように、自律移動制御部110は、検出部131、自己位置推定部132、状況分析部133、計画部134、及び、動作制御部135を備える。このうち、検出部131、自己位置推定部132、及び状況分析部133は、認識処理部121を構成し、図1の認識処理部14に対応する。また、計画部134は、行動計画処理部122を構成し、図1の行動計画処理部15に対応する。さらに、動作制御部135は、行動制御処理部123を構成し、図1の行動制御処理部16に対応する。 2, the autonomous movement control unit 110 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135. Among these, the detection part 131, the self-position estimation part 132, and the condition analysis part 133 comprise the recognition process part 121, and respond | correspond to the recognition process part 14 of FIG. Moreover, the plan part 134 comprises the action plan process part 122, and respond | corresponds to the action plan process part 15 of FIG. Further, the operation control unit 135 constitutes the behavior control processing unit 123 and corresponds to the behavior control processing unit 16 of FIG.
 検出部131は、自律移動の制御に必要な各種の情報の検出を行う。検出部131は、移動体外部情報検出部141、移動体内部情報検出部142、及び、移動体状態検出部143を備える。 The detecting unit 131 detects various information necessary for controlling autonomous movement. The detection unit 131 includes a mobile body external information detection unit 141, a mobile body internal information detection unit 142, and a mobile body state detection unit 143.
 移動体外部情報検出部141は、移動体制御システム100の各部からのデータ又は信号に基づいて、移動体の外部の情報の検出処理を行う。例えば、移動体外部情報検出部141は、移動体の周囲の物体の検出処理、認識処理、及び、追跡処理、並びに、物体までの距離の検出処理を行う。検出対象となる物体には、例えば、移動体、人、障害物、構造物、道路、信号機、交通標識、道路標示等が含まれる。また、例えば、移動体外部情報検出部141は、移動体の周囲の環境の検出処理を行う。検出対象となる周囲の環境には、例えば、天候、気温、湿度、明るさ、及び、路面の状態等が含まれる。移動体外部情報検出部141は、検出処理の結果を示すデータを自己位置推定部132、状況分析部133のマップ解析部151、及び、状況認識部152、並びに、動作制御部135等に供給する。 The mobile object external information detection unit 141 performs a process of detecting information outside the mobile object based on data or signals from each part of the mobile object control system 100. For example, the mobile object external information detection unit 141 performs detection processing, recognition processing, tracking processing, and distance detection processing for an object around the mobile object. Examples of objects to be detected include moving objects, people, obstacles, structures, roads, traffic lights, traffic signs, road markings, and the like. In addition, for example, the mobile object external information detection unit 141 performs an environment detection process around the mobile object. The surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like. The moving body external information detection unit 141 supplies data indicating the detection processing result to the self-position estimation unit 132, the map analysis unit 151 of the situation analysis unit 133, the situation recognition unit 152, the operation control unit 135, and the like. .
 移動体内部情報検出部142は、移動体制御システム100の各部からのデータ又は信号に基づいて、移動体内部の情報の検出処理を行う。例えば、移動体内部情報検出部142は、運転者の認証処理及び認識処理、運転者の状態の検出処理、搭乗者の検出処理、及び、移動体内部の環境の検出処理等を行う。検出対象となる運転者の状態には、例えば、体調、覚醒度、集中度、疲労度、視線方向等が含まれる。検出対象となる移動体内部の環境には、例えば、気温、湿度、明るさ、臭い等が含まれる。移動体内部情報検出部142は、検出処理の結果を示すデータを状況分析部133の状況認識部152、及び、動作制御部135等に供給する。 The mobile body internal information detection unit 142 performs a process of detecting information inside the mobile body based on data or signals from each part of the mobile body control system 100. For example, the mobile body internal information detection unit 142 performs driver authentication processing and recognition processing, driver state detection processing, passenger detection processing, environment detection processing inside the mobile body, and the like. The state of the driver to be detected includes, for example, physical condition, arousal level, concentration level, fatigue level, gaze direction, and the like. Examples of the environment inside the moving object to be detected include temperature, humidity, brightness, smell, and the like. The mobile object internal information detection unit 142 supplies data indicating the result of the detection process to the situation recognition unit 152 of the situation analysis unit 133, the operation control unit 135, and the like.
 移動体状態検出部143は、移動体制御システム100の各部からのデータ又は信号に基づいて、移動体の状態の検出処理を行う。検出対象となる移動体の状態には、例えば、速度、加速度、舵角、異常の有無及び内容、運転操作の状態、パワーシートの位置及び傾き、ドアロックの状態、並びに、その他の移動体搭載機器の状態等が含まれる。移動体状態検出部143は、検出処理の結果を示すデータを状況分析部133の状況認識部152、及び、動作制御部135等に供給する。 The moving body state detection unit 143 performs a state detection process of the moving body based on data or signals from each part of the moving body control system 100. The state of the mobile object to be detected includes, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, driving operation state, power seat position and tilt, door lock state, and other mobile body mounting The status of the device is included. The moving body state detection unit 143 supplies data indicating the result of the detection process to the situation recognition unit 152 of the situation analysis unit 133, the operation control unit 135, and the like.
 自己位置推定部132は、移動体外部情報検出部141、及び、状況分析部133の状況認識部152等の移動体制御システム100の各部からのデータ又は信号に基づいて、移動体の位置及び姿勢等の推定処理を行う。また、自己位置推定部132は、必要に応じて、自己位置の推定に用いるローカルマップ(以下、自己位置推定用マップと称する)を生成する。自己位置推定用マップは、例えば、SLAM(Simultaneous Localization and Mapping)等の技術を用いた高精度なマップとされる。自己位置推定部132は、推定処理の結果を示すデータを状況分析部133のマップ解析部151、及び、状況認識部152等に供給する。また、自己位置推定部132は、自己位置推定用マップを記憶部109に記憶させる。 The self-position estimation unit 132 is based on data or signals from each part of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 152 of the situation analysis unit 133. Etc. are estimated. In addition, the self-position estimation unit 132 generates a local map (hereinafter referred to as a self-position estimation map) used for self-position estimation as necessary. The self-position estimation map is, for example, a highly accurate map using a technique such as SLAM (SimultaneousultLocalization and Mapping). The self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151 of the situation analysis unit 133, the situation recognition unit 152, and the like. In addition, the self-position estimating unit 132 stores the self-position estimating map in the storage unit 109.
 さらに、自己位置推定部132は、センサ群112より供給される検出結果に基づいて、時系列に供給される時系列情報をデータベースに蓄積すると共に、蓄積した時系列の情報に基づいて、自己位置を推定し、時系列情報自己位置として出力する。また、自己位置推定部132は、センサ群112より供給される現在の検出結果に基づいて、自己位置を推定し、現在情報自己位置として出力する。そして、自己位置推定部132は、時系列情報自己位置と、現在情報自己位置とを統合する、または、切り替えることにより自己位置推定結果として出力する。さらに、自己位置推定部132は、センサ群112より供給される検出結果に基づいて、移動体10の姿勢を検出し、姿勢の変化が検出されて、自己位置が大きく変化し、時系列情報自己位置の推定精度が低下するとみなされるとき、現在情報自己位置のみから自己位置を推定する。また、例えば、移動体10が別の移動体に搭載されて移動するような場合、自己位置推定部132は、センサ群112より供給される検出結果に基づいて、移動体10の姿勢の変化が検出されなくても、自己位置が大きく変化するので、時系列情報自己位置の推定精度が低下するとみなし、現在情報自己位置のみから自己位置を推定する。これは、例えば、移動体10が車両であって、カーフェリーボートに搭載されて移動するような場合が考えられる。このようにすることで、外力の影響の有無に関わらず、予め予測できない姿勢の変化があって、自己位置が大きく変化するようなときにでも、現在情報自己位置のみから自己位置が推定されるので、自己位置を所定の精度で推定することができる。 Furthermore, the self-position estimation unit 132 accumulates time-series information supplied in time series in the database based on the detection result supplied from the sensor group 112, and based on the accumulated time-series information, Is output as the time-series information self-position. Further, the self-position estimation unit 132 estimates the self-position based on the current detection result supplied from the sensor group 112, and outputs it as the current information self-position. Then, the self-position estimation unit 132 outputs the self-position estimation result by integrating or switching the time-series information self-position and the current information self-position. Furthermore, the self-position estimation unit 132 detects the posture of the moving body 10 based on the detection result supplied from the sensor group 112, detects a change in the posture, changes the self-position greatly, and the time-series information self When the position estimation accuracy is considered to be lowered, the self-position is estimated only from the current information self-position. In addition, for example, when the moving body 10 is mounted on another moving body and moves, the self-position estimation unit 132 changes the posture of the moving body 10 based on the detection result supplied from the sensor group 112. Even if it is not detected, since the self-position changes greatly, it is assumed that the estimation accuracy of the time-series information self-position is lowered, and the self-position is estimated only from the current information self-position. For example, the moving body 10 may be a vehicle and mounted on a car ferry boat to move. By doing this, the self-position is estimated from only the current information self-position even when there is a change in posture that cannot be predicted in advance and the self-position changes greatly regardless of the influence of external force. Therefore, the self position can be estimated with a predetermined accuracy.
 状況分析部133は、移動体及び周囲の状況の分析処理を行う。状況分析部133は、マップ解析部151、状況認識部152、及び、状況予測部153を備える。 The situation analysis unit 133 performs analysis processing of the moving body and the surrounding situation. The situation analysis unit 133 includes a map analysis unit 151, a situation recognition unit 152, and a situation prediction unit 153.
 マップ解析部151は、自己位置推定部132及び移動体外部情報検出部141等の移動体制御システム100の各部からのデータ又は信号を必要に応じて用いながら、記憶部109に記憶されている各種のマップの解析処理を行い、自律移動の処理に必要な情報を含むマップを構築する。マップ解析部151は、構築したマップを、状況認識部152、状況予測部153、並びに、計画部134のルート計画部161、行動計画部162、及び、動作計画部163等に供給する。 The map analysis unit 151 uses various data or signals from each part of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and stores various data stored in the storage unit 109. The map is analyzed, and a map including information necessary for the autonomous movement process is constructed. The map analysis unit 151 supplies the constructed map to the situation recognition unit 152, the situation prediction unit 153, the route plan unit 161, the action plan unit 162, the action plan unit 163, and the like of the plan unit 134.
 状況認識部152は、自己位置推定部132、移動体外部情報検出部141、移動体内部情報検出部142、移動体状態検出部143、及び、マップ解析部151等の移動体制御システム100の各部からのデータ又は信号に基づいて、移動体に関する状況の認識処理を行う。例えば、状況認識部152は、移動体の状況、移動体の周囲の状況、及び、移動体の運転者の状況等の認識処理を行う。また、状況認識部152は、必要に応じて、移動体の周囲の状況の認識に用いるローカルマップ(以下、状況認識用マップと称する)を生成する。状況認識用マップは、例えば、占有格子地図(Occupancy Grid Map)、道路地図(Lane Map)、または、点群地図(Point Cloud Map)とされる。 The situation recognition unit 152 is a part of the mobile body control system 100 such as a self-position estimation unit 132, a mobile body external information detection unit 141, a mobile body internal information detection unit 142, a mobile body state detection unit 143, and a map analysis unit 151. Based on the data or signal from, the recognition process of the situation regarding a moving body is performed. For example, the situation recognition unit 152 performs recognition processing such as the situation of the moving body, the situation around the moving body, and the situation of the driver of the moving body. In addition, the situation recognition unit 152 generates a local map (hereinafter referred to as a situation recognition map) used for recognition of the situation around the moving object, as necessary. The situation recognition map is, for example, an occupation grid map (OccupancyccMap Map), a road map (Lane Map), or a point cloud map (Point Cloud Map).
 認識対象となる移動体の状況には、例えば、移動体の位置、姿勢、動き(例えば、速度、加速度、移動方向等)、並びに、異常の有無及び内容等が含まれる。認識対象となる移動体の周囲の状況には、例えば、周囲の静止物体の種類及び位置、周囲の動物体の種類、位置及び動き(例えば、速度、加速度、移動方向等)、周囲の道路の構成及び路面の状態、並びに、周囲の天候、気温、湿度、及び、明るさ等が含まれる。認識対象となる運転者の状態には、例えば、体調、覚醒度、集中度、疲労度、視線の動き、並びに、運転操作等が含まれる。 The situation of the moving body to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the moving body, presence / absence and contents of the abnormality. The situation around the moving object to be recognized includes, for example, the type and position of the surrounding stationary object, the type and position of the surrounding moving object (for example, speed, acceleration, moving direction, etc.), the surrounding road Configuration, road surface conditions, ambient weather, temperature, humidity, brightness, etc. are included. The state of the driver to be recognized includes, for example, physical condition, arousal level, concentration level, fatigue level, line of sight movement, and driving operation.
 状況認識部152は、認識処理の結果を示すデータ(必要に応じて、状況認識用マップを含む)を自己位置推定部132及び状況予測部153等に供給する。また、状況認識部152は、状況認識用マップを記憶部109に記憶させる。 The situation recognition unit 152 supplies data indicating the result of recognition processing (including a situation recognition map as necessary) to the self-position estimation unit 132, the situation prediction unit 153, and the like. Further, the situation recognition unit 152 stores the situation recognition map in the storage unit 109.
 状況予測部153は、マップ解析部151、及び状況認識部152等の移動体制御システム100の各部からのデータ又は信号に基づいて、移動体に関する状況の予測処理を行う。例えば、状況予測部153は、移動体の状況、移動体の周囲の状況、及び、運転者の状況等の予測処理を行う。 The situation prediction unit 153 performs a process for predicting a situation related to the moving body based on data or signals from each part of the moving body control system 100 such as the map analysis unit 151 and the situation recognition unit 152. For example, the situation prediction unit 153 performs prediction processing such as the situation of the moving body, the situation around the moving body, and the situation of the driver.
 予測対象となる移動体の状況には、例えば、移動体の挙動、異常の発生、及び、移動可能距離等が含まれる。予測対象となる移動体の周囲の状況には、例えば、移動体の周囲の動物体の挙動、信号の状態の変化、及び、天候等の環境の変化等が含まれる。予測対象となる運転者の状況には、例えば、運転者の挙動及び体調等が含まれる。 The situation of the mobile object to be predicted includes, for example, the behavior of the mobile object, the occurrence of an abnormality, and the movable distance. The situation around the moving object to be predicted includes, for example, the behavior of the moving object around the moving object, the change in the signal state, the change in the environment such as the weather, and the like. The situation of the driver to be predicted includes, for example, the behavior and physical condition of the driver.
 状況予測部153は、予測処理の結果を示すデータを、及び状況認識部152からのデータとともに、計画部134のルート計画部161、行動計画部162、及び、動作計画部163等に供給する。 The situation prediction unit 153 supplies the data indicating the result of the prediction process and the data from the situation recognition unit 152 to the route planning unit 161, the action planning unit 162, the action planning unit 163, and the like of the planning unit 134.
 ルート計画部161は、マップ解析部151及び状況予測部153等の移動体制御システム100の各部からのデータ又は信号に基づいて、目的地までのルートを計画する。例えば、ルート計画部161は、グローバルマップに基づいて、現在位置から指定された目的地までのルートを設定する。また、例えば、ルート計画部161は、渋滞、事故、通行規制、工事等の状況、及び、運転者の体調等に基づいて、適宜ルートを変更する。ルート計画部161は、計画したルートを示すデータを行動計画部162等に供給する。 The route planning unit 161 plans a route to the destination based on data or signals from each part of the mobile control system 100 such as the map analysis unit 151 and the situation prediction unit 153. For example, the route planning unit 161 sets a route from the current position to the designated destination based on the global map. In addition, for example, the route planning unit 161 changes the route as appropriate based on conditions such as traffic jams, accidents, traffic restrictions, construction, and the physical condition of the driver. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
 行動計画部162は、マップ解析部151及び状況予測部153等の移動体制御システム100の各部からのデータ又は信号に基づいて、ルート計画部161により計画されたルートを計画された時間内で安全に移動するための移動体の行動を計画する。例えば、行動計画部162は、発進、停止、進行方向(例えば、前進、後退、左折、右折、方向転換等)、移動速度、及び、追い越し等の計画を行う。行動計画部162は、計画した移動体の行動を示すデータを動作計画部163等に供給する。 Based on data or signals from each part of the mobile control system 100 such as the map analysis unit 151 and the situation prediction unit 153, the action plan unit 162 can safely route the route planned by the route plan unit 161 within the planned time. Plan the behavior of the moving body to move to. For example, the action planning unit 162 performs plans such as start, stop, traveling direction (for example, forward, backward, left turn, right turn, direction change, etc.), moving speed, and overtaking. The behavior planning unit 162 supplies data indicating the planned behavior of the moving body to the motion planning unit 163 and the like.
 より詳細には、行動計画部162は、それぞれルート計画部161により計画されたルートのそれぞれについて、計画された時間内で安全に移動するための移動体の行動計画の候補を行動計画候補として生成する。より具体的には、行動計画部162は、例えば、環境を格子状に区切って、到達判定及び経路の重みを最適化して最良のパスを生成するA*algorithm(A star探索アルゴリズム)、道路中心線に従って経路を設定するLane algorithm、及び、自己位置からインクリメンタルに到達可能な場所へのパスを適切に枝刈りしながら伸ばしていくRRT(Rapidly-exploring Random Tree) algorithmなどにより行動計画候補を生成する。 More specifically, the action plan unit 162 generates, as action plan candidates, action plan candidates of moving bodies for safely moving within the planned time for each route planned by the route plan unit 161. To do. More specifically, the action planning unit 162 divides the environment into a grid, for example, A * algorithm (A star search algorithm) that optimizes arrival determination and route weight to generate the best path, road center Generating action plan candidates with Lane algorithm that sets the route according to the line and RRT (Rapidly-exploring Random Tree) algorithm that stretches while properly pruning the path from the self position to the place where it can reach incrementally .
 動作計画部163は、マップ解析部151及び状況予測部153等の移動体制御システム100の各部からのデータ又は信号に基づいて、行動計画部162により計画された行動を実現するための移動体の動作を計画する。例えば、動作計画部163は、加速、減速、及び、移動軌道等の計画を行う。動作計画部163は、計画した移動体の動作を示すデータを、動作制御部135等に供給する。 The motion planning unit 163 is based on data or signals from each part of the mobile control system 100 such as the map analysis unit 151 and the situation prediction unit 153, and the motion planning unit 162 realizes the behavior planned by the behavior planning unit 162. Plan for action. For example, the motion planning unit 163 performs planning such as acceleration, deceleration, and movement trajectory. The motion planning unit 163 supplies data indicating the planned motion of the moving body to the motion control unit 135 and the like.
 動作制御部135は、移動体の動作の制御を行う。 The operation control unit 135 controls the operation of the moving object.
 より詳細には、動作制御部135は、移動体外部情報検出部141、移動体内部情報検出部142、及び、移動体状態検出部143の検出結果に基づいて、衝突、接触、危険地帯への進入、運転者の異常、移動体の異常等の緊急事態の検出処理を行う。動作制御部135は、緊急事態の発生を検出した場合、急停止や急旋回等の緊急事態を回避するための移動体の動作を計画する。 More specifically, the operation control unit 135 detects collision, contact, or danger zone based on the detection results of the mobile body external information detection unit 141, the mobile body internal information detection unit 142, and the mobile body state detection unit 143. Detects emergency situations such as approach, driver abnormality, and moving body abnormality. When the occurrence of an emergency situation is detected, the operation control unit 135 plans the operation of the moving body to avoid an emergency situation such as a sudden stop or a sudden turn.
 また、動作制御部135は、動作計画部163により計画された移動体の動作を実現するための加減速制御を行う。例えば、動作制御部135は、計画された加速、減速、又は、急停止を実現するための駆動力発生装置又は制動装置の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部107に供給する。 Also, the operation control unit 135 performs acceleration / deceleration control for realizing the operation of the moving object planned by the operation planning unit 163. For example, the operation control unit 135 calculates a control target value of a driving force generator or a braking device for realizing planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. This is supplied to the system control unit 107.
 動作制御部135は、動作計画部163により計画された移動体の動作を実現するための方向制御を行う。例えば、動作制御部135は、動作計画部163により計画された移動軌道又は急旋回を実現するためのステアリング機構の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部107に供給する。 The motion control unit 135 performs direction control for realizing the motion of the moving body planned by the motion planning unit 163. For example, the operation control unit 135 calculates a control target value of the steering mechanism for realizing the moving trajectory or the sudden turn planned by the operation planning unit 163, and outputs a control command indicating the calculated control target value to the drive system control unit It supplies to 107.
 [障害物の判定]
 本自律移動制御部110による障害物の判定処理について説明する。図3は、移動体10の一例であるロボット20の外観を示す模式図である。図4は、障害物の判定処理を実行するための機能的な構成例を示すブロック図である。
[Judgment of obstacles]
The obstacle determination process by the autonomous movement control unit 110 will be described. FIG. 3 is a schematic diagram illustrating an appearance of a robot 20 that is an example of the moving body 10. FIG. 4 is a block diagram illustrating a functional configuration example for executing an obstacle determination process.
 図3に示すように本実施形態では、ロボット20として、多関節構造の脚部21を有する、4足歩行型のロボットを例に挙げる。なお図3では、多関節構造の図示が簡略化されている。 As shown in FIG. 3, in the present embodiment, as the robot 20, a quadruped walking type robot having an articulated leg 21 is taken as an example. In FIG. 3, the illustration of the multi-joint structure is simplified.
 ロボット20の頭部22の前方側には、距離センサ25(図3では図示省略)が配置されており、前方側に広がる測定範囲Mに存在する物体までの距離を計測することが可能となっている。本実施形態では距離センサ25として、LiDARが用いられ、測定範囲Mに存在する物体までの距離が、3次元点群データとして取得される。この3次元点群データにより、測定範囲M内の形状を判定することが可能である。 A distance sensor 25 (not shown in FIG. 3) is disposed on the front side of the head 22 of the robot 20, and it is possible to measure the distance to an object existing in the measurement range M extending on the front side. ing. In this embodiment, LiDAR is used as the distance sensor 25, and the distance to the object existing in the measurement range M is acquired as three-dimensional point cloud data. The shape within the measurement range M can be determined from the three-dimensional point cloud data.
 距離センサ25として、例えばレーザ測距センサ、超音波センサ、レーダ、ソナー等の他のセンサが用いられてもよい。例えば撮像装置によって取得された画像のピクセルがロボット20からどの程度の距離に存在しているのかを判断可能であれば、ステレオカメラを用いたステレオマッチング、IRカメラを用いた測距、レーザーレンジファインダ等、どのような方式のセンサであっても構わない。また測定範囲M内の形状を判定可能な任意のセンサが用いられてよい。 As the distance sensor 25, for example, other sensors such as a laser distance sensor, an ultrasonic sensor, a radar, and a sonar may be used. For example, if it is possible to determine how far the pixel of the image acquired by the imaging apparatus is present from the robot 20, stereo matching using a stereo camera, distance measurement using an IR camera, laser range finder Any type of sensor may be used. Further, any sensor that can determine the shape within the measurement range M may be used.
 図3では、ロボット20の周辺の一部である前方側に測定範囲Mが設定されている。これに限定されず、ロボット20の周辺の任意の範囲を測定領域Dとして設定可能である。例えば例えばロボット20の周囲360度全周を測定範囲Mとして設定することも可能である。なお測定範囲Mの基点となる位置に、距離センサ25が配置されている。 In FIG. 3, a measurement range M is set on the front side which is a part of the periphery of the robot 20. However, the present invention is not limited to this, and an arbitrary range around the robot 20 can be set as the measurement region D. For example, it is possible to set the entire circumference of 360 degrees around the robot 20 as the measurement range M, for example. A distance sensor 25 is disposed at a position serving as a base point of the measurement range M.
 本実施形態において、距離センサ25は、ロボット20の周辺情報を検出可能なセンサに相当する。そして測定範囲M内の3次元点群データは、ロボット20の周辺情報、及びロボット20の周辺に関する形状データに相当する。 In the present embodiment, the distance sensor 25 corresponds to a sensor that can detect the peripheral information of the robot 20. The three-dimensional point cloud data in the measurement range M corresponds to the peripheral information of the robot 20 and shape data related to the periphery of the robot 20.
 また、図4に示すようにロボット20は、高さ位置センサ26を有する(図3では図示省略)。高さ位置センサ26は、距離センサ25の高さ位置に関する情報を検出する。距離センサ25の高さ位置は、典型的には、所定の基準位置を基準として算出される。すなわち基準位置からの高さ方向における変位量が、高さ位置として算出される。 As shown in FIG. 4, the robot 20 has a height position sensor 26 (not shown in FIG. 3). The height position sensor 26 detects information related to the height position of the distance sensor 25. The height position of the distance sensor 25 is typically calculated based on a predetermined reference position. That is, the amount of displacement in the height direction from the reference position is calculated as the height position.
 例えばロボット20が載置される床面(地面)1の位置を基準位置として、床面1からの相対位置が算出されてもよい。または天井面2(図5等参照)等の、実際に存在する他の物体の表面等が基準位置として採用されてもよい。 For example, the relative position from the floor surface 1 may be calculated using the position of the floor surface (ground) 1 on which the robot 20 is placed as a reference position. Alternatively, the surface of another object that actually exists such as the ceiling surface 2 (see FIG. 5 and the like) may be adopted as the reference position.
 または、ロボット20が所定の状態(所定の姿勢)となった場合を基準状態として、その基準状態において算出される距離センサ25の高さ位置が基準位置として設定される。そしてその基準位置からの相対位置が、距離センサ25の高さ位置として算出されてもよい。ロボット20のどの状態を基準状態とするかは任意に設定されてよい。例えば、床面1に配置されたロボット20の初期状態や、ロボット20の標準状態等を、基準状態として設定可能である。 Alternatively, when the robot 20 is in a predetermined state (predetermined posture), the height position of the distance sensor 25 calculated in the reference state is set as the reference position. Then, the relative position from the reference position may be calculated as the height position of the distance sensor 25. Which state of the robot 20 is set as the reference state may be arbitrarily set. For example, the initial state of the robot 20 arranged on the floor 1 and the standard state of the robot 20 can be set as the reference state.
 高さ位置センサ26としては、例えば圧力センサ(気圧センサ)が用いられる。これに限定されず、超音波センサのドップラー効果を利用して、床面1からの相対位置が算出されてもよい。また加速度センサ等の検出結果を積分することで、高さ位置を算出することも可能である。なお高さ位置に関する情報は、高さ位置を算出可能な種々の情報を含み、例えばセンサの出力結果に含まれる検出信号が含まれる。また高さ位置に関する情報には、算出された高さ位置自体の情報も含まれる。 As the height position sensor 26, for example, a pressure sensor (atmospheric pressure sensor) is used. However, the relative position from the floor surface 1 may be calculated using the Doppler effect of the ultrasonic sensor. It is also possible to calculate the height position by integrating the detection results of the acceleration sensor or the like. The information on the height position includes various information that can calculate the height position, and includes, for example, a detection signal included in the output result of the sensor. The information on the height position includes information on the calculated height position itself.
 また、図4に示すようにロボット20は、内界センサ27(図3では図示省略)を有する。内界センサ27は、加速度センサ、ジャイロセンサ、慣性計測装置(IMU)、及び地磁気センサなどの総称であり、ロボット20の加速度、角度、角速度、及び地磁気方向等を検出することが可能である。 Further, as shown in FIG. 4, the robot 20 has an internal sensor 27 (not shown in FIG. 3). The internal sensor 27 is a generic name for an acceleration sensor, a gyro sensor, an inertial measurement device (IMU), a geomagnetic sensor, and the like, and can detect the acceleration, angle, angular velocity, geomagnetic direction, and the like of the robot 20.
 本実施形態において、距離センサ25、高さ位置センサ26、及び内界センサ27は、図1及び図2に示すセンサ群11(112)に含まれる。 In the present embodiment, the distance sensor 25, the height position sensor 26, and the inner world sensor 27 are included in the sensor group 11 (112) shown in FIGS.
 また、図4に示すようにロボット20は、高さ位置算出部30、姿勢算出部31、判定領域算出部32、及び障害物判定部33を有する。これらのブロックは、本実施形態に係る情報処理装置に相当する自律移動制御部110内に構成される。典型的には、図1に示す認識処理部14、及び図2に示す認識処理部121の一部として構成される。もちろんこれに限定される訳ではない。 Further, as shown in FIG. 4, the robot 20 includes a height position calculation unit 30, a posture calculation unit 31, a determination region calculation unit 32, and an obstacle determination unit 33. These blocks are configured in the autonomous movement control unit 110 corresponding to the information processing apparatus according to the present embodiment. Typically, it is configured as a part of the recognition processing unit 14 shown in FIG. 1 and the recognition processing unit 121 shown in FIG. Of course, it is not limited to this.
 図5は、障害物の判定処理の一例を示すフローチャートである。図6~図9は、図5に示す各ステップを説明するための図である。図5に示す処理を実行することで、ロボット20の周辺に障害物が存在するか否かを判定することが可能である。なおフローを実行する周期は限定されず、任意に設定されてよい。 FIG. 5 is a flowchart showing an example of an obstacle determination process. 6 to 9 are diagrams for explaining each step shown in FIG. By executing the processing shown in FIG. 5, it is possible to determine whether there is an obstacle around the robot 20. The cycle for executing the flow is not limited and may be set arbitrarily.
 高さ位置算出部30により、高さ位置センサ26の検出結果に基づいて、距離センサ25の高さ位置が算出される(ステップ101)。判定領域算出部32により、算出された距離センサ25の高さ位置に基づいて、判定領域Dが算出される(ステップ102)。 The height position calculation unit 30 calculates the height position of the distance sensor 25 based on the detection result of the height position sensor 26 (step 101). The determination area calculation unit 32 calculates the determination area D based on the calculated height position of the distance sensor 25 (step 102).
 判定領域Dは、ロボット20の周辺の状況を判定するための領域であり、障害物の判定処理を実行する際に識別基準となる領域である。図3、図6~図9では、グレーの色で図示された空間の領域が、判定領域Dとなる。 The determination area D is an area for determining the situation around the robot 20, and is an area that serves as an identification reference when executing an obstacle determination process. In FIGS. 3 and 6 to 9, the area of the space illustrated in gray is the determination area D.
 図3、図6~図9に示す例では、距離センサ25が配置されるロボット20の頭部22の前方側を頂点とする略四角錐の形状の範囲が、測定範囲Dとして図示されている。判定領域Dは、測定範囲Mに含まれる領域として算出される。 In the example shown in FIGS. 3 and 6 to 9, a range of a substantially quadrangular pyramid shape having the apex at the front side of the head 22 of the robot 20 where the distance sensor 25 is arranged is illustrated as the measurement range D. . The determination area D is calculated as an area included in the measurement range M.
 本実施形態では、床面1側に設定される第1の識別面35と、天井面2側に設定される第2の識別面36とにより、判定領域Dが規定される。すなわち測定範囲Mのうち、第1の識別面35から第2の識別面36までの空間の領域が、判定領域Dとして設定される。第1及び第2の識別面35及び26は、第1及び第2の判定面に相当する。 In the present embodiment, the determination area D is defined by the first identification surface 35 set on the floor surface 1 side and the second identification surface 36 set on the ceiling surface 2 side. In other words, in the measurement range M, a space area from the first identification surface 35 to the second identification surface 36 is set as the determination region D. The first and second identification surfaces 35 and 26 correspond to first and second determination surfaces.
 図6に模式的に示すように、距離センサ25の位置を原点とし、距離センサ25の検出軸Lの方向(検出方向)を軸方向(図6ではY軸)とするセンサ座標系が設定される。各軸方向は、内界センサ27により算出可能である。なおセンサ座標系の設定方法は、限定されない。第1及び第2の識別面35及び36は、センサ座標系に基づいて設定される。 As schematically shown in FIG. 6, a sensor coordinate system is set with the position of the distance sensor 25 as the origin and the direction of the detection axis L (detection direction) of the distance sensor 25 as the axial direction (Y-axis in FIG. 6). The Each axial direction can be calculated by the internal sensor 27. The method for setting the sensor coordinate system is not limited. The first and second identification surfaces 35 and 36 are set based on the sensor coordinate system.
 図6は、距離センサ25の検出軸Lが前方側に向けて水平方向に延在している状態(姿勢)が図示されている。本実施形態では、この状態を基準状態とし、距離センサ25の高さ位置H0を基準位置とする。 FIG. 6 illustrates a state (attitude) in which the detection axis L of the distance sensor 25 extends in the horizontal direction toward the front side. In the present embodiment, this state is the reference state, and the height position H0 of the distance sensor 25 is the reference position.
 判定領域算出部32は、センサ座標系に基づいて、XY平面(すなわち水平面)に平行な面となるように、第1及び第2の識別面35及び36をそれぞれ設定する。第1及び第2の識別面35及び36に挟まれる領域を、判定領域Dとして算出する。以下、基準状態における判定領域Dを、基準判定領域D0として説明を行う。 The determination area calculation unit 32 sets the first and second identification surfaces 35 and 36 to be parallel to the XY plane (that is, the horizontal plane) based on the sensor coordinate system. A region sandwiched between the first and second identification surfaces 35 and 36 is calculated as a determination region D. Hereinafter, the determination area D in the reference state will be described as the reference determination area D0.
 図6に示すように基準状態において、第1の識別面35は、床面1からオフセットhfloor(以下、hfと記載する)分高い位置に設定される。第2の識別面36は、天井面2からオフセットhceiling(以下、hcと記載する)分低い位置に設定される。なお、床面1及び天井面2の位置は、内界センサ27により算出可能である。 As shown in FIG. 6, in the reference state, the first identification surface 35 is set at a position higher than the floor surface 1 by an offset h floor (hereinafter referred to as hf). The second identification surface 36 is set at a position lower than the ceiling surface 2 by an offset h ceiling (hereinafter referred to as hc). The positions of the floor surface 1 and the ceiling surface 2 can be calculated by the internal sensor 27.
 オフセットhf及びhcの具体的な値は限定されず、適宜設定されてよい。例えば距離センサ25の誤差量の最大値以上の大きさに設定することで、障害物の判定精度を向上させることが可能である。 The specific values of the offsets hf and hc are not limited and may be set as appropriate. For example, by setting the distance sensor 25 to be larger than the maximum error amount, it is possible to improve the obstacle determination accuracy.
 障害物として判定したい物体の高さが規定されている場合には、その規定された高さよりも低い位置にオフセットhfが設定される。障害物として判定する必要のない物体の高さが規定されている場合には、その規定された高さよりも高い位置にオフセットhfが設定される。例えばロボット20により乗り越えられない高さ、十分に乗り越えられる高さ等に応じて、オフセットhfが設定されてもよい。 When the height of an object to be determined as an obstacle is specified, the offset hf is set at a position lower than the specified height. When the height of an object that does not need to be determined as an obstacle is specified, the offset hf is set at a position higher than the specified height. For example, the offset hf may be set according to a height that cannot be overcome by the robot 20, a height that can be sufficiently overcome, and the like.
 天井面2側も同様に、障害物として判定した物体の、天井面2からの高さが規定されている場合には、その規定された高さよりも高い位置にオフセットhcが設定される。障害物として判定する必要のない物体の、天井面2からの高さが規定されている場合には、その規定された高さよりも低い位置にオフセットhcが設定される。例えば歩行時のロボット20の頭部22の最大高さ等により、オフセットhcが設定されてもよい。 Similarly, when the height of the object determined as an obstacle from the ceiling surface 2 is defined on the ceiling surface 2 side, the offset hc is set at a position higher than the defined height. When the height from the ceiling surface 2 of an object that does not need to be determined as an obstacle is specified, the offset hc is set at a position lower than the specified height. For example, the offset hc may be set according to the maximum height of the head 22 of the robot 20 during walking.
 このように典型的には、自律移動に有利なように、移動環境やロボット20の性能等に基づいて、第1及び第2の識別面35及び36の高さが設定される。例えば移動上回避すべき物体を障害物として抽出可能なように、第1及び第2の識別面35及び36の高さが設定される。もちろん床面1の位置と略等しい位置に第1の識別面35が設定され、天井面2と略等しい位置に第1の識別面36が設定されてもよい(オフセットhf及びhcがともに略ゼロ)。 Thus, typically, the heights of the first and second identification surfaces 35 and 36 are set based on the moving environment, the performance of the robot 20, and the like so as to be advantageous for autonomous movement. For example, the heights of the first and second identification surfaces 35 and 36 are set so that an object to be avoided in movement can be extracted as an obstacle. Of course, the first identification surface 35 may be set at a position substantially equal to the position of the floor surface 1, and the first identification surface 36 may be set at a position substantially equal to the ceiling surface 2 (both offsets hf and hc are substantially zero). ).
 なお基準状態における第1及び第2の識別面35及び36の算出(基準判定領域D0の算出)を実行するときには、一時的に高い計算量で高精度に処理が実行されてもよい。これにより基準判定領域D0を高精度に設定することが可能である。また基準判定領域D0の設定はユーザの操作により実行されてもよいし、自動的に実行されてもよい。 Note that when the calculation of the first and second identification surfaces 35 and 36 in the reference state (calculation of the reference determination area D0) is executed, the process may be temporarily executed with a high calculation amount and with high accuracy. Thereby, the reference determination area D0 can be set with high accuracy. The setting of the reference determination area D0 may be executed by a user operation or may be automatically executed.
 障害物判定部33により、距離センサ25により検出される3次元点群データが取得される(ステップ103)。障害物判定部33により、算出された判定領域Dと、検出された3次元点群データとに基づいて、障害物の判定が実行される(ステップ104)。 The obstacle determination unit 33 acquires three-dimensional point cloud data detected by the distance sensor 25 (step 103). The obstacle determination unit 33 determines an obstacle based on the calculated determination region D and the detected three-dimensional point cloud data (step 104).
 障害物判定部33は、3次元点群データと、判定領域Dとの関係に基づいて、障害物の判定を実行する。具体的には、3次元点群データに含まれる点データが、判定領域Dに含まれるか否かを判定することで、障害物が存在するか否かを判定する。 The obstacle determination unit 33 determines an obstacle based on the relationship between the three-dimensional point cloud data and the determination area D. Specifically, it is determined whether or not an obstacle exists by determining whether or not the point data included in the three-dimensional point cloud data is included in the determination region D.
 図6に示すように、例えば判定領域D(D0)に含まれる点は、障害物を構成する点として判定される。そして当該障害物を構成する点を含む物体が、障害物3として判定される。図6に示す例では、床面1に存在する物体4aと、天井面2に設置された物体4bとが、ともに障害物3として判定されている。 As shown in FIG. 6, for example, points included in the determination area D (D0) are determined as points constituting an obstacle. Then, an object including a point constituting the obstacle is determined as the obstacle 3. In the example shown in FIG. 6, the object 4 a existing on the floor surface 1 and the object 4 b installed on the ceiling surface 2 are both determined as the obstacle 3.
 すなわち本実施形態では、距離センサ25から得られる3次元点群データが、第1及び第2の識別面35及び26に基づいて、フィルタリングされる。そして少なくとも一部が判定領域D(D0)に含まれる場合には、障害物3として判定される。なお判定領域D(D0)に含まれる点のみを障害物3として判定する場合もあり得る。 That is, in this embodiment, the three-dimensional point cloud data obtained from the distance sensor 25 is filtered based on the first and second identification surfaces 35 and 26. And when at least one part is contained in the determination area | region D (D0), it determines with the obstruction 3. FIG. Note that only points included in the determination region D (D0) may be determined as the obstacle 3.
 障害物3が存在すると判定された場合、例えば、迂回や停止等の回避動作や、警告音または警告メッセージの出力が実行される。これらの処理は、例えば、図1に示す行動計画処理部15及び行動制御処理部16が協働することで実行される。また図2に示す状況分析部133、計画部134、及び動作制御部135が協働することで実行される。これらのブロックは、本実施形態において、判定部による判定結果に基づいて、駆動部を制御する駆動制御部に相当する。 When it is determined that the obstacle 3 exists, for example, an avoidance operation such as detour or stop, and output of a warning sound or a warning message are executed. These processes are executed, for example, by the cooperation of the action plan processing unit 15 and the action control processing unit 16 illustrated in FIG. Moreover, the situation analysis unit 133, the planning unit 134, and the operation control unit 135 illustrated in FIG. In the present embodiment, these blocks correspond to a drive control unit that controls the drive unit based on the determination result by the determination unit.
 また本実施形態において、高さ位置算出部30は、移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得する取得部として機能する。姿勢算出部31は、センサの傾きに関する情報を取得する取得部として機能する。センサの姿勢については、後述する。判定領域算出部32は、取得されたセンサの高さ位置に関する情報に基づいて、移動体の周辺の状況を判定するための判定領域を算出する算出部として機能する。障害物判定部33は、算出された判定領域と、センサにより検出された周辺情報とに基づいて、移動体の周辺の状況を判定する判定部として機能する。 In the present embodiment, the height position calculation unit 30 functions as an acquisition unit that acquires information about the height position of the sensor that can detect the peripheral information of the moving object. The posture calculation unit 31 functions as an acquisition unit that acquires information regarding the tilt of the sensor. The attitude of the sensor will be described later. The determination region calculation unit 32 functions as a calculation unit that calculates a determination region for determining the situation around the moving body based on the acquired information on the height position of the sensor. The obstacle determination unit 33 functions as a determination unit that determines the situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.
 図7及び図8は、多関節構造を有する脚部21が駆動され、ロボット20が移動した場合の一例を示す図である。4本の足が前後に広がり、ロボット20の頭部22の高さ位置、すなわち距離センサ25の高さ位置が低くなっている。この状態の距離センサ25の高さ位置をH1とし、基準位置H0からの変動量をΔtとする。図7及び図8に示すように、測定範囲Mの高さ位置も、変動量Δt分低くなる。 7 and 8 are diagrams illustrating an example in which the leg 21 having a multi-joint structure is driven and the robot 20 moves. Four legs spread back and forth, and the height position of the head 22 of the robot 20, that is, the height position of the distance sensor 25 is lowered. The height position of the distance sensor 25 in this state is H1, and the amount of variation from the reference position H0 is Δt. As shown in FIGS. 7 and 8, the height position of the measurement range M is also lowered by the amount of variation Δt.
 図7に示すように、距離センサ25の高さ位置H1を基準としたセンサ座標系において、図6に示す基準判定領域D0と同様な判定領域D'を設定したとする。すなわち同じ平面式にて定義される平面を、第1及び第2の識別面35'及び36'として設定したとする。そうすると判定領域D'、第1及び第2の識別面35'及び36'は、図6に示す基準判定領域D0、第1及び第2の識別面35及び36と比べて変動量Δt分低い位置に設定される。 As shown in FIG. 7, it is assumed that a determination area D ′ similar to the reference determination area D0 shown in FIG. 6 is set in the sensor coordinate system based on the height position H1 of the distance sensor 25. That is, it is assumed that the plane defined by the same plane equation is set as the first and second identification surfaces 35 ′ and 36 ′. Then, the determination area D ′ and the first and second identification surfaces 35 ′ and 36 ′ are positions lower by the variation amount Δt than the reference determination area D0, the first and second identification surfaces 35 and 36 shown in FIG. Set to
 これにより図7に示すように、第1の識別面35'が床面1よりも低くなってしまい、床面1が判定領域D'に含まれてしまう。この結果、床面1が障害物3として判定されてしまう。また本来は踏破可能で無視すべき物体が障害物3として判定されてしまうことも起こり得る。 As a result, as shown in FIG. 7, the first identification surface 35 ′ becomes lower than the floor surface 1, and the floor surface 1 is included in the determination region D ′. As a result, the floor surface 1 is determined as the obstacle 3. In addition, an object that can be traversed and should be ignored may be determined as the obstacle 3.
 また第2の識別面36'が、天井面2よりも大幅に低くなってしまい、天井面2に存在する物体4bが判定領域D'から外れてしまう。この結果、本来障害物として判定されるべき物体4bが見えなくなってしまい、障害物の損失が発生してしまう。 In addition, the second identification surface 36 ′ is significantly lower than the ceiling surface 2, and the object 4b existing on the ceiling surface 2 deviates from the determination region D ′. As a result, the object 4b that should be determined as an obstacle cannot be seen, and an obstacle is lost.
 この結果、例えば急ブレーキ等の回避動作が不用意に実行されてしまい、適正な自律移動が妨げられてしまう。また頭部22が物体に衝突してしまい、ロボット20が破損してしまうことも起こり得る。 As a result, for example, an avoidance operation such as sudden braking is inadvertently performed, and proper autonomous movement is hindered. Further, the head 22 may collide with an object and the robot 20 may be damaged.
 図8に示すように、本実施形態に係る障害物の判定処理では、ステップ102にて、距離センサ25の高さ位置H1に基づいて、判定領域Dが算出される。具体的には、距離センサ25の高さ位置の変動に応じて、判定領域Dの高さ位置が変更される。 As shown in FIG. 8, in the obstacle determination process according to the present embodiment, in step 102, the determination region D is calculated based on the height position H1 of the distance sensor 25. Specifically, the height position of the determination region D is changed according to the change in the height position of the distance sensor 25.
 本実施形態では、図6に示す基準判定領域D0を基準として、基準判定領域D0の高さ位置と同じ高さ位置となるように、判定領域Dが算出される。すなわち基準判定領域D0の第1の識別面35の高さ位置と同じ高さ位置となるように、第1の識別面35が算出される。同様に、基準判定領域D0の第2の識別面36の高さ位置と同じ高さ位置となるように、第2の識別面36が算出される。 In this embodiment, the determination area D is calculated so that the height position is the same as the height position of the reference determination area D0 with reference to the reference determination area D0 shown in FIG. That is, the first identification surface 35 is calculated so as to have the same height position as the height position of the first identification surface 35 in the reference determination region D0. Similarly, the second identification surface 36 is calculated so as to have the same height position as the height position of the second identification surface 36 in the reference determination region D0.
 従って図8に示すように、第1の識別面35は、床面1からオフセットhf分高い位置に設定される。第2の識別面36は、天井面2からオフセットhc分低い位置に設定される。第1及び第2の識別面35及び36の平面式は、基準状態における第1及び第2の識別面35及び36とは異なるものとのなる。 Therefore, as shown in FIG. 8, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane expression of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state.
 このように、図7に示す状態と比べて、第1及び第2の識別面35及び36が、距離センサ25の変動量Δt分鉛直方向に沿って上方にオフセットされる。これによりステップ104の障害物の判定処理において、図6に示す基準状態のときと同様に、物体4a及び4bを、障害物3として適正に判定することが可能となる。この結果、距離センサ25の高さ方向における揺動の影響を十分に防止して、床面1・天井面2という移動上の障害にならない物体と、移動上障害となる物体(障害物3)とを識別することが可能となる。 Thus, compared with the state shown in FIG. 7, the first and second identification surfaces 35 and 36 are offset upward along the vertical direction by the amount of variation Δt of the distance sensor 25. Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction is sufficiently prevented, and an object that does not obstruct the movement, such as the floor surface 1 and the ceiling surface 2, and an object that obstructs the movement (obstacle 3). Can be identified.
 また距離センサ25の高さ位置に基づいて、判定領域D(第1及び第2の識別面35及び36)の高さ位置を変更すればよいので、複雑なアルゴリズム不要となり、少ない計算量にて、簡単に障害物3を判定することが可能である。 Further, since the height position of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the calculation amount is small. It is possible to easily determine the obstacle 3.
 図9は、変動量Δtに応じた判定領域Dの高さ位置の変更の他の例を模式的に示す図である。例えば測定範囲Mが十分に大きい場合等において、判定領域Dの形状を維持したまま、その高さ位置のみを変更することも可能である。すなわち図9に示すように、頭部22の高さ位置(距離センサ25の高さ位置)が低くなった場合に、その測定範囲M内に、基準判定領域D0が含まれるとする。この場合、当該基準判定領域D0をそのまま判定領域Dとして算出してもよい。このことは、変動量Δtに応じて判定領域Dの高さ位置を変更したともいえる処理である。 FIG. 9 is a diagram schematically illustrating another example of changing the height position of the determination region D according to the variation Δt. For example, when the measurement range M is sufficiently large, it is possible to change only the height position while maintaining the shape of the determination region D. That is, as shown in FIG. 9, when the height position of the head 22 (the height position of the distance sensor 25) is lowered, it is assumed that the reference determination region D0 is included in the measurement range M. In this case, the reference determination area D0 may be calculated as the determination area D as it is. This is processing that can be said to have changed the height position of the determination region D in accordance with the fluctuation amount Δt.
 図4に示すように本実施形態では、姿勢算出部31により、内界センサ27の検出結果に基づいて、ロボット20の姿勢を算出することが可能である。本実施形態では、ロボット20の姿勢として、距離センサ25の傾き(頭部22の傾き)が検出される。 As shown in FIG. 4, in the present embodiment, the posture calculation unit 31 can calculate the posture of the robot 20 based on the detection result of the internal sensor 27. In the present embodiment, the inclination of the distance sensor 25 (the inclination of the head 22) is detected as the posture of the robot 20.
 具体的には、距離センサ25の検出軸Lの傾き、及び検出軸Lを中心とした回転角度が、距離センサ25の傾きとして算出される。すなわち図6に模式的に示すX軸をピッチ軸、Y軸(検出軸)をロール軸、Z軸をヨー軸とした場合における、ピッチ角度、ロール角度、及びヨー角度が、距離センサ25の傾きとして算出される。 Specifically, the inclination of the detection axis L of the distance sensor 25 and the rotation angle about the detection axis L are calculated as the inclination of the distance sensor 25. That is, when the X axis schematically shown in FIG. 6 is the pitch axis, the Y axis (detection axis) is the roll axis, and the Z axis is the yaw axis, the pitch angle, roll angle, and yaw angle are the inclinations of the distance sensor 25. Is calculated as
 距離センサ25の傾きが、これらのパラメータに限定される訳ではない。床面1との相対姿勢を規定する任意のパラメータが、距離センサ25の傾きとして算出され得る。以下、距離センサ25の検出軸Lを、判定領域Dの位置基準軸として説明を行う場合がある。 The inclination of the distance sensor 25 is not limited to these parameters. Any parameter that defines the relative posture with the floor surface 1 can be calculated as the inclination of the distance sensor 25. Hereinafter, the detection axis L of the distance sensor 25 may be described as the position reference axis of the determination region D.
 図4に示すステップ102の判定領域Dの算出処理において、距離センサ25の高さ位置に加えて、距離センサ25の傾きに基づいて、判定領域Dが算出されてもよい。この場合、距離センサ25の傾きを取得するステップが加えられることになる。 In the calculation process of the determination area D in step 102 shown in FIG. 4, the determination area D may be calculated based on the inclination of the distance sensor 25 in addition to the height position of the distance sensor 25. In this case, a step of acquiring the inclination of the distance sensor 25 is added.
 図10~図12は、ロボット20の頭部22の高さ位置(距離センサ25の高さ位置)が低くなり、かつ頭部22の傾き(距離センサ25の傾き)が水平方向から変動した場合の一例を示す図である。図10~図12では、頭部22(距離センサ25)が水平方向から下方に向けて傾いている(ピッチ角度が変動している)。 10 to 12 show the case where the height position of the head 22 (the height position of the distance sensor 25) of the robot 20 is low and the inclination of the head 22 (inclination of the distance sensor 25) varies from the horizontal direction. It is a figure which shows an example. 10 to 12, the head 22 (distance sensor 25) is inclined downward from the horizontal direction (pitch angle varies).
 図10~図12に示すように、測定範囲Mは、高さ方向における変動量Δt分低くなり、また水平方向を基準とした傾き量Δθ分だけ下方側に傾く。傾き量Δθは、姿勢変位Δθということもできる。 As shown in FIGS. 10 to 12, the measurement range M is lowered by the amount of fluctuation Δt in the height direction, and is inclined downward by the amount of inclination Δθ with respect to the horizontal direction. The tilt amount Δθ can also be referred to as a posture displacement Δθ.
 図10に示すように、距離センサ25の高さ位置H2を基準としたセンサ座標系において、図6に示す基準判定領域D0と同様な判定領域D'を設定したとする。この場合、第1及び第2の識別面35'及び36'も、変動量Δt分低くなり、さらに傾き量Δθだけ下方側に傾く。 As shown in FIG. 10, it is assumed that a determination area D ′ similar to the reference determination area D0 shown in FIG. 6 is set in the sensor coordinate system based on the height position H2 of the distance sensor 25. In this case, the first and second identification surfaces 35 ′ and 36 ′ are also lowered by the variation amount Δt and further tilted downward by the tilt amount Δθ.
 この結果、図10に示すように、床面1が判定領域D'に含まれてしまい、障害物として判定されてしまう。また天井面2に存在する物体4bが判定領域D'から外れてしまい、障害物の損失が発生してしまう。 As a result, as shown in FIG. 10, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost.
 図11に示すように、傾き量Δθに基づいて、各々が水平面となるように、第1及び第2の識別面35''及び36''を補正し、判定領域D''を算出したとする。この場合でも、床面1が判定領域D'に含まれてしまい、障害物として判定されてしまう。また天井面2に存在する物体4bが判定領域D'から外れてしまい、障害物の損失が発生してしまう。すなわち高さ方向における変動量Δtに起因する誤検出が残留してしまう。 As shown in FIG. 11, based on the inclination amount Δθ, the first and second identification surfaces 35 ″ and 36 ″ are corrected so that each is a horizontal plane, and the determination region D ″ is calculated. To do. Even in this case, the floor surface 1 is included in the determination region D ′ and is determined as an obstacle. In addition, the object 4b existing on the ceiling surface 2 deviates from the determination area D ′, and an obstacle is lost. That is, erroneous detection due to the fluctuation amount Δt in the height direction remains.
 図12に示すように、距離センサ25の高さ位置H2及び距離センサ25の傾きに基づいて、判定領域Dが算出される。すなわち距離センサ25の高さ位置及び傾きの変動に応じて、判定領域Dの高さ位置及び傾きが変更される。判定領域Dの傾きは、判定領域Dの位置基準軸(距離センサ25の検出軸Lと等しい軸)の傾きに相当する。 As shown in FIG. 12, the determination region D is calculated based on the height position H2 of the distance sensor 25 and the inclination of the distance sensor 25. That is, the height position and the inclination of the determination region D are changed according to the change in the height position and the inclination of the distance sensor 25. The inclination of the determination area D corresponds to the inclination of the position reference axis of the determination area D (the same axis as the detection axis L of the distance sensor 25).
 例えば図6に示す基準判定領域D0を基準として、基準判定領域D0の高さ位置と同じ高さ位置となるように、かつ基準判定領域D0の傾きと同じ傾きとなるように、判定領域Dが算出される。 For example, with reference to the reference determination area D0 shown in FIG. 6, the determination area D is set to have the same height position as the height position of the reference determination area D0 and the same inclination as the inclination of the reference determination area D0. Calculated.
 すなわち基準判定領域D0の第1の識別面35の高さ位置及び傾きと一致するように、第1の識別面35が算出される。同様に、基準判定領域D0の第2の識別面36の高さ位置及び傾きと一致するように、第2の識別面36が算出される。 That is, the first identification surface 35 is calculated so as to coincide with the height position and inclination of the first identification surface 35 in the reference determination area D0. Similarly, the second identification surface 36 is calculated so as to coincide with the height position and inclination of the second identification surface 36 in the reference determination region D0.
 この結果、図12に示すように、第1の識別面35は、床面1からオフセットhf分高い位置に設定される。第2の識別面36は、天井面2からオフセットhc分低い位置に設定される。なお第1及び第2の識別面35及び36の平面式は、基準状態における第1及び第2の識別面35及び36とは異なるものとのなる。 As a result, as shown in FIG. 12, the first identification surface 35 is set at a position higher than the floor surface 1 by the offset hf. The second identification surface 36 is set at a position lower than the ceiling surface 2 by the offset hc. The plane formula of the first and second identification surfaces 35 and 36 is different from the first and second identification surfaces 35 and 36 in the reference state.
 このように、図10に示す状態と比べて、第1及び第2の識別面35及び36が、距離センサ25の変動量Δt分上方にオフセットされ、傾き量Δθ分逆回転方向(-θ方向)にオフセットされる。これによりステップ104の障害物の判定処理において、図6に示す基準状態のときと同様に、物体4a及び4bを、障害物3として適正に判定することが可能となる。この結果、距離センサ25の高さ方向における揺動の影響を十分に防止して、床面1・天井面2と、障害物3とを高精度に識別することが可能となる。 Thus, compared with the state shown in FIG. 10, the first and second identification surfaces 35 and 36 are offset upward by the fluctuation amount Δt of the distance sensor 25, and the reverse rotation direction (−θ direction) by the inclination amount Δθ. ). Thereby, in the obstacle determination process in step 104, the objects 4a and 4b can be appropriately determined as the obstacle 3 as in the reference state shown in FIG. As a result, the influence of the swing of the distance sensor 25 in the height direction can be sufficiently prevented, and the floor surface 1, the ceiling surface 2, and the obstacle 3 can be identified with high accuracy.
 また距離センサ25の高さ位置に基づいて、判定領域D(第1及び第2の識別面35及び36)の高さ位置及び傾きを変更すればよいので、複雑なアルゴリズム不要となり、少ない計算量にて、簡単に障害物3を判定することが可能である。 Further, since the height position and inclination of the determination region D (first and second identification surfaces 35 and 36) may be changed based on the height position of the distance sensor 25, a complicated algorithm is not required, and the amount of calculation is small. Thus, the obstacle 3 can be easily determined.
 図13は、変動量Δt及び傾き量Δθに応じた判定領域Dの高さ位置及び傾きの変更の他の例を模式的に示す図である。例えば測定範囲Mが十分に大きい場合等において、判定領域Dの形状を維持したまま、その高さ位置及び傾きを変更することも可能である。すなわち図13に示すように、頭部22の高さ位置及び傾き(距離センサ25の高さ位置及び傾き)が変動した場合に、その測定範囲M内に、基準判定領域D0が含まれるとする。この場合、当該基準判定領域D0をそのまま判定領域Dとして算出してもよい。このことは、変動量Δtに応じて判定領域Dの高さ位置及び傾きを変更したともいえる処理である。 FIG. 13 is a diagram schematically showing another example of changing the height position and inclination of the determination region D according to the variation amount Δt and the inclination amount Δθ. For example, when the measurement range M is sufficiently large, the height position and inclination of the determination area D can be changed while the shape of the determination area D is maintained. That is, as shown in FIG. 13, when the height position and inclination of the head 22 (height position and inclination of the distance sensor 25) fluctuate, it is assumed that the reference determination region D0 is included in the measurement range M. . In this case, the reference determination area D0 may be calculated as the determination area D as it is. This is a process that can be said to have changed the height position and inclination of the determination region D in accordance with the fluctuation amount Δt.
 なお距離センサ25の高さ位置及び傾きに基づいた判定領域Dの算出が、基準判定領域Dを基準として実行する場合に限定される訳ではない。また基準判定領域D0を基準に算出する場合でも、基準判定領域Dと同じ高さ位置及び同じ傾きとなるように、判定領域Dを算出する場合に限定される訳ではない。 Note that the calculation of the determination region D based on the height position and inclination of the distance sensor 25 is not limited to the case where the calculation is performed with the reference determination region D as a reference. Even when the calculation is based on the reference determination area D0, the calculation is not limited to the case where the determination area D is calculated so as to have the same height position and the same inclination as the reference determination area D.
 例えば距離センサ25の高さ位置に基づいて、高さ方向における判定領域Dのサイズが補正されてもよい。例えば図8に示す判定領域Dに対して、高さ位置の変動量Δtに応じて、高さ方向におけるサイズ、すなわち第1及び第2の識別面35及び36間の距離が補正されてもよい。例えば変動量Δtが大きいほど、判定領域Dに対する補正量を大きくする。これにより、各センシの検出誤差の影響を抑制することができる。なお判定領域Dの高さ方向におけるサイズを大きくするか、又は小さくするかは、移動環境やロボット20の性能・用途等に応じて、自律移動に有効となるように、また安全設計が実現されるように、適宜設定されればよい。 For example, the size of the determination region D in the height direction may be corrected based on the height position of the distance sensor 25. For example, for the determination region D shown in FIG. 8, the size in the height direction, that is, the distance between the first and second identification surfaces 35 and 36 may be corrected according to the variation Δt in the height position. . For example, the correction amount for the determination region D is increased as the variation amount Δt increases. Thereby, the influence of the detection error of each sense can be suppressed. Whether the size of the determination region D in the height direction is to be increased or decreased is determined to be effective for autonomous movement according to the moving environment, the performance and use of the robot 20, and the safety design is realized. It may be set as appropriate.
 距離センサ25の傾きに基づいて、判定領域Dの高さ方向におけるサイズが補正されてもよい。例えば傾き量Δθが大きいほど、判定領域Dに対する補正量を大きくする。これにより、各センシの検出誤差の影響を抑制することができる。 Based on the inclination of the distance sensor 25, the size of the determination region D in the height direction may be corrected. For example, the correction amount for the determination region D is increased as the inclination amount Δθ is larger. Thereby, the influence of the detection error of each sense can be suppressed.
 以上、本実施形態に係る移動体制御システム100では、距離センサ25の高さ位置に関する情報に基づいて、ロボット20の周辺の状況を判定するための判定領域Dが算出される。これによりロボット20の周辺の状況を簡単に精度よく判定することが可能となり、障害物3を判定することが可能となる。 As described above, in the moving body control system 100 according to the present embodiment, the determination region D for determining the situation around the robot 20 is calculated based on the information regarding the height position of the distance sensor 25. As a result, the situation around the robot 20 can be easily and accurately determined, and the obstacle 3 can be determined.
 すなわち本技術を用いることで、距離センサを用いて外界の障害物を認識する自律移動ロボットにおいて、移動中に距離センサの高さ・姿勢が変化する場合でも、床・天井面と障害物の識別を安定的に行うことが可能となる。 In other words, by using this technology, an autonomous mobile robot that uses a distance sensor to recognize obstacles in the outside world can identify the floor / ceiling surface and obstacles even if the height / posture of the distance sensor changes during movement. Can be performed stably.
 自律移動ロボットが障害物を避けた移動を実現するためには、距離センサによる環境計測を行うことが一般的である。距離センサによってロボットは自分を中心とした外界の物体までの距離がわかる。距離センサは一般に音や光のような直進性のある波動現象の反射を用いて対象物との間の直線距離を計測しているため、移動をする上での障害物となる物体と、床・天井面の様な移動上障害とならない物体の双方を検出してしまう。自律移動ロボットは距離センサから得られる信号の中から、移動上の障害物と床・天井面を識別する必要がある。 In order for autonomous mobile robots to move around obstacles, it is common to measure the environment with a distance sensor. The distance sensor allows the robot to know the distance to the external object centered on itself. A distance sensor generally measures the linear distance from an object using reflection of a wave phenomenon with straightness such as sound and light, so an object that becomes an obstacle to movement and a floor -Both objects that do not obstruct movement such as the ceiling surface will be detected. An autonomous mobile robot needs to distinguish obstacles from movement and floor / ceiling surfaces from signals obtained from distance sensors.
 ロボットが常に地面に対して平行な姿勢を維持したまま移動出来るのであれば、床・天井面と障害物の識別は容易である。得られたセンサ信号の中からセンサ座標系に固定された適当な高さの面で囲まれた領域内の情報だけを障害物と識別すればよい。 If the robot can always move while maintaining a posture parallel to the ground, it is easy to identify the floor / ceiling surface and obstacles. From the obtained sensor signal, only the information in the area surrounded by the plane of appropriate height fixed to the sensor coordinate system may be identified as an obstacle.
 しかしながら足を備えた移動ロボットなどは距離センサがロボット移動(歩行)時に揺動し、その姿勢と高さは地面に対して変化する。特に、ロボットにおける「目」に相当する距離センサは、視野を広く取るためにロボット本体の中でも高い位置に設置されることが多く、移動に伴う揺動によって大きく位置・姿勢を動かされてしまう。 However, for mobile robots with feet, the distance sensor swings when the robot moves (walks), and its posture and height change with respect to the ground. In particular, a distance sensor corresponding to an “eye” in a robot is often installed at a high position in the robot body in order to take a wide field of view, and the position / posture is greatly moved by swinging accompanying movement.
 先述のセンサ座標系に固定された判定領域を用いて障害物判定を行う識別手法では、姿勢・高さが変化するセンサに対しては誤認識の可能性が生じる。例えばセンサの高さが平時よりも低くなったケースでは、床面を障害物と誤認識する(図7参照)。逆にセンサ位置が平時よりも高い場合では天井面を障害物と誤認識する可能性がある。高さの変化に加えてセンサが傾いている場合も同種の問題が生じる(図10、図11参照)。 In the identification method in which the obstacle determination is performed using the determination area fixed in the sensor coordinate system described above, there is a possibility of erroneous recognition for a sensor whose posture and height change. For example, in the case where the height of the sensor is lower than normal, the floor is erroneously recognized as an obstacle (see FIG. 7). Conversely, if the sensor position is higher than normal, the ceiling surface may be erroneously recognized as an obstacle. The same kind of problem occurs when the sensor is tilted in addition to the change in height (see FIGS. 10 and 11).
 移動中に床や天井が障害物と誤認識されることで、本来は移動可能な場所を迂回する必要が生じ、近傍に突然障害物が生じたと誤認識したロボットは不必要な急停止を招くことにつながる。したがって、距離センサの位置・姿勢の変化の影響を受けない床・天井面-障害物識別手法を新たに考案することが非常に重要となっていた。 When a floor or ceiling is misrecognized as an obstacle while moving, it is necessary to bypass a place where it can be originally moved, and a robot that misrecognizes that an obstacle suddenly occurred in the vicinity may cause an unnecessary sudden stop. Leads to. Therefore, it has become very important to devise a new floor / ceiling surface-obstacle identification method that is not affected by changes in the position / posture of the distance sensor.
 例えば、距離センサから得られる三次元点群から平面を抽出し、平面以外の物体を障害物として認識する手法が考えられる。センサ信号のみから床・天井面を抽出することで、外界に対するセンサの姿勢に依らずに、信号だけから床面の除去が可能となる。しかしながら、この手法では、点群からの床面検出には複雑な処置を要するため、計算機資源が乏しい小型のロボットに適用するのは難しい。 For example, a method of extracting a plane from a three-dimensional point group obtained from a distance sensor and recognizing an object other than the plane as an obstacle is conceivable. By extracting the floor / ceiling surface from only the sensor signal, the floor surface can be removed only from the signal without depending on the posture of the sensor with respect to the outside world. However, this method requires a complicated procedure for detecting the floor surface from the point cloud, and is difficult to apply to a small robot with a scarce computer resource.
 距離センサの反射光強度によって床と障害物を識別する手法が考えられる。光などの反射を用いて測距する距離センサにおいては、正対している障害物の反射強度は、センサ光軸に平行な床・天井面の反射強度に比べて強い。このことを利用した手法である。しかしながら、この手法では、反射強度は物体の表面性状による光の散乱度合や、素材による光の吸収率に影響を受けるため、多様な材質の物体に囲まれた環境での識別精度が低くなってしまう。また、距離センサが大きく上下に向き、床面と正対したような場合では障害物よりも床面からの反射強度が勝るため、本手法は使えない。 A method of discriminating between the floor and the obstacle based on the reflected light intensity of the distance sensor can be considered. In a distance sensor that measures distance using reflection of light or the like, the reflection intensity of the obstacle directly facing is stronger than the reflection intensity of the floor / ceiling surface parallel to the sensor optical axis. This is a technique using this. However, with this method, the reflection intensity is affected by the degree of light scattering due to the surface properties of the object and the light absorption rate by the material, so the identification accuracy in an environment surrounded by objects of various materials is low. End up. In addition, when the distance sensor is greatly directed up and down and directly faces the floor surface, the reflection intensity from the floor surface is better than that of the obstacle, so this method cannot be used.
 障害物がない場合の外界の距離センサ出力を予め取得しておき、現在の観測値と比較した差分を障害物として識別する手法が考えられる。しかしながら、この手法では、事前取得したセンサ信号と現在のセンサ出力の座標の合わせこみに計算を必要とすることと、未知の領域を探索しようとするケースで使用できないという点が問題となる。 A method is conceivable in which the distance sensor output of the outside world when there is no obstacle is acquired in advance, and the difference compared with the current observation value is identified as an obstacle. However, this method has a problem in that it requires calculation to match the sensor signal acquired in advance and the coordinates of the current sensor output, and cannot be used in a case where an unknown area is to be searched.
 すなわち上記の手法では、計算量コストが高くなり、また距離センサの高さ変化に対応できない。 That is, the above method increases the calculation cost and cannot cope with the height change of the distance sensor.
 小型の移動ロボットには十分な計算機リソースがなく、より低負荷で床・天井面-障害物識別が出来る手法が求められている。また特に脚ロボットなどではセンサの高さ変化に起因する障害物誤認識を低減することが必要とされる。 Small mobile robots do not have sufficient computer resources, and there is a need for a method that can identify floor / ceiling surfaces and obstacles with lower load. In particular, leg robots and the like need to reduce obstacle recognition due to changes in sensor height.
 上記の実施形態にて説明したように、本技術では、安価な高さセンサ・姿勢センサさえ備えていれば単純な三次元空間上の座標比較処理のみで実装が可能であり、従来手法の平面検出などの高負荷な処理に比して低コストで実現可能である。このことによって、本技術は、家庭内愛玩用ロボットなどの小型で計算能力が限られている対象にも適用可能である。また本技術では、高さ計測部による高さ位置の情報を加味することで、高さ位置・姿勢を含めたあらゆる三次元的なセンサの高速な揺動に対して、ロバストな識別能力を発揮することが可能である。 As described in the above embodiment, the present technology can be implemented only by coordinate comparison processing in a simple three-dimensional space as long as an inexpensive height sensor / attitude sensor is provided. Compared to high-load processing such as detection, it can be realized at low cost. Thus, the present technology can also be applied to a small object such as a domestic pet robot that has a limited calculation capability. In addition, this technology demonstrates robust identification capability against high-speed rocking of any three-dimensional sensor including height position and orientation by adding height position information from the height measurement unit. Is possible.
 <その他の実施形態>
 本技術は、以上説明した実施形態に限定されず、他の種々の実施形態を実現することができる。
<Other embodiments>
The present technology is not limited to the embodiments described above, and other various embodiments can be realized.
 上記では、第1及び第2の識別面(判定面)により、判定領域が規定された。これに限定されず、1つの識別面や、3つ以上の識別面により、判定領域が規定されてもよい。例えば床面に応じた第1の識別面のみが設定される場合や、天井面に応じた第2の識別面のみが設定される場合が挙げられる。また判定領域の形状も限定されない。曲面形状の識別面により判定領域が規定されてもよい。また床面や天井面の形状に応じて、判定領域の形状(識別面の形状)が設定されてもよい。なお識別面(判定面)は、判定のための閾値ともいえる。 In the above, the determination area is defined by the first and second identification surfaces (determination surfaces). The determination area is not limited to this, and the determination area may be defined by one identification plane or three or more identification planes. For example, the case where only the 1st discriminating surface according to a floor surface is set, and the case where only the 2nd discriminating surface according to a ceiling surface is set up are mentioned. Further, the shape of the determination area is not limited. The determination area may be defined by a curved identification surface. Further, the shape of the determination region (the shape of the identification surface) may be set according to the shape of the floor surface or the ceiling surface. Note that the identification surface (determination surface) can also be said to be a threshold for determination.
 上記では、周囲の状況の判定として、障害物の存在の有無の判定が実行された。こえrに限定されず、周辺に存在する所定の物体のカウントや、物体のサイズの変動の検出等にも、本技術が適用されてもよい。すなわち周囲の状況の判定として、任意の判定処理が実行されてよい。 In the above, the determination of the presence or absence of obstacles was performed as a determination of the surrounding situation. The present technology is not limited to r, and the present technology may also be applied to counting a predetermined object existing in the vicinity, detecting a change in the size of an object, and the like. That is, any determination process may be executed as a determination of the surrounding situation.
 図14は、本技術の一実施形態に係る自動運転制御部を搭載する車両の構成例を示す外観図である。図14Aは、車両290の構成例を示す斜視図であり、図1Bは、車両290を上方から見た場合の模式図である。車両290は、目的地までの自動走行(自律移動)が可能な自動運転機能を備えている。なお車両290は、移動体の一例である。 FIG. 14 is an external view showing a configuration example of a vehicle equipped with an automatic driving control unit according to an embodiment of the present technology. FIG. 14A is a perspective view illustrating a configuration example of the vehicle 290, and FIG. 1B is a schematic view when the vehicle 290 is viewed from above. The vehicle 290 has an automatic driving function capable of automatic traveling (autonomous movement) to a destination. The vehicle 290 is an example of a moving body.
 車両290は、自動運転に用いられる各種のセンサ291を備える。一例として、例えば図14Aには、車両290の前方に向けられた撮像装置292及び距離センサ293が模式的に図示されている。撮像装置292及び距離センサ293は、外界センサとして機能する。また図14Bには、各車輪の回転等を検出する車輪エンコーダ294が模式的に図示されている。車輪エンコーダ294は、内界センサとして機能する。この他、車両290には様々なセンサ291が搭載され、センサ291からの出力をもとに車両290の移動制御が行なわれる。 The vehicle 290 includes various sensors 291 used for automatic driving. As an example, for example, FIG. 14A schematically illustrates an imaging device 292 and a distance sensor 293 that are directed to the front of the vehicle 290. The imaging device 292 and the distance sensor 293 function as an external sensor. FIG. 14B schematically shows a wheel encoder 294 that detects the rotation and the like of each wheel. The wheel encoder 294 functions as an internal sensor. In addition, various sensors 291 are mounted on the vehicle 290, and movement control of the vehicle 290 is performed based on the output from the sensor 291.
 図15は、車両290の制御を行う車両制御システム200の構成例を示すブロック図である。車両制御システム200は、車両290に設けられ、車両290の各種の制御を行うシステムである。 FIG. 15 is a block diagram illustrating a configuration example of the vehicle control system 200 that controls the vehicle 290. The vehicle control system 200 is a system that is provided in the vehicle 290 and performs various controls of the vehicle 290.
 図15に示す入力部201、データ取得部202、通信部203、車内機器204、出力制御部205、出力部206、駆動系制御部207、駆動系システム208、記憶部209、及び自動運転制御部210は、図2に示す入力部101、データ取得部102、通信部103、移動体内部機器104、出力制御部105、出力部106、駆動系制御部107、駆動系システム108、記憶部109、及び自律移動制御部110に対応するブロックである。 The input unit 201, data acquisition unit 202, communication unit 203, in-vehicle device 204, output control unit 205, output unit 206, drive system control unit 207, drive system system 208, storage unit 209, and automatic operation control unit shown in FIG. 210 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a storage unit 109, And a block corresponding to the autonomous movement control unit 110.
 また図15に示す自動運転制御部210内の検出部231、自己位置推定部232、状況分析部233、計画部234、及び、動作制御部135は、図2に示す自律移動制御部110内の検出部131、自己位置推定部132、状況分析部133、計画部134、及び、動作制御部135に相当する。 Further, the detection unit 231, the self-position estimation unit 232, the situation analysis unit 233, the planning unit 234, and the operation control unit 135 in the automatic driving control unit 210 illustrated in FIG. 15 are included in the autonomous movement control unit 110 illustrated in FIG. 2. It corresponds to the detection unit 131, the self-position estimation unit 132, the situation analysis unit 133, the planning unit 134, and the operation control unit 135.
 以下、図2に示す自律移動制御部110と異なる点を中心に説明する。まずデータ取得部202として、運転者を撮像する撮像装置、運転者の生体情報を検出する生体センサ、及び、車室内の音声を集音するマイクロフォン等が備えられてもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座っている搭乗者又はステアリングホイールを握っている運転者の生体情報を検出する。 Hereinafter, the difference from the autonomous movement control unit 110 shown in FIG. 2 will be mainly described. First, the data acquisition unit 202 may include an imaging device that images the driver, a biological sensor that detects the driver's biological information, a microphone that collects sound in the passenger compartment, and the like. The biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of a passenger sitting on the seat or a driver holding the steering wheel.
 ボディ系制御部280は、各種の制御信号を生成し、ボディ系システム281に供給することにより、ボディ系システム281の制御を行う。また、ボディ系制御部280は、必要に応じて、ボディ系システム281以外の各部に制御信号を供給し、ボディ系システム281の制御状態の通知等を行う。 The body system control unit 280 controls the body system 281 by generating various control signals and supplying them to the body system 281. Further, the body system control unit 280 supplies a control signal to each unit other than the body system 281 as necessary, and notifies the control state of the body system 281 and the like.
 ボディ系システム281は、車体に装備されたボディ系の各種の装置を備える。例えば、ボディ系システム281は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、パワーシート、ステアリングホイール、空調装置、及び、各種ランプ(例えば、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカ、フォグランプ等)等を備える。 The body system 281 includes various body-related devices mounted on the vehicle body. For example, the body system 281 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, and various lamps (for example, a head lamp, a back lamp, a brake lamp, a blinker, a fog lamp, etc.) Etc.
 交通ルール認識部282は、自己位置推定部232、車外情報検出部241、及び、マップ解析部251等の車両制御システム200の各部からのデータ又は信号に基づいて、車両290の周囲の交通ルールの認識処理を行う。この認識処理により、例えば、車両290の周囲の信号の位置及び状態、車両290の周囲の交通規制の内容、並びに、走行可能な車線等が認識される。交通ルール認識部282は、認識処理の結果を示すデータを状況予測部253等に供給する。 The traffic rule recognition unit 282 determines traffic rules around the vehicle 290 based on data or signals from each part of the vehicle control system 200 such as the self-position estimation unit 232, the vehicle outside information detection unit 241, and the map analysis unit 251. Perform recognition processing. By this recognition processing, for example, the position and state of signals around the vehicle 290, the contents of traffic restrictions around the vehicle 290, and the lanes that can travel are recognized. The traffic rule recognition unit 282 supplies data indicating the result of the recognition process to the situation prediction unit 253 and the like.
 動作制御部235は、車両290の動作の制御を行う。動作制御部235は、緊急事態回避部283、加減速制御部284、及び、方向制御部285を備える。 The operation control unit 235 controls the operation of the vehicle 290. The operation control unit 235 includes an emergency situation avoiding unit 283, an acceleration / deceleration control unit 284, and a direction control unit 285.
 緊急事態回避部283は、車外情報検出部241、車内情報検出部242、及び、車両状態検出部243の検出結果に基づいて、衝突、接触、危険地帯への進入、運転者の異常、車両290の異常等の緊急事態の検出処理を行う。緊急事態回避部283は、緊急事態の発生を検出した場合、急停車や急旋回等の緊急事態を回避するための車両290の動作を計画する。緊急事態回避部283は、計画した車両290の動作を示すデータを加減速制御部284及び方向制御部285等に供給する。 The emergency situation avoiding unit 283 is configured to detect collision, contact, approach to a dangerous zone, driver abnormality, vehicle 290 based on the detection results of the vehicle outside information detecting unit 241, the vehicle interior information detecting unit 242, and the vehicle state detecting unit 243. Detects emergency situations such as abnormalities. When the emergency avoidance unit 283 detects the occurrence of an emergency, the emergency avoidance unit 283 plans the operation of the vehicle 290 to avoid an emergency such as a sudden stop or a sudden turn. The emergency avoidance unit 283 supplies data indicating the planned operation of the vehicle 290 to the acceleration / deceleration control unit 284, the direction control unit 285, and the like.
 加減速制御部284は、動作計画部263又は緊急事態回避部283により計画された車両290の動作を実現するための加減速制御を行う。例えば、加減速制御部284は、計画された加速、減速、又は、急停車を実現するための駆動力発生装置又は制動装置の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部207に供給する。 The acceleration / deceleration control unit 284 performs acceleration / deceleration control for realizing the operation of the vehicle 290 planned by the operation planning unit 263 or the emergency situation avoiding unit 283. For example, the acceleration / deceleration control unit 284 calculates a control target value of a driving force generator or a braking device for realizing planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. This is supplied to the system control unit 207.
 方向制御部285は、動作計画部263又は緊急事態回避部283により計画された車両290の動作を実現するための方向制御を行う。例えば、方向制御部285は、動作計画部263又は緊急事態回避部283により計画された走行軌道又は急旋回を実現するためのステアリング機構の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部207に供給する。 The direction control unit 285 performs direction control for realizing the operation of the vehicle 290 planned by the operation planning unit 263 or the emergency situation avoiding unit 283. For example, the direction control unit 285 calculates the control target value of the steering mechanism for realizing the traveling track or the sudden turn planned by the motion planning unit 263 or the emergency situation avoiding unit 283, and the control indicating the calculated control target value The command is supplied to the drive system control unit 207.
 上記のような構成を有する車両290及び車両制御システム200に、本技術を適用することが可能である。すなわち車両制御システム200を、本技術に係る情報処理装置として機能させ、障害物の判定を含む車両290の周辺の状況の判定処理を実行させることが可能である。 The present technology can be applied to the vehicle 290 and the vehicle control system 200 having the above-described configuration. That is, it is possible to cause the vehicle control system 200 to function as an information processing apparatus according to the present technology and to execute a determination process for the situation around the vehicle 290 including determination of an obstacle.
 その他、本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。 In addition, the technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure may be any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). You may implement | achieve as an apparatus mounted in a body.
 上記の実施形態では、移動体に搭載された自律移動制御部により、障害物の判定等を含む、本技術に係る情報処理方法が実行された。これに限定されず、クラウドサーバにより、本技術に係る情報処理方法が実行されてもよい。この場合、当該クラウドサーバは、本技術に係る情報処理装置として動作することになる。 In the above embodiment, the information processing method according to the present technology including determination of an obstacle and the like is executed by the autonomous movement control unit mounted on the moving body. However, the information processing method according to the present technology may be executed by the cloud server. In this case, the cloud server operates as an information processing apparatus according to the present technology.
 また車両に搭載されたコンピュータと、ネットワーク等を介して通信可能な他のコンピュータ(クラウドサーバ)とが連動することで、本技術に係る情報処理方法、及びプログラムが実行され、本技術に係る情報処理装置が構築されてもよい。 In addition, an information processing method and a program according to the present technology are executed by interlocking a computer mounted on a vehicle with another computer (cloud server) capable of communicating via a network or the like. A processing device may be constructed.
 すなわち本技術に係る情報処理方法、及びプログラムは、単体のコンピュータにより構成されたコンピュータシステムのみならず、複数のコンピュータが連動して動作するコンピュータシステムにおいても実行可能である。なお本開示において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれもシステムである。 That is, the information processing method and the program according to the present technology can be executed not only in a computer system configured by a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. In the present disclosure, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems.
 コンピュータシステムによる本技術に係る情報処理方法、及びプログラムの実行は、例えばセンサの高さ位置に関する情報の取得、判定領域の算出、周辺状況の判定等が、単体のコンピュータにより実行される場合、及び各処理が異なるコンピュータにより実行される場合の両方を含む。また所定のコンピュータによる各処理の実行は、当該処理の一部または全部を他のコンピュータに実行させその結果を取得することを含む。 The information processing method and the program according to the present technology by the computer system are executed when, for example, acquisition of information on the sensor height position, calculation of a determination region, determination of a surrounding situation, and the like are executed by a single computer, and This includes both cases where each process is executed by a different computer. The execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquiring the result.
 すなわち本技術に係る情報処理方法、及びプログラムは、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成にも適用することが可能である。 That is, the information processing method and program according to the present technology can be applied to a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is processed jointly.
 各図面を参照して説明した移動体(ロボット、車両)、自律移動制御部、自動運転制御部等の各構成、自律移動制御部等の制御フロー等はあくまで一実施形態であり、本技術の趣旨を逸脱しない範囲で、任意に変形可能である。すなわち本技術を実施するための他の任意の構成やアルゴリズム等が採用されてよい。 Each structure such as a moving body (robot, vehicle), autonomous movement control unit, automatic driving control unit, etc. described with reference to each drawing, a control flow of the autonomous movement control unit, and the like are only one embodiment, and Any modification can be made without departing from the spirit of the invention. That is, any other configuration, algorithm, or the like for carrying out the present technology may be employed.
 本開示において、「同じ高さ位置」「同じ傾き」「中心」「等しい位置」「等しい軸」「平行」等は、「実質的に同じ高さ位置」「実質的に同じ傾き」「実質的に中心」「実質的に等しい位置」「実質的に等しい軸」「実質的に平行」を含む概念とする。例えば「完全に同じ高さ位置」「完全に同じ傾き」「完全に中心」「完全に等しい位置」「完全に等しい軸」「完全に平行」等を基準とした所定の範囲(例えば±10%の範囲)に含まれる状態も含まれる。 In the present disclosure, “same height position”, “same inclination”, “center”, “equal position”, “equal axis”, “parallel”, etc. are “substantially the same height position”, “substantially the same inclination”, “substantially the same”. ”Center”, “substantially equal position”, “substantially equal axis”, and “substantially parallel”. For example, a predetermined range (for example, ± 10%) based on “completely the same height position”, “completely the same inclination”, “completely center”, “completely equal position”, “completely equal axis”, “completely parallel”, etc. The range included in the range of (1) is also included.
 以上説明した本技術に係る特徴部分のうち、少なくとも2つの特徴部分を組み合わせることも可能である。すなわち各実施形態で説明した種々の特徴部分は、各実施形態の区別なく、任意に組み合わされてもよい。また上記で記載した種々の効果は、あくまで例示であって限定されるものではなく、また他の効果が発揮されてもよい。 Of the characteristic parts according to the present technology described above, it is possible to combine at least two characteristic parts. That is, the various characteristic parts described in each embodiment may be arbitrarily combined without distinction between the embodiments. The various effects described above are merely examples and are not limited, and other effects may be exhibited.
 なお、本技術は以下のような構成も採ることができる。
(1)移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得する取得部と、
 前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出する算出部と、
 前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定する判定部と
 を具備する情報処理装置。
(2)(1)に記載の情報処理装置であって、
 前記算出部は、前記移動体の周辺に障害物が存在するか否かを判定する
 情報処理装置。
(3)(1)又は(2)に記載の情報処理装置であって、
 前記取得部は、前記移動体の周辺に関する形状データを取得し、
 前記判定部は、前記検出された形状データと前記判定領域との関係に基づいて、前記移動体の周辺の状況を判定する
 情報処理装置。
(4)(3)に記載の情報処理装置であって、
 前記取得部は、前記移動体の周辺に関する3次元点群データを取得し、
 前記判定部は、前記3次元点群データに含まれる点データが、前記判定領域に含まれるか否かを判定することで、前記移動体の周辺の状況を判定する
 情報処理装置。
(5)(1)から(4)のうちいずれか1つに記載の情報処理装置であって、
 前記算出部は、前記センサの高さ位置の変動に応じて、前記判定領域の高さ位置を変更する
 情報処理装置。
(6)(1)から(5)のうちいずれか1つに記載の情報処理装置であって、
 前記算出部は、前記判定領域を規定する1以上の判定面を算出し、
 前記1以上の判定面は、地面に対応する第1の判定面、及び天井面に対応する第2の判定面の少なくとも一方を含む
 情報処理装置。
(7)(6)に記載の情報処理装置であって、
 前記算出部は、前記センサの高さ位置の変動に応じて、前記第1の判定面及び前記第2の判定面の少なくとも一方の高さ位置を変更する
 情報処理装置。
(8)(1)から(7)のうちいずれか1つに記載の情報処理装置であって、
 前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、所定の基準判定領域を基準として前記判定領域を算出する
 情報処理装置。
(9)(8)に記載の情報処理装置であって、
 前記算出部は、前記基準判定領域の高さ位置と同じ高さ位置となるように、前記判定領域を算出する
 情報処理装置。
(10)(1)から(9)のうちいずれか1つに記載の情報処理装置であって、
 前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、前記算出された判定領域の高さ方向におけるサイズを補正する
 情報処理装置。
(11)(1)から(10)のうちいずれか1つに記載に情報処理装置であって、
 前記取得部は、前記センサの傾きに関する情報を取得し、
 前記算出部は、前記センサの傾きに関する情報に基づいて、前記判定領域を算出する
 情報処理装置。
(12)(11)に記載の情報処理装置であって、
 前記算出部は、前記センサの傾きの変動に応じて、前記判定領域の傾きを変更する
 情報処理装置。
(13)移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得し、
 前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出し、
 前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定する
 ことをコンピュータシステムが実行する情報処理方法。
(14)移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得するステップと、
 前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出するステップと、
 前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定するステップと
 をコンピュータシステムに実行させるプログラム。
(15)駆動部と、
 周辺情報を検出可能なセンサと、
 前記センサの高さ位置に関する情報を検出する検出部と、
 前記検出された前記センサの高さ位置に関する情報に基づいて、周辺の状況を判定するための判定領域を算出する算出部と、
 前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、周辺の状況を判定する判定部と、
 前記判定部による判定結果に基づいて、前記駆動部を制御する駆動制御部と
 を具備する移動体。
(16)(15)に記載の移動体であって、
 前記駆動部は、多関節構造を有する脚部である
 移動体。
In addition, this technique can also take the following structures.
(1) an acquisition unit that acquires information about the height position of a sensor capable of detecting peripheral information of the moving body;
Based on the acquired information on the height position of the sensor, a calculation unit that calculates a determination region for determining a situation around the moving body;
An information processing apparatus comprising: a determination unit configured to determine a situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.
(2) The information processing apparatus according to (1),
The calculation unit determines whether there is an obstacle around the moving body.
(3) The information processing apparatus according to (1) or (2),
The acquisition unit acquires shape data related to the periphery of the moving body,
The determination unit determines a situation around the moving body based on a relationship between the detected shape data and the determination area.
(4) The information processing apparatus according to (3),
The acquisition unit acquires three-dimensional point cloud data related to the periphery of the moving object,
The determination unit determines a situation around the moving object by determining whether or not point data included in the three-dimensional point cloud data is included in the determination region.
(5) The information processing apparatus according to any one of (1) to (4),
The calculation unit changes a height position of the determination region according to a change in a height position of the sensor.
(6) The information processing apparatus according to any one of (1) to (5),
The calculation unit calculates one or more determination planes that define the determination region;
The one or more determination surfaces include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to a ceiling surface.
(7) The information processing apparatus according to (6),
The information processing apparatus, wherein the calculation unit changes a height position of at least one of the first determination surface and the second determination surface according to a change in a height position of the sensor.
(8) The information processing apparatus according to any one of (1) to (7),
The calculation unit calculates the determination region based on a predetermined reference determination region based on the acquired information on the height position of the sensor.
(9) The information processing apparatus according to (8),
The information processing apparatus calculates the determination region so that the calculation unit has the same height position as a height position of the reference determination region.
(10) The information processing apparatus according to any one of (1) to (9),
The information processing apparatus corrects a size of the calculated determination region in a height direction based on the acquired information on the height position of the sensor.
(11) The information processing apparatus according to any one of (1) to (10),
The acquisition unit acquires information on the tilt of the sensor,
The information processing apparatus that calculates the determination area based on information related to the tilt of the sensor.
(12) The information processing apparatus according to (11),
The calculation unit changes an inclination of the determination region according to a change in inclination of the sensor.
(13) Obtain information about the height position of the sensor that can detect the peripheral information of the moving body,
Based on the acquired information on the height position of the sensor, a determination region for determining the situation around the moving body is calculated,
An information processing method in which a computer system executes determination of a situation around the moving body based on the calculated determination area and the peripheral information detected by the sensor.
(14) obtaining information related to the height position of the sensor capable of detecting peripheral information of the moving body;
Calculating a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor;
A program for causing a computer system to execute a step of determining a situation around the moving body based on the calculated determination area and the peripheral information detected by the sensor.
(15) a drive unit;
A sensor capable of detecting surrounding information;
A detection unit for detecting information on a height position of the sensor;
A calculation unit for calculating a determination region for determining a surrounding situation based on the detected information on the height position of the sensor;
A determination unit for determining a surrounding situation based on the calculated determination region and the peripheral information detected by the sensor;
A moving body comprising: a drive control unit that controls the drive unit based on a determination result by the determination unit.
(16) The moving object according to (15),
The driving unit is a leg having a multi-joint structure.
 D…判定領域
 D0…基準判定領域
 1…床面
 2…天井面
 3…障害物
 10…移動体
 20…ロボット
 21…脚部
 25…距離センサ
 26…高さ位置センサ
 27…内界センサ
 30…位置算出部
 31…姿勢算出部
 32…判定領域算出部
 33…障害物判定部
 35…第1の識別面
 36…第2の識別面
 100…移動体制御システム
 200…車両制御システム
 290…車両
 293…距離センサ
D: Determination region D0: Reference determination region 1 ... Floor surface 2 ... Ceiling surface 3 ... Obstacle 10 ... Moving body 20 ... Robot 21 ... Leg 25 ... Distance sensor 26 ... Height position sensor 27 ... Inner world sensor 30 ... Position Calculation unit 31 ... Posture calculation unit 32 ... Determination region calculation unit 33 ... Obstacle determination unit 35 ... First identification surface 36 ... Second identification surface 100 ... Mobile body control system 200 ... Vehicle control system 290 ... Vehicle 293 ... Distance Sensor

Claims (16)

  1.  移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得する取得部と、
     前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出する算出部と、
     前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定する判定部と
     を具備する情報処理装置。
    An acquisition unit for acquiring information related to the height position of the sensor capable of detecting peripheral information of the moving object;
    Based on the acquired information on the height position of the sensor, a calculation unit that calculates a determination region for determining a situation around the moving body;
    An information processing apparatus comprising: a determination unit configured to determine a situation around the moving body based on the calculated determination region and the peripheral information detected by the sensor.
  2.  請求項1に記載の情報処理装置であって、
     前記算出部は、前記移動体の周辺に障害物が存在するか否かを判定する
     情報処理装置。
    The information processing apparatus according to claim 1,
    The calculation unit determines whether there is an obstacle around the moving body.
  3.  請求項1に記載の情報処理装置であって、
     前記取得部は、前記移動体の周辺に関する形状データを取得し、
     前記判定部は、前記検出された形状データと前記判定領域との関係に基づいて、前記移動体の周辺の状況を判定する
     情報処理装置。
    The information processing apparatus according to claim 1,
    The acquisition unit acquires shape data related to the periphery of the moving body,
    The determination unit determines a situation around the moving body based on a relationship between the detected shape data and the determination area.
  4.  請求項3に記載の情報処理装置であって、
     前記取得部は、前記移動体の周辺に関する3次元点群データを取得し、
     前記判定部は、前記3次元点群データに含まれる点データが、前記判定領域に含まれるか否かを判定することで、前記移動体の周辺の状況を判定する
     情報処理装置。
    The information processing apparatus according to claim 3,
    The acquisition unit acquires three-dimensional point cloud data related to the periphery of the moving object,
    The determination unit determines a situation around the moving object by determining whether or not point data included in the three-dimensional point cloud data is included in the determination region.
  5.  請求項1に記載の情報処理装置であって、
     前記算出部は、前記センサの高さ位置の変動に応じて、前記判定領域の高さ位置を変更する
     情報処理装置。
    The information processing apparatus according to claim 1,
    The calculation unit changes a height position of the determination region according to a change in a height position of the sensor.
  6.  請求項1に記載の情報処理装置であって、
     前記算出部は、前記判定領域を規定する1以上の判定面を算出し、
     前記1以上の判定面は、地面に対応する第1の判定面、及び天井面に対応する第2の判定面の少なくとも一方を含む
     情報処理装置。
    The information processing apparatus according to claim 1,
    The calculation unit calculates one or more determination planes that define the determination region;
    The one or more determination surfaces include at least one of a first determination surface corresponding to the ground and a second determination surface corresponding to a ceiling surface.
  7.  請求項6に記載の情報処理装置であって、
     前記算出部は、前記センサの高さ位置の変動に応じて、前記第1の判定面及び前記第2の判定面の少なくとも一方の高さ位置を変更する
     情報処理装置。
    The information processing apparatus according to claim 6,
    The information processing apparatus, wherein the calculation unit changes a height position of at least one of the first determination surface and the second determination surface according to a change in a height position of the sensor.
  8.  請求項1に記載の情報処理装置であって、
     前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、所定の基準判定領域を基準として前記判定領域を算出する
     情報処理装置。
    The information processing apparatus according to claim 1,
    The calculation unit calculates the determination region based on a predetermined reference determination region based on the acquired information on the height position of the sensor.
  9.  請求項8に記載の情報処理装置であって、
     前記算出部は、前記基準判定領域の高さ位置と同じ高さ位置となるように、前記判定領域を算出する
     情報処理装置。
    The information processing apparatus according to claim 8,
    The information processing apparatus calculates the determination region so that the calculation unit has the same height position as a height position of the reference determination region.
  10.  請求項1に記載の情報処理装置であって、
     前記算出部は、前記取得された前記センサの高さ位置に関する情報に基づいて、前記算出された判定領域の高さ方向におけるサイズを補正する
     情報処理装置。
    The information processing apparatus according to claim 1,
    The information processing apparatus corrects a size of the calculated determination region in a height direction based on the acquired information on the height position of the sensor.
  11.  請求項1に記載に情報処理装置であって、
     前記取得部は、前記センサの傾きに関する情報を取得し、
     前記算出部は、前記センサの傾きに関する情報に基づいて、前記判定領域を算出する
     情報処理装置。
    The information processing apparatus according to claim 1,
    The acquisition unit acquires information on the tilt of the sensor,
    The information processing apparatus that calculates the determination area based on information related to the tilt of the sensor.
  12.  請求項11に記載の情報処理装置であって、
     前記算出部は、前記センサの傾きの変動に応じて、前記判定領域の傾きを変更する
     情報処理装置。
    The information processing apparatus according to claim 11,
    The calculation unit changes an inclination of the determination region according to a change in inclination of the sensor.
  13.  移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得し、
     前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出し、
     前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定する
     ことをコンピュータシステムが実行する情報処理方法。
    Get information about the height position of the sensor that can detect the surrounding information of the moving body,
    Based on the acquired information on the height position of the sensor, a determination region for determining the situation around the moving body is calculated,
    An information processing method in which a computer system executes determination of a situation around the moving body based on the calculated determination area and the peripheral information detected by the sensor.
  14.  移動体の周辺情報を検出可能なセンサの高さ位置に関する情報を取得するステップと、
     前記取得された前記センサの高さ位置に関する情報に基づいて、前記移動体の周辺の状況を判定するための判定領域を算出するステップと、
     前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、前記移動体の周辺の状況を判定するステップと
     をコンピュータシステムに実行させるプログラム。
    Obtaining information related to the height position of a sensor capable of detecting peripheral information of the moving body;
    Calculating a determination region for determining a situation around the moving body based on the acquired information on the height position of the sensor;
    A program for causing a computer system to execute a step of determining a situation around the moving body based on the calculated determination area and the peripheral information detected by the sensor.
  15.  駆動部と、
     周辺情報を検出可能なセンサと、
     前記センサの高さ位置に関する情報を検出する検出部と、
     前記検出された前記センサの高さ位置に関する情報に基づいて、周辺の状況を判定するための判定領域を算出する算出部と、
     前記算出された判定領域と、前記センサにより検出された前記周辺情報とに基づいて、周辺の状況を判定する判定部と、
     前記判定部による判定結果に基づいて、前記駆動部を制御する駆動制御部と
     を具備する移動体。
    A drive unit;
    A sensor capable of detecting surrounding information;
    A detection unit for detecting information on a height position of the sensor;
    A calculation unit for calculating a determination region for determining a surrounding situation based on the detected information on the height position of the sensor;
    A determination unit for determining a surrounding situation based on the calculated determination region and the peripheral information detected by the sensor;
    A moving body comprising: a drive control unit that controls the drive unit based on a determination result by the determination unit.
  16.  請求項15に記載の移動体であって、
     前記駆動部は、多関節構造を有する脚部である
     移動体。
    The moving body according to claim 15,
    The driving unit is a leg having a multi-joint structure.
PCT/JP2019/001369 2018-03-12 2019-01-18 Information processing device, information processing method, program, and mobile body WO2019176278A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-043789 2018-03-12
JP2018043789 2018-03-12

Publications (1)

Publication Number Publication Date
WO2019176278A1 true WO2019176278A1 (en) 2019-09-19

Family

ID=67907687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/001369 WO2019176278A1 (en) 2018-03-12 2019-01-18 Information processing device, information processing method, program, and mobile body

Country Status (1)

Country Link
WO (1) WO2019176278A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3916426A1 (en) * 2020-05-29 2021-12-01 Kabushiki Kaisha Toshiba Movable object, distance measurement method, and distance measurement program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09146633A (en) * 1995-11-16 1997-06-06 Hitachi Ltd Method and system for guiding robot
JP2004034272A (en) * 2002-07-08 2004-02-05 Mitsubishi Heavy Ind Ltd Self-position identification device for movable body
JP2006185438A (en) * 2004-12-03 2006-07-13 Matsushita Electric Ind Co Ltd Robot control device
JP2008090575A (en) * 2006-10-02 2008-04-17 Honda Motor Co Ltd Mobile robot
JP2016139389A (en) * 2015-01-29 2016-08-04 シャープ株式会社 Autonomous travel control system, autonomous travel apparatus and autonomous travel control method using the same, control program and storage medium
US20170217021A1 (en) * 2014-01-07 2017-08-03 Orin P.F. Hoffman Remotely operating a mobile robot
JP2018010412A (en) * 2016-07-12 2018-01-18 株式会社リコー Information processing equipment, information processing method and information processing program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09146633A (en) * 1995-11-16 1997-06-06 Hitachi Ltd Method and system for guiding robot
JP2004034272A (en) * 2002-07-08 2004-02-05 Mitsubishi Heavy Ind Ltd Self-position identification device for movable body
JP2006185438A (en) * 2004-12-03 2006-07-13 Matsushita Electric Ind Co Ltd Robot control device
JP2008090575A (en) * 2006-10-02 2008-04-17 Honda Motor Co Ltd Mobile robot
US20170217021A1 (en) * 2014-01-07 2017-08-03 Orin P.F. Hoffman Remotely operating a mobile robot
JP2016139389A (en) * 2015-01-29 2016-08-04 シャープ株式会社 Autonomous travel control system, autonomous travel apparatus and autonomous travel control method using the same, control program and storage medium
JP2018010412A (en) * 2016-07-12 2018-01-18 株式会社リコー Information processing equipment, information processing method and information processing program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3916426A1 (en) * 2020-05-29 2021-12-01 Kabushiki Kaisha Toshiba Movable object, distance measurement method, and distance measurement program

Similar Documents

Publication Publication Date Title
JP7136106B2 (en) VEHICLE DRIVING CONTROL DEVICE, VEHICLE DRIVING CONTROL METHOD, AND PROGRAM
US20200241549A1 (en) Information processing apparatus, moving apparatus, and method, and program
US11661084B2 (en) Information processing apparatus, information processing method, and mobile object
US11592829B2 (en) Control device and control method, program, and mobile object
US11537131B2 (en) Control device, control method, and mobile body
JP7143857B2 (en) Information processing device, information processing method, program, and mobile object
JP7320001B2 (en) Information processing device, information processing method, program, mobile body control device, and mobile body
WO2019181284A1 (en) Information processing device, movement device, method, and program
US11501461B2 (en) Controller, control method, and program
WO2020226085A1 (en) Information processing device, information processing method, and program
JP6891753B2 (en) Information processing equipment, mobile devices, and methods, and programs
JP7257737B2 (en) Information processing device, self-position estimation method, and program
WO2020031812A1 (en) Information processing device, information processing method, information processing program, and moving body
WO2019203022A1 (en) Moving body, information processing device, information processing method, and program
WO2019073795A1 (en) Information processing device, own-position estimating method, program, and mobile body
WO2019150918A1 (en) Information processing device, information processing method, program, and moving body
WO2021153176A1 (en) Autonomous movement device, autonomous movement control method, and program
US20210026356A1 (en) Control apparatus, control method, program, and mobile object
WO2021010083A1 (en) Information processing device, information processing method, and information processing program
WO2019176278A1 (en) Information processing device, information processing method, program, and mobile body
WO2021187299A1 (en) Information processing device, information processing method, and program
US20230260254A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19766453

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19766453

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP