WO2021140916A1 - Moving body, information processing device, information processing method, and program - Google Patents

Moving body, information processing device, information processing method, and program Download PDF

Info

Publication number
WO2021140916A1
WO2021140916A1 PCT/JP2020/048172 JP2020048172W WO2021140916A1 WO 2021140916 A1 WO2021140916 A1 WO 2021140916A1 JP 2020048172 W JP2020048172 W JP 2020048172W WO 2021140916 A1 WO2021140916 A1 WO 2021140916A1
Authority
WO
WIPO (PCT)
Prior art keywords
observation
unit
action plan
influence
moving body
Prior art date
Application number
PCT/JP2020/048172
Other languages
French (fr)
Japanese (ja)
Inventor
啓輔 前田
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021140916A1 publication Critical patent/WO2021140916A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • This technology relates to mobile objects, information processing devices, information processing methods, and programs applicable to autonomous movement and the like.
  • the autonomous mobile robot described in Patent Document 1 predicts the destination of the target and the obstacle from the captured image. From the prediction result, it is determined whether the target is shielded by an obstacle. Based on the determination result, the field of view is changed so that the target area that enters the field of view of the captured image is large. Thereby, it is disclosed that the goal is not lost (paragraphs [0024] [0026] of Patent Document 1 and the like).
  • the purpose of this technology is to provide mobiles, information processing devices, information processing methods, and programs that can exhibit high stability in observation.
  • the moving body includes a sensor unit and an action plan generation unit.
  • the sensor unit can observe the observation target.
  • the action plan generation unit generates an action plan for observing the observation target based on the degree of influence of an object in the influence space set on the observation target on the observation by the sensor unit. To do.
  • an action plan for observing the observation target is generated based on the degree of influence that the object in the influence space that affects the observation by the sensor unit set for the observation target has on the observation. This makes it possible to demonstrate high stability in observation.
  • the moving body may be an unmanned flying body.
  • the moving body may further include a space specifying unit that specifies the affected space based on the position information of the observation target.
  • the space identification unit may specify the influence space based on the observation information regarding the observation of the sensor unit.
  • the observation information may include at least one of the angle of view of the sensor unit or the observable distance of the sensor unit.
  • the space specifying unit may specify the affected space based on a line segment connecting the moving body and the observation target.
  • the moving body may further include a calculation unit that calculates the degree of influence of the object based on at least one of the shape, size, position, or speed of the observation target.
  • the calculation unit may calculate the degree of influence based on the position information of the object in the influence space.
  • the moving body may move based on the action plan generated by the action plan generation unit.
  • the moving body may move based on the action plan generated by the action plan generation unit.
  • the action plan generation unit may generate the action plan for reducing the degree of influence.
  • the action plan generation unit may integrate the observation action plan into the action plan for observing the observation target given in advance, using the action plan for observing the observation target as the observation action plan.
  • the action plan generation unit may regulate the action plan for observing the observation target given in advance and execute the observation action plan.
  • the action plan generation unit may generate the action plan by adding up the influence degrees of each.
  • the moving body may further include a behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
  • the action plan generation unit may generate the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
  • the action plan generation unit may generate the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
  • the information processing device includes an action plan generation unit.
  • the action plan generation unit generates an action plan for observing the observation target based on the degree of influence on the observation by an object in the influence space that affects the observation by the sensor unit set for the observation target.
  • the information processing device may further include a GUI output unit that outputs a GUI (Graphical User Interface) in which the influence space is identifiable.
  • GUI Graphic User Interface
  • the GUI output unit may output a GUI in which the degree of influence is identifiable.
  • the information processing method is an information processing method executed by a computer system and has an influence on the observation by the sensor unit set for the observation target.
  • a program causes a computer system to perform the following steps.
  • FIG. 1 is a schematic diagram for explaining an outline of an observation mobile body according to the present technology.
  • FIG. 1A is a schematic diagram showing how the observation moving body 1 follows the observation target 2.
  • FIG. 1B is a schematic view showing a state in which the observation target 2 is viewed from the observation moving body 1.
  • the observation moving body 1 can generate an action plan for observing the observation target 2 based on the degree of influence of the object 5 in the influence space 4 that affects the observation by the sensor unit 3 on the observation. It is possible.
  • the influence space 4 is a space set for the observation target 2 observed by the sensor unit 3.
  • the observation mobile body 1 is a drone capable of autonomous flight.
  • the observation mobile body 1 has a sensor unit 3 capable of observing the observation target 2.
  • the sensor unit 3 includes an imaging device such as a stereo camera, a digital camera, or a monocular camera.
  • sensor devices such as laser distance measuring sensors, contact sensors, ultrasonic sensors, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), and sonar may be used.
  • the observation moving body 1 can maintain the state in which the observation target 2 is observed by the sensor unit 3 and can continuously track the observation target 2.
  • the observation moving body 1 is not limited to the drone, and may be, for example, a wheel-type robot, a multi-legged walking type robot, or a robot having legs having an articulated structure.
  • the observation target 2 and the object 5 are not limited, and any object is set as the observation target 2.
  • other moving objects such as drones, people, structures, roads, traffic lights, traffic signs, road markings, etc. are included.
  • the object 5 existing in the influence space 4 may be described as an obstacle.
  • the influence space 4 is set for the observation target 2 as a space that affects the observation by the sensor unit 3.
  • Observation typically includes acquiring information about the observation target 2 from various sensor devices included in the sensor unit 3 of the observation mobile body 1. For example, the image information of the observation target 2 captured by the camera, or the voice data such as the voice and footsteps of the observation target 2 detected by the microphone are included in the observation result. Further, the state in which these information (data) can be acquired is the state in which the observation target 2 can be observed. In the present disclosure, as an observable state, a state in which the observation target 2 can be contained within the angle of view of the camera mounted on the observation moving body 1 is taken as an example.
  • the influence space 4 is a space in which the observation target 2 cannot be observed due to the existence of the object 5 or the like in the space, or a space in which the observation is impossible. is there.
  • the degree of influence of the object 5 on the observation is the degree of the object 5 hindering the observation. For example, the larger the area (volume) of the object 5 that occupies the influence space 4, the greater the degree of influence on the observation.
  • the degree of influence on the observation may change depending on other parameters such as the position of the observation moving body 1.
  • a cylindrical influence space 4 is set as an example of the influence space 4.
  • the shape of the influence space 4 is not limited, and an arbitrary shape may be set according to the observation information regarding the observation of the sensor unit 3.
  • the observation information includes at least one of observable distances such as an angle of view (viewing angle), an observable minimum distance, and a maximum distance.
  • the action plan is various information that controls the observation mobile body 1.
  • the speed of the observation moving body 1, the path (trajectory) in which the observation moving body 1 moves, the waypoint (position) through which the observation moving body 1 passes, the posture of the observation moving body 1, and the like are included in the action plan.
  • the moving direction and speed of the observed moving body 1 are generated as an action plan.
  • the moving direction and speed of the observed moving body 1 at each time can be said to be the path of the observed moving body 1.
  • the observation moving body 1 observes the observation target 2 according to an action plan that follows the observation target 2.
  • the observation moving object 1 is generated based on the degree of influence of the object 5 on the observation. It is possible to observe the observation target 2 according to the action plan.
  • the observation moving body 1 moves the observation target 2 to the observable position 8 according to the action plan 7 that moves around the object 5.
  • the observation moving body 1 continues following according to the action plan for following the observation target 2 given in advance. That is, the observation mobile body 1 can execute the action plan without losing sight of the observation target 2.
  • FIG. 2 is a block diagram showing a configuration example of a schematic function of the mobile body control system 100 that controls the observation mobile body 1 of the present disclosure.
  • the mobile control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, and a storage unit 109. And an autonomous movement control unit 110.
  • the input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the storage unit 109, and the autonomous movement control unit 110 are connected to each other via the communication network 111.
  • the communication network 111 is a communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) such as IEEE802.3, or FlexRay (registered trademark). It consists of a bus, a bus, or a unique communication method that is not standardized. In addition, each part of the mobile control system 100 may be directly connected without going through the communication network 111.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • the description of the communication network 111 shall be omitted.
  • the input unit 101 and the autonomous movement control unit 110 communicate with each other via the communication network 111, it is described that the input unit 101 and the autonomous movement control unit 110 simply communicate with each other.
  • the input unit 101 includes a device used for inputting various data, instructions, and the like to the observation mobile body 1.
  • the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device capable of inputting by a method other than manual operation by voice or gesture.
  • the input unit 101 may be a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device that supports the operation of the mobile control system 100.
  • the input unit 101 generates an input signal based on data, instructions, and the like input by an operator (hereinafter referred to as a user) who gives an action plan to the observation moving body 1, and supplies the input signal to each part of the moving body control system 100. To do.
  • the data acquisition unit 102 includes various sensors and the like for acquiring data used for processing of the mobile control system 100, and supplies the acquired data to each unit of the mobile control system 100.
  • the data acquisition unit 102 constitutes the sensor group 112 by including various sensors for detecting the state of the observation moving body 1, and corresponds to the sensor unit 3 in FIG.
  • the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an operation amount of acceleration input such as an accelerator, an operation amount of deceleration input, and an operation amount of direction instruction input.
  • IMU inertial measurement unit
  • the data acquisition unit 102 includes various sensors for detecting information outside the observation moving body 1 such as the observation target 2 and the object 5.
  • the data acquisition unit 102 includes an imaging device such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, an infrared camera, a polarized camera, and other cameras.
  • ToF Time of Flight
  • the data acquisition unit 102 includes an environment sensor for detecting the weather or the weather, and a surrounding information detection sensor for detecting an object around the observation moving body 1.
  • the environmental sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.
  • Ambient information detection sensors include, for example, laser distance measuring sensors, ultrasonic sensors, radars, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), sonar, and the like.
  • the data acquisition unit 102 includes various sensors for detecting the current position of the observation moving body 1.
  • the data acquisition unit 102 includes a GNSS receiver or the like that receives a GNSS signal from a GNSS satellite.
  • the communication unit 103 communicates with the mobile internal device 104 and various devices, servers, base stations, etc. outside the observation mobile 1 such as other drones, and data supplied from each unit of the mobile control system 100. Is transmitted, and the received data is supplied to each part of the mobile control system 100.
  • the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 may support a plurality of types of communication protocols.
  • the communication unit 103 wirelessly communicates with the mobile internal device 104 by wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like.
  • the communication unit 103 uses a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (Mobile) via a connection terminal (and a cable if necessary) (not shown). Wired communication is performed with the mobile internal device 104 by high-definition link) or the like. Further, for example, the communication unit 103 communicates with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network or a network peculiar to a business operator) via a base station or an access point. I do.
  • a device for example, an application server or a control server
  • an external network for example, the Internet, a cloud network or a network peculiar to a business operator
  • the communication unit 103 uses P2P (Peer To Peer) technology to connect with a terminal (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal) existing in the vicinity of the observation mobile body 1. To communicate.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the mobile internal device 104 includes, for example, a mobile device or wearable device owned by the user, an information device carried in or attached to the observation mobile body 1, a navigation device for searching a route to an arbitrary destination, and the like.
  • the output control unit 105 controls the output of various information to the user or the outside of the observation mobile body 1.
  • the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data) and supplies the output signal to the output unit 106.
  • the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate a bird's-eye view image, a panoramic image, or the like, and outputs an output signal including the generated image. It is supplied to the output unit 106.
  • the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a danger zone, and outputs an output signal including the generated voice data to the output unit 106.
  • Supply for example, the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a
  • the output unit 106 includes a device capable of outputting visual information or auditory information to the user or the outside of the observation mobile body 1.
  • the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a spectacle-type display worn by a user, a projector, a lamp, and the like.
  • the display device included in the output unit 106 displays visual information in the user's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function, in addition to the device having a normal display. It may be a device for Since the output control unit 105 and the output unit 106 are not indispensable for the processing of autonomous movement, they may be omitted if necessary.
  • the drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies control signals to each unit other than the drive system system 108 as necessary, and notifies the control state of the drive system system 108.
  • the drive system system 108 includes various devices related to the drive system of the observation mobile body 1.
  • the drive system 108 includes a servomotor that can specify the angle and torque provided in each joint of the four legs, a motion controller that decomposes and replaces the movement of the robot itself into the movements of the four legs, and It is equipped with a feedback control device using sensors in each motor and sensors on the back of the foot.
  • the drive system 108 includes motors having four or six upward propellers, and a motion controller that decomposes and replaces the movement of the robot itself into the amount of rotation of each motor.
  • the drive system system 108 provides a drive force generator for generating a drive force of an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to the wheels, and a steering angle. It is equipped with a steering mechanism for adjusting, a braking device for generating braking force, an ABS (Antilock Brake System), an ESC (Electronic Stability Control), an electric power steering device, and the like.
  • the storage unit 109 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and the like.
  • the storage unit 109 stores various programs, data, and the like used by each unit of the mobile control system 100.
  • the storage unit 109 is a three-dimensional high-precision map such as a dynamic map, a global map that is less accurate than the high-precision map and covers a wide area, and a map such as a local map that includes information around the observation moving object 1. Store data.
  • the autonomous movement control unit 110 controls autonomous movement such as automatic driving or driving support. Specifically, for example, the autonomous movement control unit 110 has a function of collision avoidance or impact mitigation of the observation moving body 1, follow-up movement based on the distance between the observation moving bodies, speed maintenance movement, or collision warning function of the observation moving body 1. Perform coordinated control for the purpose of realization. Further, for example, the autonomous movement control unit 110 performs cooperative control for the purpose of autonomous movement that moves autonomously without depending on the operation of the user.
  • the autonomous movement control unit 110 corresponds to the information processing device according to the present embodiment, and has hardware necessary for a computer such as a CPU, RAM, and ROM.
  • the information processing method according to the present technology is executed by the CPU loading the program according to the present technology recorded in the ROM in advance into the RAM and executing the program.
  • the specific configuration of the autonomous movement control unit 110 is not limited, and for example, a PLD (Programmable Logic Device) such as an FPGA (Field Programmable Gate Array) or another device such as an ASIC (Application Specific Integrated Circuit) may be used.
  • the autonomous movement control unit 110 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
  • the detection unit 131, the self-position estimation unit 132, and the situation analysis unit 133 constitute the recognition processing unit 121.
  • the planning unit 134 constitutes an action plan processing unit 122.
  • the motion control unit 135 constitutes the behavior control processing unit 123.
  • the influence space identification unit 151, the influence degree calculation unit 152, and the GUI (Graphical User Interface) output unit 153 constitute the influence space processing unit 124.
  • the detection unit 131 detects various types of information necessary for controlling autonomous movement.
  • the detection unit 131 includes a mobile body external information detection unit 141 and a mobile body state detection unit 142.
  • the mobile external information detection unit 141 performs detection processing of external information of the observation mobile body 1 based on data or signals from each unit of the mobile control system 100.
  • the moving body external information detection unit 141 performs detection processing, recognition processing, and tracking processing of the observation target 2 and the object 5 around the observation moving body 1, and detection processing of the distance to the observation target 2 and the object 5.
  • the mobile body external information detection unit 141 performs detection processing of the environment around the mobile body.
  • the surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like.
  • the mobile external information detection unit 141 supplies data indicating the result of the detection process to the self-position estimation unit 132, the map analysis unit 145 of the situation analysis unit 133, the situation recognition unit 146, the operation control unit 135, and the like.
  • the mobile body state detection unit 142 performs a state detection process of the observation mobile body 1 based on data or signals from each unit of the mobile body control system 100.
  • the state of the observation moving body 1 to be detected includes, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, state of driving operation, state of other moving body-mounted equipment, and the like.
  • the mobile body state detection unit 142 supplies data indicating the result of the detection process to the situation awareness unit 146 of the situation analysis unit 133, the operation control unit 135, and the like.
  • the self-position estimation unit 132 sets the position of the observation mobile body 1 and the position of the observation mobile body 1 based on data or signals from each unit of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 146 of the situation analysis unit 133. Performs estimation processing such as posture. In addition, the self-position estimation unit 132 generates a local map (hereinafter, referred to as a self-position estimation map) used for self-position estimation, if necessary.
  • the map for self-position estimation is, for example, a highly accurate map using a technique such as SLAM (Simultaneous Localization and Mapping).
  • the self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 145 of the situation analysis unit 133, the situation recognition unit 146, and the like. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 109. Further, the self-position estimation unit 132 accumulates the time-series information supplied in the time series based on the detection result supplied from the sensor group 112 in the database, and also stores the self-position based on the accumulated time-series information. Is estimated and output as the time series information self-position. Further, the self-position estimation unit 132 estimates the self-position based on the current detection result supplied from the sensor group 112, and outputs the current information self-position.
  • the self-position estimation unit 132 outputs the self-position estimation result by integrating or switching the time-series information self-position and the current information self-position. Further, the self-position estimation unit 132 detects the posture of the observation moving body 1 based on the detection result supplied from the sensor group 112, the change in the posture is detected, the self-position changes significantly, and the time series information. When it is considered that the estimation accuracy of the self-position decreases, the self-position is estimated only from the current information self-position. Further, for example, when the observation moving body 1 is mounted on another moving body and moves, the self-position estimation unit 132 determines the posture of the observation moving body 1 based on the detection result supplied from the sensor group 112. Even if no change is detected, the self-position changes significantly, so it is considered that the estimation accuracy of the time-series information self-position decreases, and the self-position is estimated only from the current information self-position.
  • the situation analysis unit 133 analyzes the observation moving body 1 and the surrounding situation.
  • the situational analysis unit 133 includes a map analysis unit 145, a situational awareness unit 146, and a situational awareness unit 147.
  • the map analysis unit 145 uses data or signals from each unit of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and is stored in the storage unit 109. The map is analyzed and a map containing the information necessary for autonomous mobile processing is constructed. The map analysis unit 145 supplies the constructed map to the situation recognition unit 146, the situation prediction unit 147, the route planning unit 161 of the planning unit 134, the action planning unit 162, the operation planning unit 163, and the like.
  • the situation recognition unit 146 is based on data or signals from each unit of the mobile control system 100 such as the self-position estimation unit 132, the mobile external information detection unit 141, the mobile state detection unit 142, and the map analysis unit 145.
  • the situation recognition process regarding the observation mobile body 1 is performed.
  • the situational awareness unit 146 performs the situation of the observation moving body 1 and the situation around the observation moving body 1. Further, the situational awareness unit 146 generates a local map (hereinafter, referred to as a situational awareness map) used for recognizing the situation around the observation moving body 1 as needed.
  • the situation recognition map is, for example, an Occupancy Grid Map, a Road Map, or a Point Cloud Map.
  • the situation of the observation moving body 1 to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the observation moving body 1, and the presence / absence and contents of an abnormality.
  • the surrounding conditions of the observation moving object 1 to be recognized include, for example, the type, position, and movement (for example, velocity, acceleration, moving direction, etc.) of surrounding objects such as the observation target 2 and the object 5. Further, for example, the composition of the surrounding road and the condition of the road surface, and the surrounding weather, temperature, humidity, brightness, and the like are included.
  • the situational awareness unit 146 supplies data indicating the result of the recognition process (including a situational awareness map, if necessary) to the self-position estimation unit 132, the situation prediction unit 147, and the like. Further, the situational awareness unit 146 stores the situational awareness map in the storage unit 109. The situational awareness unit 146 supplies data indicating the position of the object 5 to the influence space identification unit 151. Further, the situational awareness unit 146 supplies data indicating the position of the observation target 2 to the influence degree calculation unit 152. For example, the position information of the observation target 2 and the object 5 is supplied based on the position of the observation moving body 1. As the position information, coordinate values (for example, XYZ coordinate values) defined by the absolute coordinate system (world coordinate system) may be used.
  • a coordinate value for example, xyz coordinate value or uvd coordinate value
  • a relative coordinate system with a predetermined point as a reference (origin)
  • the reference origin may be set arbitrarily.
  • the situation prediction unit 147 performs a situation prediction process regarding the observation moving body 1 based on data or signals from each part of the moving body control system 100 such as the map analysis unit 145 and the situation recognition unit 146.
  • the situation prediction unit 147 performs prediction processing such as the situation of the observation moving body 1 and the situation around the observation moving body 1.
  • the situation of the observation mobile body 1 to be predicted includes, for example, the behavior of the observation mobile body 1, the occurrence of an abnormality, the movable distance, and the like.
  • the situation around the moving body to be predicted includes, for example, the behavior of the animal body around the observed moving body 1, the change in the signal state, the change in the environment such as the weather, and the like.
  • the situation prediction unit 147 supplies data indicating the result of the prediction processing and data from the situation recognition unit 146 to the route planning unit 161, the action planning unit 162, the operation planning unit 163, and the like of the planning unit 134.
  • the influence space specifying unit 151 specifies the influence space 4 set for the observation target 2.
  • the influence space 4 is specified based on the position information of the observation target 2 input from the situational awareness unit 146.
  • a specific method for specifying the influence space 4 will be described with reference to FIG.
  • the influence space specifying unit 151 determines whether or not the object 5 exists in the influence space 4.
  • a specific determination method will be described with reference to FIG.
  • Data indicating the result of the specific processing is supplied to the influence degree calculation unit 152 and the GUI output unit 153 by the influence space specifying unit 151.
  • the influence degree calculation unit 152 calculates the influence degree based on the possibility that the object 5 existing in the influence space 4 interferes with the observation of the observation target 2.
  • the possibility of hindering observation is, for example, information about various objects 5 such as the shape, size, position, and velocity of the object 5.
  • the degree of influence is calculated based on the position information of the object 5 input from the situational awareness unit 146 and the shape and position of the influence space 4 input from the influence space identification unit 151.
  • the degree of influence is calculated based on the position of the object 5 in the influence space 4 and the traveling direction of the observation moving body 1. A specific method for calculating the degree of influence will be described with reference to FIG. Data indicating the result of the calculation process is supplied by the influence degree calculation unit 152 to the GUI output unit 153, the route planning unit 161, the action planning unit 162, and the operation planning unit 163.
  • the GUI output unit 153 outputs a GUI in which the influence space 4 is identifiable.
  • the user can input an action plan for following the observation target 2 to the observation moving body 1 via the GUI displayed on the output unit 106.
  • the user can identify the shape and position of the influence space 4 via the GUI.
  • the user can identify the degree of influence of the object 5 existing in the influence space 4 via the GUI.
  • the route planning unit 161 plans a route to the destination based on data or signals from each unit of the mobile control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the impact calculation unit 152. For example, the route planning unit 161 sets a route from the current position to the specified destination based on the global map. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
  • the action planning unit 162 sets a route planned by the route planning unit 161 based on data or signals from each unit of the mobile control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the influence degree calculation unit 152. Plan the action of the observation moving object 1 to move safely within the planned time. For example, the action planning unit 162 plans the start, stop, direction of travel (for example, forward, backward, change of direction, etc.), movement speed, and the like. The action planning unit 162 supplies data indicating the behavior of the planned observation mobile body 1 to the motion planning unit 163 and the like. More specifically, the action planning unit 162 selects the action plan candidate of the observation moving body 1 for safely moving within the planned time for each of the routes planned by the route planning unit 161. Generate as.
  • the action planning unit 162 divides the environment into a grid, for example, an A * algorithm (A star search algorithm) that optimizes the arrival judgment and the weight of the route to generate the best path, and the self.
  • Action plan candidates are generated by the RRT (Rapidly-exploring Random Tree) algorithm, etc., which extends the path from the position to the place where the incremental can be reached while appropriately pruning.
  • the operation planning unit 163 performs the action planned by the action planning unit 162 based on the data or signals from each unit of the moving body control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the influence degree calculation unit 152. Plan the operation of the observation moving body 1 to realize it. For example, the motion planning unit 163 plans acceleration, deceleration, rotation speed, and the like. The motion planning unit 163 supplies data indicating the planned motion of the moving body to the motion control unit 135 and the like.
  • the action plan and the action plan also include the flight pattern of the observation mobile body 1. That is, the trajectory and speed defined as a pattern such as turning and figure 8 are also included in the action plan and the motion plan.
  • the speed and curvature of the observation moving body 1 when the turn or the figure eight is performed are planned as an action plan and an action plan.
  • Parameters such as speed and attitude associated with the flight pattern may be set by default. That is, how to move a predetermined flight pattern may be set by default.
  • the motion control unit 135 controls the motion of the observation mobile body 1. More specifically, the motion control unit 135 is based on the detection results of the mobile external information detection unit 141 and the mobile state detection unit 142, such as collision, contact, entry into a danger zone, abnormality of the observation mobile body 1, and the like. Performs emergency detection processing. When the motion control unit 135 detects the occurrence of an emergency, the motion control unit 135 plans the motion of the observation moving body 1 for avoiding an emergency such as a sudden stop or a sharp turn. Further, the motion control unit 135 performs acceleration / deceleration control for realizing the motion of the observation moving body 1 planned by the motion planning unit 163.
  • the motion control unit 135 calculates a control target value of a driving force generator or a braking device for realizing a planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. It is supplied to the system control unit 107.
  • the motion control unit 135 performs direction control for realizing the motion of the observation moving body 1 planned by the motion planning unit 163.
  • the motion control unit 135 calculates the control target value of the steering mechanism for realizing the moving trajectory or the sharp turn planned by the motion planning unit 163, and issues a control command indicating the calculated control target value to the drive system control unit. Supply to 107.
  • the sensor group 112 corresponds to a sensor unit capable of observing an observation target.
  • the influence space processing unit 124 and the planning unit 134 make observations based on the degree of influence of an object in the influence space that affects the observation by the sensor unit set for the observation target on the observation. It functions as an action plan generation unit that generates an action plan for observing an object.
  • the situation prediction unit 147 corresponds to the behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
  • the influence space identification unit 151 corresponds to the space identification unit that specifies the influence space based on the position information of the observation target.
  • the influence degree calculation unit 152 corresponds to a calculation unit that calculates the influence degree based on at least one of the shape, size, position, or speed of the object to be observed.
  • the GUI output unit 153 corresponds to a GUI output unit that outputs a GUI in which the influence space is identifiable.
  • FIG. 3 is a flowchart showing control of specifying the influence space and calculating the degree of influence.
  • the observation target 2 is detected based on the sensing result acquired by the sensor unit 3 (step 101).
  • step 101 is executed when an instruction to track the observation target 2 is input to the observation moving body 1 by the user.
  • the situational awareness unit 146 estimates the position of the detected observation target 2 (step 102).
  • the relative position of the observation target 2 is estimated with respect to the observation moving body 1.
  • the influence space identification unit 151 specifies the influence space based on the estimated position of the observation target 2 (step 103).
  • the object 5 is detected based on the sensing result acquired by the sensor unit 3 (step 104).
  • the situational awareness unit 146 estimates the position of the detected object 5 (step 105).
  • the influence space specifying unit 151 determines whether or not an obstacle (object 5) exists in the influence space 4 (step 106). When there is an obstacle in the influence space 4 (YES in step 106), the influence degree calculation unit 152 calculates the influence degree that the obstacle has on the observation (step 107).
  • the action plan unit 162 generates an action plan based on the calculated degree of influence (step 108). For example, an action plan is generated that reduces the calculated impact.
  • the action plan for observing the observation target 2 generated in step 108 is described as an observation action plan. That is, the observation action plan is an action plan for controlling the observation moving body 1 so that the observation is not obstructed by obstacles in order to continue the observation of the observation target 2.
  • the action plan for observing (tracking) the observation target 2 given in advance is described as the advance action plan.
  • the prior action plan includes an action plan in which a circular motion is performed around the observation target 2 at a predetermined distance from the observation target 2.
  • the motion control unit 135 controls the observation mobile body 1 based on the planned action plan (step 109).
  • the observation action plan is integrated with the advance action plan. That is, in addition to the advance action plan, control is executed so that the obstacle comes out of the influence space 4. For example, assume that the pre-action plan moves around to the right with respect to the observation target 2. It is assumed that there is an obstacle in the influence space in the movement path of the action plan.
  • the observation moving body 1 may move upward with respect to the observation target 2 while wrapping around to the right so that the observation is not obstructed by obstacles.
  • the control may be switched from the preliminary action plan to the observation action plan.
  • the observation action plan For example, an action plan in which the advance action plan goes straight to the observation target 2 and follows it. At this time, it is assumed that there is an obstacle on the right side of the influence space. In this case, the vehicle may move to the left in the direction opposite to the obstacle instead of going straight.
  • the observation moving body 1 is controlled based on the advance action plan.
  • the observation moving body 1 is controlled based on the advance action plan. That is, in the present embodiment, the action plan is generated based on the degree of influence of the obstacle in the influence space 4 on the observation regardless of whether or not the observation can be continued. Further, when the observation moving body 1 is moved based on the action plan and there are no obstacles in the influence space 4, the observation moving body 1 is controlled based on the prior action plan.
  • FIG. 4 is a schematic diagram for explaining a specific example of the method of specifying the influence space.
  • the shape of the influence space 20 is a cylinder. Further, in the present embodiment, the following information is input (unit is omitted). Coordinates of relative position 21 of observation target 2 with observation moving object 1 as the origin: (10.5, 2.3, -0.4) Radius of influence space 20: 2.0
  • the influence space specifying unit 151 executes the following calculation based on the above input information to specify the influence space. From the relative distance to the observation target 2 (the length of the center line 22 connecting the observation moving object 1 and the observation target 2) and the direction vector to the observation target 2, the direction vector 23 to the normalized observation target 2 is It is calculated. Relative distance: 10.8 Direction vector 23: (0.98, 0.21, -0.04)
  • the influence space identification unit 151 outputs the radius 2.0, the relative distance 10.8, and the direction vector 23 of the influence space 20 to the influence degree calculation unit 152.
  • “less than” may include “less than or equal to” and “less than”.
  • “greater than” may include “greater than or equal to”.
  • FIG. 5 is a schematic diagram for explaining a specific example of the method of calculating the degree of influence. Further, in the present embodiment, the coordinates of the three objects, the radius of the influence space 20, the relative distance, and the direction vector 23 are input. Relative position of object 31: (7.4, 0.2, 0.1) Relative position of object 32: (3.1, -1.9, 1.1) Relative position of object 33: (5.3, 2.4, -0.3)
  • the influence degree calculation unit 152 obtains the influence degree as an amount inversely proportional to the distance between each object and the center line. Further, the direction of the degree of influence is obtained as a vector in the direction of the perpendicular line drawn from each object to the center line 22.
  • the influence degree calculation unit 152 executes the following calculation based on the above input information, and calculates the influence degree vector. It is determined whether the coordinates of each object are included in the influence space 20. For the sake of brevity, only the calculation of the object 31 will be described.
  • the inner product of the direction vector 35 and the direction vector 23 to the object 31 is written as the following equation (Equation 3).
  • the influence degree vectors of the object 32 and the object 33 are calculated.
  • the object 32 is not included in the influence space 20 because the distance from the center line 22 is 2.8.
  • the influence vector of the object 32 becomes (0, 0, 0).
  • the object 33 is included in the influence space 20 because the inner product of the direction vector to the object 33 and the direction vector 23 is 5.7 and the distance from the center line 22 is 1.2.
  • the influence vector 38 of the object 33 becomes (0.18, 0.8, 0.06).
  • the influence degree calculation unit 152 supplies the total influence degree vector to the GUI output unit 153, the route planning unit 161, the action planning unit 162, and the operation planning unit 163.
  • FIG. 6 is a schematic diagram showing a specific example of the observation action plan.
  • FIG. 6A is a schematic diagram showing the path of the observation mobile body 1.
  • FIG. 6B is a schematic view showing the case where the observation target 2 is viewed from the position 41.
  • the observation moving body 1 moves according to the observation action plan generated based on the influence degree vector that each object has on the observation.
  • the following information is input.
  • Relative position of observation target 2 (10.5, 2.3, -0.4) Impact vector: (0.03, 1.46, 0.24)
  • Target distance to observation target 2 8.0
  • Maximum velocity of observation mobile body 1 0.5 P gain: 0.1
  • the action plan processing unit 122 executes the following calculation based on the above input information to generate an observation action plan.
  • the observation action plan is generated based on the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence.
  • Equation 8 The velocity vector for moving away from the object 31, which is inversely proportional to the distance 7.4 between the observation moving body 1 and the object 31, is written as the following equation (Equation 8).
  • the velocity vector for moving away from the object 32 (-0.2, 0.1, -0.1) and the velocity vector for moving away from the object 33 (-0.2, -0.1, 0) is calculated.
  • the velocity vector for moving away from the object 31, the velocity vector for leaving the object 32, and the velocity vector for leaving the object 33 are added together, and the velocity vector for leaving each object to which the P gain is 0.1 is calculated as follows. Write as in equation (Equation 9).
  • the velocity vector for reducing the influence degree is (0,0). It becomes .15, 0.02).
  • the velocity vector 43 which is the observation action plan
  • the velocity vector 43 is (0.25). , 0.21, 0).
  • the velocity vector 43 of the observation action plan output from the action plan processing unit 122 is (0.25, 0). .21, 0).
  • the observation moving body 1 moves from the position 41 to the position 42 according to the newly generated observation action plan from the preliminary action plan that follows the observation target 2, so that each object It is possible to reduce the degree of influence on the observation.
  • FIG. 6C is a schematic view showing the case where the observation target 2 is viewed from the position 42. As shown in FIG. 6C, when the observation moving body 1 moves to the position 42, each object does not exist in the influence space 20. That is, the observation action plan can be said to be an action plan for moving the observation moving body 1 until there are no more objects inside the influence space 20.
  • the radius of the influence space 20, the target distance to the observation target 2, the maximum speed of the observation moving body 1, the P gain, and the like are not limited and may be set arbitrarily. For example, it may be set by the user, or may be appropriately set according to the observation information of the sensor unit 3.
  • FIG. 7 is a schematic diagram showing an observation GUI for inputting an instruction to the observation mobile body 1.
  • FIG. 7A is a schematic diagram showing how the observation target 2 is selected. As shown in FIG. 7A, the user can set the observation target 2 via the observation GUI45.
  • the observation GUI 45 includes a display unit 46, a screen transition instruction unit 47, a movement instruction unit 48, an object selection unit 49, a position designation unit 50, and a landing instruction unit 51.
  • the display unit 46 displays a moving image acquired from the camera (sensor unit) of the observation moving body 1. In FIG. 7A, a state in which two people are moving within the angle of view of the camera mounted on the observation moving body 1 is displayed.
  • the display mode displayed on the display unit 46 before each unit is selected will be referred to as a home screen.
  • the screen transition instruction unit 47 can change the display mode of the observation GUI 45. For example, when the movement instruction unit 48 is selected from the home screen, the display unit 46 shifts to the display mode of the movement instruction unit 48. When the screen transition instruction unit 47 is selected from this state, the display unit 46 transitions from the display mode of the movement instruction unit 48 to the home screen.
  • the movement instruction unit 48 can input an instruction regarding movement to the observation moving body 1. For example, it is possible to input various movement instructions such as forward, backward, turn, figure eight, ascending, and descending.
  • the target selection unit 49 can select any object or person displayed on the display unit 46 as an observation target. In the present embodiment, the broken line 52 is displayed so as to surround the person, and the observation target 2 can be set by the user touching the inside of the broken line 52. In FIG.
  • the observation GUI is in a mode in which the observation target 2 can be set by selecting the target selection unit 49 by the user.
  • the position designation unit 50 can select a predetermined position, a landmark, or the like. For example, it is possible to move with a predetermined position specified by the user as a target point.
  • the landing instruction unit 51 can input an instruction regarding landing to the observation mobile body 1. For example, an instruction to land the observation mobile body 1 at a predetermined position selected by the user is input.
  • the attitude, orientation, speed, and the like of the aircraft when the observation mobile body 1 lands may be input.
  • the observation GUI 45 is not limited, and for example, a mode for taking a picture using the camera of the observation moving body 1 or a mode for returning to a charging position designated for charging may be set.
  • FIG. 7B is a schematic diagram showing how the observation target 2 is followed.
  • the person 53 is set as an observation target by the user.
  • the target icon 54 indicating that the observation target 2 has been set is displayed on the display unit 46.
  • the observation GUI 45 includes a display unit 46, a reselection unit 55, a movement instruction unit 48, a follow-up instruction unit 56, a position designation unit 50, and a landing instruction unit 51.
  • the reselection unit 55 can switch the observation target 2. For example, when the reselection unit 55 is selected, it switches to the observation GUI 45 shown in FIG. 7A.
  • the follow-up instruction unit 56 can input an action plan related to follow-up to the observation moving body 1 for the selected observation target 2. For example, it is possible to input an action plan such as making the observation moving object 1 follow the observation target 2 at a distance of 3 m while performing a circular motion around the observation target 2.
  • FIG. 7C is a schematic diagram showing how the observation target 2 continues to follow.
  • the observation GUI 45 includes a display unit 46, a return unit 57, a movement instruction unit 48, a follow-up stop unit 58, a position designation unit 50, and a landing instruction unit 51.
  • the return unit 57 can move the observation moving body 1 to a predetermined position. For example, by selecting the return unit 57, it is possible to return to the point where the observation mobile body 1 took off or the current position of the user. By being selected while the follow-up stop unit 58 is following, it is possible to input information to the observation moving body 1 to the effect that the action plan for following is regulated.
  • FIG. 8 is a schematic diagram showing an observation GUI in which the influence space is displayed.
  • FIG. 8A is a schematic view showing how the influence space 60 set for the observation target 2 is displayed.
  • a predetermined color is shown in the area corresponding to the influence space 60 so that the influence space 60 can be identified by the user.
  • FIG. 8B is a schematic diagram showing the influence space and the influence on the observation. As shown in FIG. 8B, a part of the person 62 overlaps in the influence space 60.
  • the degree of influence on the observation by the person 62 in the influence space 60 is shown in a color different from that of the influence space 60 so that the user can identify it.
  • the region 64 where the broken line 63 surrounding the person 62 and the influence space 60 overlap is shown as the degree of influence.
  • the method of displaying the degree of influence is not limited, and for example, the area 64 may extend along the moving direction of the person 62. Further, for example, the shade of color may be set according to the magnitude of the degree of influence.
  • the observation moving object 1 is an observation target based on the degree of influence on the observation by the object 5 in the influence space 4 that affects the observation by the sensor unit 3 set for the observation target 2.
  • An action plan for observing 2 is generated. This makes it possible to demonstrate high stability in observation.
  • the influence space that affects the observation of the target object by the sensor is specified.
  • the degree of influence of obstacles in the specified influence space on the observation is estimated, and an action plan is generated so that the degree of influence is reduced.
  • the action plan is generated so as to avoid the situation where the target is lost or the observation becomes partial, the continuity of the follow-up is improved.
  • the shape of the influence space is set as a cylinder. In addition to this, various shapes may be set as the influence space.
  • FIG. 9 is a schematic view showing another shape of the influence space.
  • FIG. 9A is a schematic view showing the influence space of the cone.
  • the influence space 70 is specified by the influence space identification unit 151 based on the following information.
  • Starting point 71 based on the position of the observation moving body 1
  • Direction vector 72 from the observation moving body 1 to the observation target 2 Length from observation moving object 1 to observation target 2 (height of cone 70)
  • the radius of the bottom 73 or the angle 75 of the bus 74 Based on this information, the influence space 70 on the cone can be uniquely determined, and the influence space can be specified.
  • FIG. 9B is a schematic view showing the influence space of the quadrangular pyramid.
  • the influence space 80 is specified by the influence space identification unit 151 based on the following information.
  • Starting point 81 based on the position of the observation moving body 1
  • Bottom vector 83 perpendicular to bottom 82
  • Direction vector 84 perpendicular to each side Boundary point 85 of each surface
  • the influence space 80 since the influence space 80 has four corners, it is possible to specify the influence space 80 by determining four boundary points 85 and four direction vectors 84 on each side surface.
  • FIG. 9C is a schematic view showing the influence space of the quadrangular pyramid.
  • the influence space 90 is specified by the influence space identification unit 151 based on the following information.
  • Direction vector 92 at the boundary point 91 of each boundary surface For example, in FIG. 9C, since the influence space 90 has six faces, it is possible to specify the influence space 90 by determining six sets of boundary points 91 and direction vectors 92 of each face. In FIG. 9C, the boundary points and direction vectors of the three boundary surfaces are shown for simplification.
  • the shape of the affected space may be changed as appropriate depending on the situation.
  • the influence space may be set to a cylinder when the distance between the observation moving body 1 and the observation target 2 is short, and the influence space may be set to a cone when the distance is long.
  • the size of the influence space may be changed so as to be proportional to the distance between the observation moving object 1 and the observation target 2.
  • the parameter for specifying the influence space (for example, the radius of the cylinder) is determined by the user.
  • the influence space may be set to a quadrangular pyramid according to the angle of view of the camera.
  • the quadrangular pyramid stand may be set except within a predetermined distance from the start point of the quadrangular pyramid.
  • the space between the ceiling and the floor may be set as the influence space.
  • the degree of influence is calculated as an amount inversely proportional to the distance between the object and the center line.
  • the degree of influence may be calculated by various methods. For example, with respect to the line segment L connecting the observation moving body 1 and the observation target 2, the space is divided into four regions by two planes, a plane P stretched by a gravity vector and a plane Q perpendicular to the plane P and including the line segment L. It may be divided into.
  • the degree of influence may be calculated by referring to the degree of influence when there is an object in each region. Further, for example, the degree of influence may be changed so that the degree of influence calculated based on the degree of influence is inversely proportional to the distance between the line segment L and the object.
  • the observation action plan is generated by adding up the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence.
  • the velocity vector of the observation action plan may be added to the velocity vector related to various controls such as the velocity vector for avoiding the collision with the object 5.
  • the degree of influence for each position of the observation moving body 1 may be calculated, and an action plan may be generated such that the integrated value of the degree of influence on the route planned to pass by the observation moving body 1 is minimized.
  • the situation prediction unit 147 may predict the velocity vector (movement direction and velocity) of the object, and calculate the degree of influence based on the prediction result.
  • the degree of influence is calculated based on the position of the object in the influence space.
  • the degree of influence may be calculated based on the size and type of the object. For example, when the object is a circular road sign, the coordinates corresponding to the edge of the road sign may be set as the position of the object.
  • the degree of influence is not limited to the gradient vector, and a collision risk indicating the possibility of colliding with each obstacle, a shielding risk indicating the possibility of becoming unobservable, or the like may be treated as the degree of influence. Further, for example, the degree of influence may be calculated based on the position information of an object existing outside the influence space.
  • an action plan may be generated based on the moving path, speed, or the like of the moving object.
  • the communication unit 103 may perform mutual communication with the mobile body, acquisition of an action plan of the mobile body, and the like.
  • the observation mobile body 1 made an autonomous movement according to the action plan.
  • an observation action plan may be generated for a user operation (for example, determination of a moving direction or a speed).
  • the user's operation is restricted, and the observation moving body 1 moves according to the observation action plan.
  • the operation of moving the observation moving body 1 in the right direction may be restricted, and the observation moving body 1 may be moved in the upward or left direction.
  • an observation action plan is generated based on the calculated degree of influence regardless of whether or not the observation of the observation target 2 can be continued.
  • the observation of the observation target 2 may be prioritized.
  • an observation action plan that reduces the degree of influence less than the observation action plan that reduces the degree of influence earliest may be adopted according to the route of the prior action plan.
  • the degree of influence is calculated from the object in the influence space.
  • the degree of influence may be calculated from various factors such as wind, rain, surrounding animals and light sources.
  • FIG. 10 is a block diagram showing a hardware configuration example of the autonomous movement control unit 110.
  • the autonomous movement control unit 110 includes a CPU 201, a ROM 202, a RAM 203, an input / output interface 205, and a bus 204 that connects them to each other.
  • a display unit 206, an input unit 207, a storage unit 208, a communication unit 209, a drive unit 210, and the like are connected to the input / output interface 205.
  • the display unit 206 is a display device using, for example, a liquid crystal or an EL.
  • the input unit 207 is, for example, a keyboard, a pointing device, a touch panel, or other operating device. When the input unit 207 includes a touch panel, the touch panel can be integrated with the display unit 206.
  • the storage unit 208 is a non-volatile storage device, for example, an HDD, a flash memory, or other solid-state memory.
  • the drive unit 210 is a device capable of driving a removable recording medium 211 such as an optical recording medium or a magnetic recording tape.
  • the communication unit 209 is a modem, router, or other communication device for communicating with another device that can be connected to a LAN, WAN, or the like.
  • the communication unit 209 may communicate using either wired or wireless.
  • the communication unit 209 is often used separately from the autonomous movement control unit 110. In the present embodiment, the communication unit 209 enables communication with other devices via the network.
  • Information processing by the autonomous movement control unit 110 having the hardware configuration as described above is realized by the cooperation between the software stored in the storage unit 208 or the ROM 202 or the like and the hardware resources of the autonomous movement control unit 110.
  • the information processing method according to the present technology is realized by loading and executing the program constituting the software stored in the ROM 202 or the like into the RAM 203.
  • the program is installed in the autonomous movement control unit 110 via, for example, the recording medium 211.
  • the program may be installed in the autonomous mobile control unit 110 via a global network or the like.
  • any non-transient storage medium that can be read by a computer may be used.
  • An information processing device By linking a computer mounted on a communication terminal with another computer capable of communicating via a network or the like, a mobile body, an information processing device, an information processing method, and a program related to this technology are executed, and related to this technology.
  • An information processing device may be constructed.
  • the mobile body, information processing device, information processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
  • Execution of moving objects, information processing devices, information processing methods, and programs related to this technology by a computer system is performed by, for example, identification of an influence space, calculation of an influence degree, generation of an action plan, and the like by a single computer. And when each process is performed by a different computer. Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result.
  • the mobile body, information processing device, information processing method, and program related to the present technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. It is possible.
  • the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained.
  • the description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
  • the present technology can also adopt the following configurations.
  • the space specifying unit is a mobile body that specifies the affected space based on observation information related to the observation of the sensor unit.
  • the observation information includes at least one of the angle of view of the sensor unit or the observable distance of the sensor unit.
  • the space specifying unit is a moving body that specifies the affected space based on a line segment connecting the moving body and the observation target. (6) The moving body according to any one of (1) to (5), and further. A moving body in which the object includes a calculation unit that calculates the degree of influence based on at least one of the shape, size, position, or velocity of the observation target. (7) The moving body according to (6).
  • the calculation unit is a moving body that calculates the degree of influence based on the position information of the object in the influence space. (8) The moving body according to any one of (1) to (7).
  • the action plan generation unit is a mobile body that generates the action plan for reducing the degree of influence.
  • the action plan generation unit is a moving body that integrates the observation action plan into a predetermined action plan for observing the observation target, using the action plan for observing the observation target as the observation action plan.
  • the action plan generation unit is a moving body that regulates the action plan for observing the observation target given in advance and executes the observation action plan.
  • the action plan generation unit is a moving body that totals the influence degrees of each and generates the action plan.
  • the action plan generation unit is a mobile body that generates the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
  • the action plan generation unit is a moving body that generates the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
  • the GUI output unit is an information processing device that outputs a GUI in which the degree of influence is identifiable.
  • Processing method. (20) A program that causes a computer system to execute a step to generate an action plan for observing the observation target based on the degree of influence that an object in the space has on the observation. ..

Abstract

In order to achieve the above purpose, a moving body according to an embodiment of the present technology includes a sensor unit and a moving plan generation unit. The sensor unit can observe an object to be observed. The moving plan generation unit generates a moving plan for observing the object to be observed, on the basis of the degree of influence that an object in an influence space affects the observation by the sensor unit set for the object to be observed. Accordingly, high stability in the observation can be exhibited.

Description

移動体、情報処理装置、情報処理方法、及びプログラムMobiles, information processing devices, information processing methods, and programs
 本技術は、自律移動等に適用可能な移動体、情報処理装置、情報処理方法、及びプログラムに関する。 This technology relates to mobile objects, information processing devices, information processing methods, and programs applicable to autonomous movement and the like.
 特許文献1に記載の自律移動ロボットは、撮像画像から目標と障害物との移動先を予測する。予測結果から、目標が障害物に遮蔽されるかが判定される。判定結果に基づいて、撮像画像の視界に入る目標の面積が多くなるように視界が変更される。これにより、目標を見失わないようにすることが開示されている(特許文献1の段落[0024][0026]図1等)。 The autonomous mobile robot described in Patent Document 1 predicts the destination of the target and the obstacle from the captured image. From the prediction result, it is determined whether the target is shielded by an obstacle. Based on the determination result, the field of view is changed so that the target area that enters the field of view of the captured image is large. Thereby, it is disclosed that the goal is not lost (paragraphs [0024] [0026] of Patent Document 1 and the like).
特開2018-147337号公報Japanese Unexamined Patent Publication No. 2018-147337
 このような目標を観測する自律移動に関して、観測における高い安定性が発揮させることが可能な技術が求められている。 Regarding autonomous movement to observe such targets, there is a need for technology that can demonstrate high stability in observation.
 以上のような事情に鑑み、本技術の目的は、観測における高い安定性が発揮可能な移動体、情報処理装置、情報処理方法、及びプログラムを提供することにある。 In view of the above circumstances, the purpose of this technology is to provide mobiles, information processing devices, information processing methods, and programs that can exhibit high stability in observation.
 上記目的を達成するため、本技術の一形態に係る移動体は、センサ部と、行動計画生成部とを具備する。
 前記センサ部は、観測対象を観測可能である。
 前記行動計画生成部は、前記観測対象に対して設定される前記センサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する。
In order to achieve the above object, the moving body according to one embodiment of the present technology includes a sensor unit and an action plan generation unit.
The sensor unit can observe the observation target.
The action plan generation unit generates an action plan for observing the observation target based on the degree of influence of an object in the influence space set on the observation target on the observation by the sensor unit. To do.
 この移動体では、観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、観測対象を観測する行動計画が生成される。これにより、観測における高い安定性が発揮可能となる。 In this moving object, an action plan for observing the observation target is generated based on the degree of influence that the object in the influence space that affects the observation by the sensor unit set for the observation target has on the observation. This makes it possible to demonstrate high stability in observation.
 前記移動体は、無人飛行体であってもよい。 The moving body may be an unmanned flying body.
 前記移動体は、さらに、前記観測対象の位置情報に基づいて、前記影響空間を特定する空間特定部を具備してもよい。 The moving body may further include a space specifying unit that specifies the affected space based on the position information of the observation target.
 前記空間特定部は、前記センサ部の観測に関する観測情報に基づいて、前記影響空間を特定してもよい。 The space identification unit may specify the influence space based on the observation information regarding the observation of the sensor unit.
 前記観測情報は、前記センサ部の画角、又は前記センサ部の観測可能距離の少なくとも1つを含んでもよい。この場合、前記空間特定部は、前記移動体と前記観測対象とを結ぶ線分に基づいて、前記影響空間を特定してもよい。 The observation information may include at least one of the angle of view of the sensor unit or the observable distance of the sensor unit. In this case, the space specifying unit may specify the affected space based on a line segment connecting the moving body and the observation target.
 前記移動体は、さらに、前記物体が前記観測対象の形状、大きさ、位置、又は速度のうち、少なくとも一つに基づいて、前記影響度を算出する算出部を具備してもよい。 The moving body may further include a calculation unit that calculates the degree of influence of the object based on at least one of the shape, size, position, or speed of the observation target.
 前記算出部は、前記影響空間内における前記物体の位置情報に基づいて、前記影響度を算出してもよい。 The calculation unit may calculate the degree of influence based on the position information of the object in the influence space.
 前記移動体は、前記行動計画生成部により生成された行動計画に基づいて移動してもよい。 The moving body may move based on the action plan generated by the action plan generation unit.
 前記移動体は、前記行動計画生成部により生成された行動計画に基づいて移動してもよい。この場合、前記行動計画生成部は、前記影響度を減らすための前記行動計画を生成してもよい。 The moving body may move based on the action plan generated by the action plan generation unit. In this case, the action plan generation unit may generate the action plan for reducing the degree of influence.
 前記行動計画生成部は、前記観測対象を観測する行動計画を観測行動計画として、予め与えられた前記観測対象を観測するための行動計画に前記観測行動計画を統合してもよい。 The action plan generation unit may integrate the observation action plan into the action plan for observing the observation target given in advance, using the action plan for observing the observation target as the observation action plan.
 前記行動計画生成部は、予め与えられた前記観測対象を観測するための前記行動計画を規制し、前記観測行動計画を実行させてもよい。 The action plan generation unit may regulate the action plan for observing the observation target given in advance and execute the observation action plan.
 前記行動計画生成部は、前記影響空間内に複数の物体が存在する場合、各々の前記影響度を合算し、前記行動計画を生成してもよい。 When a plurality of objects exist in the influence space, the action plan generation unit may generate the action plan by adding up the influence degrees of each.
 前記移動体は、さらに、前記センサ部により取得されたセンシング結果に基づいて、前記観測対象又は前記物体の少なくとも1つの行動を予測する行動予測部を具備してもよい。 The moving body may further include a behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
 前記行動計画生成部は、前記行動予測部により予測された前記観測対象の所定の行動に基づいて、前記行動計画を生成してもよい。 The action plan generation unit may generate the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
 前記行動計画生成部は、前記観測対象が前記センサ部により観測可能な空間外へ移動する場合に、前記所定の行動に基づいて前記行動計画を生成してもよい。 The action plan generation unit may generate the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
 本技術の一形態に係る情報処理装置は、行動計画生成部を具備する。
 前記行動計画生成部は、観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する。
The information processing device according to one form of the present technology includes an action plan generation unit.
The action plan generation unit generates an action plan for observing the observation target based on the degree of influence on the observation by an object in the influence space that affects the observation by the sensor unit set for the observation target.
 前記情報処理装置は、さらに、前記影響空間が識別可能に表示されたGUI(Graphical User Interface)を出力するGUI出力部を具備してもよい。 The information processing device may further include a GUI output unit that outputs a GUI (Graphical User Interface) in which the influence space is identifiable.
 前記GUI出力部は、前記影響度が識別可能に表示されたGUIを出力してもよい。 The GUI output unit may output a GUI in which the degree of influence is identifiable.
 本技術の一形態に係る情報処理方法は、コンピュータシステムが実行する情報処理方法であって、観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成することを含む。 The information processing method according to one form of the present technology is an information processing method executed by a computer system and has an influence on the observation by the sensor unit set for the observation target. The influence of an object in the space on the observation. It includes generating an action plan for observing the observation target based on the degree.
 本技術の一形態に係るプログラムは、コンピュータシステムに以下のステップを実行させる。
 観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成するステップ。
A program according to a form of the present technology causes a computer system to perform the following steps.
A step of generating an action plan for observing the observation target based on the degree of influence on the observation by an object in the influence space that affects the observation by the sensor unit set for the observation target.
観測移動体の概要を説明するための模式図である。It is a schematic diagram for demonstrating the outline of the observation moving body. 観測移動体を制御する移動体制御システムの概略的な機能の構成例を示すブロック図である。It is a block diagram which shows the structural example of the schematic function of the mobile body control system which controls an observation moving body. 影響空間の特定と影響度の算出との制御を示すフローチャートである。It is a flowchart which shows the control of the identification of the influence space, and the calculation of the degree of influence. 影響空間の特定方法の具体例を説明するための模式図である。It is a schematic diagram for demonstrating a concrete example of the method of specifying an influence space. 影響度の算出方法の具体例を説明するための模式図である。It is a schematic diagram for demonstrating a concrete example of the calculation method of the degree of influence. 観測行動計画の具体例を示す模式図である。It is a schematic diagram which shows a concrete example of an observation action plan. 観測移動体に指示を入力するための観測用GUIを示す模式図である。It is a schematic diagram which shows the GUI for observation for inputting an instruction to an observation moving body. 影響空間が表示される観測用GUIを示す模式図である。It is a schematic diagram which shows the GUI for observation which displays the influence space. 影響空間の他の形状を示す模式図である。It is a schematic diagram which shows the other shape of the influence space. 自律移動制御部のハードウェア構成例を示すブロック図である。It is a block diagram which shows the hardware configuration example of the autonomous movement control part.
 以下、本技術に係る実施形態を、図面を参照しながら説明する。 Hereinafter, embodiments relating to the present technology will be described with reference to the drawings.
 [観測移動体]
 図1は、本技術に係る観測移動体の概要を説明するための模式図である。図1Aは、観測移動体1が観測対象2に対して追従する様子を示す模式図である。図1Bは、観測移動体1から観測対象2を見た状態を示す模式図である。
 本実施形態では、観測移動体1は、センサ部3による観測に影響がある影響空間4内の物体5が観測に与える影響度に基づいて、観測対象2を観測する行動計画を生成することが可能である。なお影響空間4は、センサ部3により観測される観測対象2に対して設定される空間である。
[Observation mobile]
FIG. 1 is a schematic diagram for explaining an outline of an observation mobile body according to the present technology. FIG. 1A is a schematic diagram showing how the observation moving body 1 follows the observation target 2. FIG. 1B is a schematic view showing a state in which the observation target 2 is viewed from the observation moving body 1.
In the present embodiment, the observation moving body 1 can generate an action plan for observing the observation target 2 based on the degree of influence of the object 5 in the influence space 4 that affects the observation by the sensor unit 3 on the observation. It is possible. The influence space 4 is a space set for the observation target 2 observed by the sensor unit 3.
 観測移動体1は、自律飛行が可能なドローンである。
 本実施形態では、観測移動体1は、観測対象2を観測可能なセンサ部3を有する。例えば、センサ部3は、ステレオカメラ、デジタルカメラ、単眼カメラ等の撮像装置を含む。これ以外にも、レーザ測距センサ、接触センサ、超音波センサ、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ソナー等のセンサデバイスが用いられてもよい。
 また本実施形態では、観測移動体1は、観測対象2をセンサ部3で観測した状態を維持し、継続的に追跡するトラッキングが可能である。
 なお、観測移動体1はドローンに限定されず、例えば、車輪型のロボット、多足歩行型のロボット、及び多関節構造を有する脚部を備えるロボットでもよい。
 観測対象2及び物体5は、限定されず、任意の物体が観測対象2として設定される。例えば、ドローン等の他の移動体、人、構造物、道路、信号機、交通標識、道路標示等が含まれる。以下、影響空間4内に存在する物体5のことを障害物と記載する場合がある。
The observation mobile body 1 is a drone capable of autonomous flight.
In the present embodiment, the observation mobile body 1 has a sensor unit 3 capable of observing the observation target 2. For example, the sensor unit 3 includes an imaging device such as a stereo camera, a digital camera, or a monocular camera. In addition to this, sensor devices such as laser distance measuring sensors, contact sensors, ultrasonic sensors, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), and sonar may be used.
Further, in the present embodiment, the observation moving body 1 can maintain the state in which the observation target 2 is observed by the sensor unit 3 and can continuously track the observation target 2.
The observation moving body 1 is not limited to the drone, and may be, for example, a wheel-type robot, a multi-legged walking type robot, or a robot having legs having an articulated structure.
The observation target 2 and the object 5 are not limited, and any object is set as the observation target 2. For example, other moving objects such as drones, people, structures, roads, traffic lights, traffic signs, road markings, etc. are included. Hereinafter, the object 5 existing in the influence space 4 may be described as an obstacle.
 影響空間4は、センサ部3による観測に影響がある空間として、観測対象2に対して設定される。
 観測とは、典型的には、観測移動体1のセンサ部3に含まれる種々のセンサデバイスから観測対象2に関する情報を取得することを含む。例えば、カメラにより撮像された観測対象2の画像情報、又はマイクにより検出された観測対象2の声や足音等の音声データ等が観測の結果に含まれる。またこれらの情報(データ)を取得可能な状態が、観測対象2を観測可能な状態となる。
 本開示では、観測可能な状態として、観測移動体1に搭載されたカメラの画角内に観測対象2を収めることが可能な状態を例に挙げる。例えば、物体5により観測対象2が遮蔽される等の観測対象2を見失う状態は観測が不可能な状態となる。
 従って本実施形態では、影響空間4とは、その空間内に物体5等が存在することで、観測対象2の観測が不可能な状態となる空間、又は観測が不可能な状態となり得る空間である。また物体5が観測に与える影響度とは、物体5が観測を妨げる度合いである。例えば、影響空間4内を占める物体5の面積(体積)が大きい程、観測に与える影響度は大きくなる。もちろん、観測移動体1の位置等の他のパラメータ等により、観測に与える影響度は変わってくる可能性はある。
 本実施形態では、影響空間4の一例として、円柱形状の影響空間4が設定される。なお、影響空間4の形状は限定されず、センサ部3の観測に関する観測情報に応じた任意の形状が設定されてもよい。
 本実施形態では、観測情報は、例えば、画角(視野角)や観測可能な最小距離及び最大距離等の観測可能距離の少なくとも1つを含む。
The influence space 4 is set for the observation target 2 as a space that affects the observation by the sensor unit 3.
Observation typically includes acquiring information about the observation target 2 from various sensor devices included in the sensor unit 3 of the observation mobile body 1. For example, the image information of the observation target 2 captured by the camera, or the voice data such as the voice and footsteps of the observation target 2 detected by the microphone are included in the observation result. Further, the state in which these information (data) can be acquired is the state in which the observation target 2 can be observed.
In the present disclosure, as an observable state, a state in which the observation target 2 can be contained within the angle of view of the camera mounted on the observation moving body 1 is taken as an example. For example, a state in which the observation target 2 is obscured by the object 5 and the observation target 2 is lost is a state in which observation is impossible.
Therefore, in the present embodiment, the influence space 4 is a space in which the observation target 2 cannot be observed due to the existence of the object 5 or the like in the space, or a space in which the observation is impossible. is there. The degree of influence of the object 5 on the observation is the degree of the object 5 hindering the observation. For example, the larger the area (volume) of the object 5 that occupies the influence space 4, the greater the degree of influence on the observation. Of course, the degree of influence on the observation may change depending on other parameters such as the position of the observation moving body 1.
In the present embodiment, a cylindrical influence space 4 is set as an example of the influence space 4. The shape of the influence space 4 is not limited, and an arbitrary shape may be set according to the observation information regarding the observation of the sensor unit 3.
In the present embodiment, the observation information includes at least one of observable distances such as an angle of view (viewing angle), an observable minimum distance, and a maximum distance.
 行動計画とは、観測移動体1を制御する種々の情報である。例えば、観測移動体1の速度、観測移動体1が移動する経路(軌跡)、観測移動体1が通過する経由点(位置)、及び観測移動体1の姿勢等が行動計画に含まれる。
 本実施形態では、観測移動体1の移動方向及び速度が行動計画として生成される。なお観測移動体1の各時刻における移動方向及び速度は、観測移動体1の経路とも言える。
The action plan is various information that controls the observation mobile body 1. For example, the speed of the observation moving body 1, the path (trajectory) in which the observation moving body 1 moves, the waypoint (position) through which the observation moving body 1 passes, the posture of the observation moving body 1, and the like are included in the action plan.
In the present embodiment, the moving direction and speed of the observed moving body 1 are generated as an action plan. The moving direction and speed of the observed moving body 1 at each time can be said to be the path of the observed moving body 1.
 図1Aでは、観測移動体1は、観測対象2を追従する行動計画に従い、観測対象2を観測している。図1A及び図1Bに示すように、観測対象2が物体5に遮蔽される位置に向かって移動している場合に、観測移動体1は、物体5が観測に与える影響度に基づいて生成された行動計画に従い、観測対象2を観測することが可能である。
 例えば、観測移動体1は、物体5を回り込むように移動する行動計画7に従い、観測対象2を観測可能な位置8へと移動する。観測移動体1は、位置8へ移動が完了した後、予め与えられた観測対象2を追従する行動計画に従い、追従を続行する。
 すなわち、観測移動体1は、観測対象2を見失うことなく、行動計画を実行することが可能である。
In FIG. 1A, the observation moving body 1 observes the observation target 2 according to an action plan that follows the observation target 2. As shown in FIGS. 1A and 1B, when the observation target 2 is moving toward a position shielded by the object 5, the observation moving object 1 is generated based on the degree of influence of the object 5 on the observation. It is possible to observe the observation target 2 according to the action plan.
For example, the observation moving body 1 moves the observation target 2 to the observable position 8 according to the action plan 7 that moves around the object 5. After the movement to the position 8 is completed, the observation moving body 1 continues following according to the action plan for following the observation target 2 given in advance.
That is, the observation mobile body 1 can execute the action plan without losing sight of the observation target 2.
 [移動体制御システムの構成例]
 上述した機能を実現させるための観測移動体1を制御する移動体制御システムについて説明する。
[Configuration example of mobile control system]
A mobile control system that controls the observation mobile 1 to realize the above-mentioned functions will be described.
 図2は、本開示の観測移動体1を制御する移動体制御システム100の概略的な機能の構成例を示すブロック図である。 FIG. 2 is a block diagram showing a configuration example of a schematic function of the mobile body control system 100 that controls the observation mobile body 1 of the present disclosure.
 移動体制御システム100は、入力部101、データ取得部102、通信部103、移動体内部機器104、出力制御部105、出力部106、駆動系制御部107、駆動系システム108、記憶部109、及び自律移動制御部110を備える。入力部101、データ取得部102、通信部103、出力制御部105、駆動系制御部107、記憶部109、及び自律移動制御部110は、通信ネットワーク111を介して、相互に接続されている。通信ネットワーク111は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、IEEE802.3等のLAN(Local Area Network)、又は、FlexRay(登録商標)等の任意の規格に準拠した通信ネットワークやバス、あるいは規格化されていない独自の通信方式等からなる。なお、移動体制御システム100の各部は、通信ネットワーク111を介さずに、直接接続される場合もある。 The mobile control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, and a storage unit 109. And an autonomous movement control unit 110. The input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the storage unit 109, and the autonomous movement control unit 110 are connected to each other via the communication network 111. The communication network 111 is a communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) such as IEEE802.3, or FlexRay (registered trademark). It consists of a bus, a bus, or a unique communication method that is not standardized. In addition, each part of the mobile control system 100 may be directly connected without going through the communication network 111.
 なお以下、移動体制御システム100の各部が、通信ネットワーク111を介して通信を行う場合、通信ネットワーク111の記載を省略するものとする。例えば、入力部101と自律移動制御部110が、通信ネットワーク111を介して通信を行う場合、単に入力部101と自律移動制御部110が通信を行うと記載する。 Hereinafter, when each part of the mobile control system 100 communicates via the communication network 111, the description of the communication network 111 shall be omitted. For example, when the input unit 101 and the autonomous movement control unit 110 communicate with each other via the communication network 111, it is described that the input unit 101 and the autonomous movement control unit 110 simply communicate with each other.
 入力部101は、観測移動体1に各種のデータや指示等の入力に用いる装置を備える。例えば、入力部101は、タッチパネル、ボタン、マイクロフォン、スイッチ、及びレバー等の操作デバイス、並びに、音声やジェスチャ等により手動操作以外の方法で入力可能な操作デバイス等を備える。また、例えば、入力部101は、赤外線若しくはその他の電波を利用したリモートコントロール装置、又は、移動体制御システム100の操作に対応したモバイル機器若しくはウェアラブル機器等の外部接続機器であってもよい。
 入力部101は、観測移動体1に行動計画を与える操作者(以下、ユーザと記載する)により入力されたデータや指示等に基づいて入力信号を生成し、移動体制御システム100の各部に供給する。
The input unit 101 includes a device used for inputting various data, instructions, and the like to the observation mobile body 1. For example, the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device capable of inputting by a method other than manual operation by voice or gesture. Further, for example, the input unit 101 may be a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device that supports the operation of the mobile control system 100.
The input unit 101 generates an input signal based on data, instructions, and the like input by an operator (hereinafter referred to as a user) who gives an action plan to the observation moving body 1, and supplies the input signal to each part of the moving body control system 100. To do.
 データ取得部102は、移動体制御システム100の処理に用いるデータを取得する各種のセンサ等を備え、取得したデータを、移動体制御システム100の各部に供給する。
 例えば、データ取得部102は、観測移動体1の状態等を検出するための各種のセンサを備えることでセンサ群112を構成し、図1のセンサ部3に対応する。具体的には、例えば、データ取得部102は、ジャイロセンサ、加速度センサ、慣性計測装置(IMU)、及び、アクセル等の加速入力の操作量、減速入力の操作量、方向指示入力の操作量、エンジンやモータ等の駆動装置の回転数や入出力エネルギー・燃料量、エンジンやモータ等のトルク量、若しくは、車輪や関節の回転速度やトルク等を検出するためのセンサ等を備える。
 また例えば、データ取得部102は、観測対象2や物体5等の観測移動体1の外部の情報を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、ToF(Time of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ、偏光カメラ、及び、その他のカメラ等の撮像装置を備える。また、例えば、データ取得部102は、天候又は気象等を検出するための環境センサ、及び、観測移動体1の周囲の物体を検出するための周囲情報検出センサを備える。環境センサは、例えば、雨滴センサ、霧センサ、日照センサ、雪センサ等からなる。周囲情報検出センサは、例えば、レーザ測距センサ、超音波センサ、レーダ、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ソナー等からなる。
 さらに例えば、データ取得部102は、観測移動体1の現在位置を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、GNSS衛星からのGNSS信号を受信するGNSS受信機等を備える。
The data acquisition unit 102 includes various sensors and the like for acquiring data used for processing of the mobile control system 100, and supplies the acquired data to each unit of the mobile control system 100.
For example, the data acquisition unit 102 constitutes the sensor group 112 by including various sensors for detecting the state of the observation moving body 1, and corresponds to the sensor unit 3 in FIG. Specifically, for example, the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an operation amount of acceleration input such as an accelerator, an operation amount of deceleration input, and an operation amount of direction instruction input. It is equipped with a sensor for detecting the number of revolutions of a drive device such as an engine or a motor, the amount of input / output energy / fuel, the amount of torque of an engine or a motor, or the rotational speed or torque of wheels or joints.
Further, for example, the data acquisition unit 102 includes various sensors for detecting information outside the observation moving body 1 such as the observation target 2 and the object 5. Specifically, for example, the data acquisition unit 102 includes an imaging device such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, an infrared camera, a polarized camera, and other cameras. Further, for example, the data acquisition unit 102 includes an environment sensor for detecting the weather or the weather, and a surrounding information detection sensor for detecting an object around the observation moving body 1. The environmental sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like. Ambient information detection sensors include, for example, laser distance measuring sensors, ultrasonic sensors, radars, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), sonar, and the like.
Further, for example, the data acquisition unit 102 includes various sensors for detecting the current position of the observation moving body 1. Specifically, for example, the data acquisition unit 102 includes a GNSS receiver or the like that receives a GNSS signal from a GNSS satellite.
 通信部103は、移動体内部機器104、並びに、他のドローン等の観測移動体1外部の様々な機器、サーバ、基地局等と通信を行い、移動体制御システム100の各部から供給されるデータを送信したり、受信したデータを移動体制御システム100の各部に供給したりする。なお、通信部103がサポートする通信プロトコルは、特に限定されるものではなく、また、通信部103が、複数の種類の通信プロトコルをサポートすることも可能である。
 例えば、通信部103は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)、又は、WUSB(Wireless USB)等により、移動体内部機器104と無線通信を行う。また例えば、通信部103は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface)、又は、MHL(Mobile High-definition Link)等により、移動体内部機器104と有線通信を行う。
 さらに例えば、通信部103は、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)との通信を行う。また例えば、通信部103は、P2P(Peer To Peer)技術を用いて、観測移動体1の近傍に存在する端末(例えば、歩行者若しくは店舗の端末、又は、MTC(Machine Type Communication)端末)との通信を行う。
The communication unit 103 communicates with the mobile internal device 104 and various devices, servers, base stations, etc. outside the observation mobile 1 such as other drones, and data supplied from each unit of the mobile control system 100. Is transmitted, and the received data is supplied to each part of the mobile control system 100. The communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 may support a plurality of types of communication protocols.
For example, the communication unit 103 wirelessly communicates with the mobile internal device 104 by wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like. Further, for example, the communication unit 103 uses a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (Mobile) via a connection terminal (and a cable if necessary) (not shown). Wired communication is performed with the mobile internal device 104 by high-definition link) or the like.
Further, for example, the communication unit 103 communicates with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network or a network peculiar to a business operator) via a base station or an access point. I do. Further, for example, the communication unit 103 uses P2P (Peer To Peer) technology to connect with a terminal (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal) existing in the vicinity of the observation mobile body 1. To communicate.
 移動体内部機器104は、例えば、ユーザが有するモバイル機器若しくはウェアラブル機器、観測移動体1に搬入され若しくは取り付けられる情報機器、及び任意の目的地までの経路探索を行うナビゲーション装置等を含む。 The mobile internal device 104 includes, for example, a mobile device or wearable device owned by the user, an information device carried in or attached to the observation mobile body 1, a navigation device for searching a route to an arbitrary destination, and the like.
 出力制御部105は、ユーザ又は観測移動体1外部に対する各種の情報の出力を制御する。例えば、出力制御部105は、視覚情報(例えば、画像データ)及び聴覚情報(例えば、音声データ)のうちの少なくとも1つを含む出力信号を生成し、出力部106に供給することにより、出力部106からの視覚情報及び聴覚情報の出力を制御する。具体的には、例えば、出力制御部105は、データ取得部102の異なる撮像装置により撮像された画像データを合成して、俯瞰画像又はパノラマ画像等を生成し、生成した画像を含む出力信号を出力部106に供給する。また、例えば、出力制御部105は、衝突、接触、危険地帯への進入等の危険に対する警告音又は警告メッセージ等を含む音声データを生成し、生成した音声データを含む出力信号を出力部106に供給する。 The output control unit 105 controls the output of various information to the user or the outside of the observation mobile body 1. For example, the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data) and supplies the output signal to the output unit 106. Controls the output of visual and auditory information from 106. Specifically, for example, the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate a bird's-eye view image, a panoramic image, or the like, and outputs an output signal including the generated image. It is supplied to the output unit 106. Further, for example, the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a danger zone, and outputs an output signal including the generated voice data to the output unit 106. Supply.
 出力部106は、ユーザ又は観測移動体1外部に対して、視覚情報又は聴覚情報を出力することが可能な装置を備える。例えば、出力部106は、表示装置、インストルメントパネル、オーディオスピーカ、ヘッドホン、ユーザが装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ、ランプ等を備える。出力部106が備える表示装置は、通常のディスプレイを有する装置以外にも、例えば、ヘッドアップディスプレイ、透過型ディスプレイ、AR(Augmented Reality)表示機能を有する装置等のユーザの視野内に視覚情報を表示する装置であってもよい。尚、出力制御部105及び出力部106は、自律移動の処理には必須の構成ではないため、必要に応じて省略するようにしてもよい。 The output unit 106 includes a device capable of outputting visual information or auditory information to the user or the outside of the observation mobile body 1. For example, the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a spectacle-type display worn by a user, a projector, a lamp, and the like. The display device included in the output unit 106 displays visual information in the user's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function, in addition to the device having a normal display. It may be a device for Since the output control unit 105 and the output unit 106 are not indispensable for the processing of autonomous movement, they may be omitted if necessary.
 駆動系制御部107は、各種の制御信号を生成し、駆動系システム108に供給することにより、駆動系システム108の制御を行う。また、駆動系制御部107は、必要に応じて、駆動系システム108以外の各部に制御信号を供給し、駆動系システム108の制御状態の通知等を行う。 The drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies control signals to each unit other than the drive system system 108 as necessary, and notifies the control state of the drive system system 108.
 駆動系システム108は、観測移動体1の駆動系に関わる各種の装置を備える。例えば、駆動系システム108は、4本の足の各関節に備わった角度やトルクを指定可能なサーボモータ、ロボット自体の移動の動きを4本の足の動きに分解・置換するモーションコントローラ並びに、各モータ内のセンサや足裏面のセンサによるフィードバック制御装置を備える。
 別の例では、駆動系システム108は、4基ないし6基の機体上向きのプロペラを持つモータ、ロボット自体の移動の動きを各モータの回転量に分解・置換するモーションコントローラを備える。
 さらに、別の例では、駆動系システム108は、内燃機関又は駆動用モータ等の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、舵角を調節するステアリング機構、制動力を発生させる制動装置、ABS(Antilock Brake System)、ESC(Electronic Stability Control)、並びに、電動パワーステアリング装置等を備える。
The drive system system 108 includes various devices related to the drive system of the observation mobile body 1. For example, the drive system 108 includes a servomotor that can specify the angle and torque provided in each joint of the four legs, a motion controller that decomposes and replaces the movement of the robot itself into the movements of the four legs, and It is equipped with a feedback control device using sensors in each motor and sensors on the back of the foot.
In another example, the drive system 108 includes motors having four or six upward propellers, and a motion controller that decomposes and replaces the movement of the robot itself into the amount of rotation of each motor.
Further, in another example, the drive system system 108 provides a drive force generator for generating a drive force of an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to the wheels, and a steering angle. It is equipped with a steering mechanism for adjusting, a braking device for generating braking force, an ABS (Antilock Brake System), an ESC (Electronic Stability Control), an electric power steering device, and the like.
 記憶部109は、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、及び光磁気記憶デバイス等を備える。記憶部109は、移動体制御システム100の各部が用いる各種プログラムやデータ等を記憶する。例えば、記憶部109は、ダイナミックマップ等の3次元の高精度地図、高精度地図より精度が低く、広いエリアをカバーするグローバルマップ、及び観測移動体1の周囲の情報を含むローカルマップ等の地図データを記憶する。 The storage unit 109 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and the like. The storage unit 109 stores various programs, data, and the like used by each unit of the mobile control system 100. For example, the storage unit 109 is a three-dimensional high-precision map such as a dynamic map, a global map that is less accurate than the high-precision map and covers a wide area, and a map such as a local map that includes information around the observation moving object 1. Store data.
 自律移動制御部110は、自動運転又は運転支援等の自律移動に関する制御を行う。具体的には、例えば、自律移動制御部110は、観測移動体1の衝突回避あるいは衝撃緩和、観測移動体間距離に基づく追従移動、速度維持移動、または、観測移動体1の衝突警告の機能実現を目的とした協調制御を行う。また、例えば、自律移動制御部110は、ユーザの操作に拠らずに自律的に移動する自律移動等を目的とした協調制御を行う。
 自律移動制御部110は、本実施形態に係る情報処理装置に相当し、例えばCPU、RAM、及びROM等のコンピュータに必要なハードウェアを有する。CPUがROMに予め記録されている本技術に係るプログラムをRAMにロードして実行することにより、本技術に係る情報処理方法が実行される。
 自律移動制御部110の具体的な構成は限定されず、例えばFPGA(Field Programmable Gate Array)等のPLD(Programmable Logic Device)、その他ASIC(Application Specific Integrated Circuit)等のデバイスが用いられてもよい。
The autonomous movement control unit 110 controls autonomous movement such as automatic driving or driving support. Specifically, for example, the autonomous movement control unit 110 has a function of collision avoidance or impact mitigation of the observation moving body 1, follow-up movement based on the distance between the observation moving bodies, speed maintenance movement, or collision warning function of the observation moving body 1. Perform coordinated control for the purpose of realization. Further, for example, the autonomous movement control unit 110 performs cooperative control for the purpose of autonomous movement that moves autonomously without depending on the operation of the user.
The autonomous movement control unit 110 corresponds to the information processing device according to the present embodiment, and has hardware necessary for a computer such as a CPU, RAM, and ROM. The information processing method according to the present technology is executed by the CPU loading the program according to the present technology recorded in the ROM in advance into the RAM and executing the program.
The specific configuration of the autonomous movement control unit 110 is not limited, and for example, a PLD (Programmable Logic Device) such as an FPGA (Field Programmable Gate Array) or another device such as an ASIC (Application Specific Integrated Circuit) may be used.
 図2に示すように、自律移動制御部110は、検出部131、自己位置推定部132、状況分析部133、計画部134、及び動作制御部135を備える。このうち、検出部131、自己位置推定部132、及び状況分析部133は、認識処理部121を構成する。また、計画部134は、行動計画処理部122を構成する。さらに、動作制御部135は、行動制御処理部123を構成する。またさらに、影響空間特定部151、影響度算出部152、及びGUI(Graphical User Interface)出力部153は、影響空間処理部124を構成する。 As shown in FIG. 2, the autonomous movement control unit 110 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135. Of these, the detection unit 131, the self-position estimation unit 132, and the situation analysis unit 133 constitute the recognition processing unit 121. In addition, the planning unit 134 constitutes an action plan processing unit 122. Further, the motion control unit 135 constitutes the behavior control processing unit 123. Furthermore, the influence space identification unit 151, the influence degree calculation unit 152, and the GUI (Graphical User Interface) output unit 153 constitute the influence space processing unit 124.
 検出部131は、自律移動の制御に必要な各種の情報の検出を行う。検出部131は、移動体外部情報検出部141、及び移動体状態検出部142を備える。 The detection unit 131 detects various types of information necessary for controlling autonomous movement. The detection unit 131 includes a mobile body external information detection unit 141 and a mobile body state detection unit 142.
 移動体外部情報検出部141は、移動体制御システム100の各部からのデータ又は信号に基づいて、観測移動体1の外部の情報の検出処理を行う。例えば、移動体外部情報検出部141は、観測移動体1の周囲の観測対象2や物体5の検出処理、認識処理、及び追跡処理、並びに、観測対象2や物体5までの距離の検出処理を行う。
 また、例えば、移動体外部情報検出部141は、移動体の周囲の環境の検出処理を行う。検出対象となる周囲の環境には、例えば、天候、気温、湿度、明るさ、及び路面の状態等が含まれる。移動体外部情報検出部141は、検出処理の結果を示すデータを自己位置推定部132、状況分析部133のマップ解析部145、及び状況認識部146、並びに、動作制御部135等に供給する。
The mobile external information detection unit 141 performs detection processing of external information of the observation mobile body 1 based on data or signals from each unit of the mobile control system 100. For example, the moving body external information detection unit 141 performs detection processing, recognition processing, and tracking processing of the observation target 2 and the object 5 around the observation moving body 1, and detection processing of the distance to the observation target 2 and the object 5. Do.
Further, for example, the mobile body external information detection unit 141 performs detection processing of the environment around the mobile body. The surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like. The mobile external information detection unit 141 supplies data indicating the result of the detection process to the self-position estimation unit 132, the map analysis unit 145 of the situation analysis unit 133, the situation recognition unit 146, the operation control unit 135, and the like.
 移動体状態検出部142は、移動体制御システム100の各部からのデータ又は信号に基づいて、観測移動体1の状態の検出処理を行う。検出対象となる観測移動体1の状態には、例えば、速度、加速度、舵角、異常の有無及び内容、運転操作の状態、並びに、その他の移動体搭載機器の状態等が含まれる。移動体状態検出部142は、検出処理の結果を示すデータを状況分析部133の状況認識部146、及び動作制御部135等に供給する。 The mobile body state detection unit 142 performs a state detection process of the observation mobile body 1 based on data or signals from each unit of the mobile body control system 100. The state of the observation moving body 1 to be detected includes, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, state of driving operation, state of other moving body-mounted equipment, and the like. The mobile body state detection unit 142 supplies data indicating the result of the detection process to the situation awareness unit 146 of the situation analysis unit 133, the operation control unit 135, and the like.
 自己位置推定部132は、移動体外部情報検出部141、及び状況分析部133の状況認識部146等の移動体制御システム100の各部からのデータ又は信号に基づいて、観測移動体1の位置及び姿勢等の推定処理を行う。また、自己位置推定部132は、必要に応じて、自己位置の推定に用いるローカルマップ(以下、自己位置推定用マップと称する)を生成する。自己位置推定用マップは、例えば、SLAM(Simultaneous Localization and Mapping)等の技術を用いた高精度なマップとされる。自己位置推定部132は、推定処理の結果を示すデータを状況分析部133のマップ解析部145、及び、状況認識部146等に供給する。また、自己位置推定部132は、自己位置推定用マップを記憶部109に記憶させる。
 さらに、自己位置推定部132は、センサ群112より供給される検出結果に基づいて、時系列に供給される時系列情報をデータベースに蓄積すると共に、蓄積した時系列の情報に基づいて、自己位置を推定し、時系列情報自己位置として出力する。また、自己位置推定部132は、センサ群112より供給される現在の検出結果に基づいて、自己位置を推定し、現在情報自己位置として出力する。そして、自己位置推定部132は、時系列情報自己位置と、現在情報自己位置とを統合する、または、切り替えることにより自己位置推定結果として出力する。さらに、自己位置推定部132は、センサ群112より供給される検出結果に基づいて、観測移動体1の姿勢を検出し、姿勢の変化が検出されて、自己位置が大きく変化し、時系列情報自己位置の推定精度が低下するとみなされるとき、現在情報自己位置のみから自己位置を推定する。また、例えば、観測移動体1が別の移動体に搭載されて移動するような場合、自己位置推定部132は、センサ群112より供給される検出結果に基づいて、観測移動体1の姿勢の変化が検出されなくても、自己位置が大きく変化するので、時系列情報自己位置の推定精度が低下するとみなし、現在情報自己位置のみから自己位置を推定する。
The self-position estimation unit 132 sets the position of the observation mobile body 1 and the position of the observation mobile body 1 based on data or signals from each unit of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 146 of the situation analysis unit 133. Performs estimation processing such as posture. In addition, the self-position estimation unit 132 generates a local map (hereinafter, referred to as a self-position estimation map) used for self-position estimation, if necessary. The map for self-position estimation is, for example, a highly accurate map using a technique such as SLAM (Simultaneous Localization and Mapping). The self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 145 of the situation analysis unit 133, the situation recognition unit 146, and the like. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 109.
Further, the self-position estimation unit 132 accumulates the time-series information supplied in the time series based on the detection result supplied from the sensor group 112 in the database, and also stores the self-position based on the accumulated time-series information. Is estimated and output as the time series information self-position. Further, the self-position estimation unit 132 estimates the self-position based on the current detection result supplied from the sensor group 112, and outputs the current information self-position. Then, the self-position estimation unit 132 outputs the self-position estimation result by integrating or switching the time-series information self-position and the current information self-position. Further, the self-position estimation unit 132 detects the posture of the observation moving body 1 based on the detection result supplied from the sensor group 112, the change in the posture is detected, the self-position changes significantly, and the time series information. When it is considered that the estimation accuracy of the self-position decreases, the self-position is estimated only from the current information self-position. Further, for example, when the observation moving body 1 is mounted on another moving body and moves, the self-position estimation unit 132 determines the posture of the observation moving body 1 based on the detection result supplied from the sensor group 112. Even if no change is detected, the self-position changes significantly, so it is considered that the estimation accuracy of the time-series information self-position decreases, and the self-position is estimated only from the current information self-position.
 状況分析部133は、観測移動体1及び周囲の状況の分析処理を行う。状況分析部133は、マップ解析部145、状況認識部146、及び状況予測部147を備える。 The situation analysis unit 133 analyzes the observation moving body 1 and the surrounding situation. The situational analysis unit 133 includes a map analysis unit 145, a situational awareness unit 146, and a situational awareness unit 147.
 マップ解析部145は、自己位置推定部132及び移動体外部情報検出部141等の移動体制御システム100の各部からのデータ又は信号を必要に応じて用いながら、記憶部109に記憶されている各種のマップの解析処理を行い、自律移動の処理に必要な情報を含むマップを構築する。マップ解析部145は、構築したマップを、状況認識部146、状況予測部147、並びに、計画部134のルート計画部161、行動計画部162、及び動作計画部163等に供給する。 The map analysis unit 145 uses data or signals from each unit of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and is stored in the storage unit 109. The map is analyzed and a map containing the information necessary for autonomous mobile processing is constructed. The map analysis unit 145 supplies the constructed map to the situation recognition unit 146, the situation prediction unit 147, the route planning unit 161 of the planning unit 134, the action planning unit 162, the operation planning unit 163, and the like.
 状況認識部146は、自己位置推定部132、移動体外部情報検出部141、移動体状態検出部142、及びマップ解析部145等の移動体制御システム100の各部からのデータ又は信号に基づいて、観測移動体1に関する状況の認識処理を行う。例えば、状況認識部146は、観測移動体1の状況、及び観測移動体1の周囲の状況を行う。また、状況認識部146は、必要に応じて、観測移動体1の周囲の状況の認識に用いるローカルマップ(以下、状況認識用マップと称する)を生成する。状況認識用マップは、例えば、占有格子地図(Occupancy Grid Map)、道路地図(Lane Map)、または、点群地図(Point Cloud Map)とされる。
 認識対象となる観測移動体1の状況には、例えば、観測移動体1の位置、姿勢、動き(例えば、速度、加速度、移動方向等)、並びに、異常の有無及び内容等が含まれる。認識対象となる観測移動体1の周囲の状況には、例えば、観測対象2や物体5等の周囲の物体の種類、位置及び動き(例えば、速度、加速度、移動方向等)が含まれる。また例えば、周囲の道路の構成及び路面の状態、並びに、周囲の天候、気温、湿度、及び明るさ等が含まれる。
 状況認識部146は、認識処理の結果を示すデータ(必要に応じて、状況認識用マップを含む)を自己位置推定部132及び状況予測部147等に供給する。また、状況認識部146は、状況認識用マップを記憶部109に記憶させる。
 状況認識部146は、物体5の位置を示すデータを影響空間特定部151に供給する。また状況認識部146は、観測対象2の位置を示すデータを影響度算出部152に供給する。例えば、観測移動体1の位置を基準とした、観測対象2及び物体5の位置情報が供給される。
 位置情報としては、絶対座標系(ワールド座標系)により規定される座標値(例えばXYZ座標値)が用いられてもよい。あるいは、所定の点を基準(原点)とした相対座標系により規定される座標値(例えばxyz座標値又はuvd座標値)が用いられてもよい。相対座標系が用いられる場合、基準となる原点は、任意に設定されてよい。
The situation recognition unit 146 is based on data or signals from each unit of the mobile control system 100 such as the self-position estimation unit 132, the mobile external information detection unit 141, the mobile state detection unit 142, and the map analysis unit 145. The situation recognition process regarding the observation mobile body 1 is performed. For example, the situational awareness unit 146 performs the situation of the observation moving body 1 and the situation around the observation moving body 1. Further, the situational awareness unit 146 generates a local map (hereinafter, referred to as a situational awareness map) used for recognizing the situation around the observation moving body 1 as needed. The situation recognition map is, for example, an Occupancy Grid Map, a Road Map, or a Point Cloud Map.
The situation of the observation moving body 1 to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the observation moving body 1, and the presence / absence and contents of an abnormality. The surrounding conditions of the observation moving object 1 to be recognized include, for example, the type, position, and movement (for example, velocity, acceleration, moving direction, etc.) of surrounding objects such as the observation target 2 and the object 5. Further, for example, the composition of the surrounding road and the condition of the road surface, and the surrounding weather, temperature, humidity, brightness, and the like are included.
The situational awareness unit 146 supplies data indicating the result of the recognition process (including a situational awareness map, if necessary) to the self-position estimation unit 132, the situation prediction unit 147, and the like. Further, the situational awareness unit 146 stores the situational awareness map in the storage unit 109.
The situational awareness unit 146 supplies data indicating the position of the object 5 to the influence space identification unit 151. Further, the situational awareness unit 146 supplies data indicating the position of the observation target 2 to the influence degree calculation unit 152. For example, the position information of the observation target 2 and the object 5 is supplied based on the position of the observation moving body 1.
As the position information, coordinate values (for example, XYZ coordinate values) defined by the absolute coordinate system (world coordinate system) may be used. Alternatively, a coordinate value (for example, xyz coordinate value or uvd coordinate value) defined by a relative coordinate system with a predetermined point as a reference (origin) may be used. When a relative coordinate system is used, the reference origin may be set arbitrarily.
 状況予測部147は、マップ解析部145、及び状況認識部146等の移動体制御システム100の各部からのデータ又は信号に基づいて、観測移動体1に関する状況の予測処理を行う。例えば、状況予測部147は、観測移動体1の状況、及び観測移動体1の周囲の状況等の予測処理を行う。
 予測対象となる観測移動体1の状況には、例えば、観測移動体1の挙動、異常の発生、及び移動可能距離等が含まれる。予測対象となる移動体の周囲の状況には、例えば、観測移動体1の周囲の動物体の挙動、信号の状態の変化、及び天候等の環境の変化等が含まれる。
 状況予測部147は、予測処理の結果を示すデータを、及び状況認識部146からのデータとともに、計画部134のルート計画部161、行動計画部162、及び動作計画部163等に供給する。
The situation prediction unit 147 performs a situation prediction process regarding the observation moving body 1 based on data or signals from each part of the moving body control system 100 such as the map analysis unit 145 and the situation recognition unit 146. For example, the situation prediction unit 147 performs prediction processing such as the situation of the observation moving body 1 and the situation around the observation moving body 1.
The situation of the observation mobile body 1 to be predicted includes, for example, the behavior of the observation mobile body 1, the occurrence of an abnormality, the movable distance, and the like. The situation around the moving body to be predicted includes, for example, the behavior of the animal body around the observed moving body 1, the change in the signal state, the change in the environment such as the weather, and the like.
The situation prediction unit 147 supplies data indicating the result of the prediction processing and data from the situation recognition unit 146 to the route planning unit 161, the action planning unit 162, the operation planning unit 163, and the like of the planning unit 134.
 影響空間特定部151は、観測対象2に対して設定される影響空間4を特定する。本実施形態では、状況認識部146から入力された観測対象2の位置情報に基づいて、影響空間4が特定される。具体的な影響空間4の特定方法は図4で説明する。
 また影響空間特定部151は、物体5が影響空間4内に存在するか否かを判定する。具体的な判定方法は図4で説明する。
 影響空間特定部151により特定処理の結果を示すデータが、影響度算出部152及びGUI出力部153に供給される。
The influence space specifying unit 151 specifies the influence space 4 set for the observation target 2. In the present embodiment, the influence space 4 is specified based on the position information of the observation target 2 input from the situational awareness unit 146. A specific method for specifying the influence space 4 will be described with reference to FIG.
Further, the influence space specifying unit 151 determines whether or not the object 5 exists in the influence space 4. A specific determination method will be described with reference to FIG.
Data indicating the result of the specific processing is supplied to the influence degree calculation unit 152 and the GUI output unit 153 by the influence space specifying unit 151.
 影響度算出部152は、影響空間4内に存在する物体5が観測対象2の観測を妨げる可能性に基づいて、影響度を算出する。観測を妨げる可能性とは、例えば、物体5の形状、大きさ、位置、速度等の様々な物体5に関する情報である。
 本実施形態では、状況認識部146から入力された物体5の位置情報、及び影響空間特定部151から入力された影響空間4の形状や位置に基づいて、影響度が算出される。例えば、影響空間4内の物体5の位置と観測移動体1の進行方向とに基づいて、影響度が算出される。具体的な影響度の算出方法は図5で説明する。
 影響度算出部152により算出処理の結果を示すデータが、GUI出力部153、ルート計画部161、行動計画部162、及び動作計画部163に供給される。
The influence degree calculation unit 152 calculates the influence degree based on the possibility that the object 5 existing in the influence space 4 interferes with the observation of the observation target 2. The possibility of hindering observation is, for example, information about various objects 5 such as the shape, size, position, and velocity of the object 5.
In the present embodiment, the degree of influence is calculated based on the position information of the object 5 input from the situational awareness unit 146 and the shape and position of the influence space 4 input from the influence space identification unit 151. For example, the degree of influence is calculated based on the position of the object 5 in the influence space 4 and the traveling direction of the observation moving body 1. A specific method for calculating the degree of influence will be described with reference to FIG.
Data indicating the result of the calculation process is supplied by the influence degree calculation unit 152 to the GUI output unit 153, the route planning unit 161, the action planning unit 162, and the operation planning unit 163.
 GUI出力部153は、影響空間4を識別可能に表示されたGUIを出力する。例えば、ユーザは、出力部106に表示されるGUIを介して観測対象2を追従させる行動計画を観測移動体1に入力することが可能である。その際に、ユーザは、GUIを介して影響空間4の形状や位置を識別することが可能である。またユーザは、GUIを介して影響空間4内に存在する物体5の影響度を識別することが可能である。 The GUI output unit 153 outputs a GUI in which the influence space 4 is identifiable. For example, the user can input an action plan for following the observation target 2 to the observation moving body 1 via the GUI displayed on the output unit 106. At that time, the user can identify the shape and position of the influence space 4 via the GUI. In addition, the user can identify the degree of influence of the object 5 existing in the influence space 4 via the GUI.
 ルート計画部161は、マップ解析部145、状況予測部147、及び影響度算出部152等の移動体制御システム100の各部からのデータ又は信号に基づいて、目的地までのルートを計画する。例えば、ルート計画部161は、グローバルマップに基づいて、現在位置から指定された目的地までのルートを設定する。ルート計画部161は、計画したルートを示すデータを行動計画部162等に供給する。 The route planning unit 161 plans a route to the destination based on data or signals from each unit of the mobile control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the impact calculation unit 152. For example, the route planning unit 161 sets a route from the current position to the specified destination based on the global map. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
 行動計画部162は、マップ解析部145、状況予測部147、及び影響度算出部152等の移動体制御システム100の各部からのデータ又は信号に基づいて、ルート計画部161により計画されたルートを計画された時間内で安全に移動するための観測移動体1の行動を計画する。例えば、行動計画部162は、発進、停止、進行方向(例えば、前進、後退、方向転換等)、及び移動速度等の計画を行う。行動計画部162は、計画した観測移動体1の行動を示すデータを動作計画部163等に供給する。
 より詳細には、行動計画部162は、それぞれルート計画部161により計画されたルートのそれぞれについて、計画された時間内で安全に移動するための観測移動体1の行動計画の候補を行動計画候補として生成する。より具体的には、行動計画部162は、例えば、環境を格子状に区切って、到達判定及び経路の重みを最適化して最良のパスを生成するA*algorithm(A star探索アルゴリズム)、及び自己位置からインクリメンタルに到達可能な場所へのパスを適切に枝刈りしながら伸ばしていくRRT(Rapidly-exploring Random Tree) algorithmなどにより行動計画候補を生成する。
The action planning unit 162 sets a route planned by the route planning unit 161 based on data or signals from each unit of the mobile control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the influence degree calculation unit 152. Plan the action of the observation moving object 1 to move safely within the planned time. For example, the action planning unit 162 plans the start, stop, direction of travel (for example, forward, backward, change of direction, etc.), movement speed, and the like. The action planning unit 162 supplies data indicating the behavior of the planned observation mobile body 1 to the motion planning unit 163 and the like.
More specifically, the action planning unit 162 selects the action plan candidate of the observation moving body 1 for safely moving within the planned time for each of the routes planned by the route planning unit 161. Generate as. More specifically, the action planning unit 162 divides the environment into a grid, for example, an A * algorithm (A star search algorithm) that optimizes the arrival judgment and the weight of the route to generate the best path, and the self. Action plan candidates are generated by the RRT (Rapidly-exploring Random Tree) algorithm, etc., which extends the path from the position to the place where the incremental can be reached while appropriately pruning.
 動作計画部163は、マップ解析部145、状況予測部147、及び影響度算出部152等の移動体制御システム100の各部からのデータ又は信号に基づいて、行動計画部162により計画された行動を実現するための観測移動体1の動作を計画する。例えば、動作計画部163は、加速、減速、及び回転速度等の計画を行う。動作計画部163は、計画した移動体の動作を示すデータを、動作制御部135等に供給する。
 また行動計画及び動作計画には、観測移動体1の飛行パターン等も含まれる。すなわち、旋回や8の字といった、パターンとして規定された軌跡や速度等も、行動計画及び動作計画に含まれる。例えば、旋回や8の字といった飛行パターンに対して、旋回や8の字を行う際の観測移動体1の速度や曲率等が行動計画及び動作計画として計画される。
 飛行パターンに関連付けられる速度や姿勢等のパラメータが、デフォルト設定される場合もあり得る。すなわち所定の飛行パターンをどのように移動するかは、デフォルトで設定されていてもよい。
The operation planning unit 163 performs the action planned by the action planning unit 162 based on the data or signals from each unit of the moving body control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the influence degree calculation unit 152. Plan the operation of the observation moving body 1 to realize it. For example, the motion planning unit 163 plans acceleration, deceleration, rotation speed, and the like. The motion planning unit 163 supplies data indicating the planned motion of the moving body to the motion control unit 135 and the like.
The action plan and the action plan also include the flight pattern of the observation mobile body 1. That is, the trajectory and speed defined as a pattern such as turning and figure 8 are also included in the action plan and the motion plan. For example, for a flight pattern such as a turn or a figure eight, the speed and curvature of the observation moving body 1 when the turn or the figure eight is performed are planned as an action plan and an action plan.
Parameters such as speed and attitude associated with the flight pattern may be set by default. That is, how to move a predetermined flight pattern may be set by default.
 動作制御部135は、観測移動体1の動作の制御を行う。
 より詳細には、動作制御部135は、移動体外部情報検出部141、及び移動体状態検出部142の検出結果に基づいて、衝突、接触、危険地帯への進入、観測移動体1の異常等の緊急事態の検出処理を行う。動作制御部135は、緊急事態の発生を検出した場合、急停止や急旋回等の緊急事態を回避するための観測移動体1の動作を計画する。
 また、動作制御部135は、動作計画部163により計画された観測移動体1の動作を実現するための加減速制御を行う。例えば、動作制御部135は、計画された加速、減速、又は、急停止を実現するための駆動力発生装置又は制動装置の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部107に供給する。
 動作制御部135は、動作計画部163により計画された観測移動体1の動作を実現するための方向制御を行う。例えば、動作制御部135は、動作計画部163により計画された移動軌道又は急旋回を実現するためのステアリング機構の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部107に供給する。
The motion control unit 135 controls the motion of the observation mobile body 1.
More specifically, the motion control unit 135 is based on the detection results of the mobile external information detection unit 141 and the mobile state detection unit 142, such as collision, contact, entry into a danger zone, abnormality of the observation mobile body 1, and the like. Performs emergency detection processing. When the motion control unit 135 detects the occurrence of an emergency, the motion control unit 135 plans the motion of the observation moving body 1 for avoiding an emergency such as a sudden stop or a sharp turn.
Further, the motion control unit 135 performs acceleration / deceleration control for realizing the motion of the observation moving body 1 planned by the motion planning unit 163. For example, the motion control unit 135 calculates a control target value of a driving force generator or a braking device for realizing a planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. It is supplied to the system control unit 107.
The motion control unit 135 performs direction control for realizing the motion of the observation moving body 1 planned by the motion planning unit 163. For example, the motion control unit 135 calculates the control target value of the steering mechanism for realizing the moving trajectory or the sharp turn planned by the motion planning unit 163, and issues a control command indicating the calculated control target value to the drive system control unit. Supply to 107.
 なお、本実施形態において、センサ群112(センサ部3)は、観測対象を観測可能なセンサ部に相当する。
 なお、本実施形態において、影響空間処理部124及び計画部134が、観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、観測対象を観測する行動計画を生成する行動計画生成部として機能する。
 なお、本実施形態において、状況予測部147は、センサ部により取得されたセンシング結果に基づいて、観測対象又は物体の少なくとも1つの行動を予測する行動予測部に相当する。
 なお、本実施形態において、影響空間特定部151は、観測対象の位置情報に基づいて、影響空間を特定する空間特定部に相当する。
 なお、本実施形態において、影響度算出部152は、物体が観測対象の形状、大きさ、位置、又は速度のうち、少なくとも一つに基づいて、影響度を算出する算出部に相当する。
 なお、本実施形態において、GUI出力部153は、影響空間が識別可能に表示されたGUIを出力するGUI出力部に相当する。
In the present embodiment, the sensor group 112 (sensor unit 3) corresponds to a sensor unit capable of observing an observation target.
In the present embodiment, the influence space processing unit 124 and the planning unit 134 make observations based on the degree of influence of an object in the influence space that affects the observation by the sensor unit set for the observation target on the observation. It functions as an action plan generation unit that generates an action plan for observing an object.
In the present embodiment, the situation prediction unit 147 corresponds to the behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
In the present embodiment, the influence space identification unit 151 corresponds to the space identification unit that specifies the influence space based on the position information of the observation target.
In the present embodiment, the influence degree calculation unit 152 corresponds to a calculation unit that calculates the influence degree based on at least one of the shape, size, position, or speed of the object to be observed.
In the present embodiment, the GUI output unit 153 corresponds to a GUI output unit that outputs a GUI in which the influence space is identifiable.
 図3は、影響空間の特定と影響度の算出との制御を示すフローチャートである。
 センサ部3により取得されたセンシング結果に基づいて、観測対象2が検出される(ステップ101)。本実施形態では、ユーザにより観測対象2をトラッキングする旨の指示が観測移動体1に入力された際に、ステップ101が実行される。
 状況認識部146により、検出された観測対象2の位置が推定される(ステップ102)。本実施形態では、観測移動体1を基準とした、観測対象2の相対位置が推定される。
 影響空間特定部151により、推定された観測対象2の位置に基づいて、影響空間が特定される(ステップ103)。
 センサ部3により取得されたセンシング結果に基づいて、物体5が検出される(ステップ104)。
 状況認識部146により、検出された物体5の位置が推定される(ステップ105)。
 影響空間特定部151により、影響空間4内に障害物(物体5)が存在するかが判定される(ステップ106)。
 影響空間4内に障害物がある場合(ステップ106のYES)、影響度算出部152により障害物が観測に与える影響度が算出される(ステップ107)。
 行動計画部162により、算出された影響度に基づいて、行動計画が生成される(ステップ108)。例えば、算出された影響度を減らす行動計画が生成される。
 本実施形態では、ステップ108で生成された、観測対象2を観測する行動計画を観測行動計画と記載する。すなわち、観測行動計画は、観測対象2の観測を続けるための、障害物により観測が妨害されないように観測移動体1を制御する行動計画である。また本実施形態では、予め与えられた観測対象2を観測(トラッキング)するための行動計画を事前行動計画と記載する。
 例えば、事前行動計画は、観測対象2から所定の距離離れた状態で、観測対象2を中心に円運動を行う行動計画等が含まれる。
 動作制御部135により、計画された行動計画に基づいて、観測移動体1が制御される(ステップ109)。
 本実施形態では、影響空間4内に障害物が存在する場合は、事前行動計画に観測行動計画が統合される。すなわち、事前行動計画に加えて、障害物が影響空間4から出るような制御が実行される。
 例えば、事前行動計画が観測対象2に対して右方向に回り込んで移動する行動計画とする。その行動計画の移動経路における影響空間内に障害物があるとする。この場合、観測移動体1は、観測対象2に対して右方向に回り込みながら、障害物により観測が妨害されないように上方向に移動してもいい。
 これに限定されず、影響空間に障害物が存在する場合、事前行動計画から観測行動計画に制御が切り替えられてもよい。
 例えば、事前行動計画が観測対象2に対して直進して追従する行動計画とする。この時、影響空間の右側に障害物があるとする。この場合、直進ではなく、障害物とは逆方向の左方向に移動してもいい。
 なお、観測行動計画により、影響空間内に障害物がなくなった場合、事前行動計画に基づいて、観測移動体1が制御される。
 また影響空間4内に障害物が無い場合(ステップ106のNO)、事前行動計画に基づいて、観測移動体1が制御される。
 すなわち、本実施形態では、観測が続行可能か否かを問わず、影響空間4内の障害物が観測に与える影響度に基づいて、行動計画が生成される。また観測移動体1が行動計画に基づいて移動され、影響空間4内に障害物が存在しなくなった場合、事前行動計画に基づいて、観測移動体1が制御される。
FIG. 3 is a flowchart showing control of specifying the influence space and calculating the degree of influence.
The observation target 2 is detected based on the sensing result acquired by the sensor unit 3 (step 101). In the present embodiment, step 101 is executed when an instruction to track the observation target 2 is input to the observation moving body 1 by the user.
The situational awareness unit 146 estimates the position of the detected observation target 2 (step 102). In the present embodiment, the relative position of the observation target 2 is estimated with respect to the observation moving body 1.
The influence space identification unit 151 specifies the influence space based on the estimated position of the observation target 2 (step 103).
The object 5 is detected based on the sensing result acquired by the sensor unit 3 (step 104).
The situational awareness unit 146 estimates the position of the detected object 5 (step 105).
The influence space specifying unit 151 determines whether or not an obstacle (object 5) exists in the influence space 4 (step 106).
When there is an obstacle in the influence space 4 (YES in step 106), the influence degree calculation unit 152 calculates the influence degree that the obstacle has on the observation (step 107).
The action plan unit 162 generates an action plan based on the calculated degree of influence (step 108). For example, an action plan is generated that reduces the calculated impact.
In the present embodiment, the action plan for observing the observation target 2 generated in step 108 is described as an observation action plan. That is, the observation action plan is an action plan for controlling the observation moving body 1 so that the observation is not obstructed by obstacles in order to continue the observation of the observation target 2. Further, in the present embodiment, the action plan for observing (tracking) the observation target 2 given in advance is described as the advance action plan.
For example, the prior action plan includes an action plan in which a circular motion is performed around the observation target 2 at a predetermined distance from the observation target 2.
The motion control unit 135 controls the observation mobile body 1 based on the planned action plan (step 109).
In the present embodiment, when an obstacle exists in the influence space 4, the observation action plan is integrated with the advance action plan. That is, in addition to the advance action plan, control is executed so that the obstacle comes out of the influence space 4.
For example, assume that the pre-action plan moves around to the right with respect to the observation target 2. It is assumed that there is an obstacle in the influence space in the movement path of the action plan. In this case, the observation moving body 1 may move upward with respect to the observation target 2 while wrapping around to the right so that the observation is not obstructed by obstacles.
Not limited to this, when an obstacle exists in the influence space, the control may be switched from the preliminary action plan to the observation action plan.
For example, an action plan in which the advance action plan goes straight to the observation target 2 and follows it. At this time, it is assumed that there is an obstacle on the right side of the influence space. In this case, the vehicle may move to the left in the direction opposite to the obstacle instead of going straight.
When there are no obstacles in the influence space according to the observation action plan, the observation moving body 1 is controlled based on the advance action plan.
Further, when there is no obstacle in the influence space 4 (NO in step 106), the observation moving body 1 is controlled based on the advance action plan.
That is, in the present embodiment, the action plan is generated based on the degree of influence of the obstacle in the influence space 4 on the observation regardless of whether or not the observation can be continued. Further, when the observation moving body 1 is moved based on the action plan and there are no obstacles in the influence space 4, the observation moving body 1 is controlled based on the prior action plan.
 図4は、影響空間の特定方法の具体例を説明するための模式図である。
 図4に示すように、本実施形態では、影響空間20の形状を円柱とする。
 また本実施形態では、以下の情報が入力される(単位は省略)。
 観測移動体1を原点とした観測対象2の相対位置21の座標:(10.5、2.3、-0.4)
 影響空間20の半径:2.0
FIG. 4 is a schematic diagram for explaining a specific example of the method of specifying the influence space.
As shown in FIG. 4, in the present embodiment, the shape of the influence space 20 is a cylinder.
Further, in the present embodiment, the following information is input (unit is omitted).
Coordinates of relative position 21 of observation target 2 with observation moving object 1 as the origin: (10.5, 2.3, -0.4)
Radius of influence space 20: 2.0
 影響空間特定部151は、上記の入力された情報に基づいて、下記の計算を実行し、影響空間を特定する。
 観測対象2との相対距離(観測移動体1と観測対象2とを結ぶ中心線22の長さ)、及び観測対象2への方向ベクトルから、正規化された観測対象2への方向ベクトル23が算出される。
 相対距離:10.8
 方向ベクトル23:(0.98、0.21、-0.04)
The influence space specifying unit 151 executes the following calculation based on the above input information to specify the influence space.
From the relative distance to the observation target 2 (the length of the center line 22 connecting the observation moving object 1 and the observation target 2) and the direction vector to the observation target 2, the direction vector 23 to the normalized observation target 2 is It is calculated.
Relative distance: 10.8
Direction vector 23: (0.98, 0.21, -0.04)
 本実施形態では、以下の条件から点24(x、y、z)が影響空間20に存在するかが判定される。
 方向ベクトル23と点24(x、y、z)へのベクトル25との内積が、0より大きく、相対距離よりも小さい。この条件を以下の(数1)式のように書き表す。
In the present embodiment, it is determined whether or not the point 24 (x, y, z) exists in the influence space 20 from the following conditions.
The inner product of the direction vector 23 and the vector 25 to the point 24 (x, y, z) is greater than 0 and less than the relative distance. This condition is expressed as the following equation (Equation 1).
Figure JPOXMLDOC01-appb-M000001
 中心線22と点24(x、y、z)との距離(ベクトル26の大きさ)が円柱の半径より小さい。この条件を以下の(数2)式のように書き表す。
 ベクトル26=(x、y、z)―(x、y、z)・(0.98、0.21、-0.04)(0.98、0.21、-0.04)
Figure JPOXMLDOC01-appb-M000001
The distance between the center line 22 and the point 24 (x, y, z) (the magnitude of the vector 26) is smaller than the radius of the cylinder. This condition is expressed as the following equation (Equation 2).
Vector 26 = (x, y, z)-(x, y, z) · (0.98, 0.21, -0.04) (0.98, 0.21, -0.04)
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 上記の算出により、各位置の点群(x、y、z)が影響空間20に存在するかが判定される。
 影響空間特定部151により、影響空間20の半径2.0、相対距離10.8、及び方向ベクトル23が影響度算出部152に出力される。
 なお、「より小さい」は、「以下」及び「未満」を含んでもよい。また「より大きい」は、「以上」を含んでもよい。
By the above calculation, it is determined whether the point cloud (x, y, z) at each position exists in the influence space 20.
The influence space identification unit 151 outputs the radius 2.0, the relative distance 10.8, and the direction vector 23 of the influence space 20 to the influence degree calculation unit 152.
In addition, "less than" may include "less than or equal to" and "less than". Further, "greater than" may include "greater than or equal to".
 図5は、影響度の算出方法の具体例を説明するための模式図である。
 また本実施形態では、3つの物体の座標、影響空間20の半径、相対距離、及び方向ベクトル23が入力される。
 物体31の相対位置:(7.4、0.2、0.1)
 物体32の相対位置:(3.1、-1.9、1.1)
 物体33の相対位置:(5.3、2.4、-0.3)
FIG. 5 is a schematic diagram for explaining a specific example of the method of calculating the degree of influence.
Further, in the present embodiment, the coordinates of the three objects, the radius of the influence space 20, the relative distance, and the direction vector 23 are input.
Relative position of object 31: (7.4, 0.2, 0.1)
Relative position of object 32: (3.1, -1.9, 1.1)
Relative position of object 33: (5.3, 2.4, -0.3)
 本実施形態では、影響度算出部152により、影響度が各物体と中心線との距離に反比例した量で求められる。また影響度の向きが各物体から中心線22に下した垂線方向のベクトルとして求められる。 In the present embodiment, the influence degree calculation unit 152 obtains the influence degree as an amount inversely proportional to the distance between each object and the center line. Further, the direction of the degree of influence is obtained as a vector in the direction of the perpendicular line drawn from each object to the center line 22.
 影響度算出部152は、上記の入力された情報に基づいて、下記の計算を実行し、影響度ベクトルを算出する。
 各物体の座標が影響空間20に含まれるか判定される。説明の簡略化のため、物体31の計算のみ記載する。
 物体31への方向ベクトル35と方向ベクトル23との内積を以下の(数3)式のように書き表す。
The influence degree calculation unit 152 executes the following calculation based on the above input information, and calculates the influence degree vector.
It is determined whether the coordinates of each object are included in the influence space 20. For the sake of brevity, only the calculation of the object 31 will be described.
The inner product of the direction vector 35 and the direction vector 23 to the object 31 is written as the following equation (Equation 3).
Figure JPOXMLDOC01-appb-M000003
 中心線22から物体31へのベクトル36を以下の(数4)式のように書き表す。
Figure JPOXMLDOC01-appb-M000003
The vector 36 from the center line 22 to the object 31 is written as the following equation (Equation 4).
Figure JPOXMLDOC01-appb-M000004
 物体31への方向ベクトル35と方向ベクトル23との内積が7.3のため、0より大きく、相対距離10.8よりも小さい。また(数4)式から求められたベクトル36の大きさが1.4のため、円柱の半径2.0より小さい。この結果、物体31は、影響空間20内に存在すると判定される。
 また影響空間20内に存在する物体31の影響度ベクトル37(中心線22へのベクトル)を以下の(数5)式のように書き表す。
Figure JPOXMLDOC01-appb-M000004
Since the inner product of the direction vector 35 and the direction vector 23 to the object 31 is 7.3, it is larger than 0 and smaller than the relative distance of 10.8. Further, since the magnitude of the vector 36 obtained from the equation (Equation 4) is 1.4, the radius of the cylinder is smaller than 2.0. As a result, it is determined that the object 31 exists in the influence space 20.
Further, the influence degree vector 37 (vector to the center line 22) of the object 31 existing in the influence space 20 is written as the following equation (Equation 5).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 上記の算出方法と同様に、物体32及び物体33の影響度ベクトルが算出される。
 物体32は、中心線22からの距離が2.8となるため、影響空間20に含まれない。この結果、物体32の影響度ベクトルは(0、0、0)となる。
 物体33は、物体33への方向ベクトルと方向ベクトル23との内積が5.7、かつ中心線22からの距離が1.2のため影響空間20に含まれる。この結果、物体33の影響度ベクトル38は、(0.18、0.8、0.06)となる。
 各物体の影響度ベクトルが合算されることで、影響度算出部152から出力される合計の影響度ベクトルが以下の(数6)式のように書き表される。
Similar to the above calculation method, the influence degree vectors of the object 32 and the object 33 are calculated.
The object 32 is not included in the influence space 20 because the distance from the center line 22 is 2.8. As a result, the influence vector of the object 32 becomes (0, 0, 0).
The object 33 is included in the influence space 20 because the inner product of the direction vector to the object 33 and the direction vector 23 is 5.7 and the distance from the center line 22 is 1.2. As a result, the influence vector 38 of the object 33 becomes (0.18, 0.8, 0.06).
By adding up the influence degree vectors of each object, the total influence degree vector output from the influence degree calculation unit 152 is written as the following equation (Equation 6).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 上記の算出により、各物体が観測に与える影響度ベクトルが算出される。
 影響度算出部152により、合計の影響度ベクトルがGUI出力部153、ルート計画部161、行動計画部162、及び動作計画部163に供給される。
By the above calculation, the influence vector of each object on the observation is calculated.
The influence degree calculation unit 152 supplies the total influence degree vector to the GUI output unit 153, the route planning unit 161, the action planning unit 162, and the operation planning unit 163.
 図6は、観測行動計画の具体例を示す模式図である。図6Aは、観測移動体1の経路を示す模式図である。図6Bは、位置41から観測対象2を見た場合を示す模式図である。 FIG. 6 is a schematic diagram showing a specific example of the observation action plan. FIG. 6A is a schematic diagram showing the path of the observation mobile body 1. FIG. 6B is a schematic view showing the case where the observation target 2 is viewed from the position 41.
 図6Aに示すように、観測移動体1は、各物体が観測に与える影響度ベクトルに基づいて生成された観測行動計画に従い、移動を行う。
 本実施形態では、以下の情報が入力される。
 観測対象2の相対位置:(10.5、2.3、-0.4)
 影響度ベクトル:(0.03、1.46、0.24)
 物体31の相対位置:(7.4、0.2、0.1)
 物体32の相対位置:(3.1、-1.9、1.1)
 物体33の相対位置:(5.3、2.4、-0.3)
 観測対象2との目標距離:8.0
 観測移動体1の最大速度:0.5
 Pゲイン:0.1
As shown in FIG. 6A, the observation moving body 1 moves according to the observation action plan generated based on the influence degree vector that each object has on the observation.
In this embodiment, the following information is input.
Relative position of observation target 2: (10.5, 2.3, -0.4)
Impact vector: (0.03, 1.46, 0.24)
Relative position of object 31: (7.4, 0.2, 0.1)
Relative position of object 32: (3.1, -1.9, 1.1)
Relative position of object 33: (5.3, 2.4, -0.3)
Target distance to observation target 2: 8.0
Maximum velocity of observation mobile body 1: 0.5
P gain: 0.1
 図6Bに示すように、影響空間20内に物体31及び物体33が存在する。この場合、行動計画処理部122は、上記の入力された情報に基づいて、下記の計算を実行し、観測行動計画を生成する。本実施形態では、観測対象2との距離を保つための速度ベクトル、物体から離れるための速度ベクトル、及び影響度を減らすための速度ベクトルに基づいて、観測行動計画が生成される。 As shown in FIG. 6B, the object 31 and the object 33 exist in the influence space 20. In this case, the action plan processing unit 122 executes the following calculation based on the above input information to generate an observation action plan. In the present embodiment, the observation action plan is generated based on the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence.
 観測移動体1と観測対象2との距離が10.8であり、観測対象2との目標距離が8.0のため、距離の誤差は、2.8となる。またPゲインが0.1なので、この距離の誤差を埋めるための、観測移動体1が観測対象2との距離を保つための速度ベクトルを以下の(数7)式のように書き表す。 Since the distance between the observation moving object 1 and the observation target 2 is 10.8 and the target distance between the observation target 2 is 8.0, the distance error is 2.8. Further, since the P gain is 0.1, the velocity vector for the observation moving body 1 to maintain the distance from the observation target 2 for filling the error of this distance is written as the following equation (Equation 7).
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 観測移動体1と物体31との距離7.4に反比例した、物体31から離れるための速度ベクトルを以下の(数8)式のように書き表す。 The velocity vector for moving away from the object 31, which is inversely proportional to the distance 7.4 between the observation moving body 1 and the object 31, is written as the following equation (Equation 8).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 同様の計算方法から、物体32から離れるための速度ベクトル(-0.2、0.1、-0.1)、及び物体33から離れるための速度ベクトル(-0.2、-0.1、0)が算出される。
 物体31から離れるための速度ベクトル、物体32から離れるための速度ベクトル、及び物体33から離れるための速度ベクトルを合算し、Pゲイン0.1をかけた各物体から離れるための速度ベクトルを以下の(数9)式のように書き表す。
From the same calculation method, the velocity vector for moving away from the object 32 (-0.2, 0.1, -0.1) and the velocity vector for moving away from the object 33 (-0.2, -0.1, 0) is calculated.
The velocity vector for moving away from the object 31, the velocity vector for leaving the object 32, and the velocity vector for leaving the object 33 are added together, and the velocity vector for leaving each object to which the P gain is 0.1 is calculated as follows. Write as in equation (Equation 9).
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
 影響度算出部152により算出された影響度ベクトル(0.03、1.46、0.24)にPゲイン0.1をかけることで、影響度を減らすための速度ベクトルは、(0,0.15、0.02)となる。 By multiplying the influence degree vector (0.03, 1.46, 0.24) calculated by the influence degree calculation unit 152 by the P gain 0.1, the velocity vector for reducing the influence degree is (0,0). It becomes .15, 0.02).
 上記の観測対象2との距離を保つための速度ベクトル、物体から離れるための速度ベクトル、及び影響度を減らすための速度ベクトルを合算すると、観測行動計画である速度ベクトル43は、(0.25、0.21、0)となる。
 また速度ベクトル43の大きさ0.33は、観測移動体1の最大速度0.5より小さいので、行動計画処理部122から出力される観測行動計画の速度ベクトル43は、(0.25、0.21、0)となる。
When the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence are added together, the velocity vector 43, which is the observation action plan, is (0.25). , 0.21, 0).
Further, since the magnitude 0.33 of the velocity vector 43 is smaller than the maximum velocity 0.5 of the observation moving body 1, the velocity vector 43 of the observation action plan output from the action plan processing unit 122 is (0.25, 0). .21, 0).
 図6Aに示すように、観測移動体1は、観測対象2に対して追従する事前行動計画から、新たに生成された観測行動計画に従い、位置41から位置42へと移動することで各物体の観測に与える影響度を減らすことが可能である。 As shown in FIG. 6A, the observation moving body 1 moves from the position 41 to the position 42 according to the newly generated observation action plan from the preliminary action plan that follows the observation target 2, so that each object It is possible to reduce the degree of influence on the observation.
 図6Cは、位置42から観測対象2を見た場合を示す模式図である。
 図6Cに示すように、観測移動体1が位置42に移動した場合、影響空間20内に各物体が存在しない。すなわち、観測行動計画は、影響空間20内部に物体が存在しなくなるまで観測移動体1を移動させる行動計画とも言える。
FIG. 6C is a schematic view showing the case where the observation target 2 is viewed from the position 42.
As shown in FIG. 6C, when the observation moving body 1 moves to the position 42, each object does not exist in the influence space 20. That is, the observation action plan can be said to be an action plan for moving the observation moving body 1 until there are no more objects inside the influence space 20.
 なお、影響空間20の半径、観測対象2との目標距離、観測移動体1の最大速度、及びPゲイン等は、限定されず、任意に設定されてもよい。例えば、ユーザにより設定されてもよいし、センサ部3の観測情報に応じて適宜設定されてもよい。 The radius of the influence space 20, the target distance to the observation target 2, the maximum speed of the observation moving body 1, the P gain, and the like are not limited and may be set arbitrarily. For example, it may be set by the user, or may be appropriately set according to the observation information of the sensor unit 3.
 図7は、観測移動体1に指示を入力するための観測用GUIを示す模式図である。
 図7Aは、観測対象2を選択する様子を示す模式図である。
 図7Aに示すように、ユーザは、観測用GUI45を介して、観測対象2を設定することが可能である。
 観測用GUI45は、表示部46と、画面遷移指示部47と、移動指示部48と、対象選択部49と、位置指定部50と、着陸指示部51とを有する。
 表示部46は、観測移動体1のカメラ(センサ部)から取得される動画像が表示される。図7Aでは、観測移動体1に搭載されたカメラの画角内に2人の人物が移動している様子が表示されている。本実施形態では、ユーザにより、移動指示部48、対象選択部49、位置指定部50、又は着陸指示部51が選択されることで、表示部46に各部に関する指示が入力可能な専用の表示モードが表示される。以下、各部が選択される前の表示部46に表示される表示モードのことをホーム画面と記載する。
 画面遷移指示部47は、観測用GUI45の表示モードを遷移させることが可能である。例えば、ホーム画面から移動指示部48が選択されることで、移動指示部48の表示モードへと表示部46が遷移する。この状態から画面遷移指示部47が選択された場合、移動指示部48の表示モードからホーム画面へと表示部46が遷移する。すなわち、画面遷移指示部47が選択されることで、現在の表示モードから前後の表示モードに遷移することが可能である。もちろん、最初に選択した表示モードや最後に選択した表示モード等に遷移することも可能である。
 移動指示部48は、観測移動体1に対して移動に関する指示を入力することが可能である。例えば、前進、後退、旋回、8の字、上昇、及び降下等の様々な移動指示を入力することが可能である。
 対象選択部49は、表示部46に表示されている任意の物体や人物を観測対象として選択することが可能である。本実施形態では、人物を囲うように破線52が表示されており、破線52内をユーザがタッチ(接触)することで、観測対象2を設定することが可能である。図7Aでは、観測用GUIは、対象選択部49がユーザにより選択されることで、観測対象2を設定することが可能なモードになっている。
 位置指定部50は、所定の位置やランドマーク等を選択することが可能である。例えば、ユーザが指定した所定の位置を目標地点として、移動をすることが可能である。
 着陸指示部51は、観測移動体1に対して着陸に関する指示を入力することが可能である。例えば、ユーザが選択した所定の位置に観測移動体1を着陸させる旨の指示が入力される。また、観測移動体1が着陸する際の、機体の姿勢や向き、速度等が入力可能であってもよい。
 なお、観測用GUI45は限定されず、例えば、観測移動体1のカメラを用いて撮影を行うモードや、充電を行うために指定された充電位置に戻るモード等が設定されてもよい。
FIG. 7 is a schematic diagram showing an observation GUI for inputting an instruction to the observation mobile body 1.
FIG. 7A is a schematic diagram showing how the observation target 2 is selected.
As shown in FIG. 7A, the user can set the observation target 2 via the observation GUI45.
The observation GUI 45 includes a display unit 46, a screen transition instruction unit 47, a movement instruction unit 48, an object selection unit 49, a position designation unit 50, and a landing instruction unit 51.
The display unit 46 displays a moving image acquired from the camera (sensor unit) of the observation moving body 1. In FIG. 7A, a state in which two people are moving within the angle of view of the camera mounted on the observation moving body 1 is displayed. In the present embodiment, a dedicated display mode in which instructions related to each unit can be input to the display unit 46 by selecting the movement instruction unit 48, the target selection unit 49, the position designation unit 50, or the landing instruction unit 51 by the user. Is displayed. Hereinafter, the display mode displayed on the display unit 46 before each unit is selected will be referred to as a home screen.
The screen transition instruction unit 47 can change the display mode of the observation GUI 45. For example, when the movement instruction unit 48 is selected from the home screen, the display unit 46 shifts to the display mode of the movement instruction unit 48. When the screen transition instruction unit 47 is selected from this state, the display unit 46 transitions from the display mode of the movement instruction unit 48 to the home screen. That is, by selecting the screen transition instruction unit 47, it is possible to transition from the current display mode to the previous / next display mode. Of course, it is also possible to transition to the display mode selected first, the display mode selected last, and the like.
The movement instruction unit 48 can input an instruction regarding movement to the observation moving body 1. For example, it is possible to input various movement instructions such as forward, backward, turn, figure eight, ascending, and descending.
The target selection unit 49 can select any object or person displayed on the display unit 46 as an observation target. In the present embodiment, the broken line 52 is displayed so as to surround the person, and the observation target 2 can be set by the user touching the inside of the broken line 52. In FIG. 7A, the observation GUI is in a mode in which the observation target 2 can be set by selecting the target selection unit 49 by the user.
The position designation unit 50 can select a predetermined position, a landmark, or the like. For example, it is possible to move with a predetermined position specified by the user as a target point.
The landing instruction unit 51 can input an instruction regarding landing to the observation mobile body 1. For example, an instruction to land the observation mobile body 1 at a predetermined position selected by the user is input. In addition, the attitude, orientation, speed, and the like of the aircraft when the observation mobile body 1 lands may be input.
The observation GUI 45 is not limited, and for example, a mode for taking a picture using the camera of the observation moving body 1 or a mode for returning to a charging position designated for charging may be set.
 図7Bは、観測対象2を追従する様子を示す模式図である。
 図7Bに示すように、ユーザにより、人物53が観測対象として設定された。この場合、表示部46に観測対象2として設定された旨を示すターゲットアイコン54が表示される。
 図7Bでは、観測用GUI45は、表示部46と、再選択部55と、移動指示部48と、追従指示部56と、位置指定部50と、着陸指示部51とを有する。
 再選択部55は、観測対象2を切替えることが可能である。例えば、再選択部55が選択された場合、図7Aに示す観測用GUI45に切り替わる。
 追従指示部56は、選択された観測対象2に対して、観測移動体1に追従に関する行動計画を入力することが可能である。例えば、観測移動体1に、観測対象2を中心に円運動を行いつつ、3m離れて追従させる等の行動計画を入力することが可能である。
FIG. 7B is a schematic diagram showing how the observation target 2 is followed.
As shown in FIG. 7B, the person 53 is set as an observation target by the user. In this case, the target icon 54 indicating that the observation target 2 has been set is displayed on the display unit 46.
In FIG. 7B, the observation GUI 45 includes a display unit 46, a reselection unit 55, a movement instruction unit 48, a follow-up instruction unit 56, a position designation unit 50, and a landing instruction unit 51.
The reselection unit 55 can switch the observation target 2. For example, when the reselection unit 55 is selected, it switches to the observation GUI 45 shown in FIG. 7A.
The follow-up instruction unit 56 can input an action plan related to follow-up to the observation moving body 1 for the selected observation target 2. For example, it is possible to input an action plan such as making the observation moving object 1 follow the observation target 2 at a distance of 3 m while performing a circular motion around the observation target 2.
 図7Cは、観測対象2の追従を継続する様子を示す模式図である。
 図7Cでは、観測用GUI45は、表示部46と、帰還部57と、移動指示部48と、追従停止部58と、位置指定部50と、着陸指示部51とを有する。
 帰還部57は、観測移動体1を所定の位置に移動させることが可能である。例えば、帰還部57が選択されることで、観測移動体1が離陸した地点やユーザの現在位置に戻ることが可能である。
 追従停止部58は、追従している間に選択されることで、追従を行う行動計画を規制する旨の情報を観測移動体1に入力することが可能である。
FIG. 7C is a schematic diagram showing how the observation target 2 continues to follow.
In FIG. 7C, the observation GUI 45 includes a display unit 46, a return unit 57, a movement instruction unit 48, a follow-up stop unit 58, a position designation unit 50, and a landing instruction unit 51.
The return unit 57 can move the observation moving body 1 to a predetermined position. For example, by selecting the return unit 57, it is possible to return to the point where the observation mobile body 1 took off or the current position of the user.
By being selected while the follow-up stop unit 58 is following, it is possible to input information to the observation moving body 1 to the effect that the action plan for following is regulated.
 図8は、影響空間が表示される観測用GUIを示す模式図である。
 図8Aは、観測対象2に対して設定される影響空間60が表示されている様子を示す模式図である。本実施形態では、影響空間60がユーザに識別可能なように、影響空間60に該当する領域に所定の色が図示されている。
 なお、説明の簡略化のために図8では、観測用GUIでは表示部46のみが図示されている。図8Aに示すように、観測対象2に設定された人物61を中心に円形状の影響空間60が表示されている。
 図8Bは、影響空間及び観測に与える影響を示す模式図である。
 図8Bに示すように、影響空間60内に人物62の一部分が重なっている。すなわち、人物62の一部分が観測に影響を与える障害物となる。
 図8Bでは、影響空間60内の人物62による観測に与える影響度がユーザに識別可能なように、影響空間60とは異なる色で図示されている。本実施形態では、人物62を囲む破線63と影響空間60とが重なる領域64が影響度として図示されている。
 なお、影響度を表示する方法は限定されず、例えば、人物62の移動方向に沿って、領域64が延在してもよい。また例えば、影響度の大きさに応じて色の濃淡が設定されてもよい。
FIG. 8 is a schematic diagram showing an observation GUI in which the influence space is displayed.
FIG. 8A is a schematic view showing how the influence space 60 set for the observation target 2 is displayed. In the present embodiment, a predetermined color is shown in the area corresponding to the influence space 60 so that the influence space 60 can be identified by the user.
For the sake of simplification of the description, in FIG. 8, only the display unit 46 is shown in the observation GUI. As shown in FIG. 8A, the circular influence space 60 is displayed centering on the person 61 set as the observation target 2.
FIG. 8B is a schematic diagram showing the influence space and the influence on the observation.
As shown in FIG. 8B, a part of the person 62 overlaps in the influence space 60. That is, a part of the person 62 becomes an obstacle that affects the observation.
In FIG. 8B, the degree of influence on the observation by the person 62 in the influence space 60 is shown in a color different from that of the influence space 60 so that the user can identify it. In the present embodiment, the region 64 where the broken line 63 surrounding the person 62 and the influence space 60 overlap is shown as the degree of influence.
The method of displaying the degree of influence is not limited, and for example, the area 64 may extend along the moving direction of the person 62. Further, for example, the shade of color may be set according to the magnitude of the degree of influence.
 以上、本実施形態に係る観測移動体1は、観測対象2に対して設定されるセンサ部3による観測に影響がある影響空間4内の物体5が観測に与える影響度に基づいて、観測対象2を観測する行動計画が生成される。これにより、観測における高い安定性が発揮させることが可能となる。 As described above, the observation moving object 1 according to the present embodiment is an observation target based on the degree of influence on the observation by the object 5 in the influence space 4 that affects the observation by the sensor unit 3 set for the observation target 2. An action plan for observing 2 is generated. This makes it possible to demonstrate high stability in observation.
 従来、移動する目標物に対して追従等の、目標物をセンサで捉え続ける必要がある自律移動では、目標物が壁や柱等の遮蔽物に回り込んだ場合に見失う可能性があった。また目標物の移動予測は誤差が大きいため、目標物を見失い追従を続行できなくなることがある。 Conventionally, in autonomous movement where it is necessary to keep capturing the target object with a sensor, such as following a moving target object, there is a possibility of losing sight of the target object when it wraps around a shield such as a wall or a pillar. In addition, since the movement prediction of the target object has a large error, the target object may be lost and it may not be possible to continue following.
 そこで本技術では、センサによる目標物の観測に影響のある影響空間が特定される。特定された影響空間内の障害物が与える観測への影響度が推定され、その影響度が軽減されるような行動計画が生成される。これにより、目標物が遮蔽物に回り込む場合、移動体も回り込んで追跡する動作が可能となる。また目標物を見失う状況、又は観測が部分的になる状況を避けるように行動計画が生成されるため、追従の継続性が高くなる。 Therefore, in this technology, the influence space that affects the observation of the target object by the sensor is specified. The degree of influence of obstacles in the specified influence space on the observation is estimated, and an action plan is generated so that the degree of influence is reduced. As a result, when the target wraps around the shield, the moving body also wraps around and tracks. In addition, since the action plan is generated so as to avoid the situation where the target is lost or the observation becomes partial, the continuity of the follow-up is improved.
 <その他の実施形態>
 本技術は、以上説明した実施形態に限定されず、他の種々の実施形態を実現することができる。
<Other Embodiments>
The present technology is not limited to the embodiments described above, and various other embodiments can be realized.
 上記の実施形態では、影響空間の形状が円柱として設定された。これ以外にも、様々な形状が影響空間として設定されてもよい。 In the above embodiment, the shape of the influence space is set as a cylinder. In addition to this, various shapes may be set as the influence space.
 図9は、影響空間の他の形状を示す模式図である。
 図9Aは、円錐の影響空間を示す模式図である。
 図9Aに示すように、影響空間70が円錐の場合、影響空間特定部151により、影響空間70が以下の情報に基づいて特定される。
 観測移動体1の位置を基準とした始点71
 観測移動体1から観測対象2への方向ベクトル72
 観測移動体1から観測対象2への長さ(円錐70の高さ)
 底面73の半径、又は母線74の角度75
 これらの情報に基づいて、円錐形上の影響空間70を一意に定めることができ、影響空間を特定することが可能となる。
FIG. 9 is a schematic view showing another shape of the influence space.
FIG. 9A is a schematic view showing the influence space of the cone.
As shown in FIG. 9A, when the influence space 70 is a cone, the influence space 70 is specified by the influence space identification unit 151 based on the following information.
Starting point 71 based on the position of the observation moving body 1
Direction vector 72 from the observation moving body 1 to the observation target 2
Length from observation moving object 1 to observation target 2 (height of cone 70)
The radius of the bottom 73 or the angle 75 of the bus 74
Based on this information, the influence space 70 on the cone can be uniquely determined, and the influence space can be specified.
 図9Bは、四角錐の影響空間を示す模式図である。
 図9Bに示すように、影響空間80が四角錐の場合、影響空間特定部151により、影響空間80が以下の情報に基づいて特定される。
 観測移動体1の位置を基準とした始点81
 底面82に垂直な底面ベクトル83
 各側面に垂直な方向ベクトル84
 各面の境界点85
 例えば、図9Bでは、影響空間80は、4角あるため、各側面の境界点85及び方向ベクトル84が4つ決まることで、影響空間80を特定することが可能となる。
FIG. 9B is a schematic view showing the influence space of the quadrangular pyramid.
As shown in FIG. 9B, when the influence space 80 is a quadrangular pyramid, the influence space 80 is specified by the influence space identification unit 151 based on the following information.
Starting point 81 based on the position of the observation moving body 1
Bottom vector 83 perpendicular to bottom 82
Direction vector 84 perpendicular to each side
Boundary point 85 of each surface
For example, in FIG. 9B, since the influence space 80 has four corners, it is possible to specify the influence space 80 by determining four boundary points 85 and four direction vectors 84 on each side surface.
 図9Cは、四角錐台の影響空間を示す模式図である。
 図9Cに示すように、影響空間90が四角錐台の場合、影響空間特定部151により、影響空間90が以下の情報に基づいて特定される。
 各境界面の境界点91における方向ベクトル92
 例えば、図9Cでは、影響空間90は、6面あるため、各面の境界点91及び方向ベクトル92が6組決まることで、影響空間90を特定することが可能となる。なお、図9Cでは、簡略化のため3つの境界面の境界点及び方向ベクトルが図示されている。
FIG. 9C is a schematic view showing the influence space of the quadrangular pyramid.
As shown in FIG. 9C, when the influence space 90 is a quadrangular pyramid, the influence space 90 is specified by the influence space identification unit 151 based on the following information.
Direction vector 92 at the boundary point 91 of each boundary surface
For example, in FIG. 9C, since the influence space 90 has six faces, it is possible to specify the influence space 90 by determining six sets of boundary points 91 and direction vectors 92 of each face. In FIG. 9C, the boundary points and direction vectors of the three boundary surfaces are shown for simplification.
 なお、影響空間の形状は状況に応じて適宜変更されてもよい。例えば、観測移動体1と観測対象2との距離が近い場合は影響空間が円柱、距離が遠ければ影響空間が円錐に設定されてもよい。
 また影響空間の大きさが観測移動体1と観測対象2との距離に比例するように大きさが変更されてもよい。
The shape of the affected space may be changed as appropriate depending on the situation. For example, the influence space may be set to a cylinder when the distance between the observation moving body 1 and the observation target 2 is short, and the influence space may be set to a cone when the distance is long.
Further, the size of the influence space may be changed so as to be proportional to the distance between the observation moving object 1 and the observation target 2.
 上記の実施形態では、影響空間を特定するパラメータ(例えば、円柱の半径)がユーザにより決定された。これに限定されず、例えば、カメラの画角に合わせて影響空間が四角錐に設定されてもよい。また例えば、所定の距離以内では観測移動体1が撮像されるため、四角錐の始点から所定の距離以内を除いた四角錐台に設定されてもよい。また例えば、天井と床との間の空間が影響空間として設定されてもよい。 In the above embodiment, the parameter for specifying the influence space (for example, the radius of the cylinder) is determined by the user. Not limited to this, for example, the influence space may be set to a quadrangular pyramid according to the angle of view of the camera. Further, for example, since the observation moving body 1 is imaged within a predetermined distance, the quadrangular pyramid stand may be set except within a predetermined distance from the start point of the quadrangular pyramid. Further, for example, the space between the ceiling and the floor may be set as the influence space.
 上記の実施形態では、影響度が物体と中心線との距離に反比例した量で算出された。これに限定されず、種々の方法で影響度が算出されてもよい。
 例えば、観測移動体1と観測対象2を結ぶ線分Lに対して、重力ベクトルが張る平面Pと、平面Pに垂直で線分Lを含む平面Qとの2つの平面によって空間が4つの領域に分割されてもよい。この場合、各領域に物体があった場合の影響の度合いが参照されて、影響度が算出されてもよい。
 また例えば、影響の度合いに基づいて算出された影響度が線分Lと物体との距離に反比例するように影響度が変更されてもよい。
In the above embodiment, the degree of influence is calculated as an amount inversely proportional to the distance between the object and the center line. Not limited to this, the degree of influence may be calculated by various methods.
For example, with respect to the line segment L connecting the observation moving body 1 and the observation target 2, the space is divided into four regions by two planes, a plane P stretched by a gravity vector and a plane Q perpendicular to the plane P and including the line segment L. It may be divided into. In this case, the degree of influence may be calculated by referring to the degree of influence when there is an object in each region.
Further, for example, the degree of influence may be changed so that the degree of influence calculated based on the degree of influence is inversely proportional to the distance between the line segment L and the object.
 上記の実施形態では、観測行動計画が観測対象2との距離を保つための速度ベクトル、物体から離れるための速度ベクトル、及び影響度を減らすための速度ベクトルが合算されて生成された。これに限定されず、観測行動計画の速度ベクトルに、物体5との衝突を回避する速度ベクトル等の種々の制御に関する速度ベクトルが合算されてもよい。
 また例えば、観測移動体1の位置ごとの影響度が算出され、観測移動体1の通過する予定の経路上における影響度の積分値が最小となるような行動計画が生成されてもよい。具体的には、各時刻から10秒間に取り得る軌道が多数生成され、各軌道について予測した物体の位置に基づいて、各時刻の影響空間及び影響度が算出されて10秒分合算されてもよい。この軌道の全てから合算されて影響度が最小となる軌道が採用されて行動計画として生成されてもよい。
In the above embodiment, the observation action plan is generated by adding up the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence. Not limited to this, the velocity vector of the observation action plan may be added to the velocity vector related to various controls such as the velocity vector for avoiding the collision with the object 5.
Further, for example, the degree of influence for each position of the observation moving body 1 may be calculated, and an action plan may be generated such that the integrated value of the degree of influence on the route planned to pass by the observation moving body 1 is minimized. Specifically, a large number of orbits that can be taken in 10 seconds from each time are generated, and even if the influence space and the degree of influence at each time are calculated based on the predicted position of the object for each orbit and added up for 10 seconds. Good. An orbit that is added up from all of these orbits and has the minimum influence may be adopted and generated as an action plan.
 上記の実施形態では、物体が所定の位置から移動しないことが前提とされた。これに限定されず、状況予測部147により、物体の速度ベクトル(移動方向及び速度)等が予測され、予測結果に基づいて影響度が算出されてもよい。 In the above embodiment, it is assumed that the object does not move from a predetermined position. Not limited to this, the situation prediction unit 147 may predict the velocity vector (movement direction and velocity) of the object, and calculate the degree of influence based on the prediction result.
 上記の実施形態では、影響空間内における物体の位置に基づいて、影響度が算出された。これに限定されず、物体の大きさや種別等に基づいて影響度が算出されてもよい。例えば、物体が円形状の道路標識の場合、道路標識の縁に該当する座標が物体の位置と設定されてもよい。また影響度は勾配ベクトルに限定されず、各障害物に衝突する可能性を示す衝突リスクや観測不可能になる可能性を示す遮蔽リスク等が影響度として扱われてもよい。
 また例えば影響空間外に存在する物体の位置情報等に基づいて、影響度が算出されてもよい。例えば、観測移動体1以外のドローン等の移動体が接近してくる場合、その移動体の移動経路、速度等に基づいて行動計画が生成されてもよい。その際、通信部103により、移動体との相互通信や移動体の行動計画の取得等が行われてもよい。
In the above embodiment, the degree of influence is calculated based on the position of the object in the influence space. Not limited to this, the degree of influence may be calculated based on the size and type of the object. For example, when the object is a circular road sign, the coordinates corresponding to the edge of the road sign may be set as the position of the object. Further, the degree of influence is not limited to the gradient vector, and a collision risk indicating the possibility of colliding with each obstacle, a shielding risk indicating the possibility of becoming unobservable, or the like may be treated as the degree of influence.
Further, for example, the degree of influence may be calculated based on the position information of an object existing outside the influence space. For example, when a moving object such as a drone other than the observed moving object 1 approaches, an action plan may be generated based on the moving path, speed, or the like of the moving object. At that time, the communication unit 103 may perform mutual communication with the mobile body, acquisition of an action plan of the mobile body, and the like.
 上記の実施形態では、観測移動体1が行動計画に従い、自律移動を行った。これに限定されず、ユーザの操作(例えば、移動方向や速度の決定)に対して、観測行動計画が生成されてもよい。例えば、観測行動計画が生成された場合、ユーザの操作が規制され、観測移動体1が観測行動計画に従って、移動を行う。また例えば、右方向に遮蔽物がある場合、ユーザが右方向に移動させる操作を制限し、上や左方向に観測移動体1が移動されてもよい。 In the above embodiment, the observation mobile body 1 made an autonomous movement according to the action plan. Not limited to this, an observation action plan may be generated for a user operation (for example, determination of a moving direction or a speed). For example, when an observation action plan is generated, the user's operation is restricted, and the observation moving body 1 moves according to the observation action plan. Further, for example, when there is a shield in the right direction, the operation of moving the observation moving body 1 in the right direction may be restricted, and the observation moving body 1 may be moved in the upward or left direction.
 上記の実施形態では、観測対象2の観測が続行可能か否かを問わず、算出された影響度に基づいて観測行動計画が生成された。これに限定されず、観測対象2の観測が優先されてもよい。例えば、影響度を最も早く減らす観測行動計画よりも影響度を減らす度合いが低く、事前行動計画の経路に準じた観測行動計画が採用されてもよい。
 また上記の実施形態では、影響度は影響空間内の物体から算出された。これに限定されず、風、雨、周囲の動物体や光源等の様々な要因から影響度が算出されてもよい。
In the above embodiment, an observation action plan is generated based on the calculated degree of influence regardless of whether or not the observation of the observation target 2 can be continued. Not limited to this, the observation of the observation target 2 may be prioritized. For example, an observation action plan that reduces the degree of influence less than the observation action plan that reduces the degree of influence earliest may be adopted according to the route of the prior action plan.
Further, in the above embodiment, the degree of influence is calculated from the object in the influence space. Not limited to this, the degree of influence may be calculated from various factors such as wind, rain, surrounding animals and light sources.
 図10は、自律移動制御部110のハードウェア構成例を示すブロック図である。 FIG. 10 is a block diagram showing a hardware configuration example of the autonomous movement control unit 110.
 自律移動制御部110は、CPU201、ROM202、RAM203、入出力インタフェース205、及びこれらを互いに接続するバス204を備える。入出力インタフェース205には、表示部206、入力部207、記憶部208、通信部209、及びドライブ部210等が接続される。 The autonomous movement control unit 110 includes a CPU 201, a ROM 202, a RAM 203, an input / output interface 205, and a bus 204 that connects them to each other. A display unit 206, an input unit 207, a storage unit 208, a communication unit 209, a drive unit 210, and the like are connected to the input / output interface 205.
 表示部206は、例えば液晶、EL等を用いた表示デバイスである。入力部207は、例えばキーボード、ポインティングデバイス、タッチパネル、その他の操作装置である。入力部207がタッチパネルを含む場合、そのタッチパネルは表示部206と一体となり得る。 The display unit 206 is a display device using, for example, a liquid crystal or an EL. The input unit 207 is, for example, a keyboard, a pointing device, a touch panel, or other operating device. When the input unit 207 includes a touch panel, the touch panel can be integrated with the display unit 206.
 記憶部208は、不揮発性の記憶デバイスであり、例えばHDD、フラッシュメモリ、その他の固体メモリである。ドライブ部210は、例えば光学記録媒体、磁気記録テープ等、リムーバブルの記録媒体211を駆動することが可能なデバイスである。 The storage unit 208 is a non-volatile storage device, for example, an HDD, a flash memory, or other solid-state memory. The drive unit 210 is a device capable of driving a removable recording medium 211 such as an optical recording medium or a magnetic recording tape.
 通信部209は、LAN、WAN等に接続可能な、他のデバイスと通信するためのモデム、ルータ、その他の通信機器である。通信部209は、有線及び無線のどちらを利用して通信するものであってもよい。通信部209は、自律移動制御部110とは別体で使用される場合が多い。
 本実施形態では、通信部209により、ネットワークを介した他の装置との通信が可能となる。
The communication unit 209 is a modem, router, or other communication device for communicating with another device that can be connected to a LAN, WAN, or the like. The communication unit 209 may communicate using either wired or wireless. The communication unit 209 is often used separately from the autonomous movement control unit 110.
In the present embodiment, the communication unit 209 enables communication with other devices via the network.
 上記のようなハードウェア構成を有する自律移動制御部110による情報処理は、記憶部208またはROM202等に記憶されたソフトウェアと、自律移動制御部110のハードウェア資源との協働により実現される。具体的には、ROM202等に記憶された、ソフトウェアを構成するプログラムをRAM203にロードして実行することにより、本技術に係る情報処理方法が実現される。 Information processing by the autonomous movement control unit 110 having the hardware configuration as described above is realized by the cooperation between the software stored in the storage unit 208 or the ROM 202 or the like and the hardware resources of the autonomous movement control unit 110. Specifically, the information processing method according to the present technology is realized by loading and executing the program constituting the software stored in the ROM 202 or the like into the RAM 203.
 プログラムは、例えば記録媒体211を介して自律移動制御部110にインストールされる。あるいは、グローバルネットワーク等を介してプログラムが自律移動制御部110にインストールされてもよい。その他、コンピュータ読み取り可能な非一過性の任意の記憶媒体が用いられてよい。 The program is installed in the autonomous movement control unit 110 via, for example, the recording medium 211. Alternatively, the program may be installed in the autonomous mobile control unit 110 via a global network or the like. In addition, any non-transient storage medium that can be read by a computer may be used.
 通信端末に搭載されたコンピュータとネットワーク等を介して通信可能な他のコンピュータとが連動することにより本技術に係る移動体、情報処理装置、情報処理方法、及びプログラムが実行され、本技術に係る情報処理装置が構築されてもよい。 By linking a computer mounted on a communication terminal with another computer capable of communicating via a network or the like, a mobile body, an information processing device, an information processing method, and a program related to this technology are executed, and related to this technology. An information processing device may be constructed.
 すなわち本技術に係る移動体、情報処理装置、情報処理方法、及びプログラムは、単体のコンピュータにより構成されたコンピュータシステムのみならず、複数のコンピュータが連動して動作するコンピュータシステムにおいても実行可能である。なお、本開示において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれもシステムである。 That is, the mobile body, information processing device, information processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. .. In the present disclosure, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
 コンピュータシステムによる本技術に係る移動体、情報処理装置、情報処理方法、及びプログラムの実行は、例えば、影響空間の特定、影響度の算出、及び行動計画の生成等が、単体のコンピュータにより実行される場合、及び各処理が異なるコンピュータにより実行される場合の両方を含む。また所定のコンピュータによる各処理の実行は、当該処理の一部又は全部を他のコンピュータに実行させその結果を取得することを含む。 Execution of moving objects, information processing devices, information processing methods, and programs related to this technology by a computer system is performed by, for example, identification of an influence space, calculation of an influence degree, generation of an action plan, and the like by a single computer. And when each process is performed by a different computer. Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result.
 すなわち本技術に係る移動体、情報処理装置、情報処理方法、及びプログラムは、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成にも適用することが可能である。 That is, the mobile body, information processing device, information processing method, and program related to the present technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. It is possible.
 各図面を参照して説明した影響空間特定部、影響度算出部、GUI出力部、行動計画部等の各構成、通信システムの制御フロー等はあくまで一実施形態であり、本技術の趣旨を逸脱しない範囲で、任意に変形可能である。すなわち本技術を実施するための他の任意の構成やアルゴリズム等が採用されてよい。 Each configuration of the influence space identification part, the influence degree calculation part, the GUI output part, the action planning part, etc., the control flow of the communication system, etc. explained with reference to each drawing is only one embodiment and deviates from the purpose of the present technology. It can be deformed arbitrarily as long as it does not. That is, other arbitrary configurations, algorithms, and the like for implementing the present technology may be adopted.
 なお、本開示中に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。上記の複数の効果の記載は、それらの効果が必ずしも同時に発揮されるということを意味しているのではない。条件等により、少なくとも上記した効果のいずれかが得られることを意味しており、もちろん本開示中に記載されていない効果が発揮される可能性もある。 Note that the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained. The description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
 以上説明した各形態の特徴部分のうち、少なくとも2つの特徴部分を組み合わせることも可能である。すなわち各実施形態で説明した種々の特徴部分は、各実施形態の区別なく、任意に組み合わされてもよい。 It is also possible to combine at least two feature parts among the feature parts of each form described above. That is, the various feature portions described in each embodiment may be arbitrarily combined without distinction between the respective embodiments.
 本開示において、「中心」「中央」「均一」「等しい」「同じ」「直交」「平行」「対称」「延在」「軸方向」「円柱形状」「円筒形状」「リング形状」「円環形状」等の、形状、サイズ、位置関係、状態等を規定する概念は、「実質的に中心」「実質的に中央」「実質的に均一」「実質的に等しい」「実質的に同じ」「実質的に直交」「実質的に平行」「実質的に対称」「実質的に延在」「実質的に軸方向」「実質的に円柱形状」「実質的に円筒形状」「実質的にリング形状」「実質的に円環形状」等を含む概念とする。 In the present disclosure, "center", "center", "uniform", "equal", "same", "orthogonal", "parallel", "symmetrical", "extended", "axial", "cylindrical", "cylindrical", "ring", and "circle". Concepts that define shape, size, positional relationship, state, etc., such as "ring shape," are "substantially centered," "substantially centered," "substantially uniform," "substantially equal," and "substantially the same." "Substantially orthogonal" "substantially parallel" "substantially symmetric" "substantially extending" "substantially axial" "substantially cylindrical" "substantially cylindrical" "substantially cylindrical" The concept includes "ring shape", "substantially ring shape", and the like.
 例えば「完全に中心」「完全に中央」「完全に均一」「完全に等しい」「完全に同じ」「完全に直交」「完全に平行」「完全に対称」「完全に延在」「完全に軸方向」「完全に円柱形状」「完全に円筒形状」「完全にリング形状」「完全に円環形状」等を基準とした所定の範囲(例えば±10%の範囲)に含まれる状態も含まれる。 For example, "perfectly centered", "perfectly centered", "perfectly uniform", "perfectly equal", "perfectly identical", "perfectly orthogonal", "perfectly parallel", "perfectly symmetric", "perfectly extending", "perfectly extending" Includes states that are included in a predetermined range (for example, ± 10% range) based on "axial direction", "completely cylindrical shape", "completely cylindrical shape", "completely ring shape", "completely annular shape", etc. Is done.
 なお、本技術は以下のような構成も採ることができる。
(1)
 観測対象を観測可能なセンサ部と、
 前記観測対象に対して設定される前記センサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する行動計画生成部と
 を具備する移動体。
(2)(1)に記載の移動体であって、
 無人飛行体である
 移動体。
(3)(1)又は(2)に記載の移動体であって、さらに、
 前記観測対象の位置情報に基づいて、前記影響空間を特定する空間特定部を具備する
 移動体。
(4)(3)に記載の移動体であって、
 前記空間特定部は、前記センサ部の観測に関する観測情報に基づいて、前記影響空間を特定する
 移動体。
(5)(4)に記載の移動体であって、
 前記観測情報は、前記センサ部の画角、又は前記センサ部の観測可能距離の少なくとも1つを含み、
 前記空間特定部は、前記移動体と前記観測対象とを結ぶ線分に基づいて、前記影響空間を特定する
 移動体。
(6)(1)から(5)のうちいずれか1つに記載の移動体であって、さらに、
 前記物体が前記観測対象の形状、大きさ、位置、又は速度のうち、少なくとも一つに基づいて、前記影響度を算出する算出部を具備する
 移動体。
(7)(6)に記載の移動体であって、
 前記算出部は、前記影響空間内における前記物体の位置情報に基づいて、前記影響度を算出する
 移動体。
(8)(1)から(7)のうちいずれか1つに記載の移動体であって、
 前記行動計画生成部により生成された行動計画に基づいて移動する
 移動体。
(9)(1)から(8)のうちいずれか1つに記載の移動体であって、
 前記行動計画生成部により生成された行動計画に基づいて移動し、
 前記行動計画生成部は、前記影響度を減らすための前記行動計画を生成する
 移動体。
(10)(1)から(9)のうちいずれか1つに記載の移動体であって、
 前記行動計画生成部は、前記観測対象を観測する行動計画を観測行動計画として、予め与えられた前記観測対象を観測するための行動計画に前記観測行動計画を統合する
 移動体。
(11)(10)に記載の移動体であって、
 前記行動計画生成部は、予め与えられた前記観測対象を観測するための前記行動計画を規制し、前記観測行動計画を実行させる
 移動体。
(12)(1)から(11)のうちいずれか1つに記載の移動体であって、
 前記行動計画生成部は、前記影響空間内に複数の物体が存在する場合、各々の前記影響度を合算し、前記行動計画を生成する
 移動体。
(13)(1)から(12)のうちいずれか1つに記載の移動体であって、さらに、
 前記センサ部により取得されたセンシング結果に基づいて、前記観測対象又は前記物体の少なくとも1つの行動を予測する行動予測部を具備する
 移動体。
(14)(13)に記載の移動体であって、
 前記行動計画生成部は、前記行動予測部により予測された前記観測対象の所定の行動に基づいて、前記行動計画を生成する
 移動体。
(15)(14)に記載の移動体であって、
 前記行動計画生成部は、前記観測対象が前記センサ部により観測可能な空間外へ移動する場合に、前記所定の行動に基づいて前記行動計画を生成する
 移動体。
(16)
 観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する行動計画生成部
 を具備する情報処理装置。
(17)(16)に記載の情報処理装置であって、さらに、
 前記影響空間が識別可能に表示されたGUI(Graphical User Interface)を出力するGUI出力部を具備する
 情報処理装置。
(18)(17)に記載の情報処理装置であって、
 前記GUI出力部は、前記影響度が識別可能に表示されたGUIを出力する
 情報処理装置。
(19)
 観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する
 ことをコンピュータシステムが実行する情報処理方法。
(20)
 観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成するステップ
 をコンピュータシステムに実行させるプログラム。
The present technology can also adopt the following configurations.
(1)
A sensor unit that can observe the observation target and
An action plan generation unit that generates an action plan for observing the observation target based on the degree of influence of an object in the influence space set on the observation target on the observation by the sensor unit. A moving body to be equipped.
(2) The moving body according to (1).
A mobile body that is an unmanned flying object.
(3) The moving body according to (1) or (2), and further
A moving body including a space specifying unit that specifies the affected space based on the position information of the observation target.
(4) The moving body according to (3).
The space specifying unit is a mobile body that specifies the affected space based on observation information related to the observation of the sensor unit.
(5) The moving body according to (4).
The observation information includes at least one of the angle of view of the sensor unit or the observable distance of the sensor unit.
The space specifying unit is a moving body that specifies the affected space based on a line segment connecting the moving body and the observation target.
(6) The moving body according to any one of (1) to (5), and further.
A moving body in which the object includes a calculation unit that calculates the degree of influence based on at least one of the shape, size, position, or velocity of the observation target.
(7) The moving body according to (6).
The calculation unit is a moving body that calculates the degree of influence based on the position information of the object in the influence space.
(8) The moving body according to any one of (1) to (7).
A moving body that moves based on an action plan generated by the action plan generation unit.
(9) The moving body according to any one of (1) to (8).
Move based on the action plan generated by the action plan generation unit,
The action plan generation unit is a mobile body that generates the action plan for reducing the degree of influence.
(10) The moving body according to any one of (1) to (9).
The action plan generation unit is a moving body that integrates the observation action plan into a predetermined action plan for observing the observation target, using the action plan for observing the observation target as the observation action plan.
(11) The moving body according to (10).
The action plan generation unit is a moving body that regulates the action plan for observing the observation target given in advance and executes the observation action plan.
(12) The moving body according to any one of (1) to (11).
When a plurality of objects exist in the influence space, the action plan generation unit is a moving body that totals the influence degrees of each and generates the action plan.
(13) The moving body according to any one of (1) to (12), and further.
A moving body including a behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
(14) The moving body according to (13).
The action plan generation unit is a mobile body that generates the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
(15) The moving body according to (14).
The action plan generation unit is a moving body that generates the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
(16)
Information that includes an action plan generation unit that generates an action plan for observing the observation target based on the degree of influence that an object in the space has on the observation that affects the observation by the sensor unit set for the observation target. Processing equipment.
(17) The information processing apparatus according to (16), further
An information processing device including a GUI output unit that outputs a GUI (Graphical User Interface) in which the influence space is identifiable.
(18) The information processing apparatus according to (17).
The GUI output unit is an information processing device that outputs a GUI in which the degree of influence is identifiable.
(19)
Information that the computer system executes to generate an action plan for observing the observation target based on the degree of influence that the object in the space has on the observation that affects the observation by the sensor unit set for the observation target. Processing method.
(20)
A program that causes a computer system to execute a step to generate an action plan for observing the observation target based on the degree of influence that an object in the space has on the observation. ..
 1…観測移動体
 2…観測対象
 3…センサ部
 4…影響空間
 5…物体
 100…移動体制御システム
 110…自律移動制御部
 124…影響空間処理部
 134…計画部
 151…影響空間特定部
 152…影響度算出部
 153…GUI出力部
1 ... Observation moving object 2 ... Observation target 3 ... Sensor unit 4 ... Influence space 5 ... Object 100 ... Moving object control system 110 ... Autonomous movement control unit 124 ... Influence space processing unit 134 ... Planning unit 151 ... Influence space identification unit 152 ... Impact calculation unit 153 ... GUI output unit

Claims (20)

  1.  観測対象を観測可能なセンサ部と、
     前記観測対象に対して設定される前記センサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する行動計画生成部と
     を具備する移動体。
    A sensor unit that can observe the observation target and
    An action plan generation unit that generates an action plan for observing the observation target based on the degree of influence of an object in the influence space set on the observation target on the observation by the sensor unit. A moving body to be equipped.
  2.  請求項1に記載の移動体であって、
     無人飛行体である
     移動体。
    The moving body according to claim 1.
    A mobile body that is an unmanned flying object.
  3.  請求項1に記載の移動体であって、さらに、
     前記観測対象の位置情報に基づいて、前記影響空間を特定する空間特定部を具備する
     移動体。
    The mobile body according to claim 1, further
    A moving body including a space specifying unit that specifies the affected space based on the position information of the observation target.
  4.  請求項3に記載の移動体であって、
     前記空間特定部は、前記センサ部の観測に関する観測情報に基づいて、前記影響空間を特定する
     移動体。
    The moving body according to claim 3.
    The space specifying unit is a mobile body that specifies the affected space based on observation information related to the observation of the sensor unit.
  5.  請求項4に記載の移動体であって、
     前記観測情報は、前記センサ部の画角、又は前記センサ部の観測可能距離の少なくとも1つを含み、
     前記空間特定部は、前記移動体と前記観測対象とを結ぶ線分に基づいて、前記影響空間を特定する
     移動体。
    The moving body according to claim 4.
    The observation information includes at least one of the angle of view of the sensor unit or the observable distance of the sensor unit.
    The space specifying unit is a moving body that specifies the affected space based on a line segment connecting the moving body and the observation target.
  6.  請求項1に記載の移動体であって、さらに、
     前記物体が前記観測対象の形状、大きさ、位置、又は速度のうち、少なくとも一つに基づいて、前記影響度を算出する算出部を具備する
     移動体。
    The mobile body according to claim 1, further
    A moving body in which the object includes a calculation unit that calculates the degree of influence based on at least one of the shape, size, position, or velocity of the observation target.
  7.  請求項6に記載の移動体であって、
     前記算出部は、前記影響空間内における前記物体の位置情報に基づいて、前記影響度を算出する
     移動体。
    The moving body according to claim 6.
    The calculation unit is a moving body that calculates the degree of influence based on the position information of the object in the influence space.
  8.  請求項1に記載の移動体であって、
     前記行動計画生成部により生成された行動計画に基づいて移動する
     移動体。
    The moving body according to claim 1.
    A moving body that moves based on an action plan generated by the action plan generation unit.
  9.  請求項1に記載の移動体であって、
     前記行動計画生成部により生成された行動計画に基づいて移動し、
     前記行動計画生成部は、前記影響度を減らすための前記行動計画を生成する
     移動体。
    The moving body according to claim 1.
    Move based on the action plan generated by the action plan generation unit,
    The action plan generation unit is a mobile body that generates the action plan for reducing the degree of influence.
  10.  請求項1に記載の移動体であって、
     前記行動計画生成部は、前記観測対象を観測する行動計画を観測行動計画として、予め与えられた前記観測対象を観測するための行動計画に前記観測行動計画を統合する
     移動体。
    The moving body according to claim 1.
    The action plan generation unit is a moving body that integrates the observation action plan into a predetermined action plan for observing the observation target, using the action plan for observing the observation target as the observation action plan.
  11.  請求項10に記載の移動体であって、
     前記行動計画生成部は、予め与えられた前記観測対象を観測するための前記行動計画を規制し、前記観測行動計画を実行させる
     移動体。
    The moving body according to claim 10.
    The action plan generation unit is a moving body that regulates the action plan for observing the observation target given in advance and executes the observation action plan.
  12.  請求項1に記載の移動体であって、
     前記行動計画生成部は、前記影響空間内に複数の物体が存在する場合、各々の前記影響度を合算し、前記行動計画を生成する
     移動体。
    The moving body according to claim 1.
    When a plurality of objects exist in the influence space, the action plan generation unit is a moving body that totals the influence degrees of each and generates the action plan.
  13.  請求項1に記載の移動体であって、さらに、
     前記センサ部により取得されたセンシング結果に基づいて、前記観測対象又は前記物体の少なくとも1つの行動を予測する行動予測部を具備する
     移動体。
    The mobile body according to claim 1, further
    A moving body including a behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
  14.  請求項13に記載の移動体であって、
     前記行動計画生成部は、前記行動予測部により予測された前記観測対象の所定の行動に基づいて、前記行動計画を生成する
     移動体。
    The moving body according to claim 13.
    The action plan generation unit is a mobile body that generates the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
  15.  請求項14に記載の移動体であって、
     前記行動計画生成部は、前記観測対象が前記センサ部により観測可能な空間外へ移動する場合に、前記所定の行動に基づいて前記行動計画を生成する
     移動体。
    The moving body according to claim 14.
    The action plan generation unit is a moving body that generates the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
  16.  観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する行動計画生成部
     を具備する情報処理装置。
    Information that includes an action plan generation unit that generates an action plan for observing the observation target based on the degree of influence that an object in the space has on the observation that affects the observation by the sensor unit set for the observation target. Processing equipment.
  17.  請求項16に記載の情報処理装置であって、さらに、
     前記影響空間が識別可能に表示されたGUI(Graphical User Interface)を出力するGUI出力部を具備する
     情報処理装置。
    The information processing apparatus according to claim 16, further comprising.
    An information processing device including a GUI output unit that outputs a GUI (Graphical User Interface) in which the influence space is identifiable.
  18.  請求項17に記載の情報処理装置であって、
     前記GUI出力部は、前記影響度が識別可能に表示されたGUIを出力する
     情報処理装置。
    The information processing apparatus according to claim 17.
    The GUI output unit is an information processing device that outputs a GUI in which the degree of influence is identifiable.
  19.  観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成する
     ことをコンピュータシステムが実行する情報処理方法。
    Information that the computer system executes to generate an action plan for observing the observation target based on the degree of influence that the object in the space has on the observation that affects the observation by the sensor unit set for the observation target. Processing method.
  20.  観測対象に対して設定されるセンサ部による観測に影響がある影響空間内の物体が観測に与える影響度に基づいて、前記観測対象を観測する行動計画を生成するステップ
     をコンピュータシステムに実行させるプログラム。
    A program that causes a computer system to execute a step to generate an action plan for observing the observation target based on the degree of influence that an object in the space has on the observation. ..
PCT/JP2020/048172 2020-01-07 2020-12-23 Moving body, information processing device, information processing method, and program WO2021140916A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-000801 2020-01-07
JP2020000801 2020-01-07

Publications (1)

Publication Number Publication Date
WO2021140916A1 true WO2021140916A1 (en) 2021-07-15

Family

ID=76788650

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/048172 WO2021140916A1 (en) 2020-01-07 2020-12-23 Moving body, information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2021140916A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296760A (en) * 2005-04-20 2006-11-02 Seventh Dimension Design:Kk Optical transmitter-receiver control system
JP2009217448A (en) * 2008-03-10 2009-09-24 Mitsubishi Electric Corp Human tracking system using image information
JP2010015194A (en) * 2008-06-30 2010-01-21 Ihi Corp Autonomous moving robot device and control method for autonomous moving robot device
JP2014123306A (en) * 2012-12-21 2014-07-03 Secom Co Ltd Autonomous flight robot
JP2018147337A (en) * 2017-03-08 2018-09-20 日本電気株式会社 Autonomous Mobile Robot, Autonomous Mobile Robot Control Method and Control Program
JP2019114036A (en) * 2017-12-22 2019-07-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Information processing apparatus, flight control instruction method, program, and recording medium
JP2019146087A (en) * 2018-02-22 2019-08-29 キヤノン株式会社 Information processing device, control method of imaging device, computer program, and storage medium
JP2019168886A (en) * 2018-03-23 2019-10-03 カシオ計算機株式会社 Detection body region detecting device, imaging device, flying device, detection body region detecting method, imaging method and program
WO2019244626A1 (en) * 2018-06-18 2019-12-26 ソニー株式会社 Mobile unit and control method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296760A (en) * 2005-04-20 2006-11-02 Seventh Dimension Design:Kk Optical transmitter-receiver control system
JP2009217448A (en) * 2008-03-10 2009-09-24 Mitsubishi Electric Corp Human tracking system using image information
JP2010015194A (en) * 2008-06-30 2010-01-21 Ihi Corp Autonomous moving robot device and control method for autonomous moving robot device
JP2014123306A (en) * 2012-12-21 2014-07-03 Secom Co Ltd Autonomous flight robot
JP2018147337A (en) * 2017-03-08 2018-09-20 日本電気株式会社 Autonomous Mobile Robot, Autonomous Mobile Robot Control Method and Control Program
JP2019114036A (en) * 2017-12-22 2019-07-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Information processing apparatus, flight control instruction method, program, and recording medium
JP2019146087A (en) * 2018-02-22 2019-08-29 キヤノン株式会社 Information processing device, control method of imaging device, computer program, and storage medium
JP2019168886A (en) * 2018-03-23 2019-10-03 カシオ計算機株式会社 Detection body region detecting device, imaging device, flying device, detection body region detecting method, imaging method and program
WO2019244626A1 (en) * 2018-06-18 2019-12-26 ソニー株式会社 Mobile unit and control method

Similar Documents

Publication Publication Date Title
US11787543B2 (en) Image space motion planning of an autonomous vehicle
US11537131B2 (en) Control device, control method, and mobile body
US11427218B2 (en) Control apparatus, control method, program, and moving body
JP7259749B2 (en) Information processing device, information processing method, program, and moving body
EP3722904B1 (en) Moving body, control method and control program
WO2017143588A1 (en) Systems and methods for adjusting uav trajectory
Omari et al. Visual industrial inspection using aerial robots
US11822341B2 (en) Control device, control method, and mobile object to estimate the mobile object&#39;s self-position
JP2018535487A (en) System and method for planning and controlling UAV paths
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
WO2019098002A1 (en) Information processing device, information processing method, program, and moving body
WO2020226085A1 (en) Information processing device, information processing method, and program
KR20140144921A (en) Simulation system for autonomous vehicle using virtual reality
US11500386B2 (en) Control apparatus, control method, program, and mobile object
WO2021140916A1 (en) Moving body, information processing device, information processing method, and program
US20230107289A1 (en) Information processing method, information processor, and program
WO2019176278A1 (en) Information processing device, information processing method, program, and mobile body
WO2021187110A1 (en) Moving object, information processing device, information processing method, and program
US20230150543A1 (en) Systems and methods for estimating cuboid headings based on heading estimations generated using different cuboid defining techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20911520

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20911520

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP