WO2021140916A1 - Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2021140916A1
WO2021140916A1 PCT/JP2020/048172 JP2020048172W WO2021140916A1 WO 2021140916 A1 WO2021140916 A1 WO 2021140916A1 JP 2020048172 W JP2020048172 W JP 2020048172W WO 2021140916 A1 WO2021140916 A1 WO 2021140916A1
Authority
WO
WIPO (PCT)
Prior art keywords
observation
unit
action plan
influence
moving body
Prior art date
Application number
PCT/JP2020/048172
Other languages
English (en)
Japanese (ja)
Inventor
啓輔 前田
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021140916A1 publication Critical patent/WO2021140916A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • This technology relates to mobile objects, information processing devices, information processing methods, and programs applicable to autonomous movement and the like.
  • the autonomous mobile robot described in Patent Document 1 predicts the destination of the target and the obstacle from the captured image. From the prediction result, it is determined whether the target is shielded by an obstacle. Based on the determination result, the field of view is changed so that the target area that enters the field of view of the captured image is large. Thereby, it is disclosed that the goal is not lost (paragraphs [0024] [0026] of Patent Document 1 and the like).
  • the purpose of this technology is to provide mobiles, information processing devices, information processing methods, and programs that can exhibit high stability in observation.
  • the moving body includes a sensor unit and an action plan generation unit.
  • the sensor unit can observe the observation target.
  • the action plan generation unit generates an action plan for observing the observation target based on the degree of influence of an object in the influence space set on the observation target on the observation by the sensor unit. To do.
  • an action plan for observing the observation target is generated based on the degree of influence that the object in the influence space that affects the observation by the sensor unit set for the observation target has on the observation. This makes it possible to demonstrate high stability in observation.
  • the moving body may be an unmanned flying body.
  • the moving body may further include a space specifying unit that specifies the affected space based on the position information of the observation target.
  • the space identification unit may specify the influence space based on the observation information regarding the observation of the sensor unit.
  • the observation information may include at least one of the angle of view of the sensor unit or the observable distance of the sensor unit.
  • the space specifying unit may specify the affected space based on a line segment connecting the moving body and the observation target.
  • the moving body may further include a calculation unit that calculates the degree of influence of the object based on at least one of the shape, size, position, or speed of the observation target.
  • the calculation unit may calculate the degree of influence based on the position information of the object in the influence space.
  • the moving body may move based on the action plan generated by the action plan generation unit.
  • the moving body may move based on the action plan generated by the action plan generation unit.
  • the action plan generation unit may generate the action plan for reducing the degree of influence.
  • the action plan generation unit may integrate the observation action plan into the action plan for observing the observation target given in advance, using the action plan for observing the observation target as the observation action plan.
  • the action plan generation unit may regulate the action plan for observing the observation target given in advance and execute the observation action plan.
  • the action plan generation unit may generate the action plan by adding up the influence degrees of each.
  • the moving body may further include a behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
  • the action plan generation unit may generate the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
  • the action plan generation unit may generate the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
  • the information processing device includes an action plan generation unit.
  • the action plan generation unit generates an action plan for observing the observation target based on the degree of influence on the observation by an object in the influence space that affects the observation by the sensor unit set for the observation target.
  • the information processing device may further include a GUI output unit that outputs a GUI (Graphical User Interface) in which the influence space is identifiable.
  • GUI Graphic User Interface
  • the GUI output unit may output a GUI in which the degree of influence is identifiable.
  • the information processing method is an information processing method executed by a computer system and has an influence on the observation by the sensor unit set for the observation target.
  • a program causes a computer system to perform the following steps.
  • FIG. 1 is a schematic diagram for explaining an outline of an observation mobile body according to the present technology.
  • FIG. 1A is a schematic diagram showing how the observation moving body 1 follows the observation target 2.
  • FIG. 1B is a schematic view showing a state in which the observation target 2 is viewed from the observation moving body 1.
  • the observation moving body 1 can generate an action plan for observing the observation target 2 based on the degree of influence of the object 5 in the influence space 4 that affects the observation by the sensor unit 3 on the observation. It is possible.
  • the influence space 4 is a space set for the observation target 2 observed by the sensor unit 3.
  • the observation mobile body 1 is a drone capable of autonomous flight.
  • the observation mobile body 1 has a sensor unit 3 capable of observing the observation target 2.
  • the sensor unit 3 includes an imaging device such as a stereo camera, a digital camera, or a monocular camera.
  • sensor devices such as laser distance measuring sensors, contact sensors, ultrasonic sensors, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), and sonar may be used.
  • the observation moving body 1 can maintain the state in which the observation target 2 is observed by the sensor unit 3 and can continuously track the observation target 2.
  • the observation moving body 1 is not limited to the drone, and may be, for example, a wheel-type robot, a multi-legged walking type robot, or a robot having legs having an articulated structure.
  • the observation target 2 and the object 5 are not limited, and any object is set as the observation target 2.
  • other moving objects such as drones, people, structures, roads, traffic lights, traffic signs, road markings, etc. are included.
  • the object 5 existing in the influence space 4 may be described as an obstacle.
  • the influence space 4 is set for the observation target 2 as a space that affects the observation by the sensor unit 3.
  • Observation typically includes acquiring information about the observation target 2 from various sensor devices included in the sensor unit 3 of the observation mobile body 1. For example, the image information of the observation target 2 captured by the camera, or the voice data such as the voice and footsteps of the observation target 2 detected by the microphone are included in the observation result. Further, the state in which these information (data) can be acquired is the state in which the observation target 2 can be observed. In the present disclosure, as an observable state, a state in which the observation target 2 can be contained within the angle of view of the camera mounted on the observation moving body 1 is taken as an example.
  • the influence space 4 is a space in which the observation target 2 cannot be observed due to the existence of the object 5 or the like in the space, or a space in which the observation is impossible. is there.
  • the degree of influence of the object 5 on the observation is the degree of the object 5 hindering the observation. For example, the larger the area (volume) of the object 5 that occupies the influence space 4, the greater the degree of influence on the observation.
  • the degree of influence on the observation may change depending on other parameters such as the position of the observation moving body 1.
  • a cylindrical influence space 4 is set as an example of the influence space 4.
  • the shape of the influence space 4 is not limited, and an arbitrary shape may be set according to the observation information regarding the observation of the sensor unit 3.
  • the observation information includes at least one of observable distances such as an angle of view (viewing angle), an observable minimum distance, and a maximum distance.
  • the action plan is various information that controls the observation mobile body 1.
  • the speed of the observation moving body 1, the path (trajectory) in which the observation moving body 1 moves, the waypoint (position) through which the observation moving body 1 passes, the posture of the observation moving body 1, and the like are included in the action plan.
  • the moving direction and speed of the observed moving body 1 are generated as an action plan.
  • the moving direction and speed of the observed moving body 1 at each time can be said to be the path of the observed moving body 1.
  • the observation moving body 1 observes the observation target 2 according to an action plan that follows the observation target 2.
  • the observation moving object 1 is generated based on the degree of influence of the object 5 on the observation. It is possible to observe the observation target 2 according to the action plan.
  • the observation moving body 1 moves the observation target 2 to the observable position 8 according to the action plan 7 that moves around the object 5.
  • the observation moving body 1 continues following according to the action plan for following the observation target 2 given in advance. That is, the observation mobile body 1 can execute the action plan without losing sight of the observation target 2.
  • FIG. 2 is a block diagram showing a configuration example of a schematic function of the mobile body control system 100 that controls the observation mobile body 1 of the present disclosure.
  • the mobile control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a mobile internal device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, and a storage unit 109. And an autonomous movement control unit 110.
  • the input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the storage unit 109, and the autonomous movement control unit 110 are connected to each other via the communication network 111.
  • the communication network 111 is a communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) such as IEEE802.3, or FlexRay (registered trademark). It consists of a bus, a bus, or a unique communication method that is not standardized. In addition, each part of the mobile control system 100 may be directly connected without going through the communication network 111.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • the description of the communication network 111 shall be omitted.
  • the input unit 101 and the autonomous movement control unit 110 communicate with each other via the communication network 111, it is described that the input unit 101 and the autonomous movement control unit 110 simply communicate with each other.
  • the input unit 101 includes a device used for inputting various data, instructions, and the like to the observation mobile body 1.
  • the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device capable of inputting by a method other than manual operation by voice or gesture.
  • the input unit 101 may be a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device that supports the operation of the mobile control system 100.
  • the input unit 101 generates an input signal based on data, instructions, and the like input by an operator (hereinafter referred to as a user) who gives an action plan to the observation moving body 1, and supplies the input signal to each part of the moving body control system 100. To do.
  • the data acquisition unit 102 includes various sensors and the like for acquiring data used for processing of the mobile control system 100, and supplies the acquired data to each unit of the mobile control system 100.
  • the data acquisition unit 102 constitutes the sensor group 112 by including various sensors for detecting the state of the observation moving body 1, and corresponds to the sensor unit 3 in FIG.
  • the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an operation amount of acceleration input such as an accelerator, an operation amount of deceleration input, and an operation amount of direction instruction input.
  • IMU inertial measurement unit
  • the data acquisition unit 102 includes various sensors for detecting information outside the observation moving body 1 such as the observation target 2 and the object 5.
  • the data acquisition unit 102 includes an imaging device such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, an infrared camera, a polarized camera, and other cameras.
  • ToF Time of Flight
  • the data acquisition unit 102 includes an environment sensor for detecting the weather or the weather, and a surrounding information detection sensor for detecting an object around the observation moving body 1.
  • the environmental sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.
  • Ambient information detection sensors include, for example, laser distance measuring sensors, ultrasonic sensors, radars, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), sonar, and the like.
  • the data acquisition unit 102 includes various sensors for detecting the current position of the observation moving body 1.
  • the data acquisition unit 102 includes a GNSS receiver or the like that receives a GNSS signal from a GNSS satellite.
  • the communication unit 103 communicates with the mobile internal device 104 and various devices, servers, base stations, etc. outside the observation mobile 1 such as other drones, and data supplied from each unit of the mobile control system 100. Is transmitted, and the received data is supplied to each part of the mobile control system 100.
  • the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 may support a plurality of types of communication protocols.
  • the communication unit 103 wirelessly communicates with the mobile internal device 104 by wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like.
  • the communication unit 103 uses a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (Mobile) via a connection terminal (and a cable if necessary) (not shown). Wired communication is performed with the mobile internal device 104 by high-definition link) or the like. Further, for example, the communication unit 103 communicates with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network or a network peculiar to a business operator) via a base station or an access point. I do.
  • a device for example, an application server or a control server
  • an external network for example, the Internet, a cloud network or a network peculiar to a business operator
  • the communication unit 103 uses P2P (Peer To Peer) technology to connect with a terminal (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal) existing in the vicinity of the observation mobile body 1. To communicate.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the mobile internal device 104 includes, for example, a mobile device or wearable device owned by the user, an information device carried in or attached to the observation mobile body 1, a navigation device for searching a route to an arbitrary destination, and the like.
  • the output control unit 105 controls the output of various information to the user or the outside of the observation mobile body 1.
  • the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data) and supplies the output signal to the output unit 106.
  • the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate a bird's-eye view image, a panoramic image, or the like, and outputs an output signal including the generated image. It is supplied to the output unit 106.
  • the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a danger zone, and outputs an output signal including the generated voice data to the output unit 106.
  • Supply for example, the output control unit 105 generates voice data including a warning sound or a warning message for dangers such as collision, contact, and entry into a
  • the output unit 106 includes a device capable of outputting visual information or auditory information to the user or the outside of the observation mobile body 1.
  • the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a spectacle-type display worn by a user, a projector, a lamp, and the like.
  • the display device included in the output unit 106 displays visual information in the user's field of view, such as a head-up display, a transmissive display, and a device having an AR (Augmented Reality) display function, in addition to the device having a normal display. It may be a device for Since the output control unit 105 and the output unit 106 are not indispensable for the processing of autonomous movement, they may be omitted if necessary.
  • the drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies control signals to each unit other than the drive system system 108 as necessary, and notifies the control state of the drive system system 108.
  • the drive system system 108 includes various devices related to the drive system of the observation mobile body 1.
  • the drive system 108 includes a servomotor that can specify the angle and torque provided in each joint of the four legs, a motion controller that decomposes and replaces the movement of the robot itself into the movements of the four legs, and It is equipped with a feedback control device using sensors in each motor and sensors on the back of the foot.
  • the drive system 108 includes motors having four or six upward propellers, and a motion controller that decomposes and replaces the movement of the robot itself into the amount of rotation of each motor.
  • the drive system system 108 provides a drive force generator for generating a drive force of an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to the wheels, and a steering angle. It is equipped with a steering mechanism for adjusting, a braking device for generating braking force, an ABS (Antilock Brake System), an ESC (Electronic Stability Control), an electric power steering device, and the like.
  • the storage unit 109 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and the like.
  • the storage unit 109 stores various programs, data, and the like used by each unit of the mobile control system 100.
  • the storage unit 109 is a three-dimensional high-precision map such as a dynamic map, a global map that is less accurate than the high-precision map and covers a wide area, and a map such as a local map that includes information around the observation moving object 1. Store data.
  • the autonomous movement control unit 110 controls autonomous movement such as automatic driving or driving support. Specifically, for example, the autonomous movement control unit 110 has a function of collision avoidance or impact mitigation of the observation moving body 1, follow-up movement based on the distance between the observation moving bodies, speed maintenance movement, or collision warning function of the observation moving body 1. Perform coordinated control for the purpose of realization. Further, for example, the autonomous movement control unit 110 performs cooperative control for the purpose of autonomous movement that moves autonomously without depending on the operation of the user.
  • the autonomous movement control unit 110 corresponds to the information processing device according to the present embodiment, and has hardware necessary for a computer such as a CPU, RAM, and ROM.
  • the information processing method according to the present technology is executed by the CPU loading the program according to the present technology recorded in the ROM in advance into the RAM and executing the program.
  • the specific configuration of the autonomous movement control unit 110 is not limited, and for example, a PLD (Programmable Logic Device) such as an FPGA (Field Programmable Gate Array) or another device such as an ASIC (Application Specific Integrated Circuit) may be used.
  • the autonomous movement control unit 110 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
  • the detection unit 131, the self-position estimation unit 132, and the situation analysis unit 133 constitute the recognition processing unit 121.
  • the planning unit 134 constitutes an action plan processing unit 122.
  • the motion control unit 135 constitutes the behavior control processing unit 123.
  • the influence space identification unit 151, the influence degree calculation unit 152, and the GUI (Graphical User Interface) output unit 153 constitute the influence space processing unit 124.
  • the detection unit 131 detects various types of information necessary for controlling autonomous movement.
  • the detection unit 131 includes a mobile body external information detection unit 141 and a mobile body state detection unit 142.
  • the mobile external information detection unit 141 performs detection processing of external information of the observation mobile body 1 based on data or signals from each unit of the mobile control system 100.
  • the moving body external information detection unit 141 performs detection processing, recognition processing, and tracking processing of the observation target 2 and the object 5 around the observation moving body 1, and detection processing of the distance to the observation target 2 and the object 5.
  • the mobile body external information detection unit 141 performs detection processing of the environment around the mobile body.
  • the surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, road surface condition, and the like.
  • the mobile external information detection unit 141 supplies data indicating the result of the detection process to the self-position estimation unit 132, the map analysis unit 145 of the situation analysis unit 133, the situation recognition unit 146, the operation control unit 135, and the like.
  • the mobile body state detection unit 142 performs a state detection process of the observation mobile body 1 based on data or signals from each unit of the mobile body control system 100.
  • the state of the observation moving body 1 to be detected includes, for example, speed, acceleration, steering angle, presence / absence and content of abnormality, state of driving operation, state of other moving body-mounted equipment, and the like.
  • the mobile body state detection unit 142 supplies data indicating the result of the detection process to the situation awareness unit 146 of the situation analysis unit 133, the operation control unit 135, and the like.
  • the self-position estimation unit 132 sets the position of the observation mobile body 1 and the position of the observation mobile body 1 based on data or signals from each unit of the mobile control system 100 such as the mobile external information detection unit 141 and the situation recognition unit 146 of the situation analysis unit 133. Performs estimation processing such as posture. In addition, the self-position estimation unit 132 generates a local map (hereinafter, referred to as a self-position estimation map) used for self-position estimation, if necessary.
  • the map for self-position estimation is, for example, a highly accurate map using a technique such as SLAM (Simultaneous Localization and Mapping).
  • the self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 145 of the situation analysis unit 133, the situation recognition unit 146, and the like. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 109. Further, the self-position estimation unit 132 accumulates the time-series information supplied in the time series based on the detection result supplied from the sensor group 112 in the database, and also stores the self-position based on the accumulated time-series information. Is estimated and output as the time series information self-position. Further, the self-position estimation unit 132 estimates the self-position based on the current detection result supplied from the sensor group 112, and outputs the current information self-position.
  • the self-position estimation unit 132 outputs the self-position estimation result by integrating or switching the time-series information self-position and the current information self-position. Further, the self-position estimation unit 132 detects the posture of the observation moving body 1 based on the detection result supplied from the sensor group 112, the change in the posture is detected, the self-position changes significantly, and the time series information. When it is considered that the estimation accuracy of the self-position decreases, the self-position is estimated only from the current information self-position. Further, for example, when the observation moving body 1 is mounted on another moving body and moves, the self-position estimation unit 132 determines the posture of the observation moving body 1 based on the detection result supplied from the sensor group 112. Even if no change is detected, the self-position changes significantly, so it is considered that the estimation accuracy of the time-series information self-position decreases, and the self-position is estimated only from the current information self-position.
  • the situation analysis unit 133 analyzes the observation moving body 1 and the surrounding situation.
  • the situational analysis unit 133 includes a map analysis unit 145, a situational awareness unit 146, and a situational awareness unit 147.
  • the map analysis unit 145 uses data or signals from each unit of the mobile control system 100 such as the self-position estimation unit 132 and the mobile external information detection unit 141 as necessary, and is stored in the storage unit 109. The map is analyzed and a map containing the information necessary for autonomous mobile processing is constructed. The map analysis unit 145 supplies the constructed map to the situation recognition unit 146, the situation prediction unit 147, the route planning unit 161 of the planning unit 134, the action planning unit 162, the operation planning unit 163, and the like.
  • the situation recognition unit 146 is based on data or signals from each unit of the mobile control system 100 such as the self-position estimation unit 132, the mobile external information detection unit 141, the mobile state detection unit 142, and the map analysis unit 145.
  • the situation recognition process regarding the observation mobile body 1 is performed.
  • the situational awareness unit 146 performs the situation of the observation moving body 1 and the situation around the observation moving body 1. Further, the situational awareness unit 146 generates a local map (hereinafter, referred to as a situational awareness map) used for recognizing the situation around the observation moving body 1 as needed.
  • the situation recognition map is, for example, an Occupancy Grid Map, a Road Map, or a Point Cloud Map.
  • the situation of the observation moving body 1 to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the observation moving body 1, and the presence / absence and contents of an abnormality.
  • the surrounding conditions of the observation moving object 1 to be recognized include, for example, the type, position, and movement (for example, velocity, acceleration, moving direction, etc.) of surrounding objects such as the observation target 2 and the object 5. Further, for example, the composition of the surrounding road and the condition of the road surface, and the surrounding weather, temperature, humidity, brightness, and the like are included.
  • the situational awareness unit 146 supplies data indicating the result of the recognition process (including a situational awareness map, if necessary) to the self-position estimation unit 132, the situation prediction unit 147, and the like. Further, the situational awareness unit 146 stores the situational awareness map in the storage unit 109. The situational awareness unit 146 supplies data indicating the position of the object 5 to the influence space identification unit 151. Further, the situational awareness unit 146 supplies data indicating the position of the observation target 2 to the influence degree calculation unit 152. For example, the position information of the observation target 2 and the object 5 is supplied based on the position of the observation moving body 1. As the position information, coordinate values (for example, XYZ coordinate values) defined by the absolute coordinate system (world coordinate system) may be used.
  • a coordinate value for example, xyz coordinate value or uvd coordinate value
  • a relative coordinate system with a predetermined point as a reference (origin)
  • the reference origin may be set arbitrarily.
  • the situation prediction unit 147 performs a situation prediction process regarding the observation moving body 1 based on data or signals from each part of the moving body control system 100 such as the map analysis unit 145 and the situation recognition unit 146.
  • the situation prediction unit 147 performs prediction processing such as the situation of the observation moving body 1 and the situation around the observation moving body 1.
  • the situation of the observation mobile body 1 to be predicted includes, for example, the behavior of the observation mobile body 1, the occurrence of an abnormality, the movable distance, and the like.
  • the situation around the moving body to be predicted includes, for example, the behavior of the animal body around the observed moving body 1, the change in the signal state, the change in the environment such as the weather, and the like.
  • the situation prediction unit 147 supplies data indicating the result of the prediction processing and data from the situation recognition unit 146 to the route planning unit 161, the action planning unit 162, the operation planning unit 163, and the like of the planning unit 134.
  • the influence space specifying unit 151 specifies the influence space 4 set for the observation target 2.
  • the influence space 4 is specified based on the position information of the observation target 2 input from the situational awareness unit 146.
  • a specific method for specifying the influence space 4 will be described with reference to FIG.
  • the influence space specifying unit 151 determines whether or not the object 5 exists in the influence space 4.
  • a specific determination method will be described with reference to FIG.
  • Data indicating the result of the specific processing is supplied to the influence degree calculation unit 152 and the GUI output unit 153 by the influence space specifying unit 151.
  • the influence degree calculation unit 152 calculates the influence degree based on the possibility that the object 5 existing in the influence space 4 interferes with the observation of the observation target 2.
  • the possibility of hindering observation is, for example, information about various objects 5 such as the shape, size, position, and velocity of the object 5.
  • the degree of influence is calculated based on the position information of the object 5 input from the situational awareness unit 146 and the shape and position of the influence space 4 input from the influence space identification unit 151.
  • the degree of influence is calculated based on the position of the object 5 in the influence space 4 and the traveling direction of the observation moving body 1. A specific method for calculating the degree of influence will be described with reference to FIG. Data indicating the result of the calculation process is supplied by the influence degree calculation unit 152 to the GUI output unit 153, the route planning unit 161, the action planning unit 162, and the operation planning unit 163.
  • the GUI output unit 153 outputs a GUI in which the influence space 4 is identifiable.
  • the user can input an action plan for following the observation target 2 to the observation moving body 1 via the GUI displayed on the output unit 106.
  • the user can identify the shape and position of the influence space 4 via the GUI.
  • the user can identify the degree of influence of the object 5 existing in the influence space 4 via the GUI.
  • the route planning unit 161 plans a route to the destination based on data or signals from each unit of the mobile control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the impact calculation unit 152. For example, the route planning unit 161 sets a route from the current position to the specified destination based on the global map. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
  • the action planning unit 162 sets a route planned by the route planning unit 161 based on data or signals from each unit of the mobile control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the influence degree calculation unit 152. Plan the action of the observation moving object 1 to move safely within the planned time. For example, the action planning unit 162 plans the start, stop, direction of travel (for example, forward, backward, change of direction, etc.), movement speed, and the like. The action planning unit 162 supplies data indicating the behavior of the planned observation mobile body 1 to the motion planning unit 163 and the like. More specifically, the action planning unit 162 selects the action plan candidate of the observation moving body 1 for safely moving within the planned time for each of the routes planned by the route planning unit 161. Generate as.
  • the action planning unit 162 divides the environment into a grid, for example, an A * algorithm (A star search algorithm) that optimizes the arrival judgment and the weight of the route to generate the best path, and the self.
  • Action plan candidates are generated by the RRT (Rapidly-exploring Random Tree) algorithm, etc., which extends the path from the position to the place where the incremental can be reached while appropriately pruning.
  • the operation planning unit 163 performs the action planned by the action planning unit 162 based on the data or signals from each unit of the moving body control system 100 such as the map analysis unit 145, the situation prediction unit 147, and the influence degree calculation unit 152. Plan the operation of the observation moving body 1 to realize it. For example, the motion planning unit 163 plans acceleration, deceleration, rotation speed, and the like. The motion planning unit 163 supplies data indicating the planned motion of the moving body to the motion control unit 135 and the like.
  • the action plan and the action plan also include the flight pattern of the observation mobile body 1. That is, the trajectory and speed defined as a pattern such as turning and figure 8 are also included in the action plan and the motion plan.
  • the speed and curvature of the observation moving body 1 when the turn or the figure eight is performed are planned as an action plan and an action plan.
  • Parameters such as speed and attitude associated with the flight pattern may be set by default. That is, how to move a predetermined flight pattern may be set by default.
  • the motion control unit 135 controls the motion of the observation mobile body 1. More specifically, the motion control unit 135 is based on the detection results of the mobile external information detection unit 141 and the mobile state detection unit 142, such as collision, contact, entry into a danger zone, abnormality of the observation mobile body 1, and the like. Performs emergency detection processing. When the motion control unit 135 detects the occurrence of an emergency, the motion control unit 135 plans the motion of the observation moving body 1 for avoiding an emergency such as a sudden stop or a sharp turn. Further, the motion control unit 135 performs acceleration / deceleration control for realizing the motion of the observation moving body 1 planned by the motion planning unit 163.
  • the motion control unit 135 calculates a control target value of a driving force generator or a braking device for realizing a planned acceleration, deceleration, or sudden stop, and drives a control command indicating the calculated control target value. It is supplied to the system control unit 107.
  • the motion control unit 135 performs direction control for realizing the motion of the observation moving body 1 planned by the motion planning unit 163.
  • the motion control unit 135 calculates the control target value of the steering mechanism for realizing the moving trajectory or the sharp turn planned by the motion planning unit 163, and issues a control command indicating the calculated control target value to the drive system control unit. Supply to 107.
  • the sensor group 112 corresponds to a sensor unit capable of observing an observation target.
  • the influence space processing unit 124 and the planning unit 134 make observations based on the degree of influence of an object in the influence space that affects the observation by the sensor unit set for the observation target on the observation. It functions as an action plan generation unit that generates an action plan for observing an object.
  • the situation prediction unit 147 corresponds to the behavior prediction unit that predicts at least one behavior of the observation target or the object based on the sensing result acquired by the sensor unit.
  • the influence space identification unit 151 corresponds to the space identification unit that specifies the influence space based on the position information of the observation target.
  • the influence degree calculation unit 152 corresponds to a calculation unit that calculates the influence degree based on at least one of the shape, size, position, or speed of the object to be observed.
  • the GUI output unit 153 corresponds to a GUI output unit that outputs a GUI in which the influence space is identifiable.
  • FIG. 3 is a flowchart showing control of specifying the influence space and calculating the degree of influence.
  • the observation target 2 is detected based on the sensing result acquired by the sensor unit 3 (step 101).
  • step 101 is executed when an instruction to track the observation target 2 is input to the observation moving body 1 by the user.
  • the situational awareness unit 146 estimates the position of the detected observation target 2 (step 102).
  • the relative position of the observation target 2 is estimated with respect to the observation moving body 1.
  • the influence space identification unit 151 specifies the influence space based on the estimated position of the observation target 2 (step 103).
  • the object 5 is detected based on the sensing result acquired by the sensor unit 3 (step 104).
  • the situational awareness unit 146 estimates the position of the detected object 5 (step 105).
  • the influence space specifying unit 151 determines whether or not an obstacle (object 5) exists in the influence space 4 (step 106). When there is an obstacle in the influence space 4 (YES in step 106), the influence degree calculation unit 152 calculates the influence degree that the obstacle has on the observation (step 107).
  • the action plan unit 162 generates an action plan based on the calculated degree of influence (step 108). For example, an action plan is generated that reduces the calculated impact.
  • the action plan for observing the observation target 2 generated in step 108 is described as an observation action plan. That is, the observation action plan is an action plan for controlling the observation moving body 1 so that the observation is not obstructed by obstacles in order to continue the observation of the observation target 2.
  • the action plan for observing (tracking) the observation target 2 given in advance is described as the advance action plan.
  • the prior action plan includes an action plan in which a circular motion is performed around the observation target 2 at a predetermined distance from the observation target 2.
  • the motion control unit 135 controls the observation mobile body 1 based on the planned action plan (step 109).
  • the observation action plan is integrated with the advance action plan. That is, in addition to the advance action plan, control is executed so that the obstacle comes out of the influence space 4. For example, assume that the pre-action plan moves around to the right with respect to the observation target 2. It is assumed that there is an obstacle in the influence space in the movement path of the action plan.
  • the observation moving body 1 may move upward with respect to the observation target 2 while wrapping around to the right so that the observation is not obstructed by obstacles.
  • the control may be switched from the preliminary action plan to the observation action plan.
  • the observation action plan For example, an action plan in which the advance action plan goes straight to the observation target 2 and follows it. At this time, it is assumed that there is an obstacle on the right side of the influence space. In this case, the vehicle may move to the left in the direction opposite to the obstacle instead of going straight.
  • the observation moving body 1 is controlled based on the advance action plan.
  • the observation moving body 1 is controlled based on the advance action plan. That is, in the present embodiment, the action plan is generated based on the degree of influence of the obstacle in the influence space 4 on the observation regardless of whether or not the observation can be continued. Further, when the observation moving body 1 is moved based on the action plan and there are no obstacles in the influence space 4, the observation moving body 1 is controlled based on the prior action plan.
  • FIG. 4 is a schematic diagram for explaining a specific example of the method of specifying the influence space.
  • the shape of the influence space 20 is a cylinder. Further, in the present embodiment, the following information is input (unit is omitted). Coordinates of relative position 21 of observation target 2 with observation moving object 1 as the origin: (10.5, 2.3, -0.4) Radius of influence space 20: 2.0
  • the influence space specifying unit 151 executes the following calculation based on the above input information to specify the influence space. From the relative distance to the observation target 2 (the length of the center line 22 connecting the observation moving object 1 and the observation target 2) and the direction vector to the observation target 2, the direction vector 23 to the normalized observation target 2 is It is calculated. Relative distance: 10.8 Direction vector 23: (0.98, 0.21, -0.04)
  • the influence space identification unit 151 outputs the radius 2.0, the relative distance 10.8, and the direction vector 23 of the influence space 20 to the influence degree calculation unit 152.
  • “less than” may include “less than or equal to” and “less than”.
  • “greater than” may include “greater than or equal to”.
  • FIG. 5 is a schematic diagram for explaining a specific example of the method of calculating the degree of influence. Further, in the present embodiment, the coordinates of the three objects, the radius of the influence space 20, the relative distance, and the direction vector 23 are input. Relative position of object 31: (7.4, 0.2, 0.1) Relative position of object 32: (3.1, -1.9, 1.1) Relative position of object 33: (5.3, 2.4, -0.3)
  • the influence degree calculation unit 152 obtains the influence degree as an amount inversely proportional to the distance between each object and the center line. Further, the direction of the degree of influence is obtained as a vector in the direction of the perpendicular line drawn from each object to the center line 22.
  • the influence degree calculation unit 152 executes the following calculation based on the above input information, and calculates the influence degree vector. It is determined whether the coordinates of each object are included in the influence space 20. For the sake of brevity, only the calculation of the object 31 will be described.
  • the inner product of the direction vector 35 and the direction vector 23 to the object 31 is written as the following equation (Equation 3).
  • the influence degree vectors of the object 32 and the object 33 are calculated.
  • the object 32 is not included in the influence space 20 because the distance from the center line 22 is 2.8.
  • the influence vector of the object 32 becomes (0, 0, 0).
  • the object 33 is included in the influence space 20 because the inner product of the direction vector to the object 33 and the direction vector 23 is 5.7 and the distance from the center line 22 is 1.2.
  • the influence vector 38 of the object 33 becomes (0.18, 0.8, 0.06).
  • the influence degree calculation unit 152 supplies the total influence degree vector to the GUI output unit 153, the route planning unit 161, the action planning unit 162, and the operation planning unit 163.
  • FIG. 6 is a schematic diagram showing a specific example of the observation action plan.
  • FIG. 6A is a schematic diagram showing the path of the observation mobile body 1.
  • FIG. 6B is a schematic view showing the case where the observation target 2 is viewed from the position 41.
  • the observation moving body 1 moves according to the observation action plan generated based on the influence degree vector that each object has on the observation.
  • the following information is input.
  • Relative position of observation target 2 (10.5, 2.3, -0.4) Impact vector: (0.03, 1.46, 0.24)
  • Target distance to observation target 2 8.0
  • Maximum velocity of observation mobile body 1 0.5 P gain: 0.1
  • the action plan processing unit 122 executes the following calculation based on the above input information to generate an observation action plan.
  • the observation action plan is generated based on the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence.
  • Equation 8 The velocity vector for moving away from the object 31, which is inversely proportional to the distance 7.4 between the observation moving body 1 and the object 31, is written as the following equation (Equation 8).
  • the velocity vector for moving away from the object 32 (-0.2, 0.1, -0.1) and the velocity vector for moving away from the object 33 (-0.2, -0.1, 0) is calculated.
  • the velocity vector for moving away from the object 31, the velocity vector for leaving the object 32, and the velocity vector for leaving the object 33 are added together, and the velocity vector for leaving each object to which the P gain is 0.1 is calculated as follows. Write as in equation (Equation 9).
  • the velocity vector for reducing the influence degree is (0,0). It becomes .15, 0.02).
  • the velocity vector 43 which is the observation action plan
  • the velocity vector 43 is (0.25). , 0.21, 0).
  • the velocity vector 43 of the observation action plan output from the action plan processing unit 122 is (0.25, 0). .21, 0).
  • the observation moving body 1 moves from the position 41 to the position 42 according to the newly generated observation action plan from the preliminary action plan that follows the observation target 2, so that each object It is possible to reduce the degree of influence on the observation.
  • FIG. 6C is a schematic view showing the case where the observation target 2 is viewed from the position 42. As shown in FIG. 6C, when the observation moving body 1 moves to the position 42, each object does not exist in the influence space 20. That is, the observation action plan can be said to be an action plan for moving the observation moving body 1 until there are no more objects inside the influence space 20.
  • the radius of the influence space 20, the target distance to the observation target 2, the maximum speed of the observation moving body 1, the P gain, and the like are not limited and may be set arbitrarily. For example, it may be set by the user, or may be appropriately set according to the observation information of the sensor unit 3.
  • FIG. 7 is a schematic diagram showing an observation GUI for inputting an instruction to the observation mobile body 1.
  • FIG. 7A is a schematic diagram showing how the observation target 2 is selected. As shown in FIG. 7A, the user can set the observation target 2 via the observation GUI45.
  • the observation GUI 45 includes a display unit 46, a screen transition instruction unit 47, a movement instruction unit 48, an object selection unit 49, a position designation unit 50, and a landing instruction unit 51.
  • the display unit 46 displays a moving image acquired from the camera (sensor unit) of the observation moving body 1. In FIG. 7A, a state in which two people are moving within the angle of view of the camera mounted on the observation moving body 1 is displayed.
  • the display mode displayed on the display unit 46 before each unit is selected will be referred to as a home screen.
  • the screen transition instruction unit 47 can change the display mode of the observation GUI 45. For example, when the movement instruction unit 48 is selected from the home screen, the display unit 46 shifts to the display mode of the movement instruction unit 48. When the screen transition instruction unit 47 is selected from this state, the display unit 46 transitions from the display mode of the movement instruction unit 48 to the home screen.
  • the movement instruction unit 48 can input an instruction regarding movement to the observation moving body 1. For example, it is possible to input various movement instructions such as forward, backward, turn, figure eight, ascending, and descending.
  • the target selection unit 49 can select any object or person displayed on the display unit 46 as an observation target. In the present embodiment, the broken line 52 is displayed so as to surround the person, and the observation target 2 can be set by the user touching the inside of the broken line 52. In FIG.
  • the observation GUI is in a mode in which the observation target 2 can be set by selecting the target selection unit 49 by the user.
  • the position designation unit 50 can select a predetermined position, a landmark, or the like. For example, it is possible to move with a predetermined position specified by the user as a target point.
  • the landing instruction unit 51 can input an instruction regarding landing to the observation mobile body 1. For example, an instruction to land the observation mobile body 1 at a predetermined position selected by the user is input.
  • the attitude, orientation, speed, and the like of the aircraft when the observation mobile body 1 lands may be input.
  • the observation GUI 45 is not limited, and for example, a mode for taking a picture using the camera of the observation moving body 1 or a mode for returning to a charging position designated for charging may be set.
  • FIG. 7B is a schematic diagram showing how the observation target 2 is followed.
  • the person 53 is set as an observation target by the user.
  • the target icon 54 indicating that the observation target 2 has been set is displayed on the display unit 46.
  • the observation GUI 45 includes a display unit 46, a reselection unit 55, a movement instruction unit 48, a follow-up instruction unit 56, a position designation unit 50, and a landing instruction unit 51.
  • the reselection unit 55 can switch the observation target 2. For example, when the reselection unit 55 is selected, it switches to the observation GUI 45 shown in FIG. 7A.
  • the follow-up instruction unit 56 can input an action plan related to follow-up to the observation moving body 1 for the selected observation target 2. For example, it is possible to input an action plan such as making the observation moving object 1 follow the observation target 2 at a distance of 3 m while performing a circular motion around the observation target 2.
  • FIG. 7C is a schematic diagram showing how the observation target 2 continues to follow.
  • the observation GUI 45 includes a display unit 46, a return unit 57, a movement instruction unit 48, a follow-up stop unit 58, a position designation unit 50, and a landing instruction unit 51.
  • the return unit 57 can move the observation moving body 1 to a predetermined position. For example, by selecting the return unit 57, it is possible to return to the point where the observation mobile body 1 took off or the current position of the user. By being selected while the follow-up stop unit 58 is following, it is possible to input information to the observation moving body 1 to the effect that the action plan for following is regulated.
  • FIG. 8 is a schematic diagram showing an observation GUI in which the influence space is displayed.
  • FIG. 8A is a schematic view showing how the influence space 60 set for the observation target 2 is displayed.
  • a predetermined color is shown in the area corresponding to the influence space 60 so that the influence space 60 can be identified by the user.
  • FIG. 8B is a schematic diagram showing the influence space and the influence on the observation. As shown in FIG. 8B, a part of the person 62 overlaps in the influence space 60.
  • the degree of influence on the observation by the person 62 in the influence space 60 is shown in a color different from that of the influence space 60 so that the user can identify it.
  • the region 64 where the broken line 63 surrounding the person 62 and the influence space 60 overlap is shown as the degree of influence.
  • the method of displaying the degree of influence is not limited, and for example, the area 64 may extend along the moving direction of the person 62. Further, for example, the shade of color may be set according to the magnitude of the degree of influence.
  • the observation moving object 1 is an observation target based on the degree of influence on the observation by the object 5 in the influence space 4 that affects the observation by the sensor unit 3 set for the observation target 2.
  • An action plan for observing 2 is generated. This makes it possible to demonstrate high stability in observation.
  • the influence space that affects the observation of the target object by the sensor is specified.
  • the degree of influence of obstacles in the specified influence space on the observation is estimated, and an action plan is generated so that the degree of influence is reduced.
  • the action plan is generated so as to avoid the situation where the target is lost or the observation becomes partial, the continuity of the follow-up is improved.
  • the shape of the influence space is set as a cylinder. In addition to this, various shapes may be set as the influence space.
  • FIG. 9 is a schematic view showing another shape of the influence space.
  • FIG. 9A is a schematic view showing the influence space of the cone.
  • the influence space 70 is specified by the influence space identification unit 151 based on the following information.
  • Starting point 71 based on the position of the observation moving body 1
  • Direction vector 72 from the observation moving body 1 to the observation target 2 Length from observation moving object 1 to observation target 2 (height of cone 70)
  • the radius of the bottom 73 or the angle 75 of the bus 74 Based on this information, the influence space 70 on the cone can be uniquely determined, and the influence space can be specified.
  • FIG. 9B is a schematic view showing the influence space of the quadrangular pyramid.
  • the influence space 80 is specified by the influence space identification unit 151 based on the following information.
  • Starting point 81 based on the position of the observation moving body 1
  • Bottom vector 83 perpendicular to bottom 82
  • Direction vector 84 perpendicular to each side Boundary point 85 of each surface
  • the influence space 80 since the influence space 80 has four corners, it is possible to specify the influence space 80 by determining four boundary points 85 and four direction vectors 84 on each side surface.
  • FIG. 9C is a schematic view showing the influence space of the quadrangular pyramid.
  • the influence space 90 is specified by the influence space identification unit 151 based on the following information.
  • Direction vector 92 at the boundary point 91 of each boundary surface For example, in FIG. 9C, since the influence space 90 has six faces, it is possible to specify the influence space 90 by determining six sets of boundary points 91 and direction vectors 92 of each face. In FIG. 9C, the boundary points and direction vectors of the three boundary surfaces are shown for simplification.
  • the shape of the affected space may be changed as appropriate depending on the situation.
  • the influence space may be set to a cylinder when the distance between the observation moving body 1 and the observation target 2 is short, and the influence space may be set to a cone when the distance is long.
  • the size of the influence space may be changed so as to be proportional to the distance between the observation moving object 1 and the observation target 2.
  • the parameter for specifying the influence space (for example, the radius of the cylinder) is determined by the user.
  • the influence space may be set to a quadrangular pyramid according to the angle of view of the camera.
  • the quadrangular pyramid stand may be set except within a predetermined distance from the start point of the quadrangular pyramid.
  • the space between the ceiling and the floor may be set as the influence space.
  • the degree of influence is calculated as an amount inversely proportional to the distance between the object and the center line.
  • the degree of influence may be calculated by various methods. For example, with respect to the line segment L connecting the observation moving body 1 and the observation target 2, the space is divided into four regions by two planes, a plane P stretched by a gravity vector and a plane Q perpendicular to the plane P and including the line segment L. It may be divided into.
  • the degree of influence may be calculated by referring to the degree of influence when there is an object in each region. Further, for example, the degree of influence may be changed so that the degree of influence calculated based on the degree of influence is inversely proportional to the distance between the line segment L and the object.
  • the observation action plan is generated by adding up the velocity vector for keeping the distance from the observation target 2, the velocity vector for moving away from the object, and the velocity vector for reducing the degree of influence.
  • the velocity vector of the observation action plan may be added to the velocity vector related to various controls such as the velocity vector for avoiding the collision with the object 5.
  • the degree of influence for each position of the observation moving body 1 may be calculated, and an action plan may be generated such that the integrated value of the degree of influence on the route planned to pass by the observation moving body 1 is minimized.
  • the situation prediction unit 147 may predict the velocity vector (movement direction and velocity) of the object, and calculate the degree of influence based on the prediction result.
  • the degree of influence is calculated based on the position of the object in the influence space.
  • the degree of influence may be calculated based on the size and type of the object. For example, when the object is a circular road sign, the coordinates corresponding to the edge of the road sign may be set as the position of the object.
  • the degree of influence is not limited to the gradient vector, and a collision risk indicating the possibility of colliding with each obstacle, a shielding risk indicating the possibility of becoming unobservable, or the like may be treated as the degree of influence. Further, for example, the degree of influence may be calculated based on the position information of an object existing outside the influence space.
  • an action plan may be generated based on the moving path, speed, or the like of the moving object.
  • the communication unit 103 may perform mutual communication with the mobile body, acquisition of an action plan of the mobile body, and the like.
  • the observation mobile body 1 made an autonomous movement according to the action plan.
  • an observation action plan may be generated for a user operation (for example, determination of a moving direction or a speed).
  • the user's operation is restricted, and the observation moving body 1 moves according to the observation action plan.
  • the operation of moving the observation moving body 1 in the right direction may be restricted, and the observation moving body 1 may be moved in the upward or left direction.
  • an observation action plan is generated based on the calculated degree of influence regardless of whether or not the observation of the observation target 2 can be continued.
  • the observation of the observation target 2 may be prioritized.
  • an observation action plan that reduces the degree of influence less than the observation action plan that reduces the degree of influence earliest may be adopted according to the route of the prior action plan.
  • the degree of influence is calculated from the object in the influence space.
  • the degree of influence may be calculated from various factors such as wind, rain, surrounding animals and light sources.
  • FIG. 10 is a block diagram showing a hardware configuration example of the autonomous movement control unit 110.
  • the autonomous movement control unit 110 includes a CPU 201, a ROM 202, a RAM 203, an input / output interface 205, and a bus 204 that connects them to each other.
  • a display unit 206, an input unit 207, a storage unit 208, a communication unit 209, a drive unit 210, and the like are connected to the input / output interface 205.
  • the display unit 206 is a display device using, for example, a liquid crystal or an EL.
  • the input unit 207 is, for example, a keyboard, a pointing device, a touch panel, or other operating device. When the input unit 207 includes a touch panel, the touch panel can be integrated with the display unit 206.
  • the storage unit 208 is a non-volatile storage device, for example, an HDD, a flash memory, or other solid-state memory.
  • the drive unit 210 is a device capable of driving a removable recording medium 211 such as an optical recording medium or a magnetic recording tape.
  • the communication unit 209 is a modem, router, or other communication device for communicating with another device that can be connected to a LAN, WAN, or the like.
  • the communication unit 209 may communicate using either wired or wireless.
  • the communication unit 209 is often used separately from the autonomous movement control unit 110. In the present embodiment, the communication unit 209 enables communication with other devices via the network.
  • Information processing by the autonomous movement control unit 110 having the hardware configuration as described above is realized by the cooperation between the software stored in the storage unit 208 or the ROM 202 or the like and the hardware resources of the autonomous movement control unit 110.
  • the information processing method according to the present technology is realized by loading and executing the program constituting the software stored in the ROM 202 or the like into the RAM 203.
  • the program is installed in the autonomous movement control unit 110 via, for example, the recording medium 211.
  • the program may be installed in the autonomous mobile control unit 110 via a global network or the like.
  • any non-transient storage medium that can be read by a computer may be used.
  • An information processing device By linking a computer mounted on a communication terminal with another computer capable of communicating via a network or the like, a mobile body, an information processing device, an information processing method, and a program related to this technology are executed, and related to this technology.
  • An information processing device may be constructed.
  • the mobile body, information processing device, information processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
  • Execution of moving objects, information processing devices, information processing methods, and programs related to this technology by a computer system is performed by, for example, identification of an influence space, calculation of an influence degree, generation of an action plan, and the like by a single computer. And when each process is performed by a different computer. Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result.
  • the mobile body, information processing device, information processing method, and program related to the present technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. It is possible.
  • the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained.
  • the description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
  • the present technology can also adopt the following configurations.
  • the space specifying unit is a mobile body that specifies the affected space based on observation information related to the observation of the sensor unit.
  • the observation information includes at least one of the angle of view of the sensor unit or the observable distance of the sensor unit.
  • the space specifying unit is a moving body that specifies the affected space based on a line segment connecting the moving body and the observation target. (6) The moving body according to any one of (1) to (5), and further. A moving body in which the object includes a calculation unit that calculates the degree of influence based on at least one of the shape, size, position, or velocity of the observation target. (7) The moving body according to (6).
  • the calculation unit is a moving body that calculates the degree of influence based on the position information of the object in the influence space. (8) The moving body according to any one of (1) to (7).
  • the action plan generation unit is a mobile body that generates the action plan for reducing the degree of influence.
  • the action plan generation unit is a moving body that integrates the observation action plan into a predetermined action plan for observing the observation target, using the action plan for observing the observation target as the observation action plan.
  • the action plan generation unit is a moving body that regulates the action plan for observing the observation target given in advance and executes the observation action plan.
  • the action plan generation unit is a moving body that totals the influence degrees of each and generates the action plan.
  • the action plan generation unit is a mobile body that generates the action plan based on a predetermined action of the observation target predicted by the action prediction unit.
  • the action plan generation unit is a moving body that generates the action plan based on the predetermined action when the observation target moves out of the space observable by the sensor unit.
  • the GUI output unit is an information processing device that outputs a GUI in which the degree of influence is identifiable.
  • Processing method. (20) A program that causes a computer system to execute a step to generate an action plan for observing the observation target based on the degree of influence that an object in the space has on the observation. ..

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention a pour objet, selon un mode de réalisation, un corps mobile comprenant une unité de capteur et une unité de génération de plan de déplacement. L'unité de capteur peut observer un objet à observer. L'unité de génération de plan de déplacement génère un plan de déplacement permettant d'observer l'objet à observer, en fonction du degré d'influence selon lequel un objet d'un espace d'influence affecte l'observation par l'ensemble d'unités de capteur de l'objet à observer. En conséquence, une stabilité élevée dans l'observation peut être présentée.
PCT/JP2020/048172 2020-01-07 2020-12-23 Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2021140916A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-000801 2020-01-07
JP2020000801 2020-01-07

Publications (1)

Publication Number Publication Date
WO2021140916A1 true WO2021140916A1 (fr) 2021-07-15

Family

ID=76788650

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/048172 WO2021140916A1 (fr) 2020-01-07 2020-12-23 Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2021140916A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296760A (ja) * 2005-04-20 2006-11-02 Seventh Dimension Design:Kk 光送受信装置制御システム
JP2009217448A (ja) * 2008-03-10 2009-09-24 Mitsubishi Electric Corp 画像情報を用いた人物追跡システム
JP2010015194A (ja) * 2008-06-30 2010-01-21 Ihi Corp 自律移動ロボット装置及び自律移動ロボット装置の制御方法
JP2014123306A (ja) * 2012-12-21 2014-07-03 Secom Co Ltd 自律飛行ロボット
JP2018147337A (ja) * 2017-03-08 2018-09-20 日本電気株式会社 自律移動ロボット、自律移動ロボットの制御方法および制御プログラム
JP2019114036A (ja) * 2017-12-22 2019-07-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP2019146087A (ja) * 2018-02-22 2019-08-29 キヤノン株式会社 情報処理装置、撮像装置の制御方法、コンピュータプログラム、及び記憶媒体
JP2019168886A (ja) * 2018-03-23 2019-10-03 カシオ計算機株式会社 検出体領域検出装置、撮像装置、飛行装置、検出体領域検出方法、撮像方法及びプログラム
WO2019244626A1 (fr) * 2018-06-18 2019-12-26 ソニー株式会社 Unité mobile et procédé de commande

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296760A (ja) * 2005-04-20 2006-11-02 Seventh Dimension Design:Kk 光送受信装置制御システム
JP2009217448A (ja) * 2008-03-10 2009-09-24 Mitsubishi Electric Corp 画像情報を用いた人物追跡システム
JP2010015194A (ja) * 2008-06-30 2010-01-21 Ihi Corp 自律移動ロボット装置及び自律移動ロボット装置の制御方法
JP2014123306A (ja) * 2012-12-21 2014-07-03 Secom Co Ltd 自律飛行ロボット
JP2018147337A (ja) * 2017-03-08 2018-09-20 日本電気株式会社 自律移動ロボット、自律移動ロボットの制御方法および制御プログラム
JP2019114036A (ja) * 2017-12-22 2019-07-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP2019146087A (ja) * 2018-02-22 2019-08-29 キヤノン株式会社 情報処理装置、撮像装置の制御方法、コンピュータプログラム、及び記憶媒体
JP2019168886A (ja) * 2018-03-23 2019-10-03 カシオ計算機株式会社 検出体領域検出装置、撮像装置、飛行装置、検出体領域検出方法、撮像方法及びプログラム
WO2019244626A1 (fr) * 2018-06-18 2019-12-26 ソニー株式会社 Unité mobile et procédé de commande

Similar Documents

Publication Publication Date Title
US11787543B2 (en) Image space motion planning of an autonomous vehicle
US11537131B2 (en) Control device, control method, and mobile body
US11427218B2 (en) Control apparatus, control method, program, and moving body
JP7259749B2 (ja) 情報処理装置、および情報処理方法、プログラム、並びに移動体
EP3722904B1 (fr) Vehicule mobile, procédé de commande et programme de commande
WO2017143588A1 (fr) Systèmes et procédés de réglage de trajectoire d'uav
Omari et al. Visual industrial inspection using aerial robots
US11822341B2 (en) Control device, control method, and mobile object to estimate the mobile object's self-position
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
JP2018535487A (ja) Uav経路を計画し制御するシステム及び方法
WO2019098002A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
KR20140144921A (ko) 가상현실을 이용한 무인 자동차의 자율 주행 시뮬레이션 시스템
WO2020226085A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
EP3682306B1 (fr) Production de plan d'action lorsque la position propre est inconnue
WO2021140916A1 (fr) Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20230107289A1 (en) Information processing method, information processor, and program
WO2019176278A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
WO2021187110A1 (fr) Objet mobile, dispositif de traitement d'informations, procédé de traitement d'informations, et programme
US20230150543A1 (en) Systems and methods for estimating cuboid headings based on heading estimations generated using different cuboid defining techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20911520

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20911520

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP