WO2020144936A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2020144936A1
WO2020144936A1 PCT/JP2019/044630 JP2019044630W WO2020144936A1 WO 2020144936 A1 WO2020144936 A1 WO 2020144936A1 JP 2019044630 W JP2019044630 W JP 2019044630W WO 2020144936 A1 WO2020144936 A1 WO 2020144936A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
information processing
sensor
moving body
processing device
Prior art date
Application number
PCT/JP2019/044630
Other languages
French (fr)
Japanese (ja)
Inventor
駿 李
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020144936A1 publication Critical patent/WO2020144936A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/20Initiating means actuated automatically, e.g. responsive to gust detectors using radiated signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • Patent Document 1 discloses a control device and the like that obtains information about the general condition of a structure and flies the periphery of the structure based on the information to generate flight information of a flying body that images the structure.
  • the present disclosure proposes an information processing device, an information processing method, and a program capable of suppressing the power consumed by a plurality of sensors mounted on a mobile body.
  • an information processing device is mounted on the moving body based on control information for controlling the moving body and environment information indicating the surroundings of the moving body.
  • a sensor control unit that controls the detection condition of each of the plurality of sensors, and an acquisition unit that acquires the sensor information detected by each of the plurality of sensors under the detection condition are provided.
  • a computer uses a plurality of sensors mounted on the mobile body based on control information for controlling the mobile body and environment information indicating the surroundings of the mobile body. Each detection condition is controlled, and each of the plurality of sensors acquires sensor information detected under the detection condition.
  • a program causes a computer to control each of a plurality of sensors mounted on the moving body based on control information for controlling the moving body and environment information indicating the surroundings of the moving body.
  • the sensor control unit that controls the detection condition and each of the plurality of sensors function as an acquisition unit that acquires the sensor information detected under the detection condition.
  • FIG. 1 It is a figure for explaining an example of a flying object provided with an information treatment device concerning a 1st embodiment. It is a figure which shows the structural example of the flying body provided with the information processing apparatus which concerns on 1st Embodiment. It is a figure which shows the recognition example of the dynamic object of the information processing apparatus which concerns on 1st Embodiment. It is a figure which shows the example of recognition of the external environment of the information processing apparatus which concerns on 1st Embodiment. It is a figure which shows an example of the map information of the information processing apparatus which concerns on 1st Embodiment. It is a figure for explaining an example of processing of sensor information of an information processor concerning a 1st embodiment.
  • FIG. 6 is a flowchart showing an example of a processing procedure executed by the information processing apparatus according to the first embodiment. It is a figure which shows the example of determination of the frame rate of the information processing apparatus which concerns on 1st Embodiment. It is a figure which shows an example of a structure of the information processing apparatus which concerns on 2nd Embodiment. 9 is a flowchart showing an example of a processing procedure executed by the information processing apparatus according to the second embodiment. It is a figure for explaining an example of determination of a detection condition of a sensor of an information processor concerning a 2nd embodiment. It is a hardware block diagram which shows an example of the computer which implement
  • FIG. 1 is a diagram for explaining an example of an air vehicle provided with the information processing device according to the first embodiment.
  • the air vehicle 100 includes, for example, a drone capable of autonomous movement, an unmanned aerial vehicle (UAV), and the like.
  • the flying body 100 is an example of a moving body that can move autonomously.
  • the flying object 100 can fly in all directions, for example, front and rear, left and right, and up and down, in order to fly without colliding with the external environment, a plurality of sensors are mounted so as to cover all directions. Is desirable.
  • the power consumption increases according to the number of the sensors, and the power consumption can be suppressed in the air vehicle 100 in which the weight of the battery to be mounted is limited. Is desired.
  • the flying object 100 processes the detection results of a plurality of sensors with a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc.
  • the heat of the CPU, GPU, etc. may rise due to an increase in load. ..
  • the flying body 100 may be referred to as “own aircraft”.
  • the air vehicle 100 includes a plurality of imaging devices 111 and an information processing device 10.
  • the plurality of imaging devices 111 and the information processing device 10 are electrically connected.
  • the flying vehicle 100 includes a plurality of imaging devices 111 so that different areas around the aircraft can be imaged.
  • the plurality of imaging devices 111 are examples of a plurality of sensors that detect the external environment of the own device.
  • the imaging device 111 includes, for example, a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, a Depth camera, and other cameras. Although the present disclosure describes a case where the flying object 100 has four imaging devices 111, the number is not limited.
  • the four image pickup devices 111 are appropriately referred to as an image pickup device 111a, an image pickup device 111b, an image pickup device 111c, and an image pickup device 111d.
  • the flying object 100 is flying (moving) in the traveling direction A.
  • the traveling direction A changes according to the flight direction of the flying object 100.
  • the imaging device 111a captures an image of the imaging region D1 of the flying object 100.
  • the imaging area D1 is an area in which the traveling direction A of the flying object 100 is imaged.
  • the imaging device 111b captures an image of the imaging region D2 of the flying object 100.
  • the imaging region D2 is a region in which the right direction of the aircraft 100 moving in the traveling direction A is imaged.
  • the imaging device 111c captures an image of the imaging region D3 of the flying object 100.
  • the imaging region D3 is a region in which a direction opposite to the traveling direction A of the flying object 100 is imaged.
  • the imaging device 111d captures an image of the imaging region D4 of the flying object 100.
  • the imaging region D4 is a region in which the left direction of the aircraft 100 moving in the traveling direction A is imaged.
  • the information processing device 10 When the information processing device 10 acquires sensor information (for example, an image) from a plurality of image pickup devices 111, the information processing device 10 executes an algorithm process on the sensor information.
  • the algorithm process includes, for example, a process of recognizing the state of the flying object 100, a process of recognizing the external environment of the flying object 100, a process of creating a map around the flying object 100, and the like.
  • the information processing device 10 has a function of correcting the action plan of the flying object 100 based on the sensor information and the processing result of the algorithm processing.
  • the action plan includes information such as a flight route, a flight speed, and a posture, for example.
  • the information processing device 10 dynamically controls the imaging condition (detection condition) of each of the plurality of imaging devices 111 of the flying object 100 that is flying based on the action plan.
  • the information processing device 10 dynamically changes the respective detection conditions of the plurality of sensors mounted on the air vehicle 100 according to, for example, the operating state of the own device, the state of the external environment of the own device, and the like.
  • FIG. 2 is a diagram illustrating a configuration example of an air vehicle 100 including the information processing device 10 according to the first embodiment.
  • the air vehicle 100 includes a sensor unit 110, a communication unit 120, a drive unit 130, and an information processing device 10.
  • the sensor unit 110, the communication unit 120, the drive unit 130, and the information processing device 10 are connected to each other so that data and signals can be exchanged.
  • the sensor unit 110 includes various sensors that detect sensor information used for the processing of the flying object 100, and supplies the detected data to each unit of the flying object 100.
  • the sensor unit 110 includes, for example, a plurality of imaging devices 111 and a state sensor 112.
  • the plurality of image pickup devices 111 include, for example, the above-described image pickup devices 111a, 111b, 111c, and 111d.
  • the state sensor 112 includes, for example, a gyro sensor, an acceleration sensor, a surrounding information detection sensor, a sensor for detecting the motor rotation speed, and the like.
  • the ambient information detection sensor detects, for example, an object around the flying object 100.
  • the ambient information detection sensor includes, for example, an ultrasonic sensor, radar, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), sonar, and the like.
  • a plurality of state sensors 112 may be provided, like the image pickup device 111, for example.
  • the sensor unit 110 may include various sensors for detecting the current position of the flying object 100.
  • the sensor unit 110 may include a GPS (Global Positioning System) receiver, a GNSS receiver that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite, and the like.
  • the sensor unit 110 may include a microphone that collects sounds around the air vehicle 100.
  • the communication unit 120 communicates with other flying objects different from its own, as well as various external electronic devices, information processing servers, base stations, and the like.
  • the communication unit 120 outputs the data received from the information processing server or the like to the information processing apparatus 10 or transmits the data from the information processing apparatus 10 to the information processing server or the like.
  • the communication protocol supported by the communication unit 120 is not particularly limited, and the communication unit 120 can support a plurality of types of communication protocols.
  • the drive unit 130 includes various devices related to the drive system of the aircraft 100.
  • the drive unit 130 includes, for example, a drive force generation device for generating drive force of a plurality of drive motors and the like.
  • the drive motor rotates, for example, the rotary wings of the flying object 100.
  • the drive unit 130 rotates the drive motor based on control information including a command from the information processing device 10, for example, so that the flying object 100 floats and flies.
  • the information processing device 10 includes an acquisition unit 11, a state recognition unit 12, an environment recognition unit 13, a map creation unit 14, a storage unit 15, a sensor control unit 16, and an operation control unit. 17 is provided.
  • Each functional unit of the acquisition unit 11, the state recognition unit 12, the environment recognition unit 13, the map creation unit 14, the sensor control unit 16, and the operation control unit 17 is provided inside the information processing device 10 by, for example, a CPU or MPU. It is realized by executing the stored program using a RAM or the like as a work area. Further, each processing unit may be realized by an integrated circuit such as an ASIC or FPGA, for example.
  • the acquisition unit 11 acquires sensor information detected by the sensor unit 110 mounted on the aircraft 100. For example, the acquisition unit 11 acquires sensor information output from each of the four imaging devices 111a, 111b, 111c and 111d, and the state sensor 112. The acquisition unit 11 outputs the acquired sensor information to the state recognition unit 12, the environment recognition unit 13, the map creation unit 14, the operation control unit 17, and the like.
  • the acquisition unit 11 has a function of changing the detection condition for each sensor of the sensor unit 110, for example.
  • the detection condition includes, for example, the number of times of detection per unit time.
  • the detection condition includes the frame rate of each of the imaging device 111a, the imaging device 111b, the imaging device 111c, and the imaging device 111d.
  • the frame rate of the imaging device 111 means the number of frames processed per unit time in a moving image.
  • the detection condition may include a condition indicating whether to drive each of the plurality of sensors.
  • the acquisition unit 11 changes the detection condition in the sensor unit 110 by outputting information indicating the detection condition to the sensor unit 110.
  • the acquisition unit 11 can acquire the sensor information detected by each of the plurality of sensors of the sensor unit 110 based on the detection condition.
  • the sensor information may include the detection results of the plurality of sensors or may include only the detection results of each of the plurality of sensors.
  • the state recognition unit 12 recognizes the state of the flying vehicle 100.
  • the state of the flying object 100 includes, for example, whether or not the vehicle is flying, the speed, the traveling direction A, and the altitude.
  • the state recognition unit 12 recognizes the speed and the traveling direction of the aircraft 100 based on the control information that controls the flight of the aircraft 100.
  • the information processing apparatus 10 acquires control information such as the direction in which the air vehicle 100 moves, the attitude of the airframe of the aircraft, and the like immediately before regardless of autonomous movement or human operation. be able to.
  • the state recognition unit 12 can previously recognize the future operation of the aircraft 100 based on the control information.
  • the state recognition unit 12 can acquire the control information from the operation control unit 17 and recognize the state of the aircraft 100 according to the future operation based on the control information. Further, the state recognition unit 12 can recognize the speed and the traveling direction of the flying body 100 that actually flew, based on the sensor information of the state sensor 112. Then, the state recognition unit 12 outputs the recognized result and the like to the sensor control unit 16.
  • the environment recognition unit 13 recognizes environment information indicating the surroundings of the air vehicle 100 based on the sensor information acquired by the acquisition unit 11.
  • the environment information includes, for example, information indicating the external environment such as the surroundings of the flying vehicle 100 and a specific direction.
  • the environment information includes, for example, the presence or absence of obstacles such as a stationary object and a moving object, and the distance to the object.
  • the environment recognition unit 13 analyzes, for example, sensor information such as an image and a distance image including a depth value, and recognizes a distance to an object in the traveling direction A of the flying object 100 based on the analysis result. Then, the environment recognition unit 13 outputs the recognized result and the like to the sensor control unit 16.
  • FIG. 3 is a diagram showing an example of recognition of a dynamic object by the information processing device 10 according to the first embodiment.
  • the environment recognition unit 13 has a function of analyzing an image captured by the imaging device 111, a distance image, and the like to recognize the presence or absence of a dynamic object.
  • the dynamic object includes, for example, a person, an animal, another moving body, and the like.
  • the environment recognition unit 13 recognizes at least the object 200 in the forward direction A of the flying object 100 in the imaging region D1.
  • the object 200 is a dynamic object, but is not limited thereto.
  • FIG. 4 is a diagram showing an example of recognition of the external environment of the information processing device 10 according to the first embodiment.
  • the imaging device 111 outputs sensor information including an image G obtained by imaging the imaging region D1 in the traveling direction A of the flying object 100 to the information processing device 10.
  • the environment recognition unit 13 creates the attribute information C such as the semantic map of the image G by executing the semantic segmentation, for example.
  • the attribute information C includes a map displayed so that the user can recognize the attributes of the object shown in the image G.
  • the environment recognition unit 13 determines that the area such as the sky (the sky portion in the image G) in the imaging area D1 indicated by the image G has an object based on the attribute information C more than other areas. Recognize that you are not likely to. Further, the environment recognition unit 13 recognizes that there is no object in the sky region in front of the flying object 100.
  • the map creation unit 14 creates map information around the air vehicle 100 based on the sensor information acquired by the acquisition unit 11.
  • the map information is, for example, information indicating a map of the surrounding environment.
  • the map creating unit 14 creates map information such as a semantic map by executing semantic segmentation, for example.
  • the semantic map is a bird's-eye view map in which the attributes of objects appearing in an image are displayed so that the user can recognize them.
  • FIG. 5 is a diagram showing an example of map information of the information processing device 10 according to the first embodiment.
  • the map creation unit 14 creates map information M indicating a bird's-eye view looking down obliquely from the sky.
  • the map information M is, for example, a bird's-eye view looking down obliquely from the sky.
  • the map information M includes information indicating a bird's-eye view map to which information (for example, color) indicating the attribute of the object is added.
  • the map creation unit 14 acquires an image captured by the imaging device 111 of the sensor unit 110.
  • the map creation unit 14 generates a texture map by converting the image into a bird's eye view. Various methods can be used for bird's-eye view conversion.
  • the map creation unit 14 acquires the depth data (sensor information) detected by the imaging device 111.
  • the depth data is data including the distance (depth value) to the object.
  • the map creation unit 14 creates an Occupancy Grid Map based on the depth data and adds semantics to the Occupancy Grid Map to create a semantic map. Semantics are, for example, attributes of objects (meaning image areas) shown in an image.
  • the map creation unit 14 recognizes the attribute of an object (image area) shown in an image by performing semantic segmentation on the image captured by the image capturing device 111.
  • the map creation unit 14 creates map information M indicating the integrated map by integrating (for example, overlaying) the texture map and the semantic map.
  • the map information M is a type of a bird's-eye view map that is a three-dimensional Voxel grid expression (an expression that represents a space by a cube).
  • the information processing device 10 generates the map information M so that the information (for example, color or pattern) indicating the attribute of the image area and the three-dimensional shape of the object are included in the map. Then, the map creation unit 14 outputs the created map information M to the sensor control unit 16.
  • the storage unit 15 stores various data and programs.
  • the storage unit 15 can store various kinds of information such as sensor information detected by the sensor unit 110, an action plan, control information of the flying object 100, and map information M.
  • the storage unit 15 can store information such as an action plan received from an external electronic device or the like via the communication unit 120 of the aircraft 100.
  • the storage unit 15 is electrically connected to, for example, the sensor control unit 16, the operation control unit 17, and the like.
  • the storage unit 15 is, for example, a RAM, a semiconductor memory device such as a flash memory, a hard disk, an optical disk, or the like.
  • the storage unit 15 may be provided in a cloud server connected to the information processing device 10 via the communication unit 120.
  • the sensor control unit 16 controls the detection conditions of each of the plurality of sensors in the sensor unit 110 based on the control information for controlling the flying object 100 and the environmental information indicating the surroundings of the flying object 100.
  • the sensor control unit 16 has a configuration capable of exchanging information with the acquisition unit 11, the state recognition unit 12, the environment recognition unit 13, the map creation unit 14, the storage unit 15, the operation control unit 17, and the like.
  • the sensor control unit 16 can acquire the recognition result from the state recognition unit 12 and the environment recognition unit 13.
  • the sensor control unit 16 can acquire map information from the map creation unit 14.
  • the sensor control unit 16 can acquire control information from the operation control unit 17.
  • the sensor control unit 16 controls the detection condition via the acquisition unit 11 by instructing the detection condition of each of the plurality of sensors to the acquisition unit 11, will be described.
  • the present invention is not limited to this.
  • the sensor control unit 16 may be configured to directly instruct the sensor unit 110 about the detection conditions of each of the plurality of sensors and control the detection conditions.
  • the sensor control unit 16 controls the imaging conditions of each of the imaging devices 111a, 111b, 111c, and 111d based on the control information and the environment information.
  • the sensor control unit 16 controls the detection conditions of each of the plurality of sensors based on the control information, the environment information, and the map information.
  • the sensor control unit 16 controls the respective detection conditions of the imaging devices 111a, 111b, 111c, 111d and the state sensor 112.
  • the sensor control unit 16 detects the plurality of imaging devices 111, the state sensor 112, and the like based on at least one of the speed and the traveling direction of the aircraft 100 recognized by the state recognition unit 12 and the environment information recognized by the environment recognition unit 13. Control each detection condition. For example, when the flying object 100 is in the moving state, the sensor control unit 16 sets the frame rate of the imaging device 111a that images the traveling direction A of the flying object 100 to the frame rate of the imaging devices 111b, 111c, and 111d that images the other directions. Increase above the frame rate.
  • the moving state includes, for example, a state in which the aircraft 100 is moving in the air.
  • the sensor control unit 16 controls to extract an area necessary for detecting an obstacle in the imaging area D1 indicated by the sensor information of the imaging device 111a that images the traveling direction A of the flying object 100.
  • the sensor control unit 16 performs control to extract a region necessary for detecting an obstacle based on the attitude.
  • FIG. 6 is a diagram for explaining a processing example of sensor information of the information processing device 10 according to the first embodiment.
  • the attitude of the flying object 100 may change during flight.
  • the flying object 100 when the flying object 100 is flying in the traveling direction A, the flying object 100 changes to the posture P1 and the posture P2.
  • the posture P1 is a posture in which the flying body 100 flies horizontally with respect to a horizontal plane such as the ground.
  • the posture P2 is a posture in which the flying body 100 inclines with respect to the horizontal plane and flies.
  • the posture P2 is a posture in which the image capturing apparatus 111 captures a downward image.
  • the image captured by the image capturing apparatus 111a of the aircraft 100 differs between the posture P1 and the posture P2.
  • the attention area R focused by the information processing apparatus 10 is near the center of the image capturing area D1.
  • the attention area R focused by the information processing apparatus 10 is near the upper portion of the image capturing area D1. That is, the information processing apparatus 10 may not need to pay attention to all the images captured by the imaging device 111 when the flying object 100 is flying. Therefore, the information processing device 10 changes the region of interest in the image capturing region D1 captured by the image capturing device 111a, based on the posture P1 and the posture P2 in which the flying object 100 is flying.
  • the sensor control unit 16 sets the frame rate of the imaging device 111a that captures the traveling direction A when the object does not exist in the traveling direction A of the flying object 100 so that the object exists in the traveling direction A of the flying object 100. Lower than if.
  • the sensor control unit 16 controls the detection conditions in the sensor unit 110 based on the degree of construction of map information. For example, the detection condition of the sensor of the sensor unit 110 that detects a portion having a high degree of construction of map information is relaxed.
  • the sensor control unit 16 controls the detection condition such that the frame rate of the imaging device 111 that images the direction required for creating the map information is prioritized.
  • the sensor control unit 16 controls to increase the frame rate of the imaging device 111 when the speed of the flying object 100 increases, and decreases the frame rate of the imaging device 111 when the speed of the flying object 100 decreases.
  • the sensor control unit 16 determines the frame rates (imaging conditions) of the plurality of imaging devices 111 .
  • the sensor control unit 16 can change the frame rate from the minimum to the maximum according to the speed of the air vehicle 100.
  • FR min indicates the minimum frame rate of the specifications of the image pickup apparatus 111.
  • FR max indicates the maximum frame rate of the specifications of the image pickup apparatus 111.
  • V indicates the speed of the air vehicle 100.
  • FR min and FR max differ depending on the specifications of the imaging device 111 and the like.
  • the sensor control unit 16 uses Expression (1) to change the frame rate FR depending on the traveling speed in the direction in which the imaging device 111 is facing. For example, the sensor control unit 16 increases the frame rate FR of the image pickup device 111 in the traveling direction A of the flying object 100 and lowers or stops the frame rate of the other image pickup devices 111 as compared with the image pickup device 111 in the traveling direction A.
  • the sensor control unit 16 relaxes the sensor detection conditions of the sensor unit 110 as the degree of construction of the map information created by the map creation unit 14 increases. For example, it is not necessary to image a portion of the map information having a high degree of construction at a high frame rate, and it is not necessary to image a completed portion by the image pickup device 111.
  • the degree of construction of the map information can be obtained, for example, based on the result of comparison between the created map information and the map information stored in the storage unit 15. Specifically, in the case of a map with a three-dimensional Voxel grid representation, the degree of construction of map information can be estimated by focusing on the number of newly created Voxels.
  • the degree of construction of the map information may be estimated based on the coordinates in the space indicated by the map information and the comparison result for each area.
  • the information processing device 10 calculates so as to reduce the frame rate of the imaging device 111 according to the degree of construction of the map information.
  • the sensor control unit 16 calculates the frame rate FR using the following formula (2) when the map information is the three-dimensional Voxel grid representation.
  • FR FR max *(1-1/(N+1)) equation (2)
  • FR max indicates the maximum frame rate of the specifications of the image pickup apparatus 111.
  • N indicates the number of newly created Voxels.
  • the sensor control unit 16 calculates the frame rate FR of the imaging device 111 by using the equation (2).
  • the sensor control unit 16 lowers the frame rate of the image capturing apparatus 111, or , Stop.
  • the sensor control unit 16 controls the frame rate so as to give priority to the imaging device 111 that captures the direction required to create the map information. For example, the sensor control unit 16 preferentially increases the frame rate of the image capturing apparatus 111 that captures the direction necessary to create the map information among the plurality of image capturing apparatuses 111, and lowers the frame rate of the other image capturing apparatuses 111. To do. For example, when creating map information on the left side in the traveling direction A of the air vehicle 100, the information processing apparatus 10 captures the frame rates of the image capturing apparatuses 111a, 111c, and 111d for capturing the front, rear, and left sides, and the image capturing for the right side.
  • the information processing apparatus 10 sets the frame rate of the image capturing apparatus 111d that captures the left side to a higher frame rate of the front and rear image capturing apparatuses 111a and 111c, and sets the frame rate of the image capturing apparatus 111b that captures the right side the lowest. May be.
  • the case of setting the frame rate to the lowest includes the case of setting the frame rate to 0 for a predetermined time, for example.
  • the operation control unit 17 controls the operation of the air vehicle 100 based on the sensor information acquired by the acquisition unit 11.
  • the operation control unit 17 controls the drive unit 130 of the aircraft 100.
  • the operation control unit 17 performs flight control of the flying object 100 to realize the action plan.
  • the operation control unit 17 performs flight control of the flying object 100 for realizing the action plan stored in the storage unit 15.
  • the operation control unit 17 outputs an operation command or the like for driving the flying object 100 to the drive unit 130.
  • the operation control unit 17 may have a function of planning and correcting an action movement plan regarding movement of the flying object 100 based on the sensor information of the sensor unit 110, for example.
  • the example of the functional configuration of the information processing device 10 according to the present embodiment has been described above.
  • the configuration described above with reference to FIG. 2 is merely an example, and the functional configuration of the information processing device 10 according to the present embodiment is not limited to this example.
  • the functional configuration of the information processing device 10 according to the present embodiment can be flexibly modified according to specifications and operation.
  • FIG. 7 is a flowchart showing an example of a processing procedure executed by the information processing apparatus 10 according to the first embodiment.
  • the processing procedure illustrated in FIG. 7 is realized by the information processing device 10 executing a program.
  • the processing procedure shown in FIG. 7 shows an example of the processing procedure for determining the detection conditions of the plurality of imaging devices 111 of the sensor unit 110.
  • the information processing device 10 acquires the sensor information by the acquisition unit 11 (step S101). For example, the information processing device 10 acquires the sensor information of each of the imaging device 111 and the state sensor 112 by the acquisition unit 11.
  • the information processing device 10 acquires control information from the operation control unit 17 (step S102). For example, the information processing device 10 acquires the control information immediately before controlling the operation of the flying object 100, the latest control information, and the like from the operation control unit 17.
  • the information processing device 10 executes three processes of step S110, step S120, and step S130 in parallel.
  • the information processing device 10 may be configured to sequentially execute the three processes of step S110, step S120, and step S130.
  • the information processing device 10 recognizes the state of the flying object 100 based on the acquired sensor information (step S110). For example, the information processing device 10 analyzes the sensor information of the state sensor 112 to recognize the states such as the speed, the traveling direction, and the altitude of the flying object 100, and outputs the recognition result to the sensor control unit 16. Further, the information processing apparatus 10 recognizes the state of the flying body 100 such as the speed, the traveling direction, and the altitude based on the control information for controlling the driving unit 130. That is, the information processing device 10 enables prediction of the operation of the flying object 100 by recognizing the current state and the future state of the flying object 100. The information processing device 10 functions as the state recognition unit 12 by executing the process of step S110.
  • the information processing device 10 calculates the frame rate of each of the plurality of imaging devices based on the state of the flying vehicle 100 and the control information in step S110 (step S111). For example, the information processing device 10 calculates the frame rate of the imaging device 111 that images the traveling direction A of the flying object 100 based on the speed, attitude, and the like of the flying object 100. For example, the information processing apparatus 10 calculates the frame rate of the image capturing apparatus 111 other than the traveling direction A of the flying object 100 so as to be lower than that of the image capturing apparatus 111 capturing the traveling direction A.
  • the information processing apparatus 10 calculates so that the frame rate of the imaging device 111 increases when the speed of the flying object 100 increases, and the frame rate of the imaging device 111 decreases when the speed of the flying object 100 decreases. Then, when the processing of step S111 ends, the information processing apparatus 10 advances the processing to step S141 described below.
  • the information processing device 10 recognizes the environmental information of the external environment based on the acquired sensor information (step S120). For example, the information processing device 10 analyzes images captured by each of the plurality of imaging devices 111, a distance image, and the like, and recognizes environmental information such as the presence or absence of an obstacle around the flying object 100 and the distance to the obstacle. .. For example, the information processing device 10 analyzes an image captured by the imaging device 111, a distance image, and the like, and recognizes environment information indicating the presence or absence of a dynamic object in the traveling direction A of the air vehicle 100. The information processing device 10 functions as the environment recognition unit 13 by executing the process of step S120.
  • the information processing device 10 calculates the frame rate of each of the plurality of imaging devices 111 based on the environment information in step S120 (step S121). For example, the information processing device 10 calculates the frame rate of the imaging device 111 based on the presence or absence of a dynamic object, an obstacle, or the like in the traveling direction A of the flying object 100. For example, the information processing apparatus 10 performs calculation so that the frame rate is higher when a dynamic object, an obstacle, or the like in the image capturing areas D1, D2, D3, and D4 captured by the image capturing apparatus 111 is present than when the object is not present. ..
  • the information processing apparatus 10 sets the frame rate of the image capturing apparatus 111 capturing the dynamic object to be higher than the frame rates of other image capturing apparatuses 111. You may make it calculate high.
  • the information processing device 10 advances the process to step S141 described below.
  • the information processing device 10 creates map information based on the acquired sensor information (step S130). For example, when the map information being created is stored in the storage unit 15, the information processing device 10 creates the map information by updating the map information based on the sensor information acquired from the sensor unit 110. .. For example, when the map information being created is not stored in the storage unit 15, the information processing device 10 creates new map information based on the sensor information acquired from the sensor unit 110.
  • the information processing device 10 functions as the map creation unit 14 by executing the process of step S130.
  • the information processing device 10 calculates the frame rate of each of the plurality of imaging devices 111 based on the map information (step S131). For example, the information processing apparatus 10 calculates the image pickup apparatus 111 and its frame rate necessary for creating the map information based on the created map information. For example, the information processing apparatus 10 calculates so that the frame rate of the image capturing apparatus 111 decreases as the degree of construction of map information increases. For example, the information processing apparatus 10 controls the frame rate so that the image capturing apparatus 111 that captures the direction necessary for creating the map information is prioritized. When the process of step S111 ends, the information processing device 10 advances the process to step S141 described below.
  • the information processing device 10 determines the frame rate of each of the plurality of imaging devices based on the calculated frame rate (step S141).
  • the process of step S141 is executed by the information processing device 10 when the processes of step S111, step S121, and step S131 are completed.
  • the information processing apparatus 10 compares the frame rate calculated from the state of the flying object 100, the frame rate calculated from the environment information, and the frame rate calculated from the degree of construction of the map information with the determination criterion, and the plurality of image pickup apparatuses. Determine the frame rate of 111.
  • the criterion is the highest frame rate determined for the imaging device 111 will be described, but the criterion is not limited to this.
  • a criterion may be set according to the state of the air vehicle 100.
  • the information processing device 10 changes each of the frame rates of the plurality of imaging devices 111 based on the determined frame rate (step S142). For example, the information processing device 10 instructs each of the plurality of imaging devices 111 via the acquisition unit 11 to change the frame rate. As a result, each of the plurality of imaging devices 111 changes the existing frame rate in use to the frame rate instructed to be changed.
  • step S142 ends, the information processing apparatus 10 advances the processing to step S150 described below.
  • the information processing device 10 functions as the sensor control unit 16 by executing the processes of step S111, step S121, step S131, and steps S141 to S142.
  • the information processing device 10 determines whether or not to end (step S150). For example, the information processing apparatus 10 determines to end when the operation of the flying object 100, a change of the flying object 100 to the standby state, and the like are detected. When the information processing apparatus 10 determines that the processing is not to be ended (No in step S150), the processing is returned to step S101 described above, and the series of processing is executed. If the information processing apparatus 10 determines to end the processing (Yes in step S150), the processing procedure illustrated in FIG. 7 ends.
  • FIG. 8 is a diagram showing an example of determining the frame rate of the information processing device 10 according to the first embodiment.
  • the flying body 100 is flying in the traveling direction A by driving the driving unit 130 under the control of the information processing device 10.
  • the information processing device 10 acquires the sensor information including the image captured by the imaging device 111 by the acquisition unit 11.
  • the state recognition unit 12 recognizes the state of the flying object 100
  • the environment recognition unit 13 recognizes the environment information
  • the map creation unit 14 creates the map information.
  • the information processing device 10 calculates the frame rate for each of the four image pickup devices 111a, 111b, 111c, and 111d, as shown in FIG.
  • the frame rate calculated from the state of the flying object 100 is calculated as 60 fps (frames per second) for the imaging device 111a that images the front of the flying object 100 and 0 fps for the other imaging devices 111b, 111c, and 111d.
  • the frame rate calculated from the environment information is calculated as 20 fps for the image pickup apparatus 111a and 10 fps for the other image pickup apparatuses 111b, 111c, and 111d.
  • the frame rate calculated from the degree of construction of the map information is calculated as 20 fps for the image pickup devices 111a and 111d and 1 fps for the image pickup devices 111b and 111c.
  • the highest frame rate determined for each imaging device 111 is applied as the determination criterion.
  • the information processing apparatus 10 changes the frame rate of the image pickup apparatus 111a to 60 fps, the frame rate of the image pickup apparatus 111b to 10 fps, the frame rate of the image pickup apparatus 111c to 10 fps, and the frame rate of the image pickup apparatus 111d to 20 fps.
  • the image pickup device 111a picks up images at the front of the traveling direction A at 60 fps, the image pickup devices 111b and 111c pick up at 10 fps, and the image pickup devices 111d pick up at 20 fps. Therefore, the flying object 100 can suppress the power consumption of the imaging devices 111b, 111c, and 111d more than that of the imaging device 111a.
  • the information processing device 10 acquires from the sensor unit 110 sensor information including images captured at the frame rates of the image capturing devices 111a, 111b, 111c, and 111d. That is, the information processing apparatus 10 reduces the data amount of the sensor information of the imaging devices 111b, 111c, and 111d while maintaining the data amount of the sensor information of the imaging device 111a that images the front of the flying vehicle 100. It will be. As a result, the information processing apparatus 10 suppresses the sensor information of the four image capturing apparatuses 111a, 111b, 111c, and 111d, so that the processing using the sensor information is suppressed and the processing load executed by the information processing apparatus 10 is reduced. Can be mitigated. Then, in the information processing device 10, the operation control unit 17 controls the operation of the air vehicle 100 based on the sensor information acquired from the sensor unit 110 of the air vehicle 100.
  • the information processing apparatus 10 uses the sensor unit 110 mounted on the air vehicle 100 based on the control information for controlling the air vehicle 100 and the environmental information indicating the surroundings of the air vehicle 100.
  • the detection condition of each of the plurality of sensors is controlled.
  • the information processing device 10 acquires sensor information detected by a plurality of sensors under detection conditions.
  • the information processing device 10 can detect each of the plurality of sensors of the aircraft 100 under individual detection conditions according to the control information and the environment information of the aircraft 100.
  • the information processing apparatus 10 can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 and the data amount of the acquired sensor information.
  • the information processing device 10 creates map information of the flying object 100 based on the acquired sensor information, and determines detection conditions for each of the plurality of sensors of the sensor unit 110 based on the control information, the environment information, and the map information. Control. Thereby, the information processing apparatus 10 can image each of the plurality of sensors of the air vehicle 100 under individual detection conditions based on the control information, the environment information, and the map information of the air vehicle 100. As a result, when creating the map information, the information processing apparatus 10 can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 and suppress the data amount of the acquired sensor information. it can.
  • the information processing device 10 recognizes at least one of the speed and the traveling direction A of the flying object 100 and the environmental information.
  • the information processing device 10 controls the imaging conditions of each of the plurality of sensors of the sensor unit 110 based on the recognized speed and/or traveling direction A and the environment information. Accordingly, the information processing device 10 can detect each of the plurality of sensors of the flying object 100 under individual detection conditions based on at least one of the speed and the traveling direction of the flying object 100.
  • the information processing apparatus 10 can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 without lowering the safety of the air vehicle 100 and reduce the data amount of the sensor information to be acquired. Can be suppressed.
  • the information processing device 10 is configured to detect the four image pickup devices 111a, 111b, 111c, and 111d mounted on the air vehicle 100 based on the control information for controlling the air vehicle 100 and the environmental information indicating the surroundings of the air vehicle 100. The respective imaging conditions are controlled.
  • the information processing device 10 acquires the sensor information detected by the four imaging devices 111a, 111b, 111c, and 111d under the detection conditions.
  • the information processing apparatus 10 can image each of the four imaging devices 111a, 111b, 111c, and 111d of the flying object 100 under individual imaging conditions according to the control information and the environment information of the flying object 100. it can.
  • the information processing device 10 can suppress the power consumption of the four imaging devices 111a, 111b, 111c, and 111d mounted on the air vehicle 100, and suppress the data of the acquired sensor information. ..
  • the information processing apparatus 10 raises the frame rate of the image pickup apparatus 111 that picks up the traveling direction A of the flying object 100 to be higher than the frame rate of the image pickup apparatus 111 that picks up another direction. With this, the information processing apparatus 10 concentrates the image of the traveling direction A of the flying object 100, thereby preventing the safety from being lowered even if the frame rate of the image capturing apparatus 111 in the other direction is suppressed. .. As a result, the information processing device 10 can suppress the power consumption of the plurality of imaging devices 111 without lowering the safety of the flying object 100.
  • the information processing device 10 controls the imaging conditions of each of the plurality of imaging devices 111 based on the environmental information in the traveling direction A of the flying object 100. As a result, the information processing apparatus 10 determines the imaging conditions of the imaging apparatus 111 that images the traveling direction A and the imaging apparatus 111 that captures the other directions in consideration of the external environment in the flight direction of the flying object 100. It can be controlled separately. As a result, the information processing apparatus 10 can suppress the power consumption of the plurality of imaging devices 111 by changing the imaging condition of the imaging device 111 that captures a direction different from the traveling direction A of the flying object 100.
  • the information processing apparatus 10 sets the frame rate of the imaging device 111 that captures the traveling direction A in the case where the object exists in the traveling direction A of the flying object 100. Lower than. As a result, the information processing apparatus 10 can change the image capturing condition of the image capturing apparatus 111 that captures the traveling direction A based on the presence/absence of an object in the direction in which the flying object 100 is flying. As a result, the information processing device 10 can suppress the power consumption of the imaging device 111 that images the traveling direction A of the flying object 100.
  • the information processing apparatus 10 performs control to increase the frame rate of the imaging device 111 when the speed of the flying object 100 increases, and control to decrease the frame rate of the imaging device 111 when the speed of the flying object 100 decreases. To do. As a result, the information processing apparatus 10 can reduce the power consumption of the imaging device 111 by lowering the frame rate of the imaging device 111 when the speed of the flying object 100 decreases. As a result, the information processing device 10 can suppress the power consumption of the plurality of imaging devices 111 without lowering the safety of the flying object 100.
  • the information processing device 10 also controls the detection condition of the sensor of the sensor unit 110 based on the degree of construction of the map information M. As a result, the information processing apparatus 10 can change the detection condition of the sensor that detects a portion of the created map information having a high degree of construction. As a result, the information processing device 10 can suppress the power consumption of the sensor as the degree of construction of the map information being created increases.
  • the information processing apparatus 10 controls the imaging conditions such that the imaging device 111 that takes an image in the direction required to create the map information has priority.
  • the information processing apparatus 10 can relax the imaging conditions of the imaging apparatus 111 that captures an image in a direction other than the direction required for creating map information.
  • the information processing device 10 can suppress the power consumption of the imaging device 1111 that is not necessary for creating the map information without reducing the efficiency of creating the map information.
  • the operation control unit 17 controls the operation of the aircraft 100 based on the sensor information acquired by the acquisition unit 11 from the sensor unit 110 of the aircraft 100.
  • the information processing device 10 acquires the sensor information detected by the sensor of the sensor unit 110 under the individual detection conditions according to the control information and the environment information of the flying object 100, and based on the sensor information, the flying object 100. Can be controlled.
  • the information processing device 10 can suppress the power consumption of the flying body 100 that controls flight by controlling the detection conditions of the plurality of sensors mounted on the flying body 100.
  • the information processing device 10 can change the imaging condition of the imaging device 111 included in the sensor unit 110 of the flying object 100.
  • the information processing device 10 is based on the attribute information of the imaging region D1 indicated by the sensor information of the imaging device 111 whose sensor control unit 16 images the traveling direction A of the flying object 100.
  • the image capturing conditions of the image capturing device 111 may be controlled.
  • the sensor control unit 16 of the information processing device 10 executes the semantic segmentation of the image G included in the sensor information of the imaging device 111 acquired by the acquisition unit 11 to obtain the image. It is possible to classify the area in the imaging area indicated by G.
  • the sensor control unit 16 performs control for lowering the frame rate of the imaging device 111. For example, when the area in the traveling direction A indicated by the image G has the sky attribute, the sensor control unit 16 can lower the frame rate of the image pickup apparatus 111 below the frame rate at the time of detecting an object.
  • the information processing apparatus 10 sets the image capturing condition of the image capturing apparatus 111 based on the attribute information of the image capturing area D1 indicated by the sensor information of the image capturing apparatus 111 that captures the traveling direction A of the flying object 100. Control. Thereby, the information processing device 10 can change the imaging condition of the imaging device 111 in the traveling direction A of the flying object 100 based on the possibility that an object exists. As a result, the information processing device 10 can suppress the power consumption of the aircraft 100 without lowering the safety of the aircraft 100 in the traveling direction A.
  • modification (1) of the first embodiment may be applied to the information processing device 10 of another embodiment or modification.
  • the information processing apparatus 10 can change the imaging condition of the imaging device 111 included in the sensor unit 110 of the flying object 100 according to the attitude of the flying object 100.
  • the sensor control unit 16 when the sensor information acquired by the acquisition unit 11 has the information indicating the attitude of the flying object 100, the sensor control unit 16 causes the attitude.
  • the control for changing the image pickup area of the image pickup apparatus 111 necessary for detecting the obstacle may be performed based on the above.
  • the sensor control unit 16 of the information processing device 10 detects that the sensor information of the imaging device 111 acquired by the acquisition unit 11 is the attitude P1 in which the flying object 100 flies without tilting. In the image that is included, the attention area R is near the center of the imaging area D1. Then, the sensor control unit 16 causes the environment recognition unit 13 to execute recognition processing based on the attention area R. Further, in the case where the flying body 100 tilts and flies in the attitude P2, the sensor control unit 16 sets the attention area R near the upper portion of the imaging area D1 in the image included in the sensor information of the imaging device 111 acquired by the acquisition unit 11. ..
  • the sensor control unit 16 causes the environment recognition unit 13 to execute recognition processing based on the attention area R.
  • the environment recognition unit 13 does not need to analyze the entire image included in the sensor information, so that it is possible to reduce processing such as image analysis and image recognition.
  • the information processing apparatus 10 when the sensor information has the information indicating the attitude of the flying object 100, the information processing apparatus 10 according to the present embodiment detects the obstacle of the image pickup apparatus 111 based on the attitude. Control for changing the imaging area is performed. As a result, the information processing apparatus 10 can change the image capturing area of the image capturing apparatus 111 necessary for detecting the obstacle according to the change in the attitude of the flying object 100. As a result, the information processing apparatus 10 may process the information of a partial area of the imaging area according to the attitude of the flying object 100 during flight, so that the processing load can be reduced.
  • the information processing apparatus 10 has been described as a case of focusing on the image area of the acquired sensor information as the control for changing the imaging area of the imaging apparatus 111, but the present invention is not limited to this. Not done.
  • the sensor control unit 16 of the information processing device 10 acquires only the information of the imaging region of the imaging device 111 necessary for detecting the obstacle from the image of the sensor information acquired from the sensor unit 110, the acquisition unit 11, and the environment recognition unit 13. May be extracted.
  • the sensor control unit 16 of the information processing device 10 may be configured to cause the sensor unit 110 to output sensor information indicating only the information of the imaging region of the imaging device 111 necessary for detecting an obstacle.
  • modified example (2) of the first embodiment may be applied to the information processing device 10 of other embodiments and modified examples.
  • FIG. 9 is a diagram illustrating an example of the configuration of the information processing device 10A according to the second embodiment.
  • the aircraft 100 includes a sensor unit 110, a communication unit 120, a drive unit 130, and an information processing device 10A.
  • the sensor unit 110, the communication unit 120, the drive unit 130, and the information processing device 10A are connected to each other so that data and signals can be exchanged.
  • the information processing device 10A includes an acquisition unit 11, a state recognition unit 12, an environment recognition unit 13, a storage unit 15, a sensor control unit 16, and an operation control unit 17. That is, the information processing apparatus 10A does not include the map creation unit 14 of the information processing apparatus 10 according to the first embodiment.
  • a program stored inside the information processing device 10 is executed by the CPU, MPU, or the like. It is realized by executing the RAM or the like as a work area. Further, each processing unit may be realized by an integrated circuit such as an ASIC or FPGA, for example.
  • FIG. 10 is a flowchart showing an example of a processing procedure executed by the information processing device 10 according to the second embodiment.
  • the processing procedure illustrated in FIG. 10 is realized by the information processing device 10 executing a program.
  • the processing procedure shown in FIG. 10 shows an example of the processing procedure for determining the detection conditions of the plurality of imaging devices 111 of the sensor unit 110.
  • the information processing apparatus 10A acquires sensor information by the acquisition unit 11 (step S101).
  • the information processing apparatus 10A acquires control information from the operation control unit 17 (step S102).
  • the information processing apparatus 10A executes two processes of step S110 and step S120 in parallel.
  • the information processing device 10 may be configured to sequentially execute the two processes of step S110 and step S120.
  • the information processing device 10A recognizes the state of the flying object 100 based on the acquired sensor information (step S110). When the process of step S110 ends, the information processing device 10A advances the process to step S161. Further, the information processing device 10A recognizes the environmental information of the external environment based on the acquired sensor information (step S120). When the process of step S120 ends, the information processing apparatus 10A advances the process to step S161.
  • the information processing apparatus 10A determines the detection conditions of each of the plurality of sensors of the sensor unit 110 based on the state of the flying object 100, control information, and environment information (step S161).
  • the process of step S161 is executed by the information processing apparatus 10A when the processes of step S110 and step S120 are completed.
  • the information processing apparatus 10A can detect an object among a plurality of sensors based on the current speed of the air vehicle 100, the traveling direction A, and the current speed of the air vehicle 100, the traveling direction A, and the like.
  • the sensor to be used is specified, and the detection condition of the specified sensor is determined.
  • the information processing apparatus 10A determines a sensor to be operated and a sensor to be operated or stopped with power saving.
  • the information processing device 10A advances the process to step S162.
  • the information processing device 10A changes the detection condition of each of the plurality of sensors based on the determined detection condition (step S162). For example, the information processing device 10 instructs each of the plurality of sensors of the sensor unit 110 to change the detection condition via the acquisition unit 11. As a result, each of the imaging device 111, the state sensor 112, and the like of the sensor unit 110 changes the existing detection condition in use to the detection condition instructed to be changed.
  • step S162 ends, the information processing device 10 advances the process to step S150 described below.
  • the information processing device 10A functions as the sensor control unit 16 by executing the processes of step S161 and step S162.
  • the information processing device 10A determines whether or not to end (step S150). When the information processing apparatus 10A determines that the processing is not to be ended (No in step S150), the processing is returned to step S101 described above, and the series of processing is executed. If the information processing apparatus 10A determines to end the processing (Yes in step S150), the information processing apparatus 10A ends the processing procedure illustrated in FIG.
  • FIG. 11 is a diagram for explaining an example of determining the detection condition of the sensor of the information processing apparatus 10A according to the second embodiment.
  • the flying body 100 is flying in the traveling direction A by driving the driving unit 130 under the control of the information processing device 10A.
  • the information processing apparatus 10A causes the image pickup apparatus 111a to function as a sensor used for detecting an object, and causes the other sensors to operate with low power consumption.
  • the information processing apparatus 10 ⁇ /b>A recognizes that the traveling direction A is changed to the traveling direction A′ based on the action plan of the aircraft 100 immediately before controlling the driving unit 130.
  • the information processing apparatus 10A recognizes that it is necessary to recognize an object in the front and right directions of the air vehicle 100 based on the future speed and the traveling direction A′ of the air vehicle 100.
  • the information processing apparatus 10A uses the detection conditions of the sensors for detecting the front and right sides of the sensor unit 110 for detecting an object based on the control information for controlling the moving body and the environment information indicating the surroundings of the moving body. And the detection conditions for operating the other sensors with power saving. Then, the information processing device 10A changes the detection condition of each of the plurality of sensors based on the determined detection condition.
  • the flying body 100 is flying from the traveling direction A to the traveling direction A′ by the driving unit 130 being driven by the control of the information processing device 10A.
  • the information processing apparatus 10A causes the image pickup apparatus 111a and the image pickup apparatus 111b to function as sensors used for detecting an object, and causes the other sensors to operate with low power consumption.
  • the information processing apparatus 10A uses the sensor unit 110 mounted on the aircraft 100 based on the control information for controlling the aircraft 100 and the environmental information indicating the surroundings of the aircraft 100.
  • the detection condition of each of the plurality of sensors is controlled.
  • the information processing apparatus 10A acquires sensor information detected by a plurality of sensors under the detection conditions.
  • the information processing apparatus 10A can detect each of the plurality of sensors of the air vehicle 100 under the individual detection conditions according to the control information and the environment information of the air vehicle 100.
  • the information processing apparatus 10A can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 and the data amount of the acquired sensor information.
  • the control information is information for controlling the flying body 100 based on the action plan of the flying body 100. Accordingly, the information processing apparatus 10A can control the detection conditions of the plurality of sensors of the sensor unit 110 based on the control information immediately before controlling the flying object 100. As a result, since the information processing apparatus 10A can change the detection condition of the sensor immediately before the operation of the flying object 100, the power consumption of the flying object 100 can be further suppressed. In addition, the information processing apparatus 10A can perform control without lowering the safety of the flying object 100 by detecting the sensor under the detection condition suitable for the operation of the flying object 100.
  • the information processing devices 10 and 10A are mounted on the flying body 100 , but the present invention is not limited to this.
  • the information processing devices 10 and 10A are provided so as to be able to communicate with each other outside the flying body 100, acquire sensor information by communicating with the flying body 100 via the communication unit 120, and detect a plurality of sensors of the flying body 100. The condition may be controlled remotely.
  • the moving body may be, for example, an autonomous moving type moving body that needs to monitor the surroundings of the vehicle (a motorcycle, a four-wheeled vehicle, a bicycle, etc.), a robot, a dolly, or the like.
  • FIG. 12 is a hardware configuration diagram illustrating an example of a computer 1000 that realizes the functions of the information processing device 10.
  • the computer 1000 includes a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, an HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600.
  • the respective units of the computer 1000 are connected by a bus 1050.
  • the CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400, and controls each part. For example, the CPU 1100 expands a program stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processing corresponding to various programs.
  • the ROM 1300 stores a boot program such as a BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, a program dependent on the hardware of the computer 1000, and the like.
  • BIOS Basic Input Output System
  • the HDD 1400 is a computer-readable recording medium that non-temporarily records a program executed by the CPU 1100, data used by the program, and the like.
  • the HDD 1400 is a recording medium that records a program according to the present disclosure, which is an example of the program data 1450.
  • the communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet).
  • the CPU 1100 receives data from another device or transmits the data generated by the CPU 1100 to another device via the communication interface 1500.
  • the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000.
  • the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600.
  • the CPU 1100 also transmits data to an output device such as a display, a speaker, a printer, etc. via the input/output interface 1600.
  • the input/output interface 1600 may also function as a media interface for reading a program or the like recorded in a predetermined recording medium (medium).
  • the medium is, for example, an optical recording medium such as a DVD (Digital Versatile Disc), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
  • the CPU 1100 of the computer 1000 executes the program loaded on the RAM 1200 to acquire the acquisition unit 11, the state recognition unit 12, and the environment recognition unit 13.
  • the functions of the map creation unit 14, the sensor control unit 16, the operation control unit 17, and the like are realized.
  • the HDD 1400 stores the program according to the present disclosure and the data in the storage unit 15. Note that the CPU 1100 reads the program data 1450 from the HDD 1400 and executes the program data, but as another example, these programs may be acquired from another device via the external network 1550.
  • the effects described in the present specification are merely explanatory or exemplifying ones, and are not limiting. That is, the technique according to the present disclosure may have other effects that are apparent to those skilled in the art from the description of the present specification, in addition to or instead of the above effects.
  • each step related to the processing of the information processing apparatus in this specification does not necessarily have to be processed in time series in the order described in the flowchart.
  • the steps related to the processing of the information processing device may be processed in a different order from the order described in the flowchart or may be processed in parallel.
  • a sensor control unit for controlling the detection conditions of each of the plurality of sensors mounted on the moving body, An acquisition unit that acquires sensor information that each of the plurality of sensors has detected under the detection conditions, An information processing apparatus including.
  • a creation unit that creates map information around the moving body, The information processing device according to (1), wherein the sensor control unit controls the detection condition of each of the plurality of sensors based on the control information, the environment information, and the map information.
  • a state recognition unit that recognizes at least one of the speed and the traveling direction of the moving body based on the control information
  • An environment recognition unit that recognizes the environment information based on the sensor information acquired by the acquisition unit, Further equipped with, The sensor control unit, based on the environmental information recognized by the environment recognition unit and at least one of the speed and traveling direction of the moving body recognized by the state recognition unit, the detection conditions of each of the plurality of sensors. Controlling The information processing apparatus according to (2) above.
  • the plurality of sensors include a plurality of imaging devices provided to image different regions around the moving body, The information processing device according to (2) or (3), wherein the sensor control unit controls the imaging conditions of each of the plurality of imaging devices based on the control information and the environment information.
  • the sensor control unit performs control to increase the frame rate of the imaging device when the speed of the moving object increases, and performs control to decrease the frame rate of the imaging device when the speed of the moving object decreases.
  • the information processing device according to any one of (4) to (7).
  • the sensor control unit controls the imaging condition so as to give priority to the imaging device that images a direction necessary for creating the map information.
  • the information processing device according to (6).
  • the said control information is information for controlling the said mobile body based on the action plan of the said mobile body, The information processing apparatus in any one of said (1) to (13).
  • a mobile body including a plurality of sensors and an information processing device, The information processing device, Based on control information for controlling the moving body and environmental information indicating the surroundings of the moving body, a sensor control unit for controlling the detection conditions of each of the plurality of sensors, An acquisition unit that acquires sensor information that each of the plurality of sensors has detected under the detection conditions, A mobile body equipped with.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Provided are an information processing device, an information processing method, and a program so that it is possible to minimize power consumed by a plurality of sensors mounted on a mobile body. The information processing device (10) comprises: a sensor control unit (16) that controls the respective detection conditions of a plurality of sensors mounted on a mobile body (100) on the basis of control information for controlling the mobile body (100) and environmental information indicating the surroundings of the mobile body (100); and an acquisition unit (11) that acquires the sensor information detected by the plurality of sensors under the respective detection conditions.

Description

情報処理装置、情報処理方法及びプログラムInformation processing apparatus, information processing method, and program
 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 The present disclosure relates to an information processing device, an information processing method, and a program.
 特許文献1には、構築物の概況に関する情報を取得し、当該情報に基づいて構築物の周縁を飛行させて当該構築物を撮像させる飛行体の飛行情報を生成する制御装置等が開示されている。 [Patent Document 1] discloses a control device and the like that obtains information about the general condition of a structure and flies the periphery of the structure based on the information to generate flight information of a flying body that images the structure.
国際公開第2015/163106号International Publication No. 2015/163106
 上記の従来技術では、飛行体等の移動体は、例えば、前後、左右、上下等と進行方向が全方向である場合、移動体の全方向を網羅することが望ましい。しかし、移動体は、搭載できるバッテリの重量が制限される場合、全方向を撮像する複数のセンサが消費する電力を抑制したいとの要望がある。 In the above conventional technology, it is desirable to cover all directions of a moving body such as a flying body when the traveling direction is all directions such as front-rear, left-right, up-down, etc. However, when the weight of the battery that can be mounted on the mobile body is limited, there is a demand for suppressing the power consumed by the plurality of sensors that image in all directions.
 そこで、本開示では、移動体に搭載される複数のセンサが消費する電力を抑制することができる情報処理装置、情報処理方法及びプログラムを提案する。 Therefore, the present disclosure proposes an information processing device, an information processing method, and a program capable of suppressing the power consumed by a plurality of sensors mounted on a mobile body.
 上記の課題を解決するために、本開示に係る一形態の情報処理装置は、移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御するセンサ制御部と、複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部と、を備える。 In order to solve the above problems, an information processing device according to an aspect of the present disclosure is mounted on the moving body based on control information for controlling the moving body and environment information indicating the surroundings of the moving body. A sensor control unit that controls the detection condition of each of the plurality of sensors, and an acquisition unit that acquires the sensor information detected by each of the plurality of sensors under the detection condition are provided.
 また、本開示に係る一形態の情報処理方法は、コンピュータが、移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御し、複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する。 Further, in an information processing method according to an aspect of the present disclosure, a computer uses a plurality of sensors mounted on the mobile body based on control information for controlling the mobile body and environment information indicating the surroundings of the mobile body. Each detection condition is controlled, and each of the plurality of sensors acquires sensor information detected under the detection condition.
 また、本開示に係る一形態のプログラムは、コンピュータを、移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御するセンサ制御部、複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部として機能させる。 In addition, a program according to an aspect of the present disclosure causes a computer to control each of a plurality of sensors mounted on the moving body based on control information for controlling the moving body and environment information indicating the surroundings of the moving body. The sensor control unit that controls the detection condition and each of the plurality of sensors function as an acquisition unit that acquires the sensor information detected under the detection condition.
第1の実施形態に係る情報処置装置を備える飛行体の一例を説明するための図である。It is a figure for explaining an example of a flying object provided with an information treatment device concerning a 1st embodiment. 第1の実施形態に係る情報処理装置を備える飛行体の構成例を示す図である。It is a figure which shows the structural example of the flying body provided with the information processing apparatus which concerns on 1st Embodiment. 第1の実施形態に係る情報処理装置の動的物体の認識例を示す図である。It is a figure which shows the recognition example of the dynamic object of the information processing apparatus which concerns on 1st Embodiment. 第1の実施形態に係る情報処理装置の外部環境の認識例を示す図である。It is a figure which shows the example of recognition of the external environment of the information processing apparatus which concerns on 1st Embodiment. 第1の実施形態に係る情報処理装置のマップ情報の一例を示す図である。It is a figure which shows an example of the map information of the information processing apparatus which concerns on 1st Embodiment. 第1の実施形態に係る情報処理装置のセンサ情報の処理例を説明するための図である。It is a figure for explaining an example of processing of sensor information of an information processor concerning a 1st embodiment. 第1の実施形態に係る情報処理装置が実行する処理手順の一例を示すフローチャートである。6 is a flowchart showing an example of a processing procedure executed by the information processing apparatus according to the first embodiment. 第1の実施形態に係る情報処理装置のフレームレートの決定例を示す図である。It is a figure which shows the example of determination of the frame rate of the information processing apparatus which concerns on 1st Embodiment. 第2の実施形態に係る情報処理装置の構成の一例を示す図である。It is a figure which shows an example of a structure of the information processing apparatus which concerns on 2nd Embodiment. 第2の実施形態に係る情報処理装置が実行する処理手順の一例を示すフローチャートである。9 is a flowchart showing an example of a processing procedure executed by the information processing apparatus according to the second embodiment. 第2の実施形態に係る情報処理装置のセンサの検出条件の決定例を説明するための図である。It is a figure for explaining an example of determination of a detection condition of a sensor of an information processor concerning a 2nd embodiment. 情報処理装置の機能を実現するコンピュータの一例を示すハードウェア構成図である。It is a hardware block diagram which shows an example of the computer which implement|achieves the function of an information processing apparatus.
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In addition, in each of the following embodiments, the same portions are denoted by the same reference numerals, and a duplicate description will be omitted.
(第1の実施形態)
[第1の実施形態に係るシステムの概要]
 図1は、第1の実施形態に係る情報処置装置を備える飛行体の一例を説明するための図である。
(First embodiment)
[Outline of system according to first embodiment]
FIG. 1 is a diagram for explaining an example of an air vehicle provided with the information processing device according to the first embodiment.
 図1に示すように、飛行体100は、例えば、自律移動が可能なドローン、無人航空機(Unmanned Aerial Vehicle:UAV)等を含む。飛行体100は、自律移動可能な移動体の一例である。飛行体100は、例えば、前後、左右、上下の全方向への飛行を可能とする場合、外部環境に衝突せずに飛行するために、全方向を網羅するように複数のセンサを搭載するのが望ましい。しかし、飛行体100は、全方向にセンサを設ける場合、センサの数に応じて消費電力が増加してしまい、搭載するバッテリの重量が制限される飛行体100では、消費電力を抑制することが望まれている。また、飛行体100は、複数のセンサの検出結果をCPU(Central Processing Unit)、GPU(Graphics Processing Unit)等で処理する場合、負荷の増加によってCPU、GPU等の熱が上昇する可能性がある。しかし、飛行体100は、搭載重量を抑制するために、冷却装置等を用いることなく、熱の上昇の発生を抑制することが望まれている。本開示では、複数のセンサを搭載する飛行体100の事情を解消することを目的としている。なお、以下の説明において、飛行体100を「自機」と表記する場合がある。 As shown in FIG. 1, the air vehicle 100 includes, for example, a drone capable of autonomous movement, an unmanned aerial vehicle (UAV), and the like. The flying body 100 is an example of a moving body that can move autonomously. In the case where the flying object 100 can fly in all directions, for example, front and rear, left and right, and up and down, in order to fly without colliding with the external environment, a plurality of sensors are mounted so as to cover all directions. Is desirable. However, when the air vehicle 100 is provided with the sensors in all directions, the power consumption increases according to the number of the sensors, and the power consumption can be suppressed in the air vehicle 100 in which the weight of the battery to be mounted is limited. Is desired. In addition, when the flying object 100 processes the detection results of a plurality of sensors with a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., the heat of the CPU, GPU, etc. may rise due to an increase in load. .. However, it is desired that the aircraft 100 suppresses the rise of heat without using a cooling device or the like in order to suppress the mounted weight. In the present disclosure, it is an object of the present disclosure to eliminate the situation of the air vehicle 100 equipped with a plurality of sensors. In addition, in the following description, the flying body 100 may be referred to as “own aircraft”.
 飛行体100は、複数の撮像装置111と、情報処理装置10と、を備える。複数の撮像装置111と情報処理装置10とは、電気的に接続されている。飛行体100は、自機の周囲の異なる領域が撮像されるように、複数の撮像装置111を備える。複数の撮像装置111は、自機の外部環境を検出する複数のセンサの一例である。撮像装置111は、例えば、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ、Depthカメラ及びその他のカメラ等を含む。本開示では、飛行体100は、4台の撮像装置111を有する場合について説明するが、数を限定するものではない。また、以下の説明では、4台の撮像装置111のそれぞれを区別する説明においては、4台の撮像装置111を適宜、撮像装置111a、撮像装置111b、撮像装置111c、及び撮像装置111dと称する。 The air vehicle 100 includes a plurality of imaging devices 111 and an information processing device 10. The plurality of imaging devices 111 and the information processing device 10 are electrically connected. The flying vehicle 100 includes a plurality of imaging devices 111 so that different areas around the aircraft can be imaged. The plurality of imaging devices 111 are examples of a plurality of sensors that detect the external environment of the own device. The imaging device 111 includes, for example, a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, a Depth camera, and other cameras. Although the present disclosure describes a case where the flying object 100 has four imaging devices 111, the number is not limited. Further, in the following description, in the description for distinguishing each of the four image pickup devices 111, the four image pickup devices 111 are appropriately referred to as an image pickup device 111a, an image pickup device 111b, an image pickup device 111c, and an image pickup device 111d.
 図1に示す一例では、飛行体100は、進行方向Aに向かって飛行(移動)しているとする。なお、進行方向Aは、飛行体100の飛行方向に応じて変化する。撮像装置111aは、飛行体100の撮像領域D1の画像を撮像する。撮像領域D1は、飛行体100の進行方向Aを撮像した領域となっている。撮像装置111bは、飛行体100の撮像領域D2の画像を撮像する。撮像領域D2は、進行方向Aに向かって進行している飛行体100の右方向を撮像した領域となっている。撮像装置111cは、飛行体100の撮像領域D3の画像を撮像する。撮像領域D3は、飛行体100の進行方向Aとは反対の方向を撮像した領域となっている。撮像装置111dは、飛行体100の撮像領域D4の画像を撮像する。撮像領域D4は、進行方向Aに向かって進行している飛行体100の左方向を撮像した領域となっている。 In the example shown in FIG. 1, it is assumed that the flying object 100 is flying (moving) in the traveling direction A. The traveling direction A changes according to the flight direction of the flying object 100. The imaging device 111a captures an image of the imaging region D1 of the flying object 100. The imaging area D1 is an area in which the traveling direction A of the flying object 100 is imaged. The imaging device 111b captures an image of the imaging region D2 of the flying object 100. The imaging region D2 is a region in which the right direction of the aircraft 100 moving in the traveling direction A is imaged. The imaging device 111c captures an image of the imaging region D3 of the flying object 100. The imaging region D3 is a region in which a direction opposite to the traveling direction A of the flying object 100 is imaged. The imaging device 111d captures an image of the imaging region D4 of the flying object 100. The imaging region D4 is a region in which the left direction of the aircraft 100 moving in the traveling direction A is imaged.
 情報処理装置10は、複数の撮像装置111からセンサ情報(例えば、画像)を取得すると、当該センサ情報に対するアルゴリズム処理を実行する。アルゴリズム処理は、例えば、飛行体100の状態を認識する処理、飛行体100の外部環境を認識する処理、飛行体100の周囲のマップを作成する処理等を含む。情報処理装置10は、センサ情報とアルゴリズム処理の処理結果とに基づいて飛行体100の行動計画を修正する機能を有する。行動計画は、例えば、飛行経路、飛行速度、姿勢等の情報を含む。情報処理装置10は、行動計画に基づいて飛行している飛行体100の複数の撮像装置111のそれぞれの撮像条件(検出条件)を動的に制御する。情報処理装置10は、例えば、自機の動作状態、自機の外部環境の状態等に従って、飛行体100に搭載している複数のセンサのそれぞれの検出条件を動的に変更する。 When the information processing device 10 acquires sensor information (for example, an image) from a plurality of image pickup devices 111, the information processing device 10 executes an algorithm process on the sensor information. The algorithm process includes, for example, a process of recognizing the state of the flying object 100, a process of recognizing the external environment of the flying object 100, a process of creating a map around the flying object 100, and the like. The information processing device 10 has a function of correcting the action plan of the flying object 100 based on the sensor information and the processing result of the algorithm processing. The action plan includes information such as a flight route, a flight speed, and a posture, for example. The information processing device 10 dynamically controls the imaging condition (detection condition) of each of the plurality of imaging devices 111 of the flying object 100 that is flying based on the action plan. The information processing device 10 dynamically changes the respective detection conditions of the plurality of sensors mounted on the air vehicle 100 according to, for example, the operating state of the own device, the state of the external environment of the own device, and the like.
[第1の実施形態に係る飛行体の構成例]
 次に、第1の実施形態に係る飛行体100に搭載された情報処理装置10の構成の一例について説明する。図2は、第1の実施形態に係る情報処理装置10を備える飛行体100の構成例を示す図である。
[Configuration Example of Aircraft According to First Embodiment]
Next, an example of the configuration of the information processing device 10 mounted on the aircraft 100 according to the first embodiment will be described. FIG. 2 is a diagram illustrating a configuration example of an air vehicle 100 including the information processing device 10 according to the first embodiment.
 図2に示すように、飛行体100は、センサ部110と、通信部120と、駆動部130と、情報処理装置10と、を備える。センサ部110と通信部120と駆動部130と情報処理装置10とは、データや信号を授受可能に接続されている。 As shown in FIG. 2, the air vehicle 100 includes a sensor unit 110, a communication unit 120, a drive unit 130, and an information processing device 10. The sensor unit 110, the communication unit 120, the drive unit 130, and the information processing device 10 are connected to each other so that data and signals can be exchanged.
 センサ部110は、飛行体100の処理に用いるセンサ情報を検出する各種のセンサ等を備え、検出したデータを、飛行体100の各部に供給する。本実施形態では、センサ部110は、例えば、複数の撮像装置111と、状態センサ112と、を含む。複数の撮像装置111は、例えば、上記の撮像装置111a、111b、111c、111dを含む。状態センサ112は、例えば、ジャイロセンサ、加速度センサ、周囲情報検出センサ、モータ回転数を検出するためのセンサ等を含む。周囲情報検出センサは、例えば、飛行体100の周囲の物体を検出する。周囲情報検出センサは、例えば、超音波センサ、レーダ、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ソナー等からなる。状態センサ112は、例えば、撮像装置111と同様に複数設けてもよい。 The sensor unit 110 includes various sensors that detect sensor information used for the processing of the flying object 100, and supplies the detected data to each unit of the flying object 100. In the present embodiment, the sensor unit 110 includes, for example, a plurality of imaging devices 111 and a state sensor 112. The plurality of image pickup devices 111 include, for example, the above-described image pickup devices 111a, 111b, 111c, and 111d. The state sensor 112 includes, for example, a gyro sensor, an acceleration sensor, a surrounding information detection sensor, a sensor for detecting the motor rotation speed, and the like. The ambient information detection sensor detects, for example, an object around the flying object 100. The ambient information detection sensor includes, for example, an ultrasonic sensor, radar, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), sonar, and the like. A plurality of state sensors 112 may be provided, like the image pickup device 111, for example.
 例えば、センサ部110は、飛行体100の現在位置を検出するための各種のセンサを備えてもよい。具体的には、例えば、センサ部110は、GPS(Global Positioning System)受信機、GNSS(Global Navigation Satellite System)衛星からのGNSS信号を受信するGNSS受信機等を備えてもよい。例えば、センサ部110は、飛行体100の周囲の音を集音するマイクを備えてもよい。 For example, the sensor unit 110 may include various sensors for detecting the current position of the flying object 100. Specifically, for example, the sensor unit 110 may include a GPS (Global Positioning System) receiver, a GNSS receiver that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite, and the like. For example, the sensor unit 110 may include a microphone that collects sounds around the air vehicle 100.
 通信部120は、自機とは異なる他の飛行体、並びに、外部の様々な電子機器、情報処理サーバ、基地局等と通信を行う。通信部120は、情報処理サーバ等から受信したデータを情報処理装置10に出力したり、情報処理装置10からのデータを情報処理サーバ等に送信したりする。なお、通信部120がサポートする通信プロトコルは、特に限定されるものではなく、また、通信部120が、複数の種類の通信プロトコルをサポートすることも可能である。 The communication unit 120 communicates with other flying objects different from its own, as well as various external electronic devices, information processing servers, base stations, and the like. The communication unit 120 outputs the data received from the information processing server or the like to the information processing apparatus 10 or transmits the data from the information processing apparatus 10 to the information processing server or the like. The communication protocol supported by the communication unit 120 is not particularly limited, and the communication unit 120 can support a plurality of types of communication protocols.
 駆動部130は、飛行体100の駆動系に関わる各種の装置を備える。駆動部130は、例えば、複数の駆動用モータ等の駆動力を発生させるための駆動力発生装置等を備える。駆動用モータは、例えば、飛行体100の回転翼を回転させる。駆動部130は、例えば、情報処理装置10からの指令等を含む制御情報に基づいて駆動用モータを回転させることで、飛行体100が浮上、飛行する。 The drive unit 130 includes various devices related to the drive system of the aircraft 100. The drive unit 130 includes, for example, a drive force generation device for generating drive force of a plurality of drive motors and the like. The drive motor rotates, for example, the rotary wings of the flying object 100. The drive unit 130 rotates the drive motor based on control information including a command from the information processing device 10, for example, so that the flying object 100 floats and flies.
 次に、本実施形態に係る情報処理装置10の機能構成の一例について説明する。図2に示すように、情報処理装置10は、取得部11と、状態認識部12と、環境認識部13と、マップ作成部14と、記憶部15と、センサ制御部16と、動作制御部17と、を備える。取得部11、状態認識部12、環境認識部13、マップ作成部14、センサ制御部16、及び動作制御部17の各機能部は、例えば、CPUやMPU等によって、情報処理装置10の内部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。また、各処理部は、例えば、ASICやFPGA等の集積回路により実現されてもよい。 Next, an example of the functional configuration of the information processing device 10 according to the present embodiment will be described. As illustrated in FIG. 2, the information processing device 10 includes an acquisition unit 11, a state recognition unit 12, an environment recognition unit 13, a map creation unit 14, a storage unit 15, a sensor control unit 16, and an operation control unit. 17 is provided. Each functional unit of the acquisition unit 11, the state recognition unit 12, the environment recognition unit 13, the map creation unit 14, the sensor control unit 16, and the operation control unit 17 is provided inside the information processing device 10 by, for example, a CPU or MPU. It is realized by executing the stored program using a RAM or the like as a work area. Further, each processing unit may be realized by an integrated circuit such as an ASIC or FPGA, for example.
 取得部11は、飛行体100に搭載されたセンサ部110が検出したセンサ情報を取得する。例えば、取得部11は、4台の撮像装置111a、撮像装置111b、撮像装置111c及び撮像装置111dと状態センサ112とのそれぞれが出力したセンサ情報を取得する。取得部11は、取得したセンサ情報を状態認識部12、環境認識部13、マップ作成部14及び動作制御部17等に出力する。取得部11は、例えば、センサ部110のセンサごとの検出条件を変更する機能を有する。検出条件は、例えば、単位時間あたりの検出回数を含む。例えば、検出条件は、撮像装置111a、撮像装置111b、撮像装置111c及び撮像装置111dのそれぞれのフレームレートを含む。撮像装置111のフレームレートとは、動画において、単位時間あたりに処理させるフレームの数を意味する。検出条件は、複数のセンサごとに駆動させるか否かを示す条件を含んでもよい。例えば、取得部11は、検出条件を示す情報をセンサ部110に出力することで、センサ部110における検出条件を変更させる。その結果、取得部11は、検出条件に基づいてセンサ部110の複数のセンサのそれぞれが検出したセンサ情報を取得できる。なお、センサ情報は、複数のセンサの検出結果を含んでもよいし、複数のセンサごとの検出結果のみを含んでもよい。 The acquisition unit 11 acquires sensor information detected by the sensor unit 110 mounted on the aircraft 100. For example, the acquisition unit 11 acquires sensor information output from each of the four imaging devices 111a, 111b, 111c and 111d, and the state sensor 112. The acquisition unit 11 outputs the acquired sensor information to the state recognition unit 12, the environment recognition unit 13, the map creation unit 14, the operation control unit 17, and the like. The acquisition unit 11 has a function of changing the detection condition for each sensor of the sensor unit 110, for example. The detection condition includes, for example, the number of times of detection per unit time. For example, the detection condition includes the frame rate of each of the imaging device 111a, the imaging device 111b, the imaging device 111c, and the imaging device 111d. The frame rate of the imaging device 111 means the number of frames processed per unit time in a moving image. The detection condition may include a condition indicating whether to drive each of the plurality of sensors. For example, the acquisition unit 11 changes the detection condition in the sensor unit 110 by outputting information indicating the detection condition to the sensor unit 110. As a result, the acquisition unit 11 can acquire the sensor information detected by each of the plurality of sensors of the sensor unit 110 based on the detection condition. The sensor information may include the detection results of the plurality of sensors or may include only the detection results of each of the plurality of sensors.
 状態認識部12は、飛行体100の状態を認識する。飛行体100の状態は、例えば、飛行しているか否か、速度、進行方向A、高度等を含む。状態認識部12は、飛行体100の飛行等を制御する制御情報に基づいて、当該飛行体100の速度及び進行方向を認識する。例えば、飛行体100の場合、情報処理装置10は、自律移動や人間の操作にかかわらず、飛行体100がどの方向に移動するか、自機の機体の姿勢等の制御情報を直前に取得することができる。この場合、状態認識部12は、当該制御情報に基づいて、飛行体100のこれからの動作を予め認識することができる。例えば、状態認識部12は、動作制御部17から制御情報を取得し、当該制御情報に基づいて飛行体100のこれからの動作に応じた状態を認識することができる。また、状態認識部12は、状態センサ112のセンサ情報に基づいて、実際に飛行した飛行体100の速度及び進行方向を認識することができる。そして、状態認識部12は、認識した結果等をセンサ制御部16に出力する。 The state recognition unit 12 recognizes the state of the flying vehicle 100. The state of the flying object 100 includes, for example, whether or not the vehicle is flying, the speed, the traveling direction A, and the altitude. The state recognition unit 12 recognizes the speed and the traveling direction of the aircraft 100 based on the control information that controls the flight of the aircraft 100. For example, in the case of the air vehicle 100, the information processing apparatus 10 acquires control information such as the direction in which the air vehicle 100 moves, the attitude of the airframe of the aircraft, and the like immediately before regardless of autonomous movement or human operation. be able to. In this case, the state recognition unit 12 can previously recognize the future operation of the aircraft 100 based on the control information. For example, the state recognition unit 12 can acquire the control information from the operation control unit 17 and recognize the state of the aircraft 100 according to the future operation based on the control information. Further, the state recognition unit 12 can recognize the speed and the traveling direction of the flying body 100 that actually flew, based on the sensor information of the state sensor 112. Then, the state recognition unit 12 outputs the recognized result and the like to the sensor control unit 16.
 環境認識部13は、取得部11によって取得したセンサ情報に基づいて、飛行体100の周囲を示す環境情報を認識する。環境情報は、例えば、飛行体100の周囲、特定の方向等の外部環境を示す情報を含む。環境情報は、例えば、静止している物体、移動している物体等の障害物の有無、物体までの距離等を含む。環境認識部13は、例えば、画像、デプス値を含む距離画像等のセンサ情報を解析し、当該解析結果に基づいて飛行体100の進行方向Aにおける物体までの距離等を認識する。そして、環境認識部13は、認識した結果等をセンサ制御部16に出力する。 The environment recognition unit 13 recognizes environment information indicating the surroundings of the air vehicle 100 based on the sensor information acquired by the acquisition unit 11. The environment information includes, for example, information indicating the external environment such as the surroundings of the flying vehicle 100 and a specific direction. The environment information includes, for example, the presence or absence of obstacles such as a stationary object and a moving object, and the distance to the object. The environment recognition unit 13 analyzes, for example, sensor information such as an image and a distance image including a depth value, and recognizes a distance to an object in the traveling direction A of the flying object 100 based on the analysis result. Then, the environment recognition unit 13 outputs the recognized result and the like to the sensor control unit 16.
 図3は、第1の実施形態に係る情報処理装置10の動的物体の認識例を示す図である。図3に示すように、環境認識部13は、撮像装置111が撮像した画像、距離画像等を解析し、動的物体の存在の有無を認識する機能を有する。動的物体は、例えば、人物、動物、他の移動体等を含む。環境認識部13は、撮像領域D1において、少なくとも飛行体100の進行方向Aの前方における物体200を認識する。図3に示す一例では、物体200は、動的物体を示しているが、これに限定されない。 FIG. 3 is a diagram showing an example of recognition of a dynamic object by the information processing device 10 according to the first embodiment. As illustrated in FIG. 3, the environment recognition unit 13 has a function of analyzing an image captured by the imaging device 111, a distance image, and the like to recognize the presence or absence of a dynamic object. The dynamic object includes, for example, a person, an animal, another moving body, and the like. The environment recognition unit 13 recognizes at least the object 200 in the forward direction A of the flying object 100 in the imaging region D1. In the example shown in FIG. 3, the object 200 is a dynamic object, but is not limited thereto.
 図4は、第1の実施形態に係る情報処理装置10の外部環境の認識例を示す図である。図4に示す例では、撮像装置111は、飛行体100の進行方向Aにおける撮像領域D1を撮像した画像Gを含むセンサ情報を情報処理装置10に出力する。環境認識部13は、例えば、セマンティックセグメンテーションを実行することにより、画像Gのセマンティックマップ等の属性情報Cを作成する。属性情報Cは、画像Gに映る物体の属性をユーザが認識できるように表示したマップを含む。図4に示す例では、環境認識部13は、属性情報Cに基づいて、画像Gが示す撮像領域D1における空(画像G中のsky部分)などの領域が、他の領域よりも物体の存在しない可能性が高いことを認識する。また、環境認識部13は、飛行体100の前方の空の領域には物体が存在しないことを認識する。 FIG. 4 is a diagram showing an example of recognition of the external environment of the information processing device 10 according to the first embodiment. In the example illustrated in FIG. 4, the imaging device 111 outputs sensor information including an image G obtained by imaging the imaging region D1 in the traveling direction A of the flying object 100 to the information processing device 10. The environment recognition unit 13 creates the attribute information C such as the semantic map of the image G by executing the semantic segmentation, for example. The attribute information C includes a map displayed so that the user can recognize the attributes of the object shown in the image G. In the example illustrated in FIG. 4, the environment recognition unit 13 determines that the area such as the sky (the sky portion in the image G) in the imaging area D1 indicated by the image G has an object based on the attribute information C more than other areas. Recognize that you are not likely to. Further, the environment recognition unit 13 recognizes that there is no object in the sky region in front of the flying object 100.
 図2に戻り、マップ作成部14は、取得部11によって取得したセンサ情報に基づいて、飛行体100の周囲のマップ情報を作成する。マップ情報は、例えば、周辺環境のマップを示す情報である。マップ作成部14は、例えば、セマンティックセグメンテーションを実行することにより、セマンティックマップ等のマップ情報を作成する。セマンティックマップは、画像に映る物体の属性をユーザが認識できるように表示した鳥瞰図状のマップである。 Returning to FIG. 2, the map creation unit 14 creates map information around the air vehicle 100 based on the sensor information acquired by the acquisition unit 11. The map information is, for example, information indicating a map of the surrounding environment. The map creating unit 14 creates map information such as a semantic map by executing semantic segmentation, for example. The semantic map is a bird's-eye view map in which the attributes of objects appearing in an image are displayed so that the user can recognize them.
 図5は、第1の実施形態に係る情報処理装置10のマップ情報の一例を示す図である。マップ作成部14は、図5に示すように、上空から斜め下に見下ろした鳥瞰図を示すマップ情報Mを作成する。本実施形態では、マップ情報Mは、例えば、上空から斜め下に見下ろした鳥瞰図である。マップ情報Mは、物体の属性を示す情報(例えば、色等)が付加された鳥瞰図状のマップを示す情報を含む。 FIG. 5 is a diagram showing an example of map information of the information processing device 10 according to the first embodiment. As shown in FIG. 5, the map creation unit 14 creates map information M indicating a bird's-eye view looking down obliquely from the sky. In this embodiment, the map information M is, for example, a bird's-eye view looking down obliquely from the sky. The map information M includes information indicating a bird's-eye view map to which information (for example, color) indicating the attribute of the object is added.
 例えば、マップ作成部14は、センサ部110の撮像装置111で撮像された画像を取得する。マップ作成部14は、画像を鳥瞰図変換することにより、テクスチャマップを生成する。鳥瞰図変換には種々の方法を使用可能である。マップ作成部14は、撮像装置111により検出されたデプスデータ(センサ情報)を取得する。デプスデータは、物体までの距離(デプス値)を含むデータである。マップ作成部14は、デプスデータに基づいてOccupancy Grid Mapを生成し、当該Occupancy Grid Mapにセマンティクスを付与することにより、セマンティックマップを生成する。セマンティクスは、例えば、画像に映る物体の属性(画像領域の意味)のことである。マップ作成部14は、撮像装置111で撮像された画像に対してセマンティックセグメンテーション(Semantic Segmentation)を実行することにより、画像に映る物体(画像領域)の属性を認識する。マップ作成部14は、テクスチャマップとセマンティックマップとを統合(例えば、オーバーレイ)することにより、統合マップを示すマップ情報Mを生成する。本実施形態では、マップ情報Mは、3次元Voxel grid表現(空間をキューブで表す表現)の鳥瞰図マップの一種である。情報処理装置10は、画像領域の属性を示す情報(例えば、色やパターン)と物体の立体形状とがマップ中に含まれるようにマップ情報Mを生成する。そして、マップ作成部14は、作成したマップ情報Mをセンサ制御部16に出力する。 For example, the map creation unit 14 acquires an image captured by the imaging device 111 of the sensor unit 110. The map creation unit 14 generates a texture map by converting the image into a bird's eye view. Various methods can be used for bird's-eye view conversion. The map creation unit 14 acquires the depth data (sensor information) detected by the imaging device 111. The depth data is data including the distance (depth value) to the object. The map creation unit 14 creates an Occupancy Grid Map based on the depth data and adds semantics to the Occupancy Grid Map to create a semantic map. Semantics are, for example, attributes of objects (meaning image areas) shown in an image. The map creation unit 14 recognizes the attribute of an object (image area) shown in an image by performing semantic segmentation on the image captured by the image capturing device 111. The map creation unit 14 creates map information M indicating the integrated map by integrating (for example, overlaying) the texture map and the semantic map. In the present embodiment, the map information M is a type of a bird's-eye view map that is a three-dimensional Voxel grid expression (an expression that represents a space by a cube). The information processing device 10 generates the map information M so that the information (for example, color or pattern) indicating the attribute of the image area and the three-dimensional shape of the object are included in the map. Then, the map creation unit 14 outputs the created map information M to the sensor control unit 16.
 図2に戻り、記憶部15は、各種データ及びプログラムを記憶する。例えば、記憶部15は、センサ部110が検出したセンサ情報、行動計画、飛行体100の制御情報、マップ情報M等の各種情報を記憶できる。記憶部15は、飛行体100の通信部120を介して外部の電子機器等から受信した行動計画等の情報を記憶できる。記憶部15は、例えば、センサ制御部16、動作制御部17等と電気的に接続されている。記憶部15は、例えば、RAM、フラッシュメモリ等の半導体メモリ素子、ハードディスク、光ディスク等である。なお、記憶部15は、通信部120を介して情報処理装置10に接続されたクラウドサーバに設けてもよい。 Returning to FIG. 2, the storage unit 15 stores various data and programs. For example, the storage unit 15 can store various kinds of information such as sensor information detected by the sensor unit 110, an action plan, control information of the flying object 100, and map information M. The storage unit 15 can store information such as an action plan received from an external electronic device or the like via the communication unit 120 of the aircraft 100. The storage unit 15 is electrically connected to, for example, the sensor control unit 16, the operation control unit 17, and the like. The storage unit 15 is, for example, a RAM, a semiconductor memory device such as a flash memory, a hard disk, an optical disk, or the like. The storage unit 15 may be provided in a cloud server connected to the information processing device 10 via the communication unit 120.
 センサ制御部16は、飛行体100を制御する制御情報と飛行体100の周囲を示す環境情報とに基づいて、センサ部110における複数のセンサのそれぞれの検出条件を制御する。センサ制御部16は、取得部11、状態認識部12、環境認識部13、マップ作成部14、記憶部15、動作制御部17等と情報の授受が可能な構成となっている。センサ制御部16は、状態認識部12及び環境認識部13から認識結果を取得できる。センサ制御部16は、マップ作成部14からマップ情報を取得できる。センサ制御部16は、動作制御部17から制御情報を取得できる。 The sensor control unit 16 controls the detection conditions of each of the plurality of sensors in the sensor unit 110 based on the control information for controlling the flying object 100 and the environmental information indicating the surroundings of the flying object 100. The sensor control unit 16 has a configuration capable of exchanging information with the acquisition unit 11, the state recognition unit 12, the environment recognition unit 13, the map creation unit 14, the storage unit 15, the operation control unit 17, and the like. The sensor control unit 16 can acquire the recognition result from the state recognition unit 12 and the environment recognition unit 13. The sensor control unit 16 can acquire map information from the map creation unit 14. The sensor control unit 16 can acquire control information from the operation control unit 17.
 本実施形態では、センサ制御部16は、複数のセンサのそれぞれの検出条件を取得部11に指示することで、取得部11を介して、検出条件を制御する場合について説明するが、これに限定されない。例えば、センサ制御部16は、センサ部110に複数のセンサのそれぞれの検出条件を直接指示して制御する構成としてもよい。 In the present embodiment, the case where the sensor control unit 16 controls the detection condition via the acquisition unit 11 by instructing the detection condition of each of the plurality of sensors to the acquisition unit 11, will be described. However, the present invention is not limited to this. Not done. For example, the sensor control unit 16 may be configured to directly instruct the sensor unit 110 about the detection conditions of each of the plurality of sensors and control the detection conditions.
 例えば、センサ制御部16は、制御情報と環境情報とに基づいて、複数の撮像装置111a、111b、111c、111dのそれぞれの撮像条件を制御する。センサ制御部16は、制御情報と環境情報とマップ情報とに基づいて、複数のセンサのそれぞれの検出条件を制御する。本実施形態では、センサ制御部16は、撮像装置111a、111b、111c、111d及び状態センサ112のそれぞれの検出条件を制御する。 For example, the sensor control unit 16 controls the imaging conditions of each of the imaging devices 111a, 111b, 111c, and 111d based on the control information and the environment information. The sensor control unit 16 controls the detection conditions of each of the plurality of sensors based on the control information, the environment information, and the map information. In the present embodiment, the sensor control unit 16 controls the respective detection conditions of the imaging devices 111a, 111b, 111c, 111d and the state sensor 112.
 センサ制御部16は、状態認識部12によって認識した飛行体100の速度及び進行方向の少なくとも一方と環境認識部13によって認識した環境情報とに基づいて、複数の撮像装置111、状態センサ112等のそれぞれの検出条件を制御する。例えば、センサ制御部16は、飛行体100が移動状態である場合、飛行体100の進行方向Aを撮像する撮像装置111aのフレームレートを、他の方向を撮像する撮像装置111b、111c及び111dのフレームレートよりも上げる。移動状態は、例えば、飛行体100が空中を移動している状態を含む。 The sensor control unit 16 detects the plurality of imaging devices 111, the state sensor 112, and the like based on at least one of the speed and the traveling direction of the aircraft 100 recognized by the state recognition unit 12 and the environment information recognized by the environment recognition unit 13. Control each detection condition. For example, when the flying object 100 is in the moving state, the sensor control unit 16 sets the frame rate of the imaging device 111a that images the traveling direction A of the flying object 100 to the frame rate of the imaging devices 111b, 111c, and 111d that images the other directions. Increase above the frame rate. The moving state includes, for example, a state in which the aircraft 100 is moving in the air.
 センサ制御部16は、飛行体100の進行方向Aを撮像する撮像装置111aのセンサ情報が示す撮像領域D1のうち、障害物の検出に必要な領域を抽出する制御を行う。センサ制御部16は、センサ情報が飛行体100の姿勢を示す情報を有していると、当該姿勢に基づいて障害物の検出に必要な領域を抽出する制御を行う。 The sensor control unit 16 controls to extract an area necessary for detecting an obstacle in the imaging area D1 indicated by the sensor information of the imaging device 111a that images the traveling direction A of the flying object 100. When the sensor information has information indicating the attitude of the flying object 100, the sensor control unit 16 performs control to extract a region necessary for detecting an obstacle based on the attitude.
 図6は、第1の実施形態に係る情報処理装置10のセンサ情報の処理例を説明するための図である。例えば、飛行体100は、飛行時の姿勢が変化することがある。図6に示す一例では、飛行体100は、進行方向Aに飛行している場合、姿勢P1と姿勢P2とに変化する。姿勢P1は、地面等の水平面に対して飛行体100が水平になって飛行する姿勢である。姿勢P2は、水平面に対して飛行体100が傾いて飛行する姿勢である。姿勢P2は、撮像装置111で下方を撮像する姿勢である。この場合、飛行体100の撮像装置111aが撮像する画像は、姿勢P1と姿勢P2とで異なる。例えば、姿勢P1の場合、撮像装置111aが撮像する画像は、情報処理装置10が注目する注目領域Rが撮像領域D1の中央付近になる。また、姿勢P2の場合、撮像装置111aが撮像する画像は、情報処理装置10が注目する注目領域Rが撮像領域D1の上部付近になる。すなわち、情報処理装置10は、飛行体100が飛行している場合、撮像装置111が撮像した画像の全てを注目する必要がない場合がある。このため、情報処理装置10は、飛行体100が飛行している姿勢P1と姿勢P2とに基づいて、撮像装置111aが撮像した撮像領域D1で注目する領域を変更する。 FIG. 6 is a diagram for explaining a processing example of sensor information of the information processing device 10 according to the first embodiment. For example, the attitude of the flying object 100 may change during flight. In the example illustrated in FIG. 6, when the flying object 100 is flying in the traveling direction A, the flying object 100 changes to the posture P1 and the posture P2. The posture P1 is a posture in which the flying body 100 flies horizontally with respect to a horizontal plane such as the ground. The posture P2 is a posture in which the flying body 100 inclines with respect to the horizontal plane and flies. The posture P2 is a posture in which the image capturing apparatus 111 captures a downward image. In this case, the image captured by the image capturing apparatus 111a of the aircraft 100 differs between the posture P1 and the posture P2. For example, in the case of the posture P1, in the image captured by the image capturing apparatus 111a, the attention area R focused by the information processing apparatus 10 is near the center of the image capturing area D1. Further, in the case of the posture P2, in the image captured by the image capturing apparatus 111a, the attention area R focused by the information processing apparatus 10 is near the upper portion of the image capturing area D1. That is, the information processing apparatus 10 may not need to pay attention to all the images captured by the imaging device 111 when the flying object 100 is flying. Therefore, the information processing device 10 changes the region of interest in the image capturing region D1 captured by the image capturing device 111a, based on the posture P1 and the posture P2 in which the flying object 100 is flying.
 また、センサ制御部16は、飛行体100の進行方向Aに物体が存在しない場合に、当該進行方向Aを撮像する撮像装置111aのフレームレートを、飛行体100の進行方向Aに物体が存在する場合よりも下げる。センサ制御部16は、マップ情報の構築度に基づいてセンサ部110における検出条件を制御する。例えば、マップ情報の構築度が高い部分を検出するセンサ部110のセンサの検出条件を緩和する。センサ制御部16は、飛行体100がマップ情報を作成する場合、マップ情報の作成に必要な方向を撮像する撮像装置111のフレームレートを優先するように検出条件を制御する。センサ制御部16は、飛行体100の速度が上がった場合に撮像装置111のフレームレートを上げる制御を行い、飛行体100の速度が下がった場合に撮像装置111のフレームレートを下げる制御を行う。 In addition, the sensor control unit 16 sets the frame rate of the imaging device 111a that captures the traveling direction A when the object does not exist in the traveling direction A of the flying object 100 so that the object exists in the traveling direction A of the flying object 100. Lower than if. The sensor control unit 16 controls the detection conditions in the sensor unit 110 based on the degree of construction of map information. For example, the detection condition of the sensor of the sensor unit 110 that detects a portion having a high degree of construction of map information is relaxed. When the flying object 100 creates map information, the sensor control unit 16 controls the detection condition such that the frame rate of the imaging device 111 that images the direction required for creating the map information is prioritized. The sensor control unit 16 controls to increase the frame rate of the imaging device 111 when the speed of the flying object 100 increases, and decreases the frame rate of the imaging device 111 when the speed of the flying object 100 decreases.
 図2に戻り、センサ制御部16は、複数の撮像装置111のフレームレート(撮像条件)を決定する一例を説明する。例えば、センサ制御部16は、飛行体100の速度に従ってフレームレートを最小から最大まで変化させることができる。センサ制御部16は、以下の式(1)を用いてフレームレートFRを算出する。
 FR=FRmin+(FRmax-FRmin)*(1-1/(V+1))・・・式(1)
Returning to FIG. 2, an example in which the sensor control unit 16 determines the frame rates (imaging conditions) of the plurality of imaging devices 111 will be described. For example, the sensor control unit 16 can change the frame rate from the minimum to the maximum according to the speed of the air vehicle 100. The sensor control unit 16 calculates the frame rate FR using the following equation (1).
FR=FR min +(FR max- FR min )*(1-1/(V+1))...Equation (1)
 式(1)において、FRminは、撮像装置111のスペックの最低フレームレートを示す。FRmaxは、撮像装置111のスペックの最大フレームレートを示す。Vは、飛行体100の速度を示す。FRminとFRmaxは、撮像装置111のスペック等によって異なる。センサ制御部16は、式(1)を用いて、撮像装置111が向いている方向への進行速度に依存してフレームレートFRを変化させる。例えば、センサ制御部16は、飛行体100の進行方向Aの撮像装置111のフレームレートFRを高くし、その他の撮像装置111のフレームレートを進行方向Aの撮像装置111よりも下げる或いは停止させる。 In Expression (1), FR min indicates the minimum frame rate of the specifications of the image pickup apparatus 111. FR max indicates the maximum frame rate of the specifications of the image pickup apparatus 111. V indicates the speed of the air vehicle 100. FR min and FR max differ depending on the specifications of the imaging device 111 and the like. The sensor control unit 16 uses Expression (1) to change the frame rate FR depending on the traveling speed in the direction in which the imaging device 111 is facing. For example, the sensor control unit 16 increases the frame rate FR of the image pickup device 111 in the traveling direction A of the flying object 100 and lowers or stops the frame rate of the other image pickup devices 111 as compared with the image pickup device 111 in the traveling direction A.
 センサ制御部16は、マップ作成部14が作成したマップ情報の構築度が高いほどセンサ部110のセンサの検出条件を緩和する。例えば、マップ情報における構築度が高い部分は、高いフレームレートで撮像する必要がなく、完成している部分については、撮像装置111で撮像する必要がない。マップ情報の構築度は、例えば、作成したマップ情報と記憶部15に記憶しているマップ情報との比較結果に基づいて求めることができる。詳細には、3次元Voxel grid表現の地図である場合、マップ情報の構築度は、新たに作成されたVoxelの数に着目して推定することができる。なお、マップ情報の構築度は、マップ情報が示す空間における座標、領域ごとの比較結果に基づいて推定してもよい。情報処理装置10は、マップ情報の構築度に従って撮像装置111のフレームレートを下げるように算出する。 The sensor control unit 16 relaxes the sensor detection conditions of the sensor unit 110 as the degree of construction of the map information created by the map creation unit 14 increases. For example, it is not necessary to image a portion of the map information having a high degree of construction at a high frame rate, and it is not necessary to image a completed portion by the image pickup device 111. The degree of construction of the map information can be obtained, for example, based on the result of comparison between the created map information and the map information stored in the storage unit 15. Specifically, in the case of a map with a three-dimensional Voxel grid representation, the degree of construction of map information can be estimated by focusing on the number of newly created Voxels. The degree of construction of the map information may be estimated based on the coordinates in the space indicated by the map information and the comparison result for each area. The information processing device 10 calculates so as to reduce the frame rate of the imaging device 111 according to the degree of construction of the map information.
 例えば、センサ制御部16は、3次元Voxel grid表現のマップ情報である場合、以下の式(2)を用いてフレームレートFRを算出する。
 FR=FRmax*(1-1/(N+1))・・・式(2)
For example, the sensor control unit 16 calculates the frame rate FR using the following formula (2) when the map information is the three-dimensional Voxel grid representation.
FR=FR max *(1-1/(N+1)) equation (2)
 式(2)において、FRmaxは、撮像装置111のスペックの最大フレームレートを示す。Nは、新たに作成されたVoxelの数を示す。センサ制御部16は、式(2)を用いて、撮像装置111のフレームレートFRを算出する。また、撮像装置111が撮像した画像をセマンティックセグメンテーションなどで、空の領域などのはじめから処理する必要がないことが分かっている場合、センサ制御部16は、当該撮像装置111のフレームレートを下げる或いは、停止させる。 In Expression (2), FR max indicates the maximum frame rate of the specifications of the image pickup apparatus 111. N indicates the number of newly created Voxels. The sensor control unit 16 calculates the frame rate FR of the imaging device 111 by using the equation (2). In addition, when it is known that it is not necessary to process the image captured by the image capturing apparatus 111 from the beginning such as a sky area by semantic segmentation, the sensor control unit 16 lowers the frame rate of the image capturing apparatus 111, or , Stop.
 センサ制御部16は、飛行体100を飛行させてマップ情報を作成する場合、マップ情報の作成に必要な方向を撮像する撮像装置111を優先するようにフレームレートを制御する。例えば、センサ制御部16は、複数の撮像装置111のうち、マップ情報の作成に必要な方向を撮像する撮像装置111のフレームレートを優先的に高くし、その他の撮像装置111のフレームレートを低くする。例えば、飛行体100の進行方向Aの左側のマップ情報を作成する場合、情報処理装置10は、前方、後方及び左側を撮像する撮像装置111a、111c及び111dのフレームレートを、右側を撮像する撮像装置111bよりも低くする。或いは、情報処理装置10は、左側を撮像する撮像装置111dのフレームレートを、前方及び後方の撮像装置111a及び111cのフレームレートを高くし、右側を撮像する撮像装置111bのフレームレートを最も低くしてもよい。なお、フレームレートを最も低くする場合には、例えば、所定時間の間、フレームレートを0とする場合も含まれる。 When the flying object 100 is flown to create map information, the sensor control unit 16 controls the frame rate so as to give priority to the imaging device 111 that captures the direction required to create the map information. For example, the sensor control unit 16 preferentially increases the frame rate of the image capturing apparatus 111 that captures the direction necessary to create the map information among the plurality of image capturing apparatuses 111, and lowers the frame rate of the other image capturing apparatuses 111. To do. For example, when creating map information on the left side in the traveling direction A of the air vehicle 100, the information processing apparatus 10 captures the frame rates of the image capturing apparatuses 111a, 111c, and 111d for capturing the front, rear, and left sides, and the image capturing for the right side. Lower than device 111b. Alternatively, the information processing apparatus 10 sets the frame rate of the image capturing apparatus 111d that captures the left side to a higher frame rate of the front and rear image capturing apparatuses 111a and 111c, and sets the frame rate of the image capturing apparatus 111b that captures the right side the lowest. May be. Note that the case of setting the frame rate to the lowest includes the case of setting the frame rate to 0 for a predetermined time, for example.
 動作制御部17は、取得部11によって取得したセンサ情報に基づいて飛行体100の動作を制御する。動作制御部17は、飛行体100の駆動部130を制御する。例えば、動作制御部17は、行動計画を実現するための飛行体100の飛行制御を行う。例えば、動作制御部17は、記憶部15に記憶されている行動計画を実現するための飛行体100の飛行制御を行う。そして、動作制御部17は、飛行体100を駆動するための動作指令等を駆動部130に出力する。なお、動作制御部17は、例えば、飛行体100の移動に関する行動移動計画をセンサ部110のセンサ情報に基づいて計画、修正する機能を有してもよい。 The operation control unit 17 controls the operation of the air vehicle 100 based on the sensor information acquired by the acquisition unit 11. The operation control unit 17 controls the drive unit 130 of the aircraft 100. For example, the operation control unit 17 performs flight control of the flying object 100 to realize the action plan. For example, the operation control unit 17 performs flight control of the flying object 100 for realizing the action plan stored in the storage unit 15. Then, the operation control unit 17 outputs an operation command or the like for driving the flying object 100 to the drive unit 130. The operation control unit 17 may have a function of planning and correcting an action movement plan regarding movement of the flying object 100 based on the sensor information of the sensor unit 110, for example.
 以上、本実施形態に係る情報処理装置10の機能構成例について説明した。なお、図2を用いて説明した上記の構成はあくまで一例であり、本実施形態に係る情報処理装置10の機能構成は係る例に限定されない。本実施形態に係る情報処理装置10の機能構成は、仕様や運用に応じて柔軟に変形可能である。 The example of the functional configuration of the information processing device 10 according to the present embodiment has been described above. The configuration described above with reference to FIG. 2 is merely an example, and the functional configuration of the information processing device 10 according to the present embodiment is not limited to this example. The functional configuration of the information processing device 10 according to the present embodiment can be flexibly modified according to specifications and operation.
[第1の実施形態に係る情報処理の手順]
 次に、図7を用いて、第1の実施形態に係る情報処理の手順について説明する。図7は、第1の実施形態に係る情報処理装置10が実行する処理手順の一例を示すフローチャートである。図7に示す処理手順は、情報処理装置10がプログラムを実行することによって実現される。図7に示す処理手順は、センサ部110の複数の撮像装置111の検出条件を決定する処理手順の一例を示している。
[Procedure of information processing according to the first embodiment]
Next, a procedure of information processing according to the first embodiment will be described with reference to FIG. 7. FIG. 7 is a flowchart showing an example of a processing procedure executed by the information processing apparatus 10 according to the first embodiment. The processing procedure illustrated in FIG. 7 is realized by the information processing device 10 executing a program. The processing procedure shown in FIG. 7 shows an example of the processing procedure for determining the detection conditions of the plurality of imaging devices 111 of the sensor unit 110.
 図7に示すように、情報処理装置10は、取得部11によってセンサ情報を取得する(ステップS101)。例えば、情報処理装置10は、取得部11によって撮像装置111及び状態センサ112のそれぞれのセンサ情報を取得する。情報処理装置10は、動作制御部17から制御情報を取得する(ステップS102)。例えば、情報処理装置10は、飛行体100の動作を制御する直前の制御情報、最新の制御情報等を動作制御部17から取得する。 As shown in FIG. 7, the information processing device 10 acquires the sensor information by the acquisition unit 11 (step S101). For example, the information processing device 10 acquires the sensor information of each of the imaging device 111 and the state sensor 112 by the acquisition unit 11. The information processing device 10 acquires control information from the operation control unit 17 (step S102). For example, the information processing device 10 acquires the control information immediately before controlling the operation of the flying object 100, the latest control information, and the like from the operation control unit 17.
 情報処理装置10は、ステップS110、ステップS120及びステップS130の3つの処理を並列で実行する。なお、情報処理装置10は、ステップS110、ステップS120及びステップS130の3つの処理を順次実行するように構成してもよい。 The information processing device 10 executes three processes of step S110, step S120, and step S130 in parallel. The information processing device 10 may be configured to sequentially execute the three processes of step S110, step S120, and step S130.
 情報処理装置10は、取得したセンサ情報に基づいて飛行体100の状態を認識する(ステップS110)。例えば、情報処理装置10は、状態センサ112のセンサ情報を解析することで、飛行体100の速度、進行方向、高度等の状態を認識し、認識結果をセンサ制御部16に出力する。また、情報処理装置10は、駆動部130を制御する制御情報に基づいて、飛行体100のこれからの速度、進行方向、高度等の状態を認識する。すなわち、情報処理装置10は、飛行体100の現在の状態とこれからの状態とを認識することで、飛行体100の動作の予測を可能としている。情報処理装置10は、ステップS110の処理を実行することで、状態認識部12として機能する。 The information processing device 10 recognizes the state of the flying object 100 based on the acquired sensor information (step S110). For example, the information processing device 10 analyzes the sensor information of the state sensor 112 to recognize the states such as the speed, the traveling direction, and the altitude of the flying object 100, and outputs the recognition result to the sensor control unit 16. Further, the information processing apparatus 10 recognizes the state of the flying body 100 such as the speed, the traveling direction, and the altitude based on the control information for controlling the driving unit 130. That is, the information processing device 10 enables prediction of the operation of the flying object 100 by recognizing the current state and the future state of the flying object 100. The information processing device 10 functions as the state recognition unit 12 by executing the process of step S110.
 情報処理装置10は、ステップS110の飛行体100の状態と制御情報とに基づいて、複数の撮像装置のそれぞれのフレームレートを算出する(ステップS111)。例えば、情報処理装置10は、飛行体100の進行方向Aを撮像する撮像装置111のフレームレートを飛行体100の速度、姿勢等に基づいて算出する。例えば、情報処理装置10は、飛行体100の進行方向A以外の撮像装置111のフレームレートを、進行方向Aを撮像する撮像装置111よりも低くなるように算出する。例えば、情報処理装置10は、飛行体100の速度が上がった場合に撮像装置111のフレームレートが上がり、飛行体100の速度が下がった場合に撮像装置111のフレームレートが下がるように算出する。そして、情報処理装置10は、ステップS111の処理が終了すると、処理を後述するステップS141に進める。 The information processing device 10 calculates the frame rate of each of the plurality of imaging devices based on the state of the flying vehicle 100 and the control information in step S110 (step S111). For example, the information processing device 10 calculates the frame rate of the imaging device 111 that images the traveling direction A of the flying object 100 based on the speed, attitude, and the like of the flying object 100. For example, the information processing apparatus 10 calculates the frame rate of the image capturing apparatus 111 other than the traveling direction A of the flying object 100 so as to be lower than that of the image capturing apparatus 111 capturing the traveling direction A. For example, the information processing apparatus 10 calculates so that the frame rate of the imaging device 111 increases when the speed of the flying object 100 increases, and the frame rate of the imaging device 111 decreases when the speed of the flying object 100 decreases. Then, when the processing of step S111 ends, the information processing apparatus 10 advances the processing to step S141 described below.
 また、情報処理装置10は、取得したセンサ情報に基づいて外部環境の環境情報を認識する(ステップS120)。例えば、情報処理装置10は、複数の撮像装置111のそれぞれが撮像した画像、距離画像等を解析し、飛行体100の周囲の障害物の有無、障害物までの距離等の環境情報を認識する。例えば、情報処理装置10は、撮像装置111が撮像した画像、距離画像等を解析し、飛行体100の進行方向Aにおける動的物体の存在の有無を示す環境情報を認識する。情報処理装置10は、ステップS120の処理を実行することで、環境認識部13として機能する。 Further, the information processing device 10 recognizes the environmental information of the external environment based on the acquired sensor information (step S120). For example, the information processing device 10 analyzes images captured by each of the plurality of imaging devices 111, a distance image, and the like, and recognizes environmental information such as the presence or absence of an obstacle around the flying object 100 and the distance to the obstacle. .. For example, the information processing device 10 analyzes an image captured by the imaging device 111, a distance image, and the like, and recognizes environment information indicating the presence or absence of a dynamic object in the traveling direction A of the air vehicle 100. The information processing device 10 functions as the environment recognition unit 13 by executing the process of step S120.
 情報処理装置10は、ステップS120の環境情報に基づいて、複数の撮像装置111のそれぞれのフレームレートを算出する(ステップS121)。例えば、情報処理装置10は、飛行体100の進行方向Aにおける動的物体、障害物等の存在の有無に基づいて、撮像装置111のフレームレートを算出する。例えば、情報処理装置10は、撮像装置111が撮像する撮像領域D1、D2、D3、D4における動的物体、障害物等が存在する場合、存在しない場合よりもフレームレートが高くなるように算出する。例えば、動的物体は飛行体100に接近する可能性があるため、情報処理装置10は、動的物体を撮像している撮像装置111のフレームレートを、他の撮像装置111のフレームレートよりも高く算出するようにしてもよい。情報処理装置10は、ステップS111の処理が終了すると、処理を後述するステップS141に進める。 The information processing device 10 calculates the frame rate of each of the plurality of imaging devices 111 based on the environment information in step S120 (step S121). For example, the information processing device 10 calculates the frame rate of the imaging device 111 based on the presence or absence of a dynamic object, an obstacle, or the like in the traveling direction A of the flying object 100. For example, the information processing apparatus 10 performs calculation so that the frame rate is higher when a dynamic object, an obstacle, or the like in the image capturing areas D1, D2, D3, and D4 captured by the image capturing apparatus 111 is present than when the object is not present. .. For example, since the dynamic object may approach the flying object 100, the information processing apparatus 10 sets the frame rate of the image capturing apparatus 111 capturing the dynamic object to be higher than the frame rates of other image capturing apparatuses 111. You may make it calculate high. When the process of step S111 ends, the information processing device 10 advances the process to step S141 described below.
 また、情報処理装置10は、取得したセンサ情報に基づいてマップ情報を作成する(ステップS130)。例えば、記憶部15に作成中のマップ情報が記憶されている場合、情報処理装置10は、センサ部110から取得したセンサ情報に基づいて、当該マップ情報を更新することで、マップ情報を作成する。例えば、記憶部15に作成中のマップ情報が記憶されていない場合、情報処理装置10は、センサ部110から取得したセンサ情報に基づいて、新規のマップ情報を作成する。情報処理装置10は、ステップS130の処理を実行することで、マップ作成部14として機能する。 Further, the information processing device 10 creates map information based on the acquired sensor information (step S130). For example, when the map information being created is stored in the storage unit 15, the information processing device 10 creates the map information by updating the map information based on the sensor information acquired from the sensor unit 110. .. For example, when the map information being created is not stored in the storage unit 15, the information processing device 10 creates new map information based on the sensor information acquired from the sensor unit 110. The information processing device 10 functions as the map creation unit 14 by executing the process of step S130.
 情報処理装置10は、マップ情報に基づいて、複数の撮像装置111のそれぞれのフレームレートを算出する(ステップS131)。例えば、情報処理装置10は、作成したマップ情報に基づいて、マップ情報の作成に必要な撮像装置111とそのフレームレートとを算出する。例えば、情報処理装置10は、マップ情報の構築度が高くなるに従って撮像装置111のフレームレートを下げるように算出する。例えば、情報処理装置10は、マップ情報の作成に必要な方向を撮像する撮像装置111を優先するようにフレームレートを制御する。情報処理装置10は、ステップS111の処理が終了すると、処理を後述するステップS141に進める。 The information processing device 10 calculates the frame rate of each of the plurality of imaging devices 111 based on the map information (step S131). For example, the information processing apparatus 10 calculates the image pickup apparatus 111 and its frame rate necessary for creating the map information based on the created map information. For example, the information processing apparatus 10 calculates so that the frame rate of the image capturing apparatus 111 decreases as the degree of construction of map information increases. For example, the information processing apparatus 10 controls the frame rate so that the image capturing apparatus 111 that captures the direction necessary for creating the map information is prioritized. When the process of step S111 ends, the information processing device 10 advances the process to step S141 described below.
 情報処理装置10は、算出したフレームレートに基づいて、複数の撮像装置のそれぞれのフレームレートを決定する(ステップS141)。ステップS141の処理は、ステップS111、ステップS121及びステップS131の処理が終了した場合に情報処理装置10によって実行される。例えば、情報処理装置10は、飛行体100の状態から算出したフレームレート、環境情報から算出したフレームレート、及びマップ情報の構築度から算出したフレームレートと判断基準とを比較し、複数の撮像装置111のフレームレートを決定する。本実施形態では、判断基準は、撮像装置111に対して決定された最も高いフレームレートである場合について説明するが、これに限定されない。例えば、飛行体100の状態に応じた判断基準を設定してもよい。情報処理装置10は、ステップS141の処理が終了すると、処理を後述するステップS142に進める。 The information processing device 10 determines the frame rate of each of the plurality of imaging devices based on the calculated frame rate (step S141). The process of step S141 is executed by the information processing device 10 when the processes of step S111, step S121, and step S131 are completed. For example, the information processing apparatus 10 compares the frame rate calculated from the state of the flying object 100, the frame rate calculated from the environment information, and the frame rate calculated from the degree of construction of the map information with the determination criterion, and the plurality of image pickup apparatuses. Determine the frame rate of 111. In the present embodiment, the case where the criterion is the highest frame rate determined for the imaging device 111 will be described, but the criterion is not limited to this. For example, a criterion may be set according to the state of the air vehicle 100. When the processing of step S141 ends, the information processing apparatus 10 advances the processing to step S142 described below.
 情報処理装置10は、決定したフレームレートに基づいて、複数の撮像装置111のフレームレートのそれぞれを変更する(ステップS142)。例えば、情報処理装置10は、取得部11を介して、複数の撮像装置111のそれぞれにフレームレートの変更を指示する。その結果、複数の撮像装置111のそれぞれは、使用している既存のフレームレートを変更が指示されたフレームレートに変更する。情報処理装置10は、ステップS142の処理が終了すると、処理を後述するステップS150に進める。 The information processing device 10 changes each of the frame rates of the plurality of imaging devices 111 based on the determined frame rate (step S142). For example, the information processing device 10 instructs each of the plurality of imaging devices 111 via the acquisition unit 11 to change the frame rate. As a result, each of the plurality of imaging devices 111 changes the existing frame rate in use to the frame rate instructed to be changed. When the processing of step S142 ends, the information processing apparatus 10 advances the processing to step S150 described below.
 なお、情報処理装置10は、ステップS111、ステップS121、ステップS131及びステップS141~ステップS142の処理を実行することで、センサ制御部16として機能する。 The information processing device 10 functions as the sensor control unit 16 by executing the processes of step S111, step S121, step S131, and steps S141 to S142.
 情報処理装置10は、終了するか否かを判定する(ステップS150)。例えば、情報処理装置10は、飛行体100の動作終了、飛行体100の待機時への変化等を検出した場合に、終了すると判定する。情報処理装置10は、終了しないと判定した場合(ステップS150でNo)、処理を既に説明したステップS101に戻し、一連の処理を実行する。また、情報処理装置10は、終了すると判定した場合(ステップS150でYes)、図7に示す処理手順を終了させる。 The information processing device 10 determines whether or not to end (step S150). For example, the information processing apparatus 10 determines to end when the operation of the flying object 100, a change of the flying object 100 to the standby state, and the like are detected. When the information processing apparatus 10 determines that the processing is not to be ended (No in step S150), the processing is returned to step S101 described above, and the series of processing is executed. If the information processing apparatus 10 determines to end the processing (Yes in step S150), the processing procedure illustrated in FIG. 7 ends.
[第1の実施形態に係る情報処理装置の動作]
 次に、図8を用いて、第1の実施形態に係る情報処理装置10の動作の一例を説明する。図8は、第1の実施形態に係る情報処理装置10のフレームレートの決定例を示す図である。
[Operation of Information Processing Device According to First Embodiment]
Next, an example of the operation of the information processing device 10 according to the first embodiment will be described with reference to FIG. FIG. 8 is a diagram showing an example of determining the frame rate of the information processing device 10 according to the first embodiment.
 飛行体100は、情報処理装置10の制御によって駆動部130が駆動されることで、進行方向Aに向かって飛行している。この場合、情報処理装置10は、撮像装置111が撮像した画像を含むセンサ情報を取得部11によって取得する。情報処理装置10は、状態認識部12によって飛行体100の状態を認識し、環境認識部13によって環境情報を認識し、マップ作成部14によってマップ情報を作成する。 The flying body 100 is flying in the traveling direction A by driving the driving unit 130 under the control of the information processing device 10. In this case, the information processing device 10 acquires the sensor information including the image captured by the imaging device 111 by the acquisition unit 11. In the information processing device 10, the state recognition unit 12 recognizes the state of the flying object 100, the environment recognition unit 13 recognizes the environment information, and the map creation unit 14 creates the map information.
 情報処理装置10は、図8に示すように、4つの撮像装置111a、111b、111c及び111dごとにフレームレートを算出する。飛行体100の状態から算出したフレームレートは、飛行体100の前方を撮像する撮像装置111aが60fps(frames per second)、それ以外の撮像装置111b、111c及び111dが0fpsと算出されている。環境情報から算出したフレームレートは、撮像装置111aが20fps、それ以外の撮像装置111b、111c及び111dが10fpsと算出されている。マップ情報の構築度から算出したフレームレートは、撮像装置111a及び111dが20fps、撮像装置111b及び111cが1fpsと算出されている。ここでは、判断基準は、撮像装置111ごとに決定された最も高いフレームレートを適用することになっている。この場合、情報処理装置10は、撮像装置111aのフレームレートが60fps、撮像装置111bのフレームレートが10fps、撮像装置111cのフレームレートが10fps、撮像装置111dのフレームレートが20fpsとそれぞれ変更する。その結果、飛行体100は、進行方向Aの前方を撮像装置111aが60fpsで撮像し、撮像装置111b及び撮像装置111cが10fpsで撮像し、撮像装置111dが20fpsで撮像する。よって、飛行体100は、撮像装置111b、111c及び111dの消費電力を撮像装置111aよりも抑制することができる。 The information processing device 10 calculates the frame rate for each of the four image pickup devices 111a, 111b, 111c, and 111d, as shown in FIG. The frame rate calculated from the state of the flying object 100 is calculated as 60 fps (frames per second) for the imaging device 111a that images the front of the flying object 100 and 0 fps for the other imaging devices 111b, 111c, and 111d. The frame rate calculated from the environment information is calculated as 20 fps for the image pickup apparatus 111a and 10 fps for the other image pickup apparatuses 111b, 111c, and 111d. The frame rate calculated from the degree of construction of the map information is calculated as 20 fps for the image pickup devices 111a and 111d and 1 fps for the image pickup devices 111b and 111c. Here, the highest frame rate determined for each imaging device 111 is applied as the determination criterion. In this case, the information processing apparatus 10 changes the frame rate of the image pickup apparatus 111a to 60 fps, the frame rate of the image pickup apparatus 111b to 10 fps, the frame rate of the image pickup apparatus 111c to 10 fps, and the frame rate of the image pickup apparatus 111d to 20 fps. As a result, in the flying object 100, the image pickup device 111a picks up images at the front of the traveling direction A at 60 fps, the image pickup devices 111b and 111c pick up at 10 fps, and the image pickup devices 111d pick up at 20 fps. Therefore, the flying object 100 can suppress the power consumption of the imaging devices 111b, 111c, and 111d more than that of the imaging device 111a.
 情報処理装置10は、撮像装置111a、111b、111c及び111dごとのフレームレートで撮像した画像を含むセンサ情報をセンサ部110から取得する。すなわち、情報処理装置10は、飛行中の飛行体100の前方を撮像した撮像装置111aのセンサ情報のデータ量を維持した状態で、撮像装置111b、111c及び111dのセンサ情報のデータ量が減少することになる。これにより、情報処理装置10は、4つの撮像装置111a、111b、111c及び111dのセンサ情報を抑制されるので、センサ情報を用いた処理が抑制され、情報処理装置10が実行する処理の負荷を軽減されることができる。そして、情報処理装置10は、飛行体100のセンサ部110から取得したセンサ情報に基づいて、動作制御部17が飛行体100の動作を制御する。 The information processing device 10 acquires from the sensor unit 110 sensor information including images captured at the frame rates of the image capturing devices 111a, 111b, 111c, and 111d. That is, the information processing apparatus 10 reduces the data amount of the sensor information of the imaging devices 111b, 111c, and 111d while maintaining the data amount of the sensor information of the imaging device 111a that images the front of the flying vehicle 100. It will be. As a result, the information processing apparatus 10 suppresses the sensor information of the four image capturing apparatuses 111a, 111b, 111c, and 111d, so that the processing using the sensor information is suppressed and the processing load executed by the information processing apparatus 10 is reduced. Can be mitigated. Then, in the information processing device 10, the operation control unit 17 controls the operation of the air vehicle 100 based on the sensor information acquired from the sensor unit 110 of the air vehicle 100.
 以上のように、本実施形態に係る情報処理装置10は、飛行体100を制御する制御情報と飛行体100の周囲を示す環境情報とに基づいて、飛行体100に搭載されたセンサ部110の複数のセンサのそれぞれの検出条件を制御する。情報処理装置10は、複数のセンサが検出条件で検出したセンサ情報を取得する。これにより、情報処理装置10は、飛行体100の制御情報と環境情報とに応じた個別の検出条件で、飛行体100の複数のセンサのそれぞれを検出させることができる。その結果、情報処理装置10は、飛行体100に搭載される複数のセンサの消費電力を抑制することができるとともに、取得するセンサ情報のデータ量を抑制することができる。 As described above, the information processing apparatus 10 according to the present embodiment uses the sensor unit 110 mounted on the air vehicle 100 based on the control information for controlling the air vehicle 100 and the environmental information indicating the surroundings of the air vehicle 100. The detection condition of each of the plurality of sensors is controlled. The information processing device 10 acquires sensor information detected by a plurality of sensors under detection conditions. As a result, the information processing device 10 can detect each of the plurality of sensors of the aircraft 100 under individual detection conditions according to the control information and the environment information of the aircraft 100. As a result, the information processing apparatus 10 can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 and the data amount of the acquired sensor information.
 また、情報処理装置10は、取得したセンサ情報に基づいて飛行体100のマップ情報を作成し、制御情報と環境情報とマップ情報とに基づいてセンサ部110の複数のセンサのそれぞれの検出条件を制御する。これにより、情報処理装置10は、飛行体100の制御情報と環境情報とマップ情報とに基づいた個別の検出条件で、飛行体100の複数のセンサのそれぞれを撮像させることができる。その結果、情報処理装置10は、マップ情報を作成する場合に、飛行体100に搭載される複数のセンサの消費電力を抑制することができるとともに、取得するセンサ情報のデータ量を抑制することができる。 In addition, the information processing device 10 creates map information of the flying object 100 based on the acquired sensor information, and determines detection conditions for each of the plurality of sensors of the sensor unit 110 based on the control information, the environment information, and the map information. Control. Thereby, the information processing apparatus 10 can image each of the plurality of sensors of the air vehicle 100 under individual detection conditions based on the control information, the environment information, and the map information of the air vehicle 100. As a result, when creating the map information, the information processing apparatus 10 can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 and suppress the data amount of the acquired sensor information. it can.
 また、情報処理装置10は、飛行体100の速度及び進行方向Aの少なくとも一方と、環境情報とを認識する。情報処理装置10は、認識した速度及び進行方向Aの少なくとも一方と環境情報とに基づいてセンサ部110の複数のセンサのそれぞれの撮像条件を制御する。これにより、情報処理装置10は、飛行体100の速度及ぶ進行方向の少なくとも一方に基づいた個別の検出条件で、飛行体100の複数のセンサのそれぞれを検出させることができる。その結果、情報処理装置10は、飛行体100の安全性を低下させることなく、飛行体100に搭載される複数のセンサの消費電力を抑制することができるとともに、取得するセンサ情報のデータ量を抑制することができる。 Further, the information processing device 10 recognizes at least one of the speed and the traveling direction A of the flying object 100 and the environmental information. The information processing device 10 controls the imaging conditions of each of the plurality of sensors of the sensor unit 110 based on the recognized speed and/or traveling direction A and the environment information. Accordingly, the information processing device 10 can detect each of the plurality of sensors of the flying object 100 under individual detection conditions based on at least one of the speed and the traveling direction of the flying object 100. As a result, the information processing apparatus 10 can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 without lowering the safety of the air vehicle 100 and reduce the data amount of the sensor information to be acquired. Can be suppressed.
 また、情報処理装置10は、飛行体100を制御する制御情報と飛行体100の周囲を示す環境情報とに基づいて、飛行体100に搭載された4つの撮像装置111a、111b、111c及び111dのそれぞれの撮像条件を制御する。情報処理装置10は、4つの撮像装置111a、111b、111c及び111dが検出条件で検出したセンサ情報を取得する。これにより、情報処理装置10は、飛行体100の制御情報と環境情報とに応じた個別の撮像条件で、飛行体100の4つの撮像装置111a、111b、111c及び111dのそれぞれを撮像させることができる。その結果、情報処理装置10は、飛行体100に搭載される4つの撮像装置111a、111b、111c及び111dの消費電力を抑制することができるとともに、取得するセンサ情報のデータを抑制することができる。 In addition, the information processing device 10 is configured to detect the four image pickup devices 111a, 111b, 111c, and 111d mounted on the air vehicle 100 based on the control information for controlling the air vehicle 100 and the environmental information indicating the surroundings of the air vehicle 100. The respective imaging conditions are controlled. The information processing device 10 acquires the sensor information detected by the four imaging devices 111a, 111b, 111c, and 111d under the detection conditions. As a result, the information processing apparatus 10 can image each of the four imaging devices 111a, 111b, 111c, and 111d of the flying object 100 under individual imaging conditions according to the control information and the environment information of the flying object 100. it can. As a result, the information processing device 10 can suppress the power consumption of the four imaging devices 111a, 111b, 111c, and 111d mounted on the air vehicle 100, and suppress the data of the acquired sensor information. ..
 また、情報処理装置10は、飛行体100の進行方向Aを撮像する撮像装置111のフレームレートを、他の方向を撮像する撮像装置111のフレームレートよりも上げる。これにより、情報処理装置10は、飛行体100の進行方向Aを集中して撮像させることで、他の方向の撮像装置111のフレームレートを抑制しても、安全性を低下させることを防止できる。その結果、情報処理装置10は、飛行体100の安全性を低下させることなく、複数の撮像装置111の消費電力を抑制できる。 Further, the information processing apparatus 10 raises the frame rate of the image pickup apparatus 111 that picks up the traveling direction A of the flying object 100 to be higher than the frame rate of the image pickup apparatus 111 that picks up another direction. With this, the information processing apparatus 10 concentrates the image of the traveling direction A of the flying object 100, thereby preventing the safety from being lowered even if the frame rate of the image capturing apparatus 111 in the other direction is suppressed. .. As a result, the information processing device 10 can suppress the power consumption of the plurality of imaging devices 111 without lowering the safety of the flying object 100.
 また、情報処理装置10は、飛行体100の進行方向Aにおける環境情報に基づいて、複数の撮像装置111のそれぞれの撮像条件を制御する。これにより、情報処理装置10は、飛行体100が飛行する方向の外部環境を考慮して、進行方向Aを撮像する撮像装置111とそれ以外の方向を撮像する撮像装置111との撮像条件を区別して制御することができる。その結果、情報処理装置10は、飛行体100の進行方向Aとは異なる方向を撮像する撮像装置111の撮像条件を変更することで、複数の撮像装置111の消費電力も抑制することができる。 Further, the information processing device 10 controls the imaging conditions of each of the plurality of imaging devices 111 based on the environmental information in the traveling direction A of the flying object 100. As a result, the information processing apparatus 10 determines the imaging conditions of the imaging apparatus 111 that images the traveling direction A and the imaging apparatus 111 that captures the other directions in consideration of the external environment in the flight direction of the flying object 100. It can be controlled separately. As a result, the information processing apparatus 10 can suppress the power consumption of the plurality of imaging devices 111 by changing the imaging condition of the imaging device 111 that captures a direction different from the traveling direction A of the flying object 100.
 また、情報処理装置10は、飛行体100の進行方向Aに物体が存在しない場合に、当該進行方向Aを撮像する撮像装置111のフレームレートを飛行体100の進行方向Aに物体が存在する場合よりも下げる。これにより、情報処理装置10は、飛行体100が飛行している方向における物体の有無に基づいて、進行方向Aを撮像する撮像装置111の撮像条件を変更することができる。その結果、情報処理装置10は、飛行体100の進行方向Aを撮像する撮像装置111の消費電力を抑制することができる。 Further, when there is no object in the traveling direction A of the flying object 100, the information processing apparatus 10 sets the frame rate of the imaging device 111 that captures the traveling direction A in the case where the object exists in the traveling direction A of the flying object 100. Lower than. As a result, the information processing apparatus 10 can change the image capturing condition of the image capturing apparatus 111 that captures the traveling direction A based on the presence/absence of an object in the direction in which the flying object 100 is flying. As a result, the information processing device 10 can suppress the power consumption of the imaging device 111 that images the traveling direction A of the flying object 100.
 また、情報処理装置10は、飛行体100の速度が上がった場合に撮像装置111のフレームレートを上げる制御を行い、飛行体100の速度が下がった場合に撮像装置111のフレームレートの下げる制御を行う。これにより、情報処理装置10は、飛行体100の速度が下がった場合に撮像装置111のフレームレートを下げることで、当該撮像装置111の消費電力を抑制することができる。その結果、情報処理装置10は、飛行体100の安全性を低下させることなく、複数の撮像装置111の消費電力を抑制できる。 Further, the information processing apparatus 10 performs control to increase the frame rate of the imaging device 111 when the speed of the flying object 100 increases, and control to decrease the frame rate of the imaging device 111 when the speed of the flying object 100 decreases. To do. As a result, the information processing apparatus 10 can reduce the power consumption of the imaging device 111 by lowering the frame rate of the imaging device 111 when the speed of the flying object 100 decreases. As a result, the information processing device 10 can suppress the power consumption of the plurality of imaging devices 111 without lowering the safety of the flying object 100.
 また、情報処理装置10は、マップ情報Mの構築度に基づいてセンサ部110のセンサの検出条件を制御する。これにより、情報処理装置10は、作成しているマップ情報の構築度が高い部分を検出するセンサの検出条件を変更することができる。その結果、情報処理装置10は、作成しているマップ情報の構築度が高くなるに従ってセンサの消費電力を抑制することができる。 The information processing device 10 also controls the detection condition of the sensor of the sensor unit 110 based on the degree of construction of the map information M. As a result, the information processing apparatus 10 can change the detection condition of the sensor that detects a portion of the created map information having a high degree of construction. As a result, the information processing device 10 can suppress the power consumption of the sensor as the degree of construction of the map information being created increases.
 また、情報処理装置10は、飛行体100を移動させてマップ情報を作成する場合、マップ情報の作成に必要な方向を撮像する撮像装置111を優先するように撮像条件を制御する。これにより、情報処理装置10は、マップ情報の作成に必要な方向以外を撮像する撮像装置111の撮像条件を緩めることができる。その結果、情報処理装置10は、マップ情報の作成の効率を低下させることなく、マップ情報の作成に必要のない撮像装置1111の消費電力を抑制することができる。 When the aircraft 100 is moved and the map information is created, the information processing apparatus 10 controls the imaging conditions such that the imaging device 111 that takes an image in the direction required to create the map information has priority. As a result, the information processing apparatus 10 can relax the imaging conditions of the imaging apparatus 111 that captures an image in a direction other than the direction required for creating map information. As a result, the information processing device 10 can suppress the power consumption of the imaging device 1111 that is not necessary for creating the map information without reducing the efficiency of creating the map information.
 また、情報処理装置10は、飛行体100のセンサ部110から取得部11によって取得したセンサ情報に基づいて飛行体100の動作を動作制御部17が制御する。これにより、情報処理装置10は、飛行体100の制御情報と環境情報とに応じた個別の検出条件でセンサ部110のセンサが検出したセンサ情報を取得し、当該センサ情報に基づいて飛行体100の動作を制御することができる。その結果、情報処理装置10は、飛行体100に搭載される複数のセンサの検出条件を制御することで、飛行を制御する飛行体100の消費電力を抑制することができる。 In the information processing device 10, the operation control unit 17 controls the operation of the aircraft 100 based on the sensor information acquired by the acquisition unit 11 from the sensor unit 110 of the aircraft 100. As a result, the information processing device 10 acquires the sensor information detected by the sensor of the sensor unit 110 under the individual detection conditions according to the control information and the environment information of the flying object 100, and based on the sensor information, the flying object 100. Can be controlled. As a result, the information processing device 10 can suppress the power consumption of the flying body 100 that controls flight by controlling the detection conditions of the plurality of sensors mounted on the flying body 100.
 上述の第1の実施形態は一例を示したものであり、種々の変更及び応用が可能である。 The above-described first embodiment is an example, and various modifications and applications are possible.
[第1の実施形態の変形例(1)]
 例えば、第1の実施形態に係る情報処理装置10は、飛行体100のセンサ部110が有する撮像装置111の撮像条件を変更することができる。
[Modification (1) of the first embodiment]
For example, the information processing device 10 according to the first embodiment can change the imaging condition of the imaging device 111 included in the sensor unit 110 of the flying object 100.
 第1の実施形態の変形例(1)に係る情報処理装置10は、センサ制御部16が飛行体100の進行方向Aを撮像する撮像装置111のセンサ情報が示す撮像領域D1の属性情報に基づいて撮像装置111の撮像条件を制御してもよい。例えば、上記の図4に示したように、情報処理装置10のセンサ制御部16は、取得部11が取得した撮像装置111のセンサ情報が有する画像Gのセマンティックセグメンテーションを実行することにより、当該画像Gが示す撮像領域における領域を分類できる。そして、センサ制御部16は、分類した領域に、進行方向Aに物体が存在しない可能性が高い属性の領域が含まれていると、撮像装置111のフレームレートを下げる制御を行う。例えば、画像Gが示す進行方向Aの領域がskyの属性であると、センサ制御部16は、撮像装置111のフレームレートを、物体の検出時のフレームレートよりも下げることができる。 The information processing device 10 according to the modified example (1) of the first embodiment is based on the attribute information of the imaging region D1 indicated by the sensor information of the imaging device 111 whose sensor control unit 16 images the traveling direction A of the flying object 100. The image capturing conditions of the image capturing device 111 may be controlled. For example, as illustrated in FIG. 4 described above, the sensor control unit 16 of the information processing device 10 executes the semantic segmentation of the image G included in the sensor information of the imaging device 111 acquired by the acquisition unit 11 to obtain the image. It is possible to classify the area in the imaging area indicated by G. Then, when the classified area includes an area having an attribute in which there is a high possibility that an object does not exist in the traveling direction A, the sensor control unit 16 performs control for lowering the frame rate of the imaging device 111. For example, when the area in the traveling direction A indicated by the image G has the sky attribute, the sensor control unit 16 can lower the frame rate of the image pickup apparatus 111 below the frame rate at the time of detecting an object.
 以上のように、本実施形態に係る情報処理装置10は、飛行体100の進行方向Aを撮像する撮像装置111のセンサ情報が示す撮像領域D1の属性情報に基づいて撮像装置111の撮像条件を制御する。これにより、情報処理装置10は、飛行体100の進行方向Aにおける撮像装置111の撮像条件を、物体が存在する可能性に基づいて変更することができる。その結果、情報処理装置10は、飛行体100の進行方向Aにおける安全性を低下することなく、飛行体100の消費電力を抑制することができる。 As described above, the information processing apparatus 10 according to the present embodiment sets the image capturing condition of the image capturing apparatus 111 based on the attribute information of the image capturing area D1 indicated by the sensor information of the image capturing apparatus 111 that captures the traveling direction A of the flying object 100. Control. Thereby, the information processing device 10 can change the imaging condition of the imaging device 111 in the traveling direction A of the flying object 100 based on the possibility that an object exists. As a result, the information processing device 10 can suppress the power consumption of the aircraft 100 without lowering the safety of the aircraft 100 in the traveling direction A.
 なお、第1の実施形態の変形例(1)は、他の実施形態、変形例の情報処理装置10に適用してもよい。 Note that the modification (1) of the first embodiment may be applied to the information processing device 10 of another embodiment or modification.
[第1の実施形態の変形例(2)]
 例えば、第1の実施形態に係る情報処理装置10は、飛行体100のセンサ部110が有する撮像装置111の撮像条件を飛行体100の姿勢に応じて変更することができる。
[Modification (2) of the first embodiment]
For example, the information processing apparatus 10 according to the first embodiment can change the imaging condition of the imaging device 111 included in the sensor unit 110 of the flying object 100 according to the attitude of the flying object 100.
 第1の実施形態の変形例(2)に係る情報処理装置10は、取得部11によって取得したセンサ情報が飛行体100の姿勢を示す情報を有していると、センサ制御部16が当該姿勢に基づいて障害物の検出に必要な撮像装置111の撮像領域を変更する制御を行ってもよい。 In the information processing device 10 according to the modified example (2) of the first embodiment, when the sensor information acquired by the acquisition unit 11 has the information indicating the attitude of the flying object 100, the sensor control unit 16 causes the attitude. The control for changing the image pickup area of the image pickup apparatus 111 necessary for detecting the obstacle may be performed based on the above.
 例えば、上記の図6に示したように、情報処理装置10のセンサ制御部16は、飛行体100が傾かずに飛行する姿勢P1の場合、取得部11が取得した撮像装置111のセンサ情報が有する画像において、注目領域Rを撮像領域D1の中央付近とする。そして、センサ制御部16は、環境認識部13によって注目領域Rに基づく認識処理を実行させる。また、センサ制御部16は、飛行体100が傾いて飛行する姿勢P2の場合、取得部11が取得した撮像装置111のセンサ情報が有する画像において、注目領域Rを撮像領域D1の上部付近とする。そして、センサ制御部16は、環境認識部13によって注目領域Rに基づく認識処理を実行させる。その結果、環境認識部13は、センサ情報が有する画像の全体を解析する必要がなくなるので、例えば、画像解析、画像認識等の処理を軽減することができる。 For example, as shown in FIG. 6 above, the sensor control unit 16 of the information processing device 10 detects that the sensor information of the imaging device 111 acquired by the acquisition unit 11 is the attitude P1 in which the flying object 100 flies without tilting. In the image that is included, the attention area R is near the center of the imaging area D1. Then, the sensor control unit 16 causes the environment recognition unit 13 to execute recognition processing based on the attention area R. Further, in the case where the flying body 100 tilts and flies in the attitude P2, the sensor control unit 16 sets the attention area R near the upper portion of the imaging area D1 in the image included in the sensor information of the imaging device 111 acquired by the acquisition unit 11. .. Then, the sensor control unit 16 causes the environment recognition unit 13 to execute recognition processing based on the attention area R. As a result, the environment recognition unit 13 does not need to analyze the entire image included in the sensor information, so that it is possible to reduce processing such as image analysis and image recognition.
 以上のように、本実施形態に係る情報処理装置10は、センサ情報が飛行体100の姿勢を示す情報を有していると、当該姿勢に基づいて障害物の検出に必要な撮像装置111の撮像領域を変更する制御を行う。これにより、情報処理装置10は、飛行体100の姿勢の変化に応じて、障害物の検出に必要な撮像装置111の撮像領域を変更することができる。その結果、情報処理装置10は、飛行体100の飛行時の姿勢に応じた撮像領域の一部の領域の情報を処理すればよいので、処理の負荷を低減させることができる。 As described above, when the sensor information has the information indicating the attitude of the flying object 100, the information processing apparatus 10 according to the present embodiment detects the obstacle of the image pickup apparatus 111 based on the attitude. Control for changing the imaging area is performed. As a result, the information processing apparatus 10 can change the image capturing area of the image capturing apparatus 111 necessary for detecting the obstacle according to the change in the attitude of the flying object 100. As a result, the information processing apparatus 10 may process the information of a partial area of the imaging area according to the attitude of the flying object 100 during flight, so that the processing load can be reduced.
 第1の実施形態の変形例(2)に係る情報処理装置10は、撮像装置111の撮像領域を変更する制御として、取得したセンサ情報の画像領域に着目する場合について説明したが、これに限定されない。例えば、情報処理装置10のセンサ制御部16は、センサ部110から取得したセンサ情報の画像から、障害物の検出に必要な撮像装置111の撮像領域の情報のみを取得部11、環境認識部13等に抽出させてもよい。例えば、情報処理装置10のセンサ制御部16は、障害物の検出に必要な撮像装置111の撮像領域の情報のみを示すセンサ情報をセンサ部110に出力させるように構成してもよい。 The information processing apparatus 10 according to the modified example (2) of the first embodiment has been described as a case of focusing on the image area of the acquired sensor information as the control for changing the imaging area of the imaging apparatus 111, but the present invention is not limited to this. Not done. For example, the sensor control unit 16 of the information processing device 10 acquires only the information of the imaging region of the imaging device 111 necessary for detecting the obstacle from the image of the sensor information acquired from the sensor unit 110, the acquisition unit 11, and the environment recognition unit 13. May be extracted. For example, the sensor control unit 16 of the information processing device 10 may be configured to cause the sensor unit 110 to output sensor information indicating only the information of the imaging region of the imaging device 111 necessary for detecting an obstacle.
 なお、第1の実施形態の変形例(2)は、他の実施形態、変形例の情報処理装置10に適用してもよい。 Note that the modified example (2) of the first embodiment may be applied to the information processing device 10 of other embodiments and modified examples.
(第2の実施形態)
[第2の実施形態に係る情報処理装置の構成例]
 次に、第2の実施形態について説明する。図9は、第2の実施形態に係る情報処理装置10Aの構成の一例を示す図である。図9に示すように、飛行体100は、センサ部110と、通信部120と、駆動部130と、情報処理装置10Aと、を備える。センサ部110と通信部120と駆動部130と情報処理装置10Aとは、データや信号を授受可能に接続されている。
(Second embodiment)
[Example of configuration of information processing apparatus according to second embodiment]
Next, a second embodiment will be described. FIG. 9 is a diagram illustrating an example of the configuration of the information processing device 10A according to the second embodiment. As shown in FIG. 9, the aircraft 100 includes a sensor unit 110, a communication unit 120, a drive unit 130, and an information processing device 10A. The sensor unit 110, the communication unit 120, the drive unit 130, and the information processing device 10A are connected to each other so that data and signals can be exchanged.
 情報処理装置10Aは、取得部11と、状態認識部12と、環境認識部13と、記憶部15と、センサ制御部16と、動作制御部17と、を備える。すなわち、情報処理装置10Aは、第1の実施形態に係る情報処理装置10のマップ作成部14を構成に含んでいない。取得部11、状態認識部12、環境認識部13、センサ制御部16、及び動作制御部17の各機能部は、例えば、CPUやMPU等によって、情報処理装置10の内部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。また、各処理部は、例えば、ASICやFPGA等の集積回路により実現されてもよい。 The information processing device 10A includes an acquisition unit 11, a state recognition unit 12, an environment recognition unit 13, a storage unit 15, a sensor control unit 16, and an operation control unit 17. That is, the information processing apparatus 10A does not include the map creation unit 14 of the information processing apparatus 10 according to the first embodiment. For each functional unit of the acquisition unit 11, the state recognition unit 12, the environment recognition unit 13, the sensor control unit 16, and the operation control unit 17, for example, a program stored inside the information processing device 10 is executed by the CPU, MPU, or the like. It is realized by executing the RAM or the like as a work area. Further, each processing unit may be realized by an integrated circuit such as an ASIC or FPGA, for example.
[第2の実施形態に係る情報処理の手順]
 次に、図10を用いて、第2の実施形態に係る情報処理の手順について説明する。図10は、第2の実施形態に係る情報処理装置10が実行する処理手順の一例を示すフローチャートである。図10に示す処理手順は、情報処理装置10がプログラムを実行することによって実現される。図10に示す処理手順は、センサ部110の複数の撮像装置111の検出条件を決定する処理手順の一例を示している。
[Procedure of information processing according to the second embodiment]
Next, a procedure of information processing according to the second embodiment will be described with reference to FIG. FIG. 10 is a flowchart showing an example of a processing procedure executed by the information processing device 10 according to the second embodiment. The processing procedure illustrated in FIG. 10 is realized by the information processing device 10 executing a program. The processing procedure shown in FIG. 10 shows an example of the processing procedure for determining the detection conditions of the plurality of imaging devices 111 of the sensor unit 110.
 図10に示すように、情報処理装置10Aは、取得部11によってセンサ情報を取得する(ステップS101)。情報処理装置10Aは、動作制御部17から制御情報を取得する(ステップS102)。 As shown in FIG. 10, the information processing apparatus 10A acquires sensor information by the acquisition unit 11 (step S101). The information processing apparatus 10A acquires control information from the operation control unit 17 (step S102).
 情報処理装置10Aは、ステップS110及びステップS120の2つの処理を並列に実行する。なお、情報処理装置10は、ステップS110及びステップS120の2つの処理を順次実行するように構成してもよい。 The information processing apparatus 10A executes two processes of step S110 and step S120 in parallel. The information processing device 10 may be configured to sequentially execute the two processes of step S110 and step S120.
 情報処理装置10Aは、取得したセンサ情報に基づいて飛行体100の状態を認識する(ステップS110)。情報処理装置10Aは、ステップS110の処理が終了すると、処理をステップS161に進める。また、情報処理装置10Aは、取得したセンサ情報に基づいて外部環境の環境情報を認識する(ステップS120)。情報処理装置10Aは、ステップS120の処理が終了すると、処理をステップS161に進める。 The information processing device 10A recognizes the state of the flying object 100 based on the acquired sensor information (step S110). When the process of step S110 ends, the information processing device 10A advances the process to step S161. Further, the information processing device 10A recognizes the environmental information of the external environment based on the acquired sensor information (step S120). When the process of step S120 ends, the information processing apparatus 10A advances the process to step S161.
 情報処理装置10Aは、飛行体100の状態と制御情報と環境情報とに基づいて、センサ部110の複数のセンサのそれぞれの検出条件を決定する(ステップS161)。ステップS161の処理は、ステップS110及びステップS120の処理が終了した場合に情報処理装置10Aによって実行される。例えば、情報処理装置10Aは、現在の飛行体100の速度、進行方向A等と、これからの現在の飛行体100の速度、進行方向A等とに基づいて、複数のセンサのうち物体の検出に用いるセンサを特定し、特定したセンサの検出条件を決定する。例えば、情報処理装置10Aは、動作させるセンサと、省電力で動作させる或いは停止させるセンサとを決定する。情報処理装置10Aは、ステップS161の処理が終了すると、処理をステップS162に進める。 The information processing apparatus 10A determines the detection conditions of each of the plurality of sensors of the sensor unit 110 based on the state of the flying object 100, control information, and environment information (step S161). The process of step S161 is executed by the information processing apparatus 10A when the processes of step S110 and step S120 are completed. For example, the information processing apparatus 10A can detect an object among a plurality of sensors based on the current speed of the air vehicle 100, the traveling direction A, and the current speed of the air vehicle 100, the traveling direction A, and the like. The sensor to be used is specified, and the detection condition of the specified sensor is determined. For example, the information processing apparatus 10A determines a sensor to be operated and a sensor to be operated or stopped with power saving. When the process of step S161 ends, the information processing device 10A advances the process to step S162.
 情報処理装置10Aは、決定した検出条件に基づいて、複数のセンサのそれぞれの検出条件を変更する(ステップS162)。例えば、情報処理装置10は、取得部11を介して、センサ部110の複数のセンサのそれぞれに検出条件の変更を指示する。その結果、センサ部110の撮像装置111、状態センサ112等のそれぞれは、使用している既存の検出条件を、変更が指示された検出条件に変更する。情報処理装置10は、ステップS162の処理が終了すると、処理を後述するステップS150に進める。 The information processing device 10A changes the detection condition of each of the plurality of sensors based on the determined detection condition (step S162). For example, the information processing device 10 instructs each of the plurality of sensors of the sensor unit 110 to change the detection condition via the acquisition unit 11. As a result, each of the imaging device 111, the state sensor 112, and the like of the sensor unit 110 changes the existing detection condition in use to the detection condition instructed to be changed. When the process of step S162 ends, the information processing device 10 advances the process to step S150 described below.
 なお、情報処理装置10Aは、ステップS161及びステップS162の処理を実行することで、センサ制御部16として機能する。 Note that the information processing device 10A functions as the sensor control unit 16 by executing the processes of step S161 and step S162.
 情報処理装置10Aは、終了するか否かを判定する(ステップS150)。情報処理装置10Aは、終了しないと判定した場合(ステップS150でNo)、処理を既に説明したステップS101に戻し、一連の処理を実行する。また、情報処理装置10Aは、終了すると判定した場合(ステップS150でYes)、図10に示す処理手順を終了させる。 The information processing device 10A determines whether or not to end (step S150). When the information processing apparatus 10A determines that the processing is not to be ended (No in step S150), the processing is returned to step S101 described above, and the series of processing is executed. If the information processing apparatus 10A determines to end the processing (Yes in step S150), the information processing apparatus 10A ends the processing procedure illustrated in FIG.
[第2の実施形態に係る情報処理装置の動作]
 次に、図11を用いて、第2の実施形態に係る情報処理装置10Aの動作の一例を説明する。図11は、第2の実施形態に係る情報処理装置10Aのセンサの検出条件の決定例を説明するための図である。
[Operation of Information Processing Device According to Second Embodiment]
Next, an example of the operation of the information processing device 10A according to the second embodiment will be described with reference to FIG. FIG. 11 is a diagram for explaining an example of determining the detection condition of the sensor of the information processing apparatus 10A according to the second embodiment.
 図11に示す場面SN1では、飛行体100は、情報処理装置10Aの制御によって駆動部130が駆動されることで、進行方向Aに向かって飛行している。この場合、情報処理装置10Aは、撮像装置111aを物体の検出に用いるセンサとして機能させ、それ以外のセンサを省電力で動作させている。そして、情報処理装置10Aは、飛行体100の行動計画に基づいて進行方向Aから進行方向A’に変更することを、駆動部130を制御する直前に認識する。この場合、情報処理装置10Aは、飛行体100のこれからの速度、進行方向A’に基づいて、飛行体100の前方及び右側の方向における物体を認識する必要があると認識する。その結果、情報処理装置10Aは、移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、センサ部110の前方及び右側を検出するセンサの検出条件を物体の検出用の条件とし、それ以外のセンサを省電力で動作させる検出条件とする。そして、情報処理装置10Aは、決定した検出条件に基づいて複数のセンサのそれぞれの検出条件を変更する。 In the scene SN1 shown in FIG. 11, the flying body 100 is flying in the traveling direction A by driving the driving unit 130 under the control of the information processing device 10A. In this case, the information processing apparatus 10A causes the image pickup apparatus 111a to function as a sensor used for detecting an object, and causes the other sensors to operate with low power consumption. Then, the information processing apparatus 10</b>A recognizes that the traveling direction A is changed to the traveling direction A′ based on the action plan of the aircraft 100 immediately before controlling the driving unit 130. In this case, the information processing apparatus 10A recognizes that it is necessary to recognize an object in the front and right directions of the air vehicle 100 based on the future speed and the traveling direction A′ of the air vehicle 100. As a result, the information processing apparatus 10A uses the detection conditions of the sensors for detecting the front and right sides of the sensor unit 110 for detecting an object based on the control information for controlling the moving body and the environment information indicating the surroundings of the moving body. And the detection conditions for operating the other sensors with power saving. Then, the information processing device 10A changes the detection condition of each of the plurality of sensors based on the determined detection condition.
 図11に示す場面SN2では、飛行体100は、情報処理装置10Aの制御によって駆動部130が駆動されることで、進行方向Aから進行方向A’に向かって飛行している。この場合、情報処理装置10Aは、撮像装置111a及び撮像装置111bを物体の検出に用いるセンサとして機能させ、それ以外のセンサを省電力で動作させている。 In the scene SN2 shown in FIG. 11, the flying body 100 is flying from the traveling direction A to the traveling direction A′ by the driving unit 130 being driven by the control of the information processing device 10A. In this case, the information processing apparatus 10A causes the image pickup apparatus 111a and the image pickup apparatus 111b to function as sensors used for detecting an object, and causes the other sensors to operate with low power consumption.
 以上のように、本実施形態に係る情報処理装置10Aは、飛行体100を制御する制御情報と飛行体100の周囲を示す環境情報とに基づいて、飛行体100に搭載されたセンサ部110の複数のセンサのそれぞれの検出条件を制御する。情報処理装置10Aは、複数のセンサが検出条件で検出したセンサ情報を取得する。これにより、情報処理装置10Aは、飛行体100の制御情報と環境情報とに応じた個別の検出条件で、飛行体100の複数のセンサのそれぞれを検出させることができる。その結果、情報処理装置10Aは、飛行体100に搭載される複数のセンサの消費電力を抑制することができるとともに、取得するセンサ情報のデータ量を抑制することができる。 As described above, the information processing apparatus 10A according to the present embodiment uses the sensor unit 110 mounted on the aircraft 100 based on the control information for controlling the aircraft 100 and the environmental information indicating the surroundings of the aircraft 100. The detection condition of each of the plurality of sensors is controlled. The information processing apparatus 10A acquires sensor information detected by a plurality of sensors under the detection conditions. As a result, the information processing apparatus 10A can detect each of the plurality of sensors of the air vehicle 100 under the individual detection conditions according to the control information and the environment information of the air vehicle 100. As a result, the information processing apparatus 10A can suppress the power consumption of the plurality of sensors mounted on the air vehicle 100 and the data amount of the acquired sensor information.
 また、情報処理装置10Aは、制御情報が飛行体100の行動計画に基づいて飛行体100を制御するための情報である。これにより、情報処理装置10Aは、飛行体100を制御する直前の制御情報に基づいて、センサ部110の複数のセンサの検出条件を制御することができる。その結果、情報処理装置10Aは、飛行体100の動作の直前にセンサの検出条件を変更することができるので、飛行体100の消費電力をより一層抑制することができる。また、情報処理装置10Aは、飛行体100の動作に適した検出条件でセンサを検出させることで、飛行体100の安全性を低下させることなく制御を行うことができる。 Further, in the information processing device 10A, the control information is information for controlling the flying body 100 based on the action plan of the flying body 100. Accordingly, the information processing apparatus 10A can control the detection conditions of the plurality of sensors of the sensor unit 110 based on the control information immediately before controlling the flying object 100. As a result, since the information processing apparatus 10A can change the detection condition of the sensor immediately before the operation of the flying object 100, the power consumption of the flying object 100 can be further suppressed. In addition, the information processing apparatus 10A can perform control without lowering the safety of the flying object 100 by detecting the sensor under the detection condition suitable for the operation of the flying object 100.
 上記の本実施形態では、情報処理装置10及び10Aは、飛行体100に搭載される場合について説明したが、これに限定されない。例えば、情報処理装置10及び10Aは、飛行体100の外部に通信可能に設けて、通信部120を介した飛行体100との通信によってセンサ情報を取得し、飛行体100の複数のセンサの検出条件をリモート制御する構成としてもよい。 In the above-described present embodiment, the case where the information processing devices 10 and 10A are mounted on the flying body 100 has been described, but the present invention is not limited to this. For example, the information processing devices 10 and 10A are provided so as to be able to communicate with each other outside the flying body 100, acquire sensor information by communicating with the flying body 100 via the communication unit 120, and detect a plurality of sensors of the flying body 100. The condition may be controlled remotely.
 上記の本実施形態では、移動体は、飛行体100である場合について説明したが、これに限定されない。移動体は、例えば、車両(自動二輪車、自動四輪車、自転車等)、ロボット、台車等の自機の周辺の監視が必要な自律移動型の移動体としてもよい。 In the above embodiment, the case where the moving body is the flying body 100 has been described, but the moving body is not limited to this. The moving body may be, for example, an autonomous moving type moving body that needs to monitor the surroundings of the vehicle (a motorcycle, a four-wheeled vehicle, a bicycle, etc.), a robot, a dolly, or the like.
[ハードウェア構成]
 上述してきた実施形態に係る情報処理装置は、例えば図12に示すような構成のコンピュータ1000によって実現される。以下、実施形態に係る情報処理装置10を例に挙げて説明する。図12は、情報処理装置10の機能を実現するコンピュータ1000の一例を示すハードウェア構成図である。コンピュータ1000は、CPU1100、RAM1200、ROM(Read Only Memory)1300、HDD(Hard Disk Drive)1400、通信インターフェイス1500、及び入出力インターフェイス1600を有する。コンピュータ1000の各部は、バス1050によって接続される。
[Hardware configuration]
The information processing apparatus according to the above-described embodiment is realized by, for example, a computer 1000 having a configuration shown in FIG. Hereinafter, the information processing apparatus 10 according to the embodiment will be described as an example. FIG. 12 is a hardware configuration diagram illustrating an example of a computer 1000 that realizes the functions of the information processing device 10. The computer 1000 includes a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, an HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600. The respective units of the computer 1000 are connected by a bus 1050.
 CPU1100は、ROM1300又はHDD1400に格納されたプログラムに基づいて動作し、各部の制御を行う。例えば、CPU1100は、ROM1300又はHDD1400に格納されたプログラムをRAM1200に展開し、各種プログラムに対応した処理を実行する。 The CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400, and controls each part. For example, the CPU 1100 expands a program stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processing corresponding to various programs.
 ROM1300は、コンピュータ1000の起動時にCPU1100によって実行されるBIOS(Basic Input Output System)等のブートプログラムや、コンピュータ1000のハードウェアに依存するプログラム等を格納する。 The ROM 1300 stores a boot program such as a BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, a program dependent on the hardware of the computer 1000, and the like.
 HDD1400は、CPU1100によって実行されるプログラム、及び、かかるプログラムによって使用されるデータ等を非一時的に記録する、コンピュータが読み取り可能な記録媒体である。具体的には、HDD1400は、プログラムデータ1450の一例である本開示に係るプログラムを記録する記録媒体である。 The HDD 1400 is a computer-readable recording medium that non-temporarily records a program executed by the CPU 1100, data used by the program, and the like. Specifically, the HDD 1400 is a recording medium that records a program according to the present disclosure, which is an example of the program data 1450.
 通信インターフェイス1500は、コンピュータ1000が外部ネットワーク1550(例えばインターネット)と接続するためのインターフェイスである。例えば、CPU1100は、通信インターフェイス1500を介して、他の機器からデータを受信したり、CPU1100が生成したデータを他の機器へ送信したりする。 The communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet). For example, the CPU 1100 receives data from another device or transmits the data generated by the CPU 1100 to another device via the communication interface 1500.
 入出力インターフェイス1600は、入出力デバイス1650とコンピュータ1000とを接続するためのインターフェイスである。例えば、CPU1100は、入出力インターフェイス1600を介して、キーボードやマウス等の入力デバイスからデータを受信する。また、CPU1100は、入出力インターフェイス1600を介して、ディスプレイやスピーカーやプリンタ等の出力デバイスにデータを送信する。また、入出力インターフェイス1600は、所定の記録媒体(メディア)に記録されたプログラム等を読み取るメディアインターフェイスとして機能してもよい。メディアとは、例えばDVD(Digital Versatile Disc)等の光学記録媒体、MO(Magneto-Optical disk)等の光磁気記録媒体、テープ媒体、磁気記録媒体、または半導体メモリ等である。 The input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000. For example, the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600. The CPU 1100 also transmits data to an output device such as a display, a speaker, a printer, etc. via the input/output interface 1600. The input/output interface 1600 may also function as a media interface for reading a program or the like recorded in a predetermined recording medium (medium). The medium is, for example, an optical recording medium such as a DVD (Digital Versatile Disc), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
 例えば、コンピュータ1000が実施形態に係る情報処理装置10として機能する場合、コンピュータ1000のCPU1100は、RAM1200上にロードされたプログラムを実行することにより、取得部11、状態認識部12、環境認識部13、マップ作成部14、センサ制御部16、動作制御部17等の機能を実現する。また、HDD1400には、本開示に係るプログラムや、記憶部15内のデータが格納される。なお、CPU1100は、プログラムデータ1450をHDD1400から読み取って実行するが、他の例として、外部ネットワーク1550を介して、他の装置からこれらのプログラムを取得してもよい。 For example, when the computer 1000 functions as the information processing apparatus 10 according to the embodiment, the CPU 1100 of the computer 1000 executes the program loaded on the RAM 1200 to acquire the acquisition unit 11, the state recognition unit 12, and the environment recognition unit 13. The functions of the map creation unit 14, the sensor control unit 16, the operation control unit 17, and the like are realized. Further, the HDD 1400 stores the program according to the present disclosure and the data in the storage unit 15. Note that the CPU 1100 reads the program data 1450 from the HDD 1400 and executes the program data, but as another example, these programs may be acquired from another device via the external network 1550.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described above in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that the invention also belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Also, the effects described in the present specification are merely explanatory or exemplifying ones, and are not limiting. That is, the technique according to the present disclosure may have other effects that are apparent to those skilled in the art from the description of the present specification, in addition to or instead of the above effects.
 また、コンピュータに内蔵されるCPU、ROMおよびRAMなどのハードウェアに、情報処理装置10が有する構成と同等の機能を発揮させるためのプログラムも作成可能であり、当該プログラムを記録した、コンピュータに読み取り可能な記録媒体も提供され得る。 Further, it is also possible to create a program for causing hardware such as a CPU, a ROM, and a RAM included in the computer to exhibit a function equivalent to the configuration of the information processing apparatus 10, and the program is recorded and read by the computer. Possible recording media may also be provided.
 また、本明細書の情報処理装置の処理に係る各ステップは、必ずしもフローチャートに記載された順序に沿って時系列に処理される必要はない。例えば、情報処理装置の処理に係る各ステップは、フローチャートに記載された順序と異なる順序で処理されても、並列的に処理されてもよい。 Also, each step related to the processing of the information processing apparatus in this specification does not necessarily have to be processed in time series in the order described in the flowchart. For example, the steps related to the processing of the information processing device may be processed in a different order from the order described in the flowchart or may be processed in parallel.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御するセンサ制御部と、
 複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部と、
 を備える情報処理装置。
(2)
 前記取得部によって取得した前記センサ情報に基づいて、前記移動体の周囲のマップ情報を作成する作成部をさらに備え、
 前記センサ制御部は、前記制御情報と前記環境情報と前記マップ情報とに基づいて複数の前記センサのそれぞれの前記検出条件を制御する
 前記(1)に記載の情報処理装置。
(3)
 前記制御情報に基づいて前記移動体の速度及び進行方向の少なくとも一方を認識する状態認識部と、
 前記取得部によって取得した前記センサ情報に基づいて前記環境情報を認識する環境認識部と、
 をさらに備え、
 前記センサ制御部は、前記状態認識部によって認識した前記移動体の速度及び進行方向の少なくとも一方と前記環境認識部によって認識した前記環境情報とに基づいて、複数の前記センサのそれぞれの検出条件を制御する
 前記(2)に記載の情報処理装置。
(4)
 複数の前記センサは、前記移動体の周囲の異なる領域を撮像するように設けられた複数の撮像装置を含み、
 前記センサ制御部は、前記制御情報と前記環境情報とに基づいて、複数の前記撮像装置のそれぞれの撮像条件を制御する
 前記(2)または(3)に記載の情報処理装置。
(5)
 前記センサ制御部は、前記移動体の進行方向を撮像する前記撮像装置のフレームレートを、他の方向を撮像する前記撮像装置のフレームレートよりも上げる
 前記(4)に記載の情報処理装置。
(6)
 前記センサ制御部は、前記移動体の進行方向における前記環境情報に基づいて、複数の前記撮像装置のそれぞれの撮像条件を制御する
 前記(4)または(5)に記載の情報処理装置。
(7)
 前記センサ制御部は、前記移動体の進行方向に物体が存在しない場合に、当該進行方向を撮像する前記撮像装置のフレームレートを前記移動体の前記進行方向に物体が存在する場合よりも下げる
 前記(4)から(6)のいずれかに記載の情報処理装置。
(8)
 前記センサ制御部は、前記移動体の速度が上がった場合に前記撮像装置のフレームレートを上げる制御を行い、前記移動体の速度が下がった場合に前記撮像装置の前記フレームレートを下げる制御を行う
 前記(4)から(7)のいずれかに記載の情報処理装置。
(9)
 前記センサ制御部は、前記マップ情報の構築度に基づいて複数の前記センサの前記検出条件を制御する
 前記(2)に記載の情報処理装置。
(10)
 前記センサ制御部は、前記移動体を移動させて前記マップ情報を作成する場合、前記マップ情報の作成に必要な方向を撮像する前記撮像装置を優先するように前記撮像条件を制御する
 前記(4)に記載の情報処理装置。
(11)
 前記センサ制御部は、前記移動体の進行方向を撮像する前記撮像装置の前記センサ情報が示す撮像領域の属性情報に基づいて当該撮像装置の前記撮像条件を制御する
 前記(4)に記載の情報処理装置。
(12)
 前記センサ制御部は、前記センサ情報が前記移動体の姿勢を示す情報を有していると、当該姿勢に基づいて障害物の検出に必要な前記撮像装置の撮像領域を変更する制御を行う
 前記(6)に記載の情報処理装置。
(13)
 前記取得部によって取得した前記センサ情報に基づいて前記移動体の動作を制御する動作制御部をさらに備える
 前記(1)から(12)のいずれかに記載の情報処理装置。
(14)
 前記制御情報は、前記移動体の行動計画に基づいて前記移動体を制御するための情報である
 前記(1)から(13)のいずれかに記載の情報処理装置。
(15)
 コンピュータが、
 移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御し、
 複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する
 情報処理方法。
(16)
 コンピュータを、
 移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御するセンサ制御部、
 複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部
 として機能させるためのプログラム。
(17)
 複数のセンサと、情報処理装置と、を備える移動体であって、
 前記情報処理装置は、
 前記移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、複数の前記センサのそれぞれの検出条件を制御するセンサ制御部と、
 複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部と、
 を備える移動体。
The following configurations also belong to the technical scope of the present disclosure.
(1)
Based on control information for controlling the moving body and environmental information indicating the surroundings of the moving body, a sensor control unit for controlling the detection conditions of each of the plurality of sensors mounted on the moving body,
An acquisition unit that acquires sensor information that each of the plurality of sensors has detected under the detection conditions,
An information processing apparatus including.
(2)
Based on the sensor information acquired by the acquisition unit, further comprising a creation unit that creates map information around the moving body,
The information processing device according to (1), wherein the sensor control unit controls the detection condition of each of the plurality of sensors based on the control information, the environment information, and the map information.
(3)
A state recognition unit that recognizes at least one of the speed and the traveling direction of the moving body based on the control information,
An environment recognition unit that recognizes the environment information based on the sensor information acquired by the acquisition unit,
Further equipped with,
The sensor control unit, based on the environmental information recognized by the environment recognition unit and at least one of the speed and traveling direction of the moving body recognized by the state recognition unit, the detection conditions of each of the plurality of sensors. Controlling The information processing apparatus according to (2) above.
(4)
The plurality of sensors include a plurality of imaging devices provided to image different regions around the moving body,
The information processing device according to (2) or (3), wherein the sensor control unit controls the imaging conditions of each of the plurality of imaging devices based on the control information and the environment information.
(5)
The information processing device according to (4), wherein the sensor control unit raises a frame rate of the imaging device that images the traveling direction of the moving body higher than a frame rate of the imaging device that images another direction.
(6)
The information processing device according to (4) or (5), wherein the sensor control unit controls the imaging condition of each of the plurality of imaging devices based on the environment information in the traveling direction of the moving body.
(7)
When the object does not exist in the traveling direction of the moving body, the sensor control unit lowers the frame rate of the imaging device that captures the moving direction than when the object exists in the traveling direction of the moving body. The information processing device according to any one of (4) to (6).
(8)
The sensor control unit performs control to increase the frame rate of the imaging device when the speed of the moving object increases, and performs control to decrease the frame rate of the imaging device when the speed of the moving object decreases. The information processing device according to any one of (4) to (7).
(9)
The information processing apparatus according to (2), wherein the sensor control unit controls the detection conditions of the plurality of sensors based on the degree of construction of the map information.
(10)
When the moving object is moved to create the map information, the sensor control unit controls the imaging condition so as to give priority to the imaging device that images a direction necessary for creating the map information. ) The information processing device described in.
(11)
The information according to (4), wherein the sensor control unit controls the image capturing condition of the image capturing device based on attribute information of an image capturing area indicated by the sensor information of the image capturing device capturing the traveling direction of the moving body. Processing equipment.
(12)
When the sensor information has information indicating the posture of the moving body, the sensor control unit performs control to change an image pickup area of the image pickup apparatus necessary for detecting an obstacle based on the posture. The information processing device according to (6).
(13)
The information processing apparatus according to any one of (1) to (12), further including an operation control unit that controls an operation of the moving body based on the sensor information acquired by the acquisition unit.
(14)
The said control information is information for controlling the said mobile body based on the action plan of the said mobile body, The information processing apparatus in any one of said (1) to (13).
(15)
Computer
Based on the control information for controlling the moving body and the environmental information indicating the surroundings of the moving body, controlling the respective detection conditions of the plurality of sensors mounted on the moving body,
An information processing method, wherein each of the plurality of sensors acquires sensor information detected under the detection condition.
(16)
Computer,
A sensor control unit that controls respective detection conditions of a plurality of sensors mounted on the moving body, based on control information for controlling the moving body and environment information indicating the periphery of the moving body,
A program for causing each of the plurality of sensors to function as an acquisition unit that acquires sensor information detected under the detection conditions.
(17)
A mobile body including a plurality of sensors and an information processing device,
The information processing device,
Based on control information for controlling the moving body and environmental information indicating the surroundings of the moving body, a sensor control unit for controlling the detection conditions of each of the plurality of sensors,
An acquisition unit that acquires sensor information that each of the plurality of sensors has detected under the detection conditions,
A mobile body equipped with.
 10、10A 情報処理装置
 11 取得部
 12 状態認識部
 13 環境認識部
 14 マップ作成部
 15 記憶部
 16 センサ制御部
 17 動作制御部
 100 飛行体
 110 センサ部
 111 撮像装置
 111a 撮像装置
 111b 撮像装置
 111c 撮像装置
 111d 撮像装置
 120 通信部
 130 駆動部
 A 進行方向
10, 10A Information processing device 11 Acquisition unit 12 State recognition unit 13 Environment recognition unit 14 Map creation unit 15 Storage unit 16 Sensor control unit 17 Motion control unit 100 Aircraft 110 Sensor unit 111 Imaging device 111a Imaging device 111b Imaging device 111c Imaging device 111d imaging device 120 communication unit 130 drive unit A traveling direction

Claims (16)

  1.  移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御するセンサ制御部と、
     複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部と、
     を備える情報処理装置。
    Based on control information for controlling the moving body and environmental information indicating the surroundings of the moving body, a sensor control unit for controlling the detection conditions of each of the plurality of sensors mounted on the moving body,
    An acquisition unit that acquires sensor information that each of the plurality of sensors has detected under the detection conditions,
    An information processing apparatus including.
  2.  前記取得部によって取得した前記センサ情報に基づいて、前記移動体の周囲のマップ情報を作成するマップ作成部をさらに備え、
     前記センサ制御部は、前記制御情報と前記環境情報と前記マップ情報とに基づいて複数の前記センサのそれぞれの前記検出条件を制御する
     請求項1に記載の情報処理装置。
    Based on the sensor information acquired by the acquisition unit, further comprising a map creation unit that creates map information around the moving body,
    The information processing device according to claim 1, wherein the sensor control unit controls the detection condition of each of the plurality of sensors based on the control information, the environment information, and the map information.
  3.  前記制御情報に基づいて前記移動体の速度及び進行方向の少なくとも一方を認識する状態認識部と、
     前記取得部によって取得した前記センサ情報に基づいて前記環境情報を認識する環境認識部と、
     をさらに備え、
     前記センサ制御部は、前記状態認識部によって認識した前記移動体の速度及び進行方向の少なくとも一方と前記環境認識部によって認識した前記環境情報とに基づいて、複数の前記センサのそれぞれの前記検出条件を制御する
     請求項2に記載の情報処理装置。
    A state recognition unit that recognizes at least one of the speed and the traveling direction of the moving body based on the control information,
    An environment recognition unit that recognizes the environment information based on the sensor information acquired by the acquisition unit,
    Further equipped with,
    The sensor control unit, based on at least one of the speed and traveling direction of the moving body recognized by the state recognition unit and the environment information recognized by the environment recognition unit, the detection condition of each of the plurality of sensors. The information processing device according to claim 2.
  4.  複数の前記センサは、前記移動体の周囲の異なる領域を撮像するように設けられた複数の撮像装置を含み、
     前記センサ制御部は、前記制御情報と前記環境情報とに基づいて、複数の前記撮像装置のそれぞれの撮像条件を制御する
     請求項2に記載の情報処理装置。
    The plurality of sensors include a plurality of imaging devices provided to image different regions around the moving body,
    The information processing device according to claim 2, wherein the sensor control unit controls an imaging condition of each of the plurality of imaging devices based on the control information and the environment information.
  5.  前記センサ制御部は、前記移動体の進行方向を撮像する前記撮像装置のフレームレートを、他の方向を撮像する前記撮像装置のフレームレートよりも上げる
     請求項4に記載の情報処理装置。
    The information processing device according to claim 4, wherein the sensor control unit raises a frame rate of the imaging device that images the traveling direction of the moving body higher than a frame rate of the imaging device that images another direction.
  6.  前記センサ制御部は、前記移動体の進行方向における前記環境情報に基づいて、複数の前記撮像装置のそれぞれの撮像条件を制御する
     請求項4に記載の情報処理装置。
    The information processing device according to claim 4, wherein the sensor control unit controls the imaging conditions of each of the plurality of imaging devices based on the environment information in the traveling direction of the moving body.
  7.  前記センサ制御部は、前記移動体の進行方向に物体が存在しない場合に、当該進行方向を撮像する前記撮像装置のフレームレートを前記移動体の前記進行方向に物体が存在する場合よりも下げる
     請求項4に記載の情報処理装置。
    When the object does not exist in the traveling direction of the moving body, the sensor control unit lowers the frame rate of the imaging device that captures the moving direction than when the object exists in the traveling direction of the moving body. Item 4. The information processing device according to item 4.
  8.  前記センサ制御部は、前記移動体の速度が上がった場合に前記撮像装置のフレームレートを上げる制御を行い、前記移動体の速度が下がった場合に前記撮像装置の前記フレームレートを下げる制御を行う
     請求項4に記載の情報処理装置。
    The sensor control unit performs control to increase the frame rate of the imaging device when the speed of the moving body increases, and performs control to decrease the frame rate of the imaging device when the speed of the moving body decreases. The information processing apparatus according to claim 4.
  9.  前記センサ制御部は、前記マップ情報の構築度に基づいて複数の前記センサの前記検出条件を制御する
     請求項2に記載の情報処理装置。
    The information processing device according to claim 2, wherein the sensor control unit controls the detection conditions of the plurality of sensors based on a degree of construction of the map information.
  10.  前記センサ制御部は、前記移動体を移動させて前記マップ情報を作成する場合、前記マップ情報の作成に必要な方向を撮像する前記撮像装置を優先するように前記撮像条件を制御する
     請求項4に記載の情報処理装置。
    The sensor control unit controls the imaging condition such that when the moving body is moved to create the map information, the imaging device that takes an image in a direction required to create the map information is prioritized. The information processing device according to 1.
  11.  前記センサ制御部は、前記移動体の進行方向を撮像する前記撮像装置の前記センサ情報が示す撮像領域の属性情報に基づいて当該撮像装置の前記撮像条件を制御する
     請求項4に記載の情報処理装置。
    The information processing according to claim 4, wherein the sensor control unit controls the imaging condition of the imaging device based on attribute information of an imaging region indicated by the sensor information of the imaging device that images the traveling direction of the moving body. apparatus.
  12.  前記センサ制御部は、前記センサ情報が前記移動体の姿勢を示す情報を有していると、当該姿勢に基づいて障害物の検出に必要な前記撮像装置の撮像領域を変更する制御を行う
     請求項6に記載の情報処理装置。
    When the sensor information has information indicating the posture of the moving body, the sensor control unit performs control to change the image pickup area of the image pickup apparatus necessary for detecting an obstacle based on the posture. Item 6. The information processing device according to item 6.
  13.  前記取得部によって取得した前記センサ情報に基づいて前記移動体の動作を制御する動作制御部をさらに備える
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising an operation control unit that controls an operation of the moving body based on the sensor information acquired by the acquisition unit.
  14.  前記制御情報は、前記移動体の行動計画に基づいて前記移動体を制御するための情報である
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the control information is information for controlling the moving body based on an action plan of the moving body.
  15.  コンピュータが、
     移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御し、
     複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する
     情報処理方法。
    Computer
    Based on the control information for controlling the moving body and the environmental information indicating the surroundings of the moving body, controlling the respective detection conditions of the plurality of sensors mounted on the moving body,
    An information processing method, wherein each of the plurality of sensors acquires sensor information detected under the detection condition.
  16.  コンピュータを、
     移動体を制御する制御情報と前記移動体の周囲を示す環境情報とに基づいて、前記移動体に搭載された複数のセンサのそれぞれの検出条件を制御するセンサ制御部、
     複数の前記センサのそれぞれが前記検出条件で検出したセンサ情報を取得する取得部
     として機能させるためのプログラム。
    Computer,
    A sensor control unit that controls respective detection conditions of a plurality of sensors mounted on the moving body, based on control information for controlling the moving body and environment information indicating the periphery of the moving body,
    A program for causing each of the plurality of sensors to function as an acquisition unit that acquires sensor information detected under the detection conditions.
PCT/JP2019/044630 2019-01-09 2019-11-14 Information processing device, information processing method, and program WO2020144936A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019001946 2019-01-09
JP2019-001946 2019-01-09

Publications (1)

Publication Number Publication Date
WO2020144936A1 true WO2020144936A1 (en) 2020-07-16

Family

ID=71521146

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/044630 WO2020144936A1 (en) 2019-01-09 2019-11-14 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2020144936A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494848A (en) * 2021-12-21 2022-05-13 重庆特斯联智慧科技股份有限公司 Robot sight distance path determining method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6324649B2 (en) * 1983-03-30 1988-05-21 Konuma Nogyo Kyodokumiai
JP2009027564A (en) * 2007-07-20 2009-02-05 Fujifilm Corp Image processor, image processing method, and program
JP2013510377A (en) * 2009-11-06 2013-03-21 エボリューション ロボティックス インコーポレイテッド Method and system for completely covering a surface with an autonomous robot
JP2017516415A (en) * 2014-05-21 2017-06-15 クアルコム,インコーポレイテッド System and method for determining image resolution
JP2017111606A (en) * 2015-12-16 2017-06-22 カシオ計算機株式会社 Autonomous mobile apparatus, autonomous mobile method, and program
US20180032042A1 (en) * 2016-08-01 2018-02-01 Qualcomm Incorporated System And Method Of Dynamically Controlling Parameters For Processing Sensor Output Data
JP2018019182A (en) * 2016-07-26 2018-02-01 トヨタ自動車株式会社 Remote maneuver system of mobile
US20180322348A1 (en) * 2017-05-02 2018-11-08 Qualcomm Incorporated Dynamic sensor operation and data processing based on motion information

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6324649B2 (en) * 1983-03-30 1988-05-21 Konuma Nogyo Kyodokumiai
JP2009027564A (en) * 2007-07-20 2009-02-05 Fujifilm Corp Image processor, image processing method, and program
JP2013510377A (en) * 2009-11-06 2013-03-21 エボリューション ロボティックス インコーポレイテッド Method and system for completely covering a surface with an autonomous robot
JP2017516415A (en) * 2014-05-21 2017-06-15 クアルコム,インコーポレイテッド System and method for determining image resolution
JP2017111606A (en) * 2015-12-16 2017-06-22 カシオ計算機株式会社 Autonomous mobile apparatus, autonomous mobile method, and program
JP2018019182A (en) * 2016-07-26 2018-02-01 トヨタ自動車株式会社 Remote maneuver system of mobile
US20180032042A1 (en) * 2016-08-01 2018-02-01 Qualcomm Incorporated System And Method Of Dynamically Controlling Parameters For Processing Sensor Output Data
US20180322348A1 (en) * 2017-05-02 2018-11-08 Qualcomm Incorporated Dynamic sensor operation and data processing based on motion information

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494848A (en) * 2021-12-21 2022-05-13 重庆特斯联智慧科技股份有限公司 Robot sight distance path determining method and device
CN114494848B (en) * 2021-12-21 2024-04-16 重庆特斯联智慧科技股份有限公司 Method and device for determining vision path of robot

Similar Documents

Publication Publication Date Title
US11726498B2 (en) Aerial vehicle touchdown detection
US10776939B2 (en) Obstacle avoidance system based on embedded stereo vision for unmanned aerial vehicles
US20220234733A1 (en) Aerial Vehicle Smart Landing
US11760484B2 (en) Detecting optical discrepancies in captured images
US10802509B2 (en) Selective processing of sensor data
JP2023090817A (en) Flying body control device, flying body control method and program
KR101350242B1 (en) Method and apparatus for searching a landing site of aircraft using a depth map
JPWO2019098082A1 (en) Control devices, control methods, programs, and mobiles
JP2020079997A (en) Information processing apparatus, information processing method, and program
KR20220020804A (en) Information processing devices and information processing methods, and programs
KR20200136398A (en) Exposure control device, exposure control method, program, photographing device, and moving object
WO2021131685A1 (en) Information processing apparatus, information processing method, and program
EP3682306B1 (en) Action plan generation when the own position is unknown
WO2020144936A1 (en) Information processing device, information processing method, and program
JP6900029B2 (en) Unmanned aerial vehicle, position estimation device, flight control device, position estimation method, control method and program
JP7044147B2 (en) Information processing equipment, information processing methods, and information processing programs
CN116540776B (en) Unmanned aerial vehicle vision obstacle avoidance method and system
JP2021154857A (en) Operation support device, operation support method, and program
Elawady et al. Detecting and avoiding frontal obstacles from monocular camera for micro unmanned aerial vehicles
WO2022004385A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19909105

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19909105

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP